CN113739696B - Coordinate measuring machine with vision probe for performing point self-focusing type measuring operation - Google Patents

Coordinate measuring machine with vision probe for performing point self-focusing type measuring operation Download PDF

Info

Publication number
CN113739696B
CN113739696B CN202110574684.6A CN202110574684A CN113739696B CN 113739696 B CN113739696 B CN 113739696B CN 202110574684 A CN202110574684 A CN 202110574684A CN 113739696 B CN113739696 B CN 113739696B
Authority
CN
China
Prior art keywords
axis
image
vision probe
image stack
workpiece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110574684.6A
Other languages
Chinese (zh)
Other versions
CN113739696A (en
Inventor
T.M.埃尔斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitutoyo Corp
Original Assignee
Mitutoyo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitutoyo Corp filed Critical Mitutoyo Corp
Publication of CN113739696A publication Critical patent/CN113739696A/en
Application granted granted Critical
Publication of CN113739696B publication Critical patent/CN113739696B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • G01B11/007Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines feeler heads therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • G01B11/0608Height gauges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/571Depth or shape recovery from multiple images from focus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10148Varying focus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Robotics (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • A Measuring Device Byusing Mechanical Method (AREA)

Abstract

A Coordinate Measuring Machine (CMM) system is provided that includes utilizing a vision probe (e.g., for performing operations for determining and/or measuring a surface profile of a workpiece, etc.). The angular orientation of the vision probe may be adjusted using a rotation mechanism such that the optical axis of the vision probe is directed toward the angled surface of the workpiece (e.g., the optical axis may be approximately perpendicular to the angled workpiece surface in some embodiments). The x-axis, y-axis, and z-axis sliding mechanisms (e.g., moving in mutually orthogonal directions) may together move the vision probe along an image stack acquisition axis (which may be substantially coincident with the optical axis) to an acquisition position to acquire an image stack of the angled workpiece surface. The focus curve data may be determined from an analysis of the image stack indicating a 3-dimensional position of a surface point on the angled surface of the workpiece.

Description

Coordinate measuring machine with vision probe for performing point self-focusing type measuring operation
Technical Field
The present disclosure relates to precision metrology, and more particularly to a coordinate measuring machine having a movement mechanism that is capable of moving a measurement probe along multiple axes and at a desired angle/direction relative to a workpiece surface.
Background
A Coordinate Measuring Machine (CMM) is typically known that includes a probe, a movement mechanism and a controller. The probe may be a tactile measurement probe having a probe tip that physically contacts a workpiece (i.e., an object) to be measured. Some examples of tactile probes include contact probes or scanning probes (e.g., a probe tip positioned to contact and slide along in order to "scan" a surface of a workpiece). In operation, the movement mechanism of the CMM holds and moves the probe and the controller controls the movement mechanism. The movement mechanism typically enables the probe to move in mutually orthogonal X, Y and Z directions.
An exemplary CMM is disclosed in us patent 7,660,688, which is incorporated herein by reference in its entirety. As described above, a CMM having a movement mechanism moves the contact point of the tactile scanning probe along the surface of the workpiece. During movement, the probe is pressed against the workpiece to obtain displacements of the movement mechanism and probe, and the CMM then synthesizes the displacements to detect the position of the contact point (measurement value), thereby measuring/determining the surface profile of the workpiece based on the detected surface points.
While the use of such CMMs with such tactile probes enables measurement of the surface profile of a workpiece, such processes still present certain limitations (e.g., related to the amount of time required for the process to be performed, the physical contact of the probe tip with the workpiece, etc.). Techniques that can improve or otherwise enhance the use of CMM measurements or otherwise determine the surface profile of a workpiece would be desirable.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
According to one aspect, a coordinate measuring machine system is provided that includes a vision probe, a slide mechanism configuration, a rotating mechanism, one or more processors, and memory coupled to the one or more processors. The vision probe includes a light source and an objective lens that inputs image light generated by a workpiece surface illuminated by the light source and transmits the image light along an imaging optical path, wherein the objective lens defines an optical axis of the vision probe that extends at least between the objective lens and the workpiece surface. The vision probe also includes a camera that receives imaging light transmitted along the imaging light path and provides an image of the workpiece surface. The slide mechanism configuration includes an x-axis slide mechanism, a y-axis slide mechanism, and a z-axis slide mechanism, each configured to move the vision probe in mutually orthogonal x-axis, y-axis, and z-axis directions, respectively, within the machine coordinate system. The rotation mechanism is coupled between the z-axis slide mechanism and the vision probe and is configured to rotate the vision probe to different angular orientations relative to a z-axis of the machine coordinate system. The memory stores program instructions that, when executed by the one or more processors, cause the one or more processors to:
Adjusting the direction of the vision probe using the rotation mechanism such that an optical axis of the vision probe is directed toward the surface of the workpiece, wherein the optical axis of the vision probe is not parallel to a z-axis of the machine coordinate system and corresponds to an image stack acquisition axis;
acquiring an image stack comprising a plurality of images, each image corresponding to a focal position of the vision probe along an image stack acquisition axis; and
focus curve data is determined based at least in part on an analysis of the images of the image stack, wherein the focus curve data is indicative of 3-dimensional locations of a plurality of surface points on the workpiece surface.
Further, the above-mentioned acquired image stacking includes:
adjusting the plurality of slide mechanisms to move the vision probe from a first image acquisition position to a second image acquisition position along the image stack acquisition axis, respectively, wherein the vision probe acquires a first image and a second image of the plurality of images at the first image acquisition position and the second image acquisition position, respectively; and
the plurality of slide mechanisms are adjusted to move the vision probe from a second image acquisition position, also along the image stack acquisition axis, to a third image acquisition position, wherein the vision probe acquires a third image of the plurality of images at the third image acquisition position.
According to another aspect, a method of measuring a surface of a workpiece is provided. The method generally comprises four steps:
operating a coordinate measuring machine system, the system comprising: (i) A vision probe configured to image a surface of a workpiece based on image light transmitted along an optical axis of the vision probe; (ii) A slide mechanism arrangement comprising an x-axis slide mechanism, a y-axis slide mechanism, and a z-axis slide mechanism, each configured to move the vision probe in mutually orthogonal x-axis, y-axis, and z-axis directions, respectively, within a machine coordinate system; and (iii) a rotation mechanism coupled between the z-axis slide mechanism and the vision probe and configured to rotate the vision probe to different angular orientations relative to a z-axis of the machine coordinate system;
adjusting the direction of the vision probe using the rotation mechanism such that an optical axis of the vision probe is directed toward the surface of the workpiece, wherein the optical axis of the vision probe is not parallel to a z-axis of the machine coordinate system and corresponds to an image stack acquisition axis;
acquiring an image stack comprising a plurality of images, each image corresponding to a focal position of the vision probe along an image stack acquisition axis, wherein acquiring the image stack comprises: (i) Adjusting the plurality of slide mechanisms to move the vision probe from a first image acquisition position to a second image acquisition position along the image stack acquisition axis, respectively, wherein the vision probe acquires a first image and a second image of the plurality of images at the first image acquisition position and the second image acquisition position, respectively; and (ii) adjusting the plurality of slide mechanisms to move the vision probe from a second image acquisition position, also along the image stack acquisition axis, to a third image acquisition position, wherein the vision probe acquires a third image of the plurality of images at the third image acquisition position; and
Focus curve data is determined based at least in part on an analysis of the images of the image stack, wherein the focus curve data is indicative of 3-dimensional locations of a plurality of surface points on the workpiece surface.
Drawings
FIG. 1A is a diagram illustrating various components of a Coordinate Measuring Machine (CMM) according to an embodiment;
FIG. 1B is a diagram schematically illustrating a vision probe coupled to a probe head such as the CMM shown in FIG. 1A;
FIG. 2 is a block diagram illustrating various control elements such as the CMM of FIG. 1A;
FIG. 3A is a schematic diagram of a vision probe having its Optical Axis (OA) oriented generally perpendicular relative to the surface on which the work piece WP is placed (i.e., OA parallel to the z-axis of the machine coordinate system);
FIG. 3B is a schematic diagram of the vision probe of FIG. 3A with its Optical Axis (OA) oriented at an angle (i.e., not parallel to the z-axis of the machine coordinate system);
FIG. 4 is a 2-dimensional perspective view showing movement of the vision probe along the x-axis and z-axis directions of the machine coordinate system to acquire an image at image acquisition location (I);
FIG. 5 is a 3-dimensional perspective view showing movement of the vision probe along the x-axis, y-axis, and z-axis directions of the machine coordinate system to acquire an image at image acquisition location (I);
FIGS. 6A and 6B illustrate an exemplary method of determining the relative position/coordinates of a point on a surface of a workpiece along the z-axis of a probe coordinate system using a stack of images obtained by a vision probe;
FIG. 7A illustrates a sample workpiece including a plurality of workpiece surfaces and workpiece features;
FIG. 7B is a schematic diagram of a vision probe with its Optical Axis (OA) and Image Stack Acquisition Axis (ISAA) oriented substantially in a vertical direction (i.e., OA/ISAA parallel to the z-axis of the machine coordinate system);
FIG. 7C is a schematic view of a vision probe with its Optical Axis (OA) and Image Stack Acquisition Axis (ISAA) oriented at an angle so as to be substantially perpendicular to the angled surface of the workpiece of FIG. 7A; and
figure 8 is a flow chart of a method of measuring a workpiece surface using a CMM system including a movement mechanism configuration such as that described in figures 1-7C to acquire an image stack by moving a vision probe at a desired angle/direction relative to the workpiece surface.
Detailed Description
Fig. 1A is a diagram illustrating various components of a Coordinate Measuring Machine (CMM) 100. As shown in fig. 1A, the coordinate measuring machine 100 includes a machine body 200 that moves a probe 300, an operation unit 105 having a manual joystick 106, and a processing device configuration 110. The machine body 200 includes a faceplate 210, a movement mechanism arrangement 220 (see also fig. 2), and a vision probe 300. The movement mechanism arrangement 220 includes an X-axis slide mechanism 225, a Y-axis slide mechanism 226, and a Z-axis slide mechanism 227 (fig. 2) that are disposed on the faceplate 210 to hold and move the vision probe 300 in three dimensions relative to the work piece WP to be measured, as shown in fig. 1A. The movement mechanism arrangement 220 also includes a rotation mechanism 214.
Specifically, the movement mechanism arrangement 220 includes beam mounts 221 movable in the Ym direction in a Machine Coordinate System (MCS), beams 222 bridging between the beam mounts 221, columns 223 movable on the beams 222 in the Xm direction in the machine coordinate system, and Z-axis moving members 224 movable within the columns 223 in the Zm direction in the machine coordinate system, as shown in fig. 1. An X-axis sliding mechanism 225, a Y-axis sliding mechanism 226, and a Z-axis sliding mechanism 227 shown in fig. 2 are provided between the beam 222 and the column 223, between the panel 210 and the beam bracket 221, and between the column 223 and the Z-axis moving member 224, respectively. The vision probe 300 is attached to a probe head 213 that includes a rotation mechanism 214 and is attached to and supported by the end of the Z-axis moving member 224. The rotation mechanism 214 enables the vision probe 300 to rotate relative to the Z-axis moving member 224, which will be described in more detail below. The X-axis slide mechanism 225, the Y-axis slide mechanism 226, and the Z-axis slide mechanism 227 are each configured to move the probe 300 in mutually orthogonal X, Y, Z axis directions within the MCS, respectively.
As shown in fig. 2, the X-axis slide mechanism 225, the Y-axis slide mechanism 226, and the Z-axis slide mechanism 227 are provided with an X-axis scale sensor 228, a Y-axis scale sensor 229, and a Z-axis scale sensor 230, respectively. Accordingly, the movement amounts of the vision probe 300 in the X-axis, Y-axis, and Z-axis directions in the Machine Coordinate System (MCS) can be obtained from the outputs of the X-axis scale sensor 228, the Y-axis scale sensor 229, and the Z-axis scale sensor 230. In the illustrated embodiment, the directions of movement of the X-axis slide mechanism 225, the Y-axis slide mechanism 226, and the Z-axis slide mechanism 227 coincide with the Xm direction, the Ym direction, and the Zm direction, respectively, in the Machine Coordinate System (MCS). In various embodiments, these relatively direct correlations and associated components may help ensure a high level of accuracy of motion and position control/sensing in the Xm, ym, and Zm directions, as well as relatively simplified processing. The probe head 213 with the rotation mechanism 214 includes one or more rotation sensors 215 (see fig. 2) for sensing the angular rotation/position/orientation of the vision probe 300, as will be described in more detail below.
In various embodiments, the vision probe 300 can be used to perform operations for determining and/or measuring the surface profile of the work piece WP. As will be described in greater detail below, the angular orientation of the vision probe 300 may be adjusted using the rotation mechanism 214 such that the optical axis OA of the vision probe is directed toward the angled surface of the work piece WP (e.g., in some embodiments, the optical axis OA may be made substantially perpendicular to the angled work piece surface). The x-axis, y-axis, and z-axis slide mechanisms 225, 226, and 227 (e.g., moving in mutually orthogonal directions) may in combination move the vision probe 300 along an image stack acquisition axis (which may generally coincide with the optical axis OA) to an image acquisition position to acquire an image stack of an angled workpiece surface. The focus curve data may be determined from an analysis of the image stack (e.g., as part of a point-from-focus (PFF) type measurement operation) that indicates a 3-dimensional location of a surface point on the angled surface of the work piece WP.
As shown in fig. 2, the operation unit 105 is connected to a command section 402 of the processing device configuration 110. Various commands may be input to the machine body 200 and the processing device configuration 110 via the operation unit 105. As shown in FIG. 1A, the processing device configuration 110 includes a motion controller 140 and a host computer system 115. In various embodiments, the processing device configuration 110 may calculate shape coordinates of the work piece WP to be measured based at least in part on the amount of movement of the vision probe 300 moved by the movement mechanism configuration 220 and/or analysis of data (e.g., including image stacks) obtained by the vision probe 300, as will be described in more detail below. In various embodiments, the shape coordinates may correspond to a depth map and/or surface topography of the workpiece and/or workpiece surface, and may be based on the relative positions of the surface points on the workpiece (e.g., indicated by the coordinates).
The motion controller 140 of fig. 1A primarily controls the motion of the vision probe 300. The host computer system 115 processes the motions and positions performed and obtained in the machine body 200. In this embodiment, a processing device configuration 110 having the combined functionality of a motion controller 140 and a host computer system 115 is shown in the block diagram of FIG. 2 and will be described below. The host computer system 115 includes a computer 120, an input unit 125 such as a keyboard, and an output unit 130 such as a display and printer.
Those skilled in the art will appreciate that the host computer system 115 and/or other computing systems and/or control systems described or available with the elements and methods described herein may generally be implemented using any suitable computing system or device, including distributed or networked computing environments and the like. Such computing systems or devices may include one or more general-purpose or special-purpose processors (e.g., non-custom or custom devices) that execute software to perform the functions described herein. The software may be stored in a memory, such as Random Access Memory (RAM), read Only Memory (ROM), flash memory, etc., or a combination of these components. The software may also be stored in one or more storage devices, such as an optically-based disk, a flash memory device, or any other type of non-volatile storage medium for storing data. Software may include one or more program modules including procedures, routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. In a distributed computing environment, the functionality of program modules may be combined or distributed across multiple computing systems or devices and accessed in a wired or wireless configuration via service calls.
As shown in fig. 2, the processing device configuration 110 includes a command portion 402, a slide mechanism controller 404, a position determination portion 406, a vision probe controller 408, a vision probe data portion 410, an analyzer portion 412, and a storage portion 414.
The command section 402 shown in fig. 2 supplies a predetermined command to the slide mechanism controller 404 (for example, based on a command input by the operation unit 105 or the input unit 125). The command section 402 generates coordinate values in a machine coordinate system as position commands for the movement mechanism configuration 220 for each control cycle in consideration of, for example, a movement direction, a movement distance, a movement speed, and the like, to move the vision probe 300 to a plurality of positions (for example, image pickup positions, and the like).
The slide mechanism controller 404 shown in fig. 2 performs drive control by outputting a drive control signal D in response to a command from the command section 402, thereby causing current to flow through the motors of the x-axis, y-axis, and z-axis slide mechanisms 225, 226, and 227 in the moving mechanism configuration 220.
The position latches 216 in one embodiment communicate with various sensors and/or drive mechanisms to ensure that the coordinates of the CMM 100 and vision probe 300 are properly synchronized when the image is acquired. More specifically, in various embodiments, the position latch 216 may be used to help ensure the accuracy of measurements derived from images in an image stack. In various embodiments, the operation of the position latch 216 enables the CMM machine coordinates (which reflect the position of the connection point or other reference point of the vision probe 300 during a particular measurement) to be appropriately combined with position data (e.g., position and orientation relative to the vision probe 300) determined from the vision probe image. In some embodiments, the position latch 216 may be used to trigger measurements from CMM position sensors (e.g., sensors 215 and 228-230, etc.) that may include graduations, encoders, or other sensing elements that track the overall position and orientation (e.g., including the base position) of the vision probe 300 in a machine coordinate system. In some implementations, the position latch 216 may also trigger image acquisitions from the vision probe 300 (e.g., as part of an image stack, a trigger signal may be provided for each image in the image stack, with the respective position of the vision probe 300 also synchronized and tracked accordingly for each image acquisition).
Figure 1B is a diagram schematically illustrating certain components of the machine body 200 and vision probe 300' of the CMM 100, which may be similar to the vision probe 300 of figure 1A. As shown in fig. 1B, the machine body 200 includes a probe head 213. The probe head 213 receives and transmits probe signals through the probe head cable 211. The probe head 213 is fixed to a coordinate measuring machine spindle (quick) 217 attached to an end of a Z-axis moving member 224 (or a sliding element such as a spindle) that moves in the Z-axis direction of the Machine Coordinate System (MCS). The probe head 213 is connected to the vision probe 300' at a probe automatic joint connection 231. One implementation of a probe automatic joint connection is described in more detail in U.S. patent No. 9,115,982, which is incorporated herein by reference in its entirety.
The probe head 213 includes a rotation mechanism 214 (fig. 2) that in some embodiments rotates 360 degrees in a horizontal plane (e.g., the first rotation sensor 215 may sense its angular movement/position/direction) and may include a U-shaped joint (e.g., that enables the connected probe to rotate about a corresponding axis located in the horizontal plane, the second rotation sensor 215 may sense its angular movement/position/direction, as will be described in more detail below with respect to fig. 3B). Thus, the rotation mechanism 214 of the probe head 213 in the particular example of fig. 1B supports rotation of the vision probe 300' about two different axes: firstly, the vision probe 300 'is rotated (turned) around the Z-axis in the current direction, and secondly, the vision probe 300' is rotated around the horizontal axis (i.e., an axis in the XY plane of the machine coordinate system). The rotation mechanism 214 embodied in the probe head 213 in fig. 3A and 3B described below is illustrated as being substantially circular in shape, although it may be illustrated as being spherical in a three-dimensional representation in various embodiments. The rotation mechanism 214, including a spherical (or ball) joint, allows the vision probe 300 'to rotate about a Z-axis displacement member 224 within the post 223 and/or about any horizontal axis, thereby positioning the optical axis of the vision probe 300' at a desired angle/direction relative to the workpiece surface (e.g., the workpiece surface may be at an angle relative to a horizontal plane). In general, the rotation mechanism 214 is a mechanism for changing the direction of the vision probe 300 (i.e., the posture of the vision probe 300), as shown in fig. 3A and 3B.
The probe auto-joint connection 231 is an electromechanical connection that rigidly and mechanically secures the probe head 213 to the vision probe 300' in a manner that allows the probe head to be disconnected from one probe and attached to another. In one embodiment, the probe robotic adapter connector 231 may comprise a first mating robotic exchange adapter element 234 and a second mating robotic exchange adapter element 236, wherein the first robotic exchange adapter element 234 is mounted to the probe head 213 and the second mating robotic exchange adapter element 236 is mounted to the vision probe 300'. In one embodiment, the probe auto-joint connection 231 has mating electrical contacts or connections 235 such that when the probe is attached, the contacts automatically engage and form an electrical connection.
The vision probe 300' may receive at least some of its power and control signals through the automatic joint connection 231, which in turn pass through the probe head cable 211. The signal transmitted to the video probe 300' through the self-connector connection 231 is transmitted through the connection 235. As shown in fig. 1B, the video probe 300' includes an auto-exchange joint element 236 and a probe assembly 237 mounted to the auto-exchange joint element 236 for automatic connection to the CMM 100 via a probe auto-joint connection 231.
In various embodiments, vision probe 300 'may also or alternatively have at least some of its power and control signals transferred through cable 211'. In some embodiments, the cable 211' may be used due to a standard automatic joint connection 231 having a limited number of available wired connections, and more connections (e.g., as provided by the optional cable 211 ') may be desired/utilized for the vision probe 300 '. In various embodiments, the vision probe 300 'may have additional features/capabilities in addition to certain standard power and/or communication signals that may be needed and/or benefit from additional power and/or communication signals that may be provided through the optional cable 211' and/or through other transmission mechanisms. In various embodiments, power and/or communication signals for the vision probe 300 '(e.g., as communicated over the cable 211 and/or the cable 211') may be to and from the vision probe controller 408 and the vision probe data portion 410, as will be described in more detail below with respect to fig. 2.
As shown in fig. 2, in some embodiments the machine body 200 of the CMM 100 may include an optional tactile measurement probe 390 in addition to the vision probe 300 (300'), which includes an XYZ sensor 392. The tactile measurement probe 390 may be a contact probe, a scanning probe, or the like, which typically has a probe tip that physically contacts the workpiece under test. In some embodiments, such a tactile measurement probe 390 may be used in addition to/in combination with the vision probe 300. For example, after the vision probe 300 is used to obtain the image stack and determine the 3-dimensional contour of the workpiece surface, the vision probe 300 may be disassembled/removed from the CMM 100 (e.g., disassembled from the probe head 213 in fig. 1B). The tactile measurement probe 390 may then be subsequently attached to the CMM 100 (e.g., to the probe head 213). To this end, in some examples, the CMM 100 may store different probes (e.g., 300, 390, etc.) on a probe carriage and move the probe head 213 into position for attaching and detaching the different probes. The tactile measurement probe 390 can then be used to physically contact and verify certain measurements or surface points (e.g., for surface points that may be difficult to view/determine using the vision probe 300). In various embodiments, if there are surface points on the workpiece surface that may be difficult to image/capture and/or partially hidden from view by other components of the workpiece from the vision probe 300, the tactile measurement probe 390 may be used to physically contact these surface points for measurement.
Still referring to fig. 2, vision probe 300 may include an illumination configuration 302, an objective 304, and a camera 306. In the illustrated embodiment, the objective lens 304 and the camera 306 are internal to the vision probe 300, and in some figures (e.g., fig. 3A and 3B) are illustrated with dashed boxes. In various embodiments, objective lens 304 may be a multi-lens optical element and may be selected with a range of magnifications. For example, different objectives with different magnifications may be chosen and the objective to be used in the vision probe may be selected based on the desired magnification for certain applications (e.g., a relatively higher magnification objective may be selected, but a smaller range of PFF images may need to be weighed, etc.).
In the embodiment of fig. 3A and 3B, the illumination configuration 302 may be a ring light (e.g., formed by an arrangement of LEDs) disposed at the distal end of the vision probe 300, although the arrangement of the illumination configuration 302 is not limited to the illustrated embodiment. For example, the lighting configuration 302 may alternatively be provided as a coaxial lamp. In some embodiments, different configurations may be required to provide coaxial light, with a beam splitter in the optical axis path within vision probe 300 for directing light downward through objective lens 304, and with a light source biased sideways or otherwise located within vision probe 300 for directing light to the beam splitter, and so forth. In some embodiments, the lighting configuration 302 (e.g., arrangement of LEDs) formed from a ring light may have a lighter weight and smaller volume and size than the lighting configuration 302 formed from a coaxial light (which may require a beam splitter and a sideways-oriented light source).
As described above with reference to fig. 1B, optional probe head cable 211 'may be used to carry other signals (e.g., to control illumination configuration 302, camera 306, etc. and/or provide power to vision probe 300), or alternatively, cable 211' need not be included, in which case all of the required wires/signals may pass through probe head 213 (e.g., and thus through cable 211).
When used with vision probe 300 alone, CMM movement mechanism configuration 220, and in particular its sensors (215 and 228-230), may provide measurement output M to a position determination portion 406 that determines the position (or other connection point or reference position) of probe head 213 of vision probe 300 within the Machine Coordinate System (MCS) of the CMM. For example, the position determination portion 406 may provide X, Y and Z coordinates for other connection points or reference points of the probe head 213 or vision probe 300 within a machine coordinate system. When a touch sense measurement probe 390 is attached, the touch sense measurement probe 390 may include a mechanism that allows (a small amount of) movement of the probe tip relative to the rest of the touch sense measurement probe 390 and a corresponding sensor (e.g., XYZ sensor 392) that provides sensor data indicative of the position of the probe tip (i.e., probe stylus tip) that actually contacts the workpiece surface in the local coordinate system of the touch sense measurement probe 390. Measurement synchronization trigger signals (e.g., provided with respect to operation of the position latches 216, etc.) trigger the tracking of the measurement of the overall position and orientation of the tactile measurement probe 390 (e.g., of the probe head 213) in a machine coordinate system, as well as the triggering of local surface measurements using the tactile measurement probe 390 in a local coordinate system. The position determining section 406 may use and combine the coordinates measured in the local coordinate system and the position of the tactile measurement probe 390 measured in the machine coordinate system to determine the overall position of the probe tip and thus the measured/detected surface point on the workpiece.
In comparison to such a determination with the tactile measurement probe 390, the position determination portion 406 may determine only the position (or other reference or connection position) of the probe head 213 at the top of the vision probe 300 when the vision probe 300 is utilized as described herein with respect to various exemplary embodiments. To determine the coordinates of the surface points on the workpiece, information from the analysis of the image stack may be used. For example, an image stack (of images at different focus positions) may be acquired by vision probe 300, where the relative position/focus position of the images in the image stack is in terms of a Probe Coordinate System (PCS), which in some embodiments may be related to a reference position of the probe within the MCS. To determine the overall position of a surface point within a Machine Coordinate System (MCS), in some embodiments, the PCS position data of the surface point may be converted and/or otherwise combined with the MCS position data to determine the overall position of the surface point.
When the vision probe 300 is oriented at an angle (e.g., as shown in fig. 3B) and thus the Z-axis of the Probe Coordinate System (PCS) is oriented at an angle (i.e., corresponding to the optical axis of the vision probe 300), the acquired stack of images indicates the relative distance of the surface point of the workpiece in the direction of the Z-axis of the probe oriented at the angle. In some embodiments, those Probe Coordinate System (PCS) coordinates may be referred to as local coordinate systems, which may then be combined (e.g., converted and added to) with MCS coordinates determined for the probe head 213 (or other reference location) to determine the overall location of surface points on the workpiece in the MCS. For example, if it is desired to determine the coordinates of a surface point from the MCS, the measurement points determined in the probe coordinate system PCS may be converted to MCS coordinates and added to the other MCS coordinates (or other reference locations) of the probe head 213 of the vision probe 300. Alternatively, if the workpiece itself is assigned its own Local Coordinate System (LCS), the MCS coordinates determined for the probe head 213 (or other reference location) of the vision probe 300 may be transformed or combined with the LCS of the workpiece. As yet another example, in some cases, other local coordinate systems (e.g., images for image stacking, etc.) may also or alternatively be established. Typically, the MCS covers the coordinates of the overall general volume of the CMM 100, while LCS (e.g., such as PCS) typically covers a smaller volume and may in some cases typically be contained within the MCS. In various embodiments, in addition to X, Y and Z coordinates, certain types of cylindrical coordinates, cartesian coordinates, or other coordinates may also or alternatively be utilized in connection with the orientation of the vision probe 300 and the determination of measured surface point coordinates on the work piece WP.
In some implementations, the position data for the PCS from the image stack can be utilized relatively independently (e.g., with limited or no conversion or in combination with coordinates from the MCS or other coordinate system). For example, the position data determined from the analysis of the image stack may provide 3D coordinates indicative of the 3D position of surface points on the workpiece surface in terms of PCS or other LCS, which thus represent/correspond to the 3D contour/surface topography of the workpiece surface. As described above, in some embodiments, such data may be combined with other location data represented in the MCS to indicate the overall location of the workpiece surface and surface points within the MCS. However, for certain implementations/analyses/representations, etc., it may be desirable to utilize primarily or solely the position data determined from the image stack. For example, if the analysis or inspection is primarily to determine the relative position and/or nature of workpiece features on the workpiece surface (e.g., with respect to the distance between such workpiece features on the workpiece surface and/or the 3D dimensions of the workpiece features on the surface, etc.), such data may be primarily determined from the analysis of the image stack in some embodiments. More specifically, if the desired analysis/inspection does not require an overall location within the MCS of the workpiece surface and/or workpiece feature, the data determined from the image stack may be utilized with or without limited incorporation of other MCSs or other coordinate system coordinates. In addition to analyzing such data, it will be appreciated that a 3D representation of the workpiece surface (e.g., on a display or the like) may similarly be provided in accordance with the analyzed data from the image stack.
As shown in fig. 2, the vision probe controller 408 controls the vision probe 300 (e.g., controls the illumination configuration 302 and the camera 306, etc., to obtain images of the image stack, etc.). In various embodiments, the vision probe controller 408 need not control movement or focusing of the vision probe 300. Rather, these aspects may be controlled by the CMM movement mechanism configuration 220, which moves the vision probe 300 closer to and/or farther from the workpiece to obtain an image stack (i.e., moves the vision probe 300 to each image acquisition position, as illustrated/described below with respect to fig. 4 and 5), wherein the rotation mechanism 214 may be used to rotate the vision probe 300 to a desired angle/direction. In various embodiments, the focal distance of the vision probe 300 may be primarily determined by the objective lens 304 (e.g., the focal distance in front of the vision probe 300 during a measurement operation may be constant, corresponding to the objective lens 304 selected/used in the vision probe 300). The vision probe data portion 410 receives the output of the vision probe 300 (i.e., image data for the images of the image stack). The analyzer portion 412 may be used to perform a correlation analysis (e.g., a point self-focusing (PFF) analysis or other analysis of the image stack to determine the relative position of each surface point on the workpiece surface along the Z-axis of the probe, to determine a 3-dimensional surface profile of the workpiece surface, etc., as will be described in more detail below with respect to fig. 6A and 6B. Storage portion 414 may comprise a portion of computer memory for storing certain software, routines, data, etc. for processing the operation of device configuration 110, etc.
Figures 3A and 3B illustrate certain components relative to figures 1A-2, including certain components of a motion mechanism arrangement 220 that includes a rotation mechanism 214 '(embodied in a probe head 213') of the machine body 200 of the CMM 100. Fig. 3A illustrates vision probe 300 in a vertical orientation (e.g., similar to some prior art systems such as some vision systems that are primarily operated to move the focal position up and down only along the Z-axis of the machine coordinate system to obtain an image stack containing an image of the workpiece). As shown in fig. 3A, the work WP has a work surface WPs1 with an angular direction (at an angle A1). Note that in the illustration of fig. 3A, the Z-axis of the machine coordinate system is parallel to the optical axis OA of the vision probe 300. It will be appreciated that if vision probe 300 (including movement of Z-axis moving member 224 within post 223) is simply moved up and down along the Z-axis of MCS by Z-axis slide mechanism 227, the optical axis (Z-axis) of vision probe 300 may be in the same direction as the Z-axis of the machine coordinate system and image stack acquisition axis ISAA. The workpiece surface WPS1 is shown at an angle A1 relative to the horizontal plane of the MCS. In comparison, the work piece surface WPs2 of the work piece WP is shown as being substantially parallel to the horizontal plane.
Fig. 3B illustrates a vision probe 300 angled with respect to the horizontal plane of the MCS (at angle "a-H") and the vertical plane of the MCS (at angle "a-V") as may be implemented by the disclosed CMM100, in accordance with various embodiments of the disclosure. As will be described in greater detail below, the CMM100 is capable of operating its three slide mechanisms (i.e., x-axis, y-axis, and Z-axis slide mechanisms 225-227, which are orthogonal to each other and each produce motion only along the respective orthogonal X, Y and Z-axis/directions of the MCS) and a rotation mechanism 214 '(embodied in the probe head 213') to move/orient the vision probe 300. Thus, the CMM100 can freely move the vision probe 300 relative to the work piece WP along multiple axes, including simultaneously rotation about any axis, in order to obtain an image stack at a specified angle. More generally, the movement mechanism arrangement 220 (including X, Y and Z slide mechanisms 225-227 and rotation mechanism 214') supports and enables the vision probe 300 to move in mutually orthogonal X, Y and Z directions and to be at a desired angle/orientation relative to the workpiece surface to be measured.
In the illustrated example of fig. 3B, the vision probe 300 has been rotated (e.g., by a U-joint or other component of the rotation mechanism 214 'of the probe head 213') about a horizontal rotational axis RA2 passing through the rotational point R2 to be directed at angles a-H and for this purpose the optical axis OA of the vision probe 300 is substantially perpendicular to the workpiece surface WPS1. In fig. 3B, the ability of the rotation mechanism 214 'of the probe head 213' to rotate the vision probe 300 about the Z-axis of the machine coordinate system is illustrated by the rotation axis RA1 passing through the rotation point R1 at the top of the probe head 213 '/rotation mechanism 214'. Rotation about a horizontal axis is illustrated in terms of rotation axis RA2 (i.e., represented as a single point due to pointing to the page) as passing through rotation point R2 at the center of probe head 213 '/rotation mechanism 214' (e.g., in terms of operation of a U-joint as illustrated in fig. 1B).
In fig. 3B, an example image stacking range SR-3B for determining a 3-dimensional surface profile of the workpiece surface WPS1 is shown. The workpiece surface WPS1 may have various workpiece features (e.g., surface features) that may be above or below the average planar position of the workpiece surface WPS1, as will be described in more detail below with respect to fig. 7A. In some embodiments, it may be desirable to extend the range of individual focal positions of the image pile a distance above and below the workpiece surface. As shown in fig. 3B, the example image stacking range SR-3B may be significantly smaller than the image stacking range SR-3A of fig. 3A (i.e., the image stacking range required to cover all surface points of the workpiece surface WPS1 in the illustrated direction of fig. 3A), due to the fact that: the vision probe 300 in fig. 3B is oriented such that its optical axis OA is substantially perpendicular to the workpiece surface WPS1, as opposed to the orientation in fig. 3A. In fig. 3B, the angle of the optical axis OA (and image stack acquisition axis ISAA) relative to at least a portion of the workpiece surface WPS1 is denoted as "a-P", which in the illustrated example is about 90 degrees/perpendicular. Fig. 3B also shows the angle of the workpiece surface WPS1 relative to the horizontal plane "a-W" (e.g., corresponding to angle A1 of fig. 3A). According to the particular angle a-W in each embodiment, the rotation mechanism 214' may be adjusted to ensure that the optical axis OA (and ISAA) of the vision probe 300 is substantially perpendicular to at least a portion of the workpiece surface WPS1, as will be described in more detail below with reference to fig. 7A-7C.
Fig. 4 shows a 2-dimensional perspective view of the motion of vision probe 300, and fig. 5 shows a 3-dimensional perspective view thereof for obtaining a stack of images (e.g., including eleven images, as one example, as shown and described in more detail below with respect to fig. 6A and 6B). As shown in fig. 4 and 5, in one particular example, the vision probe 300 may be moved through at least eleven axial image acquisition positions I1-I11 to acquire eleven images with corresponding axial focus positions F1-F11. It should be appreciated that each axial focus position F1-F11 may be located along an Image Stack Acquisition Axis (ISAA) of the vision probe 300.
Fig. 4 and 5 show 2-dimensional coordinates and 3-dimensional coordinates of each of the axial image acquisition positions I1-I11 and the axial focus positions F1-F11. Typically, some prior art systems that have acquired stacks of images are performed only in the vertical direction (i.e., only along the Z-axis of the machine coordinate system). More specifically, according to the prior art, an imaging system (e.g., a machine vision system, etc.) may be configured to move the focal position of the system up and down along a vertical Z-axis that corresponds to the Z-axis of the machine coordinate system. On the other hand, according to the present disclosure, the specified direction for acquiring the image stack is not limited thereto. As shown herein, the image stack may now be acquired at an angle using the components of the CMM 100 in combination with the disclosed vision probe 300. Thus, in accordance with the present disclosure, instead of referring to the "Z-axis" of the machine coordinate system as in the prior art as the default optical axis for image acquisition, the optical axis of the vision probe 300, which may be arranged and oriented in any direction and at any angle to acquire an image stack, may correspond to and/or be referred to as the "image stack acquisition axis" (ISAA or ISA axis) in some cases.
Referring to fig. 4, in general, an ISA axis (ISAA) may be established at the beginning of a process for acquiring an image stack. Vision probe 300 can then be moved to each new position along the ISA axis to capture additional images. To acquire each additional image of the image stack, the optical axis OA of the vision probe 300 may be coaxial with the ISA axis. Due to the fact that: during such fine adjustment between image acquisition positions, the movement of vision probe 300 typically requires separate adjustment of X, Y and Z-slide mechanisms 225-227 (e.g., which may or may not all move simultaneously or proportionally in various embodiments), and thus movement of vision probe 300 may not always be precisely along the ISA axis. However, once the movement is completed such that vision probe 300 moves to the next axial image acquisition position to acquire the next image, the axial image acquisition position may be along the ISA axis. Furthermore, each axial focus position F1-F11 (i.e., the focus position corresponding to each acquired image) may also be along the ISA axis.
The prior art imaging systems described above that use only a single slide mechanism (e.g., a Z-axis slide mechanism) to acquire an image stack may in some cases be configured to perform specialized imaging and thus may be less common and more expensive. In contrast, CMMs that include x-axis, y-axis, and z-axis slide mechanisms are relatively popular and widely used. In accordance with the present disclosure, in various exemplary embodiments, the CMM is used to move the vision probe 300 to acquire an image stack in any direction or angle, which provides greater flexibility with standard CMMs. In addition, the configuration with X, Y and Z-axis slide mechanisms 225-227 and rotation mechanism 214 may be highly accurate due to the high accuracy X, Y and Z-axis scale sensors 228-230 (e.g., including rotary encoders and/or other types of relative position sensors) of each of the slide mechanism and rotation sensor 215 that include rotation mechanism 214. In various exemplary embodiments, due in part to the direct correlation of each X, Y, Z sensor with a single coordinate axis (and corresponding single coordinate) in the MCS, the overall position determination within the MCS for each of the corresponding X, Y and Z coordinates may be performed relatively simply and while being highly accurate.
Fig. 4 and 5 show examples of X, Y and Z coordinates in the machine coordinate system for each movement of the vision probe 300 to the image acquisition positions I1-I11. In various embodiments, the machine coordinate system X-axis, y-axis, and z-axis may be referred to as X-axis, respectively S Axis, Y S Axis and Z S A shaft. The image acquisition positions I1-I11 at which the vision probe 300 is positioned to capture eleven images of the image stack (images (1) - (11) in fig. 6B) correspond to the axial focus positions F1-F11 at which the vision probe 300 is focused for capturing eleven images of the image stack. In the example shown, all image acquisition positions I1-I11 and axial focus positions F1-F11 are along an Image Stack Acquisition Axis (ISAA). For the image stack 650 of fig. 6B, when the vision probe 300 is in the image acquisition position I1, it will focus on the axial focus position F1 to capture the image (1) of the image stack.
More specifically, as shown in fig. 4, for image acquisition position I1, the corresponding MCS coordinates of the reference position of vision probe 300 are at IX1 and IZ1. In the next image acquisition position I2, the MCS coordinates may be IX2 and IZ2. At the next image acquisition position I3, the MCS coordinates may be IX3 and IZ3. For the remaining image acquisition positions I4-I11, the corresponding MCS coordinates of the reference position of the vision probe 300 are similarly at IX4-IX11 and IZ4-IZ11, respectively. To move vision probe 300 from image acquisition position I1 to image acquisition position I2, X-axis slide mechanism 255 is adjusted to move from IX1 to IX2. Similarly, the Z-axis sliding mechanism 227 is adjusted to move from IZ1 to IZ2. With respect to fig. 5, for image acquisition position I1, the corresponding MCS coordinates are at IX1, IY1 and IZ1. In the next image acquisition position I2, MCS coordinates may be IX2, IY2, and IZ2. At the next image acquisition position I3, the MCS coordinates may be IX3, IY3, and IZ3. To move vision probe 300 from image acquisition position I1 to image acquisition position I2, X-axis slide mechanism 255 is adjusted to move from IX1 to IX2. Similarly, the Y-axis slide mechanism 226 is adjusted to move from IY1 to IY2, and the Z-axis slide mechanism 227 is adjusted to move from IZ1 to IZ2. A similar movement is performed for the movement to the remaining image acquisition position.
In some embodiments, such adjustment of the slide mechanisms 225-227 may be relatively simultaneous such that the vision probe 300 may move substantially along the Image Stack Acquisition Axis (ISAA) in its movement between the image acquisition positions I1 and I2. However, the movement of each slide mechanism 225-227 need not be precisely proportional or simultaneous throughout the movement, and the movement of vision probe 300 between positions may not be entirely centered along the ISA axis. That is, unlike prior art systems that utilize a single slide mechanism to result in movement along the image stack acquisition axis that is always accurate, movement of each slide mechanism 225-227 according to various embodiments of the present disclosure may result in the determination and/or combination of movement along multiple axes. However, in various exemplary embodiments, at the end of the entire motion from position I1, vision probe 300 will be positioned at position I2 along the ISA axis and/or otherwise coaxial the optical axis of vision probe 300 with the ISA axis.
As will be described in more detail below, in some embodiments, it may be desirable to have the focal position of at least a portion of the workpiece surface WPS1 approximately correspond to the focal position in the middle of the focal position range of the image stack. For example, in the illustrated image stack of eleven (11) images with respective focal positions F1-F11, it may be desirable for at least a portion of the workpiece surface to be approximately in focus at a generally axial focal position F6, corresponding approximately in the middle of the range of the image stack, as will be described in more detail below with respect to fig. 6A and 6B. As described herein, it may also be desirable for at least a portion of the workpiece surface WPS1 (and/or the general or average angular orientation of the workpiece surface or portion thereof) to be approximately/nominally perpendicular to the ISA axis, as shown in fig. 4 and 5. Such features were also described previously with respect to the potential scan ranges SR in fig. 3A and 3B, and are described in more detail below with respect to the scan ranges SR1 and SR2 of fig. 7B and 7C. More specifically, by orienting the vision probe 300 such that the Image Stack Acquisition Axis (ISAA) is approximately perpendicular to at least a portion of the workpiece surface (WPS 1) being imaged, the extent of the image stack may be relatively short while still covering all surface point ranges of the 3-dimensional workpiece features (i.e., corresponding to 3-dimensional surface features and deviations) with high accuracy in accordance with the present disclosure.
Fig. 6A and 6B illustrate how the image stack obtained by vision probe 300 according to the present disclosure may be used to determine the ZP position of a point on the workpiece surface along a ZP axis, which may be approximately/nominally perpendicular to the workpiece surface. As used herein, the "ZP axis" may correspond to the z-axis of the probe coordinate system and/or the optical axis of the vision probe 300, which may not coincide with the z-axis of the MCS when the vision probe 300 is angled or tilted. In various embodiments, the image stack is obtained by the CMM 100 operating in a point autofocus (PFF) mode (or similar mode) to determine the ZP height (ZP position) of the workpiece surface along an axis approximately perpendicular to the workpiece surface. The PFF image stack may be processed to determine or output a ZP height coordinate map (e.g., a point cloud) that quantitatively indicates a set of 3-dimensional surface coordinates (e.g., corresponding to a surface shape or contour of the workpiece).
In particular, fig. 6A and 6B illustrate operations related to determining a relative ZP position along a direction of an image stack acquisition axis (e.g., parallel to a ZP axis of the vision probe 300 or a Probe Coordinate System (PCS)) for a point on a workpiece surface. In a configuration where the image stack acquisition axis ISAA is parallel to the Z-axis of the machine coordinate system, the relative position has been referenced in some existing systems to correspond to the Z-height of the surface points, although more generally, the image stack acquisition axis ISAA may be oriented in any direction, as disclosed herein.
As shown in fig. 6A and 6B, the focus position may be moved within the range of the position Zp (i) along the direction of the image stack acquisition axis ISAA, which may correspond to the focus axis at each image acquisition position. The vision probe 300 may capture an image (i) at each location Zp (i). For each captured image (i), a focus metric fm (k, i) may be calculated based on the region or sub-region of interest ROI (k) (e.g., a set of pixels) in the image (e.g., with the corresponding surface point at the center of the region or sub-region of interest ROI (k)). The focus metric fm (k, i) is related to the corresponding position Zp (i) of the vision probe 300 and the corresponding focus position along the direction of the image stack acquisition axis ISAA when capturing the image (i). This produces focus curve data (e.g., a set of focus metrics fm (k, i) at location Zp (i), which is a type of focus peak determination dataset), which may be referred to simply as a "focus curve" or an "autofocus curve". In an embodiment, the magnitude of the degree of focus may involve calculation of contrast or sharpness of a region of interest in the image.
ZP position corresponding to the peak of the focus curve (e.g., Z in fig. 6A p k601 Corresponding to the best focus position along the image stack acquisition axis, is the ZP position of the region of interest for determining the focus curve. It will be appreciated that while the image stack is shown as including eleven images (image (1) -image (11)) for purposes of illustration, in practical embodiments a fewer or greater number of images (e.g., 100 or more) may be utilized.
As indicated by the focus curves generated for images (1) - (11), in the example shown, image (6) appears to be close to or in optimal focus (e.g., image farther from image (6) appears to be increasingly unfocused compared to the workpiece surface, and other images that are blurred may increasingly appear, features (not shown) in the middle of ROI (1) appear to be most in focus in image (6). When the magnitude of the degree of focus is based on contrast as described above, one method includes comparing the center pixel of the ROI with its neighboring pixels in the ROI in terms of color/brightness, etc. By finding the image with the highest overall contrast (corresponding to the focus position at the time the image was acquired), an indication/measurement of the relative ZP position of the surface point (e.g. at the center of the ROI) can be acquired along the optical axis OA and the image stack acquisition axis ISAA.
In fig. 6B as described above, the central region of interest ROI (1) is considered to be approximately focused on the image (6), which corresponds to the position Zp (6) along the optical axis of the vision probe 300. The optical axis corresponds to the Zp axis in the Probe Coordinate System (PCS), and may also be coaxial with the image stack acquisition axis ISAA when each image is acquired using the vision probe 300. In this way, the relative position Zp (6) of the surface point on the workpiece surface corresponding to the center of the ROI (1) can be determined, since this position approximately corresponds to the focal position of the image (6) in the image stack. It will be appreciated that in some cases, the determined peak focus position may fall between two images in the image stack for which the focus peak position may be determined by interpolation or other technique from a fit of the focus curve to the focus power magnitude determined for the images.
In some implementations, it may be desirable to have the images of the image stack be substantially evenly spaced within the image stack, which may help ensure even distribution of data points along the focal curve and/or to simplify certain calculations (e.g., interpolation, etc.), or otherwise to assist/improve certain algorithmic operations. However, in some cases, the focus curve may also be determined relatively accurately from the image stack when the images are not all uniformly spaced apart (e.g., may be due to X, Y and Z-axis slide mechanisms 225-227 having certain parameters/limitations in relative motion, e.g., how small incremental movements may be accurately performed, etc.).
If uniform spacing of all images in the image stack is desired, in some embodiments it may be desirable to utilize certain orientations of the vision probe 300 for which movement for acquiring the image stack may be supported by the limitations/characteristics of the particular CMMs X, Y and Z-axis slide mechanisms 225-227. For example, if X, Y and Z-axis slide mechanisms 225-227 each have a minimum increment of motion (e.g., 1 um), and if a 45 degree angle is used for the ISA axis, then in one example embodiment X, Y and Z-axis slide mechanisms 225-227 to be moved may be moved in the same increment (e.g., 1 um) for each image acquisition location, so that the spacing between each image in the image stack may be the same. According to a similar principle, each of X, Y and Z-axis slide mechanisms 225-227 may be moved by different amounts for each image acquisition position, but for this purpose, the X movement amount/increment may be the same for movement between each image acquisition position, the Y movement amount/increment may be the same for movement between each image acquisition position, and the Z movement amount/increment may be the same for movement between each image acquisition position. In accordance with such movement, the image acquisition positions will correspond to and/or define an image stack acquisition axis ISAA for which probe orientation may be performed such that the optical axis OA of the vision probe 300 at each image acquisition position may be substantially/nominally coaxial with the image stack acquisition axis ISAA.
In accordance with similar principles, if there is a minimum delta for adjustment of the angular orientation of the vision probe 300 (e.g., a minimum obtainable delta/adjustment in accordance with movement of the rotation mechanism 214 for angular orientation of the vision probe 300), movement of X, Y and Z-axis slide mechanisms 225-227 may also be made to have ISAAs corresponding to such angular directions. In some implementations, for the overall system, a desired orientation of vision probe 300 that optimally/most accurately aligns optical axis OA of vision probe 300 with image stack acquisition axis ISAA may be found based at least in part on the minimum increment of movement of rotation mechanism 214 and/or X, Y and Z-axis slide mechanisms 225-227. In particular, such a desired orientation of the vision probe 300 for capturing a stack of images may be found from the motion/adjustment capabilities of the CMM 100 for adjusting the position/angular orientation of the vision probe 300. In one particular example implementation, in accordance with the principles/examples described above, the vision probe 300 (e.g., with respect to a horizontal or vertical plane, such as one or more of an XY plane, an XZ plane, and/or a YZ plane of an MCS) may be oriented in some cases with a 45 degree angle (or a triangle-like angle, such as a 135 degree, 225 degree, or 315 degree angle).
With further regard to fig. 6B, the region of interest ROI (2) is shown positioned diagonally with respect to the region of interest ROI (1). As an example, if the region of interest ROI (2) is not in focus at any point within the 11 images of the example image stack 650, additional images may need to be evaluated and/or the range of the image stack may need to be expanded in order to find the focus position of the surface point corresponding to the ROI (2) (e.g., in order to acquire an image stack with a greater number of images and a greater corresponding focus position range). In some embodiments, an image stack having 100 or more images may often be acquired/utilized. For example, referring to fig. 7A, a surface point centered in the middle of ROI (1) may be at the bottom of a cylindrical hole that is workpiece feature WPF1, while a surface point corresponding to ROI (2) may be at the top edge of the cylindrical hole for which a larger image stacking range may be needed/utilized as well as additional images (e.g., all surface points of workpiece feature WPF1 for covering workpiece surface WPS 1).
Fig. 7A shows a sample work piece WP1 having respective work piece surfaces WPs1, WPs2, WPs3 and work piece features WPF1 (which are holes defined in work piece surface WPs 1), WPF2 and WPF3C (which are certain geometric features defined on the edge interface between work piece surfaces WPs2 and WPs 3), and WPF3A and WPF3B (two holes defined in work piece surface WPs 3). As described above with reference to fig. 3B, the workpiece surface or workpiece feature to be measured will lie within the image stacking range SR-3B to determine the 3-dimensional surface profile of the workpiece surface or workpiece feature. As shown in fig. 7A, each workpiece feature includes a surface that is higher or lower than the general or average plane of the workpiece surface on which the workpiece feature is defined. Thus, in various embodiments, imaging of workpiece features may require the use of a sufficiently large image stacking range (or scan range SR) to cover all surfaces/surface points of workpiece features of different ZP heights.
Fig. 7B is a schematic diagram showing the distal end of a vision probe 300 having its optical axis OA and image stack acquisition axis ISAA oriented in a generally vertical direction (i.e., parallel to the z-axis of MCS) relative to the surface on which the work piece WP1 is placed, the work piece WP1 having an angled work piece surface WPs1 that includes work piece features WPF1. Fig. 7C is a schematic view of the distal end of the vision probe 300 with its optical axis OA and image stack acquisition axis ISAA oriented at an angle so as to be generally/nominally perpendicular to the angled workpiece surface WPs1 of the workpiece WP 1.
In general, fig. 7B and 7C can be understood to illustrate a desired scan range (e.g., the range of fig. 7C as compared to fig. 7B) for covering a 3-dimensional surface topography of the workpiece surface WPS1, depending on the orientation of the vision probe 300 relative to the workpiece surface WPS1 to be measured. For example, the scan range SR1 with the orientation of fig. 7B is significantly larger than the scan range SR2 with the orientation of fig. 7C, thereby enabling coverage of the 3-dimensional surface topography of the workpiece surface WPS1 (e.g., including the workpiece feature WPF 1). Thus, adjusting the angle/direction of the vision probe 300 as in fig. 7C to have the optical axis OA approximately perpendicular to the workpiece surface WPS1 and/or the workpiece feature WPF1 is technically advantageous in reducing the required scan range, which in turn may reduce the scan time and/or reduce the number of images required to form an image stack (e.g., having a desired image density).
As shown in fig. 7B, in addition to the scan range SR1 for image stacking being significantly greater than the scan range SR2 of fig. 7C, the vision probe 300 is also oriented at a relatively acute angle relative to the workpiece surface WPS1, which may reduce imaging quality or prevent imaging of certain portions/aspects of the workpiece feature WPF 1. For example, sharp angles may reduce imaging quality due to less imaging light being reflected back into vision probe 300, etc. As another example, in fig. 7B, the upper corner at the surface point SP3 of the bottom of the cylindrical hole workpiece feature WPF1 is shown as not visible through the vision probe 300 (i.e., the upper edge of the cylindrical hole blocks a view of the corner at the surface point SP3 of the cylindrical hole in the direction shown). In contrast, in fig. 7C, by orienting the vision probe 300 substantially perpendicular to the workpiece surface WPS1 and/or at least a portion of the workpiece feature WPF1, the vision probe 300 may have a better angle to image various workpiece features (e.g., WPF 1) on the workpiece surface WPS1 (e.g., better angle for reflected imaging light, ability to view corners at surface point SP3, etc.). Thus, in some embodiments, the vision probe 300 in the direction of fig. 7C may be able to provide a more accurate 3-dimensional surface profile of the workpiece surface WPS1 in addition to having a smaller scan range SR2 than the scan range SR1 of fig. 7B.
In various embodiments, it may also be desirable to perform different scans (including acquisition of different stacks of images) with the vision probe 300 in different directions. For example, work piece WP1 is considered to include work piece surfaces WPs1, WPs2, and WPs3. In one embodiment, the vision probe 300 may be positioned as shown in fig. 7B (e.g., tilted 0 degrees relative to vertical) to acquire an image stack for scanning the workpiece surface WPS2, then oriented as shown in fig. 7C (e.g., tilted 45 degrees relative to vertical) to acquire an image stack for scanning the workpiece surface WPS1, and then oriented (e.g., tilted 90 degrees relative to vertical) to acquire an image stack for scanning the workpiece surface WPS3.
In some embodiments, the scan/image stack may be made to include all or part of a plurality of workpiece surfaces. For example, the image (and view) for scanning the workpiece surface WPS2 at a 0 degree tilt may also include all or part of the workpiece surface WPS1 (and/or the workpiece surface WPS 3). Such a process (where multiple image stacks may include at least some common surface points scanned/imaged from different directions) may help to further verify the 3D position of each surface point and/or enable accurate alignment/recombination of various 3D data corresponding to different workpiece surfaces to form a full or partial 3D representation of workpiece WP 1. For example, in various embodiments, the 3D contours of the various surfaces may be "stitched together" or otherwise combined to form a full or partial 3D representation of the work piece WP 1. In addition, certain workpiece features (e.g., WPF2/WPF 3C) may have certain dimensions/aspects that may be included in scans of multiple surfaces (e.g., WPS2 and WPS 3), and the scans of each surface may be utilized/combined to determine the overall characteristics/dimensions/3D profile of the workpiece features WPF2/WPF 3C. Such possible operations and processes illustrate another advantage of the present disclosure because certain prior art systems are generally only able to acquire image stacks from a single direction (e.g., along the Z-axis of the MCS), while the present disclosure is able to enable a CMM system to acquire multiple image stacks from multiple directions using a vision probe to analyze/measure/determine 3D contours of multiple workpiece surfaces and/or features that may be in different directions. Such 3D data for various surfaces/features of the workpiece may then be combined or otherwise utilized to determine the overall 3D profile of all or part of the workpiece and/or certain workpiece features.
In the PFF type analysis described above with reference to fig. 6A and 6B, each focus curve (as shown in fig. 6A) corresponds to a single point on the workpiece surface. That is, the peak of each focus curve represents the Zp position of a single point along the direction of the optical axis OA of the vision probe 300. In various embodiments, the PFF type analysis repeats the process for a plurality of surface points across the workpiece surface (e.g., each having a corresponding region of interest), so that the overall profile of the workpiece surface can be determined. In general, this process may be performed for a plurality of surface points within the field of view (i.e., as captured within the images of the image stack), with a particular ROI (i) corresponding to a particular point on the workpiece surface (the point preferably being at the center of the ROI) for each image of the image stack. Referring additionally to fig. 7B, as one illustrative example, if ROI (1) corresponds to a surface point at the bottom edge of cylindrical hole workpiece feature WPF1 (e.g., adjacent to surface point SP 3), and if ROI (2) corresponds to a surface point on workpiece surface WPS1 that is not in a cylindrical hole (e.g., at surface point SP 2), then the focus curves corresponding to the two illustrative surface points will be different and will have different focus peaks. For example, for surface point SP2, a focus curve such as that of fig. 6A would be shifted, with the peak at a different location (e.g., indicating that the focus location is closer to vision probe 300, and thus higher in the portion of the image stack shown in fig. 6B, or even higher in a portion of the image stack not shown in fig. 6B, such as in an embodiment where the image stack has other images than the 11 images shown).
For the total available movement of the CMM 100 with X, Y and Z-axis slide mechanisms 225-227, if only moved along the Z-axis (i.e., only Z-axis slide mechanism 227 is used) to adjust the focus of vision probe 300 (e.g., similar to techniques used to acquire image stacks in some existing machine vision systems), the total range of motion of the potential image stack acquisition process will be limited to the maximum range of motion of Z-axis slide mechanism 227. In contrast, in accordance with the techniques of the present disclosure, the vision probe 300 may be moved diagonally across the entire available moving volume of the CMM 100 to acquire image stacks, which may generally provide a longer potential scan range and greater flexibility in acquiring image stacks to scan the workpiece surfaces from different angles.
As previously described, in some embodiments, in addition to using vision probe 300 to obtain an image stack for determining a 3-dimensional surface profile, it may also be useful in some cases to use a tactile measurement probe 390 (i.e., a probe having a probe tip that physically contacts the workpiece to determine a measurement, such as a contact probe or scanning probe that the probe tip is positioned to contact and slide along in order to "scan" the workpiece surface) in combination with vision probe 300. For example, after use of the vision probe 300, the vision probe 300 may be separated from the CMM 100, and the tactile measurement probe 390 may be attached to the CMM 100 and used to verify the location of certain surface points, and/or to measure certain surface points, such as those points where the vision probe 300 may not be well imaged. For example, in the embodiment of fig. 7C, assuming that surface point SP3 is directly below surface point SP2 along the optical axis OA of vision probe 300 in the orientation shown, it may be difficult to determine the exact location of surface point SP3 from the stack of images captured by vision probe 300. In this case, the tactile measurement probe 390 may be used to verify the location of certain surface points (e.g., along the edge of the cylindrical hole workpiece feature WPF1 and/or at the bottom corners thereof, such as surface points SP3 and SP4, etc.).
As previously described, in some embodiments, it may be desirable to have the optical axis of the vision probe 300 substantially perpendicular to the surface of the workpiece being scanned (i.e., the stack of images captured for it). It should be noted that the optical axis of the vision probe 300 may be perpendicular to only a portion of the workpiece surface, or in some cases may not be substantially perpendicular to any particular portion of the workpiece surface, but rather only perpendicular to the general or average, etc., direction of the workpiece surface. For example, if the workpiece surface is particularly uneven and/or includes a large number of workpiece features that form complex or non-uniform 3-dimensional contours/surface topography, the optical axis/image stack acquisition axis (OA/ISAA) may be inaccurately perpendicular to any particular portion of the workpiece surface, but may be generally/nominally perpendicular to the overall, average, and/or generally equi-oriented or principal angles of the workpiece surface.
Still referring to fig. 7A-7C, additional examples of implementations of the CMM 100 for obtaining and using image stacks to determine a "depth map" and/or a "surface topography" of a workpiece surface will be described. In some cases, it may be described that the entire workpiece surface may be at a "primary angle" that corresponds to the workpiece angle a-W described with respect to fig. 3B, which is the angle formed between the workpiece surface and the horizontal plane in which the workpiece is located. As previously described, in some embodiments it may be advantageous or otherwise desirable to have the image stack acquisition axis ISAA substantially perpendicular to the workpiece surface at a primary angle (a-W). Even if the image stack acquisition axis ISAA is not exactly perpendicular to the general direction of the workpiece surface, this problem can be partially solved by processing the image data according to a specific application (e.g. an application comprising how the user wishes to present the image data, etc.). More specifically, when determining a depth map and/or surface topography of a workpiece surface using image stacking, if it is determined that the workpiece surface is not perfectly aligned with the ISA axis but is at an angle to the ISA axis (i.e., the workpiece surface is not perfectly perpendicular to the ISA axis), such angular offset may be subtracted or otherwise compensated for as part of image data processing so that the depth map or surface topography of the workpiece surface may be determined/displayed generally at the level of the entire workpiece surface (e.g., as may be desired for certain presentations and/or analyses, etc.). In some embodiments, if a user or system visually or otherwise evaluates a defect of a workpiece surface, it may be preferable to have a generally horizontal presentation of the workpiece surface, for which purpose the defect may be more readily discernable/determinable as a height deviation from other horizontal workpiece surfaces (e.g., a defect and/or other deviation from coordinates above or below the coordinates of the generally horizontal surface).
In various exemplary embodiments of the CMM100, the primary angle (a-W) of the workpiece surface to be measured may be first determined in order to know at what angle to orient the vision probe 300 to image the workpiece surface. In various exemplary embodiments, the dimensions and characteristics of the workpiece to be measured (including its principal angle) may be known, for which purpose the CMM100 may be used to perform precision measurements and/or inspections. Once the principal angle is known or determined, a desired angular orientation of the vision probe 300 may be determined (e.g., to be substantially perpendicular to at least a portion of the workpiece surface). As described above, by orienting the vision probe 300 relative to the workpiece surface in this manner, the required range of the image stack may be made relatively smaller/shorter than an image stack covering the same number of images that are required to be more spaced apart (i.e., the corresponding focal position spacing between the images is greater) for a greater scan range, thereby allowing for faster acquisition of the image stack and/or for the same number of images to be acquired in a denser image stack (i.e., the corresponding focal position spacing between the images is smaller).
Referring to fig. 7B and 7C, the program instructions, when executed by the one or more processors of the CMM100, may cause the one or more processors to acquire a first image stack as shown in fig. 7C and to acquire a second image stack as shown in fig. 7B. In fig. 7C, the work piece surface WPS1 may be designated as a first work piece surface, and the orientation of the vision probe 300 (which is a first orientation) may be used to acquire a first image stack of the first work piece surface of the work piece WP 1. In fig. 7B, the work piece surface WPS2 may be designated as a second work piece surface oriented at a different angle than the first work piece surface WPS1, and the orientation of the vision probe 300 of a second orientation different than the first orientation may be used to capture a second image stack of the second work piece surface of the work piece WP 1. In the orientation shown in fig. 7B, the second image stack may have a field of view that includes primarily the workpiece surface WPS2 (second workpiece surface) but may also include all or part of the workpiece surface WPS1 (first workpiece surface). Similarly, in the orientation shown in fig. 7C, the first image stack may have a field of view that includes primarily the workpiece surface WPS1 (first workpiece surface) but may also include all or part of the workpiece surface WPS2 (second workpiece surface).
In various embodiments, in addition to the focal curve data that may be determined based at least in part on the analysis of the first image stack (e.g., with respect to fig. 7C), additional focal curve data may be determined based at least in part on the analysis of the second image stack (e.g., with respect to fig. 7B), wherein the additional focal curve data is indicative of a 3-dimensional position of a plurality of surface points on a second workpiece surface of the workpiece (e.g., workpiece surface WPS 2). In various embodiments, the system may determine and/or display a 3-dimensional representation of at least a portion of the first workpiece surface WPS1 based at least on focus curve data determined based on analysis of the first image stack but not based on analysis of the second image stack. Similarly, the system may determine and/or display a 3-dimensional representation of at least a portion of the second workpiece surface WPS2 based at least on focus curve data determined based on analysis of the second image stack but not based on analysis of the first image stack.
For example, in some embodiments where the first image stack and the second image stack may each contain a portion or all of the workpiece surfaces WPS1 and WPS2, the focus curve data determined for the first workpiece surface WPS1 based on the analysis of the first image stack (for which the first image stack acquisition axis ISAA1 may be considered or determined to be more accurate and/or of higher quality/certainty than the focus curve data determined for the first workpiece surface WPS1 based on the analysis of the second image stack (for which the second image stack acquisition axis ISAA2 is not approximately perpendicular to at least a portion of the first workpiece surface WPS1, and in particular is farther away from perpendicular than the first image stack acquisition axis ISAA 1).
As described in more detail in U.S. patent No. 8,581,162, the entire contents of which are incorporated herein by reference, the certainty of certain focus peaks and/or Z-height quality metadata analysis may indicate the reliability and/or quality of certain 3-dimensional data determined (e.g., relating to a region of interest in an image stack). While the' 162 patent performs such analysis with respect to determining the quality/reliability of coordinates of adjacent workpiece surface points in a single image stack (i.e., acquired only along the Z-axis direction of the machine coordinate system), in accordance with the present disclosure, certain similar principles may be applied to consider determining the quality/reliability of coordinates of one image stack relative to workpiece surface points in another image stack (e.g., acquired at different angles). For example, in some embodiments, focus curve data determined for an image stack acquired with an image stack acquisition axis ISAA that is approximately perpendicular to a portion of a workpiece surface may be relatively more accurate for determining coordinates of workpiece surface points on the portion of the workpiece surface than an image stack with an image stack acquisition axis that is less perpendicular. In some embodiments, as described above, the higher accuracy is due at least in part to the more perpendicular orientation of the vision probe/optical axis relative to the workpiece surface, which results in more imaging light being reflected back to the vision probe 300 (e.g., which may result in higher focus peaks and higher corresponding reliability and/or quality of the 3-dimensional data). In some embodiments, the relative accuracy may also be due in part to the fact that in a given image of the image stack, more adjacent workpiece surface points/pixels are simultaneously focused in a more vertical direction, thus allowing a higher focus metric value to be determined (e.g., based on contrast or other focus metric). In contrast, focus curve data determined for an image stack acquired with an image stack acquisition axis ISAA that is relatively far from perpendicular to a portion of a workpiece surface may be relatively less accurate for determining coordinates of workpiece surface points on that portion of the workpiece surface. In some embodiments, as described above, the lower accuracy is due at least in part to the less perpendicular orientation of the vision probe/optical axis relative to the workpiece surface, which results in less imaging light being reflected back to the vision probe 300 (e.g., which may result in lower focus peaks and lower corresponding reliability and/or quality of the 3-dimensional data). In various embodiments, the relative inaccuracy may also be due in part to fewer adjacent workpiece surface points/pixels being simultaneously focused (i.e., due to the inclination of the portion of the workpiece surface relative to the image stack acquisition axis), in some cases, only "stripes" of relatively inclined workpiece surfaces at the same Z-distance in the probe coordinate system may be simultaneously precisely focused, resulting in fewer "focused" pixels/surface points in a region of interest in a given image of the image stack that contribute a higher amount to the overall focus metric for the region of interest having the corresponding center surface point/pixel. More specifically, in some cases, more focal pixels in a given image at the same time in the region of interest may result in a higher focus peak, for which the determination of focus peak position may be more accurate (e.g., less sensitive to noise or other factors), resulting in better focus peak certainty.
As another example, in various embodiments, the CMM 100 may acquire a commonly imaged surface point, such as surface point SP2 on a first portion of the first workpiece surface WPS1, that is imaged in both the first and second image stacks, wherein the first image stack acquisition axis ISAA1 (of fig. 7C) is more perpendicular to the first portion of the first workpiece surface WPS1 than the second image stack acquisition axis ISAA2 (of fig. 7B). The focus curve data determined based at least in part on the analysis of the first image stack may be indicative of a first 3-dimensional position of the commonly imaged surface point SP2 (e.g., which may correspond to a determined first set of coordinates, such as (XP 2C, YP2C, ZP C), in one example). On the other hand, the focal curve data determined based at least in part on analysis of the second image stack may indicate a second 3-dimensional position of the commonly imaged surface point SP2 (e.g., in one example, may correspond to a second set of determined coordinates, such as (XP 2B, YP2B, ZP 2B). Note that, in various embodiments, the second determined 3-dimensional position (e.g., at determined coordinates XP2B, YP2B, ZP 2B) may be different from the first determined 3-dimensional position (e.g., at determined coordinates XP2C, YP2C, ZP 2C), and the first 3-dimensional position may be indicative of and/or determined to be more reliable/accurate than the second 3-dimensional position, and may be used as part of a set of 3-dimensional data for the workpiece instead of the second 3-dimensional position, such techniques may be advantageous, as described above, because acquiring focal curve data determined for an image stack using the image stack with the image stack acquisition axis ISAA (which is substantially perpendicular to the workpiece surface and/or a portion of the workpiece feature) may be more accurately acquired with respect to the second coordinate position (e.g., at coordinates XP2C, YP2C, ZP C) than the first determined coordinate position (e.g., with respect to the surface area 95 a position of the workpiece surface area) may be determined with respect to the second coordinate position (e.g., surface area) 95.g., with respect to the second position (s 2) 3-dimensional position) may be determined as part of the surface area (surface area) 3.g., surface area (95).
Figure 8 is a flow chart of a method of measuring a surface of a workpiece by using a CMM system including a movement mechanism arrangement as described in figures 1-7C to move a vision probe along a plurality of axes and at a desired angle/direction relative to the surface of the workpiece. The method generally comprises four steps.
Block 802 includes the steps of operating a Coordinate Measuring Machine (CMM) system comprising: (i) A vision probe 300 configured to image a surface of a work piece WP based on image light transmitted along an optical axis OA of the vision probe 300; (ii) A slide mechanism arrangement comprising an x-axis slide mechanism, a y-axis slide mechanism, and z-axis slide mechanisms 225-227, each configured to move vision probe 300 in mutually orthogonal x-axis, y-axis, and z-axis directions, respectively, within a machine coordinate system MCS; and (iii) a rotation mechanism 214 coupled between the z-axis slide mechanism and the vision probe 300 and configured to rotate the vision probe 300 to different angular orientations relative to the z-axis of the machine coordinate system.
Block 804 includes the step of adjusting the orientation of the vision probe 300 using a rotation mechanism such that the optical axis OA of the vision probe 300 is directed toward the surface of the work piece WP, wherein the optical axis OA of the vision probe 300 is not parallel to the z-axis of the machine coordinate system and corresponds to the image stack acquisition axis ISAA. As described above, in various embodiments, the optical axis OA may be approximately/nominally perpendicular to at least a portion of the workpiece surface, for which purpose the workpiece surface may be angled (e.g., may not be horizontal within the machine coordinate system).
Block 806 includes the step of acquiring an image stack comprising a plurality of images, each image having a corresponding focal position of the vision probe 300 along an image stack acquisition axis. The acquisition of the image stack includes: (i) Adjusting the plurality of slide mechanisms 225-227 to move the vision probe 300 from a first image acquisition position to a second image acquisition position, each along an image stack acquisition axis, wherein the vision probe 300 acquires a first image and a second image of the plurality of images at the first image acquisition position and the second image acquisition position, respectively; and (ii) adjusting the plurality of slide mechanisms 225-227 to move the vision probe 300 from the second image acquisition position to a third image acquisition position that is also along the image stack acquisition axis, wherein the vision probe 300 acquires a third image of the plurality of images at the third image acquisition position.
Block 808 includes the step of determining focus curve data based at least in part on an analysis of the images of the image stack, wherein the focus curve data is indicative of 3-dimensional locations of a plurality of surface points on the workpiece surface.
While the preferred embodiments of the present disclosure have been shown and described, numerous variations in the illustrated and described arrangements of features and sequences of operations will be understood by those skilled in the art based on this disclosure. Various alternatives may be used to implement the principles disclosed herein. In addition, the various embodiments described above may be combined to provide further embodiments.

Claims (21)

1. A coordinate measuring machine system, comprising:
a vision probe, comprising:
a light source;
an objective lens inputting image light generated by a workpiece surface illuminated by the light source and transmitting the image light along an imaging optical path, wherein the objective lens defines an optical axis of the vision probe, the optical axis extending at least between the objective lens and the workpiece surface;
a camera that receives imaging light transmitted along the imaging light path and provides an image of the workpiece surface;
a slide mechanism arrangement comprising an x-axis slide mechanism, a y-axis slide mechanism, and a z-axis slide mechanism, each configured to move the vision probe in mutually orthogonal x-axis, y-axis, and z-axis directions, respectively, within a machine coordinate system;
a rotation mechanism coupled between the z-axis slide mechanism and the vision probe and configured to rotate the vision probe to different angular orientations relative to a z-axis of the machine coordinate system;
one or more processors; and
a memory coupled with the one or more processors and storing program instructions that, when executed by the one or more processors, cause the one or more processors to at least:
Adjusting a direction of the vision probe using the rotation mechanism to direct the optical axis of the vision probe toward a surface of the workpiece, wherein the optical axis of the vision probe is not parallel to a z-axis of the machine coordinate system and corresponds to an image stack acquisition axis;
acquiring an image stack comprising a plurality of images, each image corresponding to a focus position of the vision probe along the image stack acquisition axis, wherein acquiring the image stack comprises:
adjusting a plurality of the slide mechanisms to move the vision probe from a first image acquisition position to a second image acquisition position along the image stack acquisition axis, respectively, wherein the vision probe acquires first and second images of a plurality of images at the first and second image acquisition positions, respectively; and
adjusting the plurality of slide mechanisms to move the vision probe from the second image acquisition position to a third image acquisition position that is also along the image stack acquisition axis, wherein the vision probe acquires a third image of a plurality of images at the third image acquisition position; and
focus curve data is determined based at least in part on analysis of the images of the image stack, wherein the focus curve data is indicative of 3-dimensional locations of a plurality of surface points on a surface of the workpiece.
2. The system of claim 1, wherein as part of the analysis of the image stack, each surface point of the plurality of surface points corresponds to a center of a region of interest in the image stack, and the analysis includes determining a focal curve of each region of interest of the image stack as part of the focal curve data and a peak of each focal curve represents a 3-dimensional location of the respective surface point.
3. The system of claim 1, wherein the program instructions, when executed by the one or more processors, further cause the one or more processors to display a 3-dimensional representation of the workpiece surface based at least in part on the focus curve data.
4. The system of claim 1, wherein adjusting the plurality of slide mechanisms to move the vision probe from the first image acquisition position to the second image acquisition position comprises adjusting the x-axis slide mechanism by an x-axis image interval, adjusting the y-axis slide mechanism by a y-axis image interval, and adjusting the z-axis slide mechanism by a z-axis image interval.
5. The system of claim 4, wherein a spacing between each of the plurality of images in the image stack corresponds to the x-axis spacing, the y-axis spacing, and the z-axis spacing.
6. The system of claim 1, wherein a distance along the image stack acquisition axis between the first image acquisition position and the second image acquisition position is the same as a distance along the image stack acquisition axis between the second image acquisition position and the third image acquisition position.
7. The system of claim 1, wherein the rotation mechanism is configured to orient the vision probe in a plurality of directions, including at least:
the optical axis of the vision probe is oriented at an angle of 0 degrees relative to the z-axis of the machine coordinate system; and
the optical axis of the vision probe is oriented at a 45 degree angle relative to the z-axis of the machine coordinate system.
8. The system of claim 7, wherein the system is configured to: the stack of images is acquired with the vision probe at a 45 degree angle to the optical axis of the vision probe relative to the z-axis of the machine coordinate system.
9. The system of claim 1, wherein the rotation mechanism is not adjusted such that the orientation of the vision probe is not adjusted and remains constant during adjustment of the plurality of slide mechanisms to move the vision probe between the image acquisition positions.
10. The system of claim 1, wherein during acquisition of the image stack, an optical axis of the vision probe that is not parallel to a z-axis of the machine coordinate system and the image stack acquisition axis are substantially perpendicular to at least a portion of the workpiece surface.
11. The system of claim 1, wherein the workpiece surface is a first workpiece surface and the direction of the vision probe is a first direction and the image stack is a first image stack acquired with the vision probe in the first direction, and the program instructions, when executed by the one or more processors, further cause the one or more processors to acquire a second image stack with the vision probe in a second direction different from the first direction and with an optical axis of the vision probe directed toward a second workpiece surface of the workpiece, the second workpiece surface oriented at a different angle in the machine coordinate system than the first workpiece surface.
12. The system of claim 11, wherein the program instructions, when executed by the one or more processors, further cause the one or more processors to determine focus curve data based at least in part on analysis of the second image stack, wherein the focus curve data indicates a 3-dimensional location of a plurality of surface points on a second workpiece surface of the workpiece.
13. The system of claim 12, wherein the program instructions, when executed by the one or more processors, further cause the one or more processors to:
displaying a 3-dimensional representation of at least a portion of the first workpiece surface based at least on the focus curve data determined based on the analysis of the first image stack but not based on the analysis of the second image stack; and
a 3-dimensional representation of at least a portion of the second workpiece surface is displayed based at least on the focus curve data determined based on the analysis of the second image stack but not based on the analysis of the first image stack.
14. The system of claim 12, wherein a commonly imaged surface point on a first portion of the first workpiece surface is imaged in both the first and second image stacks, for which purpose the first image stack acquisition axis is closer to perpendicular to the first portion of the first workpiece surface than a second image stack acquisition axis, and for which purpose the focal curve data determined based at least in part on analysis of the first image stack indicates a first 3-dimensional position of the commonly imaged surface point, and the focal curve data determined based at least in part on analysis of the second image stack indicates a second 3-dimensional position of the commonly imaged surface point, the second 3-dimensional position being different from the first 3-dimensional position, and for which purpose the first 3-dimensional position is indicated and/or determined to be more reliable than the second 3-dimensional position and used as part of a set of 3-dimensional data of the workpiece in place of the second 3-dimensional position.
15. The system of claim l, wherein the objective lens has a particular magnification and is selected for the vision probe from a series of objective lenses having different magnifications.
16. A method of measuring a surface of a workpiece, comprising:
operating a coordinate measuring machine system, the system comprising: (i) A vision probe configured to image a surface of a workpiece based on image light transmitted along an optical axis of the vision probe; (ii) A slide mechanism arrangement comprising an x-axis slide mechanism, a y-axis slide mechanism, and a z-axis slide mechanism, each configured to move the vision probe in mutually orthogonal x-axis, y-axis, and z-axis directions, respectively, within a machine coordinate system; and (iii) a rotation mechanism coupled between the z-axis slide mechanism and the vision probe and configured to rotate the vision probe to different angular orientations relative to a z-axis of the machine coordinate system;
adjusting a direction of the vision probe using the rotation mechanism to direct an optical axis of the vision probe toward a surface of the workpiece, wherein the optical axis of the vision probe is not parallel to a z-axis of the machine coordinate system and corresponds to an image stack acquisition axis;
Acquiring an image stack comprising a plurality of images, each image corresponding to a focus position of the vision probe along the image stack acquisition axis, wherein acquiring the image stack comprises:
adjusting a plurality of the slide mechanisms to move the vision probe from a first image acquisition position to a second image acquisition position along the image stack acquisition axis, respectively, wherein the vision probe acquires first and second images of a plurality of images at the first and second image acquisition positions, respectively; and
adjusting the plurality of slide mechanisms to move the vision probe from the second image acquisition position to a third image acquisition position that is also along the image stack acquisition axis, wherein the vision probe acquires a third image of a plurality of images at the third image acquisition position; and
focus curve data is determined based at least in part on analysis of the images of the image stack, wherein the focus curve data is indicative of 3-dimensional locations of a plurality of surface points on a surface of the workpiece.
17. The method of claim 16, further comprising:
a 3-dimensional representation of the workpiece surface is displayed on a screen.
18. The method of claim 16, wherein adjusting the orientation of the vision probe using the rotation mechanism comprises setting an optical axis of the vision probe substantially perpendicular to at least a portion of the workpiece surface, and for this purpose, not adjusting the orientation of the vision probe further and remaining approximately constant during acquisition of the image stack.
19. The method of claim 16, wherein the workpiece surface is a first workpiece surface and the orientation of the vision probe is a first orientation, and the image stack is a first image stack acquired with the vision probe in the first orientation, and the method further comprises:
a second image stack is acquired with the vision probe in a second direction different from the first direction and with an optical axis of the vision probe directed toward a second workpiece surface of the workpiece, the second workpiece surface oriented at a different angle than the first workpiece surface.
20. The method of claim 19, further comprising:
second focus curve data is determined based at least in part on analysis of the second image stack, wherein the second focus curve data is indicative of a 3-dimensional position of a plurality of surface points on a second workpiece surface of the workpiece.
21. The method of claim 20, further comprising:
displaying a 3-dimensional representation of at least a portion of the first workpiece surface based at least on the focus curve data determined based on the analysis of the first image stack but not based on the analysis of the second image stack; and
A 3-dimensional representation of at least a portion of the second workpiece surface is displayed based at least on second focus curve data determined based on an analysis of the second image stack but not based on an analysis of the first image stack.
CN202110574684.6A 2020-05-29 2021-05-25 Coordinate measuring machine with vision probe for performing point self-focusing type measuring operation Active CN113739696B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/887,589 2020-05-29
US16/887,589 US11499817B2 (en) 2020-05-29 2020-05-29 Coordinate measuring machine with vision probe for performing points-from-focus type measurement operations

Publications (2)

Publication Number Publication Date
CN113739696A CN113739696A (en) 2021-12-03
CN113739696B true CN113739696B (en) 2024-02-06

Family

ID=78509267

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110574684.6A Active CN113739696B (en) 2020-05-29 2021-05-25 Coordinate measuring machine with vision probe for performing point self-focusing type measuring operation

Country Status (4)

Country Link
US (1) US11499817B2 (en)
JP (1) JP2021189179A (en)
CN (1) CN113739696B (en)
DE (1) DE102021113391A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3087254B1 (en) * 2018-10-16 2021-01-29 Commissariat Energie Atomique CONFIGURATION OF A NON-DESTRUCTIVE CONTROL DEVICE
JP6856607B2 (en) * 2018-11-12 2021-04-07 ファナック株式会社 Imaging equipment and machine tools
US11714051B2 (en) 2021-11-30 2023-08-01 Mitutoyo Corporation Metrology system configured to measure apertures of workpieces

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4498778A (en) * 1981-03-30 1985-02-12 Technical Arts Corporation High speed scanning method and apparatus
US6335787B1 (en) * 1996-03-04 2002-01-01 Nikon Corporation Projection exposure apparatus
CN101738728A (en) * 2008-11-04 2010-06-16 株式会社三丰 Optical aberration correction for machine vision inspection systems
JP2011194498A (en) * 2010-03-18 2011-10-06 Denso Wave Inc Visual inspection system
CN102803893A (en) * 2009-06-04 2012-11-28 瑞尼斯豪公司 Vision measurement probe and method of operation
CN105547147A (en) * 2014-10-23 2016-05-04 康耐视公司 System and method for calibrating a vision system with respect to a touch probe
CN106895778A (en) * 2015-12-17 2017-06-27 赫克斯冈技术中心 Optical probe with integrally formed interface and protection location

Family Cites Families (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5140469B2 (en) * 1973-10-04 1976-11-04
DE3806686A1 (en) 1988-03-02 1989-09-14 Wegu Messtechnik MULTICOORDINATE MEASURING AND TESTING DEVICE
DE69032005T2 (en) * 1990-04-13 1998-09-17 Hitachi Ltd Method for controlling the thickness of a thin film during its manufacture
JP3395280B2 (en) * 1993-09-21 2003-04-07 株式会社ニコン Projection exposure apparatus and method
US5982489A (en) * 1996-01-29 1999-11-09 Nikon Corporation Method and apparatus for measuring depth of a depression in a pattern by light interference from crossed light beams
US5849375A (en) * 1996-07-17 1998-12-15 Minnesota Mining & Manufacturing Company Candle filter
US5781302A (en) * 1996-07-22 1998-07-14 Geneva Steel Non-contact shape meter for flatness measurements
JPH11211617A (en) * 1998-01-22 1999-08-06 Topcon Corp Lens specifying apparatus
JP3610569B2 (en) * 1999-03-23 2005-01-12 株式会社高岳製作所 Active confocal imaging device and three-dimensional measurement method using the same
JP2002141609A (en) * 2000-11-02 2002-05-17 Furukawa Electric Co Ltd:The Semiconductor laser module, laser unit, and raman amplifier
US20040191786A1 (en) * 2001-02-16 2004-09-30 Yue David T. Three cube fret method (3-fret) for detecting fluorescence energy transfer
DE10319798A1 (en) * 2003-04-30 2004-11-25 Werth Messtechnik Gmbh Transmitted illumination arrangement
US7141802B2 (en) * 2003-12-01 2006-11-28 Olympus Corporation Optical device and imaging method
US8422127B2 (en) * 2005-03-17 2013-04-16 Hamamatsu Photonics K.K. Microscopic image capturing device
US7652275B2 (en) * 2006-07-28 2010-01-26 Mitutoyo Corporation Non-contact probe control interface
JP5189806B2 (en) 2006-09-07 2013-04-24 株式会社ミツトヨ Surface shape measuring device
US8323183B2 (en) * 2006-10-12 2012-12-04 Board Of Supervisors Of Louisiana State University And Agricultural And Mechanical College Forward looking optical coherence tomography endoscope
US8085295B2 (en) * 2007-10-26 2011-12-27 Mitutoyo Corporation Controllable micro light assembly
US7878651B2 (en) * 2007-12-26 2011-02-01 Carl Zeiss Meditec, Inc. Refractive prescription using optical coherence tomography
US8116005B2 (en) * 2008-04-04 2012-02-14 Texas Instruments Incorporated Light combiner
JP5192283B2 (en) 2008-05-13 2013-05-08 株式会社ミツトヨ CMM
US20090312859A1 (en) * 2008-06-16 2009-12-17 Electro Scientific Industries, Inc. Modifying entry angles associated with circular tooling actions to improve throughput in part machining
JP5297818B2 (en) * 2009-01-06 2013-09-25 株式会社ミツトヨ CMM
US8111905B2 (en) 2009-10-29 2012-02-07 Mitutoyo Corporation Autofocus video tool and method for precise dimensional inspection
US8581162B2 (en) 2009-12-08 2013-11-12 Mitutoyo Corporation Weighting surface fit points based on focus peak uncertainty
JP5410317B2 (en) 2010-02-05 2014-02-05 株式会社ミツトヨ CMM
AT509884B1 (en) 2010-07-27 2011-12-15 Alicona Imaging Gmbh Microscopy method and device
JP5738564B2 (en) * 2010-09-30 2015-06-24 オリンパス株式会社 Image processing system
JP5100916B2 (en) * 2010-09-30 2012-12-19 パナソニック株式会社 Signal processing apparatus and video display apparatus using the same
US9804082B2 (en) * 2011-03-08 2017-10-31 Magee Scientific Corporation Method for automatic performance diagnosis and calibration of a photometric particle analyzer
CN104137147B (en) * 2011-12-23 2017-05-31 株式会社三丰 The focusing set using multiple light in NI Vision Builder for Automated Inspection is operated
US8736817B2 (en) 2012-05-25 2014-05-27 Mitutoyo Corporation Interchangeable chromatic range sensor probe for a coordinate measuring machine
US8817240B2 (en) 2012-05-25 2014-08-26 Mitutoyo Corporation Interchangeable optics configuration for a chromatic range sensor optical pen
US8995749B2 (en) 2013-03-28 2015-03-31 Mitutoyo Corporation Enhanced edge detection tool for edges of irregular surfaces
US20160252451A1 (en) * 2013-10-15 2016-09-01 National Institute Of Advanced Industrial Science And Technology Optical measuring device and device having optical system
US9846122B2 (en) * 2013-11-26 2017-12-19 Nanometrics Incorporated Optical metrology system for spectral imaging of a sample
US9639083B2 (en) 2013-12-18 2017-05-02 Mitutoyo Corporation System and method for programming workpiece feature inspection operations for a coordinate measuring machine
EP2986957B1 (en) * 2014-05-01 2021-09-29 Bio-Rad Laboratories, Inc. Imaging assembly for emitted light
AT515745A1 (en) 2014-05-05 2015-11-15 Alicona Imaging Gmbh lighting device
US9291447B2 (en) 2014-07-09 2016-03-22 Mitutoyo Corporation Method for controlling motion of a coordinate measuring machine
EP2977720B1 (en) 2014-07-25 2019-06-05 Mitutoyo Corporation A method for measuring a high accuracy height map of a test surface
US9646425B2 (en) 2015-04-09 2017-05-09 Mitutoyo Corporation Inspection program editing environment with editing environment automatically globally responsive to editing operations in any of its portions
US9952586B2 (en) 2015-04-09 2018-04-24 Mitutoyo Corporation Inspection program editing environment with simulation status and control continually responsive to selection operations
US11520472B2 (en) 2015-09-24 2022-12-06 Mitutoyo Corporation Inspection program editing environment including integrated alignment program planning and editing features
CN108291801B (en) 2015-12-22 2020-11-10 株式会社三丰 Sensor signal offset compensation system for CMM touch probe
CN109416237B (en) 2016-07-01 2021-05-18 株式会社三丰 Power transfer arrangement for providing power to a detachable probe for a coordinate measuring machine
JP6341962B2 (en) 2016-08-26 2018-06-13 株式会社ミツトヨ Three-dimensional measuring apparatus and coordinate correction method
JP6825884B2 (en) * 2016-11-15 2021-02-03 株式会社ミツトヨ CMM
US10352679B2 (en) 2017-03-31 2019-07-16 Mitutoyo Corporation Compact coordinate measurement machine configuration with large working volume relative to size
JP7245839B2 (en) 2017-12-29 2023-03-24 株式会社ミツトヨ Inspection program editing environment with automatic transmission behavior for occluded workpiece features
JP2019168419A (en) 2018-03-26 2019-10-03 株式会社ミツトヨ Three-dimensional measuring device
US11328409B2 (en) * 2020-09-30 2022-05-10 Mitutoyo Corporation System and method utilizing multi-point autofocus to align an optical axis of an optical assembly portion to be normal to a workpiece surface

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4498778A (en) * 1981-03-30 1985-02-12 Technical Arts Corporation High speed scanning method and apparatus
US6335787B1 (en) * 1996-03-04 2002-01-01 Nikon Corporation Projection exposure apparatus
CN101738728A (en) * 2008-11-04 2010-06-16 株式会社三丰 Optical aberration correction for machine vision inspection systems
CN102803893A (en) * 2009-06-04 2012-11-28 瑞尼斯豪公司 Vision measurement probe and method of operation
JP2011194498A (en) * 2010-03-18 2011-10-06 Denso Wave Inc Visual inspection system
CN105547147A (en) * 2014-10-23 2016-05-04 康耐视公司 System and method for calibrating a vision system with respect to a touch probe
CN106895778A (en) * 2015-12-17 2017-06-27 赫克斯冈技术中心 Optical probe with integrally formed interface and protection location

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
智能三坐标测量机中零件位置自动识别系统;马新辉, 张国雄, 王建利, 刘书桂;仪器仪表学报(02);全文 *

Also Published As

Publication number Publication date
CN113739696A (en) 2021-12-03
JP2021189179A (en) 2021-12-13
US20210372769A1 (en) 2021-12-02
US11499817B2 (en) 2022-11-15
DE102021113391A1 (en) 2021-12-02

Similar Documents

Publication Publication Date Title
CN113739696B (en) Coordinate measuring machine with vision probe for performing point self-focusing type measuring operation
JP5948729B2 (en) Shape measuring device
US10107618B2 (en) Coordinate measuring machine
EP2450660B1 (en) Surface texture measuring machine and a surface texture measuring method
CN101788290B (en) Surveying instrument and method of providing survey data using a surveying instrument
CN107073719A (en) Robot and robot system
CN105102925A (en) Three-dimensional coordinate scanner and method of operation
US20180003485A1 (en) Image Measurement Device
US10088302B2 (en) Image measurement device
CN107121060B (en) Inner wall measuring instrument and offset calculating method
JP3859571B2 (en) 3D visual sensor
JP2010528318A (en) 3D assembly inspection with 2D images
JP2021510836A (en) 3D reconstruction system and 3D reconstruction method
EP2138803A1 (en) Jig for measuring an object shape and method for measuring a three-dimensional shape
CN114326140A (en) System and method for aligning optical axis of optical component portion perpendicular to workpiece surface using multi-point auto-focus
JP2021193400A (en) Method for measuring artefact
JP3678916B2 (en) Non-contact 3D measurement method
JP3602965B2 (en) Non-contact three-dimensional measurement method
JP2019074470A (en) Adjustment method of image measurement device
JP2011257293A (en) Information processing apparatus, program and information processing system
JP4375710B2 (en) Three-dimensional shape measuring apparatus and three-dimensional shape measuring method
JP2012093258A (en) Shape measurement device
US11635291B2 (en) Workpiece holder for utilization in metrology system for measuring workpiece in different orientations
JP4578538B2 (en) Non-contact 3D measurement method
JP4138555B2 (en) Non-contact 3D measuring device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant