US20180286027A1 - Surface profiling and imaging system including optical channels providing distance-dependent image offsets - Google Patents
Surface profiling and imaging system including optical channels providing distance-dependent image offsets Download PDFInfo
- Publication number
- US20180286027A1 US20180286027A1 US15/471,811 US201715471811A US2018286027A1 US 20180286027 A1 US20180286027 A1 US 20180286027A1 US 201715471811 A US201715471811 A US 201715471811A US 2018286027 A1 US2018286027 A1 US 2018286027A1
- Authority
- US
- United States
- Prior art keywords
- optical
- array
- imaging
- along
- profiling system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 174
- 238000003384 imaging method Methods 0.000 title claims abstract description 133
- 230000001419 dependent effect Effects 0.000 title 1
- 238000012634 optical imaging Methods 0.000 claims abstract description 46
- 238000005259 measurement Methods 0.000 claims abstract description 27
- 238000012545 processing Methods 0.000 claims description 33
- 238000000034 method Methods 0.000 claims description 14
- 238000004458 analytical method Methods 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 14
- 238000005286 illumination Methods 0.000 description 13
- 238000003491 array Methods 0.000 description 6
- 230000007547 defect Effects 0.000 description 6
- 238000007689 inspection Methods 0.000 description 5
- 238000000926 separation method Methods 0.000 description 4
- 239000000758 substrate Substances 0.000 description 4
- 238000013461 design Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 2
- 108010002352 Interleukin-1 Proteins 0.000 description 1
- 108010002350 Interleukin-2 Proteins 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 229920001971 elastomer Polymers 0.000 description 1
- 239000000806 elastomer Substances 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14627—Microlenses
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14603—Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
- H01L27/14605—Structural or functional details relating to the position of the pixel elements, e.g. smaller pixel elements in the center of the imager compared to pixel elements at the periphery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Definitions
- This disclosure relates generally to precision metrology, and more particularly to surface profiling and imaging systems that may be utilized for determining surface height measurement coordinates for points on a surface of a workpiece that is being inspected.
- Various types of surface profiling systems are known that may be utilized for acquiring data regarding a surface (e.g., a surface of a workpiece that is being inspected).
- various bore imaging systems are known that use a bore surface imaging arrangement for imaging the interior of a bore, for example in a cylinder bore of an engine.
- Exemplary bore inspection systems are disclosed in U.S. Pat. Nos. 4,849,626; 7,636,204; 8,334,971; 8,570,505; U.S. Patent Publication Nos. 2013/0112881; 2016/0178534; and U.S. patent application Ser. No. 15/186,231, filed Jun. 17, 2016, each of which is hereby incorporated herein by reference in its entirety.
- Such bore imaging systems may be configured to provide a 360-degree view (also referred to as a panoramic view and/or image) of the interior of a bore in order to inspect for form errors or surface defects.
- These systems may use signal processing to map image pixel signals or detector element signals to coordinates on the interior surface of the bore.
- challenges may exist for determining highly accurate surface height measurement coordinates for workpiece surface points (e.g., due in part to the constrained spaces in which such systems may operate, etc.)
- a high-resolution metrology-grade surface profiling system which is able to operate in constrained spaces and determine highly accurate surface height measurement coordinates for workpiece surface points would be desirable.
- a surface profiling system including an imaging detector array and an optical imaging array.
- the imaging detector array includes at least one set of detector pixels arrayed generally along the direction of a first array axis.
- the optical imaging array includes at least one set of optical channels, wherein the optical channels in a set are arrayed generally along the direction of the first array axis, and each optical channel is configured with its optical axis arranged transverse to the first array axis to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view (FOV) and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged field of view (IFOV).
- the detector pixels of the imaging detector array are arranged at a back imaging distance from the optical channels.
- Each optical channel includes a lens arrangement (e.g., including a GRIN lens) configured to provide an erect image in its IFOV for a workpiece surface located in its FOV and within a measuring range along the direction of the optical axes of the surface profiling system.
- the optical channels in a set are configured in the optical imaging array to have overlapping FOVs and overlapping IFOVs.
- a workpiece surface point that is located in the measuring range may be simultaneously imaged in N overlapping IFOVs of N optical channels included in the set of optical channels, where N is an integer that is at least 2.
- a surface point that is located at a defined object reference distance from the N optical channels in the measuring range may be imaged at the same respective position along the imaging detector array in each of the N overlapping IFOVs.
- the surface point may be imaged at different respective positions at least along the direction of the first array axis in each of the N overlapping IFOVs, the difference between at least two of the respective positions defining a respective image offset for that surface point.
- the surface profiling system includes a signal processing and control portion configured to perform various operations. Such operations may include acquiring image data provided by the imaging detector array and analyzing the image data to determine the respective image offset. The operations may also include providing a surface height measurement coordinate for the workpiece surface point along the direction of the optical axes based at least in part on the determined respective image offset.
- FIG. 1 is a diagram of a first exemplary implementation of a surface profiling system
- FIG. 2 is a diagram of a second exemplary implementation of a surface profiling system
- FIG. 3 is a diagram showing various operational aspects of one exemplary implementation of an imaging detector array and an optical imaging array as included in a surface profiling system;
- FIG. 4 is a diagram illustrating operation of a gradient-index (GRIN) lens as included in a lens arrangement of an optical channel of an optical imaging array;
- GRIN gradient-index
- FIGS. 5A and 5B are diagrams of images illustrating discrete features of a workpiece surface as imaged at different respective object distances by a surface profiling system
- FIGS. 6A and 6B are diagrams of one exemplary implementation of an imaging and detector configuration of a surface profiling system as including an imaging detector array and an optical imaging array;
- FIG. 7 is a flow diagram illustrating one exemplary implementation of a routine for acquiring and analyzing image data provided by a surface profiling system to determine a surface height measurement coordinate for a workpiece surface point.
- FIG. 1 is a diagram of a first exemplary implementation of a surface profiling system 100 .
- the surface profiling system 100 includes a signal processing and control portion 101 , an imaging and detector configuration 105 , a carrier 170 and a support portion 180 .
- the carrier 170 may include an arm portion 172 that is attached to the imaging and detector configuration 105 , and a central portion 174 that may be rotated about a central axis, as will be described in more detail below.
- the imaging and detector configuration 105 includes an imaging detector array and an optical imaging array, as will be described in more detail below with respect to FIGS. 3, 6A and 6B .
- the signal processing and control portion 101 includes a motion control portion 102 and processing circuits 104 .
- the carrier 170 may be mounted to or include a motion control system or the like (e.g., as controlled by the motion control portion 102 ) for rotating so as to scan the imaging and detector configuration 105 at the end of the arm portion 172 along a scanning direction SD (e.g., corresponding to an X axis direction as will be described in more detail below with respect to FIG. 3 ) to acquire image data for a desired workpiece surface region WSR (e.g., of a bore surface 160 ).
- the carrier 170 may also be moved axially by a motion control system for positioning the imaging and detector configuration 105 for scanning higher or lower workpiece surface regions of the bore surface 160 .
- the imaging and detector configuration 105 may be stationary for certain operations and/or adjustments, and the bore surface may be moved (e.g., on a stage for axial and/or rotary movement, etc.) in a manner measured and controlled by the surface profiling system 100 , according to known methods.
- the support portion 180 (e.g., including a central portion attached to the carrier 170 ) may be utilized to support and hold the carrier 170 in a steady centered relationship relative to the bore surface 160 while the carrier 170 is rotated.
- the arm portion 172 and central portion 174 of the carrier 170 may consist of hollow tubes that carry wires or other electrical connections (e.g., from the imaging and detector configuration 105 to the signal processing and control portion 101 , etc.).
- the carrier 170 and/or support portion 180 may be lowered into the cylindrical bore and held by a structure from above (e.g., in a probe-type configuration).
- FIG. 1 may be arranged according to cylindrical coordinates Z A , R and ⁇ which are aligned with the cylindrical bore, wherein Z A corresponds to an axial direction and ⁇ (or “P”) corresponds to the circumferential direction on the bore surface 160 .
- a surface profiling system may comprise a set of detector pixels arrayed generally along the direction of a first array axis and an optical imaging array including at least one set of optical channels, wherein the optical channels in a set are arrayed generally along the direction of the first array axis and each optical channel is configured with its optical axis arranged transverse to the first array axis to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view (FOV) and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged field of view (IFOV).
- FOV field of view
- the first array axis FA is straight and the optical imaging array of the imaging and detector configuration 105 may include at least one set of optical channels having approximately parallel optical axes, and the at least one set of optical channels may be arrayed generally along the direction of a first array axis FA which may generally be in the same direction as the axial direction Z A of the cylindrical bore.
- the surface profiling system 100 may be configured to rotate the carrier 170 to cause the imaging and detector configuration 105 to scan along a workpiece surface region WSR of the bore surface 160 in the scan direction SD (i.e., which is transverse to the first array axis FA).
- the optical channels of the imaging and detector configuration 105 are each configured to input image light IL from a workpiece surface region WSR of the bore surface 160 and transmit the image light IL to a plurality of pixels of the imaging detector array of the imaging and detector configuration 105 .
- the imaging detector array includes at least one set of detector pixels arrayed generally along the direction of the first array axis FA.
- the pixels are formed of photodetector elements that provide image data (e.g., intensity values), which may be output individually, or in parallel, or multiplexed, or serialized, or otherwise processed before being output on designated connections (e.g., provided through the carrier 170 , etc.).
- the signal processing and control portion 101 may include processing circuits 104 that are provided as part of the imaging detector array and/or on the carrier 170 and which provide signals to other processing circuits 104 and/or other portions of the signal processing and control portion 101 via designated connections.
- the processing circuits 104 may be provided separately from the imaging detector array and/or carrier, and may receive pixel signals from the imaging detector array via the designated connections.
- an illumination portion (not shown) is connected to an illumination power and control element, which may be provided as part of or in connection with the signal processing and control portion 101 (e.g., via an illumination/control line).
- the illumination portion is arranged to provide source light SL directed toward a workpiece surface region WSR of the bore surface 160 , wherein the source light SL may be reflected from the workpiece surface region WSR as image light IL that is received by the optical channels of the imaging and detector configuration 105 . More specifically, as will be described in more detail below with respect to FIGS.
- the image light IL is received by a lens arrangement of each optical channel that is configured to provide an erect image of at least a portion of the workpiece surface region WSR in its imaged field of view IFOV.
- the illumination portion may be provided as part of the imaging and detector configuration 105 (e.g., as will be described in more detail below with respect to FIGS. 6A and 6B ), or on the carrier 170 , or as an independent element (e.g., a ring of illumination elements as will be described in more detail below with respect to FIG. 2 ), or in any other convenient form.
- the optical channels of the imaging and detector configuration 105 may have overlapping fields of view FOV and overlapping imaged fields of view IFOV.
- a workpiece surface point in the workpiece surface region WSR may be simultaneously imaged in at least two or more overlapping IFOVs of two or more optical channels.
- a surface point that is not at an object reference distance may be imaged at different respective positions in the IFOVs, and the difference between the respective positions may define a respective image offset for that surface point.
- Corresponding image data may be analyzed by the signal processing and control portion 101 (e.g., by processing circuits 104 ) to determine the respective image offset.
- a surface height measurement coordinate for the surface point (e.g., as related to a distance between the surface point and the imaging and detector configuration 105 ) may be determined based on the determined image offset.
- a workpiece surface point that is imaged in this manner may be one of multiple surface points that are imaged along the direction of the first array axis FA, and the signal processing and control portion 101 may further be configured to perform operations comprising determining respective coordinates for each of the multiple surface points. For example, respective surface height measurement coordinates for each of the multiple surface points may be determined based at least in part on determined respective image offsets for each of the multiple surface points.
- the signal processing and control portion 101 may further be configured to perform operations including constructing a synthetic image of the workpiece surface region WSR of the bore surface 160 wherein respective image offsets corresponding to a plurality of surface points are compensated and/or removed, and a majority of the synthetic image appears substantially focused.
- the signal processing and control portion 101 may be further configured to perform operations comprising determining contrast based Z-height information for at least a portion of the workpiece surface region WSR, as will be described in more detail below.
- FIG. 2 is a diagram of a second exemplary implementation of a surface profiling system 200 .
- the surface profiling system 200 may have certain characteristics that are similar to those of the surface profiling system 100 of FIG. 1 , and will be understood to operate similarly except as otherwise described below.
- the surface profiling system 200 includes a signal processing and control portion 201 , an imaging and detector configuration 205 , a curved carrier 270 , a support portion 280 and an illumination portion 285 .
- the imaging and detector configuration 205 is carried on the curved carrier 270 .
- the imaging and detector configuration 205 includes an imaging detector array and an optical imaging array, portions or all of which may correspondingly be carried on the curved carrier 270 which may hold them in a stable form.
- certain portions or all of the surface profiling system 200 may include and/or be carried on a schematically represented support portion 280 which holds certain portions in proper relationships and which may be mounted to or include a motion control system or the like (e.g., as controlled by the motion control portion 202 ) for scanning the imaging and detector configuration 205 along an axial scanning direction SD to image a desired axial workpiece surface region of a bore surface 260 .
- a motion control system or the like e.g., as controlled by the motion control portion 202
- either the imaging and detector configuration 205 or the bore surface 260 may be stationary, and the other may be moved in a manner measured and controlled by the surface profiling system 200 , according to known methods.
- the axial scanning direction SD may generally correspond to the axial direction Z A .
- the axial scanning direction SD may also correspond to an X or Y axis direction with respect to the relative coordinates of the imaging and detector configuration 205 .
- the illumination portion 285 may include a strobe light source, controllable to determine an exposure duration and timing (e.g., a timing which is triggered at a particular imaging position, for example).
- the illumination portion 285 is connected to an illumination power and control element, which may be provided as part of or in connection with the signal processing and control portion 201 , via an illumination/control line 286 .
- the illumination portion 285 is arranged to provide source light SL to a workpiece surface region WSR on the bore surface 260 .
- an illumination portion may be omitted, or provided as part of the imaging and detector configuration 205 , or otherwise provided on the carrier 270 , or in any other convenient form.
- the source light SL is reflected from the workpiece surface region WSR as image light IL that is received by optical channels 236 of the optical imaging array of the imaging and detector configuration 205 .
- a surface profiling system may comprise a set of detector pixels arrayed generally along the direction of a first array axis and an optical imaging array including at least one set of optical channels, wherein the optical channels in a set are arrayed generally along the direction of the first array axis and each optical channel is configured with its optical axis arranged transverse to the first array axis to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view (FOV) and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged field of view (IFOV).
- FOV field of view
- the first array axis FA follows a circular path
- the optical imaging array of the imaging and detector configuration 205 may include at least one set of optical channels configured with their optical axes aligned along respective radial directions transverse to the circular path.
- the at least one set of optical channels may be arrayed generally along the direction of the circular first array axis FA which may generally correspond to the direction of the circumference of the cylindrical bore.
- the surface profiling system 200 may be configured to move the carrier 270 to cause the imaging and detector configuration 205 to scan along a workpiece surface region WSR of the bore surface 260 in the scan direction SD (along the direction of the cylinder axis).
- the optical imaging array of the imaging and detector configuration 205 includes optical channels 236 - 1 to 236 - n having optical axes radially aligned transverse to the circular first array axis FA, for which optical channels 236 - 1 to 236 - n are arrayed generally along the circular direction of a first array axis FA.
- FIG. 2 illustrates the ends of the optical channels 236 as facing toward the bore surface 260 .
- each optical channel 236 is configured to input image light IL from a workpiece surface region WSR of the bore surface 260 located at an object distance along its optical axis in its field of view FOV and transmit the image light IL to a plurality of pixels of the imaging detector array of the imaging and detector configuration 205 located in its imaged field of view IFOV.
- the imaging detector array of the imaging and detector configuration 205 includes at least one set of detector pixels (not shown) arrayed generally along the circular direction of a first array axis FA.
- the pixels are formed of photodetector elements that provide image data (e.g., intensity values), which may be output individually, or in parallel, or multiplexed, or serialized, or otherwise processed before being output on connections 233 .
- the signal processing and control portion 201 may include processing circuits 204 that are provided as part of the imaging detector array 310 ( FIG.
- the processing circuits 204 may be provided separately from the carrier 270 and/or the imaging detector array of the imaging and detector configuration 205 , and may receive pixel signals from the imaging detector array via the connections 233 .
- the imaging and detector configuration 205 is curved in an approximately circular shape on the carrier 270 .
- the carrier 270 may be a portion of the support portion 280 .
- portions of the imaging and detector configuration 205 may be formed on a substrate that may be a flex print, an elastomer or a thinned semiconductor substrate, or another curvable substrate that provides the required properties (e.g., for providing a curved optical imaging array, etc.).
- the optical imaging array of the imaging and detector configuration 205 may be formed of a flexible IC sensor such as a FleXTM Silicon-on-PolymerTM CMOS sensor available from American Semiconductor of Boise, Id., or a curved high-resolution CCD sensor provided by Andanta of Olching, Germany.
- a flexible IC sensor such as a FleXTM Silicon-on-PolymerTM CMOS sensor available from American Semiconductor of Boise, Id., or a curved high-resolution CCD sensor provided by Andanta of Olching, Germany.
- Various other usable alternatives for fabricating a curved optical imaging array configuration are disclosed in U.S. Pat. Nos. 6,791,072; 6,849,843; 7,786,421; 8,372,726; 8,742,325; and U.S. Patent Publications 2010/0264502 and 2012/0261551, all of which are hereby incorporated herein by reference in their entirety.
- the optical imaging array of the imaging and detector configuration 205 may comprise multiple imaging arrays which are each nominally flat over a limited span, but are arranged along a curved form of the optical imaging array.
- a plurality of nominally flat imaging arrays may be provided on a flexible substrate that extends along the curved form of the optical imaging array.
- each of the imaging arrays should not receive an unacceptably blurred image of its corresponding portion of a workpiece surface region WSR.
- any corresponding portions of the optical imaging array and corresponding optical channels (e.g., each including a lens arrangement) should be designed to have complementary curvatures to the extent required to maintain each pixel within a desirable image focus depth or range.
- FIG. 3 is a diagram of one exemplary implementation of a surface profiling system 300 including an imaging and detector configuration 305 having an imaging detector array 310 and an optical imaging array 330 .
- the imaging detector array 310 includes at least one set 315 of detector pixels 316 arrayed generally along a direction of a first array axis FA, which may be understood to represent a straight first array axis (as in FIG. 1 ), or a segment of a curved or circular first array axis (as in FIG. 2 ).
- FIG. 1 the imaging detector array 310
- FIG. 2 the imaging detector array 310 includes at least one set 315 of detector pixels 316 arrayed generally along a direction of a first array axis FA, which may be understood to represent a straight first array axis (as in FIG. 1 ), or a segment of a curved or circular first array axis (as in FIG. 2 ).
- the optical imaging array 330 includes at least one set 335 of optical channels 336 , wherein the optical channels in a set are arrayed generally along the direction of the first array axis FA, and each optical channel is configured with its optical axis 338 arranged transverse to the first array axis to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view (FOV) and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged field of view (IFOV).
- the optical axes may be approximately parallel. In the implementation of FIG.
- example optical channels 336 - 1 to 336 - 5 having optical axes 338 - 1 to 338 - 5 are illustrated as part of the optical imaging array 330 . It will be appreciated that the number of detector pixels 316 and optical channels 336 in the configuration of FIG. 3 are for purposes of illustration only, and that other implementations may have more or less detector pixels and/or optical channels, depending on the implementation.
- Each optical channel 336 is configured to input image light IL from a workpiece surface region WSR (e.g., of a bore surface) located at an object distance OD along a direction of an optical channel's optical axis 338 in the optical channel's field of view FOV, and transmit the image light IL to a plurality of pixels 316 of the imaging detector array 310 located in the optical channel's imaged field of view IFOV.
- WSR workpiece surface region
- the surface profiling system 300 further includes one or more light sources (not shown) that provide source light directed toward the workpiece surface region WSR, wherein the source light is reflected from the workpiece surface region WSR as the image light IL that is received by the lens arrangement of each optical channel 336 that is configured to provide an erect image of at least a portion of the workpiece surface region WSR in its imaged field of view IFOV (e.g., the optical channel 336 - 1 receives image light IL-1, the optical channel 336 - 2 receives image light IL-2, etc.).
- the optical channel 336 - 1 receives image light IL-1
- the optical channel 336 - 2 receives image light IL-2, etc.
- the imaging detector array 310 is configured to provide at least M pixels 316 that are located in an optical channel 336 imaged field of view IFOV, where M is an integer that is at least a minimum amount (e.g., 10, 25, 50, 100, etc.). It will be appreciated that the larger M is, the better the resolution with which the respective image offset IO corresponding to an imaged surface point can be determined. Generally speaking, the better the resolution of the determination of image offset 10 , the better the resolution of the corresponding surface point height measurement.
- optical channel 336 - 5 is illustrated with the FOV-5 and IFOV-5
- each of the optical channels has a similar corresponding field of view FOV and imaged field of view IFOV, which overlap with one another, as will be described in more detail below.
- the detector pixels 316 of the imaging detector array 310 are arranged at a back imaging distance BD from the optical channels 336 .
- the back imaging distance BD may be made to be approximately equal to a back focal length of the optical channels 336
- an object reference distance RD may be made to be approximately equal to a front focal length of the optical channels 336 , such as may provide certain advantages depending on the implementation.
- the front and back focal lengths and/or the back imaging distance BD and the object reference distance RD may be equal.
- other combinations of back imaging distance BD and object reference distance RD may operate according to the principles outlined herein, and that these examples are illustrative only, and not limiting.
- Each optical channel 336 includes a lens arrangement (e.g., as will be described in more detail below with respect to FIG. 4 ) configured to provide an erect image in its imaged field of view IFOV for a workpiece surface WS located in its FOV and within a measuring range along the direction of the optical axes of the surface profiling system 300 .
- a lens arrangement e.g., as will be described in more detail below with respect to FIG. 4
- the optical channels 336 are configured in the optical imaging array 330 to have overlapping fields of view FOV and overlapping imaged fields of view IFOV.
- the optical channels 336 may be adjacent to one another along the direction of the first array axis FA.
- a measuring range of the surface profiling system 300 may be at least 50 micrometers along the direction of the optical axis 338 (e.g., along a Z axis), and the optical imaging array may have a dimension of at least 5 mm along the first array axis FA (e.g., along a Y axis).
- the measuring range may be any operable range less than or more than 50 micrometers (e.g., 100 micrometers or more, in some implementations) and the optical imaging array and imaging detector array may be much longer than 5 mm along the first array axis FA (e.g., 1 meter or more, in some implementations), if desired for a particular application.
- a dimension along the first array axis FA that is at least as long as the bore height is advantageous for throughput, such that a single revolution of the scanning mechanism allows imaging of an entire bore surface.
- a dimension of the surface profiling system perpendicular to the first array axis FA may be minimal in a one-dimensional system (e.g., comprising only 1 row or set of optical channels and 1 row or set of detector pixels). It will be appreciated that the surface profiling system and/or workpiece surface may be scanned relative to one along a direction transverse to the first array axis, in order to create a two-dimensional profile map of a workpiece surface. Therefore, it is a design choice whether or not to include a plurality of sets (rows) of optical channels and/or detector pixels along a direction perpendicular to the first array axis FA.
- the measuring range may include the object reference distance RD of the N optical channels 336 .
- the measuring range (which may be defined by operating specifications of the surface profiling system, and/or by inherent operating limitations of the optical configuration) may be asymmetrical about the object reference distance RD, if desired.
- a workpiece surface point SP (e.g., located at coordinates X1, Y1, Z1) that is located in the measuring range of the surface profiling system 300 may be simultaneously imaged in at least N overlapping imaged fields of view IFOVs of N optical channels 336 , where N is an integer that is at least a minimum amount (e.g., 2, 3, etc.).
- N is an integer that is at least a minimum amount (e.g., 2, 3, etc.).
- the workpiece surface point SP is illustrated as being imaged in the overlapping imaged fields of view IFOVs of at least the optical channels 336 - 1 , 336 - 2 , and potentially 336 - 3 .
- the surface profiling system is configured according to known optical principles such that a surface point that is located at a defined object reference distance RD from the N optical channels 336 in the measuring range is imaged at the same respective position (e.g., the same pixel position) along the imaging detector array 310 in each of the N overlapping imaged fields of view IFOVs.
- a surface point that is located at a defined object reference distance RD from the N optical channels 336 in the measuring range is imaged at the same respective position (e.g., the same pixel position) along the imaging detector array 310 in each of the N overlapping imaged fields of view IFOVs.
- the surface point SP is imaged at different respective positions at least along the direction of the first array axis FA (e.g., corresponding to the Y axis) in each of the N overlapping imaged fields of view IFOVs, wherein the difference between at least two respective imaged positions defines a respective image offset IO for the surface point SP.
- FA first array axis FA
- the surface point SP is thus shown as being imaged at the detector array 310 at the positions PN- 1 , PN- 2 (and potentially PN- 3 ) by the optical channels 336 - 1 , 336 - 2 (and potentially 336 - 3 ), respectively, and a corresponding image offset IO is correspondingly determined (e.g., indicated by a difference between the positions PN- 1 and PN- 2 ).
- corresponding image data provided by the imaging detector array 310 may be acquired, and the image data may be analyzed to determine the respective image offset IO.
- a surface height measurement coordinate (e.g., Z1) for the workpiece surface point SP along the direction of the optical axis (e.g., along the direction of the Z axis) may be determined and provided based at least in part on the determined respective image offset IO. Further examples of image offsets will be illustrated and described in more detail below with respect to FIGS. 5A and 5B .
- the surface profiling system 300 may be configured to have the imaging and detector configuration 305 scan along the workpiece surface region WSR in a direction that is transverse to the first array axis FA (e.g., in an X axis direction which may correspond to a ⁇ or “P” circumferential direction on a bore surface as described above with respect to FIG. 1 ).
- FIG. 4 is a diagram illustrating operation of a gradient-index (GRIN) lens 400 such as may be included in the lens arrangement of an optical channel of an optical imaging array (e.g., in each of the optical channels 336 of FIG. 3 ).
- the travel of light rays through the GRIN lens 400 are illustrated and may be referenced according to a 1 ⁇ 4 pitch, 1 ⁇ 2 pitch and 1 pitch. As illustrated, at a 1 ⁇ 2 pitch length of the GRIN lens 400 , an inverted image is produced, while at a 1 pitch length of the GRIN lens 400 , an erect image is produced.
- GRIN gradient-index
- a GRIN lens 400 may be cut to a desired length to produce a desired image for a specific application (e.g., to produce an erect image).
- a GRIN lens that is included as the lens arrangement of an optical channel e.g., an optical channel 336 of FIG. 3
- the choice of pitch length in this range allows setting a back imaging distance BD to achieve a 1 ⁇ imaging system necessary to minimize optical stitching effects resulting from the array of optical channels 336 .
- the nominal object reference distance RD may be equal to BD.
- RD and/or BD may have example design values in the range 0.5 mm to 15 mm to allow noncontact measurement in small bore sizes.
- the lens arrangement of an optical channel 336 may be made to include additional or alternative lenses (e.g., microlenses, etc.) arranged along the optical axis, which may be configured to produce an erect image at the detector array 310 .
- FIGS. 5A and 5B are diagrams of images 500 A and 500 B illustrating discrete features of a workpiece as imaged at different respective object distances OD by a surface profiling system.
- the images 500 A and 500 B may be provided by a surface profiling system that includes a plurality of sets of optical channels and many sets of detector pixels, in order to provide a two-dimensional image.
- the two-dimensional image could be provided by scanning a surface profiling system relative to the workpiece along a direction transverse to the first array axis of a single set of optical channels and/or detector pixels, and forming a composite image from multiple scanned “one-dimensional” image data.
- a workpiece surface point may be one of a number of surface points located along a discrete feature of a workpiece surface region.
- such edges may be oriented transverse to the first array axis, such that as described above when the discrete feature is located at an object distance OD that is different than the object reference distance of the N optical channels (e.g., optical channels 336 of FIG. 3 ), each workpiece surface point of the discrete feature may be imaged at different respective positions along the imaging detector array in each of the N overlapping IFOVs.
- N optical channels e.g., optical channels 336 of FIG. 3
- the images 500 A and 500 B are produced by a surface profiling system that is approximately 250 um and 500 um, respectively, away from best focus relative to the workpiece surface. More specifically, the illustrated discrete features in the images 500 A and 500 B are located at an object distance that is different than the object reference distance of the optical channels of an imaging and detector configuration by approximately 250 um and 500 um, respectively.
- the two-dimensional images 500 A and 500 B may be reconstructed images formed by combining image data from a scan by a one-dimensional imaging and detector configuration (e.g., the configurations of FIGS. 3, 6A and 6B ), as will be described in more detail below.
- the system may be desirable for the system to utilize a measuring range wherein a difference between the object distance and the object reference distance of the N optical channels (e.g., an out of focus amount) produces relatively clear images of the surface points/discrete features at the different respective positions (e.g., so that a respective image offset may be more accurately determined, etc.)
- a difference between the object distance and the object reference distance of the N optical channels e.g., an out of focus amount
- contrast-based or other evaluation techniques may be utilized to evaluate image data and/or images (e.g., such as images 500 A and 500 B) produced by the system to determine a desirable out of focus level for the system to utilize (e.g., one that produces clear discrete feature images and/or separations for use in determining respective image offsets, etc.).
- the image 500 B may represent a preferred out of focus level of 500 um in the measuring range at which the system may operate. It is noted that in the example of the image 500 B, the difference between the respective imaged positions is relatively clear, such that the respective image offset may be determined with relatively high accuracy.
- a surface profiling system formed in accordance with the principles disclosed herein may provide sensitivity for the determination of the surface height measurement coordinate in the range of 1 um as compared to 10 um, or approximately 10 ⁇ accuracy improvement (e.g., due to the larger triangulation angles from the multiple spatially arranged optical channels 336 ) as compared to prior techniques (e.g., points from focus operations based on blurriness and utilizing a single large lens). It will further be appreciated that such characteristics allow a surface profiling system to be formed in a more compact configuration that may operate in constrained spaces (e.g., for scanning an interior bore surface) with a higher degree of accuracy for the determination of surface height measurement coordinates.
- a discrete feature is imaged at different respective “image offset” positions PN- 1 a , PN- 2 a corresponding to the image offset IOa, between two adjacent optical channels.
- the image offset amount IOa is due to a “relatively lesser” difference between an object distance (or surface height) of the indicated edge feature and the object reference distance of the surface profiling system.
- the same discrete feature is imaged at different respective “image offset” positions PN- 1 b , PN- 2 b corresponding to the image offset IO b , between two adjacent optical channels.
- the image offset amount IO b is due to a “relatively greater” difference between the object distance (or surface height) of the indicated edge feature and the object reference distance of the surface profiling system.
- the various image offset amounts 10 corresponding to various object distances in the measuring range can be calibrated for the surface profiling systems, such that any image offset amount IO is quantitatively indicative of the difference between the object distance (or surface height) of a feature and the object reference distance of the surface profiling system.
- a surface height measurement coordinate e.g., Z1
- Z1 a surface height measurement coordinate for the corresponding surface height of an imaged workpiece surface point or feature along the direction of the optical axis (e.g., along the direction of the Z axis) may be determined and provided based (at least in part) on its determined respective image offset IO.
- the two images 500 A and 500 B show a workpiece feature at two different heights corresponding to the image offsets IOa and IO b respectively, in order to illustrate the operative measurement principle disclosed herein.
- only one image of a surface is needed, and each respective feature in that image will have a respective image offset IO, such that a three-dimensional surface map may be determined based for that surface region included in that image.
- the analyzing of the image data (e.g., by the signal processing and control portion 101 of FIG. 1 ) to determine the respective image offset IO may include at least one of Fourier analysis of spatial frequencies, auto-correlation operations, etc.
- an analysis technique may be utilized that defines a point spread function as a function of surface height (e.g., as a function of Z). For example, for a one-dimensional function, a point spread function may be based on a specified number of imaged positions (e.g., two imaged positions) increasing in distance and size with Z.
- an extended depth of field image and Z profile map may be constructed utilizing known techniques.
- the resulting process allows increased speed through use of one-dimensional point spread functions, with a one-dimensional imaging and detector configuration in a line scan type system (e.g., utilizing the imaging and detector configuration of FIG. 3 for performing line scan type operations such as those described above).
- the analyzing of the image data to determine the respective image offset may also or alternatively include utilizing a video tool to determine a distance between the respective positions that the surface point/discrete feature is imaged at.
- a video tool may be utilized determine the positions PN- 1 b and PN- 2 b of the imaged discrete feature and/or the corresponding image offset distance between the positions PN- 1 b and PN- 2 b .
- some systems may include GUI features and predefined image analysis “video tools” such that operation and programming can be performed by “non-expert” operators. For example, U.S. Pat. No.
- Exemplary video tools may include edge location tools, which are sometimes referred to as “box tools,” which may be used to locate an edge that defines a discrete feature of a workpiece (e.g., utilized to determine the positions PN- 1 b and PN- 2 b of the imaged discrete feature).
- box tools may be used to locate an edge that defines a discrete feature of a workpiece (e.g., utilized to determine the positions PN- 1 b and PN- 2 b of the imaged discrete feature).
- a signal processing and control portion may further be configured to perform operations comprising determining contrast based Z-height information for at least a portion of the workpiece surface region. For example, in addition to determining surface height measurement coordinates for surface points/discrete features based on determined image offsets, more traditional contrast based Z-height determination techniques may also or alternatively be utilized for determining the Z-height of a portion of a workpiece surface region. In various implementations, such contrast based Z-height determination techniques may be implemented through utilization of certain types of video tools (e.g., autofocus and/or focus height determination video tools, etc.). For example, commonly assigned U.S. Pat. No. 8,111,938, which is hereby incorporated herein by reference in its entirety, teaches various applications of autofocus video tools that utilize contrast based Z-height determination techniques.
- video tools e.g., autofocus and/or focus height determination video tools, etc.
- certain of the above noted techniques may be utilized in combination with motion control that may be utilized to scan at known and desired spacing transverse to the first array axis, such that the “pixel coordinates” in combination with the motion control position coordinates allow the reconstruction of a two-dimensional image if desired (e.g., such as may be utilized to form the images 500 A and 500 B).
- an image offset may be analyzed along either or both directions of the reconstructed image, and the Z coordinate for a particular XY coordinate can be determined from the X or Y offset, or a combination thereof.
- the Y coordinate along the first array axis and the associated Z coordinate based on the image offset can be determined for each scan position (e.g., the X coordinate position, as determined by rotary encoder on scan arm 174 or other external sensor), and that data for multiple scan positions may be combined into a three-dimensional surface profile or map, without the intermediate step of assembling the individual scan image data into two-dimensional image data.
- a two-dimensional image that is assembled may be displayed by the surface profiling system.
- the two-dimensional image may also be “de-blurred” if desired, by compensating or removing the local offset at local regions throughout the two-dimensional reconstructed image.
- a two-dimensional imaging and detector configuration e.g., having multiple columns of optical channels
- a detected two-dimensional image that corresponds to the size of the imaging detector array may be acquired without requiring motion and reconstruction.
- a detected or constructed two-dimensional image it may be analyzed using any method disclosed herein or otherwise known.
- the two-dimensional image may be analyzed to provide surface height information (e.g., resulting in three-dimensional surface profile data).
- the image may also be deblurred as described herein, to provide an extended depth of field (EDOF) image for viewing.
- EDOF extended depth of field
- a two-dimensional image may not be required depending on the nature of the designated output of the system. For example, if a cylindrical bore is being inspected to determine if there are defects in the cylindrical bore (e.g., inspecting for form errors or surface defects along the interior surface, etc.), an output provided by the system may primarily indicate whether or not the current cylindrical bore is free of defects or otherwise passes a designated inspection analysis. In such an implementation, as the surface height measurement coordinates are determined for the workpiece surface points on the interior bore surface, a warning or other indicator may be provided if a certain number of the surface height measurement coordinates are determined to deviate from expected values (e.g., if the interior surface of the cylindrical bore includes an unexpected number or depth of form errors or surface defects, etc.).
- such determinations may also be made based on a relative comparison between determined surface height measurement coordinates (e.g., wherein deviations among certain of the determined surface height measurement coordinates, such as in a given column, may indicate an unsmooth or otherwise defective interior bore surface, etc.)
- FIGS. 6A and 6B are diagrams of one exemplary implementation of an imaging and detector configuration 605 of a surface profiling system 600 . It will be appreciated that the surface profiling system 600 may have similar characteristics and will be understood to operate similarly to the surface profiling systems 100 , 200 and 300 , except as otherwise described below. As shown in FIGS. 6A and 6B , the imaging and detector configuration 605 includes an imaging detector array 610 , an optical imaging array 630 and light source arrays 685 and 685 ′.
- the light source arrays 685 and 685 ′ may each include an array of individual light sources (e.g., LEDs, etc.).
- each of the light source arrays 685 and 685 ′ may provide source light SL directed toward a workpiece surface region WSR, wherein the source light SL may be reflected from the workpiece surface region WSR as image light IL that is received by the lens arrangement of each optical channel of the optical imaging array 630 that is configured to provide an erect image of at least a portion of the workpiece surface region WSR in its imaged field of view IFOV.
- the surface profiling system 600 may be configured to have the imaging and detector configuration 605 scan along the workpiece surface region WSR in a direction that is transverse to a first array axis FA (e.g., in an X axis direction which may correspond to a ⁇ or “P” circumferential direction on a bore surface as described above with respect to FIG. 1 ).
- the imaging and detector configuration 605 may include only a single set of detector pixels in the imaging detector array 610 and a single set of optical channels in the optical imaging array 630 in a one-dimensional configuration of the surface profiling system 600 .
- an imaging and detector configuration may include multiple sets of detector pixels in an imaging detector array and/or multiple sets of optical channels in an optical imaging array in a multi-dimensional configuration of a surface profiling system.
- an imaging detector array may include multiple sets (e.g., columns) of detector pixels arrayed generally along the direction of a first array axis FA.
- a corresponding optical imaging array may include multiple sets (e.g., columns) of optical channels having parallel optical axes, wherein each of the sets of optical channels may be arrayed generally along the direction of a first array axis FA.
- Each optical channel may be configured to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view FOV and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged IFOV.
- the multiple sets of detector pixels and the multiple sets of optical channels may each be arrayed along a second array axis (e.g., in an X axis direction) that is transverse to the first array axis FA (e.g., each set being arranged in a respective column wherein the columns are arranged along the second array axis, etc.).
- the imaging detector array 610 may include a plurality of similar sets of detector pixels
- the optical imaging array 630 may include a plurality of similar sets of optical channels
- the plurality of sets of detector pixels may be arranged adjacent to one another and the plurality of sets of optical channels may be arranged adjacent to one another, along a second array axis that is transverse to the first array axis FA.
- multiple surface profiling systems and/or imaging and detector configurations may be utilized in combination.
- multiple imaging and detector configurations may be arranged relative to one another in a specified configuration.
- an arrangement of first and second imaging and detector configurations may correspond to a V-shape (e.g., in a “triangulation” three-dimensional imaging configuration).
- the optical imaging array of the first imaging and detector configuration may be arranged relative to the optical imaging array of the second imaging and detector configuration in a V-shape, as illustrated.
- a surface profiling system may comprise a first profiling subsystem that includes a first imaging detector array and a first optical imaging array corresponding to a straight first array axis that is approximately straight. It may further comprise a second profiling subsystem similar to the first profiling subsystem.
- the optical axes of the first profiling subsystem may approximately align with a first plane
- the optical axes of the second profiling subsystem approximately align with a second plane
- the first and second planes may intersect at a line approximately parallel to their respective first array axes, and form an angle in a plane transverse to their respective first array axes.
- the signal processing and control portion may be configured to use image data provided by the first and second profiling subsystems in combination to perform a three-dimensional measurement operation. It will be appreciated that in accordance with principles disclosed herein, such surface profiling systems may be made smaller than traditional surface profiling systems, which also allows for smaller combined implementations to be produced.
- FIG. 7 is a flow diagram illustrating one exemplary implementation of a routine 700 for acquiring and analyzing image data provided by a surface profiling system to determine a surface height measurement coordinate for a workpiece surface point.
- image data is acquired as provided by a surface profiling system for a workpiece surface point.
- the surface profiling system may include an imaging detector array and an optical imaging array with at least one set of optical channels, wherein the optical channels in a set are arrayed generally along the direction of a first array axis, and each optical channel is configured with its optical axis arranged transverse to the first array axis to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged field of view, and wherein each optical channel includes a lens arrangement configured to provide an erect image and the workpiece surface point is simultaneously imaged in at least two overlapping imaged fields of view of at least three of the optical channels.
- the image data is analyzed (e.g., by a signal processing and control portion) to determine a respective image offset according to the surface point being located at an object distance that is different than an object reference distance, for which the surface point is imaged at different respective positions in each of the at least two overlapping imaged fields of view.
- the difference between at least two of the respective positions may define a respective image offset for the surface point.
- a difference between two (or more) of the respective positions may define a respective image offset for the surface point (e.g., in an implementation utilizing a point spread function, etc.).
- a surface height measurement coordinate is provided (e.g., by a signal processing and control portion) for the surface point along the direction of the optical axes based at least in part on the determined respective image offset.
- the analyzing of the image data to determine the respective image offset may include at least one of Fourier analysis of spatial frequencies, auto-correlation operations, point spread functions, video tool operations, etc.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Power Engineering (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Electromagnetism (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Quality & Reliability (AREA)
- Geometry (AREA)
Abstract
Description
- This disclosure relates generally to precision metrology, and more particularly to surface profiling and imaging systems that may be utilized for determining surface height measurement coordinates for points on a surface of a workpiece that is being inspected.
- Various types of surface profiling systems are known that may be utilized for acquiring data regarding a surface (e.g., a surface of a workpiece that is being inspected). For example, various bore imaging systems are known that use a bore surface imaging arrangement for imaging the interior of a bore, for example in a cylinder bore of an engine. Exemplary bore inspection systems are disclosed in U.S. Pat. Nos. 4,849,626; 7,636,204; 8,334,971; 8,570,505; U.S. Patent Publication Nos. 2013/0112881; 2016/0178534; and U.S. patent application Ser. No. 15/186,231, filed Jun. 17, 2016, each of which is hereby incorporated herein by reference in its entirety. Such bore imaging systems may be configured to provide a 360-degree view (also referred to as a panoramic view and/or image) of the interior of a bore in order to inspect for form errors or surface defects. These systems may use signal processing to map image pixel signals or detector element signals to coordinates on the interior surface of the bore. In such systems, challenges may exist for determining highly accurate surface height measurement coordinates for workpiece surface points (e.g., due in part to the constrained spaces in which such systems may operate, etc.)
- A high-resolution metrology-grade surface profiling system which is able to operate in constrained spaces and determine highly accurate surface height measurement coordinates for workpiece surface points would be desirable.
- This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- A surface profiling system is provided including an imaging detector array and an optical imaging array. The imaging detector array includes at least one set of detector pixels arrayed generally along the direction of a first array axis. The optical imaging array includes at least one set of optical channels, wherein the optical channels in a set are arrayed generally along the direction of the first array axis, and each optical channel is configured with its optical axis arranged transverse to the first array axis to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view (FOV) and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged field of view (IFOV). The detector pixels of the imaging detector array are arranged at a back imaging distance from the optical channels. Each optical channel includes a lens arrangement (e.g., including a GRIN lens) configured to provide an erect image in its IFOV for a workpiece surface located in its FOV and within a measuring range along the direction of the optical axes of the surface profiling system. The optical channels in a set are configured in the optical imaging array to have overlapping FOVs and overlapping IFOVs.
- In various implementations, a workpiece surface point that is located in the measuring range may be simultaneously imaged in N overlapping IFOVs of N optical channels included in the set of optical channels, where N is an integer that is at least 2. In addition, a surface point that is located at a defined object reference distance from the N optical channels in the measuring range may be imaged at the same respective position along the imaging detector array in each of the N overlapping IFOVs. Furthermore, when a surface point is located at an object distance that is different than the object reference distance, then the surface point may be imaged at different respective positions at least along the direction of the first array axis in each of the N overlapping IFOVs, the difference between at least two of the respective positions defining a respective image offset for that surface point.
- In various implementations, the surface profiling system includes a signal processing and control portion configured to perform various operations. Such operations may include acquiring image data provided by the imaging detector array and analyzing the image data to determine the respective image offset. The operations may also include providing a surface height measurement coordinate for the workpiece surface point along the direction of the optical axes based at least in part on the determined respective image offset.
-
FIG. 1 is a diagram of a first exemplary implementation of a surface profiling system; -
FIG. 2 is a diagram of a second exemplary implementation of a surface profiling system; -
FIG. 3 is a diagram showing various operational aspects of one exemplary implementation of an imaging detector array and an optical imaging array as included in a surface profiling system; -
FIG. 4 is a diagram illustrating operation of a gradient-index (GRIN) lens as included in a lens arrangement of an optical channel of an optical imaging array; -
FIGS. 5A and 5B are diagrams of images illustrating discrete features of a workpiece surface as imaged at different respective object distances by a surface profiling system; -
FIGS. 6A and 6B are diagrams of one exemplary implementation of an imaging and detector configuration of a surface profiling system as including an imaging detector array and an optical imaging array; and -
FIG. 7 is a flow diagram illustrating one exemplary implementation of a routine for acquiring and analyzing image data provided by a surface profiling system to determine a surface height measurement coordinate for a workpiece surface point. -
FIG. 1 is a diagram of a first exemplary implementation of asurface profiling system 100. As shown inFIG. 1 , thesurface profiling system 100 includes a signal processing andcontrol portion 101, an imaging anddetector configuration 105, acarrier 170 and asupport portion 180. In various implementations, thecarrier 170 may include anarm portion 172 that is attached to the imaging anddetector configuration 105, and a central portion 174 that may be rotated about a central axis, as will be described in more detail below. In various implementations, the imaging anddetector configuration 105 includes an imaging detector array and an optical imaging array, as will be described in more detail below with respect toFIGS. 3, 6A and 6B . The signal processing andcontrol portion 101 includes amotion control portion 102 andprocessing circuits 104. - The
carrier 170 may be mounted to or include a motion control system or the like (e.g., as controlled by the motion control portion 102) for rotating so as to scan the imaging anddetector configuration 105 at the end of thearm portion 172 along a scanning direction SD (e.g., corresponding to an X axis direction as will be described in more detail below with respect toFIG. 3 ) to acquire image data for a desired workpiece surface region WSR (e.g., of a bore surface 160). In various implementations, thecarrier 170 may also be moved axially by a motion control system for positioning the imaging anddetector configuration 105 for scanning higher or lower workpiece surface regions of thebore surface 160. In various alternative implementations, the imaging anddetector configuration 105 may be stationary for certain operations and/or adjustments, and the bore surface may be moved (e.g., on a stage for axial and/or rotary movement, etc.) in a manner measured and controlled by thesurface profiling system 100, according to known methods. - In various implementations, the support portion 180 (e.g., including a central portion attached to the carrier 170) may be utilized to support and hold the
carrier 170 in a steady centered relationship relative to thebore surface 160 while thecarrier 170 is rotated. In various implementations, thearm portion 172 and central portion 174 of thecarrier 170 may consist of hollow tubes that carry wires or other electrical connections (e.g., from the imaging anddetector configuration 105 to the signal processing andcontrol portion 101, etc.). In various implementations, thecarrier 170 and/orsupport portion 180 may be lowered into the cylindrical bore and held by a structure from above (e.g., in a probe-type configuration). In various implementations, various sensors and/or trial scanning techniques (e.g., utilizing image data from the imaging anddetector configuration 105 while it scans along the bore surface 160) may be utilized to determine if thesupport portion 180 and/orcarrier 170 are properly centered within the cylindrical bore or if adjustments to the positioning are needed. In various implementations,FIG. 1 may be arranged according to cylindrical coordinates ZA, R and ϕ which are aligned with the cylindrical bore, wherein ZA corresponds to an axial direction and ϕ (or “P”) corresponds to the circumferential direction on thebore surface 160. - As will be described in more detail below, generally speaking a surface profiling system according to principles disclosed herein may comprise a set of detector pixels arrayed generally along the direction of a first array axis and an optical imaging array including at least one set of optical channels, wherein the optical channels in a set are arrayed generally along the direction of the first array axis and each optical channel is configured with its optical axis arranged transverse to the first array axis to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view (FOV) and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged field of view (IFOV). In the specific example shown in
FIG. 1 , the first array axis FA is straight and the optical imaging array of the imaging anddetector configuration 105 may include at least one set of optical channels having approximately parallel optical axes, and the at least one set of optical channels may be arrayed generally along the direction of a first array axis FA which may generally be in the same direction as the axial direction ZA of the cylindrical bore. As noted above, thesurface profiling system 100 may be configured to rotate thecarrier 170 to cause the imaging anddetector configuration 105 to scan along a workpiece surface region WSR of thebore surface 160 in the scan direction SD (i.e., which is transverse to the first array axis FA). - In various implementations, the optical channels of the imaging and
detector configuration 105 are each configured to input image light IL from a workpiece surface region WSR of thebore surface 160 and transmit the image light IL to a plurality of pixels of the imaging detector array of the imaging anddetector configuration 105. As will be described in more detail below with respect toFIG. 3 , the imaging detector array includes at least one set of detector pixels arrayed generally along the direction of the first array axis FA. In various implementations, the pixels are formed of photodetector elements that provide image data (e.g., intensity values), which may be output individually, or in parallel, or multiplexed, or serialized, or otherwise processed before being output on designated connections (e.g., provided through thecarrier 170, etc.). That is, in some implementations, the signal processing andcontrol portion 101 may includeprocessing circuits 104 that are provided as part of the imaging detector array and/or on thecarrier 170 and which provide signals toother processing circuits 104 and/or other portions of the signal processing andcontrol portion 101 via designated connections. In other implementations, theprocessing circuits 104 may be provided separately from the imaging detector array and/or carrier, and may receive pixel signals from the imaging detector array via the designated connections. - In various implementations, an illumination portion (not shown) is connected to an illumination power and control element, which may be provided as part of or in connection with the signal processing and control portion 101 (e.g., via an illumination/control line). In operation, the illumination portion is arranged to provide source light SL directed toward a workpiece surface region WSR of the
bore surface 160, wherein the source light SL may be reflected from the workpiece surface region WSR as image light IL that is received by the optical channels of the imaging anddetector configuration 105. More specifically, as will be described in more detail below with respect toFIGS. 3, 6A and 6B , the image light IL is received by a lens arrangement of each optical channel that is configured to provide an erect image of at least a portion of the workpiece surface region WSR in its imaged field of view IFOV. In various implementations, the illumination portion may be provided as part of the imaging and detector configuration 105 (e.g., as will be described in more detail below with respect toFIGS. 6A and 6B ), or on thecarrier 170, or as an independent element (e.g., a ring of illumination elements as will be described in more detail below with respect toFIG. 2 ), or in any other convenient form. - As will be described in more detail below with respect to
FIG. 3 , the optical channels of the imaging anddetector configuration 105 may have overlapping fields of view FOV and overlapping imaged fields of view IFOV. A workpiece surface point in the workpiece surface region WSR may be simultaneously imaged in at least two or more overlapping IFOVs of two or more optical channels. A surface point that is not at an object reference distance may be imaged at different respective positions in the IFOVs, and the difference between the respective positions may define a respective image offset for that surface point. Corresponding image data may be analyzed by the signal processing and control portion 101 (e.g., by processing circuits 104) to determine the respective image offset. A surface height measurement coordinate for the surface point (e.g., as related to a distance between the surface point and the imaging and detector configuration 105) may be determined based on the determined image offset. - In various implementations, a workpiece surface point that is imaged in this manner may be one of multiple surface points that are imaged along the direction of the first array axis FA, and the signal processing and
control portion 101 may further be configured to perform operations comprising determining respective coordinates for each of the multiple surface points. For example, respective surface height measurement coordinates for each of the multiple surface points may be determined based at least in part on determined respective image offsets for each of the multiple surface points. In various implementations, the signal processing andcontrol portion 101 may further be configured to perform operations including constructing a synthetic image of the workpiece surface region WSR of thebore surface 160 wherein respective image offsets corresponding to a plurality of surface points are compensated and/or removed, and a majority of the synthetic image appears substantially focused. In various implementations, the signal processing andcontrol portion 101 may be further configured to perform operations comprising determining contrast based Z-height information for at least a portion of the workpiece surface region WSR, as will be described in more detail below. -
FIG. 2 is a diagram of a second exemplary implementation of asurface profiling system 200. It will be appreciated that thesurface profiling system 200 may have certain characteristics that are similar to those of thesurface profiling system 100 ofFIG. 1 , and will be understood to operate similarly except as otherwise described below. As shown inFIG. 2 , thesurface profiling system 200 includes a signal processing andcontrol portion 201, an imaging anddetector configuration 205, acurved carrier 270, asupport portion 280 and anillumination portion 285. The imaging anddetector configuration 205 is carried on thecurved carrier 270. In various implementations, the imaging anddetector configuration 205 includes an imaging detector array and an optical imaging array, portions or all of which may correspondingly be carried on thecurved carrier 270 which may hold them in a stable form. - In various implementations, certain portions or all of the
surface profiling system 200 may include and/or be carried on a schematically representedsupport portion 280 which holds certain portions in proper relationships and which may be mounted to or include a motion control system or the like (e.g., as controlled by the motion control portion 202) for scanning the imaging anddetector configuration 205 along an axial scanning direction SD to image a desired axial workpiece surface region of abore surface 260. In various implementations, either the imaging anddetector configuration 205 or thebore surface 260 may be stationary, and the other may be moved in a manner measured and controlled by thesurface profiling system 200, according to known methods. In various implementations,FIG. 2 is arranged according to cylindrical coordinates ZA, R and ϕ which are aligned with the cylindrical bore, wherein ZA corresponds to an axial direction and ϕ (or “P”) corresponds to the circumferential direction on thebore surface 260, and the axial scanning direction SD may generally correspond to the axial direction ZA. As will be described in more detail below with respect toFIG. 3 , the axial scanning direction SD may also correspond to an X or Y axis direction with respect to the relative coordinates of the imaging anddetector configuration 205. - In various implementations the
illumination portion 285 may include a strobe light source, controllable to determine an exposure duration and timing (e.g., a timing which is triggered at a particular imaging position, for example). Theillumination portion 285 is connected to an illumination power and control element, which may be provided as part of or in connection with the signal processing andcontrol portion 201, via an illumination/control line 286. In operation, theillumination portion 285 is arranged to provide source light SL to a workpiece surface region WSR on thebore surface 260. In alternative implementations, an illumination portion may be omitted, or provided as part of the imaging anddetector configuration 205, or otherwise provided on thecarrier 270, or in any other convenient form. In any case, the source light SL is reflected from the workpiece surface region WSR as image light IL that is received byoptical channels 236 of the optical imaging array of the imaging anddetector configuration 205. - As previously indicated, generally speaking a surface profiling system according to principles disclosed herein may comprise a set of detector pixels arrayed generally along the direction of a first array axis and an optical imaging array including at least one set of optical channels, wherein the optical channels in a set are arrayed generally along the direction of the first array axis and each optical channel is configured with its optical axis arranged transverse to the first array axis to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view (FOV) and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged field of view (IFOV). In the specific example shown in
FIG. 2 , the first array axis FA follows a circular path, and the optical imaging array of the imaging anddetector configuration 205 may include at least one set of optical channels configured with their optical axes aligned along respective radial directions transverse to the circular path. The at least one set of optical channels may be arrayed generally along the direction of the circular first array axis FA which may generally correspond to the direction of the circumference of the cylindrical bore. As noted above, thesurface profiling system 200 may be configured to move thecarrier 270 to cause the imaging anddetector configuration 205 to scan along a workpiece surface region WSR of thebore surface 260 in the scan direction SD (along the direction of the cylinder axis). - In various implementations, the optical imaging array of the imaging and
detector configuration 205 includes optical channels 236-1 to 236-n having optical axes radially aligned transverse to the circular first array axis FA, for which optical channels 236-1 to 236-n are arrayed generally along the circular direction of a first array axis FA.FIG. 2 illustrates the ends of theoptical channels 236 as facing toward thebore surface 260. As will be described in more detail below with respect toFIG. 3 , eachoptical channel 236 is configured to input image light IL from a workpiece surface region WSR of thebore surface 260 located at an object distance along its optical axis in its field of view FOV and transmit the image light IL to a plurality of pixels of the imaging detector array of the imaging anddetector configuration 205 located in its imaged field of view IFOV. - As will further be described in more detail below with respect to
FIG. 3 , the imaging detector array of the imaging anddetector configuration 205 includes at least one set of detector pixels (not shown) arrayed generally along the circular direction of a first array axis FA. In various implementations, the pixels are formed of photodetector elements that provide image data (e.g., intensity values), which may be output individually, or in parallel, or multiplexed, or serialized, or otherwise processed before being output onconnections 233. That is, in some implementations, the signal processing andcontrol portion 201 may include processingcircuits 204 that are provided as part of the imaging detector array 310 (FIG. 3 ) and/or on thecarrier 270 and which provide signals toother processing circuits 204 and/or other portions of the signal processing andcontrol portion 201 via theconnections 233. In other implementations, theprocessing circuits 204 may be provided separately from thecarrier 270 and/or the imaging detector array of the imaging anddetector configuration 205, and may receive pixel signals from the imaging detector array via theconnections 233. - In the implementation shown in
FIG. 2 , the imaging anddetector configuration 205 is curved in an approximately circular shape on thecarrier 270. In some implementations, thecarrier 270 may be a portion of thesupport portion 280. In various implementations, portions of the imaging anddetector configuration 205 may be formed on a substrate that may be a flex print, an elastomer or a thinned semiconductor substrate, or another curvable substrate that provides the required properties (e.g., for providing a curved optical imaging array, etc.). In some implementations, the optical imaging array of the imaging anddetector configuration 205 may be formed of a flexible IC sensor such as a FleX™ Silicon-on-Polymer™ CMOS sensor available from American Semiconductor of Boise, Id., or a curved high-resolution CCD sensor provided by Andanta of Olching, Germany. Various other usable alternatives for fabricating a curved optical imaging array configuration are disclosed in U.S. Pat. Nos. 6,791,072; 6,849,843; 7,786,421; 8,372,726; 8,742,325; and U.S. Patent Publications 2010/0264502 and 2012/0261551, all of which are hereby incorporated herein by reference in their entirety. - In some implementations, the optical imaging array of the imaging and
detector configuration 205 may comprise multiple imaging arrays which are each nominally flat over a limited span, but are arranged along a curved form of the optical imaging array. For example, a plurality of nominally flat imaging arrays may be provided on a flexible substrate that extends along the curved form of the optical imaging array. One design consideration in such an implementation is that each of the imaging arrays should not receive an unacceptably blurred image of its corresponding portion of a workpiece surface region WSR. Thus, any corresponding portions of the optical imaging array and corresponding optical channels (e.g., each including a lens arrangement) should be designed to have complementary curvatures to the extent required to maintain each pixel within a desirable image focus depth or range. -
FIG. 3 is a diagram of one exemplary implementation of asurface profiling system 300 including an imaging anddetector configuration 305 having animaging detector array 310 and anoptical imaging array 330. It will be appreciated that thesurface profiling system 300 may have similar characteristics and will be understood to operate similarly to thesurface profiling systems imaging detector array 310 includes at least one set 315 of detector pixels 316 arrayed generally along a direction of a first array axis FA, which may be understood to represent a straight first array axis (as inFIG. 1 ), or a segment of a curved or circular first array axis (as inFIG. 2 ). In the implementation ofFIG. 3 , example detector pixels 316-1 to 316-35 are illustrated as part of theimaging detector array 310. In various implementations, theoptical imaging array 330 includes at least one set 335 ofoptical channels 336, wherein the optical channels in a set are arrayed generally along the direction of the first array axis FA, and each optical channel is configured with its optical axis 338 arranged transverse to the first array axis to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view (FOV) and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged field of view (IFOV). In some embodiments, the optical axes may be approximately parallel. In the implementation ofFIG. 3 , example optical channels 336-1 to 336-5 having optical axes 338-1 to 338-5 are illustrated as part of theoptical imaging array 330. It will be appreciated that the number of detector pixels 316 andoptical channels 336 in the configuration ofFIG. 3 are for purposes of illustration only, and that other implementations may have more or less detector pixels and/or optical channels, depending on the implementation. - Each
optical channel 336 is configured to input image light IL from a workpiece surface region WSR (e.g., of a bore surface) located at an object distance OD along a direction of an optical channel's optical axis 338 in the optical channel's field of view FOV, and transmit the image light IL to a plurality of pixels 316 of theimaging detector array 310 located in the optical channel's imaged field of view IFOV. In various implementations, thesurface profiling system 300 further includes one or more light sources (not shown) that provide source light directed toward the workpiece surface region WSR, wherein the source light is reflected from the workpiece surface region WSR as the image light IL that is received by the lens arrangement of eachoptical channel 336 that is configured to provide an erect image of at least a portion of the workpiece surface region WSR in its imaged field of view IFOV (e.g., the optical channel 336-1 receives image light IL-1, the optical channel 336-2 receives image light IL-2, etc.). In various implementations, theimaging detector array 310 is configured to provide at least M pixels 316 that are located in anoptical channel 336 imaged field of view IFOV, where M is an integer that is at least a minimum amount (e.g., 10, 25, 50, 100, etc.). It will be appreciated that the larger M is, the better the resolution with which the respective image offset IO corresponding to an imaged surface point can be determined. Generally speaking, the better the resolution of the determination of image offset 10, the better the resolution of the corresponding surface point height measurement. - In
FIG. 3 , for simplicity only a single field of view FOV and imaged field of view IFOV are illustrated for only a single optical channel (i.e., optical channel 336-5 is illustrated with the FOV-5 and IFOV-5), although it will be appreciated that each of the optical channels has a similar corresponding field of view FOV and imaged field of view IFOV, which overlap with one another, as will be described in more detail below. - The detector pixels 316 of the
imaging detector array 310 are arranged at a back imaging distance BD from theoptical channels 336. In various implementations, the back imaging distance BD may be made to be approximately equal to a back focal length of theoptical channels 336, and/or an object reference distance RD may be made to be approximately equal to a front focal length of theoptical channels 336, such as may provide certain advantages depending on the implementation. In some embodiments, the front and back focal lengths and/or the back imaging distance BD and the object reference distance RD may be equal. However, it will be appreciated that other combinations of back imaging distance BD and object reference distance RD may operate according to the principles outlined herein, and that these examples are illustrative only, and not limiting. - Each
optical channel 336 includes a lens arrangement (e.g., as will be described in more detail below with respect toFIG. 4 ) configured to provide an erect image in its imaged field of view IFOV for a workpiece surface WS located in its FOV and within a measuring range along the direction of the optical axes of thesurface profiling system 300. - The
optical channels 336 are configured in theoptical imaging array 330 to have overlapping fields of view FOV and overlapping imaged fields of view IFOV. In various implementations, theoptical channels 336 may be adjacent to one another along the direction of the first array axis FA. In various implementations, it may be advantageous if theoptical channels 336 have a nominal channel dimension along the direction of the first array axis of at most 500 micrometers (although this dimension is exemplary only, and not limiting). - In various desirable implementations, a measuring range of the
surface profiling system 300 may be at least 50 micrometers along the direction of the optical axis 338 (e.g., along a Z axis), and the optical imaging array may have a dimension of at least 5 mm along the first array axis FA (e.g., along a Y axis). Of course, the measuring range may be any operable range less than or more than 50 micrometers (e.g., 100 micrometers or more, in some implementations) and the optical imaging array and imaging detector array may be much longer than 5 mm along the first array axis FA (e.g., 1 meter or more, in some implementations), if desired for a particular application. In bore inspection operations using a configuration such as that shown inFIG. 1 , for example, a dimension along the first array axis FA that is at least as long as the bore height is advantageous for throughput, such that a single revolution of the scanning mechanism allows imaging of an entire bore surface. - A dimension of the surface profiling system perpendicular to the first array axis FA (e.g., along an X axis) may be minimal in a one-dimensional system (e.g., comprising only 1 row or set of optical channels and 1 row or set of detector pixels). It will be appreciated that the surface profiling system and/or workpiece surface may be scanned relative to one along a direction transverse to the first array axis, in order to create a two-dimensional profile map of a workpiece surface. Therefore, it is a design choice whether or not to include a plurality of sets (rows) of optical channels and/or detector pixels along a direction perpendicular to the first array axis FA.
- In various implementations, the measuring range may include the object reference distance RD of the N
optical channels 336. In various implementations, the measuring range (which may be defined by operating specifications of the surface profiling system, and/or by inherent operating limitations of the optical configuration) may be asymmetrical about the object reference distance RD, if desired. - In various implementations, a workpiece surface point SP (e.g., located at coordinates X1, Y1, Z1) that is located in the measuring range of the
surface profiling system 300 may be simultaneously imaged in at least N overlapping imaged fields of view IFOVs of Noptical channels 336, where N is an integer that is at least a minimum amount (e.g., 2, 3, etc.). In the example ofFIG. 3 , the workpiece surface point SP is illustrated as being imaged in the overlapping imaged fields of view IFOVs of at least the optical channels 336-1, 336-2, and potentially 336-3. In various implementations, the surface profiling system is configured according to known optical principles such that a surface point that is located at a defined object reference distance RD from the Noptical channels 336 in the measuring range is imaged at the same respective position (e.g., the same pixel position) along theimaging detector array 310 in each of the N overlapping imaged fields of view IFOVs. For example, with respect to the configuration ofFIG. 3 , if the workpiece surface point SP were at the object reference distance RD relative to the ends of theoptical channels 336, the workpiece surface point SP would be imaged by each of the Noptical channels 336 at the same position (e.g., the same pixel position) along thedetector array 310. In contrast, as illustrated inFIG. 3 , when the workpiece surface point SP is located at an object distance OD that is different than the object reference distance RD, then the surface point SP is imaged at different respective positions at least along the direction of the first array axis FA (e.g., corresponding to the Y axis) in each of the N overlapping imaged fields of view IFOVs, wherein the difference between at least two respective imaged positions defines a respective image offset IO for the surface point SP. - In the example of
FIG. 3 , the surface point SP is thus shown as being imaged at thedetector array 310 at the positions PN-1, PN-2 (and potentially PN-3) by the optical channels 336-1, 336-2 (and potentially 336-3), respectively, and a corresponding image offset IO is correspondingly determined (e.g., indicated by a difference between the positions PN-1 and PN-2). In various implementations, corresponding image data provided by theimaging detector array 310 may be acquired, and the image data may be analyzed to determine the respective image offset IO. In accordance with principles described herein, a surface height measurement coordinate (e.g., Z1) for the workpiece surface point SP along the direction of the optical axis (e.g., along the direction of the Z axis) may be determined and provided based at least in part on the determined respective image offset IO. Further examples of image offsets will be illustrated and described in more detail below with respect toFIGS. 5A and 5B . In various implementations, thesurface profiling system 300 may be configured to have the imaging anddetector configuration 305 scan along the workpiece surface region WSR in a direction that is transverse to the first array axis FA (e.g., in an X axis direction which may correspond to a ϕ or “P” circumferential direction on a bore surface as described above with respect toFIG. 1 ). -
FIG. 4 is a diagram illustrating operation of a gradient-index (GRIN)lens 400 such as may be included in the lens arrangement of an optical channel of an optical imaging array (e.g., in each of theoptical channels 336 ofFIG. 3 ). The travel of light rays through theGRIN lens 400 are illustrated and may be referenced according to a ¼ pitch, ½ pitch and 1 pitch. As illustrated, at a ½ pitch length of theGRIN lens 400, an inverted image is produced, while at a 1 pitch length of theGRIN lens 400, an erect image is produced. In various implementations, aGRIN lens 400 may be cut to a desired length to produce a desired image for a specific application (e.g., to produce an erect image). In various implementations, a GRIN lens that is included as the lens arrangement of an optical channel (e.g., anoptical channel 336 ofFIG. 3 ) may have a pitch length between ¾ and 1 along the optical axis (e.g., so as to produce an erect real image at thedetector array 310.) In some implementations, the choice of pitch length in this range allows setting a back imaging distance BD to achieve a 1× imaging system necessary to minimize optical stitching effects resulting from the array ofoptical channels 336. In some implementations, the nominal object reference distance RD may be equal to BD. In some implementations, RD and/or BD may have example design values in the range 0.5 mm to 15 mm to allow noncontact measurement in small bore sizes. It will be appreciated that in various alternative implementations, the lens arrangement of anoptical channel 336 may be made to include additional or alternative lenses (e.g., microlenses, etc.) arranged along the optical axis, which may be configured to produce an erect image at thedetector array 310. -
FIGS. 5A and 5B are diagrams ofimages images - With respect to the
images optical channels 336 ofFIG. 3 ), each workpiece surface point of the discrete feature may be imaged at different respective positions along the imaging detector array in each of the N overlapping IFOVs. In the particular examples ofFIGS. 5A and 5B , theimages images dimensional images FIGS. 3, 6A and 6B ), as will be described in more detail below. - In various implementations, it may be desirable for the system to utilize a measuring range wherein a difference between the object distance and the object reference distance of the N optical channels (e.g., an out of focus amount) produces relatively clear images of the surface points/discrete features at the different respective positions (e.g., so that a respective image offset may be more accurately determined, etc.)
- It will be appreciated that if, in the examples of
FIGS. 5A and 5B , the out of focus amount was increased further (e.g., above the amount of approximately 500 um corresponding toFIG. 5B ), the corresponding imaged feature separation or image offset IO between the imaged positions may continue to increase, although the blurriness of the imaged discrete features may also increase beyond a desirable level. In various implementations, contrast-based or other evaluation techniques may be utilized to evaluate image data and/or images (e.g., such asimages image 500B may represent a preferred out of focus level of 500 um in the measuring range at which the system may operate. It is noted that in the example of theimage 500B, the difference between the respective imaged positions is relatively clear, such that the respective image offset may be determined with relatively high accuracy. For example, in various implementations, a surface profiling system formed in accordance with the principles disclosed herein may provide sensitivity for the determination of the surface height measurement coordinate in the range of 1 um as compared to 10 um, or approximately 10× accuracy improvement (e.g., due to the larger triangulation angles from the multiple spatially arranged optical channels 336) as compared to prior techniques (e.g., points from focus operations based on blurriness and utilizing a single large lens). It will further be appreciated that such characteristics allow a surface profiling system to be formed in a more compact configuration that may operate in constrained spaces (e.g., for scanning an interior bore surface) with a higher degree of accuracy for the determination of surface height measurement coordinates. - As shown in the
image 500A, a discrete feature is imaged at different respective “image offset” positions PN-1 a, PN-2 a corresponding to the image offset IOa, between two adjacent optical channels. The image offset amount IOa is due to a “relatively lesser” difference between an object distance (or surface height) of the indicated edge feature and the object reference distance of the surface profiling system. In comparison, as shown in theimage 500B, the same discrete feature is imaged at different respective “image offset” positions PN-1 b, PN-2 b corresponding to the image offset IOb, between two adjacent optical channels. The image offset amount IOb is due to a “relatively greater” difference between the object distance (or surface height) of the indicated edge feature and the object reference distance of the surface profiling system. - In any case, the various image offset amounts 10 (e.g., in micrometers, or pixel units) corresponding to various object distances in the measuring range can be calibrated for the surface profiling systems, such that any image offset amount IO is quantitatively indicative of the difference between the object distance (or surface height) of a feature and the object reference distance of the surface profiling system. Thus, in various implementations, a surface height measurement coordinate (e.g., Z1) for the corresponding surface height of an imaged workpiece surface point or feature along the direction of the optical axis (e.g., along the direction of the Z axis) may be determined and provided based (at least in part) on its determined respective image offset IO.
- It should be appreciated that the two
images - In various implementations, the analyzing of the image data (e.g., by the signal processing and
control portion 101 ofFIG. 1 ) to determine the respective image offset IO may include at least one of Fourier analysis of spatial frequencies, auto-correlation operations, etc. In various implementations, an analysis technique may be utilized that defines a point spread function as a function of surface height (e.g., as a function of Z). For example, for a one-dimensional function, a point spread function may be based on a specified number of imaged positions (e.g., two imaged positions) increasing in distance and size with Z. By deconvolution operations using known or predetermined point spread functions at different Zs with the image (e.g., to make an image stack), and then finding best contrast image portions from the image stack, an extended depth of field image and Z profile map may be constructed utilizing known techniques. The resulting process allows increased speed through use of one-dimensional point spread functions, with a one-dimensional imaging and detector configuration in a line scan type system (e.g., utilizing the imaging and detector configuration ofFIG. 3 for performing line scan type operations such as those described above). - In various implementations, the analyzing of the image data to determine the respective image offset may also or alternatively include utilizing a video tool to determine a distance between the respective positions that the surface point/discrete feature is imaged at. For example, with respect to the
image 500B, one or more video tools may be utilized determine the positions PN-1 b and PN-2 b of the imaged discrete feature and/or the corresponding image offset distance between the positions PN-1 b and PN-2 b. More specifically, some systems may include GUI features and predefined image analysis “video tools” such that operation and programming can be performed by “non-expert” operators. For example, U.S. Pat. No. 6,542,180, which is hereby incorporated herein by reference in its entirety, teaches a system that uses automated video inspection including the use of various video tools. Exemplary video tools may include edge location tools, which are sometimes referred to as “box tools,” which may be used to locate an edge that defines a discrete feature of a workpiece (e.g., utilized to determine the positions PN-1 b and PN-2 b of the imaged discrete feature). For example, commonly assigned U.S. Pat. No. 7,627,162, which is hereby incorporated herein by reference in its entirety, teaches various applications of box tools. - In various implementations, a signal processing and control portion (e.g., see
FIG. 1 ) may further be configured to perform operations comprising determining contrast based Z-height information for at least a portion of the workpiece surface region. For example, in addition to determining surface height measurement coordinates for surface points/discrete features based on determined image offsets, more traditional contrast based Z-height determination techniques may also or alternatively be utilized for determining the Z-height of a portion of a workpiece surface region. In various implementations, such contrast based Z-height determination techniques may be implemented through utilization of certain types of video tools (e.g., autofocus and/or focus height determination video tools, etc.). For example, commonly assigned U.S. Pat. No. 8,111,938, which is hereby incorporated herein by reference in its entirety, teaches various applications of autofocus video tools that utilize contrast based Z-height determination techniques. - In various implementations, certain of the above noted techniques may be utilized in combination with motion control that may be utilized to scan at known and desired spacing transverse to the first array axis, such that the “pixel coordinates” in combination with the motion control position coordinates allow the reconstruction of a two-dimensional image if desired (e.g., such as may be utilized to form the
images - In various implementations, a two-dimensional image that is assembled may be displayed by the surface profiling system. The two-dimensional image may also be “de-blurred” if desired, by compensating or removing the local offset at local regions throughout the two-dimensional reconstructed image. In various implementations, if a two-dimensional imaging and detector configuration is utilized (e.g., having multiple columns of optical channels), then a detected two-dimensional image that corresponds to the size of the imaging detector array may be acquired without requiring motion and reconstruction. Once a detected or constructed two-dimensional image is provided, it may be analyzed using any method disclosed herein or otherwise known. As previously noted, the two-dimensional image may be analyzed to provide surface height information (e.g., resulting in three-dimensional surface profile data). The image may also be deblurred as described herein, to provide an extended depth of field (EDOF) image for viewing. Such an EDOF image maybe helpful for defect inspection, in various implementations.
- In various implementations, a two-dimensional image may not be required depending on the nature of the designated output of the system. For example, if a cylindrical bore is being inspected to determine if there are defects in the cylindrical bore (e.g., inspecting for form errors or surface defects along the interior surface, etc.), an output provided by the system may primarily indicate whether or not the current cylindrical bore is free of defects or otherwise passes a designated inspection analysis. In such an implementation, as the surface height measurement coordinates are determined for the workpiece surface points on the interior bore surface, a warning or other indicator may be provided if a certain number of the surface height measurement coordinates are determined to deviate from expected values (e.g., if the interior surface of the cylindrical bore includes an unexpected number or depth of form errors or surface defects, etc.). In various implementations, such determinations may also be made based on a relative comparison between determined surface height measurement coordinates (e.g., wherein deviations among certain of the determined surface height measurement coordinates, such as in a given column, may indicate an unsmooth or otherwise defective interior bore surface, etc.)
-
FIGS. 6A and 6B are diagrams of one exemplary implementation of an imaging anddetector configuration 605 of asurface profiling system 600. It will be appreciated that thesurface profiling system 600 may have similar characteristics and will be understood to operate similarly to thesurface profiling systems FIGS. 6A and 6B , the imaging anddetector configuration 605 includes animaging detector array 610, anoptical imaging array 630 andlight source arrays - In various implementations, the
light source arrays light source arrays optical imaging array 630 that is configured to provide an erect image of at least a portion of the workpiece surface region WSR in its imaged field of view IFOV. In various implementations, thesurface profiling system 600 may be configured to have the imaging anddetector configuration 605 scan along the workpiece surface region WSR in a direction that is transverse to a first array axis FA (e.g., in an X axis direction which may correspond to a ϕ or “P” circumferential direction on a bore surface as described above with respect toFIG. 1 ). - In various implementations, the imaging and
detector configuration 605 may include only a single set of detector pixels in theimaging detector array 610 and a single set of optical channels in theoptical imaging array 630 in a one-dimensional configuration of thesurface profiling system 600. In various alternative implementations, an imaging and detector configuration may include multiple sets of detector pixels in an imaging detector array and/or multiple sets of optical channels in an optical imaging array in a multi-dimensional configuration of a surface profiling system. For example, in a two-dimensional configuration, an imaging detector array may include multiple sets (e.g., columns) of detector pixels arrayed generally along the direction of a first array axis FA. In addition, a corresponding optical imaging array may include multiple sets (e.g., columns) of optical channels having parallel optical axes, wherein each of the sets of optical channels may be arrayed generally along the direction of a first array axis FA. Each optical channel may be configured to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view FOV and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged IFOV. In such a configuration, the multiple sets of detector pixels and the multiple sets of optical channels may each be arrayed along a second array axis (e.g., in an X axis direction) that is transverse to the first array axis FA (e.g., each set being arranged in a respective column wherein the columns are arranged along the second array axis, etc.). Stated another way, theimaging detector array 610 may include a plurality of similar sets of detector pixels, and theoptical imaging array 630 may include a plurality of similar sets of optical channels, and the plurality of sets of detector pixels may be arranged adjacent to one another and the plurality of sets of optical channels may be arranged adjacent to one another, along a second array axis that is transverse to the first array axis FA. - It will be appreciated that in various implementations, multiple surface profiling systems and/or imaging and detector configurations (e.g., such as the
surface profiling system 600 and/or the imaging and detector configuration 605) may be utilized in combination. For example, multiple imaging and detector configurations may be arranged relative to one another in a specified configuration. In one specific implementation, an arrangement of first and second imaging and detector configurations may correspond to a V-shape (e.g., in a “triangulation” three-dimensional imaging configuration). For example, the optical imaging array of the first imaging and detector configuration may be arranged relative to the optical imaging array of the second imaging and detector configuration in a V-shape, as illustrated. Stated another way, a surface profiling system may comprise a first profiling subsystem that includes a first imaging detector array and a first optical imaging array corresponding to a straight first array axis that is approximately straight. It may further comprise a second profiling subsystem similar to the first profiling subsystem. The optical axes of the first profiling subsystem may approximately align with a first plane, and the optical axes of the second profiling subsystem approximately align with a second plane, and the first and second planes may intersect at a line approximately parallel to their respective first array axes, and form an angle in a plane transverse to their respective first array axes. The signal processing and control portion may be configured to use image data provided by the first and second profiling subsystems in combination to perform a three-dimensional measurement operation. It will be appreciated that in accordance with principles disclosed herein, such surface profiling systems may be made smaller than traditional surface profiling systems, which also allows for smaller combined implementations to be produced. -
FIG. 7 is a flow diagram illustrating one exemplary implementation of a routine 700 for acquiring and analyzing image data provided by a surface profiling system to determine a surface height measurement coordinate for a workpiece surface point. At ablock 710, image data is acquired as provided by a surface profiling system for a workpiece surface point. As described above, the surface profiling system may include an imaging detector array and an optical imaging array with at least one set of optical channels, wherein the optical channels in a set are arrayed generally along the direction of a first array axis, and each optical channel is configured with its optical axis arranged transverse to the first array axis to input image light from a workpiece surface region located at an object distance along its optical axis in its field of view and transmit the image light to a plurality of pixels of the imaging detector array located in its imaged field of view, and wherein each optical channel includes a lens arrangement configured to provide an erect image and the workpiece surface point is simultaneously imaged in at least two overlapping imaged fields of view of at least three of the optical channels. - At a
block 720, the image data is analyzed (e.g., by a signal processing and control portion) to determine a respective image offset according to the surface point being located at an object distance that is different than an object reference distance, for which the surface point is imaged at different respective positions in each of the at least two overlapping imaged fields of view. As described above, in certain implementations the difference between at least two of the respective positions may define a respective image offset for the surface point. In some implementations, a difference between two (or more) of the respective positions may define a respective image offset for the surface point (e.g., in an implementation utilizing a point spread function, etc.). At ablock 730, a surface height measurement coordinate is provided (e.g., by a signal processing and control portion) for the surface point along the direction of the optical axes based at least in part on the determined respective image offset. In various implementations, the analyzing of the image data to determine the respective image offset may include at least one of Fourier analysis of spatial frequencies, auto-correlation operations, point spread functions, video tool operations, etc. - While preferred implementations of the present disclosure have been illustrated and described, numerous variations in the illustrated and described arrangements of features and sequences of operations will be apparent to one skilled in the art based on this disclosure. Various alternative forms may be used to implement the principles disclosed herein. Although bore scanning implementations have been emphasized in various system figures, it will be appreciated that these examples are illustrative only, and not limiting. For example, “planar” or flat panel surface scanning implementations may be provided based on the principles disclosed herein. In addition, the various implementations described above can be combined to provide further implementations. All of the U.S. patents and U.S. patent applications referred to in this specification are incorporated herein by reference, in their entirety. Aspects of the implementations can be modified, if necessary to employ concepts of the various patents and applications to provide yet further implementations.
- These and other changes can be made to the implementations in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific implementations disclosed in the specification and the claims, but should be construed to include all possible implementations along with the full scope of equivalents to which such claims are entitled.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/471,811 US10593718B2 (en) | 2017-03-28 | 2017-03-28 | Surface profiling and imaging system including optical channels providing distance-dependent image offsets |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/471,811 US10593718B2 (en) | 2017-03-28 | 2017-03-28 | Surface profiling and imaging system including optical channels providing distance-dependent image offsets |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180286027A1 true US20180286027A1 (en) | 2018-10-04 |
US10593718B2 US10593718B2 (en) | 2020-03-17 |
Family
ID=63670897
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/471,811 Active 2037-12-22 US10593718B2 (en) | 2017-03-28 | 2017-03-28 | Surface profiling and imaging system including optical channels providing distance-dependent image offsets |
Country Status (1)
Country | Link |
---|---|
US (1) | US10593718B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111521617A (en) * | 2020-04-30 | 2020-08-11 | 上海御微半导体技术有限公司 | Optical detection apparatus, control method of optical detection apparatus, and storage medium |
US11118899B2 (en) * | 2017-11-02 | 2021-09-14 | Kawasaki Jukogyo Kabushiki Kaisha | Work support system and work support method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8619082B1 (en) * | 2012-08-21 | 2013-12-31 | Pelican Imaging Corporation | Systems and methods for parallax detection and correction in images captured using array cameras that contain occlusions using subsets of images to perform depth estimation |
US9906771B2 (en) * | 2015-02-17 | 2018-02-27 | Samsung Electronics Co., Ltd. | Light-field camera |
US9945988B2 (en) * | 2016-03-08 | 2018-04-17 | Microsoft Technology Licensing, Llc | Array-based camera lens system |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4849626A (en) | 1987-11-13 | 1989-07-18 | The Babcock & Wilcox Company | Fiber optic bore inspection probe |
US6542180B1 (en) | 2000-01-07 | 2003-04-01 | Mitutoyo Corporation | Systems and methods for adjusting lighting of a part based on a plurality of selected regions of an image of the part |
DE10004891C2 (en) | 2000-02-04 | 2002-10-31 | Astrium Gmbh | Focal area and detector for optoelectronic image recording systems, manufacturing process and optoelectronic image recording system |
US6791072B1 (en) | 2002-05-22 | 2004-09-14 | National Semiconductor Corporation | Method and apparatus for forming curved image sensor module |
US7786421B2 (en) | 2003-09-12 | 2010-08-31 | California Institute Of Technology | Solid-state curved focal plane arrays |
US7627162B2 (en) | 2005-01-31 | 2009-12-01 | Mitutoyo Corporation | Enhanced video metrology tool |
US7792423B2 (en) | 2007-02-06 | 2010-09-07 | Mitsubishi Electric Research Laboratories, Inc. | 4D light field cameras |
US7968959B2 (en) | 2008-10-17 | 2011-06-28 | The United States Of America As Represented By The Secretary Of The Navy | Methods and systems of thick semiconductor drift detector fabrication |
US7636204B1 (en) | 2007-10-30 | 2009-12-22 | LumenFlow Corp. | 360 degree view imaging system |
KR101733443B1 (en) | 2008-05-20 | 2017-05-10 | 펠리칸 이매징 코포레이션 | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US8372726B2 (en) | 2008-10-07 | 2013-02-12 | Mc10, Inc. | Methods and applications of non-planar imaging arrays |
US8111938B2 (en) | 2008-12-23 | 2012-02-07 | Mitutoyo Corporation | System and method for fast approximate focus |
DE102009019459B4 (en) | 2009-05-04 | 2012-02-02 | Hommel-Etamic Gmbh | Device for imaging the inner surface of a cavity in a workpiece |
US9442285B2 (en) | 2011-01-14 | 2016-09-13 | The Board Of Trustees Of The University Of Illinois | Optical component array having adjustable curvature |
DE102011117618B4 (en) | 2011-11-04 | 2019-07-18 | Jenoptik Industrial Metrology Germany Gmbh | Device for imaging the inner surface of a cavity in a workpiece |
US8570505B2 (en) | 2012-03-06 | 2013-10-29 | Siemens Energy, Inc. | One-dimensional coherent fiber array for inspecting components in a gas turbine engine |
US8754829B2 (en) | 2012-08-04 | 2014-06-17 | Paul Lapstun | Scanning light field camera and display |
US9412172B2 (en) | 2013-05-06 | 2016-08-09 | Disney Enterprises, Inc. | Sparse light field representation |
US8742325B1 (en) | 2013-07-31 | 2014-06-03 | Google Inc. | Photodetector array on curved substrate |
US9759670B2 (en) | 2014-12-23 | 2017-09-12 | Mitutoyo Corporation | Bore imaging system |
-
2017
- 2017-03-28 US US15/471,811 patent/US10593718B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8619082B1 (en) * | 2012-08-21 | 2013-12-31 | Pelican Imaging Corporation | Systems and methods for parallax detection and correction in images captured using array cameras that contain occlusions using subsets of images to perform depth estimation |
US9906771B2 (en) * | 2015-02-17 | 2018-02-27 | Samsung Electronics Co., Ltd. | Light-field camera |
US9945988B2 (en) * | 2016-03-08 | 2018-04-17 | Microsoft Technology Licensing, Llc | Array-based camera lens system |
Non-Patent Citations (2)
Title |
---|
Broxton Wave optics theory and 3-D deconvolution for the light field microscope, 2013, IDS submitted on 07/31/2017 * |
Grin Tech Gradient Index Optics Technology, 12/2015, IDS submitted on 07/31/2017 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11118899B2 (en) * | 2017-11-02 | 2021-09-14 | Kawasaki Jukogyo Kabushiki Kaisha | Work support system and work support method |
CN111521617A (en) * | 2020-04-30 | 2020-08-11 | 上海御微半导体技术有限公司 | Optical detection apparatus, control method of optical detection apparatus, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
US10593718B2 (en) | 2020-03-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10154187B2 (en) | Apparatus and method for adjusting and / or calibrating a multi-camera module as well as the use of such an apparatus | |
CN109859272B (en) | Automatic focusing binocular camera calibration method and device | |
TWI665444B (en) | Defect inspection device and method | |
CN102239384B (en) | Optical apparatus for non-contact measurement or testing of body surface | |
US9880108B2 (en) | Bore imaging system | |
WO2001037025A1 (en) | Confocal imaging | |
KR101545186B1 (en) | method of correction of defect location using predetermined wafer image targets | |
US10593718B2 (en) | Surface profiling and imaging system including optical channels providing distance-dependent image offsets | |
CN109952526B (en) | System and method for dimensional calibration of analytical microscopes | |
US8937654B2 (en) | Machine vision inspection system comprising two cameras having a rotational offset | |
US10731971B1 (en) | Method of measuring 3D profile | |
JP6955910B2 (en) | Super resolution bore imaging system | |
CN101673043A (en) | Wide-angle distortion testing system and method | |
US20110292200A1 (en) | Scanning microscope | |
CN113614617A (en) | Collimator | |
US7551296B2 (en) | Method for determining the focal position of at least two edges of structures on a substrate | |
US9759670B2 (en) | Bore imaging system | |
US10794679B2 (en) | Method and system for measuring geometric parameters of through holes | |
CN106461382B (en) | Five-axis optical detection system | |
CN115127483A (en) | Detection method for measuring coaxiality and system for detecting coaxiality | |
JP2008145121A (en) | Three-dimensional shape measuring apparatus | |
JP2016095243A (en) | Measuring device, measuring method, and article manufacturing method | |
US20180276836A1 (en) | End face inspection device and focused image data acquisition method therefor | |
WO2023182095A1 (en) | Surface shape measurement device and surface shape measurement method | |
US7807951B1 (en) | Imaging sensor system with staggered arrangement of imaging detector subelements, and method for locating a position of a feature in a scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MITUTOYO CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TOBIASON, JOSEPH DANIEL;REEL/FRAME:041770/0571 Effective date: 20170327 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |