EP1040449A1 - Verfahren und gerät zur kalibrierung eines berührungslosen abstandssensors - Google Patents

Verfahren und gerät zur kalibrierung eines berührungslosen abstandssensors

Info

Publication number
EP1040449A1
EP1040449A1 EP99935950A EP99935950A EP1040449A1 EP 1040449 A1 EP1040449 A1 EP 1040449A1 EP 99935950 A EP99935950 A EP 99935950A EP 99935950 A EP99935950 A EP 99935950A EP 1040449 A1 EP1040449 A1 EP 1040449A1
Authority
EP
European Patent Office
Prior art keywords
deviations
data
random noise
sensor
estimating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP99935950A
Other languages
English (en)
French (fr)
Inventor
Van-Duc Nguyen
Victor Nzomigni
Charles Vernon Stewart
Kishore Bubna
Lila Abdessemed
Nathalie Poirier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/360,032 external-priority patent/US6411915B1/en
Application filed by General Electric Co filed Critical General Electric Co
Publication of EP1040449A1 publication Critical patent/EP1040449A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the present invention relates generally to digital image processing techniques and in particular to a method and apparatus for calibrating a non-contact range sensor of the type used to obtain digital images of objects for the purpose of inspecting the objects for defects.
  • Airfoils e.g. turbine blades
  • digital data and images of the objects are inspected for deformation parameters such as platform orientation, contour cross-section, bow and twist along stacking axis, thickness and chord length at given cross-sections.
  • CMMs coordinate measurement machines
  • Non-contact full-field sensors scan the external surfaces of opaque objects using laser or white light. These sensors are capable of scanning the part quickly while obtaining large quantities of data. Examples of non contact sensors include sensors that are based on laser line grating and stereo triangulation; and based on single laser line scan plus rotating the part. Another non contact sensor is based on phased-shift Moire and white light.
  • a method for calibrating a non contact range sensor comprises the steps of scanning the object to be inspected with the range sensor to obtain a range image of the object.
  • the range image is registered with a reference image of the object to provide registered data . Normal deviations between the registered data and the reference image are computed. Noise is filtered from the normal deviations.
  • a plurality of bias vectors and a covariance matrix are estimated and used to generate a lookup table comprising geometric correction factors.
  • the range image of the object is compensated in accordance with the lookup table.
  • Figure 1 is a flowchart of the method of the present invention, of calibrating a full-field non-contact range sensor
  • Figure 2 is a perspective view of a plurality of projections of a bias vector disposed along the z-axis;
  • Figure 3 is a perspective view of the x, y and z components of the bias vector illustrated in FIG.2.
  • Figure 4 is a flowchart of one method of registering a range image of an object with a reference image of the object.
  • sensor calibration is defined as the process of identifying bias errors in measurements obtained by the sensor such that later-obtained measurements can be compensated for the errors.
  • the sensor is calibrated in a sequence of steps as follows.
  • the sensor is employed to obtain a range image of an object to be inspected.
  • the sensor scans the object as the object translates and rotates through its working volume.
  • range data points are obtained.
  • the data points are arranged in data sets, also referred to herein as a scanned image, and the data sets are registered to one or more reference data sets.
  • Reference data comprises geometric ideal models of the object and is also referred to herein as a reference image.
  • the scanned image and the reference image are registered to each other such that both are represented in a common coordinate system. Normal deviations are then computed from the scanned image to the registered image. Random noise, which describes precision, and bias error, which describes accuracy, are estimated from the normal deviations and stored in a lookup table.
  • the look up table is employed for 3D geometric correction and multiple view integration of images obtained by the sensor.
  • Step 42 of method 40 obtains a scanned image of an object to be inspected by translating the object on a translation stage, and rotating the object on a rotation stage along a plurality of directions throughout the 3 dimensional working volume of the object. This step is accomplished using typical scanning and digital imaging systems and techniques.
  • a reference image comprising data points corresponding to the ideal shape of the object, and the scanned image of the actual object are registered.
  • One method of obtaining a reference image with which to register the scanned image is to grind a planar object to achieve flatness.
  • the object itself is measured using a more accurate sensor like a CMM, and the measurements stored as reference data.
  • computer assisted drawing (CAD) data or other engineering or design specifications may be used to provide a reference image.
  • one embodiment of the invention uses a Robust- Closest-Patch (RCP) algorithm in the registration process.
  • the RCP algorithm is the subject of co-pending application SN 09/303,241 , entitled “Method and Apparatus for Image Registration", filed 4/30/99, and assigned to the assignee of the present invention and hereby incorporated by reference.
  • RCP algorithm given an initial rigid pose, each scanned image data point is matched to its nearest data point on the reference image as indicated at 62 and 64 of FIG. 4.
  • the RCP algorithm matches reference image data points and scanned image data points based on the current pose estimate, and then refines the pose estimate based on these matches.
  • pose is solved by singular value decomposition (SVD) of a linear system in six parameters.
  • SSVD singular value decomposition
  • One embodiment of the invention employs, a linear and symmetric formulation of the rotation constraint using Rodrigues' formula rather than quaternions or orthonormal matrices. This is a simplified method of solving for pose.
  • a typical M-estimator is employed to estimate both the rigid pose parameters and the error standard deviation to ensure robustness to gross errors in the data.
  • step 66 The 3D rigid transformation that best aligns these matches is determined at step 66 and applied in step 68 of FIG. 4.
  • the foregoing steps are repeated using the most recently estimated pose until convergence. Accordingly, registration step 44 of FIG. 1 solves for the translation direction and the axis of rotation by which the reference image may be represented in the scanned image coordinates.
  • step 46 of FIG.1 normal deviations from registered data points to data points of the scanned image are found.
  • an approximate normal distance between model data points, also referred to herein as patches, and data surface points is used, avoiding the need to estimate local surface normal and curvature from noisy data.
  • one embodiment of the invention computes the raw error The raw error e, ; along the normal, n k , for the given direction D ⁇ , at the point q ⁇ according to the relationship :
  • Random noise is filtered from the normal deviations using a Gaussian or mean filter.
  • the projection b IJk of the bias vector b, j at each point is determined by:
  • M is a small region surrounding the data point.
  • FIG 2 illustrates a portion of the step 50 of estimating bias vectors.
  • Bias values 56 are determined for a set of planes 58, from the plane 59 closest to the sensor, to the plane furthest from the sensor, along the z axis.
  • a 3D geometric correction table is built by finding the projection of the bias vectors at regularly (evenly) spaced grid points p in the 3D volume of the object. To accomplish this the projections of the bias vectors found at each point q terme are down-sampled and interpolated.
  • One embodiment of the invention employs Gaussian interpolation to interpolate the projections of the bias vectors.
  • step 50 estimates bias vectors as a function of surface location and orientation based upon the projection of the bias vector along a plurality of normals, all projections stemming from the same grid point.
  • the previous steps provide, for each direction D k , the normal to the planes n k , and the projections of the bias vector, b pk , at each grid point p.
  • the following system of relationships are solved for each grid point p of the 3d working volume of the object to be inspected.
  • N is the k x 3 matrix of the normal vectors
  • v p is the k x 1 vector of the bias projections from gridpoint p.
  • the system described above is solved by applying a typical singular value decomposition (SVD) algorithm when the rank of N is 3.
  • SSVD singular value decomposition
  • Figure 3 illustrates three components X, Y and Z of bias vector components in a 3D grid.
  • a covariance matrix is estimated.
  • the covariance matrix provides a model of the random noise and precision of the range sensor based on data points corrected for bias as described above. Principal eigenvalues and eigenvectors of the covariance matrix are found and analyzed to simplify the noise model. To accomplish this, the variance in normal direction n k at the grid point p is determined by the relationship:
  • n' s the vector:
  • s p which contains the terms of the covariance matrix S p , is estimated by forming the /ex 6 matrix as follows:
  • the foregoing is solved using a typical least squares approach.
  • An alternative embodiment of the invention employs an SVD or pseudo-inverse approach, as long as the rank of T p is at least 6.
  • step 54 the calibration obtained by the process of the invention is verified using an independent data source relating to known features of the object.
  • an independent data source relating to known features of the object.
  • use of a corner artifact allows independent verification of the sensor calibration.
  • sphere artifacts of different radii can be employed to reveal the effect of curvature on the 3D geometric correction and the sensor noise model.
  • the method 40 can be used to implement on-site calibration checks and re-calibration procedures that can extend the working life of the sensor.
  • calibration checks are performed at least at 8-hour intervals, and re-calibration procedures at least every week.
  • the present invention can be embodied in the form of computer-implemented processes and apparatuses for practicing those processes.
  • the present invention can also be embodied in the form of computer program code containing instructions embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention.
  • the present invention can also be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention.
  • computer program code segments configure the microprocessor to create specific logic circuits.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
EP99935950A 1998-07-28 1999-07-27 Verfahren und gerät zur kalibrierung eines berührungslosen abstandssensors Withdrawn EP1040449A1 (de)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US9444498P 1998-07-28 1998-07-28
US94444P 1998-07-28
US360032 1999-07-23
US09/360,032 US6411915B1 (en) 1998-07-28 1999-07-23 Method and apparatus for calibrating a non-contact range sensor
PCT/US1999/016941 WO2000007146A1 (en) 1998-07-28 1999-07-27 Method and apparatus for calibrating a non-contact range sensor

Publications (1)

Publication Number Publication Date
EP1040449A1 true EP1040449A1 (de) 2000-10-04

Family

ID=26788890

Family Applications (1)

Application Number Title Priority Date Filing Date
EP99935950A Withdrawn EP1040449A1 (de) 1998-07-28 1999-07-27 Verfahren und gerät zur kalibrierung eines berührungslosen abstandssensors

Country Status (3)

Country Link
EP (1) EP1040449A1 (de)
JP (1) JP2002521699A (de)
WO (1) WO2000007146A1 (de)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060008178A1 (en) * 2004-07-08 2006-01-12 Seeger Adam A Simulation of scanning beam images by combination of primitive features extracted from a surface model
CN105144241B (zh) 2013-04-10 2020-09-01 皇家飞利浦有限公司 图像质量指数和/或基于其的成像参数推荐
US11227414B2 (en) 2013-04-10 2022-01-18 Koninklijke Philips N.V. Reconstructed image data visualization
ES2537783B2 (es) * 2013-06-14 2015-09-29 Universidad De Sevilla Procedimiento para la obtención de una imagen teledetectada a partir de fotografía

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BE790830A (fr) 1971-11-02 1973-02-15 Nordson Corp Pulverisateur electrostatique
US4051981A (en) 1975-12-15 1977-10-04 Louis John Mandlak Powder gun
GB1588393A (en) 1978-04-27 1981-04-23 Bushnell R B Electrostatic powder spraying
DE3811309C2 (de) 1988-04-02 1997-09-04 Weitmann & Konrad Fa Vorrichtung zum Zerstäuben von Puder
EP0381067A3 (de) * 1989-01-31 1992-08-12 Schlumberger Technologies, Inc. Verfahren zur C.A.D.-Modellspeicherung auf Videobildern mit Addierung der Unordnung

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0007146A1 *

Also Published As

Publication number Publication date
JP2002521699A (ja) 2002-07-16
WO2000007146A1 (en) 2000-02-10

Similar Documents

Publication Publication Date Title
JP4815052B2 (ja) 滑らかな表面を有する物体の形状変形探索装置及びその方法
US6504957B2 (en) Method and apparatus for image registration
US12001191B2 (en) Automated 360-degree dense point object inspection
US7327857B2 (en) Non-contact measurement method and apparatus
US6822748B2 (en) Calibration for 3D measurement system
US8526705B2 (en) Driven scanning alignment for complex shapes
EP3963414A2 (de) Automatisierte 360-grad-inspektion von objekten mit hoher punktdichte
CN112082491A (zh) 基于点云的高度检测方法
EP1612734A2 (de) Verfahren und Vorrichtung für Grenzschätzung mit CT Metrologie
US7840367B2 (en) Multi-modality inspection system
CN112686961B (zh) 一种深度相机标定参数的修正方法、装置
Jokinen Self-calibration of a light striping system by matching multiple 3-d profile maps
US6411915B1 (en) Method and apparatus for calibrating a non-contact range sensor
WO2000007146A1 (en) Method and apparatus for calibrating a non-contact range sensor
CN112116665A (zh) 一种结构光传感器标定方法
US20030012449A1 (en) Method and apparatus for brightness equalization of images taken with point source illumination
Di Leo et al. Uncertainty of line camera image based measurements
US20230274454A1 (en) Correction mapping
Haig et al. Lens inclination due to instable fixings detected and verified with VDI/VDE 2634 Part 1
US20230385479A1 (en) Making a measurement relating to an object depending on a derived measurement surface
CN113409367A (zh) 条纹投影测量点云逐点加权配准方法、设备和介质
CN114596341A (zh) 面向大视场运动目标的多相机高精度三维位姿跟踪方法
CN118314217A (zh) 一种用于工件形位尺寸测量的多目相机标定方法及装置
D'Argenio et al. Metrological performances of camera self-calibration techniques
CN116576882A (zh) 一种惯性激光扫描仪系统误差辨识及标定方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE FR GB IT

17P Request for examination filed

Effective date: 20000810

17Q First examination report despatched

Effective date: 20080814

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20150203