JPS6221011A - Measuring apparatus by light cutting method - Google Patents

Measuring apparatus by light cutting method

Info

Publication number
JPS6221011A
JPS6221011A JP15837485A JP15837485A JPS6221011A JP S6221011 A JPS6221011 A JP S6221011A JP 15837485 A JP15837485 A JP 15837485A JP 15837485 A JP15837485 A JP 15837485A JP S6221011 A JPS6221011 A JP S6221011A
Authority
JP
Japan
Prior art keywords
image data
light
image
memory
intensity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP15837485A
Other languages
Japanese (ja)
Inventor
Kazunori Noso
千典 農宗
Hiroshi Saito
浩 斎藤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Priority to JP15837485A priority Critical patent/JPS6221011A/en
Publication of JPS6221011A publication Critical patent/JPS6221011A/en
Pending legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

PURPOSE:To enable the recognition of 3-D shape information and measurement at a high accuracy, by taking an image of an object varying the intensity of a slit light projected thereto gradually to process a plurality of image data obtained. CONSTITUTION:A density image data outputted from a camera 4 is written into an image memory 6 through a fetching section 5. At this point, a CPU 7 varies the intensity of a slit light 2 at three steps for the same part of the same object 3 without changing the position of a light cutting line 2a to memories 6a-6c three kinds of image data obtained from the camera 4 at each intensity of light. The density values of the image data are compared at the image address (x, y) to detect the density value for the brightest level but darker than a specified threshold as representative value at the image address (x, y) and one image data is synthesized by a light cutting line image 2b partially extracted. The synthesized image data is stored into a memory 6b and processed thereby enabling the recognition of 3-D shape information and measurement at a high accuracy.

Description

【発明の詳細な説明】 [発明の技術分野1 この発明は、光切断法によって物体の三次元形状情報を
求める計測装置に関する。
DETAILED DESCRIPTION OF THE INVENTION [Technical Field of the Invention 1] The present invention relates to a measuring device that obtains three-dimensional shape information of an object using a light cutting method.

[発明の技術的背景と問題点1 第5図(A)に光切断法の光学系の概要を示している。[Technical background of the invention and problems 1 FIG. 5(A) shows an outline of the optical system for the optical cutting method.

光源1からのスリット光2を対象物3に投射し、その反
射光をテレビカメラ4で画像する。
A slit light 2 from a light source 1 is projected onto an object 3, and the reflected light is imaged by a television camera 4.

テレビカメラ4からは、対象物3の表面のスリット光2
による輝線2a  (以下、これを光切断線と称する)
の画像を含んだ階調のあるm製画像データが得られる。
From the television camera 4, a slit light 2 on the surface of the object 3 is emitted.
Bright line 2a (hereinafter referred to as the light cutting line)
m-manufactured image data with gradations including images of .

画像データの一例を第5図(B)に示している。この図
中の2bは光切断線2aに対応した光切lfl′1線像
である。この画像データを適宜に処理し、ズ・1象物3
の三次元形状情報を求めるのである。
An example of image data is shown in FIG. 5(B). 2b in this figure is a light section lfl'1 line image corresponding to the light section line 2a. This image data is processed appropriately and
The three-dimensional shape information is obtained.

光切断法で高精度な計測を行うには、背景と明確に区別
のできる濃度のできる限り幅の狭い光切断線像2bを得
る必要がある。しかし、対象物3の表面の反射率は部分
的に異なるし、光切断線2aとカメラ4との距離も部分
的に異なるため、画像データ中の光切断線像2bの濃度
は一様でなく、そのダイナミックレンジは一般的に非常
に広い。
In order to perform highly accurate measurement using the light sectioning method, it is necessary to obtain a light section line image 2b with a density that can be clearly distinguished from the background and as narrow as possible. However, since the reflectance of the surface of the object 3 differs partially and the distance between the light section line 2a and the camera 4 also differs partially, the density of the light section line image 2b in the image data is not uniform. , its dynamic range is generally very wide.

光切断線像2bの濃度が暗すぎると、背景の濃度と区別
しにく(なり、誤計測の原因となる。対染物3上の光切
断線2aの輝度が大きすぎると、カメラ4においてその
部分で飽和が生じ、ブルーミングやスミア等が発生する
。そうすると、光切断線像2bの対応部分の濃度が明る
くなるだけでなく、線幅が太くなってしまい、対象物3
の三次元形状情報を正確に検出しにくくなる。カメラ4
から得られた画像データに対しその中に含まれる光切断
線像2bの線幅を小さくする細線化処理がまず施され、
その後三次元形状の認識処理が行なわれるのが普通であ
る。この細線化処理としては、例えば特開昭56−70
407号に開示さた技術などが知られている。
If the density of the light section line image 2b is too dark, it will be difficult to distinguish it from the density of the background (this will cause erroneous measurements. Saturation occurs in the part, causing blooming, smearing, etc. Then, not only the density of the corresponding part of the light-cut line image 2b becomes brighter, but also the line width becomes thicker, and the object 3
It becomes difficult to accurately detect three-dimensional shape information. camera 4
First, a line thinning process is performed on the image data obtained from the process to reduce the line width of the light-cut line image 2b contained therein.
After that, three-dimensional shape recognition processing is usually performed. As this thinning process, for example, JP-A-56-70
The technique disclosed in No. 407 is known.

どの様な細線化処理を施すにしても、光切断線像2bの
濃度のダイナミックレンジが余り大きいと期待する処理
結果が得られず、光切断線像2bが部分的に消失したり
、変形して細線化されたりする。
No matter what kind of line thinning processing is performed, if the dynamic range of the density of the light sectioned line image 2b is too large, the expected processing result cannot be obtained, and the light sectioned line image 2b may partially disappear or be deformed. The lines may be thinned.

[発明の目的] この発明は上述した従来の問題点に鑑みなされたもので
、その目的は、光切断線像の濃度のダイナミックレンジ
が小さく、かつ適正レベルに揃った画像データが得られ
、それによって高精度な三次元形状情報の認識、31測
が行なえるようにした光切断法による計測装置を提供す
ることにある。
[Object of the Invention] The present invention was made in view of the above-mentioned conventional problems, and its purpose is to obtain image data in which the dynamic range of the density of a light-cut line image is small and uniform at an appropriate level; It is an object of the present invention to provide a measuring device using a light cutting method that can recognize highly accurate three-dimensional shape information and perform 31 measurements.

[発明の概要] この発明の光切断法による計測装置は、スリット光を用
いた光切断法によって生成された光切断線を層像して階
調のある濃淡画像データを得る囮像手段と、同一対象物
の同一部分についてスリット光強度を変化させて上記搬
像手段で得られた複数の画像データを一時記憶する手段
と、上記複数の画像データの同一画素アドレスの濃度値
を比較し、指定のしきい値より暗くて、その中でも最も
明るい濃度値をこの画素アドレスの代表値としで抽出す
る手段とを備えたものである。第1図はこれら各構成要
件を図解したブロック図である。
[Summary of the Invention] A measuring device using a light cutting method according to the present invention includes a decoy image means for obtaining gradation image data by layering a light cutting line generated by a light cutting method using a slit light; A means for temporarily storing a plurality of image data obtained by the image carrying means by changing the slit light intensity for the same part of the same object, and a means for comparing and specifying the density value of the same pixel address of the plurality of image data. The pixel address has means for extracting the brightest density value which is darker than the threshold value of the pixel address as a representative value of this pixel address. FIG. 1 is a block diagram illustrating each of these constituent elements.

[発明の実施例] 第5図に示したスリット光源1.スリット光2゜光切断
線2a、対象物3.テレビカメラ4による光学系の構成
は本発明においても同様である。
[Embodiments of the invention] The slit light source 1 shown in FIG. Slit light 2° light cutting line 2a, object 3. The configuration of the optical system of the television camera 4 is the same in the present invention.

第2図に示すように、カメラ4から出力される濃淡画像
データは取り込み部5を介して画像メモリ6に書き込ま
れ、コンピュータ7はメモリ6に格納された画像データ
について以下に詳述するような処理を加える。スリット
光源1はレーザを用いたもので、スリット光2の強度は
制御部8を介してコンピュータ7によって可変制御され
る。
As shown in FIG. 2, the grayscale image data output from the camera 4 is written into the image memory 6 via the capture unit 5, and the computer 7 processes the image data stored in the memory 6 as detailed below. Add processing. The slit light source 1 uses a laser, and the intensity of the slit light 2 is variably controlled by a computer 7 via a control section 8.

コンピュータ7は、同一対象物3の同一部分について(
光切断線2aの位置を変えずに)スリット光2の強度を
1−強」 「中1 「弱」の三段階に変化させ、各光強
度の状態でカメラ4から得られた三種類の画像データを
画像メモリ6に格納する。
The computer 7 calculates the same part of the same object 3 (
Without changing the position of the light cutting line 2a), the intensity of the slit light 2 was changed to three levels: 1-strong, 1-medium, 1-weak, and three types of images were obtained from the camera 4 at each light intensity state. The data is stored in the image memory 6.

画像メモリ6は、第2図のように画像データのx、y平
面に対応した4つの画像メモリ(a )(b )  (
c )  (d )を合んでいる。光強度「強」での画
像データは画像メモリ(a)に格納され、光強度「中」
での画像データは画像メモリ(b )に格納され、光強
度「弱」での画像データは画像メモリ(C)に格納され
る。この三種類の画像データの例を第4図に示している
As shown in FIG. 2, the image memory 6 has four image memories (a) (b) (
c) (d) matches. The image data with the light intensity "strong" is stored in the image memory (a), and the image data with the light intensity "medium" is stored in the image memory (a).
The image data at "low" light intensity is stored in the image memory (b), and the image data at "low" light intensity is stored in the image memory (C). Examples of these three types of image data are shown in FIG.

メモリ(a )の画像データでは、光切断線像2bの多
くの部分が飽和して太くなっているが、暗すぎて消失し
ている部分はない。メモリ(b)の画像データでは、光
切断線像2bの一部が飽和し、一部が消失している。メ
モリ(C)の画像データでは、光切断線像2bの多くの
部分で消失が見られるが、飽和している部分はない。
In the image data in memory (a), many parts of the light section line image 2b are saturated and thick, but there are no parts that are too dark to disappear. In the image data in the memory (b), a part of the light section line image 2b is saturated and a part has disappeared. In the image data in the memory (C), disappearance is seen in many parts of the light section line image 2b, but there is no saturated part.

この発明では、上記の3種類の画像データの中から光切
断線像2bの最良の部分を抽出し、部分的に抽出された
光切断線!&2bでもって一つの画像データを合成する
。その合成画像データはメモリ<d )に格納され、こ
のデータが三次元形状情報の認識、計測処理に供される
In this invention, the best part of the light section line image 2b is extracted from the above three types of image data, and the partially extracted light section line! &2b to synthesize one image data. The composite image data is stored in a memory <d2), and this data is used for recognition and measurement processing of three-dimensional shape information.

第4図(d >に例示した合成画像は、最下段の線がメ
モリ(a )から抽出され、中段の線がメモリ(b)か
ら抽出され、上段の線がメモリ(C)より抽出されたも
のである。
In the composite image illustrated in Figure 4 (d), the bottom line is extracted from memory (a), the middle line is extracted from memory (b), and the top line is extracted from memory (C). It is something.

上記の合成画像データの作成に際し、次のような処理基
準でデータが抽出される。上記の三種類の画像データの
同−画素アドレスの濃度値を比較し、所定のしきい値T
ll  (飽和レベルより若干低い値に設定)より暗く
て、その中でも最も明るむ1濃度値をこの画素アドレス
の代表値として抽出する。 以上の処理をコンピュータ
ー7で行う場合の処理手順を第3図のフローチャートに
示している。
When creating the above composite image data, data is extracted based on the following processing criteria. Compare the density values of the same pixel address of the above three types of image data, and set a predetermined threshold value T.
One density value that is darker than ll (set to a value slightly lower than the saturation level) and is the brightest among them is extracted as a representative value of this pixel address. The processing procedure when the above processing is performed by the computer 7 is shown in the flowchart of FIG.

ステップ100,101,102ではスリット光2の強
度を3段階に変化させ、3種類の画像データをメモリ(
a)(b)(c)にそれぞれ格納する。
In steps 100, 101, and 102, the intensity of the slit light 2 is changed in three steps, and three types of image data are stored in the memory (
Store them in a), (b), and (c), respectively.

上述の合成画像データの作成には、画像データのx、y
平面の各画素の1つ1つについて次の処理を行う。まず
ステップ103,104で処理対象となる画素アドレス
(x 、 y )を初期化する。
To create the above-mentioned composite image data, x, y of the image data
The following processing is performed for each pixel on the plane. First, in steps 103 and 104, the pixel address (x, y) to be processed is initialized.

次のステップ105で、画像メモリ(a )から画素ア
ドレス(x 、 y )の濃度値Va(x、y)を読み
取り、前述のしきい値Thと大小比較する。
In the next step 105, the density value Va(x, y) of the pixel address (x, y) is read from the image memory (a) and compared in magnitude with the aforementioned threshold Th.

濃度fiiVa(x、y)がしきい値Thより小さけれ
ば、メモリ<a >のデータが光強度「強」でのデータ
なので、Va(x、y)がしきい値Thより暗くて、し
かも他のメモリ(b )  (c )より明るいi9[
値であると判断し、ステップ106でその濃度値Va(
x、y)と同じ濃度値を画像メモリ(d )の画素アド
レス(x 、 y )に書き込む。
If the density fiiVa(x, y) is smaller than the threshold Th, then the data in the memory <a> is data at a light intensity of "strong", so Va(x, y) is darker than the threshold Th, and other memory (b) (c) Brighter than i9 [
In step 106, the density value Va(
Write the same density value as (x, y) to the pixel address (x, y) of the image memory (d).

そしてステップ110で画素アドレスyを更新し、ステ
ップ111でyが最終アドレスMでないことを確認して
前述のステップ105に戻る。
Then, in step 110, the pixel address y is updated, and in step 111, it is confirmed that y is not the final address M, and the process returns to step 105 described above.

ステップ105でVa(X、V)がしきい値Thより大
きい(明るい)と判断された場合、これはメモリ(a 
)のこの画素アドレスのS度値がほぼ飽和レベルになっ
ていることを意味する。この場合、光強度「中」で得ら
れたメモリ(b )の画像データから上記画素アドレス
(x 、 y )の濃度値Vb(x、y)を読み取り、
しきい値Thと大小比較する。vb(x、y)がしぎい
値Thより小さ番プれば(118い)、ステップ108
でその濃度値と画像メモリ(d )の画素アドレス(X
、V)に書き込む。Vb(x、y)がしきい値Thより
大きければ(明るい)、ステップ109で光強度「弱」
での画像データが格納されたメモリ(C)の画素アドレ
ス(x 、 y )の濃度値Vc(x、y)を画像メモ
リ(d )の画素アドレス(X 、 V )に書き込む
。この処理をyの最終アドレスMまで行い、更にXの最
終アドレスNまで行って、画像データのx、y平面を全
面にわたって処理する。これでメモリ(d )に第4図
に示した様な合成画像データが生成される。
If it is determined in step 105 that Va (X, V) is larger (brighter) than the threshold Th, this is the memory (a
) means that the S degree value of this pixel address is almost at the saturation level. In this case, the density value Vb (x, y) of the above pixel address (x, y) is read from the image data in the memory (b) obtained with the light intensity "medium",
The magnitude is compared with the threshold value Th. If vb (x, y) is smaller than the threshold Th (118), step 108
The density value and the pixel address (X
, V). If Vb (x, y) is larger than the threshold Th (bright), the light intensity is set to "weak" in step 109.
The density value Vc (x, y) of the pixel address (x, y) of the memory (C) where the image data at is stored is written to the pixel address (X, V) of the image memory (d). This processing is performed up to the final y address M, and further up to the final X address N, processing the entire x and y planes of the image data. As a result, composite image data as shown in FIG. 4 is generated in the memory (d).

[発明の効果] 以上詳細に説明したように、この発明の光切断法による
計測装置にあっては、対象物に投射するスリット光の強
度を段階的に変化させて銀像し、得られる複数の画像デ
ータの中から光切断線像の最良の濃度部分を抽出するの
で、対象物表面の屈折率が部分的に太き(変化していた
り、カメラに対する高低差が大きくても、光切断線像の
12度のダイナミックレンジは非常に小さくなり、適正
なレベルにほぼ揃う。従って三次元形状情報の認識。
[Effects of the Invention] As explained in detail above, in the measurement device using the light sectioning method of the present invention, the intensity of the slit light projected onto the object is changed stepwise to form a silver image, and the obtained plurality of Since the best density part of the light section line image is extracted from the image data of The dynamic range of 12 degrees of the image becomes very small and is almost at an appropriate level.Therefore, three-dimensional shape information can be recognized.

計測が高精度に行なえる。Measurements can be made with high precision.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は特許請求の範囲に記載の構成要件を図解したブ
ロック図、第2図はこの発明の一実施例装置のハードウ
ェア構成の概略を示すブロック図、第3図は第2図のコ
ンピュータによって実行されるデータ処理手順を示すフ
ローチャート、第4図はこの発明の装置の作用を示す図
、第5図は光切断法の光学系の基本構成(A)とそれに
よって得られる画像データ(B)である。 1・・・スリット光源 2・・・スリット光 2a・・・光切断線 2b・・・光切断線像 3・・・対象物 4・・・デレピカメラ 6・・・画像メモリ 7・・・コンピュータ 8・・・光強度ルリi71S 第1図 第2囚 第4図 (a)a      (b)中    (C1弱(d)
合成l!5i11表
FIG. 1 is a block diagram illustrating the constituent elements described in the claims, FIG. 2 is a block diagram schematically showing the hardware configuration of an apparatus according to an embodiment of the present invention, and FIG. 3 is the computer shown in FIG. 2. FIG. 4 is a flow chart showing the operation of the apparatus of the present invention, and FIG. 5 shows the basic configuration (A) of the optical system of the light sectioning method and the image data (B ). 1...Slit light source 2...Slit light 2a...Light cutting line 2b...Light cutting line image 3...Object 4...Delephoto camera 6...Image memory 7...Computer 8 ...Light intensity Ruri i71S Fig. 1 Fig. 2 Prisoner Fig. 4 (a) a (b) medium (C1 weak (d)
Synthesis! 5i11 table

Claims (1)

【特許請求の範囲】[Claims] (1)スリット光を用いた光切断法によつて生成された
光切断線を撮像して階調のある濃淡画像データを得る撮
像手段と、同一対象物の同一部分についてスリット光強
度を変化させて上記撮像手段で得られた複数の画像デー
タを一時記憶する手段と、上記複数の画像データの同一
画素アドレスの濃度値を比較し、所定のしきい値より暗
くて、その中でも最も明るい濃度値をこの画素アドレス
の代表値として抽出する手段とを備えたことを特徴とす
る光切断法による計測装置。
(1) An imaging means that obtains gradation image data by imaging a light cutting line generated by a light cutting method using slit light, and a method that changes the intensity of the slit light for the same part of the same object. means for temporarily storing a plurality of image data obtained by the imaging means, and a density value at the same pixel address of the plurality of image data is compared, and a density value which is darker than a predetermined threshold value and which is brightest among them is determined. A measuring device using a light sectioning method, characterized in that it is equipped with means for extracting a representative value of this pixel address.
JP15837485A 1985-07-19 1985-07-19 Measuring apparatus by light cutting method Pending JPS6221011A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP15837485A JPS6221011A (en) 1985-07-19 1985-07-19 Measuring apparatus by light cutting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP15837485A JPS6221011A (en) 1985-07-19 1985-07-19 Measuring apparatus by light cutting method

Publications (1)

Publication Number Publication Date
JPS6221011A true JPS6221011A (en) 1987-01-29

Family

ID=15670305

Family Applications (1)

Application Number Title Priority Date Filing Date
JP15837485A Pending JPS6221011A (en) 1985-07-19 1985-07-19 Measuring apparatus by light cutting method

Country Status (1)

Country Link
JP (1) JPS6221011A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0312046A2 (en) * 1987-10-14 1989-04-19 Hitachi, Ltd. Apparatus and method for inspecting defect of mounted component with slit light
JPH02114110A (en) * 1988-10-24 1990-04-26 Kumamoto Techno Porisu Zaidan Method for detecting three-dimensional object
JP2005315821A (en) * 2003-04-28 2005-11-10 Steinbichler Optotechnik Gmbh Method and instrument for measuring contour and/or deformation, in particular, interference of object
JP2007127431A (en) * 2005-11-01 2007-05-24 Fuji Xerox Co Ltd Method and apparatus for detecting end position
JP2008215975A (en) * 2007-03-02 2008-09-18 Pulstec Industrial Co Ltd Three-dimensional shape measuring device and three-dimensional shape measuring method
JP2008216098A (en) * 2007-03-06 2008-09-18 Pulstec Industrial Co Ltd Three-dimensional shape measuring device and three-dimensional shape measuring method
JP2008281391A (en) * 2007-05-09 2008-11-20 Pulstec Industrial Co Ltd Apparatus and method for measuring three-dimensional shape
JP2009085775A (en) * 2007-09-28 2009-04-23 Sunx Ltd Measuring apparatus
WO2011145319A1 (en) * 2010-05-19 2011-11-24 株式会社ニコン Shape measuring device and shape measuring method
US10378886B2 (en) 2015-06-11 2019-08-13 Canon Kabushiki Kaisha Three dimensional measurement apparatus, control method for three dimensional measurement apparatus, and storage medium
CN110462688A (en) * 2017-03-26 2019-11-15 康耐视公司 System and method is determined using the three-D profile of the peak value selection based on model
JP2021099251A (en) * 2019-12-20 2021-07-01 株式会社豊田中央研究所 Height distribution measuring device and height distribution measuring method

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0312046A2 (en) * 1987-10-14 1989-04-19 Hitachi, Ltd. Apparatus and method for inspecting defect of mounted component with slit light
US5076697A (en) * 1987-10-14 1991-12-31 Hitachi, Ltd. Apparatus and method for inspecting defect of mounted component with slit light
JPH02114110A (en) * 1988-10-24 1990-04-26 Kumamoto Techno Porisu Zaidan Method for detecting three-dimensional object
JP2005315821A (en) * 2003-04-28 2005-11-10 Steinbichler Optotechnik Gmbh Method and instrument for measuring contour and/or deformation, in particular, interference of object
JP2007127431A (en) * 2005-11-01 2007-05-24 Fuji Xerox Co Ltd Method and apparatus for detecting end position
JP2008215975A (en) * 2007-03-02 2008-09-18 Pulstec Industrial Co Ltd Three-dimensional shape measuring device and three-dimensional shape measuring method
JP2008216098A (en) * 2007-03-06 2008-09-18 Pulstec Industrial Co Ltd Three-dimensional shape measuring device and three-dimensional shape measuring method
JP2008281391A (en) * 2007-05-09 2008-11-20 Pulstec Industrial Co Ltd Apparatus and method for measuring three-dimensional shape
JP2009085775A (en) * 2007-09-28 2009-04-23 Sunx Ltd Measuring apparatus
WO2011145319A1 (en) * 2010-05-19 2011-11-24 株式会社ニコン Shape measuring device and shape measuring method
CN102906536A (en) * 2010-05-19 2013-01-30 株式会社尼康 Shape measuring device and shape measuring method
US9194697B2 (en) 2010-05-19 2015-11-24 Nikon Corporation Apparatus and method for measuring three-dimensional objects
JP5825254B2 (en) * 2010-05-19 2015-12-02 株式会社ニコン Shape measuring apparatus and shape measuring method
US10378886B2 (en) 2015-06-11 2019-08-13 Canon Kabushiki Kaisha Three dimensional measurement apparatus, control method for three dimensional measurement apparatus, and storage medium
CN110462688A (en) * 2017-03-26 2019-11-15 康耐视公司 System and method is determined using the three-D profile of the peak value selection based on model
JP2020512536A (en) * 2017-03-26 2020-04-23 コグネックス・コーポレイション System and method for 3D profile determination using model-based peak selection
CN110462688B (en) * 2017-03-26 2023-08-01 康耐视公司 Three-dimensional contour determination system and method using model-based peak selection
JP2021099251A (en) * 2019-12-20 2021-07-01 株式会社豊田中央研究所 Height distribution measuring device and height distribution measuring method

Similar Documents

Publication Publication Date Title
EP1462992B1 (en) System and method for shape reconstruction from optical images
US20070176927A1 (en) Image Processing method and image processor
CN110390719B (en) Reconstruction equipment based on flight time point cloud
EP0864134B1 (en) Vector correlation system for automatically locating patterns in an image
US9007457B2 (en) Acquisition of 3D topographic images of tool marks using non-linear photometric stereo method
JP5043023B2 (en) Image processing method and apparatus
US5889582A (en) Image-directed active range finding system
JPS6221011A (en) Measuring apparatus by light cutting method
US7149345B2 (en) Evaluating method, generating method and apparatus for three-dimensional shape model
CN108377380A (en) Image scanning system and method thereof
TW201337839A (en) Segmentation for wafer inspection
CN107869968A (en) A kind of quick three-dimensional scan method and system suitable for complex object surface
GB2281165A (en) Identifying flat orthogonal objects using reflected energy signals
JPH02148279A (en) Mark detecting system
JPH08292014A (en) Measuring method of pattern position and device thereof
JPH1079029A (en) Stereoscopic information detecting method and device therefor
Cardillo et al. 3-D position sensing using a passive monocular vision system
EP0356727A2 (en) Symmetrie-based target position measurement
US5373567A (en) Method and apparatus for pattern matching
JPH05135155A (en) Three-dimensional model constitution device using successive silhouette image
EP0265769A1 (en) Method and apparatus for measuring with an optical cutting beam
CN112816995B (en) Target detection method and device, fusion processing unit and computer readable medium
CN111259991B (en) Under-sampling single-pixel imaging target identification method in noise environment
CN112815846A (en) 2D and 3D composite high-precision vision device and measuring method
CN114689604A (en) Image processing method for optical detection of object to be detected with smooth surface and detection system thereof