JPH03289505A - Three-dimensional shape measuring apparatus - Google Patents

Three-dimensional shape measuring apparatus

Info

Publication number
JPH03289505A
JPH03289505A JP9252390A JP9252390A JPH03289505A JP H03289505 A JPH03289505 A JP H03289505A JP 9252390 A JP9252390 A JP 9252390A JP 9252390 A JP9252390 A JP 9252390A JP H03289505 A JPH03289505 A JP H03289505A
Authority
JP
Japan
Prior art keywords
distribution data
image
light
measured
brightness distribution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP9252390A
Other languages
Japanese (ja)
Inventor
Fumiaki Fujie
藤江 文明
Yasuyuki Ito
靖之 伊藤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Original Assignee
NipponDenso Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NipponDenso Co Ltd filed Critical NipponDenso Co Ltd
Priority to JP9252390A priority Critical patent/JPH03289505A/en
Publication of JPH03289505A publication Critical patent/JPH03289505A/en
Pending legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

PURPOSE:To measure the shape and the position of a body to be measured highly accurately by emitting slit light on the surface of the material to be measured, froming the fringe image on the surface, picking up the image, and correcting the obtained distribution data of luminance with the distribution data of luminance at the time of the total emission. CONSTITUTION:The light from a lamp 11 becomes the fringe light through a condenser lens 12, a liquid-crystal shutter 13 and a projecting lens, and the surface of a body to be measured M is irradiated. The image of the surface of the body M is focused on an image sensing element 24 of a camera 20. The output is inputted into a CPU 30a through an A/D converter 31 and an interface 30d. Thus the luminance distribution data of the body M are obtained. Then, a pattern control part 32 switches the shutter 13 based on the command from the CPU 30a, and the body M is irradiated with the total light. The image is picked up with the camera 20. The image signal is received in the CPU 30a, and the luminance distribution data are obtained. The fringe-shaped luminance distribution data are corrected, and the shape of the body M is determined and displayed on a surface-shape displaying device 40.

Description

【発明の詳細な説明】 聚肌Q旦酌 [産業上の利用分野] 本発明は3次元形状測定装置に関し、詳しくは被測定物
体表面に縞状の光を照射し、表面に形成された縞像の状
態に基づいて被測定物体の形状や位置を測定する3次元
形状測定装置に関する。
Detailed Description of the Invention [Field of Industrial Application] The present invention relates to a three-dimensional shape measuring device, and more specifically, the present invention relates to a three-dimensional shape measuring device, and more specifically, the present invention relates to a three-dimensional shape measuring device, and more specifically, the present invention relates to a three-dimensional shape measuring device. The present invention relates to a three-dimensional shape measuring device that measures the shape and position of an object to be measured based on the state of an image.

[従来の技術] 従来、この種の3次元形状測定装置として、例えば特開
昭64−54206号公報のものが知られている。この
装置では、被測定物体の表面の縞像を明暗の強弱で表す
輝度分布データに基づいて、被測定物体の形状を測定す
るが、更に、被測定物体の周囲から差し込む外乱光(背
景光)の影響をうけて輝度分布データが歪み、測定精度
が低下するのを防止するため、次の構成が取られていた
[Prior Art] Conventionally, this type of three-dimensional shape measuring device is known, for example, as disclosed in Japanese Patent Application Laid-Open No. 64-54206. This device measures the shape of the object to be measured based on brightness distribution data that represents the fringe image on the surface of the object by the intensity of brightness and darkness. In order to prevent the brightness distribution data from being distorted and the measurement accuracy from decreasing due to the influence of the above, the following configuration was adopted.

まず、被測定物体に光を全く照射しない無照射状態でそ
の表面を撮像し、無照射時の表面を明暗の強弱で示す輝
度分布データを作成する。無照射時の輝度分布データは
、背景光が縞像の輝度分布データに及ぼす変化を示すデ
ータとなる。次に、縞像の輝度分布データを、無照射時
の輝度分布データを用いて補正する処理、つまり、縞像
の画像を構成する各画素の照度レベル値Saから無照射
時の画像を構成する各画素の照度レベル値sbを減算す
る処理を行なって、背景光に起因する歪みのない縞像の
輝度分布データ(照度差データD=Sa−5b)を作成
する。このようにして作成した縞像の輝度分布データか
ら、各線の位相や形状を検出することにより、この種測
定の精度向上を図った。
First, an image of the surface of the object to be measured is taken in a non-irradiation state in which no light is irradiated, and luminance distribution data is created that shows the surface in the non-irradiation state by the intensity of brightness and darkness. The brightness distribution data at the time of no irradiation is data indicating the change that background light has on the brightness distribution data of the striped image. Next, the brightness distribution data of the striped image is corrected using the brightness distribution data in the non-irradiation state, that is, the image in the non-irradiation state is constructed from the illuminance level value Sa of each pixel constituting the image of the striped image. A process of subtracting the illuminance level value sb of each pixel is performed to create brightness distribution data of a striped image (illuminance difference data D=Sa-5b) without distortion caused by background light. The accuracy of this type of measurement was improved by detecting the phase and shape of each line from the brightness distribution data of the fringe image created in this way.

[発明が解決しようとする課題] しかしながら、上記従来装置は背景光の影響は除去でき
たが、被測定物体表面の反射率分布のばらつきや光源の
照度分布の偏り等によって生じた縞像の輝度分布データ
の歪みを除去できず、測定精度の向上が妨げられるとい
う問題がある。
[Problems to be Solved by the Invention] However, although the above-mentioned conventional device was able to eliminate the influence of background light, the brightness of the fringe image caused by variations in the reflectance distribution on the surface of the object to be measured, bias in the illuminance distribution of the light source, etc. There is a problem in that distortion of the distribution data cannot be removed and improvement in measurement accuracy is hindered.

本発明の3次元形状測定装置は上記課題を解決し、被測
定物体表面の反射率分布のばらつきや光源の照度分布の
偏りに起因する輝度分布データの歪みを除去可能とし、
高精度の測定を実現することを目的とする。
The three-dimensional shape measuring device of the present invention solves the above problems and makes it possible to remove distortions in brightness distribution data caused by variations in the reflectance distribution on the surface of the object to be measured and bias in the illuminance distribution of the light source,
The purpose is to achieve high-precision measurements.

K肌Ω構成 「課題を解決するための手段コ 本発明の3次元形状測定装置は、第1図に例示するよう
に、 被測定物体に光源からの光をスリットにより縞状に照射
して前記被測定物体表面に縞像を形成し、該縞像を撮像
手段で撮像して得られる縞像の輝度分布データから各線
の状態を検出して、前記被測定物体の形状あるいは位置
を求める3次元形状測定装置において、 前記被測定物体表面の全面に前記光源と同一照度分布の
光を照射する全照射手段と、 該全照射手段により光を全照射して撮像することにより
得られる全照射時の輝度分布データを用いて、前記縞像
の輝度分布データを補正する輝度分布データ補正手段と を備えることを特徴とする。
K Skin Ω Configuration ``Means for Solving the Problems'' The three-dimensional shape measuring device of the present invention, as illustrated in FIG. A three-dimensional method in which a fringe image is formed on the surface of the object to be measured, and the condition of each line is detected from the brightness distribution data of the fringe image obtained by imaging the fringe image with an imaging means to determine the shape or position of the object to be measured. In the shape measuring device, a total irradiation means for irradiating the entire surface of the object to be measured with light having the same illuminance distribution as that of the light source; A brightness distribution data correction means for correcting the brightness distribution data of the fringe image using brightness distribution data.

「作用] 上記構成を有する本発明の3次元形状測定装置において
は、縞像の輝度分布データを次のように補正する。まず
、縞状の光を照射する光源と同一照度分布の光を全照射
手段により被測定物体表面の全面に照射し、撮像して全
照射時の輝度分布データを得る。全照射時の輝度分布デ
ータは、被測定物体表面の反射率分布のばらつきや光源
の照度分布の偏り等が、縞像の輝度分布データに及ぼす
変化を示すデータである。輝度分布データ補正手段は、
全照射時の輝度分布データを用いて、縞像の輝度分布デ
ータを補正し、被測定物体表面の反射率分布のばらつき
や、光源の照度分布の偏り等に起因する歪みを縞像の輝
度分布データから除去する。こうして補正された縞像の
輝度分布データからは各線の位相が正確に検出さね被測
定物体の形状あるいは位置が高精度で測定される。
"Operation" In the three-dimensional shape measuring device of the present invention having the above configuration, the brightness distribution data of the striped image is corrected as follows. First, the light having the same illuminance distribution as the light source that irradiates the striped light is completely The entire surface of the object to be measured is irradiated by the irradiation means and imaged to obtain brightness distribution data at the time of full irradiation.The brightness distribution data at the time of full irradiation is based on variations in the reflectance distribution on the surface of the object to be measured and the illuminance distribution of the light source. This data indicates the change that the bias, etc. of the stripe image has on the brightness distribution data of the fringe image.
The brightness distribution data of the fringe image is corrected using the brightness distribution data at the time of full irradiation, and distortions caused by variations in the reflectance distribution on the surface of the object to be measured and bias in the illuminance distribution of the light source are corrected. Remove from data. From the brightness distribution data of the fringe image corrected in this way, the phase of each line can be accurately detected, and the shape or position of the object to be measured can be measured with high precision.

[実施例] 以下本発明の一実施例としての3次元形状測定装置を図
面にしたがって説明する。
[Example] A three-dimensional shape measuring device as an example of the present invention will be described below with reference to the drawings.

3次元形状測定装置は、第2図に示すように、プロジェ
クタ]0、カメラ20、信号処理装置30、表面形状表
示装置40から構成されている。
As shown in FIG. 2, the three-dimensional shape measuring device includes a projector 0, a camera 20, a signal processing device 30, and a surface shape display device 40.

プロジェクタ]Oはランプ]1、集光レンズ]2、液晶
シャッタ]3、投影レンズ14を備える。
The projector [O] includes a lamp [1], a condensing lens [2], a liquid crystal shutter [3], and a projection lens 14.

液晶シャッタ13は、第3図に示すように、偏光板15
の内面に、透明電極17をストライプ状二所定の間隔で
形成し、対向する偏光板16の内面に透明電極]8を面
状に形成し、さらに画電極17.18の間に液晶19を
充填したものである。
The liquid crystal shutter 13 includes a polarizing plate 15 as shown in FIG.
Transparent electrodes 17 are formed in stripes at predetermined intervals on the inner surface of the polarizing plate 16, transparent electrodes 8 are formed in a planar shape on the inner surface of the opposing polarizing plate 16, and liquid crystal 19 is filled between the picture electrodes 17 and 18. This is what I did.

透明電極]7には、後述するようにパターン制御部32
から個別的に電力が与えられて、液晶]9の透明部分と
遮光部分とからなる投影パターンが種々作成される。透
明電極」8は透明電極]7の総てに対応する共通の電極
にされている。実施例の液晶シャッタ13は、2枚の偏
光板15,16の偏光方向が直交するように配置さね 
電界の印加により遮光性となるものである。なお、液晶
シャッタ13は電界の印加により透光性となるものでも
よい。
Transparent electrode] 7 includes a pattern control section 32 as described later.
Electric power is applied individually to create various projection patterns consisting of transparent portions and light-shielding portions of the liquid crystal 9. The transparent electrode 8 is a common electrode that corresponds to all of the transparent electrodes 7. The liquid crystal shutter 13 of the embodiment is arranged such that the polarization directions of the two polarizing plates 15 and 16 are perpendicular to each other.
It becomes light-shielding when an electric field is applied. Note that the liquid crystal shutter 13 may be made transparent by application of an electric field.

したがって、第2図に示すように、ランプ]]の光は集
光レンズ12を通った後、液晶シャッタ]3を介して例
えば縞状の光となり、投影レンズ]4より被測定物体の
表面に照射される。
Therefore, as shown in FIG. 2, the light from the lamp passes through the condenser lens 12 and then becomes, for example, striped light through the liquid crystal shutter [3], and is projected onto the surface of the object to be measured through the projection lens [4]. irradiated.

カメラ20は、第2図に示すように、実施例では対物レ
ンズ22および撮像素子24を有するCCDカメラを用
いる。撮像素子24上には、被測定物体Mの表面の像が
結像する。
As the camera 20, as shown in FIG. 2, a CCD camera having an objective lens 22 and an image sensor 24 is used in the embodiment. An image of the surface of the object to be measured M is formed on the image sensor 24 .

信号処理装置30は、コンピュータとして構成さf5 
 CPU30a、ROM30b、RAM30C及びイン
ターフェイス30dを備え、バス30eにて相互に信号
が伝達可能に構成されている。
The signal processing device 30 is configured as a computer f5
It includes a CPU 30a, a ROM 30b, a RAM 30C, and an interface 30d, and is configured to be able to mutually transmit signals via a bus 30e.

更に、インターフェイス30dには、A/D変換部3]
、パターン制御部32、カメラ制御部33、表面形状表
示装置40が接続されている。
Furthermore, the interface 30d includes an A/D converter 3]
, a pattern control section 32, a camera control section 33, and a surface shape display device 40 are connected.

A/D変換部3]は、CPU30aの指令によりカメラ
20からのアナログ信号をディジタル信号に変換しイン
ターフェイス30dに出力する。
The A/D converter 3] converts an analog signal from the camera 20 into a digital signal according to a command from the CPU 30a, and outputs the digital signal to the interface 30d.

パターン制御部32は、CPU30aの指令により液晶
シャツタコ3の所定の子;ス≠勇透明電極17に電力を
供給し、透明部分と不透明部分とにより構成される投影
パターンを種々作成する。
The pattern control unit 32 supplies power to predetermined transparent electrodes 17 of the liquid crystal shirt octopus 3 according to commands from the CPU 30a, and creates various projection patterns composed of transparent parts and opaque parts.

実施例では、第4図に示すように、図(A)の全遮光パ
ターン、図(B)の全透明パターン、図(C)の縞状パ
ターンの3種を作成する。なお、図においては、便宜的
に遮光部分をハツチングにより示している。
In this example, as shown in FIG. 4, three kinds of patterns are created: a completely light-shielding pattern shown in FIG. 4 (A), a fully transparent pattern shown in FIG. 4 (B), and a striped pattern shown in FIG. Note that in the figures, the light-shielding portions are indicated by hatching for convenience.

カメラ制御部33は、CPU30aの指令によりカメラ
20の電子シャッタを制御する。その制御については後
に詳述する。
The camera control unit 33 controls the electronic shutter of the camera 20 based on instructions from the CPU 30a. The control will be detailed later.

表面形状表示装置40は最終的に求められた被測定物体
Mの形状を立体画像等にして表示する。
The surface shape display device 40 displays the finally determined shape of the object to be measured M as a three-dimensional image or the like.

次隠信号処理装置30において実行される形状測定処理
について第5図および第6図のフローチャートに基づき
説明する。
Next, the shape measurement process executed in the hidden signal processing device 30 will be explained based on the flowcharts of FIGS. 5 and 6.

形状測定処理ルーチン(第5図)は、3次元形状測定装
置を起動後、測定開始指令の度に実行される。
The shape measurement processing routine (FIG. 5) is executed every time a measurement start command is issued after starting up the three-dimensional shape measuring device.

本ルーチンが開始されると、まず、上記3種の投影パタ
ーンを切り換えながら被測定物体Mの表面を撮像するパ
ターン像の取込処理(ステップ100)を行なう。
When this routine is started, first, a pattern image capturing process (step 100) is performed in which the surface of the object to be measured M is imaged while switching between the three types of projection patterns.

パターン像の取込処理(ステップ100)の詳細を、第
6図のフローチャートに示す。取込処理を開始すると、
まず、初期処理として投影パターンを特定する変数に値
]をセットしくステップ]10)、さらにカメラ20の
垂直同期信号が入るのを待って(ステップ120)、カ
メラ20の電子シャッタを閉じる処理(ステップ]30
)を行なう。
The details of the pattern image capture process (step 100) are shown in the flowchart of FIG. When you start the import process,
First, as an initial process, set a value to a variable that specifies the projection pattern (Step 10), wait for the vertical synchronization signal of the camera 20 to be input (Step 120), and close the electronic shutter of the camera 20 (Step 10). ]30
).

これらの初期処理の後、液晶シャッタ13の構成するパ
ターンを、変数りの値で特定されるパターンに変更する
処理(ステップ140)を行なう。
After these initial processes, a process (step 140) is performed in which the pattern formed by the liquid crystal shutter 13 is changed to a pattern specified by a variable value.

まず、変数りが値]で全遮光パターン(第4図(A) 
)に切り換えられる。実施例では、変数りが値2で全透
明パターン(第4図(B))、値3で縞状パターン(第
4図(C))に切り換えられる。
First, set the total light-shielding pattern (Figure 4 (A)
). In the embodiment, a value of 2 switches the variable to a completely transparent pattern (FIG. 4(B)), and a value of 3 switches to a striped pattern (FIG. 4(C)).

次に、カメラの電子シャッタを開け(ステップ150)
、全遮光パターンによりランプ]]の光が遮られた状態
で、露光を開始する。
Next, open the electronic shutter of the camera (step 150).
, exposure is started in a state where the light of the lamp is blocked by the total light-blocking pattern.

所定時間が経過してカメラ20から垂直同期信号が入る
と(ステップ160)、カメラ20の電子シャッタを閉
じ(ステップ]70)、続いて、撮像した映像信号を読
み出し、A/D変換部31でデジタル変換した後、その
信号を無照射時の輝度分布データとしてRAM30cの
所定領域に格納する(ステップ]80)。
When a vertical synchronization signal is input from the camera 20 after a predetermined period of time has passed (step 160), the electronic shutter of the camera 20 is closed (step 70), and the captured video signal is then read out and converted by the A/D converter 31. After digital conversion, the signal is stored in a predetermined area of the RAM 30c as brightness distribution data during non-irradiation (step 80).

無照射時の輝度分布データの格納が終了すると、次に変
数りに値1を加算する(ステップ]90)。
When the storage of the brightness distribution data during non-irradiation is completed, the value 1 is then added to the variable (step) 90.

変数りは値2となり、投影パターンの枚数値3以下であ
るので(ステップ195)、ステップ]40に戻り、液
晶シャッタ13のパターンを変数りの値2に対応するパ
ターン、即ち、全透明パターンに切り換える。この後、
ステップ150以降の処理を実行し、全透明パターンを
ランプ11の光が通過して、被測定物体Mの全表面を照
射した状態を撮像し、その映像信号をデジタル変換して
全照射時の輝度分布データとしてRAM30cの所定領
域に格納する(ステップ180)。
Since the variable value is 2 and the projection pattern number is less than or equal to 3 (step 195), the process returns to step 40 and the pattern of the liquid crystal shutter 13 is changed to a pattern corresponding to the variable value of 2, that is, a fully transparent pattern. Switch. After this,
The process from step 150 onwards is executed to image the state in which the light from the lamp 11 passes through the fully transparent pattern and illuminates the entire surface of the object to be measured M, and converts the video signal digitally to obtain the brightness when the entire surface is illuminated. The data is stored in a predetermined area of the RAM 30c as distribution data (step 180).

最後に、変数りを値3に加算しくステップ]90)、液
晶シャッタ13の投影パターンを縞状パターンに切り換
える(ステップ]40)。そして、縞状パターンをラン
プ]]の光が通過して被測定物体Mの表面を縞状1こ照
らした状態を撮像し、その映像信号をデジタル変換して
縞像の輝度分布データとしてRAM30cの所定領域に
格納する(ステップ180)。
Finally, the variable is added to the value 3 (step] 90), and the projection pattern of the liquid crystal shutter 13 is switched to a striped pattern (step) 40). Then, an image is taken of the state in which the light from the lamp] passes through the striped pattern and illuminates the surface of the object to be measured M in a striped pattern, and the video signal is digitally converted and stored in the RAM 30c as brightness distribution data of the striped image. The information is stored in a predetermined area (step 180).

こうして3種の投影パターンを用いた撮像が終了し、変
数りに値1を加算すると(ステップ190)、変数りの
値4は所定値3より太き(なり(ステップ195)、本
処理を終了する。
In this way, imaging using the three types of projection patterns is completed, and when the value 1 is added to the variable RI (step 190), the value 4 of the variable RI becomes thicker than the predetermined value 3 (step 195), and this processing ends. do.

パターン像の取込処理(第6図)の実行により液晶シャ
ッタ]3、カメラ20が制御される様子を、第7図のタ
イミングチャートに示す。
The timing chart in FIG. 7 shows how the liquid crystal shutter] 3 and camera 20 are controlled by executing the pattern image capture process (FIG. 6).

図示するように、カメラ20の撮像周期1サイクル(垂
直同期信号を出力する間)のうちに、液晶シャッタ]3
のパターンの切換(ステップ]40)と、カメラ20の
露光(ステップ150)とが行なわれる。したがって、
撮像周期3サイクルのうちに全遮光パターン、全透明パ
ターン、縞状パターンにがかる撮像が終了する。
As shown in the figure, during one cycle of the imaging period of the camera 20 (while outputting the vertical synchronization signal), the liquid crystal shutter] 3
Pattern switching (step 40) and exposure of the camera 20 (step 150) are performed. therefore,
Imaging of the completely light-shielding pattern, the completely transparent pattern, and the striped pattern is completed within three imaging cycles.

以上説明したパターン像の取込処理(第6図)を終える
と、次に、縞像の輝度分布データの補正を画素毎に行な
う処理(第5図)を行なう。
After completing the pattern image capture process (FIG. 6) described above, next, a process (FIG. 5) is performed in which the brightness distribution data of the striped image is corrected for each pixel.

まず、第5図に示すように、補正する画素を座標で特定
する変数i、Jを共に値Oにリセットしくステップ20
0)、次に、変数i、jで特定される画素にかかる縞像
の輝度分布データを補正する処理(ステップ210)を
行なう。
First, as shown in FIG. 5, in step 20, both variables i and J, which specify the coordinates of pixels to be corrected, are reset to the value O.
0), then a process (step 210) is performed to correct the brightness distribution data of the fringe image for the pixel specified by the variables i and j.

補正処理(ステップ210)においては、第8図に示す
ように、変数i jで特定された画素の縞像の輝度分布
データ5Cijを、同じ変数i、jで特定される画素に
かかる無照射時の輝度分布データ5Aijおよび全照射
時の輝度分布データ5Bijを用いて次式に従い補正す
る。
In the correction process (step 210), as shown in FIG. The brightness distribution data 5Aij and the brightness distribution data 5Bij at the time of full irradiation are used to correct according to the following equation.

5ij= (SCij−8Aij) / (SBij−
3Aij)・・・■ただし、Sijは補正後の縞像の輝
度分布データ、=0〜511.j=0〜511である(
実施例の画面は512X512の画素により構成される
)。
5ij= (SCij-8Aij) / (SBij-
3Aij) ...■ However, Sij is the brightness distribution data of the fringe image after correction, = 0 to 511. j=0 to 511 (
The screen of the embodiment is composed of 512×512 pixels).

この0式は次のよう(こして導いたものである。This equation 0 was derived as follows.

第9図のグラフに例示するように、縞状光照射時の映像
信号g(Xc)(縞像の輝度分布データ5Cijに相当
)は、全照射時の映像信号f (Xc) (全照射時の
輝度分布データ5Bijに相当)と、無照射時の映像信
号b(Xc)(無照射時の輝度分布データSAJに相当
)との間の範囲に挟まれている。つまり、全照射時の映
像信号f (Xc)は、被測定物体M表面の反射率分布
のばらつきや、ランプ]1の照度分布の偏りが反映され
た信号であり、無照射時の映像信号b(Xc)は、背景
光が反映された信号であることが示される。
As exemplified in the graph of FIG. (corresponding to the brightness distribution data 5Bij) and the video signal b(Xc) at the time of no irradiation (corresponding to the brightness distribution data SAJ at the time of no irradiation). In other words, the video signal f (Xc) during full irradiation is a signal that reflects variations in the reflectance distribution on the surface of the object to be measured M and the bias in the illuminance distribution of the lamp 1, and the video signal b when no irradiation is performed. It is shown that (Xc) is a signal in which background light is reflected.

ここで、被測定物体Mの表面が光学的に理想状態(光が
完全拡散する状態)で、かつランプ11の照度分布が均
一と考えた場合の縞像のカメラ20への写像をG(Xc
)とする(補正した輝度分布データSUに対応)。同様
に全照射時のランプ光のカメラ20への写像、および無
照射時の背景光のカメラへの写像を、それぞれF (X
c)、  B (Xc)とする。また、実際の測定を考
えた場合、被測定物体M表面の反射率分布のばらつきに
より生ずるランプ光と背景光の不均一な反射光のカメラ
への写像、加えてランプ光の照度分布の偏りにより生ず
る不均一な反射光のカメラへの写像を、ランプ光につい
てRf (Xc)、  背景光についてRb(Xc)と
おく。
Here, assuming that the surface of the object M to be measured is in an optically ideal state (a state in which light is completely diffused) and the illuminance distribution of the lamp 11 is uniform, the mapping of the fringe image to the camera 20 is G(Xc
) (corresponding to the corrected brightness distribution data SU). Similarly, the mapping of the lamp light to the camera 20 during full irradiation and the mapping of the background light to the camera during no irradiation are respectively F (X
c), B (Xc). In addition, when considering actual measurements, the mapping of the uneven reflected light of the lamp light and background light onto the camera caused by variations in the reflectance distribution on the surface of the object to be measured M, as well as the bias in the illuminance distribution of the lamp light, The mapping of the resulting non-uniform reflected light onto the camera is defined as Rf (Xc) for the lamp light and Rb (Xc) for the background light.

コノ時、各映像信号f (Xc)、  g (Xc)、
  b (Xc)ハ、それぞれ次式で表わせる。
At this time, each video signal f (Xc), g (Xc),
b (Xc) C can be expressed by the following formulas.

g(Xc)=F(Xc)G(Xc)・Rf(Xc)十B
(Xc)−Rb(Xc)   −■f(Xc) =F(
Xc)Rf(Xc)+B(Xc)Rb(Xc)  −−
−■b (Xc) = B (Xc) ・Rb(Xc)
          ・”00式を■氏■式に代入する
と、次の2式が得られる。
g(Xc)=F(Xc)G(Xc)・Rf(Xc)10B
(Xc)-Rb(Xc)-■f(Xc) =F(
Xc) Rf(Xc)+B(Xc)Rb(Xc) --
−■b (Xc) = B (Xc) ・Rb(Xc)
・Substituting the 00 formula into the ■ Mr. ■ formula, the following two formulas are obtained.

g(Xc)=F(Xc)G(Xc)Rf(Xc)+b(
Xc)  −■f (Xc) = F (Xc) Rf
(Xc)+ b (Xc)      −00式からは
次式が得られる。
g(Xc)=F(Xc)G(Xc)Rf(Xc)+b(
Xc) −■f (Xc) = F (Xc) Rf
(Xc)+b(Xc)-00 The following equation is obtained.

G(Xc)= (g(Xc)−b(Xc)) / (F
(Xc)−Rf(Xc))・・・■ 0式からは次式が得られる。
G(Xc)=(g(Xc)-b(Xc))/(F
(Xc)-Rf(Xc))...■ From equation 0, the following equation is obtained.

F (Xc) Rf(Xc)= f (Xc)−b (
Xc)      −−・00式を0式に代入すると次
の0式が得られる。
F (Xc) Rf (Xc) = f (Xc) - b (
Xc) --- When the 00 formula is substituted into the 0 formula, the following 0 formula is obtained.

G (Xc)” (g (XC)−b (Xc)) /
 (f (Xc)−b(Xc))・・・■ 上記0式は項Rf(Xc)、  Rb(Xc)を含まな
い。二〇〇式は0式と等価である。つまり、縞像の輝度
分布データ5Cijと、無照射時の輝度分布データ5A
iJと、全照射時の輝度分布データ5Bijとにより、
被測定物体M表面の反射率分布のばらつきや、ランプの
照度分布の偏り、さらに背景光の影響を除去した縞像の
輝度分布データSijを記述できる。
G (Xc)” (g (XC)-b (Xc)) /
(f (Xc)-b(Xc))...■ The above formula 0 does not include the terms Rf(Xc) and Rb(Xc). Formula 200 is equivalent to formula 0. In other words, the brightness distribution data 5Cij of the fringe image and the brightness distribution data 5A during no irradiation.
iJ and the brightness distribution data 5Bij at the time of full irradiation,
It is possible to describe the brightness distribution data Sij of the fringe image, which removes variations in the reflectance distribution on the surface of the object to be measured M, deviations in the illuminance distribution of the lamp, and the influence of background light.

以上説明した0式に基づく補正処理(ステップ210)
により、変数i、jで特定される1画素の縞像の輝度分
布データを補正すると、次に、行を指定する変数iの値
が、その列jにおいて最後の行を示す値(ここでは値5
11)か否かを判断しくステップ220)、最後の値で
なければ、変数iに値1を加算して(ステップ230)
、以上の処理を繰り返し、j列(最初は値O)の全部の
画素の補正を終了する。j列の画素の補正を終了すると
、次に、変数jが最後の列(ここでは値51])か否か
を判断しくステップ240)、最後の列でなければ、変
数jに値1を加算しか°つ変数iは値Oにリセットする
(ステップ250)。
Correction processing based on the 0 formula explained above (step 210)
When the brightness distribution data of the fringe image of one pixel specified by the variables i and j is corrected, the value of the variable i that specifies the row is changed to the value indicating the last row in the column j (here, the value 5
11), and if it is not the last value, add the value 1 to the variable i (step 230).
, the above process is repeated to complete the correction of all pixels in column j (initially value O). After completing the correction of the pixels in column j, it is then determined whether variable j is the last column (here, the value 51) (step 240), and if it is not the last column, the value 1 is added to variable j. However, the variable i is reset to the value O (step 250).

以上の処理を繰り返して変数ノが値51]である列の全
部の画素の補正を終了すると(ステップ240)、縞像
の輝度分布データについて補正が完了したのであるから
、次に、変数jを値Oに再びリセットして(ステップ2
6o)、ステップ270以降の形状測定の演算を列毎に
行なう。
When the above process is repeated to complete the correction of all the pixels in the column where the variable j has the value 51 (step 240), the correction of the brightness distribution data of the fringe image has been completed. Reset again to value O (step 2
6o) Shape measurement calculations from step 270 onward are performed for each column.

形状測定の演算は、次のようにして行なわれる。Shape measurement calculations are performed as follows.

まず、j列の画素(第8図×C方向の画素)にかかる5
12個の輝度データ群を読み込む(ステップ270)。
First, the 5
Twelve luminance data groups are read (step 270).

実施例では輝度データは256階調で表される。続いて
、この輝度データ群が256階調で構成する点の集合か
ら補間により基本波fを算出し、基本波fにおける各波
のピーク位置を検出する(ステップ280)。補間は良
く知られた処理であるので詳細は省略する。
In the embodiment, the luminance data is expressed in 256 gradations. Subsequently, a fundamental wave f is calculated by interpolation from a set of points of which this luminance data group consists of 256 gradations, and the peak position of each wave in the fundamental wave f is detected (step 280). Since interpolation is a well-known process, details will be omitted.

次に、各ピーク位置の位相を検出する(ステップ290
)。こうしてピーク位置とその位相が得られると、3角
法に基づいてj列に相当する部分の形状を計算する(ス
テップ300)。形状を計算する演算については、公知
であるので詳細は省略する。
Next, detect the phase of each peak position (step 290
). Once the peak position and its phase are obtained in this way, the shape of the portion corresponding to the j column is calculated based on trigonometry (step 300). Since the calculation for calculating the shape is well known, details thereof will be omitted.

この後、変数jの値が最後の値511であるか否かを判
断しくステップ3]O)、最後の値でなければ変数ノに
値]を加算しくステップ320)、以降、加算後の変数
ノで特定される列の輝度データ群から、上述したように
その列に相当する部分の形状を計算する。
After this, it is determined whether the value of variable j is the last value 511 or not (step 3); if it is not the last value, the value is added to variable j (step 320); From the luminance data group of the column specified in (), the shape of the portion corresponding to that column is calculated as described above.

こうして変数ノに値1を加算する度に、その列に相当す
る部分の形状を計算し、変数ノが値5]]である列の形
状の計算を終了すると、本処理を終了する。
In this way, each time the value 1 is added to the variable, the shape of the part corresponding to that column is calculated, and when the calculation of the shape of the column where the variable has the value 5] is completed, this processing is terminated.

こうして計算された被測定物体Mの形状は、必要に応じ
て表面形状表示装置40に表示される。
The shape of the object M to be measured thus calculated is displayed on the surface shape display device 40 as necessary.

以上説明した実施例の3次元形状測定装置によれば、全
照射時の輝度分布データおよび無照射時の輝度分布デー
タを用いて縞像の輝度分布データを補正し、被測定物体
表面の反射率分布のばらつきや、ランプの照度分布の偏
り、さらに背景光の影響を縞像の輝度分布データから除
去できるので、測定精度の格段の向上を図ることができ
るという優れた効果を奏する。
According to the three-dimensional shape measuring device of the embodiment described above, the brightness distribution data of the fringe image is corrected using the brightness distribution data during full irradiation and the brightness distribution data during no irradiation, and the reflectance of the surface of the object to be measured is Since variations in the distribution, bias in the illuminance distribution of the lamp, and the influence of background light can be removed from the brightness distribution data of the fringe image, this has the excellent effect of significantly improving measurement accuracy.

また、本実施例では、カメラの撮像周期1サイクルで液
晶シャッタ]3のパターンの切換と、露光とを行なうの
で、パターン像の取り込みを短時間で行なうことができ
、計測時間を従来より短縮できる利点がある。加えて、
パターン像の取込時間が短いため、撮像位置のずれや背
景光等の変化が生ずる可能性が小さく、また変化が生じ
たとしても微小であるから、形状測定をより高精度で行
なうことができる。なお、第10図に示すような一般的
な制御、即ちカメラの撮像周期]サイクルの間に液晶シ
ャッタ]3のパターンの切換を行ない、次の撮像周期で
カメラの露光を行なう制御を用いることもできる。ただ
し、上述した実施例の制御によれば、パターン像の取込
時間を半分に短縮できる。
In addition, in this embodiment, since the switching of the liquid crystal shutter [3] pattern and exposure are performed in one imaging cycle of the camera, the pattern image can be captured in a short time, and the measurement time can be shortened compared to the conventional method. There are advantages. In addition,
Since the time required to capture the pattern image is short, there is less possibility of deviations in the imaging position or changes in background light, etc., and even if changes occur, they are minute, so shape measurements can be performed with higher precision. . It is also possible to use general control as shown in FIG. 10, that is, control in which the liquid crystal shutter pattern is switched between the camera's imaging cycle and the camera is exposed in the next imaging cycle. can. However, according to the control of the above-described embodiment, the time to capture the pattern image can be reduced by half.

また、こうしたパターン像の取込時間の短縮を図る他の
構成としては、例えば、撮像周期を自由に設定できるカ
メラを用い、最初の撮像周期は液晶シャッタ]3のパタ
ーンの切換所要時間に設定し、次の撮像周期はカメラの
露光所要時間に設定して、空き時間を省略する構成とし
てもよい。
In addition, as another configuration that aims to shorten the time to capture such pattern images, for example, a camera that can freely set the imaging cycle is used, and the first imaging cycle is set to the time required for switching the pattern of [LCD shutter] 3. , the next imaging cycle may be set to the required exposure time of the camera and the idle time may be omitted.

以上本発明の実施例について説明したが、本発明はこう
した実施例に何等限定されるものではなく、本発明の要
旨を逸脱しない範囲において、種々なる態様で実施し得
ることは勿論である。例えば液晶シャッタ以外に、PL
ZT電気光学シャッタ列を用いたり、プロジェクタとし
てCRTを用いて各種パターンを投影する構成にしても
よい。
Although the embodiments of the present invention have been described above, the present invention is not limited to these embodiments in any way, and it goes without saying that it can be implemented in various forms without departing from the gist of the present invention. For example, in addition to the LCD shutter, PL
A ZT electro-optical shutter array or a CRT as a projector may be used to project various patterns.

3種のパターンの投影順序は問わない。無照射時のパタ
ーンは、ランプ11の消灯により実現してもよい。
The order in which the three types of patterns are projected does not matter. The non-irradiation pattern may be realized by turning off the lamp 11.

発明の効果 以上詳述したように、本発明の3次元形状測定装置によ
れば、被測定物体表面の反射率分布のばらつきや、光源
の照度分布の偏り等の影響による歪を縞像の輝度分布デ
ータから除去できるから、被測定物体の形状や位置の測
定精度の格段の向上を図ることができるという優れた効
果を奏する。
Effects of the Invention As detailed above, according to the three-dimensional shape measuring device of the present invention, distortions caused by variations in the reflectance distribution on the surface of the object to be measured, bias in the illuminance distribution of the light source, etc. Since it can be removed from the distribution data, it has the excellent effect of significantly improving the accuracy of measuring the shape and position of the object to be measured.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は本発明の基本的構成を例示するブロック図、第
2図は本発明の一実施例としての3次元形状測定装置の
ブロック図、第3図(A)は液晶シャッタの正面図、第
3図(B)は液晶シャッタの断面図、第4図(A)、 
 (B)、  (C)は液晶シャッタがつくる3種の投
影パターンを示す正面図、第5図は信号処理装置におい
て実行される形状測定処理の一例を示すフローチャート
第6図はパターン像の取込処理を示すフローチャート、
第7図はパターン像の取込制御を示すタイミングチャー
ト、第8図は補正処理を説明する画像の斜視図、第9図
は補正処理を説明する映像信号のグラフ、第10図はパ
ターン像の取込制御を従来の制御で行なった場合のタイ
ミングチャートである。 10・・・プロジェクタ   13・・・液晶シャッタ
13a・・・全遮光パターン 13b・・・全透明パターン 13c・・・縞像パターン
FIG. 1 is a block diagram illustrating the basic configuration of the present invention, FIG. 2 is a block diagram of a three-dimensional shape measuring device as an embodiment of the present invention, and FIG. 3 (A) is a front view of a liquid crystal shutter. Figure 3 (B) is a cross-sectional view of the liquid crystal shutter, Figure 4 (A),
(B) and (C) are front views showing three types of projection patterns created by the liquid crystal shutter, and FIG. 5 is a flowchart showing an example of shape measurement processing executed in the signal processing device. A flowchart showing the process,
Fig. 7 is a timing chart showing control of capturing the pattern image, Fig. 8 is a perspective view of the image to explain the correction process, Fig. 9 is a graph of the video signal to explain the correction process, and Fig. 10 is a graph of the pattern image. FIG. 2 is a timing chart when acquisition control is performed using conventional control. FIG. 10... Projector 13... Liquid crystal shutter 13a... Fully shielding pattern 13b... Fully transparent pattern 13c... Striped image pattern

Claims (1)

【特許請求の範囲】[Claims] 1 被測定物体に光源からの光をスリットにより縞状に
照射してその表面に縞像を形成し、該縞像を撮像手段で
撮像して得られる縞像の輝度分布データから各縞の状態
を検出して、前記被測定物体の形状あるいは位置を求め
る3次元形状測定装置において、前記被測定物体表面の
全面に前記光源と同一照度分布の光を照射する全照射手
段と、該全照射手段により光を全照射して撮像すること
により得られる全照射時の輝度分布データを用いて、前
記縞像の輝度分布データを補正する輝度分布データ補正
手段とを備えることを特徴とする3次元形状測定装置。
1. The object to be measured is irradiated with light from a light source in a striped manner through a slit to form a striped image on its surface, and the striped image is captured by an imaging means. The condition of each stripe is determined from the brightness distribution data of the striped image obtained. The three-dimensional shape measuring device detects the shape or position of the object to be measured, the total irradiation means for irradiating the entire surface of the object to be measured with light having the same illuminance distribution as that of the light source; and brightness distribution data correction means for correcting the brightness distribution data of the striped image using brightness distribution data at the time of full irradiation obtained by imaging with full irradiation of light. measuring device.
JP9252390A 1990-04-06 1990-04-06 Three-dimensional shape measuring apparatus Pending JPH03289505A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP9252390A JPH03289505A (en) 1990-04-06 1990-04-06 Three-dimensional shape measuring apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP9252390A JPH03289505A (en) 1990-04-06 1990-04-06 Three-dimensional shape measuring apparatus

Publications (1)

Publication Number Publication Date
JPH03289505A true JPH03289505A (en) 1991-12-19

Family

ID=14056701

Family Applications (1)

Application Number Title Priority Date Filing Date
JP9252390A Pending JPH03289505A (en) 1990-04-06 1990-04-06 Three-dimensional shape measuring apparatus

Country Status (1)

Country Link
JP (1) JPH03289505A (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002213931A (en) * 2001-01-17 2002-07-31 Fuji Xerox Co Ltd Instrument and method for measuring three-dimensional shape
WO2003093761A1 (en) 2002-04-30 2003-11-13 Jfe Steel Corporation Method and instrument for measuring bead cutting shape of electric welded tube
WO2006054425A1 (en) * 2004-11-19 2006-05-26 School Juridical Person Of Fukuoka Kogyo Daigaku Three-dimensional measuring instrument, three-dimensional measuring method, and three-dimensional measuring program
JP2007114071A (en) * 2005-10-20 2007-05-10 Omron Corp Three-dimensional shape measuring apparatus, program, computer-readable storage medium, and three-dimensional shape measuring method
JP2008170279A (en) * 2007-01-11 2008-07-24 Omron Corp Three-dimensional shape measuring device, method for correction therefor, program, and computer-readable recording medium
WO2008120457A1 (en) * 2007-03-29 2008-10-09 School Juridical Person Of Fukuoka Kogyo Daigaku Three-dimensional image measurement apparatus, three-dimensional image measurement method, and three-dimensional image measurement program of non-static object
JP2009180689A (en) * 2008-02-01 2009-08-13 Nikon Corp Three-dimensional shape measuring apparatus
WO2009098803A1 (en) * 2008-02-08 2009-08-13 Dainippon Screen Mfg.Co., Ltd. Image information generation device and image information generation method
JP2009186217A (en) * 2008-02-04 2009-08-20 Nikon Corp Three-dimensional shape measuring device and method
JP2010133735A (en) * 2008-12-02 2010-06-17 Makino Milling Mach Co Ltd Shape measurement method and device
JP2011064617A (en) * 2009-09-18 2011-03-31 Fukuoka Institute Of Technology Three-dimensional information measuring device and three-dimensional information measuring method
JP2011133328A (en) * 2009-12-24 2011-07-07 Roland Dg Corp Method and apparatus for measurement of three-dimensional shape
US20170008169A1 (en) * 2015-07-09 2017-01-12 Canon Kabushiki Kaisha Measurement apparatus for measuring shape of object, system and method for producing article
WO2017006544A1 (en) * 2015-07-09 2017-01-12 Canon Kabushiki Kaisha Measurement apparatus for measuring shape of target object, system and manufacturing method
US10068350B2 (en) 2015-12-15 2018-09-04 Canon Kabushiki Kaisha Measurement apparatus, system, measurement method, determination method, and non-transitory computer-readable storage medium
US10121246B2 (en) 2015-03-04 2018-11-06 Canon Kabushiki Kaisha Measurement apparatus that obtains information of a shape of a surface using a corrected image and measurement method
EP3492861A1 (en) * 2017-12-01 2019-06-05 Omron Corporation Image processing system and image processing method
US10726569B2 (en) 2017-05-16 2020-07-28 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and non-transitory computer-readable storage medium

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002213931A (en) * 2001-01-17 2002-07-31 Fuji Xerox Co Ltd Instrument and method for measuring three-dimensional shape
EP1500904A4 (en) * 2002-04-30 2008-08-27 Jfe Steel Corp Method and instrument for measuring bead cutting shape of electric welded tube
WO2003093761A1 (en) 2002-04-30 2003-11-13 Jfe Steel Corporation Method and instrument for measuring bead cutting shape of electric welded tube
EP1500904A1 (en) * 2002-04-30 2005-01-26 JFE Steel Corporation Method and instrument for measuring bead cutting shape of electric welded tube
US7619750B2 (en) 2002-04-30 2009-11-17 Jfe Steel Corporation Measurement method and device for bead cutting shape in electric resistance welded pipes
US7471400B2 (en) 2002-04-30 2008-12-30 Jfe Steel Corporation Measurement method and device for bead cutting shape in electric resistance welded pipes
US7583391B2 (en) 2004-11-19 2009-09-01 School Juridical Person Of Fukuoka Kogyo Daigaku Three-dimensional measuring apparatus, three-dimensional measuring method, and three-dimensional measuring program
JP2006145405A (en) * 2004-11-19 2006-06-08 Fukuoka Institute Of Technology Three-dimensional measuring instrument, method, and program
WO2006054425A1 (en) * 2004-11-19 2006-05-26 School Juridical Person Of Fukuoka Kogyo Daigaku Three-dimensional measuring instrument, three-dimensional measuring method, and three-dimensional measuring program
JP2007114071A (en) * 2005-10-20 2007-05-10 Omron Corp Three-dimensional shape measuring apparatus, program, computer-readable storage medium, and three-dimensional shape measuring method
JP2008170279A (en) * 2007-01-11 2008-07-24 Omron Corp Three-dimensional shape measuring device, method for correction therefor, program, and computer-readable recording medium
WO2008120457A1 (en) * 2007-03-29 2008-10-09 School Juridical Person Of Fukuoka Kogyo Daigaku Three-dimensional image measurement apparatus, three-dimensional image measurement method, and three-dimensional image measurement program of non-static object
JP2008249432A (en) * 2007-03-29 2008-10-16 Fukuoka Institute Of Technology Three-dimensional image measuring device, method, and program of non-static object
JP2009180689A (en) * 2008-02-01 2009-08-13 Nikon Corp Three-dimensional shape measuring apparatus
JP2009186217A (en) * 2008-02-04 2009-08-20 Nikon Corp Three-dimensional shape measuring device and method
WO2009098803A1 (en) * 2008-02-08 2009-08-13 Dainippon Screen Mfg.Co., Ltd. Image information generation device and image information generation method
JP2010133735A (en) * 2008-12-02 2010-06-17 Makino Milling Mach Co Ltd Shape measurement method and device
JP2011064617A (en) * 2009-09-18 2011-03-31 Fukuoka Institute Of Technology Three-dimensional information measuring device and three-dimensional information measuring method
JP2011133328A (en) * 2009-12-24 2011-07-07 Roland Dg Corp Method and apparatus for measurement of three-dimensional shape
US10121246B2 (en) 2015-03-04 2018-11-06 Canon Kabushiki Kaisha Measurement apparatus that obtains information of a shape of a surface using a corrected image and measurement method
US20170008169A1 (en) * 2015-07-09 2017-01-12 Canon Kabushiki Kaisha Measurement apparatus for measuring shape of object, system and method for producing article
WO2017006544A1 (en) * 2015-07-09 2017-01-12 Canon Kabushiki Kaisha Measurement apparatus for measuring shape of target object, system and manufacturing method
US10223575B2 (en) 2015-07-09 2019-03-05 Canon Kabushiki Kaisha Measurement apparatus for measuring shape of object, system and method for producing article
US10068350B2 (en) 2015-12-15 2018-09-04 Canon Kabushiki Kaisha Measurement apparatus, system, measurement method, determination method, and non-transitory computer-readable storage medium
US10726569B2 (en) 2017-05-16 2020-07-28 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and non-transitory computer-readable storage medium
EP3492861A1 (en) * 2017-12-01 2019-06-05 Omron Corporation Image processing system and image processing method
JP2019100852A (en) * 2017-12-01 2019-06-24 オムロン株式会社 Image processing system and image processing method
US11118901B2 (en) 2017-12-01 2021-09-14 Omron Corporation Image processing system and image processing method

Similar Documents

Publication Publication Date Title
JPH03289505A (en) Three-dimensional shape measuring apparatus
JP3525964B2 (en) 3D shape measurement method for objects
US7151560B2 (en) Method and apparatus for producing calibration data for a digital camera
KR20040084697A (en) Image processing system, projector, information storage medium and image processing method
JPS63276676A (en) Detecting system for interpicture corresponding area
JP3999505B2 (en) camera
JP7310606B2 (en) Two-dimensional flicker measuring device and two-dimensional flicker measuring method
JP2019158351A (en) Object recognition method, height measurement method, and three-dimensional measurement method
TW201723420A (en) Three-dimensional measurement device
CN108871206B (en) Surface measuring method and surface measuring device
JPH0875542A (en) Method for measuring quantity of light for display pixel, and method and apparatus for inspecting display screen
WO2021053852A1 (en) Appearance inspection device, appearance inspection device calibration method, and program
JP3538009B2 (en) Shape measuring device
JPH08254499A (en) Displaying/appearance inspection device
JP2003004425A (en) Optical shape-measuring apparatus
JP2008170282A (en) Shape measuring device
JPH09153139A (en) Video signal processor
JP3495570B2 (en) Tone characteristic measuring device
JP3760217B2 (en) CRT convergence measurement method
JP2650684B2 (en) Convergence error detection method for projection display
JPH05308554A (en) Displacement detector for solid-state image pickup device
JPS61198014A (en) Object information processor
JP2024065550A (en) PATTERN IMAGE OUTPUT METHOD, PROJECTOR, AND PROGRAM
KR0183736B1 (en) Method of judging photo sensor arrangement of cathode ray tube picture
JPH07113623A (en) Method and device for alignment of three dimensional shape data