JPH01250177A - Image input method - Google Patents

Image input method

Info

Publication number
JPH01250177A
JPH01250177A JP63077707A JP7770788A JPH01250177A JP H01250177 A JPH01250177 A JP H01250177A JP 63077707 A JP63077707 A JP 63077707A JP 7770788 A JP7770788 A JP 7770788A JP H01250177 A JPH01250177 A JP H01250177A
Authority
JP
Japan
Prior art keywords
image
distance
size
photographic lens
focusing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP63077707A
Other languages
Japanese (ja)
Inventor
Toshio Inada
俊生 稲田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Priority to JP63077707A priority Critical patent/JPH01250177A/en
Publication of JPH01250177A publication Critical patent/JPH01250177A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Input (AREA)

Abstract

PURPOSE:To obtain the image input method which can be used for high-level image processing by performing respective detection processings of the distance to an object to be examined, the size, and the direction of the existence of the object plural times while stepwise varying the focal length of a photographic lens. CONSTITUTION:The focusing state on the image surface of the photographic lens is two-dimensionally observed, and the distance to the object, the size, and the direction of the existence of the object are detected by the focusing state on the image surface, the size and the position of the focused image on the image surface, and the focal length and the field angle of the photographic lens respectively. This processing is performed by the focusing operation of the photographic lens where they are detected plural times while stepwise varying the focal length of the photographic lens. Thus, a number of degrees of freedom is given to positional relations between the object and a TV camera to not only provide a mere image taking-in function but also detect the distance to the object, the size, and the direction of the existence, and this method can be used for high-level image processing.

Description

【発明の詳細な説明】 技術分野 本発明は、パターン認識、物体認識等の画像人力方法に
関する。
DETAILED DESCRIPTION OF THE INVENTION TECHNICAL FIELD The present invention relates to manual image methods such as pattern recognition and object recognition.

従来技術 現在、FA(ファクトリ・オートメーション)分野など
で利用されているパターン認識システムは、多くの場合
、画像入力装置としてTVカメラを使用している。しか
し、被験物とTVカメラとの位置関係が固定されており
、人力装置としては画像取込み機能しか持たないもので
ある。これは、例えば特開昭58−114172号公報
により示されている2台のTVカメラを用いるものでも
同様であり、2台のカメラに対する被験物の存在位置に
制限がある。
BACKGROUND OF THE INVENTION Pattern recognition systems currently used in the FA (factory automation) field often use a TV camera as an image input device. However, the positional relationship between the test object and the TV camera is fixed, and the human-powered device only has the function of capturing images. This also applies to the method using two TV cameras as disclosed in, for example, Japanese Unexamined Patent Publication No. 58-114172, and there is a limit to the position of the test object with respect to the two cameras.

また、ロボット用の3次元視覚機能として、種々の方法
があるが、「画像電子学会誌 第14巻第3号 (19
85)の技術解説「ロボットの3次元視覚機能」」中に
示されるように、各々問題点がある。例えば、両眼立体
視(ステレオ)なる視覚センサは、相関法、点特徴照合
法、エツジ特徴照合法等の手法によるものであり、間接
的に3次元情報が取れ、遠距離での適用が可能という特
徴を持つが、対応点探索が困難であり、対応点の取れた
所しか距離が判らず、かつ、処理時間がかかるという問
題点がある。これは、特開昭60−27085号公報に
示される両眼立体視でも同様であり、やはり、両眼の対
応点の探索が困難である。このため、両視覚系を駆動さ
せて対応点探索を助けているものである。
In addition, there are various methods for providing 3D visual functions for robots, including ``Journal of the Institute of Image Electronics Engineers, Vol. 14, No. 3 (19
85), each has its own problems, as shown in the technical explanation "3-dimensional visual function of robots". For example, binocular stereoscopic vision sensors use techniques such as the correlation method, point feature matching method, and edge feature matching method, which can indirectly obtain 3D information and can be applied over long distances. However, there are problems in that it is difficult to search for corresponding points, distances can only be determined from locations where corresponding points are found, and processing time is required. This is also the case with the binocular stereoscopic vision disclosed in Japanese Unexamined Patent Publication No. 60-27085, and it is also difficult to search for corresponding points for both eyes. For this reason, both visual systems are activated to help search for corresponding points.

また、光を用いる距離センサなる視覚センサは、反射光
処理法(例えば、スポット、スリット、モレア、パター
ン光投映)や直接光処理法によるものであり、対象に能
動的に光を投影する近距離用に適し、ロボット装置に発
光素子を装着して用いるという特徴を持つが、反射光が
取れないことがあるという問題がある。また、距離画像
では分解能と処理時間に問題があり、かつ、対象に素子
を装着させる必要があるという問題点もある。
In addition, visual sensors, which are distance sensors that use light, are based on reflected light processing methods (for example, spot, slit, morea, pattern light projection) and direct light processing methods, and are short-range sensors that actively project light onto an object. Although it is suitable for various applications and has the feature of being used by attaching a light emitting element to a robot device, it has the problem that it may not be able to capture reflected light. Furthermore, distance images have problems with resolution and processing time, and also have the problem of requiring an element to be attached to the object.

さらに、単眼視法なる視覚センサは、照度差ステレオ法
によるものであり、対象に対する知識や環境を変化させ
て物体表面の法線方向、3次元情報を求めるという特徴
を持つが、照明条件などの環境のコントロールが必要と
いう問題点がある。
Furthermore, the monocular visual system is based on the photometric stereo method, which has the characteristic of determining the normal direction and three-dimensional information of the object surface by changing the knowledge of the object and the environment. There is a problem in that the environment needs to be controlled.

目的 本発明は、このような点に鑑みなされたもので、被験物
とTVカメラとの間の位置関係に自由度を持たせつつ、
この被験物までの距離、被験物の存在する方向、被験物
の大きさを検知し得る機能を持たせて高度な画像処理に
供することができる画像入力方法を提供することを目的
とする。
Purpose The present invention has been made in view of the above points, and has the following objects: While providing flexibility in the positional relationship between the test object and the TV camera,
It is an object of the present invention to provide an image input method that is capable of detecting the distance to the test object, the direction in which the test object exists, and the size of the test object and can be used for advanced image processing.

構成 本発明は、上記目的を達成するため、TVカメラの撮影
レンズの像面における合焦状態を2次元的に観察し、像
面での合焦状態、像面上での合焦像の大きさと位置及び
前記撮影レンズの合焦距離と画角から、被験物までの距
離、大きさ及び被験物の存在する方向を各々検出する処
理を、前記撮影レンズの合焦距離を段階的に可変させな
がら複数回行なうことを特徴とする。
Structure In order to achieve the above object, the present invention two-dimensionally observes the in-focus state on the image plane of the taking lens of a TV camera, and determines the in-focus state on the image plane and the size of the focused image on the image plane. The process of detecting the distance to the object, the size, and the direction in which the object exists from the position of the object and the focusing distance and angle of view of the taking lens is performed by varying the focusing distance of the taking lens in stages. It is characterized by being performed multiple times.

以下、本発明の一実施例を図面に基づいて説明する。Hereinafter, one embodiment of the present invention will be described based on the drawings.

まず、本発明で扱う画像情報処理システムの概念を第2
図を参照して説明する。この画像情報処理システムでは
、人間の視覚情報処理を参考にするものであり、基本的
には画像入力部1と情報処理部2と出力部3とからなる
First, the concept of the image information processing system handled by the present invention will be explained in the second section.
This will be explained with reference to the figures. This image information processing system is based on human visual information processing, and basically includes an image input section 1, an information processing section 2, and an output section 3.

ここに、画像入力部1はTVカメラ等にて一般画像を取
込むものであり、その際、注目する被験物の存在位置、
大きさ、色を検知する。これらの情報を画像情報ととも
に情報処理部2に送出する。
Here, the image input unit 1 is for capturing general images using a TV camera or the like, and at that time, the location of the object of interest,
Detect size and color. These pieces of information are sent to the information processing section 2 together with the image information.

また、情報処理部2から注目被験物の指定、例えば「赤
い物に注目せよ」 「近くの物に注目せよ」「視野中心
のものに注目せよ」の如き指定を受ける。また、情報処
理部2は人間の大脳における処理を実現するものである
。即ち、不完全画像の認識、連想記憶、学習、連想想起
等の機能にて、画像入力部1からの情報を処理し、学習
・記憶し、判断を下す。ここでは、過去の学習・記憶内
容4との照合を行なう。また、処理過程において能動的
処理機能が働き、画像入力部1に対して注目被験物の指
定を行なう。さらに、出力部3は情報処理部2からの処
理結果又は判断を受取り、CRT表示、音声信号発生、
マニピュレータ操作、プリントアウト等の出力を行なう
Further, the information processing unit 2 receives designations of objects of interest, such as "pay attention to a red object", "pay attention to a nearby object", and "pay attention to an object in the center of the visual field". Further, the information processing unit 2 realizes processing in the human cerebrum. That is, it processes information from the image input unit 1, learns and stores it, and makes decisions using functions such as recognition of incomplete images, associative memory, learning, and associative recall. Here, comparison with past learning/memory content 4 is performed. In addition, an active processing function operates during the processing process, and designates the object of interest to the image input unit 1. Further, the output unit 3 receives the processing results or judgments from the information processing unit 2, displays the CRT, generates an audio signal,
Performs manipulator operations and outputs such as printouts.

しかして、本実施例ではこのような画像入力部1におけ
る[被験物の距離の検知J [被験物の存在位置の検知
」 「被験物の大きさの検知」に関するものであり、こ
れらの検知をTVカメラ中の撮影レンズのフォーカシン
グにより行なうというものである。これらの距離検知方
法、存在方向の検知方法及び大きさ検知方法を説明する
Therefore, in this embodiment, the image input unit 1 is concerned with detecting the distance of the test object, detecting the position of the test object, and detecting the size of the test object. This is done by focusing the photographic lens in the TV camera. These distance detection methods, presence direction detection methods, and size detection methods will be explained.

まず、被験物の距離検知は、特定の被験物に的を絞って
フォーカシングを行なうというものではなく、撮影レン
ズが特定の合焦距離にある時に像面上のどの部分が合焦
状態にあるかを見つけ、撮影レンズの合焦距離を段階的
に可変させて各々の合焦距離での合焦部分を見つけると
いう方法による。即ち、撮耐レンズの各々の合焦距離に
対し合焦状態となる像面位置を順次求めていくものであ
る。
First, object distance detection does not involve focusing on a specific object, but rather detecting which part of the image plane is in focus when the photographic lens is at a specific focusing distance. This method involves finding the in-focus area at each focusing distance by varying the focusing distance of the photographic lens in stages. That is, the image plane position at which the image is in focus is sequentially determined for each focusing distance of the photographic lens.

第1図にこのような合焦部分を見つけるための画像入力
部1中の光学系を示す。まず、撮影レンズ5が設けられ
、この撮影レンズ5を通った光情報を4等分に振幅分割
するビームスプリッタ6が設けられてい・る。このビー
ムスプリッタ6により4分割された各々の光情報を受光
する4つの2次元受光素子7a、7b、7c、7dがビ
ームスプリッタ6の射出側所定位置に設けられている。
FIG. 1 shows an optical system in the image input section 1 for finding such a focused area. First, a photographing lens 5 is provided, and a beam splitter 6 is provided which divides the amplitude of the optical information passing through the photographing lens 5 into four equal parts. Four two-dimensional light receiving elements 7a, 7b, 7c, and 7d are provided at predetermined positions on the exit side of the beam splitter 6 to receive the optical information divided into four parts by the beam splitter 6.

ここに、前記撮影レンズ5は矢印゛Aで示す光軸方向に
変位自在とされ、合焦距離を任意に可変し得るように構
成されている。
The photographic lens 5 is movable in the direction of the optical axis shown by arrow A, and is configured so that the focusing distance can be arbitrarily varied.

二二に、前記2次元受光素子7a、7b、7c。22, the two-dimensional light receiving elements 7a, 7b, and 7c.

7dと各々の設置位置での光路長りとの関係を第3図及
び拡大した第4図に示す。図中、a、b。
The relationship between 7d and the optical path length at each installation position is shown in FIG. 3 and an enlarged FIG. 4. In the figure, a, b.

c、dは各々の2次元受光素子7a、7b、7c。c and d are two-dimensional light receiving elements 7a, 7b, and 7c, respectively.

7dの受光位置を示し、Fは撮影レンズ5に対する合焦
面を示す。具体的には、受光位置aは合焦位置Fから撮
影レンズ5寄りにΔDだけ離れた所に位置する。受光位
置dは合焦位置Fから受光位置aとは逆向きにΔDだけ
離れた所に位置する。
7d indicates the light receiving position, and F indicates the focal plane for the photographic lens 5. Specifically, the light receiving position a is located at a distance of ΔD from the focusing position F toward the photographing lens 5. The light receiving position d is located at a distance of ΔD from the focusing position F in the opposite direction to the light receiving position a.

受光位置すは合焦位置Fから撮影レンズ5寄りにΔdだ
け離れた所に位置し、受光位置Cは合焦位置Fから受光
位置すとは逆向きにΔdだけ離れた所に位置する。
The light receiving position S is located at a distance of Δd from the focusing position F toward the photographic lens 5, and the light receiving position C is located at a distance of Δd from the focusing position F in the opposite direction to the light receiving position.

ここに、2次元受光素子7aの2次元座標(i。Here, the two-dimensional coordinates (i.

j)における受光出力を5alJとし、同様に、残りの
2次元受光素子7b、7c、7dの各々の2次元座標(
i、j)における受光出力をSb1,1゜5CIJI 
5dIJとする。すると、座標(i、j)において合焦
状態となっているとすると、合焦面Fの(i、j)座標
が合焦状態となる。そして、ΔD及びΔdの値を適当に
決めれば、S alJとS diJ及びS bIJとS
 dlJは各々同程度のボケ状態となる。
Let the light receiving output at j) be 5alJ, and similarly, the two-dimensional coordinates (
The received light output at i, j) is Sb1,1゜5CIJI
It is assumed to be 5dIJ. Then, if it is assumed that the coordinates (i, j) are in focus, the (i, j) coordinates of the focusing plane F are in focus. Then, if the values of ΔD and Δd are determined appropriately, S alJ and S diJ and S bIJ and S
dlJ are blurred to the same degree.

ここで、 F = h−l a (SblJ−5CIJ)+β(S
at、+ −5dlJ)1で定められる合焦面Fを考え
る。ここに、Fは合焦状態を表す数値となり、hは合焦
状態の許容量を定める数値、α、βは正の定数である。
Here, F = h-la (SblJ-5CIJ) + β(S
At, + -5dlJ) Consider the focal plane F defined by 1. Here, F is a numerical value representing the focused state, h is a numerical value that determines the allowable amount of the focused state, and α and β are positive constants.

ここに、上式において、S btJとS CIJ、 S
 alJとS dIJは各々同程度のボケであるため、
右辺中のlα(Sbt、+  5CtJ)+β(Sat
、+ −5alJ月の項の値は小さな値となる。しかし
、合焦休息から外れると、I a (Sb1.+ −S
et、+)+β(Sat、+  5dlJ)lの項の値
は増す。そして、hの値を適当に決め、Fの値が正の時
には合焦状態とし、負の時にはピンボケ状態とする。こ
れにより、Fの値から、各々の座標が合焦部分であるか
否かを判断する。
Here, in the above formula, S btJ and S CIJ, S
Since alJ and S dIJ each have the same degree of blur,
lα(Sbt, +5CtJ)+β(Sat
, + -5alJ The value of the month term is a small value. However, once out of focus rest, I a (Sb1.+ −S
et, +) + β (Sat, + 5dlJ) The value of the term increases. Then, the value of h is determined appropriately, and when the value of F is positive, the object is in focus, and when it is negative, the object is out of focus. Thereby, it is determined from the value of F whether each coordinate is a focused portion.

しかし、これらの5aIJ+  5btJ、  5c1
J、  5dLJの対応がとれるのは、座標(t+  
J)が光軸近傍に位置する時だけである。座標(t、J
)が光軸から離れるに従い、上式を計算する際の各2次
元受光素子7a〜7dの座標に補正値を加える必要があ
る。この補正量は座標(i、j)、ΔD、Δd、撮影レ
ンズ5の画角、受光素子密度、結像の横倍率に依存する
However, these 5aIJ + 5btJ, 5c1
J, 5dLJ can be matched at the coordinate (t+
J) is located near the optical axis. Coordinates (t, J
) as the distance from the optical axis increases, it is necessary to add a correction value to the coordinates of each of the two-dimensional light receiving elements 7a to 7d when calculating the above equation. The amount of correction depends on the coordinates (i, j), ΔD, Δd, the angle of view of the photographing lens 5, the density of the light receiving element, and the lateral magnification of imaging.

第5図及び第6図に座標ずれ発生の様子を示す。FIGS. 5 and 6 show how the coordinate shift occurs.

まず、第5図は物点0の方向と2次元受光素子7a〜7
dの受光位置a −dへの入射位置との関係を示す、第
5図(a)は物点0が光軸近傍に存在する時を示し、図
示の如く、物点○からの主光線は光学系の光軸近傍を通
り、各受光位置a、b、c。
First, FIG. 5 shows the direction of object point 0 and the two-dimensional light receiving elements 7a to 7.
Figure 5(a), which shows the relationship between the incident position of d and the light receiving position a-d, shows when object point 0 exists near the optical axis, and as shown in the figure, the chief ray from object point ○ is The light receiving positions a, b, and c pass near the optical axis of the optical system.

dへの入射座標も各々光軸近傍でほぼ同じとなる。The incident coordinates to d are also approximately the same near the optical axis.

これに対し、第5図(b)は物点Oが光軸から離れた所
に存在する場合を示す0図示の状態からも明かなように
、各受光位置a、b、c、dに対する主光線の入射座標
が各々異なることになる。第6図はこの第5図(b)の
状態を拡大して示すものである。
On the other hand, in FIG. 5(b), as is clear from the state shown in FIG. The incident coordinates of the light rays are different from each other. FIG. 6 shows an enlarged view of the state shown in FIG. 5(b).

即ち、合焦面Fの■方向のi座標にて合焦状態にある時
、その物点Oの主光線の受光位置a −dに対する入射
位置を各々a′、b′、c′、d′とすると、これらの
位置87〜42間には差が生ずる。このような座標の差
は、前述した如く、受光位置a −dにおける座標(i
、j)、ΔD、Δd、撮影レンズ5の画角、受光素子密
度、結像の横倍率に依存する。
That is, when in focus at the i-coordinate of the focusing plane F in the {circle around (2)} direction, the incident positions of the principal ray of the object point O relative to the light receiving positions a-d are a', b', c', and d', respectively. Then, there is a difference between these positions 87-42. As mentioned above, such a difference in coordinates is determined by the coordinates (i
, j), ΔD, Δd depends on the angle of view of the photographic lens 5, the density of the light receiving elements, and the lateral magnification of imaging.

つぎに、被験物の存在する方向の検知方法を説明する。Next, a method for detecting the direction in which a test object exists will be explained.

被験物の存在方向は像面座標、撮影レンズ5の主点と像
面との間の距離から算出される。
The direction in which the object exists is calculated from the image plane coordinates and the distance between the principal point of the photographic lens 5 and the image plane.

なお、第7図に示すように撮影レンズ5の主点と像面と
の距離Xは、撮影レンズ5の画角θと被験物8との距離
Xに依存する。
Note that, as shown in FIG. 7, the distance X between the principal point of the photographic lens 5 and the image plane depends on the angle of view θ of the photographic lens 5 and the distance X between the subject 8.

さらに、被験物8の大きさWは、第8図に示すように像
面上での大きさWと、像に対する縦倍率から算出される
。ここに、縦倍率は被験物8までの距離X、撮影レンズ
5の主点・像面間の距離Xの比x / X (= w 
/ W )より得られる。
Furthermore, the size W of the test object 8 is calculated from the size W on the image plane and the vertical magnification of the image, as shown in FIG. Here, the vertical magnification is the ratio of the distance X to the test object 8 and the distance X between the principal point and image plane of the photographic lens 5 x / X (= w
/W).

このようにして、本実施例方式によれば、撮影レンズ5
のフォーカシングにより、3次元的に被験物の距離、存
在方向及び大きさを検知できる。
In this way, according to the method of this embodiment, the photographic lens 5
By focusing, the distance, direction and size of the object can be detected three-dimensionally.

つまり、被験物とTVカメラとの位置関係に自由度を持
たせることができ、高度な画像処理、例えば人間の持つ
処理機能などへの橋渡し的役割りを発揮させることがで
きる。
In other words, it is possible to have a degree of freedom in the positional relationship between the test object and the TV camera, and it is possible to play a role as a bridge to advanced image processing, such as processing functions possessed by humans.

効果 本発明は、上述したように撮影レンズの像面における合
焦状態を2次元的に観察し、像面での合焦状態、像面上
での合焦像の大きさと位置及び撮影レンズの合焦距離と
画角から、被験物までの距離、大きさ及び被験物の存在
する方向を各々検出する処理を、前記撮影レンズの合焦
距離を段階的に可変させながら複数回行なうという撮影
レンズのフォーカシング動作により行なうようにしたの
で、被験物とTVカメラとの位置関係に自由度を持たせ
ることができ、単なる画像取込み機能だけでなく、被験
物の距離、大きさ及び存在方向を検知でき、高度な画像
処理に供することができる。
Effects As described above, the present invention two-dimensionally observes the in-focus state on the image plane of the photographing lens, and determines the in-focus state on the image plane, the size and position of the focused image on the image plane, and the position of the photographic lens. A photographing lens that performs processing multiple times to detect the distance to the object, its size, and the direction in which the object exists from the focusing distance and angle of view, while varying the focusing distance of the photographing lens stepwise. Since this is done by the focusing operation of the subject, it is possible to have a degree of freedom in the positional relationship between the subject and the TV camera, and it is not only a simple image capture function, but also a detection of the distance, size, and direction of existence of the subject. , which can be used for advanced image processing.

【図面の簡単な説明】[Brief explanation of the drawing]

図面は本発明の一実施例を示し、第1図は画像入力部の
光学系の構成図、第2図は画像情報処理システムの概略
ブロック図、第3図は受光素子位置−光路長の関係を示
す説明図、第4図は第3図の状態を拡大して示す説明図
、第5図は物点位置と受光素子への入射位置との関係を
示す説明図、第6図は第5図(b)の状態を拡大して示
す説明図、第7図は被験物の存在方向検知を説明する説
明図。 第8図は被験物の大きさ検知を説明する説明図である。 5・・・撮影レンズ、8・・・被験物 33図 ヒ 、75 誌図
The drawings show one embodiment of the present invention, in which Fig. 1 is a block diagram of the optical system of the image input section, Fig. 2 is a schematic block diagram of the image information processing system, and Fig. 3 is the relationship between the light receiving element position and the optical path length. FIG. 4 is an explanatory diagram showing the state of FIG. 3 in an enlarged manner. FIG. FIG. 7 is an explanatory diagram showing an enlarged view of the state shown in FIG. 7(b), and FIG. FIG. 8 is an explanatory diagram illustrating detection of the size of a test object. 5...Photographing lens, 8...Test object figure 33 h, 75 magazine figure

Claims (1)

【特許請求の範囲】[Claims] TVカメラの撮影レンズの像面における合焦状態を2次
元的に観察し、像面での合焦状態、像面上での合焦像の
大きさと位置及び前記撮影レンズの合焦距離と画角から
、被験物までの距離、大きさ及び被験物の存在する方向
を各々検出する処理を、前記撮影レンズの合焦距離を段
階的に可変させながら複数回行なうことを特徴とする画
像入力方法。
The in-focus state on the image plane of the photographing lens of the TV camera is observed two-dimensionally, and the in-focus state on the image plane, the size and position of the focused image on the image plane, and the focal distance and image of the photographing lens are determined. An image input method characterized in that the process of detecting the distance from the corner to the test object, its size, and the direction in which the test object exists is performed multiple times while varying the focusing distance of the photographing lens stepwise. .
JP63077707A 1988-03-30 1988-03-30 Image input method Pending JPH01250177A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP63077707A JPH01250177A (en) 1988-03-30 1988-03-30 Image input method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP63077707A JPH01250177A (en) 1988-03-30 1988-03-30 Image input method

Publications (1)

Publication Number Publication Date
JPH01250177A true JPH01250177A (en) 1989-10-05

Family

ID=13641368

Family Applications (1)

Application Number Title Priority Date Filing Date
JP63077707A Pending JPH01250177A (en) 1988-03-30 1988-03-30 Image input method

Country Status (1)

Country Link
JP (1) JPH01250177A (en)

Similar Documents

Publication Publication Date Title
KR101331543B1 (en) Three-dimensional sensing using speckle patterns
JP2756803B2 (en) Method and apparatus for determining the distance between a surface section of a three-dimensional spatial scene and a camera device
JP4825971B2 (en) Distance calculation device, distance calculation method, structure analysis device, and structure analysis method.
RU2001101469A (en) GETTING THREE-DIMENSIONAL PLANS WITH ONE MANUAL RECORDING CAMERA
CN105282443A (en) Method for imaging full-field-depth panoramic image
JP2008096162A (en) Three-dimensional distance measuring sensor and three-dimensional distance measuring method
US5703677A (en) Single lens range imaging method and apparatus
US20050206874A1 (en) Apparatus and method for determining the range of remote point light sources
Watanabe et al. Telecentric optics for constant magnification imaging
JPH03200007A (en) Stereoscopic measuring instrument
JPS59109805A (en) Position detector
Pham et al. Depth from defocusing using a neural network
Wei Three dimensional machine vision using image defocus
Pachidis et al. Pseudo-stereo vision system: a detailed study
JPH01250177A (en) Image input method
Sueishi et al. Mirror-based high-speed gaze controller calibration with optics and illumination control
Lin et al. Single-view-point omnidirectional catadioptric cone mirror imager
JPH0260377A (en) Automatic focus adjustment of industrial or military video camera
JPH0252204A (en) Measuring instrument for three-dimensional coordinate
Shimizu et al. Development of Wide Angle Fovea Lens for High-Definition Imager Over 3 Mega Pixels
JP3655065B2 (en) Position / attitude detection device, position / attitude detection method, three-dimensional shape restoration device, and three-dimensional shape restoration method
JP2011090166A (en) Stereo imaging apparatus
JP3564383B2 (en) 3D video input device
JP2809348B2 (en) 3D position measuring device
Dai et al. High-Accuracy Calibration for a Multiview Microscopic 3-D Measurement System