JPS6050407A - Recognizing apparatus of object - Google Patents

Recognizing apparatus of object

Info

Publication number
JPS6050407A
JPS6050407A JP58159767A JP15976783A JPS6050407A JP S6050407 A JPS6050407 A JP S6050407A JP 58159767 A JP58159767 A JP 58159767A JP 15976783 A JP15976783 A JP 15976783A JP S6050407 A JPS6050407 A JP S6050407A
Authority
JP
Japan
Prior art keywords
processing device
ultrasonic wave
image
ultrasonic
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP58159767A
Other languages
Japanese (ja)
Inventor
Hiroshi Takenaga
寛 武長
Nobuyoshi Tsuboi
坪井 信義
Hiroshi Okubo
大窪 弘
Masahito Suzuki
優人 鈴木
Morio Kanezaki
金崎 守男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP58159767A priority Critical patent/JPS6050407A/en
Publication of JPS6050407A publication Critical patent/JPS6050407A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices Characterised By Use Of Acoustic Means (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

PURPOSE:To make it possible to perform high speed processing, by providing a picture processing device and a distance measuring device, separately processing both devices, synthesizing the results, and recognizing a body in three dimensions. CONSTITUTION:An ITV1, a distance measuring ultrasonic wave element 2, a servomotor, which rotates the ITV1 and the ultrasonic wave element 2, a servomotor controlling device 3, a picture processing device 4 which processes the picture taken by the ITV1, and an ultrasonic wave processing device 5, which controls the timing of the emission of the ultrasonic wave from the ultrasonic wave element 2 and obtains a distance from the received ultrasonic wave signal, are provided. A synthesizing apparatus 6 synthesizes the processed results from the picture processing device 4 and the ultrasonic wave processing device 5 and performs three-dimensional recognition. The high-speed, three-dimensional recognition is carried out by the processing device 6.

Description

【発明の詳細な説明】 〔発明の利用分野〕 本発明は物体認識装置の改良に関する。[Detailed description of the invention] [Field of application of the invention] The present invention relates to improvements in object recognition devices.

〔発明の背景〕[Background of the invention]

ロボットが組立作業を行う場合や、移動ロボットが移動
経路を決定する場合には、物体及びその3次元的な位置
、姿勢を認識する必要があり、それには物体までの距離
、方向をめる必要がちる。
When a robot performs assembly work or when a mobile robot determines a movement route, it is necessary to recognize objects and their three-dimensional positions and postures, and to do so, it is necessary to calculate the distance and direction to the objects. Chiru.

そのため、従来は第1図あるいは第2図による方法が用
いられてきたつ第1図で11.12は所定距離り離して
設置した工業用テレビカメラ(以下ITVと呼ぶことに
する)、13はITVII。
For this reason, conventionally, the method shown in Figure 1 or Figure 2 has been used. .

12で撮像した画像を処理するための画像処理装置、1
4は物体であるっまた、第2図で21はITV、22は
ITV21より距離り離して設置したスリット23を設
けた遮光板で軸0を中心に回転できる、24は物体、2
5はITV21で撮像した画像を処理する画像処理−装
置である。第1図、第2図とも三角測量の原理に基づい
て処理を行うが、第1図の方法で物体14上の点Pの座
標をめるにはITVI1と12の画像を画像処理袋ft
13で処理して両者を対応側ける必要がある。
an image processing device for processing the image captured in 12;
4 is an object; in FIG. 2, 21 is an ITV, 22 is a light-shielding plate with a slit 23 installed at a distance from the ITV 21, and can be rotated around axis 0; 24 is an object;
5 is an image processing device that processes images captured by the ITV 21. Both Figures 1 and 2 are processed based on the principle of triangulation, but in order to find the coordinates of point P on the object 14 using the method shown in Figure 1, the images of ITVI1 and 12 are processed using the image processing bag ft.
It is necessary to process the information in step 13 and match the two sides.

しかし、この処理は複雑でちるため処理時間が長く、ま
た、信頼性にもやや欠ける。そのため、高速性が要求さ
れるロボットの視覚装置としては不適当である。更に、
照明の条件によっては対応付けが非常に困難になる場合
も発生する。
However, this process is complicated, takes a long time, and is somewhat unreliable. Therefore, it is unsuitable as a visual device for robots that require high speed. Furthermore,
Correlation may become extremely difficult depending on lighting conditions.

次に、第2図の方法は遮光板22に設けたスリット23
を通した線状のスリット光を物体24にあてて、物体2
4上での線の曲り具合いをITV21でとらえ、画像処
理装置25で処理して各点の距離を三角測量によりめ、
物体を認識するものである。しかし、この方法の欠点は
明るい場所ではスリット光による明暗差がはっきり出な
いこと、物体が太きいとスリット光の曲りが発生せず、
物体か背景かの区別ができないことである。
Next, the method shown in FIG.
The linear slit light passed through the object 24 is applied to the object 24.
The degree of curvature of the line on 4 is captured by the ITV 21, processed by the image processing device 25, and the distance between each point is determined by triangulation.
It recognizes objects. However, the disadvantage of this method is that in bright places, the difference in brightness due to the slit light cannot be clearly seen, and when the object is thick, the slit light does not bend.
It is impossible to distinguish between an object and the background.

捷た、第1図では高価なITVを2台必要とし、第2図
ではスリット光を投光する光源を作る必要があるため、
コストが高くなシ、更に、第1図。
Figure 1 requires two expensive ITVs, and Figure 2 requires a light source that emits slit light.
Furthermore, the cost is high, and FIG.

第2図の方法では距離の計算と画像処理は密接に関連し
ているため、各々単独に処理できないという欠点も有し
ている。
The method shown in FIG. 2 has the disadvantage that distance calculation and image processing are closely related, and therefore cannot be processed independently.

〔発明の目的〕[Purpose of the invention]

このため、本発明では経済的装置で、かつ、高速処理可
能な物体認識装置を提供することを目的とする。
Therefore, an object of the present invention is to provide an object recognition device that is economical and capable of high-speed processing.

〔発明の概要〕[Summary of the invention]

本発明の特徴とするところは、画像処理装置と距離測定
装置を設け、両者で独立に処理した後、結果を複合処理
して物体f:3次元で認識することにある。
The feature of the present invention is that an image processing device and a distance measuring device are provided, and after processing is performed independently by both, the results are subjected to combined processing to recognize the object f in three dimensions.

〔発明の実施例〕[Embodiments of the invention]

以下、本発明の実施列について第3図を用いて詳細に贈
、明する。
Hereinafter, the implementation sequence of the present invention will be explained in detail using FIG. 3.

第3図で1はITV、2は距離測定用の超音波素子、3
i、1:TT■1と超音波素子2を回転させるだめのザ
ーボモータとその制御装置、4はI T Vlで撮像し
た画像を処理する画像処理装置、5は超音波素子2から
の超音波発射のタイミング制御や受波した超音波信号か
ら距離をめる超音波処理装置、6は画像処理装置4と超
音波処理装置5で処理した結果を合成して3次元認識を
行う複合処理装置である。ここでは距離測定に超音波を
用いたが、レーザ光、赤外線等の光を用いてもよい。
In Figure 3, 1 is an ITV, 2 is an ultrasonic element for distance measurement, and 3
i, 1: TT 1 and a servo motor for rotating the ultrasonic element 2 and its control device; 4 is an image processing device that processes images captured by the IT Vl; 5 is an ultrasonic wave emitted from the ultrasonic element 2. 6 is a composite processing device that performs three-dimensional recognition by synthesizing the results processed by the image processing device 4 and the ultrasonic processing device 5. . Although ultrasonic waves are used for distance measurement here, laser light, infrared light, or other light may also be used.

次に本実施例による動作を第4図を例に以下詳細に説明
する。まず、(a)のように物体が置かれているとする
。これをITVIでフ最1象して得られた画像から画像
処理装置4で画像に含まれる明暗雑音を除くノイズ除去
処理、物体の分離に必要な境界線強調処理、等の画像処
理を行う。その顛果を第4図(b)に示す。これを実現
する画像処理装置4の構成例を第5図に示す。すなわち
、中央処理装置従51が外部インターフェース52よシ
画像処理要求を受けるとクロック発生部53に指令を与
える。クロック発生部53はITVIがらのビデオ信号
をA/Df遺するだめのクロック信号を発生してA/D
変換器54に与える。このとき、ビデオ信号に含まれて
いる水平同期信号、垂直同期信号は除くつA/D変換器
54にょシディジタル信号に変換された画像情報は画像
メモリ55に格納されるか、ちるいは画像プロセッサ5
6に与えられる。先に述べたノイズ除去、境界線強調等
の画像処理は画像グロ、セッサ56で行い、これらの処
理は第6図に示すように注目画素Xljに対してそのま
わりの8画素も含めて重み係数を掛け、その和をとる、
いわゆる積和演算により行える。すなわち、出力画素y
Ijは でめられる。この積和演算はリフトウェアで処理しても
良いし、ハードウニIで処理することもできる。
Next, the operation of this embodiment will be explained in detail using FIG. 4 as an example. First, assume that an object is placed as shown in (a). This is visualized using ITVI, and from the obtained image, an image processing device 4 performs image processing such as noise removal processing to remove brightness noise contained in the image, and boundary line enhancement processing necessary for separating objects. The result is shown in Figure 4(b). An example of the configuration of the image processing device 4 that realizes this is shown in FIG. That is, when the central processing unit slave 51 receives an image processing request from the external interface 52, it gives a command to the clock generator 53. The clock generating section 53 generates a clock signal for transmitting the video signal from the ITVI to the A/Df.
Converter 54 is provided. At this time, the image information converted into digital signals by the A/D converter 54, excluding the horizontal synchronization signal and vertical synchronization signal included in the video signal, is stored in the image memory 55, or processor 5
given to 6. The image processing such as noise removal and border enhancement described above is performed by the image processor 56, and these processes are performed using weighting coefficients for the pixel of interest Xlj, including the surrounding 8 pixels, as shown in FIG. Multiply and take the sum,
This can be done by a so-called product-sum operation. That is, the output pixel y
Ij is framed. This product-sum operation may be processed by liftware, or may be processed by hardware.

また、画I象グロセツサ5Gは′A/D変換器54から
の画諌を処理して画像メモリ55に格納することも、画
像メモリ55がらの画1隊を処理して再び画像メモリに
格納することもできる。この切換制御は中央処理装置5
1で行う。
In addition, the picture I image grosser 5G may process the picture marks from the A/D converter 54 and store them in the image memory 55, or it may process the pictures from the picture memory 55 and store them in the picture memory again. You can also do that. This switching control is performed by the central processing unit 5.
Do it in 1.

次に、第4図(a)の物体の位置(距離、方向)をめる
方法を述べる。距離測定用の超音波センサは超音波の持
続時間の短いものを用いる。これは、例えばプラスチッ
クフィルムによるコンデンサ型のものが考えられる。
Next, a method for determining the position (distance, direction) of the object shown in FIG. 4(a) will be described. An ultrasonic sensor for distance measurement uses one with a short duration of ultrasonic waves. This may be, for example, a capacitor type made of plastic film.

物体の位置測定の原理は第7図(a)に示すように超音
波センサ2を例えば■→■→■と回転させ、送信した超
音波の物体72による反射波を処理することにより行う
。すなわち、■、■、■の方向からの超音波の反射波を
観測すると第7図(b)のように物体72(この場合は
四角柱)の角A、B。
The principle of measuring the position of an object is as shown in FIG. 7(a), by rotating the ultrasonic sensor 2, for example, in the direction ■→■→■, and processing the reflected waves of the transmitted ultrasonic waves from the object 72. That is, when the reflected ultrasound waves from the directions ■, ■, and ■ are observed, the corners A and B of the object 72 (in this case, a square prism) are observed as shown in FIG. 7(b).

Cからの反射波がはっきり検出できるため、反射波が最
大となったときの超音波センサ2の方向から物体の方向
を、超音波を送信してから反射波が戻ってくるまでの時
間により距離をめる。これを実現するための超音波処理
装置5(第3図)の1構成例を第8図に示す。第8図で
演算処理部81が外部インク−7エース82から物体位
置測定要求を受けると、超音波センサ2から超音波の送
信起動とA、/D変換部83へ信号取込み要求を出す。
Since the reflected wave from C can be clearly detected, the direction of the object from the direction of the ultrasonic sensor 2 when the reflected wave is at its maximum can be determined by the distance from the direction of the ultrasonic sensor 2 to the time from when the ultrasonic wave is transmitted until the reflected wave returns. I put it on. FIG. 8 shows an example of the configuration of the ultrasonic processing apparatus 5 (FIG. 3) for realizing this. In FIG. 8, when the arithmetic processing unit 81 receives an object position measurement request from the external ink-7 ace 82, it issues a request to start ultrasonic transmission from the ultrasonic sensor 2 and to the A/D conversion unit 83 to take in a signal.

要求を受けたA/D変換部83はA/D変換器にクロッ
ク信号を与え、超音波センサ2が受信した反射波をディ
ジクル信号に変換してメモリ84に格納する。これを所
定時間行う。そのあと、演算処理部81はメモリ84の
内容を調べ、物体までの距離と反射波のピーク値をめる
。すなわち、第9図でピーク値はPI + P2 、P
3であり距離rの算出は反射波が所定のしきい値と交差
したとき、そこから逆戻りして零と交差するαまでの時
間taと音速Vとにより r=vt σ でめる。
Upon receiving the request, the A/D converter 83 supplies a clock signal to the A/D converter, converts the reflected wave received by the ultrasonic sensor 2 into a digital signal, and stores the digital signal in the memory 84 . This is done for a predetermined period of time. After that, the arithmetic processing unit 81 examines the contents of the memory 84 and calculates the distance to the object and the peak value of the reflected wave. That is, in FIG. 9, the peak values are PI + P2, P
3, and the distance r is calculated as r=vt σ by the time ta from when the reflected wave crosses a predetermined threshold value to α when it crosses zero and the sound speed V.

以上の処理を終了した後、1t11制御線800を介し
てサーボモーフへ回転起動を与えて超廿諷センサ2を所
定角度回転させ、先に述べた反射波の記録。
After completing the above processing, a rotational start is given to the servomorph via the 1t11 control line 800 to rotate the ultra-long range sensor 2 by a predetermined angle, and the reflected waves described above are recorded.

距離の計算、ピーク値の検策を行う。これを所定角度範
囲くり返すと、超音波センサの指向特性によシビーク値
P+は第10図のように回転角度の関数になる。観測し
たピーク値の中でも最大となる時の角度を物体の方向と
する。更に、精度を上げるため第10図のように複数個
のピーク値よp、例えば最小2乗法によシ最大ピークと
なる角度θ0.8 を補間してめることも考えられる。
Calculate distance and check peak value. When this is repeated over a predetermined angle range, the civic value P+ becomes a function of the rotation angle as shown in FIG. 10 due to the directivity characteristics of the ultrasonic sensor. The direction of the object is defined as the angle at which the maximum value is reached among the observed peak values. Furthermore, in order to improve accuracy, it is conceivable to interpolate p from a plurality of peak values, for example, by the method of least squares, as shown in FIG. 10, and to determine the angle θ0.8 at which the maximum peak occurs.

したがって、これらの処理によシ第4図(C)のように
物体装置が平面図としてめられる。この処理は前述した
画像の画1象メモリへの取込みが終了す荘ば、その後の
画像処理と独立に行えるため、全体の処理時間は速くな
る。
Therefore, through these processes, the object device can be viewed as a plan view as shown in FIG. 4(C). This processing can be performed independently of the subsequent image processing once the above-mentioned image has been loaded into the image memory, thereby shortening the overall processing time.

次に、第4図(b)、 (C)のように得られた画像情
報、位置情報を合成処理装置6で複合処理する。合成処
理装置6は第11図のように中央処理部111、画像処
理装置とのインターフェースを行う外部インターフェー
ス112、超音波処理装置とのインターフェースを行う
外部インターフェース113.3次元モデルを格納する
ためのメモリ114で構成する。
Next, as shown in FIGS. 4(b) and 4(c), the obtained image information and position information are subjected to composite processing by a composition processing device 6. As shown in FIG. 11, the synthesis processing device 6 includes a central processing unit 111, an external interface 112 for interfacing with an image processing device, an external interface 113 for interfacing with an ultrasound processing device, and a memory for storing a three-dimensional model. 114.

まず、複合処理の前処理として第4図の画1象情報(b
)、位置情報(C)間の対応付けを行う必要があるが、
これは、位置情報(C)より第12図に示すように物体
までの距離rA、 rB、 rc及び方向0人。
First, as pre-processing for the composite processing, the image information (b
), it is necessary to make a correspondence between the location information (C),
From the position information (C), as shown in Figure 12, the distances to the object are rA, rB, rc and the direction is 0 people.

θ扉、θCがまるので物体の角の座標は次式でめられる
。ただし、y軸は紙面に垂直に裏側から、表側の方向を
正とする。
Since θdoor and θC are round, the coordinates of the corners of the object can be found using the following equation. However, the y-axis is perpendicular to the plane of the paper and the direction from the back side to the front side is positive.

(xA 、ZA)=(rAcosoA、rAsinθ^
)(Xi 、zs 1=(r、cosθ”trllal
llθB)(Xc 、 Zc )・= (rc cO5
θc+ rC81[1θC)となろう超音波センサの回
転はX−Z平面と平行な面で行うのでyは一定値y。で
ある。
(xA, ZA) = (rAcosoA, rAsinθ^
)(Xi, zs 1=(r, cosθ”trllal
llθB) (Xc, Zc)・= (rc cO5
θc+rC81 [1θC) Since the rotation of the ultrasonic sensor is performed in a plane parallel to the X-Z plane, y is a constant value y. It is.

上式で得られた座標とITVの撮像素子上の座標との対
応は第13図を用いて比例計算で行える。
The correspondence between the coordinates obtained by the above equation and the coordinates on the image sensor of the ITV can be determined by proportional calculation using FIG.

すなわち、第12図のAに対するrut It!素子上
の座標(ξえ、η^)はITVのレンズの焦点距離をF
として XA ξA二□ によ請求められる。
That is, rut It! for A in FIG. The coordinates (ξE, η^) on the element are the focal length of the ITV lens F
It is claimed by XA ξA2□ as.

これにより第4図位置情報(C)で得られた各点と画像
t’tt報(b)との対応付けが行え、複合処理として
、例えば第14図に示すように、原点0から所だ距離1
0以上にある画像を消して、物体142だけ残して物体
認識を行うことも可能であるし、また、その逆の操作も
行える。
As a result, each point obtained from the position information (C) in Figure 4 can be associated with the image t'tt information (b), and as a composite process, for example, as shown in Figure 14, the points obtained from the origin 0 can be distance 1
It is also possible to erase images above 0 and perform object recognition while leaving only the object 142, or vice versa.

以上述べた手順をフロー化すると第15図のようになる
When the procedure described above is converted into a flowchart, it becomes as shown in FIG. 15.

本実施例では超音波センサを回転させて位置(距離、方
向)をめるようVこしているが、他の実施例として、第
16図に示すように複数個の超音波センサ161〜16
nfc各々適箔な角度を持たせて配置しておき、走査装
置160で順番に超音波センサ161〜16nから超音
波を発振させる方法もある。この方法では画1ニヌの取
込みの過程から並列に位置測定の処理が行える。
In this embodiment, the ultrasonic sensor is rotated to adjust the position (distance, direction), but in another embodiment, a plurality of ultrasonic sensors 161 to 16 are used as shown in FIG.
There is also a method in which the NFCs are arranged at appropriate angles, and the scanning device 160 sequentially oscillates ultrasonic waves from the ultrasonic sensors 161 to 16n. In this method, position measurement processing can be performed in parallel from the process of capturing one image.

更に他の実施列とじて、第17図のようにITv171
と超音波センサ172を所定角度0傾けて設置すめこと
も考えられる。
Furthermore, as shown in FIG. 17, ITv171
It is also conceivable to install the ultrasonic sensor 172 at a predetermined angle of 0.

〔発明の効果〕〔Effect of the invention〕

以上、本発明によれば画1象処理だけでは非常に困難で
処理時間のかかった複雑背景での物体認識が簡単な構成
でかつ短時間に行えるようにkった。
As described above, according to the present invention, object recognition in a complex background, which would be extremely difficult and time consuming to process using only one image, can be performed with a simple configuration and in a short time.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は従来の2台のITVを用いた3次元認識装置、
第2図は従来のスリット光を用いた3次元認識装置、第
3図は本発明の一冥Jjζ例、第4図は本発明による3
次元認識の方法を説明1゛るためのモデル、第5図は画
1げ処理装置の構成例、第6図は両画プロセッサによる
画1象処理の説明図、第7図は超音波による物体位1f
tg識の原理と物体からの反射波形、第8図は超音波処
理装置の構成例、第9図はメモリに記録した超音波波形
、第10図は超音波の発射方向と反射波の強度との関係
、第11図は複合処理装置の構成例、第12図は物体の
3次元座標を計算する方法の説明図、第13図は超音波
センサで得た物体位置を画像上の位置に対応付ける方法
の説明図、第14図は複合処理装置による処理の1モデ
ル説明図、第15図は画像処理装置、超音波処理装置、
複合処理装置における3次元g識のだめの流れ図、第1
6図は本発明の内超音波センサ部分の他の実施例、第1
7図は本発明の他の実施例である。 11.12,21.1・・・口’V、13,25.4・
・・画像処理装置、14,24..72・・・物体、2
2・・・遮光板、23・・・スリット、2・・・超音波
センサ、3・・・ツーーボモータとコントローラ、5・
・・超音波処理装置、6・・・複合゛処理装置、51・
・・中央処理装置、52・・・外部インターフェース、
53・・・クロック発生部、54・・・A/D変換器、
55・・・画像メモリ、56・・・画像ノロセッサー、
81・・・演算処理部、82・・・外部インターフェー
ス、83・・・A/D&換部、84・・メモILIII
・・・中央処理部、112゜113・・・外部インター
フェース、114・・・メモリ、141、L42,14
3・・・物体、160・・・走査装R1161〜16n
・・・超音波センサ、171・・・鱈 [71 拭 2霞 箔 3(2) (C) 何 6 口 第70(α) 茗 8の 箭90 しきし・イJ6グ 第1O図 第1/図 第1?図 第13図 第1頁の続き 0発 明 者 金 崎 守 男 日立市幸町′3丁目;
所内
Figure 1 shows a conventional three-dimensional recognition device using two ITVs.
Fig. 2 shows a conventional three-dimensional recognition device using slit light, Fig. 3 shows an example of the present invention, and Fig. 4 shows a three-dimensional recognition device according to the present invention.
A model for explaining the method of dimensional recognition, Fig. 5 is an example of the configuration of an image processing device, Fig. 6 is an explanatory diagram of image processing by a dual-image processor, and Fig. 7 is an object generated by ultrasonic waves. Place 1f
The principle of TG perception and the reflected waveform from an object, Figure 8 shows an example of the configuration of an ultrasonic processing device, Figure 9 shows the ultrasonic waveform recorded in memory, and Figure 10 shows the direction of ultrasonic emission and the intensity of the reflected wave. Fig. 11 shows an example of the configuration of a multifunction processing device, Fig. 12 shows an explanatory diagram of a method for calculating the three-dimensional coordinates of an object, and Fig. 13 shows how the object position obtained by the ultrasonic sensor is associated with the position on the image. An explanatory diagram of the method, Fig. 14 is an explanatory diagram of one model of processing by a compound processing device, Fig. 15 is an image processing device, an ultrasonic processing device,
Flowchart of 3D g-knowledge in multiprocessing equipment, Part 1
Figure 6 shows another embodiment of the ultrasonic sensor part of the present invention, the first
FIG. 7 shows another embodiment of the present invention. 11.12,21.1...mouth'V, 13,25.4.
...Image processing device, 14, 24. .. 72...object, 2
2... Light shielding plate, 23... Slit, 2... Ultrasonic sensor, 3... Two-botor motor and controller, 5...
... Ultrasonic processing device, 6... Complex processing device, 51.
...Central processing unit, 52...External interface,
53... Clock generation section, 54... A/D converter,
55... Image memory, 56... Image processor,
81... Arithmetic processing section, 82... External interface, 83... A/D & switching section, 84... Memo ILIII
... Central processing unit, 112° 113 ... External interface, 114 ... Memory, 141, L42, 14
3...Object, 160...Scanning device R1161~16n
...Ultrasonic sensor, 171...Cod [71 Wipe 2 Kasumi foil 3 (2) (C) What 6 Mouth 70 (α) Sweetfish 8 No. 90 Shikishi I J6gu 1O Figure 1/ Figure 1? Figure 13 Continued from page 1 0 Inventor Morio Kanazaki 3-chome, Saiwai-cho, Hitachi City;
Inside the office

Claims (1)

【特許請求の範囲】 1、物体の座標を測定する第1の手段と、前記物体を撮
像する第2の手段を備え、前記第1の手段によって測定
した座標と前記第2の手段で撮像した画像上の点とを対
応付けることによって前記第2の手段で撮像した画像を
3次元で認識するように構成したことを特徴とする物体
認識装置。 2、特許請求の範囲第1項において、前記物体の座標を
測定する第1の手段は超音波センサであり、この超音波
センサを回転装置に取付け、任意角度回転するように構
成したことを特徴とする物体認識装置。 3、特許請求の範囲第1項において、前記物体の座標を
測定する第1の手段を複数の超音波センサで構成したこ
とを特徴とする物体認識装置。 4、特許請求の範囲第2項において、前記物体を撮像す
る第2の手段を前記回転装置に取付けたことを特徴とす
る物体認識装置。
[Claims] 1. A first means for measuring the coordinates of an object, and a second means for imaging the object, wherein the coordinates measured by the first means and the image taken by the second means are provided. An object recognition device characterized in that it is configured to three-dimensionally recognize an image captured by the second means by correlating points on the image. 2. Claim 1 is characterized in that the first means for measuring the coordinates of the object is an ultrasonic sensor, and the ultrasonic sensor is attached to a rotating device and configured to rotate at any angle. Object recognition device. 3. An object recognition device according to claim 1, characterized in that the first means for measuring the coordinates of the object includes a plurality of ultrasonic sensors. 4. An object recognition device according to claim 2, characterized in that second means for capturing an image of the object is attached to the rotating device.
JP58159767A 1983-08-30 1983-08-30 Recognizing apparatus of object Pending JPS6050407A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP58159767A JPS6050407A (en) 1983-08-30 1983-08-30 Recognizing apparatus of object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP58159767A JPS6050407A (en) 1983-08-30 1983-08-30 Recognizing apparatus of object

Publications (1)

Publication Number Publication Date
JPS6050407A true JPS6050407A (en) 1985-03-20

Family

ID=15700818

Family Applications (1)

Application Number Title Priority Date Filing Date
JP58159767A Pending JPS6050407A (en) 1983-08-30 1983-08-30 Recognizing apparatus of object

Country Status (1)

Country Link
JP (1) JPS6050407A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0887005A1 (en) 1997-06-27 1998-12-30 Honda Giken Kogyo Kabushiki Kaisha Knapsack type working machine

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0887005A1 (en) 1997-06-27 1998-12-30 Honda Giken Kogyo Kabushiki Kaisha Knapsack type working machine
US6053259A (en) * 1997-06-27 2000-04-25 Honda Giken Kogyo Kabushiki Kaisha Knapsack type working machine

Similar Documents

Publication Publication Date Title
US11320536B2 (en) Imaging device and monitoring device
KR100380819B1 (en) Method of determining relative camera orientation position to create 3-d visual images
EP1530123A2 (en) Projector and projector accessory
EP0782100A2 (en) Three-dimensional shape extraction apparatus and method
KR20040030081A (en) 3D video conferencing system
O'Donovan et al. Microphone arrays as generalized cameras for integrated audio visual processing
US9208565B2 (en) Method and apparatus for estimating three-dimensional position and orientation through sensor fusion
JP7038729B2 (en) Image compositing device and image compositing method
WO2006015236A3 (en) Audio-visual three-dimensional input/output
US6839081B1 (en) Virtual image sensing and generating method and apparatus
KR20180121259A (en) Distance detecting device of camera mounted computer and its method
KR101868549B1 (en) Method of generating around view and apparatus performing the same
US9261974B2 (en) Apparatus and method for processing sensory effect of image data
JP2000134537A (en) Image input device and its method
JP2000304508A (en) Three-dimensional input device
JPS6050407A (en) Recognizing apparatus of object
KR20130135016A (en) An apparatus and a method for recognizing material of the objects
CN111176337A (en) Projection device, projection method and computer storage medium
JP2010190793A (en) Apparatus and method for measuring distance
US20180278902A1 (en) Projection device, content determination device and projection method
Wang et al. Active stereo vision for improving long range hearing using a laser Doppler vibrometer
WO2022228461A1 (en) Three-dimensional ultrasonic imaging method and system based on laser radar
WO2016202111A1 (en) Audio output method and apparatus based on photographing
JP4451968B2 (en) 3D image input device
JP2970835B2 (en) 3D coordinate measuring device