JPS6159507A - Guiding device of selfcontrol running truck - Google Patents

Guiding device of selfcontrol running truck

Info

Publication number
JPS6159507A
JPS6159507A JP59181029A JP18102984A JPS6159507A JP S6159507 A JPS6159507 A JP S6159507A JP 59181029 A JP59181029 A JP 59181029A JP 18102984 A JP18102984 A JP 18102984A JP S6159507 A JPS6159507 A JP S6159507A
Authority
JP
Japan
Prior art keywords
sign
vehicle
running
camera
drive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP59181029A
Other languages
Japanese (ja)
Inventor
Katsumi Kubo
久保 克己
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Priority to JP59181029A priority Critical patent/JPS6159507A/en
Publication of JPS6159507A publication Critical patent/JPS6159507A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons

Abstract

PURPOSE:To attain ease of change of a moving path of a running truck and to control the movement of the truck by providing an ITV camera, a picture processing circuit and a running control computer so as to apply any processing to a floor. CONSTITUTION:A video signal from an ITV camera 1 provided to an universal head 2 mounted on a running truck moving in a room is inputted to the processing circuit 4 via a picture interface 3. The circuit 4 applies binary processing via the computer 5 and the result is inputted to the running control computer 7. The computer 7 discriminates the length of side and a character or a figure in a mark, and applies comparison with a map and range operation in the inside of it, a deviation to an object location is inputted to drive conver sion circuits 8a, 8b and drive systems 9a, 9b are operated. That is, the movement of the running truck is controlled without applying any processing to the floor, the moving path of the running track is changed easily and the running truck is induced to the object place by the information obtained by the camera 1.

Description

【発明の詳細な説明】 〔発明の技術分野〕 本発明は、点検保全作業ロボット等の走行指示として利
用されている自律走行車の誘導装置に関する。
DETAILED DESCRIPTION OF THE INVENTION [Technical Field of the Invention] The present invention relates to a guidance device for an autonomous vehicle that is used as a driving instruction for an inspection/maintenance robot or the like.

〔発明の技術背景とその問題点〕[Technical background of the invention and its problems]

現在、産業用ロボットは色々な分野で開発されており、
その目的に応じて移動し、点検・監視・保全・作業を行
うことが要求されでいる。これら作業の移動手段には、
大きく分けるとモルレール上を移動するものと、床面を
移動するものとに分けることができる。
Currently, industrial robots are being developed in various fields.
They are required to move and perform inspection, monitoring, maintenance, and work according to their purpose. Transportation methods for these tasks include:
Broadly speaking, they can be divided into those that move on a mole rail and those that move on the floor.

しかしながら、モルレール上を移動するものでは、その
移動経路を変更することはできない。また、床面を移動
するものにあっても、床面等に何らかの加工を必要とし
、容易に経路を変更することは困難であった。
However, if something moves on a mole rail, its movement route cannot be changed. Furthermore, even for devices that move on the floor, some kind of processing is required on the floor or the like, making it difficult to easily change the route.

また、従来の産業用ロボットの誘導方法としては、次の
2つの方法が一般的である。第1は、床面に誘導線を付
設し該誘導線に一定の周波数の電流を流し、ロボット側
にピックアップ用のコイルを一定間隔に設けてピックア
ップコイルに発生する電圧の比が零になるように制御す
ることにより、付設した線上を走行させるものである。
Furthermore, the following two methods are generally used to guide conventional industrial robots. First, a guide wire is attached to the floor surface, a current of a certain frequency is passed through the guide wire, and pick-up coils are installed on the robot side at regular intervals so that the ratio of the voltage generated in the pick-up coils becomes zero. The vehicle is controlled to run on the attached line.

第2は、床面に付設したテープからの反射光を得られる
ようにしてステアリングを制御するものである。
The second method is to control the steering by making it possible to obtain reflected light from a tape attached to the floor surface.

これらの方法は、いずれも床面に加工を施すもので、加
工に費やす期間が必要であったり、床面を常にきれいな
状態にづる8廿がある。このようなことから、床面に加
工を施す方法以外で走行車を誘導する方法が求められて
いる。
All of these methods involve processing the floor surface, which requires a long period of time, and requires 8 years to keep the floor surface clean at all times. For this reason, there is a need for a method of guiding vehicles other than by applying processing to the floor surface.

〔発明の目的〕[Purpose of the invention]

本発明の目的は、床面に何らの加工を施すことなく、走
行車の移動を制御することができ、且つ走行車の移動経
路を容易に変更することのできる自律走行車の誘導装は
を提供することにある。
An object of the present invention is to provide a guidance system for an autonomous vehicle that can control the movement of a traveling vehicle and easily change the traveling route of the traveling vehicle without performing any processing on the floor surface. It is about providing.

本発明の骨子は、ITVカメラで10られる外界の情報
の中から目的とする対称物を検出し、その結果から距離
と方向を求めて蹟度良い測定を行い、この測定結果を走
行車の移動制御量に変換することにある。
The gist of the present invention is to detect a target object from the outside world information obtained by an ITV camera, obtain the distance and direction from the results, perform a measurement with good accuracy, and use this measurement result to ensure the movement of the vehicle. The purpose is to convert it into a controlled quantity.

〔発明の概要) 即ち本発明は、上記目的を達成するために、例えば原子
炉格納容器の内外を走行車で移動して点検監視を行う走
行車の誘導装置として、前記走行車に搭載された雲台に
設けられたITVカメラにより目的とする対象物例えば
標識を抽出し、測定した標識の辺の長さの比から角度を
求め、また記憶した標識と測定した標識の辺の絶対長か
ら距離を求め、記憶したマツプの中から自分く走行車)
の位置を求め、指示された目的地に対するずれ量を制御
岳に変換しステアリング制御するようにしたものである
[Summary of the Invention] In other words, in order to achieve the above object, the present invention provides a guiding device for a traveling vehicle that moves inside and outside of a nuclear reactor containment vessel for inspection and monitoring. A target object, such as a sign, is extracted using the ITV camera installed on the camera platform, and the angle is calculated from the ratio of the side lengths of the measured sign, and the distance is calculated from the absolute length of the side of the memorized sign and the measured sign. (I search for a running car from the memorized map)
The system calculates the position of the vehicle and converts the amount of deviation from the designated destination into a control distance for steering control.

〔発明の効果〕〔Effect of the invention〕

本発明による走行車の誘導装置によれば、走行車移動を
制御するのに外界の情報から走行車自身で誘導すること
ができ、走行車に目的地の指示だけをすれば良く、非常
に操作性を良くすることができる。また、床面等に誘導
するための加工を必要とせず、標識のデータベースを変
換することにより経路を自由に選定できる。
According to the vehicle guidance device according to the present invention, the vehicle can be guided by itself using information from the outside world to control the movement of the vehicle, and it is only necessary to instruct the vehicle about the destination, making it very easy to operate. It can improve sex. In addition, there is no need for processing to guide the route to the floor, etc., and the route can be freely selected by converting the sign database.

〔発明の実施例〕[Embodiments of the invention]

以下、本発明の詳細を図示の実施例によって説明する。 Hereinafter, details of the present invention will be explained with reference to illustrated embodiments.

第1図は本発明の一実施例の概略構成を示すブロックで
ある。室内を移動する走行車に搭載された雲台2に設け
られたITVカメラ1からの映倣信号は、ケーブルを介
しデジタル画像に変換するインターフェース3で例えば
512X512の画素に分割され、画像処理回路4に入
力される。画像処理回路4は2値化処理、空間フィルタ
リング処理及び画素間演算等の処理別能を有したハード
ウェアで、処理制御をサボー]・する計EHjl15の
下で動作している。また、計算癲5は目的とする対象物
を記憶する磁気ディスク等を有している。ここで、対象
物を抽出する方法及び距離計測する方法等については後
述する。
FIG. 1 is a block diagram showing a schematic configuration of an embodiment of the present invention. A mirror image signal from an ITV camera 1 installed on a pan head 2 mounted on a vehicle moving indoors is divided into, for example, 512 x 512 pixels by an interface 3 that converts it into a digital image via a cable, and then sent to an image processing circuit 4. is input. The image processing circuit 4 is hardware having processing functions such as binarization processing, spatial filtering processing, and inter-pixel calculations, and operates under a total EHjl 15 that suspends processing control. Further, the calculator 5 has a magnetic disk or the like that stores the target object. Here, a method for extracting a target object, a method for measuring a distance, etc. will be described later.

画像処理結果は制御用計算改7に入力され、この計n機
7では辺の長さ及び(W Lm中に出かれている文字或
いは数字を判定する。そして、計算線内部でマツプと比
較したり距離演算等を行い(詳細について後述)、目的
地に対するずれ吊を駆動変換回路8a、8bに入力し、
駆動系9a、9bを動作させるものとなっている。
The image processing result is input to the control calculation unit 7, which judges the length of the side and the letters or numbers appearing in (WLm).Then, it compares it with the map inside the calculation line. and perform distance calculations (details will be described later), input the deviation relative to the destination to the drive conversion circuits 8a and 8b,
It operates the drive systems 9a and 9b.

次に、走行車と対象物との関係について説明する。例え
ば第2図に示すような迷路を想定した場合、それぞれの
コーナ゛に標111 a、〜、11hを配置する。移動
ロボット12は、雲台に設けられた[TVカメラ13か
ら標識11aを検出し、距離と角度を求めである一定の
ずれ角内にあるように制御し■の点まで達する。そして
、■の点で方向転換し、対象物11bが正対する位置ま
で回転し、上記ずれ角の制御を行って■の点に向かう。
Next, the relationship between the traveling vehicle and the object will be explained. For example, assuming a maze as shown in FIG. 2, markers 111a to 11h are placed at each corner. The mobile robot 12 detects the sign 11a from the TV camera 13 installed on the pan head, determines the distance and angle, and controls the sign 11a to be within a certain deviation angle until it reaches the point (3). Then, the direction is changed at the point (■), rotated to a position where the object 11b faces directly, and the above-mentioned deviation angle is controlled to head toward the point (2).

このように次から次へ同一の制御を行い、最終目的地で
ある■の点に達することになる。
In this way, the same control is performed one after another, and the final destination, point 2, is reached.

次に、画像処理手法及び制御方法について第3図乃至第
5図を参照して詳細に説明する。第4図のような室内を
想定し、目的とする標識を11a。
Next, the image processing method and control method will be explained in detail with reference to FIGS. 3 to 5. Assuming a room like the one shown in Figure 4, the target sign is 11a.

〜、11にとし、走行経路はそれぞれについて18a、
〜、18にであり、0点にロボットが位置すると仮定し
述べる。まず、0点での映像は第3図(a)のように入
力され、この中から例えば” F ”と記されている場
所へ行く場合について説明すると、0点から得られた画
像の中から標識の背景14a、14c、14eの色によ
る外界背景との分離を行う。例えば、標識背景を赤とす
れば、ビデオ信号の赤信号による映像を作り領域分割を
行い、点のかたまり具合を求め標識の部分にマスクをか
ける。次に、第3図(C)に示す如<j2216aと辺
16bの長さと記憶標識の長さとの比から距離を求める
。ここで、カメラのレンズ系IR率と1象の辺の長さか
ら16dの距離が決り、16a、16bは長さの比で求
められる。このようにしてロボットと対象となる標識ま
での距離が求まり、次にこの中からF′′と記されてい
る場所へ行くわけで標識の識別をするわけである。この
識別はそれぞれの辺の長さが求められたわけで、16に
示した画像から辺の長さによる座標変換を行い、第3図
(b)に示す如く15の画像を作りズレを正規化し、記
憶した標識と相関演算を行い相関値の最大なものによる
文字及び数字の決定を行い標識の識別を行う。このよう
にして求められた標識とロボットとの位置関係を次に求
めるわけであり、第5図に示すような場合、対象物の実
長aはITVカメラで観測した長さとしてbが投影され
る。ロボットと対象物の位置関係、即ちθはθ−cos
 ’  (b/a ) で求められるわけである。
~, 11, and the traveling route is 18a, respectively.
~, 18, and the description will be made assuming that the robot is located at point 0. First, the image at point 0 is input as shown in Figure 3 (a), and to explain the case where you go to a place marked as "F" from among these, for example, from among the images obtained from point 0, The backgrounds 14a, 14c, and 14e of the sign are separated from the external background by color. For example, if the background of a sign is red, an image is created using the red light of the video signal, area division is performed, the degree of clustering of dots is determined, and the part of the sign is masked. Next, the distance is determined from the ratio of the length of the side 16b and the length of the memory marker as shown in FIG. 3(C). Here, the distance 16d is determined from the IR rate of the camera lens system and the length of the sides of one elephant, and 16a and 16b are determined by the ratio of the lengths. In this way, the distance between the robot and the target sign is determined, and then the robot goes to the location marked F'' and identifies the sign. For this identification, the length of each side was determined, so we performed coordinate transformation based on the length of the side from the image shown in 16, created 15 images as shown in Figure 3(b), and normalized the deviation. A correlation calculation is performed with the stored mark to determine the letter and number based on the one with the maximum correlation value, and the mark is identified. Next, the positional relationship between the sign and the robot determined in this way is determined. In the case shown in Figure 5, the actual length a of the object is projected as b as the length observed by the ITV camera. Ru. The positional relationship between the robot and the object, that is, θ is θ-cos
' (b/a).

このようにして角度と距Bitから目的とする標識に向
けて走行するわけである。例えば距離d1で方向転換す
る場合は、0点で90+θ角度分制陣し、雲台を対象物
に正対するように90度回転して得られる画像中の中央
にくるまで直進走行すれば、■点に達することができ、
その位置で90度方向転換すれば、その後直進すれば目
的とする対象に達することができる。
In this way, the vehicle travels toward the target sign based on the angle and distance Bit. For example, when changing direction at distance d1, make a 90 + θ angle control at the 0 point, rotate the camera platform 90 degrees to face the object, and drive straight until it is in the center of the image obtained. can reach the point,
If you turn around 90 degrees at that point, you can reach your target by continuing straight ahead.

このように本実施例によれば、ITVカメラで得られた
外界の情報から走行車自身を誘導することができ、走行
車に目的地の指示だけをすれば走行車を目的地に誘導す
ることができる。このため、操作性が極めて良好なもの
となる。また、床面等に誘導するための加工を必要とせ
ず、標識のデータベースを変換することにより経路を自
由に選定できる等の利点がある。
In this way, according to this embodiment, the vehicle can be guided by the information of the outside world obtained by the ITV camera, and the vehicle can be guided to the destination by simply instructing the vehicle to the destination. Can be done. Therefore, the operability is extremely good. Further, there are advantages such as the ability to freely select a route by converting the sign database without requiring any processing to guide the route to the floor or the like.

なお、本発明は上述した実hI!例に限定されるもので
なく、その要旨を逸脱しない範囲で、種々変形して実施
することができる。例えば、前記画像処理回路、制御用
計算は及び駆動変換回路等からなる駆動制御部の(14
成は第1図に同等限定されるものでなく、ITVカメラ
により得られた映像情報に基いて走行車の位置を判定し
、該位置情報に基いて走行車の駆動系を制御できるもの
であればよい。また、標識の形状、個数及び配置位置等
は、所望する走行経路に応じて適宜変更すればよい。
Note that the present invention is based on the above-mentioned actual hI! The invention is not limited to the examples, and can be implemented with various modifications without departing from the gist of the invention. For example, the image processing circuit, control calculation (14
The configuration is not limited to the same as shown in FIG. 1, but may be any type that can determine the position of the vehicle based on video information obtained by an ITV camera and control the drive system of the vehicle based on the position information. Bye. Further, the shape, number, arrangement position, etc. of the signs may be changed as appropriate depending on the desired travel route.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は本発明の一実施例の概略構成を示すブロック図
、第2図は移動ロボットと標識との位置 、関係を示す
図、第3図は入力画(数例を示す模式図、第4図は経路
と(=識との関係を示す図、第5図は走行車を対象標識
へ誘導する方法を説明するための図である。  “ 1・・・ITVカメラ、2・・・雲台、3・・・画像イ
ンターフェース、4・・・画像処理回路、5・・・画像
処理用計算機、6・・・入力キーボード、7・・・走行
制御用計停敗、8・・・駆動変換回路、9・・・駆動系
、10・・・探識仮、11a、〜、11h・・・標識、
12・・・走行車、13・・・ITVカメラ、4.〜.
16・・・入力画像、18a、 〜、18h−経路。 第 2 目 11d 第 3 図 (a)            (b)CC) 第 4 ロ 第 5 図
FIG. 1 is a block diagram showing a schematic configuration of an embodiment of the present invention, FIG. 2 is a diagram showing the position and relationship between a mobile robot and a sign, and FIG. 3 is a schematic diagram showing some examples of input images. Figure 4 is a diagram showing the relationship between the route and (= sense), and Figure 5 is a diagram to explain the method of guiding the vehicle to the target sign. 1... ITV camera, 2... Cloud 3... Image interface, 4... Image processing circuit, 5... Image processing computer, 6... Input keyboard, 7... Drive control failure, 8... Drive conversion Circuit, 9... Drive system, 10... Exploration temporary, 11a, ~, 11h... Marker,
12... Traveling vehicle, 13... ITV camera, 4. ~.
16... Input image, 18a, ~, 18h-path. 2nd eye 11d Fig. 3 (a) (b) CC) Fig. 4 b Fig. 5

Claims (3)

【特許請求の範囲】[Claims] (1)走行車で移動して点検、監視及び作業を行う走行
車の誘導装置において、所定の位置に配置され前記走行
車を誘導するための標識と、前記走行車に搭載された雲
台に設けられ上記標識を撮影するITVカメラと、この
カメラにより得られた映像信号に基いて前記走行車の位
置を判定し、該位置情報により前記走行車の駆動系を制
御する駆動制御部とを具備してなることを特徴とする自
律走行車の誘導装置。
(1) In a guidance device for a traveling vehicle that moves in a traveling vehicle for inspection, monitoring, and work, a sign placed at a predetermined position to guide the traveling vehicle, and a platform mounted on the traveling vehicle. The vehicle is equipped with an ITV camera that is provided to photograph the sign, and a drive control unit that determines the position of the vehicle based on the video signal obtained by the camera and controls the drive system of the vehicle based on the position information. A guidance device for an autonomous vehicle characterized by:
(2)前記駆動制御部は、前記カメラからの映像信号を
画像処理する画像処理回路と、前記標識を予め記憶する
記憶回路と、上記記憶回路に記憶させる標識を登録及び
変更するための標識入力部と、上記画像処理回路で得ら
れる画像情報と上記記憶回路に記憶された標識とを比較
照合して検出標識を判定し、該判定した標識と上記画像
情報との比較により前記走行車の位置を演算する演算処
理部と、この演算処理部の出力に基いて前記走行車の駆
動系に駆動信号を送出する駆動変換回路とからなるもの
であることを特徴とする特許請求の範囲第1項記載の自
律走行車の誘導装置。
(2) The drive control unit includes an image processing circuit that processes the video signal from the camera, a storage circuit that stores the sign in advance, and a sign input for registering and changing the sign to be stored in the storage circuit. The detected sign is determined by comparing the image information obtained by the image processing circuit with the sign stored in the storage circuit, and the position of the traveling vehicle is determined by comparing the determined sign with the image information. and a drive conversion circuit that sends a drive signal to the drive system of the traveling vehicle based on the output of the arithmetic processing section. Guidance device for the autonomous vehicle described.
(3)前記演算処理部は、検出した標識の辺の長さの比
から該標識に対する傾きを求め、且つ検出した標識と記
憶した標識との辺の絶対長から検出した標識との距離を
求め、記憶したマップの中から前記走行車の位置を求め
るものであることを特徴とする特許請求の範囲第1項記
載の自律走行車の誘導装置。
(3) The arithmetic processing unit calculates the slope with respect to the detected sign from the ratio of the side lengths of the detected sign, and calculates the distance between the detected sign and the stored sign from the absolute length of the sides of the detected sign and the stored sign. 2. A guidance system for an autonomous vehicle according to claim 1, wherein the position of the vehicle is determined from a stored map.
JP59181029A 1984-08-30 1984-08-30 Guiding device of selfcontrol running truck Pending JPS6159507A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP59181029A JPS6159507A (en) 1984-08-30 1984-08-30 Guiding device of selfcontrol running truck

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP59181029A JPS6159507A (en) 1984-08-30 1984-08-30 Guiding device of selfcontrol running truck

Publications (1)

Publication Number Publication Date
JPS6159507A true JPS6159507A (en) 1986-03-27

Family

ID=16093521

Family Applications (1)

Application Number Title Priority Date Filing Date
JP59181029A Pending JPS6159507A (en) 1984-08-30 1984-08-30 Guiding device of selfcontrol running truck

Country Status (1)

Country Link
JP (1) JPS6159507A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62272307A (en) * 1986-05-21 1987-11-26 Komatsu Ltd Guide position correcting device for unattended moving body
JPS62285115A (en) * 1986-06-03 1987-12-11 Shinko Electric Co Ltd Drive control method for unmanned carrier
JPH0281105A (en) * 1988-09-17 1990-03-22 Nippon Yusoki Co Ltd Automatic steering control system
JPH04333903A (en) * 1991-05-09 1992-11-20 Fujita Corp Travel controller for self-traveling vehicle
US6058339A (en) * 1996-11-18 2000-05-02 Mitsubishi Denki Kabushiki Kaisha Autonomous guided vehicle guidance device
JP2006004175A (en) * 2004-06-17 2006-01-05 Toshiba Corp Self-position identification device and self-position identification method
JP2009080807A (en) * 2007-09-12 2009-04-16 Pepperl & Fuchs Gmbh Method and apparatus for determining position of vehicle, computer program and computer program product
JP2009211666A (en) * 2008-02-29 2009-09-17 Ind Technol Res Inst Zone identification system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62272307A (en) * 1986-05-21 1987-11-26 Komatsu Ltd Guide position correcting device for unattended moving body
JPS62285115A (en) * 1986-06-03 1987-12-11 Shinko Electric Co Ltd Drive control method for unmanned carrier
JPH0281105A (en) * 1988-09-17 1990-03-22 Nippon Yusoki Co Ltd Automatic steering control system
JPH04333903A (en) * 1991-05-09 1992-11-20 Fujita Corp Travel controller for self-traveling vehicle
US6058339A (en) * 1996-11-18 2000-05-02 Mitsubishi Denki Kabushiki Kaisha Autonomous guided vehicle guidance device
JP2006004175A (en) * 2004-06-17 2006-01-05 Toshiba Corp Self-position identification device and self-position identification method
US7489255B2 (en) 2004-06-17 2009-02-10 Kabushiki Kaisha Toshiba Self-position identification apparatus and self-position identification method
JP2009080807A (en) * 2007-09-12 2009-04-16 Pepperl & Fuchs Gmbh Method and apparatus for determining position of vehicle, computer program and computer program product
US8385594B2 (en) 2007-09-12 2013-02-26 Pepperl +Fuchs Method and apparatus for determining the position of a vehicle, computer program and computer program product
JP2009211666A (en) * 2008-02-29 2009-09-17 Ind Technol Res Inst Zone identification system
TWI384407B (en) * 2008-02-29 2013-02-01 Ind Tech Res Inst Region identification system by using braille code design

Similar Documents

Publication Publication Date Title
CN106774335B (en) Multi-view vision and inertial navigation based guiding device, landmark layout and guiding method
CN106950972B (en) Automatic Guided Vehicle (AGV) and route correction method thereof
JP4409035B2 (en) Image processing apparatus, singular part detection method, and recording medium recording singular part detection program
US5911767A (en) Navigation system for an autonomous mobile robot
US4566032A (en) Visually guided vehicle
JP4798450B2 (en) Navigation device and control method thereof
JPH02143309A (en) Operation method and apparatus
JP2005070043A (en) Self-position recognizing device and method of intelligent system using artificial mark, and the intelligent system using the same
JP2003015739A (en) External environment map, self-position identifying device and guide controller
CN108919810A (en) The localization for Mobile Robot and navigation system of view-based access control model teaching
CN110473414B (en) Vehicle driving path determining method, device and system
Behringer et al. Autonomous road vehicle guidance from autobahnen to narrow curves
JPH11272328A (en) Color mark, moving robot and method for guiding moving robot
CN115857504A (en) DWA-based robot local path planning method, equipment and storage medium in narrow environment
JPS6159507A (en) Guiding device of selfcontrol running truck
De Lima et al. Sensor-based control with digital maps association for global navigation: a real application for autonomous vehicles
Asada et al. Representing global world of a mobile robot with relational local maps
Gilg et al. Landmark-oriented visual navigation of a mobile robot
Morra et al. Visual control through narrow passages for an omnidirectional wheeled robot
JPH01197808A (en) Guidance system for unmanned vehicle
Han et al. Navigation control for a mobile robot
JPS59111508A (en) Automatic car guiding method using point follow-up system
JPS63134912A (en) Detecting method for position of moving robot
Stella et al. Self-location of a mobile robot by estimation of camera parameters
Lee et al. Model‐based location of automated guided vehicles in the navigation sessions by 3d computer vision