JPS59112311A - Guiding method of unmanned moving body - Google Patents

Guiding method of unmanned moving body

Info

Publication number
JPS59112311A
JPS59112311A JP57223353A JP22335382A JPS59112311A JP S59112311 A JPS59112311 A JP S59112311A JP 57223353 A JP57223353 A JP 57223353A JP 22335382 A JP22335382 A JP 22335382A JP S59112311 A JPS59112311 A JP S59112311A
Authority
JP
Japan
Prior art keywords
moving body
unmanned moving
markers
fixed station
unmanned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP57223353A
Other languages
Japanese (ja)
Inventor
Yokichi Nishi
西 洋吉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Komatsu Ltd
Original Assignee
Komatsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Komatsu Ltd filed Critical Komatsu Ltd
Priority to JP57223353A priority Critical patent/JPS59112311A/en
Publication of JPS59112311A publication Critical patent/JPS59112311A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons

Abstract

PURPOSE:To detect the relative intervals of pictures of three markers fitted to a fixed station and to detect the position of an unmanned moving body precisely to guide the moving body to the fixed station by projecting the markers on the image receiving surface of a TV camera mounted on the moving body. CONSTITUTION:Three markers 4a-4c are fitted to the surface of the fixed station 2 fixed on a floor surface 1 so that the marker 4c is projected to the front in the direction parallel to the floor surface 1. The TV camera 6 is arranged on the unmanned moving body 5 guided to the station 2 so as to be rotated on a surface parallel to the floor surface 1. The pictures 4a'-4c' of respective markers 4a-4c on the station 2 are simultaneously projected on the image receiving surface 7 of the camera 6. A controlling device detects the relative relation of the pictures 4a'-4c' to control the travelling of the moving body 5 on the basis of the detected result. Thus, the position of the moving body 5 is detected precisely and easily.

Description

【発明の詳細な説明】 本発明は、床面上を移動する無人移動体を固定ステーシ
ョンに向けて誘導する誘導方式に関する。
DETAILED DESCRIPTION OF THE INVENTION The present invention relates to a guidance method for guiding an unmanned moving object moving on a floor toward a fixed station.

例えば、作業口ゲットのような、蓄電池を電源として床
面上を移動する無人移動体においては、作業途中でその
蓄電池に充電する必要が生じた場合、床面上の適所に配
置した充電用装置を備えた固定ステーションに向けて、
この無人移動体を所定の方向から接近させて固定ステー
ションと結合させなければならない。
For example, in an unmanned moving object such as a work opening that uses a storage battery as a power source and moves on the floor, if it becomes necessary to charge the storage battery during work, a charging device placed at an appropriate place on the floor can be used. towards a fixed station with
This unmanned moving body must be approached from a predetermined direction and coupled to the fixed station.

このため、従来、あらかじめ誘導線を床に埋設しておき
、この誘導線に流された信号電流を無人移動体上の受信
機で電磁的に受信することにより、無人移動体が誘導線
に沿って移動して固定ステーションに所定の方向から接
近しかつこれと結合するという誘導方式が採られていた
For this reason, conventionally, a guide wire is buried in the floor in advance, and the signal current passed through the guide wire is electromagnetically received by a receiver on the unmanned vehicle, so that the unmanned vehicle can follow the guide wire. A guidance method was adopted in which the robot moved, approached a fixed station from a predetermined direction, and connected to it.

しかしながら、このような従来の無人移動車の誘導方式
の場合、床に誘導線を埋設しなければならないから、多
大の費用を必要とするという欠点があった。
However, in the case of such a conventional guidance method for an unmanned vehicle, a guide wire must be buried in the floor, resulting in a high cost.

このため、上述のような誘導線を床に埋設する代りに、
床に対して明瞭なコントラストを有するテープを床面上
に貼着し、このテープを無人移動体上に設けた光学的検
出装置を用いて検出することにより、無人移動体をこの
テープに沿って移動させるような誘導方式も行われてい
るが、この場合はテープが汚損し易く、シたがって誤動
作が生じ易い欠点があり、このだめテープの頻繁な貼り
替えを必要とした。
Therefore, instead of burying the guide wire in the floor as described above,
By pasting a tape with a clear contrast to the floor on the floor and detecting this tape using an optical detection device installed on the unmanned vehicle, the unmanned vehicle can be guided along this tape. A guiding method in which the tape is moved has also been used, but in this case, the tape is easily soiled and malfunctions are likely to occur, requiring frequent replacement of the tape.

本発明はd述した点に鑑みてなされたもので、床に誘導
線または光学的テープを設けることなく、確実に無人移
動体を固定ステーションへ誘導しうる新規な無人移動体
の誘導方式を提供することを目的とする。
The present invention has been made in view of the above points, and provides a new unmanned moving object guidance method that can reliably guide an unmanned moving object to a fixed station without providing a guide wire or optical tape on the floor. The purpose is to

そこで本発明は、固定ステーションに3個のマーカを設
け、無人移動体上に回転自在に設けられたテレビジョン
カメラの受像面上にこれら3個のマーカを映出し、この
受像向上における3個のマーカの映像の相対間隔を検出
することにより、無人移動体の固定ステーションに対す
る相対位置を検知し、この検知にもとづいて無人移動体
を固定ステーションに向けて誘導するようにするように
している。
Therefore, the present invention provides three markers on a fixed station, projects these three markers on the image receiving surface of a television camera rotatably installed on an unmanned moving body, and achieves three points in improving image reception. By detecting the relative spacing between the images of the markers, the relative position of the unmanned moving object with respect to the fixed station is detected, and based on this detection, the unmanned moving object is guided toward the fixed station.

以下、本発明の一実施例を添付図面を参照して詳細に説
明する。
Hereinafter, one embodiment of the present invention will be described in detail with reference to the accompanying drawings.

第1図および第2図は固定ステーションの正面図および
平面図で、床面1上に設置された固定ステーション2の
前面3には、発光体よりなる3個のマーカ4a 、4b
 +4cが、床面1上から所定の距、離を保って配置さ
れている。マーカ4a、4bは固定ステーション2の前
面3上に設けられているが、マーカ4Cのみは固定ステ
ーション2の前面より突出した位置に設けられている。
1 and 2 are a front view and a plan view of the fixed station. On the front surface 3 of the fixed station 2 installed on the floor 1, there are three markers 4a, 4b made of light emitters.
+4c is placed at a predetermined distance from the floor surface 1. The markers 4a and 4b are provided on the front surface 3 of the fixing station 2, but only the marker 4C is provided at a position protruding from the front surface of the fixing station 2.

ここでマーカ4a +4b +4cの中心位置をそれぞ
れA。
Here, the center positions of markers 4a + 4b + 4c are respectively A.

B、Cとすれば、点A、B、Cは床面1と平行な1つの
共通平面P上にあるものとし、線分AB(長さ2t)の
中点をDとするとき、点Cは、点りにおいて平面P上に
引いた線分ABの垂直2等分線上にあり、かつ点りより
距離mだけ偏位した位置にある。
B, C, points A, B, C are on one common plane P parallel to the floor 1, and when the midpoint of line segment AB (length 2t) is D, then point C is on the perpendicular bisector of the line segment AB drawn on the plane P at the point, and is at a position deviated from the point by a distance m.

第3図は、固定ステーション2と、それに向って移動す
る無人移動体5とを平面図によって示し、無人移動体5
上には、テレビジョンカメラ6が点0を中心として平面
P内で回転自在に設けられている。テレビジョンカメラ
6は、それ自体は公知の適当な自動追随装置を用いて常
にマーカ4cを指向するように制御されている。なお、
本実施例の場合、固定ステーション2および無人移動体
5上には、両者の結合により、無人移動体5上の蓄電池
に充電するだめの装置が設けられているが、説明を簡単
にするため、充電装置の図示および説明は省略する。
FIG. 3 shows the fixed station 2 and the unmanned moving body 5 moving toward it in a plan view.
A television camera 6 is provided above so as to be rotatable within a plane P around point 0. The television camera 6 is controlled so as to always point at the marker 4c using a suitable automatic tracking device which is known per se. In addition,
In the case of this embodiment, a device is provided on the fixed station 2 and the unmanned moving body 5 to charge the storage battery on the unmanned moving body 5 by coupling the two, but for the sake of simplicity, Illustrations and descriptions of the charging device will be omitted.

第4図は、テレビジョンカメラ6の受像面7上に映出さ
れるマーカ4a 、4b 、4cの映像4a′。
FIG. 4 shows an image 4a' of the markers 4a, 4b, and 4c projected on the image receiving surface 7 of the television camera 6.

4 b’ r 4 c’の位置関係を説明する平面図で
、8はテレビジョンカメラ6のレンズである。この場合
、受像面7上における映像4 a’ g 4 b’ +
 4 c’の位置をそれぞれA/ 、 B/ 、 c/
とすれば、無人移動体5が線分DC+7)延長線上にあ
ればA′C′=B′C′、線分DC(5) の延長線より右側にあれば、A′C′〈B′C′、線分
DCの延長線より左側にあればA/C/> B/C/と
なり、このA′C′とB′C′との長さの差を検出すれ
ば、無人移動体5が線分DCの延長線に関し、何れの領
域にあるかが検知される。したがって、本発明による無
人移動体の誘導方式の実施例においては A/ C/と
B′C′との長さの差を検出し、この差がゼロとなるよ
うに無人移動体5を移動させて、無人移動体5を線分D
Cの延長線上に導き、次にA′C′とB′C′との差を
ゼロに保ちつつ無人移動体5を固定ステーション2に接
近させれるようにしている。
4 b' r 4 c' is a plan view explaining the positional relationship, and 8 is a lens of the television camera 6. In this case, the image 4 a' g 4 b' + on the image receiving surface 7
4 The position of c' is A/, B/, c/ respectively.
Then, if the unmanned mobile object 5 is on the extension of the line segment DC+7), then A'C'=B'C', and if it is on the right side of the extension of the line segment DC(5), then A'C'<B'C', if it is on the left side of the extension line of the line segment DC, A/C/> B/C/, and if the difference in length between A'C' and B'C' is detected, the unmanned moving object 5 It is detected in which region the line DC is located as an extension of the line segment DC. Therefore, in the embodiment of the guidance method for an unmanned moving object according to the present invention, the difference in length between A/C/ and B'C' is detected, and the unmanned moving object 5 is moved so that this difference becomes zero. , the unmanned moving object 5 is a line segment D
Then, the unmanned moving body 5 can be brought closer to the fixed station 2 while keeping the difference between A'C' and B'C' at zero.

次に第3図の位置にある無人移動体5を線分DCの延長
線上に誘導する方法の一例について説明する。いまテレ
ビジョンカメラ6の回転中心0を通り無人移動体5の進
行方向にとった軸をy軸とし、これと直角な軸をX軸と
し、y軸が線分DCの延長線と平行であるものとする。
Next, an example of a method for guiding the unmanned moving object 5 located at the position shown in FIG. 3 onto an extension of the line segment DC will be described. The axis passing through the rotation center 0 of the television camera 6 and in the direction of movement of the unmanned moving object 5 is the y-axis, the axis perpendicular to this is the x-axis, and the y-axis is parallel to the extension of the line segment DC. shall be taken as a thing.

したがって、テレビジョンカメラ6はy軸に対し角度θ
だけ左方に回動した位置にある。この場合、テレビジョ
ンカメラ6の受像面7上のマーカ4a、4b、4cの(
6) 映像4 a’ + 4 b’ + 4 c’はA/ C
/ (B/ C/の関係にあることが検出されるから、
これにもとづいて無人移動体5を90°左方に旋回させ
れば第5図の状態となり、y軸が線分ABと平行になる
。次に無人移動体5を前進させ、第6図のようにA/ 
C/ == B/ c/となった位置で90°右方に旋
回させて第7図の状態とし、固定ステーション2に向け
て前進させればよい。
Therefore, the television camera 6 is at an angle θ with respect to the y-axis.
It is in a position rotated to the left. In this case, the markers 4a, 4b, 4c on the image receiving surface 7 of the television camera 6 (
6) Image 4 a' + 4 b' + 4 c' is A/C
/ (Since the relationship B/C/ is detected,
If the unmanned moving body 5 is turned to the left by 90 degrees based on this, the state shown in FIG. 5 will be obtained, and the y-axis will become parallel to the line segment AB. Next, move the unmanned moving body 5 forward, and as shown in FIG.
At the position where C/ == B/ c/, the robot can be turned to the right by 90 degrees to obtain the state shown in FIG. 7, and then moved forward toward the fixing station 2.

テレビジョンカメラ6の受像面7上のA′C′およびB
′C′の長さを検出するには、受像面7上の絵素数を数
えればよく、その検出にもとづいて無人移動体5の左右
の車輪を個別に駆動するモータMRおよびMLの回転数
を制御することにより、無人移動体5を所定の位置に移
動させることができる。
A'C' and B on the image receiving surface 7 of the television camera 6
In order to detect the length of 'C', it is sufficient to count the number of picture elements on the image receiving surface 7, and based on that detection, the rotation speed of the motors MR and ML that individually drive the left and right wheels of the unmanned vehicle 5 can be determined. By controlling it, the unmanned moving body 5 can be moved to a predetermined position.

第8図は本発明による無人移動体の誘導方式を示すブロ
ック図で、9はテレビジョンカメラ6を作動させる発振
器、10はA−D変換器、11はデータメモリ、12は
CPUである。CPU 12内部の演算ブロック図は第
9図に示されており、気はテレビジ、ンカメラ6の受像
面7上におけるA/ C/間の絵素数をあられす信号、
GBは8702間の絵素数をあられす信号である。信号
GAI G、は減算され、増幅器13R+13Lでそれ
ぞれ増幅された信号が基準速度指令信号vFとそれぞれ
加算されて、左右モータMRI MLの回転速度指令信
号vR1vLを発生する。これら信号vR2■Lはそれ
ぞれモータ駆動用増幅器14B + 14Lに加えられ
、vR=vLテあればモータMRの回転速度とモータM
Lの回転速度とが等しくなり、無人移動体5は直進し、
vR>vLであれば左方に曲り、vR<vLであれば右
方に曲る。またCPU 12の出力は増幅器15に加え
られ、この増幅器15によって、テレビジョンカメラ旋
回用モータM、が駆動される。
FIG. 8 is a block diagram showing a guidance system for an unmanned moving object according to the present invention, in which 9 is an oscillator for operating the television camera 6, 10 is an AD converter, 11 is a data memory, and 12 is a CPU. The calculation block diagram inside the CPU 12 is shown in FIG.
GB is a signal that indicates the number of picture elements between 8702. The signals GAIG and G are subtracted, and the signals amplified by amplifiers 13R+13L are added to the reference speed command signal vF to generate a rotational speed command signal vR1vL for the left and right motors MRI ML. These signals vR2■L are respectively applied to the motor drive amplifiers 14B + 14L, and if vR=vLte, the rotational speed of the motor MR and the motor M
The rotational speed of L becomes equal, and the unmanned moving object 5 moves straight,
If vR>vL, turn to the left; if vR<vL, turn to the right. The output of the CPU 12 is also applied to an amplifier 15, which drives a television camera rotation motor M.

なお、上述した実施例においては、マーカ4 m。In addition, in the embodiment described above, the marker 4m.

4b、4cを発光体としたが、これらマーカはテレビジ
ョンカメラ6の受像面7上で明瞭に識別しうるものであ
れば必らずしも発光体でなくてもよい。また、無人移動
体5の移動を制御する方式は、上記実施例に限定される
ものでなく、種々の態様が可能である。
Although 4b and 4c are light emitters, these markers do not necessarily have to be light emitters as long as they can be clearly identified on the image receiving surface 7 of the television camera 6. Further, the method for controlling the movement of the unmanned moving body 5 is not limited to the above embodiment, and various modes are possible.

以上の説明によって本発明による無人移動体の誘導方式
が明らかとなったが、本発明によれば、3個のマーカを
テレビジョンカメラの受像面上に映出し、この受像面上
に映出されたマーカの映像間の相対間隔にもとづいて無
人移動体の移動を制御するようになされているので、従
来の方式のように床に誘導線を埋設したり光学的テープ
を貼付したりすることが不用となる。
The above explanation has clarified the method of guiding an unmanned moving object according to the present invention.According to the present invention, three markers are projected on the image receiving surface of a television camera, Since the movement of the unmanned vehicle is controlled based on the relative spacing between images of the markers, there is no need to bury guide wires in the floor or attach optical tape as in conventional methods. It becomes unnecessary.

また、中央のマーカが両側のマーカを結ぶ線に対して無
人移動体が存在する側に偏位して取付けられているので
、テレビジョンカメラの受像面上におけるマーカの映像
間の距離の差が、3個のマーカを同一線上に配置する場
合に比較して、より顕著にあられれるから、無人移動体
の位置検出をより容易かつ確実なものとすることができ
る。
In addition, since the center marker is mounted offset to the side where the unmanned moving object is present with respect to the line connecting the markers on both sides, the difference in distance between the images of the markers on the image receiving surface of the television camera is , are more prominent than when three markers are arranged on the same line, and therefore the position of the unmanned moving body can be detected more easily and reliably.

【図面の簡単な説明】[Brief explanation of drawings]

第1図および第2図は本発明に適用される固定ステージ
、ンの正面図および平面図、第3図は本発明に適用され
る固定ステーションおよび無人移動体を示す平面図、第
4図はその説明図、第5図(9) 〜第7図は無人移動体の移動態様の一例を示す平面図、
第8図は本発明のブロック図、第9図はそのCPU内部
の演算ブロック図である。 1・・・床面      2・・・固定ステーション4
a + 4b * 4.c −−−−r−力  4m’
+ 4b’+ 4 c’−・マーカの映像5・・・無人
移動体   6・・・テレビジョンカメラ7・・・受像
面     8・・・レンズ12 ・・・ CPU (10)
1 and 2 are a front view and a plan view of a fixed stage applied to the present invention, FIG. 3 is a plan view showing a fixed station and an unmanned moving body applied to the present invention, and FIG. Its explanatory diagrams, FIGS. 5(9) to 7, are plan views showing an example of the movement mode of the unmanned moving body,
FIG. 8 is a block diagram of the present invention, and FIG. 9 is a block diagram of calculations inside the CPU. 1... Floor surface 2... Fixed station 4
a + 4b * 4. c ----r-force 4m'
+4b'+4c'-・Marker image 5...Unmanned moving object 6...Television camera 7...Image receiving surface 8...Lens 12...CPU (10)

Claims (1)

【特許請求の範囲】 無人移動体を床面上で固定ステーションに向けて誘導す
る誘導方式において、 前記固定ステーションに、第1、第2および第3のマー
カを、前記床面と平行な方向にかつ前記第3のマーカを
前記第1および第2のマーカの中間において該第1およ
び第2のマーカを結ぶ線より前記無人移動体の存在する
側に偏位させて並設し、 前記無人移動体上に前記床面と平行な面内で回動可能に
設けたテレビジョンカメラの受像面上に前記第1、第2
および第3のマーカを同時に映出し、 前記受像面上における前記第1、第2および第3のマー
カの映像間の相対間隔を検出し、該検出にもとづき前記
無人移動体を制御して、該無人移動体を前記固定ステー
ションに対し所定の方向から接近するように誘導するこ
とを特徴とする無人移動体の誘導方式。
[Claims] In a guidance method for guiding an unmanned moving object toward a fixed station on a floor surface, first, second, and third markers are placed on the fixed station in a direction parallel to the floor surface. and the third marker is arranged in parallel between the first and second markers and is offset from the line connecting the first and second markers to the side where the unmanned moving body is present, and the unmanned moving body The first and second cameras are placed on the image receiving surface of a television camera that is rotatably provided on the body in a plane parallel to the floor surface.
and a third marker simultaneously, detecting a relative interval between images of the first, second and third markers on the image receiving surface, controlling the unmanned moving body based on the detection, and controlling the unmanned moving body based on the detection. A method for guiding an unmanned moving object, characterized in that the unmanned moving object is guided to approach the fixed station from a predetermined direction.
JP57223353A 1982-12-20 1982-12-20 Guiding method of unmanned moving body Pending JPS59112311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP57223353A JPS59112311A (en) 1982-12-20 1982-12-20 Guiding method of unmanned moving body

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP57223353A JPS59112311A (en) 1982-12-20 1982-12-20 Guiding method of unmanned moving body

Publications (1)

Publication Number Publication Date
JPS59112311A true JPS59112311A (en) 1984-06-28

Family

ID=16796827

Family Applications (1)

Application Number Title Priority Date Filing Date
JP57223353A Pending JPS59112311A (en) 1982-12-20 1982-12-20 Guiding method of unmanned moving body

Country Status (1)

Country Link
JP (1) JPS59112311A (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6123221A (en) * 1984-07-11 1986-01-31 Oki Electric Ind Co Ltd Guiding system of mobile truck
JPS61259308A (en) * 1985-05-10 1986-11-17 Komatsu Ltd Guiding method of luminescent point follow-up type unmanned vehicle
JPH04241605A (en) * 1991-01-14 1992-08-28 Daifuku Co Ltd Device for detecting deviated amount of moving vehicle
KR100437159B1 (en) * 2001-08-06 2004-06-25 삼성광주전자 주식회사 External charging apparatus and robot cleaner system employing and method of rejoining the same
WO2005098476A1 (en) * 2004-03-29 2005-10-20 Evolution Robotics, Inc. Method and apparatus for position estimation using reflected light sources
US7706917B1 (en) 2004-07-07 2010-04-27 Irobot Corporation Celestial navigation system for an autonomous robot
US8632376B2 (en) 2007-09-20 2014-01-21 Irobot Corporation Robotic game systems and methods
US8855813B2 (en) 2005-02-18 2014-10-07 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
US9002511B1 (en) 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
US9026302B2 (en) 2009-11-06 2015-05-05 Irobot Corporation Methods and systems for complete coverage of a surface by an autonomous robot
US9104204B2 (en) 2001-06-12 2015-08-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US9310806B2 (en) 2010-01-06 2016-04-12 Irobot Corporation System for localization and obstacle detection using a common receiver
US9320398B2 (en) 2005-12-02 2016-04-26 Irobot Corporation Autonomous coverage robots
US9622635B2 (en) 2001-01-24 2017-04-18 Irobot Corporation Autonomous floor-cleaning robot
US9949608B2 (en) 2002-09-13 2018-04-24 Irobot Corporation Navigational control system for a robotic device
US9955841B2 (en) 2006-05-19 2018-05-01 Irobot Corporation Removing debris from cleaning robots
US10070764B2 (en) 2007-05-09 2018-09-11 Irobot Corporation Compact autonomous coverage robot
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush
US10470629B2 (en) 2005-02-18 2019-11-12 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US10524629B2 (en) 2005-12-02 2020-01-07 Irobot Corporation Modular Robot

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0344322B2 (en) * 1984-07-11 1991-07-05 Oki Electric Ind Co Ltd
JPS6123221A (en) * 1984-07-11 1986-01-31 Oki Electric Ind Co Ltd Guiding system of mobile truck
JPS61259308A (en) * 1985-05-10 1986-11-17 Komatsu Ltd Guiding method of luminescent point follow-up type unmanned vehicle
JPH04241605A (en) * 1991-01-14 1992-08-28 Daifuku Co Ltd Device for detecting deviated amount of moving vehicle
US9622635B2 (en) 2001-01-24 2017-04-18 Irobot Corporation Autonomous floor-cleaning robot
US9104204B2 (en) 2001-06-12 2015-08-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
KR100437159B1 (en) * 2001-08-06 2004-06-25 삼성광주전자 주식회사 External charging apparatus and robot cleaner system employing and method of rejoining the same
US9949608B2 (en) 2002-09-13 2018-04-24 Irobot Corporation Navigational control system for a robotic device
US8295955B2 (en) 2004-03-29 2012-10-23 Evolutions Robotics, Inc. Methods and apparatus for position estimation using reflected light sources
US7996097B2 (en) 2004-03-29 2011-08-09 Evolution Robotics, Inc. Methods and apparatus for position estimation using reflected light sources
US7720554B2 (en) 2004-03-29 2010-05-18 Evolution Robotics, Inc. Methods and apparatus for position estimation using reflected light sources
JP2007530978A (en) * 2004-03-29 2007-11-01 エヴォリューション ロボティクス インコーポレイテッド Position estimation method and apparatus using reflected light source
WO2005098476A1 (en) * 2004-03-29 2005-10-20 Evolution Robotics, Inc. Method and apparatus for position estimation using reflected light sources
US8874264B1 (en) 2004-07-07 2014-10-28 Irobot Corporation Celestial navigation system for an autonomous robot
US7706917B1 (en) 2004-07-07 2010-04-27 Irobot Corporation Celestial navigation system for an autonomous robot
US8855813B2 (en) 2005-02-18 2014-10-07 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US10470629B2 (en) 2005-02-18 2019-11-12 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US9002511B1 (en) 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
US9632505B2 (en) 2005-10-21 2017-04-25 Irobot Corporation Methods and systems for obstacle detection using structured light
US10524629B2 (en) 2005-12-02 2020-01-07 Irobot Corporation Modular Robot
US9320398B2 (en) 2005-12-02 2016-04-26 Irobot Corporation Autonomous coverage robots
US10244915B2 (en) 2006-05-19 2019-04-02 Irobot Corporation Coverage robots and associated cleaning bins
US9955841B2 (en) 2006-05-19 2018-05-01 Irobot Corporation Removing debris from cleaning robots
US10299652B2 (en) 2007-05-09 2019-05-28 Irobot Corporation Autonomous coverage robot
US10070764B2 (en) 2007-05-09 2018-09-11 Irobot Corporation Compact autonomous coverage robot
US11498438B2 (en) 2007-05-09 2022-11-15 Irobot Corporation Autonomous coverage robot
US8632376B2 (en) 2007-09-20 2014-01-21 Irobot Corporation Robotic game systems and methods
US9188983B2 (en) 2009-11-06 2015-11-17 Irobot Corporation Methods and systems for complete coverage of a surface by an autonomous robot
US9895808B2 (en) 2009-11-06 2018-02-20 Irobot Corporation Methods and systems for complete coverage of a surface by an autonomous robot
US9026302B2 (en) 2009-11-06 2015-05-05 Irobot Corporation Methods and systems for complete coverage of a surface by an autonomous robot
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
US10583562B2 (en) 2009-11-06 2020-03-10 Irobot Corporation Methods and systems for complete coverage of a surface by an autonomous robot
US11052540B2 (en) 2009-11-06 2021-07-06 Irobot Corporation Methods and systems for complete coverage of a surface by an autonomous robot
US9310806B2 (en) 2010-01-06 2016-04-12 Irobot Corporation System for localization and obstacle detection using a common receiver
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush
US11058271B2 (en) 2010-02-16 2021-07-13 Irobot Corporation Vacuum brush

Similar Documents

Publication Publication Date Title
JPS59112311A (en) Guiding method of unmanned moving body
CN105446344A (en) Mobile robot homing charge and payment system
CN105487543A (en) Movable robot homing and charging system
CN111090284A (en) Method for returning from traveling equipment to base station and self-traveling equipment
CN111476166A (en) Self-service charging alignment method for ground inspection robot
JPH0614408A (en) Charger for electric motor vehicle
JPH0344322B2 (en)
JP2001005525A (en) Unmanned carriage system
JPS63134912A (en) Detecting method for position of moving robot
JPS622843Y2 (en)
JPS62272307A (en) Guide position correcting device for unattended moving body
JPH07329743A (en) Runaway detector of unmanned transport vehicle
JPS59116813A (en) Carrier vehicle
JPH06149369A (en) Stop position detector for moving vehicle
JPS60239810A (en) Control equipment of travelling of moving car
Gardel et al. Detection and tracking vehicles using a zoom camera over a pan-and-tilt unit
JPS59212916A (en) Guiding method of unmanned traveling object
JPS6081609A (en) Supporting device of travelling course of omnidirectional movable truck
JP2023152070A (en) Control system for automatic traveling vehicle
JP2564127B2 (en) Unmanned vehicles that can make detours
JPH04303212A (en) Position detector for unmanned carrier
JPH04137015A (en) Automatic travelling carriage guiding device using itv
JPS61196306A (en) Guide controller for traveling object
JP2665559B2 (en) Moving vehicle stop state detection device
JPH01300311A (en) Unmanned vehicle guiding device