JPS62122507A - Steering control apparatus of self-running working machine - Google Patents

Steering control apparatus of self-running working machine

Info

Publication number
JPS62122507A
JPS62122507A JP61229124A JP22912486A JPS62122507A JP S62122507 A JPS62122507 A JP S62122507A JP 61229124 A JP61229124 A JP 61229124A JP 22912486 A JP22912486 A JP 22912486A JP S62122507 A JPS62122507 A JP S62122507A
Authority
JP
Japan
Prior art keywords
boundary
differential value
vehicle
image information
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP61229124A
Other languages
Japanese (ja)
Other versions
JPH0646886B2 (en
Inventor
正彦 林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kubota Corp
Original Assignee
Kubota Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kubota Corp filed Critical Kubota Corp
Priority to JP61229124A priority Critical patent/JPH0646886B2/en
Publication of JPS62122507A publication Critical patent/JPS62122507A/en
Publication of JPH0646886B2 publication Critical patent/JPH0646886B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Abstract

(57)【要約】本公報は電子出願前の出願データであるた
め要約のデータは記録されません。
(57) [Summary] This bulletin contains application data before electronic filing, so abstract data is not recorded.

Description

【発明の詳細な説明】 し産業上の利用分野] 本発明は、撮像手段によって車体の進行方向前方所定範
囲の作業地状態を撮像し、その撮像画像情報を、その平
均明度差に基づいて2値化することによって未処理作業
地と処理済作業地との境界を検出する境界検出手段、お
よび、この境界検出手段による境界検出結果に基づいて
、車体が前記境界に沿って自動的に走行する。ように操
向する制御手段を備えた自動走行作業車の操向制御装置
に関する。
[Detailed Description of the Invention] [Industrial Field of Application] The present invention captures an image of the working ground condition in a predetermined range in front of the vehicle in the direction of movement of the vehicle body using an imaging means, and divides the captured image information into two images based on the average brightness difference. Boundary detection means detects the boundary between the untreated work area and the treated work area by converting it into a value, and the vehicle body automatically runs along the boundary based on the boundary detection result by the boundary detection means. . The present invention relates to a steering control device for an automatic traveling work vehicle, which includes a control means for steering the vehicle in this manner.

[従来の技術] 上記この種の自動走行作業車においては、上記境界検出
手段を構成するに、フォトインタラプタ式あるいは光反
射式の敏いセンサ、または接触式のセンサ、等を用いて
、進行方向直前あるいは現在の車体位置等、特定の一地
点における作業地状態が°未処理作業地であるか処理済
作業地であるかを感知することによって車体が沿うべき
境界位置を検出するように構成してあった。
[Prior Art] In this type of autonomous work vehicle, the boundary detection means uses a photo-interrupter type or light reflection type sensitive sensor, a contact type sensor, etc. to detect the direction of travel. The present invention is configured to detect the boundary position along which the vehicle body should follow by sensing whether the state of the work area at a specific point, such as the previous or current position of the vehicle body, is an untreated work area or a treated work area. There was.

しかしながら、上記従来の境界検出手段にあっては、作
業地のある一地点の境界に対するずれを検出するもので
あり、その検出結果を直ちにステアリングの制御バラメ
ークとしていたために、以下に示すような不都合があっ
た。
However, the above-mentioned conventional boundary detection means detects the deviation of one point of the work area from the boundary, and the detection result is immediately used as a control parameter for the steering, which causes the following disadvantages. was there.

即ち、作業地状態が悪く、境界が非連続な場合などには
、沿うべき境界を見失いやすく、境界の検出信号を例え
ば所定走行区間に亘って平均化するなどの処理を行って
誤動作を防止する必要があり1.その結果、制御応答が
遅くなるという不都合がある。
In other words, when the working area is in poor condition and the boundaries are discontinuous, it is easy to lose sight of the boundaries to be followed, so processing such as averaging the boundary detection signals over a predetermined travel section is performed to prevent malfunctions. There is a need 1. As a result, there is an inconvenience that the control response becomes slow.

又、実1祭に走行している地点あるいはその直前の一地
点における境界に対するずれのみしか検出できないため
、上記検出境界が非連続であっても実際の境界方向は直
線的であり単に直進するだけでよいような場合にも、こ
の不連続な境界に追従しようとして不要な操向制御を行
い、その結果、大きく蛇行したり制御が・・ンチングを
起こしたりして直進走行性が悪くなるという不都合もあ
った。
In addition, since it is possible to detect only the deviation from the boundary at the point where the vehicle is currently traveling or the point immediately before it, even if the detected boundary is discontinuous, the actual direction of the boundary is straight and the vehicle simply moves straight. Even in cases where this would be acceptable, unnecessary steering control is performed in an attempt to follow this discontinuous boundary, resulting in large meandering or control loss, resulting in poor straight-line driving performance. There was also.

更に、実際に走行した後でなければ、境界の方向等その
状態を検出できないという構造上の欠点もあった。
Furthermore, there was also a structural drawback in that the state, such as the direction of the boundary, could not be detected until after the vehicle had actually traveled.

そこで、本出願人は、車体が沿うべき境界を実際に走行
すること無く、その形状をも検出可能な境界検出手段と
して、車体進行方向前方の。
Therefore, the present applicant developed a boundary detecting means that can detect the shape of the boundary along which the vehicle body should travel without actually traveling along it.

作業地の所定範囲を撮像した画像情報を、その平均明度
差に基づいて2値化することによって前記境界を連続し
た線として検出する手段を、先に提案しである(特@昭
59−211637号)。
We have previously proposed a means for detecting the boundary as a continuous line by binarizing image information obtained by capturing a predetermined range of a work area based on the average brightness difference (Special @ 1983-211637). issue).

[発明が解決しようとする問題点] 上記画像情報を2値化することによって境界を検出する
境界検出手段においては、明度差に基づいてコ値化処理
を行うために、未処理作業地と処理済作業地との明度差
が明確でなければ誤検出が多くなる不都合がある。 そ
して、上記明度差を実質的に大きくするために、例えば
車体進行方向となる画像情報成分の明度変化を無視した
り、画像を平均化して細かい明度差を予め除去する等の
処理を行った後、画像情報の隣接した画素間の明度変化
を微分し、その微分値の大きい部分を境界として判別す
るようにしていた。
[Problems to be Solved by the Invention] In the boundary detection means that detects boundaries by binarizing the image information, in order to perform co-value processing based on the brightness difference, it is necessary to If the difference in brightness from the area where the work has already been done is not clear, there will be a problem in that false detections will occur frequently. In order to substantially increase the brightness difference, for example, after processing is performed such as ignoring changes in brightness of the image information component corresponding to the direction of travel of the vehicle or averaging the images to remove fine brightness differences in advance. , the change in brightness between adjacent pixels of image information is differentiated, and the portion where the differential value is large is determined as a boundary.

従って、例えば処理済作業地内に暗く見える部分が有っ
たり、未処理作業地内に明かるく見える部分が有ると、
この部分での明度変化が大きくなり、境界と同様な情報
が得られ、特に車体進行方向と同じ方向に上記境界と同
じ情報が連続して現れると、本来の境界と区別できなく
なる不都合があった。
Therefore, for example, if there is a part that looks dark in a treated work area, or a part that looks bright in an untreated work area,
The brightness change in this part becomes large, and information similar to the boundary is obtained. Especially when the same information as the boundary appears continuously in the same direction as the vehicle's traveling direction, it becomes inconvenient that it cannot be distinguished from the original boundary. .

ところで、上記自動走行作業車は、作業を行いながら走
行するものであるから、処理済作業地に隣接した未処理
作業地上を走行することになる。 つまり、左右いずれ
かの車体外方側にはその進行方向に対応して処理済作業
地があり、作業装置に対して前方側は未処理作業地があ
る状態となるものである。
By the way, since the automatic traveling work vehicle described above travels while performing work, it travels on untreated work ground adjacent to treated work land. In other words, there is a treated work area on either the left or right outer side of the vehicle body corresponding to the direction of movement, and there is an untreated work area on the front side with respect to the working device.

本発明は、上記実情に鑑みてなされたものであって、そ
の目的は、車体進行方向と作業地状況との関係を有効に
利用することによって、作業地状況の画像情報から境界
と区別しにくい情報を予め除去して正確な境界状板を検
出することにある。
The present invention has been made in view of the above-mentioned circumstances, and its purpose is to effectively utilize the relationship between the moving direction of the vehicle body and the working area situation so that it is difficult to distinguish it from the boundary from the image information of the working area situation. The purpose of this method is to remove information in advance to detect accurate boundary plates.

し問題点を解決するための手段〕 上記目的を達成すべく、本発明による自動走行作業車の
操向制御装置は、前記撮像(面像情報をその平均明度差
に基づいて2値化するに、画像情報の明度変化の微分値
を演算する手段、その微分値の正負符号を’l’lJ別
する手段、車体進行方向を検出する手段、および、前記
進行方向回出手段による車体進行方向検出結果に基づい
て、前記微分値符号判別手段による判別符号の一方のみ
の符号の微分値変化が所定値以上であるか否かによって
画像情報を2値化する手段を備えさせである点に特徴を
有し、その作用ならびに効果は以下の通りである。
Means for Solving the Problem] In order to achieve the above object, the steering control device for an automatic traveling work vehicle according to the present invention has a method for binarizing the imaging (surface image information) based on the average brightness difference. , means for calculating a differential value of a change in brightness of image information, means for classifying the sign of the differential value by 'l'lJ, means for detecting a vehicle body traveling direction, and vehicle body traveling direction detection by the traveling direction retrieval means. The present invention is characterized by comprising means for binarizing the image information based on the result, depending on whether the differential value change of only one of the codes determined by the differential value code discriminating means is equal to or greater than a predetermined value. Its actions and effects are as follows.

1作 用] すなわち、車体進行方向と、画像情報の明度変化の微分
値の符号(正、負)とに基づいて、2値化する画像情報
の明度変化方向を予め限定することによって、不要な情
報を除去するのである。 つまり、連体進行方向と作業
地状況との関係から、未処理作業地上の処理済作業地と
区別しにくい部分、および、処理済作業地上の未処理作
業地と区別しにくい部分、を予め除くことによって、本
来の境界を明確にするのである。
1 Effect] In other words, by limiting in advance the direction of change in brightness of the image information to be binarized based on the vehicle traveling direction and the sign (positive, negative) of the differential value of the change in brightness of the image information, unnecessary It removes information. In other words, from the relationship between the traveling direction of the continuous body and the working land situation, parts of the untreated working ground that are difficult to distinguish from the treated working land, and parts of the treated working ground that are difficult to distinguish from the untreated working land are removed in advance. This clarifies the original boundaries.

[発明の効果] 上記特徴故に、下記の如き優れた効果が発揮されるに至
った。
[Effects of the Invention] Due to the above characteristics, the following excellent effects have been achieved.

すなわち、車体進行方向に対応して、画像情報をその明
度変化の微分値の符号が正負 いずれか一方の値のもの
のみを2値化して、未処理地方向から処理済作業地方向
へと明るさ変化が大きくなる情報を境界として検出する
ので、未処理作業地および四理済作業地夫々に含まれる
ノイズ成分を予め除去することができるのである。 従
って、2値化した後には除去することが困雉な境界と誤
′l!II断じやすい情報を予め画像情報から効率良く
除去することができ、その、結果、正確な境界情報が得
られるのである。
In other words, depending on the direction in which the vehicle is traveling, the image information is binarized only if the sign of the differential value of the brightness change is positive or negative, and the image information is brightened from the direction of the unprocessed area to the direction of the processed work area. Since the information in which the change in value increases is detected as a boundary, it is possible to remove noise components included in each of the unprocessed work area and the four processed work areas in advance. Therefore, it is difficult to eliminate boundaries and errors after binarization! Information that is easy to distinguish can be efficiently removed from the image information in advance, and as a result, accurate boundary information can be obtained.

[実施例] 以下、本発明の実施例を図面に基づいて説明する。[Example] Embodiments of the present invention will be described below based on the drawings.

第5図および第6図に示すように、@論(2)・(2)
および後輪(3)・(3)のいずれをもステアリング操
作可能に構成された車体f+lの中間部にディスク型刈
刃を内装した芝刈装置(4)を上下動自在に懸架すると
ともに、各工程の走行コースを示す芝地の未刈地(B)
と既刈地fc)との境界(L)を検出する後記構成にな
る境界検出手段としての敏いセンサ(A)ヲ設け、この
倣いセンサGA)によ6境界険出結果に基づいて操向制
御するこ七によって、1?記境界(L)に沿って自動走
行可能な自動走行作業車としての芝刈作業車を構成しで
ある。
As shown in Figures 5 and 6, @ theory (2) and (2)
A lawn mowing device (4) equipped with a disc-shaped cutting blade is suspended vertically in the middle of the vehicle body f+l, which is configured so that both rear wheels (3) and (3) can be steered. Unmown grassland showing the running course (B)
A sensitive sensor (A) is provided as a boundary detection means to detect the boundary (L) between the mowed field fc) and the mown field fc). 1 by controlling this seven? A lawn mowing vehicle is configured as an automatically traveling vehicle that can automatically travel along the boundary (L).

if前記倣いセン丈ム)を構成するに、撮像手段として
のモニタカメラ(5)を、そのi 像ffl 野が、車
体(1)の進行方向前方の沿って走行すべき目標境界(
L。)を中心とする所定範囲の芝地(DlとなるようF
、車体(1)前方下方に向かって延設されたセンナ指示
フレーム(6)の先端部に設け、このモニタカメラ(5
)してよる撮像両像をその平均明度差に基つハて、21
直化することによって、目標境界(Lo)K対する検出
境界(L)のずれ量(X)とずれ角(θ)を求めて、B
ifi記険出境界(L)と目標境界(Lo)とが一致す
るように、つまり、検出境界(L)が前記モニタカメラ
(5)の車体(1)に対して前後方向の視野中心に位置
するように、走行方向を修正すべくステアリング操作を
制御するのである。
In order to configure the above-mentioned copying center height), the monitor camera (5) as an imaging means is used so that its i image ffl field is set at the target boundary (
L. ) in a predetermined range of grassland (Dl)
This monitor camera (5) is installed at the tip of the Senna instruction frame (6) extending downward and forward of the vehicle body (1).
), based on the average brightness difference between the two images, 21
By straightening, the deviation amount (X) and deviation angle (θ) of the detection boundary (L) with respect to the target boundary (Lo) K are determined, and B
The detection boundary (L) is located at the center of the field of view of the monitor camera (5) in the longitudinal direction with respect to the vehicle body (1) so that the ifi entry boundary (L) and the target boundary (Lo) coincide. In this way, the steering operation is controlled to correct the direction of travel.

以下、前記モニタカメラ(5)による撮像画f象を2値
化して境界(L)を検出する手段を、第1図に示すブロ
ック図、第2図に示す画像は号の説明図、および、第3
図に示すフローチャートに基づいて説明する。
Hereinafter, a block diagram of a means for detecting a boundary (L) by binarizing an image captured by the monitor camera (5) is shown in FIG. 1, an explanatory diagram of the image shown in FIG. 2, and Third
This will be explained based on the flowchart shown in the figure.

まず、前記モニタカメラ(5)によって撮像された両像
信号(So)を7レ一ムメモ1月7)に一旦記噴し、マ
イクロコンピュータによって構成された制御装置(8)
によって前記フレームメモリ(7)に肥憶σれた原画像
信号(So)(第2図(イ)に示す)を、例えば画像中
心画素の明度をその同門の所定ドツト数で区画された領
域の平均直にi順次置き換えるというような処理を行う
ことによって、平均化して、第2図(ロ)に示すように
ぼかした平均化画像(Sl)を得る。
First, the image signals (So) captured by the monitor camera (5) are once recorded in the 7th record memo (January 7), and the control device (8) configured by a microcomputer is
The original image signal (So) stored in the frame memory (7) (shown in Fig. 2 (a)) is calculated by, for example, changing the brightness of the central pixel of the image into an area partitioned by a predetermined number of dots. By performing a process of directly replacing the average in i order, the averaged image is averaged to obtain a blurred averaged image (Sl) as shown in FIG. 2 (b).

ところで、Itr記モニタカメラ(5)の視野(D)は
、第5図および第6図にも示すように、車体(1)上方
より前方下方の作業地を斜めに見下ろすようになるため
、車体fl)より前方に扇形に広がるものとなり、撮像
された原画像信号(So)は、車体(1)より遠方とな
る部分(第2図中上O11,1)の画素密度が車体i1
1に近い部分(第2図中下側)より粗くなる。 従って
、このまま一様に平均化すると、画(QdZ下方より上
方はど週平均され、その結果下方はノイズが多く、上方
は不必要にぼけた画像となるので、lf前記平均化は画
像上、下方より上方に向かって平均化密度が粗くなるよ
うに処理している。
By the way, as shown in FIGS. 5 and 6, the field of view (D) of the Itr monitor camera (5) is such that it looks down diagonally from above the vehicle body (1) to the work area below the front. fl), and the captured original image signal (So) has a pixel density of the portion far from the vehicle body (1) (upper O11, 1 in FIG. 2) of the vehicle body i1.
The area closer to 1 (lower side in Figure 2) is rougher. Therefore, if you uniformly average the image (QdZ), the upper part of the image (QdZ) will be averaged every week from the lower part, and as a result, the lower part will be noisy and the upper part will be unnecessarily blurred. Processing is performed so that the averaged density becomes coarser from the bottom to the top.

次に、前記平均化画像(Sl)の明度変化が大きい部分
、すなわち、明度変化が大きい画素部分を2値化するこ
とによって、第2図(ハ)に示すように、未刈地(B)
と既刈地(C)との境界(L)部分のみが明るく、池の
部分が暗い、2値化画像(S2)を得るのであるが、前
述したように、車体(1)の進行方向と作業地状況の関
係、すなわち本実施例では、車体(1)右側に処理済作
業地(C)を隣接して未処理作業地(B)上を走行する
ものであることから、平均化画像(Sl)を平均化する
ために、その明度変化の微分値の符号が第2図中左側よ
り右側方向に暗−明となる負(−)の微分値が得られた
情報のみ2値化することによって、第2図(ロ)中(L
′)で示す作業地(B)上にある芝の粗な部分あるいは
ノ・ゲた部分の微分値が正(+)となる明−暗度化部や
、処理済作業地(C)上の雑草やわずかな刈残し部分を
、ノイズとして予め除去するのである。
Next, by binarizing the portions of the averaged image (Sl) where the brightness changes are large, that is, the pixel portions where the brightness changes are large, as shown in FIG.
A binarized image (S2) is obtained in which only the boundary (L) between the area and the mown area (C) is bright and the pond area is dark. Regarding the relationship between the working area conditions, that is, in this example, since the vehicle is traveling on an untreated working area (B) with a treated working area (C) adjacent to the right side of the vehicle body (1), the averaged image ( In order to average SL), only the information for which a negative (-) differential value is obtained, where the sign of the differential value of the brightness change changes from dark to bright from the left side to the right side in Figure 2, is binarized. According to Figure 2 (B) middle (L
′) on the working area (B) where the differential value of the rough or sloughed area is positive (+), or on the treated working area (C). Weeds and small uncut areas are removed in advance as noise.

尚、第4図に示すように、車体f+1進行方向は、隣接
した各行左を行程端で7.10度方向転換して順次往復
走行する場合には、車体(1)と処理済作業地(C)と
の位置関係が一行程毎に左右反転するので、方向転換の
終了とともに@記明度変化を判別する微分値の符号を正
負切換えるようにして、第4図中(a)に示すように車
体+1+右側に処理済作業地(C)を隣接する場合は、
前記第2図仲)に示すように、明度変化が暗−明となる
負(→の微分値の情報のみを横方向に2値化し、第4図
中(b)に示すように車体(1)左側に処理済作業地(
C)を隣接する場合は、第2図に)に示す原画像信号(
So′)に基づいて、明度変化が明−暗となる正(+)
の微分値の情報のみを横方向に2値化して、不要なノイ
ズ成分を除去するようにしである。
As shown in Fig. 4, the traveling direction of the car body f+1 is the same as that of the car body (1) and the treated work area ( Since the positional relationship with C) is reversed left and right every stroke, the sign of the differential value used to determine the change in brightness is switched between positive and negative at the end of the direction change, as shown in (a) in Figure 4. If the treated work area (C) is adjacent to the vehicle body + 1 + right side,
As shown in Fig. 2 (middle), only information on the differential value of negative (→) where the brightness changes from dark to bright is binarized in the horizontal direction, and as shown in Fig. 4 (b), the car body (1 ) On the left side is the treated work area (
C) is adjacent, the original image signal (
So′), positive (+) where the brightness change becomes light-dark.
This is to binarize only the differential value information in the horizontal direction to remove unnecessary noise components.

そして、前記コ値化画像(S2)Vcよって検出された
境界(L)を連続した線、例えば、近似直線、となるよ
うに処理した後、It第c図に示すように車体fl)が
沿うべき目標境界(Lo)に対する検出境界(L)のず
れ量(X)とずれ角(θ)を演算する。
Then, after processing the boundary (L) detected by the co-valued image (S2) Vc to form a continuous line, for example, an approximate straight line, the vehicle body fl) follows it as shown in Fig. The deviation amount (X) and deviation angle (θ) of the detection boundary (L) from the power target boundary (Lo) are calculated.

そして、前記演算によって算出されたずれ量(X)に基
づいて、前記前 後輪(2)・(3)を同一方向にステ
アリング操作して、車体(llをその向きを変えること
無く平行路tIJさせた後、前記ずれ角(θ)に基づい
て、前記前・後輪+211(31が相対的に逆方向とな
るようにステアリング操作して車体(1)の目標境界(
L。)に対する向きを修正して、前記検出境界(L)と
目標境界(Lo)とが前記モニクカメラ、5)の視野内
において一致するように、つまり、重体(1)が目標境
界(Lo)に沿うように制御するのである。
Then, based on the amount of deviation (X) calculated by the calculation, the front and rear wheels (2) and (3) are steered in the same direction to move the vehicle body (11) along the parallel road tIJ without changing its direction. After that, based on the deviation angle (θ), the front and rear wheels +211 (31) are steered so that they are in relatively opposite directions to reach the target boundary (
L. ) so that the detection boundary (L) and the target boundary (Lo) coincide within the field of view of the moniker camera 5), that is, the heavy object (1) is aligned with the target boundary (Lo). It is controlled so that it follows.

従って、不連続な境界、局所的に曲がった境界、芝地に
ムラがあって不明確な境界、などが検出されたとしても
、連続した境界L)として正確に補正可能であり、従来
のように不要なステアリング操作をしたり、本来の境界
外へ走行方向がずれたりすることの無い、ステアリング
制御が可能となったのである。
Therefore, even if a discontinuous boundary, a locally curved boundary, a boundary that is unclear due to uneven grass, etc. is detected, it can be accurately corrected as a continuous boundary This makes it possible to control the steering without making unnecessary steering operations or causing the driving direction to deviate from the original boundaries.

尚、第1図中、(9)I(lO)r/′i夫々前$6(
21り(2+および後輪+3) I+31を実際にステ
アリング操作する油圧シリンダ、(If) 、 (+2
)は夫々前記油圧シリンダ(9)・(10)を駆動する
電磁パルプ、(1(6)は無段変速装置(14)の変速
位置を操作するモータである。 また、(RI)I(R
2)は夫々前記前輪(2しく2)および後輪(3)+(
3)のステアリング量を検出して前記制御装置(8)に
フィードパンクするためのポテンショメーク、(R3)
は同様に前記変速装置114)の変速位置を検出して制
御装置(8)に74−ドパンクするためのポテンショメ
ータである。
In Fig. 1, (9)I(lO)r/'i is $6(
21ri (2+ and rear wheel +3) Hydraulic cylinder that actually steers I+31, (If), (+2
) are electromagnetic pulps that drive the hydraulic cylinders (9) and (10), respectively, and (1 (6) is a motor that operates the shift position of the continuously variable transmission (14).
2) is the front wheel (2) and rear wheel (3) + (
3) a potentiometer for detecting the steering amount and providing a feed puncture to the control device (8); (R3);
Similarly, is a potentiometer for detecting the shift position of the transmission device 114) and providing a 74-stroke to the control device (8).

ところで、前記原画像信号CS。)を平均化するのは、
刈り取った芝の葉などの反射光の影響による大きな明度
変化や未刈地(B)内にある既刈地(C)と区別しにく
い芝が小さくノ・ゲた部分やムラのある部分などを検出
しに<<シて、境界(L+の検出誤差を予め少なくする
ためである。
By the way, the original image signal CS. ) is averaged by
Large changes in brightness due to the influence of reflected light from cut grass leaves, etc., and areas where the grass is small and bulges and uneven areas that are difficult to distinguish from mown areas (C) in uncut areas (B). This is to reduce the detection error of the boundary (L+) in advance.

そして、この平均化処理を行うに、前記演算処理に変え
て、ポカシ用の光学的なフィルタを用いたり、あるいは
単にカメラ(6)をいわゆるピンボケ状態で使用、する
ことによって行ってもよい。 更に、例えば、光学的な
フィルタを用いる場合には、そのポカシ用のフィルタを
、いわゆるソフトフォーカスフィルタラ用いて、その視
野の上方より下方に向かって徐々にソフトフォーカスと
なるようにすることによって、ErJ記カメラ(5)の
視野(D)の歪みによる画像の粗密変化を同時に補正す
ることができる。
The averaging process may be performed by using an optical filter for blurring instead of the arithmetic processing, or by simply using the camera (6) in a so-called out-of-focus state. Furthermore, for example, when using an optical filter, the focus filter is a so-called soft focus filter, so that the field of view gradually becomes soft focus from the top to the bottom. Changes in image density due to distortion of the field of view (D) of the ErJ camera (5) can be corrected at the same time.

また、車体fl)進行方向を検出するに、方向転換によ
って自動的に判別する他、人為的に設定してもよい。 
ちなみに、作業地外周より内周方向へ未処理作業地(B
)の外周部を順次走行する回り走行形式や、−行程毎に
幅寄せを行って前後進を繰り返す前後進走行形式等では
、車体(1)と処理済作業地(C)すなわち境界(L)
との左右位置関係は変わらないので、作業開始時に前記
微分籠の符号を一度設定するだけでよい。
Furthermore, in order to detect the traveling direction of the vehicle body (fl), it may be determined automatically by changing the direction, or it may be set manually.
By the way, the untreated work area (B
), or in a forward-reverse driving mode in which the vehicle repeatedly moves forward and backward by closing the width at each -stroke, the vehicle body (1) and the treated work area (C), i.e., the boundary (L).
Since the left-right positional relationship with the differential cage does not change, it is only necessary to set the sign of the differential cage once at the start of work.

また、前記境界(L)を求めるに、前記2直画像(S2
)の済接した各画素(ドツト)から最小自乗法などによ
って処理できるが、2値化するのではなく明度変化の微
分値を重み係数として処理してもよい。
In addition, in order to find the boundary (L), the two-direction image (S2
) can be processed using the least squares method or the like from each pixel (dot) that has been connected to the pixel (dot), but instead of binarizing, the differential value of the brightness change may be processed as a weighting coefficient.

さらにまた、前記境界(L)を連続した線とじて求める
に、ifi記最小自乗法シHaugh g換により境界
線部分の画像の座標を用いて計算してもよい0
Furthermore, in order to find the boundary (L) as a continuous line, it may be calculated using the coordinates of the image of the boundary line part by the least squares method or Hough conversion.

【図面の簡単な説明】[Brief explanation of drawings]

図面は本発明に係る自動走行作業車の操向制御装置の実
施例を示し、第1図は制御ンステムの構成を示すブロッ
ク図、第2図(イ)・(ロ)・(・)・(に)は画像信
号の説明図、第3図は制御装置の動作を示す70−チャ
ート、第4図は車体と境界の位置関係の説明図、第5図
は芝刈作業車の全体構成を示す平面図、そして、第6図
はその側面図である。 (り・・・・・車体、(5)・・・・・撮像手段、(A
)・・・境界検出手段、(B)・・・・未処理作業地、
(C)・・・・処理済作業地、(L)・・・境界。 代理人 弁理士  北 村   修 第1図 第4図 wE5図 第6図
The drawings show an embodiment of the steering control device for an automatic traveling work vehicle according to the present invention, FIG. 1 is a block diagram showing the configuration of the control system, and FIG. ) is an explanatory diagram of the image signal, Fig. 3 is a 70-chart showing the operation of the control device, Fig. 4 is an explanatory diagram of the positional relationship between the vehicle body and the boundary, and Fig. 5 is a plane showing the overall configuration of the lawn mowing vehicle. FIG. 6 is a side view thereof. (ri...vehicle body, (5)...imaging means, (A
)... Boundary detection means, (B)... Untreated work area,
(C) ... Treated work area, (L) ... Boundary. Agent Patent Attorney Osamu Kitamura Figure 1 Figure 4 wE5 Figure 6

Claims (1)

【特許請求の範囲】[Claims] 撮像手段(5)によって車体(1)の進行方向前方所定
範囲の作業地状態を撮像し、その撮像画像情報を、その
平均明度差に基づいて2値化することによって未処理作
業地(B)と処理済作業地(C)との境界(L)を検出
する境界検出手段(A)、および、この境界検出手段(
A)による境界(L)検出結果に基づいて、車体(1)
が前記境界(L)に沿って自動的に走行するように操向
する制御手段を備えた自動走行作業車の操向制御装置で
あって、前記撮像画像情報をその平均明度差に基づいて
2値化するに、画像情報の明度変化の微分値を演算する
手段、その微分値の正負符号を判別する手段、車体(1
)進行方向を検出する手段、および、前記進行方向検出
手段による車体進行方向検出結果に基づいて、前記微分
値符号判別手段による判別符号の一方のみの符号の微分
値変化が所定値以上であるか否かによって画像情報を2
値化する手段を備えさせてある自動走行作業車の操向制
御装置。
An unprocessed working area (B) is obtained by capturing an image of the working area in a predetermined range in front of the vehicle body (1) using the imaging means (5) and binarizing the captured image information based on the average brightness difference. Boundary detection means (A) for detecting the boundary (L) between and the treated working area (C);
Based on the boundary (L) detection result by A), the vehicle body (1)
A steering control device for an automatic traveling work vehicle, comprising a control means for steering the vehicle so that the vehicle automatically travels along the boundary (L), wherein To convert the image information into a value, a means for calculating the differential value of the change in brightness of the image information, a means for determining the sign of the differential value, a means for determining the sign of the differential value, and a means for calculating the differential value of the change in brightness of the image information, a means for determining the sign of the differential value,
) Based on the means for detecting the traveling direction and the detection result of the vehicle traveling direction by the traveling direction detecting means, whether the differential value change of only one of the signs determined by the differential value sign determining means is greater than or equal to a predetermined value. Image information 2 depending on whether or not
A steering control device for an automatic driving work vehicle that is equipped with a means to convert it into a value.
JP61229124A 1986-09-27 1986-09-27 Steering control device for automated vehicle Expired - Lifetime JPH0646886B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP61229124A JPH0646886B2 (en) 1986-09-27 1986-09-27 Steering control device for automated vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP61229124A JPH0646886B2 (en) 1986-09-27 1986-09-27 Steering control device for automated vehicle

Publications (2)

Publication Number Publication Date
JPS62122507A true JPS62122507A (en) 1987-06-03
JPH0646886B2 JPH0646886B2 (en) 1994-06-22

Family

ID=16887124

Family Applications (1)

Application Number Title Priority Date Filing Date
JP61229124A Expired - Lifetime JPH0646886B2 (en) 1986-09-27 1986-09-27 Steering control device for automated vehicle

Country Status (1)

Country Link
JP (1) JPH0646886B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6451504A (en) * 1987-08-21 1989-02-27 Iseki Agricult Mach Travel control system for farmwork machine
JPH01187015A (en) * 1988-01-22 1989-07-26 Kubota Ltd Mowing, working vehicle of automatic steering type
EP0878121B1 (en) * 1997-05-13 2003-06-04 CLAAS KGaA Harvesting machine with automatic steering

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6451504A (en) * 1987-08-21 1989-02-27 Iseki Agricult Mach Travel control system for farmwork machine
JPH01187015A (en) * 1988-01-22 1989-07-26 Kubota Ltd Mowing, working vehicle of automatic steering type
EP0878121B1 (en) * 1997-05-13 2003-06-04 CLAAS KGaA Harvesting machine with automatic steering

Also Published As

Publication number Publication date
JPH0646886B2 (en) 1994-06-22

Similar Documents

Publication Publication Date Title
JPS62122507A (en) Steering control apparatus of self-running working machine
JPS61139304A (en) Steering controller of self-propelling working machine
JPS6270916A (en) Boundary detecting method for self-traveling truck
JPH04126004A (en) Boundary detecting device for automatic traveling working vehicle
JPH0759407A (en) Traveling controller of automatic traveling working car
JPS6190215A (en) Automatic steering working wagon
JPH0460242B2 (en)
JPH01161403A (en) Image pickup type steering control device for automatic traveling working vehicle
JPH01231809A (en) Photographing type travel control device for automatic travel working car
JPH069011B2 (en) Travel control system for automated guided vehicles
JP2510660B2 (en) Image-capturing type traveling control device for automated guided vehicles
JPS63298103A (en) Image sensing type boundary detector
JPS63293402A (en) Image pickup type border detecting device
JPS5972523A (en) Unmanned traveling truck
JPS63277908A (en) Image pickup type border detection device
JPH0465402B2 (en)
JPH0437443B2 (en)
JPH023B2 (en)
JPH01187015A (en) Mowing, working vehicle of automatic steering type
JPS61115405A (en) Steering controller of automatic running working vehicle
JPS6261507A (en) Boundary detector for self-propelling type working vehicle
JPS6196906A (en) Automatic propelling working machine
JPH04325003A (en) Sensor for crop row
JPH0575334B2 (en)
JPH01235507A (en) Image sensing boundary detecting device