JPH07222505A - Method for controlling travel of self-traveling working vehicle - Google Patents

Method for controlling travel of self-traveling working vehicle

Info

Publication number
JPH07222505A
JPH07222505A JP6015562A JP1556294A JPH07222505A JP H07222505 A JPH07222505 A JP H07222505A JP 6015562 A JP6015562 A JP 6015562A JP 1556294 A JP1556294 A JP 1556294A JP H07222505 A JPH07222505 A JP H07222505A
Authority
JP
Japan
Prior art keywords
boundary
image
picture image
work
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP6015562A
Other languages
Japanese (ja)
Inventor
Takayuki Sogawa
能之 十川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Subaru Corp
Original Assignee
Fuji Heavy Industries Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Heavy Industries Ltd filed Critical Fuji Heavy Industries Ltd
Priority to JP6015562A priority Critical patent/JPH07222505A/en
Publication of JPH07222505A publication Critical patent/JPH07222505A/en
Pending legal-status Critical Current

Links

Abstract

PURPOSE:To obtain highly reliable boundary information by removing the affection of non- worked targets in an imaged picture image, when the boundary is detected from the imaged picture image for controlling the travel of a self-traveling working vehicle along the boundary between a worked target region and a non-worked region. CONSTITUTION:A green picture image is subjected to a two-dimensional transfer average treatment for the removal of noises and singular points, and the quantities of changed strengths in the horizontal axis direction are subsequently computed to obtain a differential value picture image. Picture elements imaging turf-free targets having color strength ratios excluding a preliminarily set range or having the absolute values of green color strengths in the preliminarily set range are extracted as mask information from the respective picture images of R, G and B. The differential values of the picture elements imaging the targets excluding the turf and their peripheral picture images are rewritten to zero to produce a binary picture image in which the data of the articles excluding the turf are removed from the green differential value picture image. The gradient and intercept of a straight line approximating a working boundary determined by the Hough transformation of the binary picture image. Thereby, the affection of the non-worked targets in the imaged picture image can be removed to obtain high reliable boundary information.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は、車輌周辺の撮像画像を
処理して作業対象領域と非作業対象領域との境界を検出
し、検出した境界に沿って自律走行するよう操舵系を制
御する自律走行作業車の走行制御方法に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention processes a captured image around a vehicle to detect a boundary between a work target area and a non-work target area, and controls a steering system so as to autonomously travel along the detected boundary. The present invention relates to a traveling control method for an autonomous traveling work vehicle.

【0002】[0002]

【従来の技術】従来、無人で自律走行する自律走行車に
対しては、自律走行のための自己位置検出として、電線
を地下に埋設し、この電線が発する磁界を磁気センサで
検出する技術が提案されているが、ゴルフ場、河川敷堤
防、公園等の各種フィールドで草刈、芝刈等の作業を無
人で行なう自律走行作業車等のように、自律走行領域が
広大な場合、領域の地下全てに電線を埋設することは困
難であり、設置費用も大きなものとなる。
2. Description of the Related Art Conventionally, for an autonomous vehicle that is autonomously autonomously traveling, there is a technique in which an electric wire is buried underground and a magnetic sensor detects a magnetic field generated by the electric wire as self-position detection for autonomous traveling. Although it has been proposed, if the autonomous driving area is large, such as an autonomous vehicle that performs unmanned grass cutting, lawn cutting, etc. in various fields such as golf courses, river banks, parks, etc. It is difficult to embed the electric wire, and the installation cost becomes large.

【0003】このため、前記自律走行作業車では、光セ
ンサ等の無接触センサあるいは機械的な接触式センサ等
を用いてフィールド上の既処理と未処理地との境界部を
検出し、検出した境界部に沿った倣い走行を行なうもの
が多く、最近では、特開昭61−139304号公報等
に開示されているように、カメラを搭載して作業対象を
含む周辺領域を撮像し、撮像画像における明るさの変化
量から作業対象領域と非作業対象領域との境界を検出す
る技術が提案されている。
Therefore, in the above-mentioned autonomous vehicle, a boundary between the unprocessed land and the unprocessed land on the field is detected and detected using a non-contact sensor such as an optical sensor or a mechanical contact sensor. Many of them follow the boundary, and recently, as disclosed in Japanese Patent Laid-Open No. 61-139304, a camera is mounted to capture a peripheral area including a work target, and a captured image is obtained. There has been proposed a technique for detecting the boundary between the work target area and the non-work target area from the amount of change in brightness in.

【0004】[0004]

【発明が解決しようとする課題】しかしながら、境界に
沿った倣い走行を行なう際に、撮像画像における明るさ
の変化量から作業対象領域と非作業対象領域との境界を
検出する技術では、撮像画像に写っている作業対象以外
の物体の明るさの変化量も取込んでしまい、境界の位置
を誤認識して倣い走行が困難となるおそれがある。
However, in the technique of detecting the boundary between the work target area and the non-work target area from the change amount of the brightness in the captured image when performing the contour traveling along the boundary, the captured image is The amount of change in the brightness of the object other than the work target shown in FIG. 3 is also captured, and the position of the boundary may be erroneously recognized, making it difficult for the copying traveling.

【0005】本発明は上記事情に鑑みてなされたもの
で、作業対象領域と非作業対象領域との境界に沿った走
行制御を行なうため撮像画像から境界を検出する際に、
撮像画像内の非作業対象の影響を除去し、信頼性の高い
境界情報を得ることのできる自律走行作業車の走行制御
方法を提供することを目的としている。
The present invention has been made in view of the above circumstances, and when the boundary is detected from a captured image in order to perform traveling control along the boundary between the work target area and the non-work target area,
It is an object of the present invention to provide a traveling control method for an autonomous traveling work vehicle that can remove the influence of a non-working object in a captured image and obtain highly reliable boundary information.

【0006】[0006]

【課題を解決するための手段】第1の発明は、車輌周辺
の撮像画像から作業対象領域と非作業対象領域との境界
を検出し、検出した境界に沿って自律走行するよう操舵
系を制御する自律走行作業車の走行制御方法において、
前記撮像画像から明るさの変化量を算出して2値化画像
を生成する際に、作業対象を識別するための色に係る条
件を満足しない画素及びその周辺画素の明るさの変化量
を零として2値化画像を生成し、前記2値化画像から前
記境界を近似する直線を算出することを特徴とする。
A first aspect of the present invention detects a boundary between a work target area and a non-work target area from a captured image around a vehicle, and controls a steering system so as to autonomously travel along the detected boundary. In the traveling control method of the autonomous traveling work vehicle,
When calculating the amount of change in brightness from the captured image to generate a binarized image, the amount of change in brightness of the pixels that do not satisfy the condition related to the color for identifying the work target and the peripheral pixels thereof is zero. Is generated, and a straight line approximating the boundary is calculated from the binarized image.

【0007】第2の発明は、第1の発明における作業対
象を識別するための色に係る条件を、特定の操作入力に
応じて撮像した画像から設定することを特徴とする。
A second invention is characterized in that the condition relating to the color for identifying the work object in the first invention is set from an image picked up according to a specific operation input.

【0008】[0008]

【作用】第1の発明では、車輌周辺を撮像した画像に対
し、作業対象を識別するための色に係る条件を満足しな
い画素及びその周辺画素の明るさの変化量を零として2
値化画像を生成し、生成した2値化画像から作業対象領
域と非作業対象領域との境界を近似する直線を算出す
る。
According to the first aspect of the present invention, the amount of change in the brightness of a pixel which does not satisfy the condition relating to the color for identifying the work target and the peripheral pixel thereof is set to zero in the image of the periphery of the vehicle.
A binarized image is generated, and a straight line that approximates the boundary between the work target region and the non-work target region is calculated from the generated binarized image.

【0009】第2の発明では、第1の発明において、作
業対象を識別するための色に係る条件を特定の操作入力
に応じて撮像した画像から設定し、この条件を満足しな
い画素及びその周辺画素の明るさの変化量を零として2
値化画像を生成する。
According to a second aspect of the present invention, in the first aspect, a condition relating to a color for identifying a work object is set from an image picked up in accordance with a specific operation input, and a pixel which does not satisfy this condition and its periphery are set. The amount of change in pixel brightness is set to zero and 2
Generate a binarized image.

【0010】[0010]

【実施例】以下、図面を参照して本発明の実施例を説明
する。図面は本発明の一実施例に係わり、図1は境界検
出ルーチンのフローチャート、図2は走行制御ルーチン
のフローチャート、図3は芝刈作業車の外観を示す概略
説明図、図4は操舵制御系の基本構成図、図5は境界検
出部の回路ブロック図、図6は作業領域を撮像した元画
像を示す説明図、図7は平均化処理画像を示す説明図、
図8は微分値画像を示す説明図、図9はマスク情報を示
す説明図、図10はマスク後の微分値画像を示す説明
図、図11はスライスレベル算出用ヒストグラムを示す
説明図、図12は2値化画像を示す説明図、図13及び
図14はハフ変換用2次元ヒストグラムを示す説明図、
図15は直線抽出結果を示す説明図、図16は境界近似
直線の認識結果を示す説明図である。
Embodiments of the present invention will be described below with reference to the drawings. 1 is a flow chart of a boundary detection routine, FIG. 2 is a flow chart of a traveling control routine, FIG. 3 is a schematic explanatory view showing an outer appearance of a lawn mowing vehicle, and FIG. 4 is a steering control system. FIG. 5 is a basic block diagram, FIG. 5 is a circuit block diagram of a boundary detection unit, FIG. 6 is an explanatory diagram showing an original image of a work area, and FIG. 7 is an explanatory diagram showing an averaging processed image.
8 is an explanatory diagram showing a differential value image, FIG. 9 is an explanatory diagram showing mask information, FIG. 10 is an explanatory diagram showing a differential value image after masking, FIG. 11 is an explanatory diagram showing a slice level calculation histogram, and FIG. Is an explanatory view showing a binarized image, FIGS. 13 and 14 are explanatory views showing a two-dimensional histogram for Hough transform,
FIG. 15 is an explanatory diagram showing the straight line extraction result, and FIG. 16 is an explanatory diagram showing the recognition result of the boundary approximate straight line.

【0011】図3において、符号1は無人で自律走行が
可能な自律走行作業車を示し、本実施例においてはゴル
フ場等の草・芝刈り作業を行う芝刈作業車である。この
芝刈作業車1には、車輌本体下部に、草・芝刈作業を行
うためのモーア等の刈刃機構部2を備えるとともに、作
業領域を撮像する撮像手段として、例えば固体撮像素子
(CCD)の撮像面にカラーモザイクフィルタを配した
カラー撮像用のCCDカメラ3を車輌前方側に備え、さ
らに、走行履歴を算出するため、地磁気センサ4と車輪
エンコーダ5とからなる推測航法センサを備えている。
In FIG. 3, reference numeral 1 indicates an autonomous traveling work vehicle capable of being autonomously driven by an unmanned vehicle. In this embodiment, it is a lawn mowing work vehicle for performing grass / lawn mowing work on a golf course or the like. This lawnmower 1 is provided with a cutting blade mechanism 2 such as a mower for performing grass / lawn mowing work on the lower part of the vehicle body, and as an image pickup means for picking up a work area, for example, a solid-state image sensor (CCD) is used. A CCD camera 3 for color image pickup having a color mosaic filter on the image pickup surface is provided on the front side of the vehicle, and a dead reckoning sensor including a geomagnetic sensor 4 and a wheel encoder 5 is provided for calculating a traveling history.

【0012】前記芝刈作業車1は、エンジン駆動で走行
し、図4に示すように、前輪操舵機構9a、後輪操舵機
構9bが、それぞれ前輪用油圧シリンダ8a、後輪用油
圧シリンダ8bによって独立して駆動されるようになっ
ている。各油圧シリンダ8a,8bには、それぞれ前輪
操舵用油圧制御弁7a、後輪操舵用油圧制御弁7bを介
して、エンジン1aによって駆動される油圧ポンプ6が
接続されており、制御装置20により制御される。
The lawnmower working vehicle 1 is driven by an engine, and as shown in FIG. 4, the front wheel steering mechanism 9a and the rear wheel steering mechanism 9b are independent by a front wheel hydraulic cylinder 8a and a rear wheel hydraulic cylinder 8b, respectively. And is driven. A hydraulic pump 6 driven by the engine 1a is connected to each of the hydraulic cylinders 8a and 8b via a front wheel steering hydraulic control valve 7a and a rear wheel steering hydraulic control valve 7b, respectively. To be done.

【0013】前記制御装置20には、前記CCDカメラ
3及び後述する条件設定スイッチ70が接続される境界
検出部30と、前輪操舵用油圧制御弁7a及び後輪操舵
用油圧制御弁7bが接続されるとともに、前輪操舵機構
9a、後輪操舵機構9bにそれぞれ取付けられた舵角セ
ンサ10a,10b、前記地磁気センサ4、前記車輪エ
ンコーダ5等からの各信号が入力される走行制御部40
とが備えられ、それぞれがマイクロコンピュータを中心
として構成されている。
The control device 20 is connected to the boundary detection section 30 to which the CCD camera 3 and a condition setting switch 70 described later are connected, the front wheel steering hydraulic control valve 7a and the rear wheel steering hydraulic control valve 7b. In addition, the traveling control unit 40 to which signals from the steering angle sensors 10a and 10b respectively mounted on the front wheel steering mechanism 9a and the rear wheel steering mechanism 9b, the geomagnetic sensor 4, the wheel encoder 5 and the like are input.
And are provided, each of which is mainly configured by a microcomputer.

【0014】すなわち、前記境界検出部30では、車輌
周辺の撮像画像から、例えば、ゴルフ場のフェアウェイ
とラフとの境界や、草・芝刈り作業中の作業済み領域と
未作業領域との境界等のように、作業対象領域と非作業
対象領域との境界(以下、作業境界と称する)を検出す
る。また、前記走行制御部40では、前記境界検出部3
0で検出した作業境界に沿った倣い走行を行なうよう操
舵系を制御する。
That is, the boundary detection unit 30 detects, for example, a boundary between a fairway and a rough on a golf course, a boundary between a worked area and a not-worked area during grass and lawn mowing from a captured image around the vehicle. As described above, the boundary between the work target area and the non-work target area (hereinafter, referred to as a work boundary) is detected. Further, in the traveling control unit 40, the boundary detection unit 3
The steering system is controlled so that the vehicle travels along the work boundary detected at 0.

【0015】前記境界検出部30の具体的回路構成は図
5に示され、CPU50、ワークデータを保持するため
のRAM51、制御用固定データ及び制御用プログラム
が格納されているROM52、後述する条件設定スイッ
チ70が接続された入出力(I/O)インターフェース
53を備えたマイクロコンピュータのデータバス54及
びアドレスバス55に、例えば1フレーム512×51
2画素の、R(赤),G(緑),B(青)の光の3原色
に対応する3枚のフレームメモリからなるビデオメモリ
67が切換回路66を介して接続されている。
A concrete circuit configuration of the boundary detecting section 30 is shown in FIG. 5, which includes a CPU 50, a RAM 51 for holding work data, a ROM 52 in which fixed control data and a control program are stored, and a condition setting described later. The data bus 54 and the address bus 55 of the microcomputer provided with the input / output (I / O) interface 53 to which the switch 70 is connected, for example, one frame 512 × 51.
A video memory 67 composed of three frame memories corresponding to the three primary colors of R (red), G (green), and B (blue) light of two pixels is connected via a switching circuit 66.

【0016】また、前記ビデオメモリ67には、前記切
換回路66を介してAD変換器59,60,61、アド
レス制御回路63が接続されており、各AD変換器5
9,60,61に、CCDカメラ3からのR,G,Bの
各信号を増幅する各アンプ56,57,58がそれぞれ
接続されるとともに、前記アドレス制御回路63に、C
CDカメラ3の同期信号からタイミング信号を生成する
同期回路62が接続されている。
Further, AD converters 59, 60, 61 and an address control circuit 63 are connected to the video memory 67 via the switching circuit 66, and each AD converter 5 is connected.
Amplifiers 56, 57 and 58 for amplifying R, G and B signals from the CCD camera 3 are respectively connected to 9, 60 and 61, and C is connected to the address control circuit 63.
A synchronizing circuit 62 for generating a timing signal from the synchronizing signal of the CD camera 3 is connected.

【0017】各AD変換器59,60,61では、それ
ぞれアンプ56,57,58で増幅された各R,G,B
信号を、例えば約8MHzのサンプル速度で同期回路6
2からのタイミング信号に同期してデジタルデータに変
換し、データバス64を介して切換回路66に出力す
る。また、アドレス制御回路63は、前記同期回路62
からのタイミング信号に同期してアドレスデータを生成
し、アドレスバス65を介して切換回路66に供給す
る。
In the AD converters 59, 60 and 61, the R, G and B amplified by the amplifiers 56, 57 and 58, respectively.
The signal is sent to the synchronization circuit 6 at a sample rate of, for example, about 8 MHz.
The data is converted into digital data in synchronization with the timing signal from 2 and output to the switching circuit 66 via the data bus 64. Further, the address control circuit 63 uses the synchronization circuit 62.
The address data is generated in synchronization with the timing signal from and is supplied to the switching circuit 66 via the address bus 65.

【0018】前記切換回路66は、CPU50側のデー
タバス54及びアドレスバス55と、AD変換器59,
60,61側のデータバス64及びアドレスバス65と
のいずれか一方を選択的にビデオメモリ67に接続する
ものであり、アドレス制御回路63から切換回路66に
アドレスデータが供給されている間はAD変換器59,
60,61側のデータバス64をビデオメモリ67に接
続して画像データを書込み、この間、CPU50による
ビデオメモリ67へのアクセスを禁止する。
The switching circuit 66 includes a data bus 54 and an address bus 55 on the CPU 50 side, an AD converter 59,
One of the data bus 64 and the address bus 65 on the 60, 61 side is selectively connected to the video memory 67, and AD is supplied while the address data is being supplied from the address control circuit 63 to the switching circuit 66. Converter 59,
The data bus 64 on the side of 60, 61 is connected to the video memory 67 to write image data, and during this time, access to the video memory 67 by the CPU 50 is prohibited.

【0019】そして、CCDカメラ3からのR,G,B
信号の供給が停止し、CPU50のビデオメモリ67へ
のアクセスが可能になると、CPU50では、ビデオメ
モリ67から画像データを読出し、読出した画像データ
を処理して作業境界を近似する直線の傾き、切片を求め
た後、I/Oインターフェース53を介して前記走行制
御部40に出力する。
Then, R, G, B from the CCD camera 3
When the supply of the signal is stopped and the CPU 50 can access the video memory 67, the CPU 50 reads the image data from the video memory 67, processes the read image data, and inclines and intercepts the straight line approximating the work boundary. Is obtained and then output to the traveling control unit 40 via the I / O interface 53.

【0020】以下、境界検出部30による作業境界検
出、次いで、走行制御部40による倣い制御について説
明する。
The work boundary detection by the boundary detection unit 30 and the copying control by the traveling control unit 40 will be described below.

【0021】まず、境界検出部30にて実行される図1
の境界検出ルーチンでは、ステップS101で、ビデオメモ
リ67から画像データを入力すると、次に、緑画像に対
する処理とR,G,Bの各色画像に対する処理とを並列
して実行する。
First, FIG. 1 executed by the boundary detection unit 30.
In the boundary detection routine (1), when the image data is input from the video memory 67 in step S101, the process for the green image and the process for the R, G, B color images are executed in parallel.

【0022】すなわち、ステップS102で、ノイズや特異
点除去のための2次元移動平均処理を緑画像に対して行
なった後、ステップS103で、画面上の明るさの変化点
(境界)を抽出するため、水平軸方向の強度(明るさ)
変化量を算出する処理を行ない、これらの処理に並行し
て、ステップS102'で、R,G,Bの各色の強度から芝
以外の物体を除去するためのマスク情報を算出する処理
を行なう。
That is, in step S102, two-dimensional moving average processing for removing noise and singularity is performed on the green image, and then in step S103, a change point (boundary) of brightness on the screen is extracted. Therefore, the intensity (brightness) in the horizontal axis direction
A process of calculating the amount of change is performed, and in parallel with these processes, a process of calculating mask information for removing objects other than turf from the intensities of the colors R, G, and B is performed in parallel with these processes.

【0023】前記ステップS102における2次元移動平均
処理は、例えば、フェアウェイとラフとの境界部分を撮
像した図6の元画像に対し、20×20画素の強度平均
値をX,Y方向に4画素ずつずらした状態の画素データ
の平均値を求めることにより行われ、図7に示すような
緑色平均化処理画像が得られる。そして、この2次元移
動平均処理を行なった平均化処理画像に対し、前記ステ
ップS103で、水平軸方向の強度変化量を算出し、図8に
示すような微分値画像を得る。
In the two-dimensional moving average processing in step S102, for example, the intensity average value of 20 × 20 pixels is 4 pixels in the X and Y directions with respect to the original image of FIG. This is performed by obtaining the average value of the pixel data in the state of being shifted by one by one, and a green averaging processed image as shown in FIG. 7 is obtained. Then, in step S103, the intensity change amount in the horizontal axis direction is calculated for the averaged image subjected to the two-dimensional moving average process, and a differential image as shown in FIG. 8 is obtained.

【0024】尚、図6中のWDはウィンドウを示し、以
下の処理で、このウィンドウ内で作業境界を近似する直
線を求める。
WD in FIG. 6 indicates a window, and a straight line approximating the work boundary is obtained in this window by the following processing.

【0025】また、前記ステップS102'におけるマスク
情報の算出は、予め設定した範囲以外の色強度比率、あ
るいは、予め設定した範囲内の緑色強度絶対値を有する
画素を抽出することにより行ない、これにより緑色画像
において芝以外の物体が写った画素を識別する。
The calculation of the mask information in step S102 'is performed by extracting pixels having a color intensity ratio outside the preset range or a green intensity absolute value within the preset range. Pixels in which objects other than turf are reflected in the green image are identified.

【0026】例えば、緑色強度、赤色強度、青色強度
を、それぞれ、Gi、Ri、Biとすると、以下の(1),
(2),(3)のいずれかの条件を満たす画素、すなわち、図
9に示すように、梨子地表示の緑色強度Giよりも赤色
強度Riが強い条件(1)の画素、あるいは、斜線表示の緑
色強度Giが青色強度Biに対して設定値よりも小さい条
件(2)の画素、さらには、木立部分等における緑色強度
Giの絶対値が設定範囲よりも小さい条件(3)の画素は芝
でないと判断し、抽出した画素をマスク情報として緑色
画像に適用するのである。
For example, assuming that green intensity, red intensity, and blue intensity are Gi, Ri, and Bi, respectively, the following (1),
Pixels satisfying any of the conditions (2) and (3), that is, as shown in FIG. 9, pixels under the condition (1) in which the red intensity Ri is stronger than the green intensity Gi in the pear-skin display, or the diagonal line display. Of the condition (2) in which the green intensity Gi is smaller than the set value with respect to the blue intensity Bi, and further, the pixel of the condition (3) in which the absolute value of the green intensity Gi in the grove is smaller than the set range is the grass. If not, the extracted pixel is applied to the green image as mask information.

【0027】 Gi/Ri < 1.0 … (1) Gi/Bi < 1.15 … (2) Gi < 30%フルスケール … (3) これらの条件は、フィールドにおいて、CCDカメラ3
を作業対象の芝地に向け、外部に設けた条件設定スイッ
チ70をONにした瞬間に取込まれた画像データのR,
G,Bそれぞれの各平均値から設定することができ、例
えば、夏と冬の芝の緑色の強さの差等の実際のフィール
ドの状況に応じた条件として、確実に芝以外の物体を排
除することができる。
Gi / Ri <1.0 (1) Gi / Bi <1.15 (2) Gi <30% full scale (3) These conditions are the CCD camera 3 in the field.
The image data captured at the moment when the condition setting switch 70 provided outside is turned on, with the
It can be set from each average value of G and B, for example, as a condition according to the actual field situation such as the difference in the green strength of the grass in summer and winter, surely exclude objects other than grass can do.

【0028】尚、本実施例においては、作業対象が芝で
あることから緑画像に対して2値化を行なうが、他の画
像に対して2値化を行なっても良い。例えば、R,G,
Bの各信号を出力するCCDカメラ3に代えて、NTS
C方式の複合カラー映像信号を出力するカメラを撮像手
段として使用した場合には、複合カラー映像信号から得
られる色情報によってマスク情報を算出し、このマスク
情報をカラー画像に適用して2値化画像を得ても良い。
さらには、作業対象の色の特徴によっては、予めマスク
情報を得るためのカラー撮像用カメラと2値化画像を得
るための白黒撮像用カメラとを2台使用することも可能
である。
In this embodiment, the green image is binarized because the work object is grass, but other images may be binarized. For example, R, G,
Instead of the CCD camera 3 which outputs each signal of B, NTS
When a camera that outputs a C-type composite color video signal is used as the imaging means, mask information is calculated from the color information obtained from the composite color video signal, and this mask information is applied to a color image to be binarized. You may get an image.
Further, depending on the characteristics of the color of the work target, it is possible to use two color imaging cameras for obtaining mask information in advance and two monochrome imaging cameras for obtaining a binarized image.

【0029】続いて、ステップS104では、前記マスク情
報に基づいて、芝以外の物体が写った画素と、その周辺
画素の微分値を零に書換えることによって緑色微分値画
像から芝以外のデータを排除し、図10に示すような微
分値画像を得ると、ステップS105へ進み、このマスク後
の微分値画像を2値化するためのスライスレベルを算出
する。このスライスレベルは、図11(a)に示すよう
に、強度微分量を横軸、度数を縦軸として作成したヒス
トグラムから算出され、例えば、設定した画面領域内の
強度変化量の上位10データの平均値の約50%に設定
される。
Then, in step S104, based on the mask information, the pixels other than the turf are photographed and the differential values of the peripheral pixels are rewritten to zero to obtain the data other than the turf from the green differential image. When the differential value image as shown in FIG. 10 is eliminated, the process proceeds to step S105, and the slice level for binarizing the masked differential value image is calculated. As shown in FIG. 11A, this slice level is calculated from a histogram created with the intensity differential amount as the horizontal axis and the frequency as the vertical axis, and for example, for the top 10 data of the intensity change amount in the set screen area. It is set to about 50% of the average value.

【0030】その後、ステップS106へ進み、前記ステッ
プS105で設定したスライスレベルを越えている画素を、
前記ステップS103で生成した微分値画像から抽出して図
12(a)に示すような2値化画像を生成し、ステップ
S107へ進む。
After that, the process proceeds to step S106, and the pixels exceeding the slice level set in step S105 are
Extraction is performed from the differential value image generated in step S103 to generate a binary image as shown in FIG.
Proceed to S107.

【0031】この場合、芝以外のデータを排除していな
い図8の微分値画像から得られる図11(b)のスライ
スレベル算出用ヒストグラムを用いて画像を2値化する
と、図12(b)に示すように、画面端に写っている作
業対象外の木立の部分や人物までが誤って2値化されて
しまう。
In this case, if the image is binarized using the slice level calculation histogram of FIG. 11B obtained from the differential image of FIG. 8 in which data other than turf is not excluded, FIG. 12B is obtained. As shown in FIG. 6, even the part of the tree or the person who is not the work target, which is reflected at the edge of the screen, is erroneously binarized.

【0032】これに対し、前記マスク情報に基づいて、
人物や木立の幹、葉と葉の間の芝でない部分を排除する
とともに、芝以外の画素の周辺画素として木立の葉その
ものも排除した草・芝部分のみの図10の微分値画像か
ら図11(a)のスライスレベル算出用ヒストグラムを
用いて2値化した図12(a)の2値化画像では、作業
対象である草・芝地のみが適切に2値化されていること
がわかる。
On the other hand, based on the mask information,
11 is obtained from the differential value image of FIG. 10 of only the grass / turf portion in which the non-turf portion between the person or the grove, the leaves and the leaves is excluded, and the leaves themselves of the grove are also excluded as peripheral pixels of pixels other than the grass. In the binarized image of FIG. 12A, which is binarized using the slice level calculation histogram of FIG. 12A, it can be seen that only the grass or lawn that is the work target is appropriately binarized.

【0033】次いで、ステップS107では、前記ステップ
S106で得た2値化画像から図13に示すような2次元ヒ
ストグラムを作成し、この2次元ヒストグラムによりハ
フ変換を行なって作業境界を近似する直線の傾きと切片
を求める。この2次元ヒストグラムは、ウィンドウWD
の中央下部を原点Oとし、画面横方向をX軸、画面縦方
向をY軸とするX−Y座標系において、ウインドウWD
内に含まれる画素2点を結ぶ直線をx=ay+bで表わ
したときの傾きtan-1aとX軸切片bとを、画素2点
の全組合わせについて求め、傾きtan-1aを横軸、切
片bを縦軸として作成したものであり、図中、Rhで示
す梨子地部分が最も度数が大きく、次いでGhで示す網
線部分、Bhで示す斜線部分と度数が小さくなり、他は
度数零である。
Then, in step S107,
A two-dimensional histogram as shown in FIG. 13 is created from the binarized image obtained in S106, and Hough transformation is performed using this two-dimensional histogram to obtain the slope and intercept of a straight line that approximates the work boundary. This two-dimensional histogram has a window WD
Of the window WD in the XY coordinate system in which the lower part of the center is the origin O, the horizontal direction of the screen is the X axis, and the vertical direction of the screen is the Y axis.
The slope tan -1 a and the X-axis intercept b when a straight line connecting two pixels included in the pixel are represented by x = ay + b are obtained for all combinations of the two pixels, and the slope tan -1 a is the horizontal axis. , The intercept b is plotted on the vertical axis. In the figure, the frequency of the pear-skin portion indicated by Rh is the highest, followed by the halftone line portion indicated by Gh, the shaded portion indicated by Bh, and the other frequency is reduced. The number is zero.

【0034】そして、作成したヒストグラムからピーク
点を抽出することにより、図15(a)に示すように複
数の直線が得られ、これらの複数の直線の中で最も度数
の高い(相関が強い)ものを抽出すると、図16に破線
で示すようなフェアウェイとラフとの境界を示す境界近
似直線Lが得られ、フェアウェイとラフとの境界を正確
に認識することができる。
Then, by extracting the peak points from the created histogram, a plurality of straight lines are obtained as shown in FIG. 15 (a), and among these straight lines, the frequency is the highest (the correlation is strong). When the object is extracted, a boundary approximation line L indicating the boundary between the fairway and the rough as shown by the broken line in FIG. 16 is obtained, and the boundary between the fairway and the rough can be accurately recognized.

【0035】一方、芝以外の物体のデータを排除しない
で生成した2値化画像から得られる2次元ヒストグラム
は、図14に示され、この2次元ヒストグラムから抽出
した直線では、図15(b)に示すように、画面端の林
の中にフェアウェイとラフとの境界が存在することにな
り、明らかに誤認識していることがわかる。
On the other hand, the two-dimensional histogram obtained from the binarized image generated without excluding the data of objects other than grass is shown in FIG. 14, and the straight line extracted from this two-dimensional histogram is shown in FIG. 15 (b). As shown in, there is a boundary between the fairway and the rough in the forest at the edge of the screen, and it can be seen that it is clearly misrecognized.

【0036】その後、前記ステップS107で求めた直線の
傾きと切片をステップS108で走行制御部40に出力する
と、ステップS101へ戻り、新たな画像を入力して以上の
処理を繰返す。
After that, when the inclination and intercept of the straight line obtained in step S107 are output to the traveling control unit 40 in step S108, the process returns to step S101 and a new image is input and the above processing is repeated.

【0037】このように、フェアウェイとのラフとの境
界あるいは草・芝刈作業済みの領域と未作業領域との境
界等のように、草・芝の葉の高さの異なる作業境界が境
界検出部30で検出されると、走行制御部40では、図
2の走行制御ルーチンにより、作業境界に沿った倣い走
行を制御する。
As described above, the boundary detection unit is a work boundary having different heights of grass / turf leaves, such as a boundary with a rough road or a boundary between a grass / lawn mowed area and an unworked area. When detected by 30, the traveling control unit 40 controls the traveling along the work boundary by the traveling control routine of FIG.

【0038】この走行制御ルーチンは、1行程毎に往復
しながら芝刈作業を行なう形式、あるいは、輪状に芝刈
作業を行なう形式等、予め記憶されている走行経路情報
と地磁気センサ4及び車輪エンコーダ5からの信号に基
づいて算出される走行履歴とに従って走行を行なう場合
の基本ルーチンであり、ステップS201で、境界検出部3
0で検出した画面上における境界近似直線Lを入力する
と、ステップS202で、この直線Lと、予め設定しておい
た画面上の目標直線L0 (y=px+q)とを比較し、
芝刈作業車1の車輌本体の目標直線L0 からの偏差量C
と進行方向の目標直線L0 に対する偏差角θとを求め
る。
This traveling control routine is based on previously stored traveling route information such as a type of performing lawn mowing work while reciprocating for each stroke, a type of performing lawn mowing work in a ring shape, the geomagnetic sensor 4 and the wheel encoder 5. This is a basic routine for traveling in accordance with the traveling history calculated based on the signal of
When the boundary approximation straight line L on the screen detected at 0 is input, this straight line L is compared with the preset target straight line L0 (y = px + q) on the screen in step S202,
Deviation amount C from the target straight line L0 of the vehicle body of the lawnmower 1
And the deviation angle θ with respect to the target straight line L0 in the traveling direction.

【0039】次に、ステップS203へ進み、前記偏差量C
及び前記偏差角θを一定にする前輪目標舵角及び後輪目
標舵角を算出し、ステップS204で、前後輪の各舵角セン
サ10a,10bからの各信号を入力して前輪舵角、後
輪舵角を求め、ステップS205で、前輪舵角と前輪目標舵
角とを比較する。
Next, in step S203, the deviation amount C
And a front wheel target rudder angle and a rear wheel target rudder angle that keep the deviation angle θ constant, and in step S204, signals from the front and rear wheel rudder angle sensors 10a and 10b are input to input the front wheel rudder angle and rear wheel rudder angle. The wheel steering angle is obtained, and in step S205, the front wheel steering angle is compared with the front wheel target steering angle.

【0040】前記ステップS205における比較の結果、前
輪舵角≧前輪目標舵角の場合には、ステップS206へ進
み、前輪操舵用油圧制御弁7aをOFFとし、前輪舵角
<前輪目標舵角の場合には、ステップS207へ分岐して前
輪操舵用油圧制御弁7aをONとし、ステップS208へ進
む。
As a result of the comparison in step S205, if the front wheel steering angle ≧ the front wheel target steering angle, the process proceeds to step S206, the front wheel steering hydraulic control valve 7a is turned off, and if the front wheel steering angle <the front wheel target steering angle To step S207, the front wheel steering hydraulic control valve 7a is turned on, and the process proceeds to step S208.

【0041】ステップS208では、後輪舵角と後輪目標舵
角との比較が行なわれ、後輪舵角≧後輪目標舵角の場
合、ステップS209へ進んで、後輪操舵用油圧制御弁7b
をOFFとし、後輪舵角<後輪目標舵角の場合には、ス
テップS210へ分岐して後輪操舵用油圧制御弁7bをON
とする。
In step S208, the rear wheel steering angle is compared with the rear wheel target steering angle. If the rear wheel steering angle ≧ the rear wheel target steering angle, the process proceeds to step S209, and the rear wheel steering hydraulic control valve is operated. 7b
Is turned off, and if the rear wheel steering angle <rear wheel target steering angle, the process branches to step S210 to turn on the rear wheel steering hydraulic control valve 7b.
And

【0042】その後、ステップS211へ進み、予め設定さ
れた時間(例えば、数秒の制御インターバル)が経過し
たか否かを判定し、この制御インターバルが設定時間だ
け経過していない場合には、前述のステップS204へ戻っ
て前後輪の各舵角センサ10a,10bからの信号によ
り前後輪舵角を検出して同様の過程を繰返して前後輪の
舵角が目標舵角に収束するよう制御し、制御インターバ
ルが設定時間以上経過している場合には、ステップS201
へ戻って境界近似直線式を再度入力することにより前後
輪目標舵角を修正する。
Thereafter, the process proceeds to step S211, and it is determined whether or not a preset time (for example, a control interval of several seconds) has elapsed. If this control interval has not elapsed for the set time, the above-mentioned Returning to step S204, the front and rear wheel steering angles are detected by the signals from the front and rear wheel steering angle sensors 10a and 10b, and the same process is repeated to control the front and rear wheels to converge to the target steering angle. If the interval has exceeded the set time, step S201
The target front and rear wheel steering angles are corrected by returning to and inputting the boundary approximation straight line equation again.

【0043】[0043]

【発明の効果】以上述べたように本発明によれば、車輌
周辺を撮像した画像から明るさの変化量を算出して2値
化画像を生成する際に、作業対象を識別するための色に
係る条件を満足しない画素及びその周辺画素の明るさの
変化量を零として2値化画像を生成し、この2値化画像
から作業対象領域と非作業対象領域との境界を近似する
直線を算出するため、撮像画像内の非作業対象の影響を
除去し、信頼性の高い境界情報を得ることができる。ま
た、この際、前記作業対象を識別するための色に係る条
件を特定の操作入力に応じて撮像した画像から設定する
ことにより、実際のフィールドの状況に応じた条件とす
ることができ、より確実に非作業対象の影響を除去する
ことができる等優れた効果が得られる。
As described above, according to the present invention, a color for identifying a work target when a binarized image is generated by calculating the amount of change in brightness from an image picked up around the vehicle. A binarized image is generated by setting the amount of change in brightness of pixels that do not satisfy the condition of 1 above and the surrounding pixels to zero, and a straight line that approximates the boundary between the work target region and the non-work target region Since the calculation is performed, the influence of the non-working object in the captured image can be removed, and highly reliable boundary information can be obtained. Further, at this time, by setting the condition related to the color for identifying the work target from the image captured according to the specific operation input, the condition can be set according to the actual field situation. An excellent effect such as the effect of the non-working object can be surely removed can be obtained.

【図面の簡単な説明】[Brief description of drawings]

【図1】境界検出ルーチンのフローチャートFIG. 1 is a flowchart of a boundary detection routine.

【図2】走行制御ルーチンのフローチャートFIG. 2 is a flowchart of a traveling control routine

【図3】芝刈作業車の外観を示す概略説明図FIG. 3 is a schematic explanatory view showing the appearance of a lawnmower work vehicle.

【図4】操舵制御系の基本構成図FIG. 4 is a basic configuration diagram of a steering control system.

【図5】境界検出部の回路ブロック図FIG. 5 is a circuit block diagram of a boundary detection unit.

【図6】作業領域を撮像した元画像を示す説明図FIG. 6 is an explanatory diagram showing an original image obtained by imaging a work area.

【図7】平均化処理画像を示す説明図FIG. 7 is an explanatory diagram showing an averaged image.

【図8】微分値画像を示す説明図FIG. 8 is an explanatory diagram showing a differential value image.

【図9】マスク情報を示す説明図FIG. 9 is an explanatory diagram showing mask information.

【図10】マスク後の微分値画像を示す説明図FIG. 10 is an explanatory diagram showing a differential value image after masking.

【図11】スライスレベル算出用ヒストグラムを示す説
明図
FIG. 11 is an explanatory diagram showing a slice level calculation histogram.

【図12】2値化画像を示す説明図FIG. 12 is an explanatory diagram showing a binarized image.

【図13】ハフ変換用2次元ヒストグラムを示す説明図FIG. 13 is an explanatory diagram showing a two-dimensional histogram for Hough transform.

【図14】ハフ変換用2次元ヒストグラムを示す説明図FIG. 14 is an explanatory diagram showing a two-dimensional histogram for Hough transform.

【図15】直線抽出結果を示す説明図FIG. 15 is an explanatory diagram showing a straight line extraction result.

【図16】境界近似直線の認識結果を示す説明図FIG. 16 is an explanatory diagram showing a recognition result of a boundary approximation line.

【符号の説明】[Explanation of symbols]

1 芝刈作業車(自律走行作業車) 3 CCDカメラ(撮像手段) 1 Lawn mowing work vehicle (autonomous traveling work vehicle) 3 CCD camera (imaging means)

───────────────────────────────────────────────────── フロントページの続き (51)Int.Cl.6 識別記号 庁内整理番号 FI 技術表示箇所 G06T 7/00 ─────────────────────────────────────────────────── ─── Continuation of the front page (51) Int.Cl. 6 Identification code Internal reference number FI technical display location G06T 7/00

Claims (2)

【特許請求の範囲】[Claims] 【請求項1】 車輌周辺の撮像画像から作業対象領域と
非作業対象領域との境界を検出し、検出した境界に沿っ
て自律走行するよう操舵系を制御する自律走行作業車の
走行制御方法において、 前記撮像画像から明るさの変化量を算出して2値化画像
を生成する際に、作業対象を識別するための色に係る条
件を満足しない画素及びその周辺画素の明るさの変化量
を零として2値化画像を生成し、 前記2値化画像から前記境界を近似する直線を算出する
ことを特徴とする自律走行作業車の走行制御方法。
1. A travel control method for an autonomous traveling work vehicle, which detects a boundary between a work target area and a non-work target area from an imaged image around the vehicle and controls a steering system so as to autonomously travel along the detected boundary. When calculating the amount of change in brightness from the captured image to generate a binarized image, the amount of change in brightness of pixels that do not satisfy the condition related to the color for identifying the work target and the amount of change in brightness A traveling control method for an autonomous traveling work vehicle, comprising: generating a binarized image as zero and calculating a straight line approximating the boundary from the binarized image.
【請求項2】 前記作業対象を識別するための色に係る
条件を、特定の操作入力に応じて撮像した画像から設定
することを特徴とする請求項1記載の自律走行作業車の
走行制御方法。
2. The traveling control method for an autonomous traveling work vehicle according to claim 1, wherein the condition relating to the color for identifying the work target is set from an image captured according to a specific operation input. .
JP6015562A 1994-02-09 1994-02-09 Method for controlling travel of self-traveling working vehicle Pending JPH07222505A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP6015562A JPH07222505A (en) 1994-02-09 1994-02-09 Method for controlling travel of self-traveling working vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP6015562A JPH07222505A (en) 1994-02-09 1994-02-09 Method for controlling travel of self-traveling working vehicle

Publications (1)

Publication Number Publication Date
JPH07222505A true JPH07222505A (en) 1995-08-22

Family

ID=11892200

Family Applications (1)

Application Number Title Priority Date Filing Date
JP6015562A Pending JPH07222505A (en) 1994-02-09 1994-02-09 Method for controlling travel of self-traveling working vehicle

Country Status (1)

Country Link
JP (1) JPH07222505A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09133762A (en) * 1995-11-10 1997-05-20 Nec Corp Target detection device
CN113822094A (en) * 2020-06-02 2021-12-21 苏州科瓴精密机械科技有限公司 Method, system, robot and storage medium for identifying working position based on image

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09133762A (en) * 1995-11-10 1997-05-20 Nec Corp Target detection device
CN113822094A (en) * 2020-06-02 2021-12-21 苏州科瓴精密机械科技有限公司 Method, system, robot and storage medium for identifying working position based on image
CN113822094B (en) * 2020-06-02 2024-01-16 苏州科瓴精密机械科技有限公司 Method, system, robot and storage medium for identifying working position based on image

Similar Documents

Publication Publication Date Title
CN110243372B (en) Intelligent agricultural machinery navigation system and method based on machine vision
US5706355A (en) Method of analyzing sequences of road images, device for implementing it and its application to detecting obstacles
JP4930046B2 (en) Road surface discrimination method and road surface discrimination device
CN113761970B (en) Method, system, robot and storage medium for identifying working position based on image
US20090010482A1 (en) Diagrammatizing Apparatus
CN113822095B (en) Method, system, robot and storage medium for identifying working position based on image
CN113822094B (en) Method, system, robot and storage medium for identifying working position based on image
JP3585948B2 (en) Travel control method for autonomous traveling work vehicle
JPH07222505A (en) Method for controlling travel of self-traveling working vehicle
JP3757502B2 (en) Moving object travel control device
JP2000099896A (en) Traveling path detecting device and vehicle traveling controller and recording medium
JP2815760B2 (en) Crop row detector
JP3502652B2 (en) Travel control method for autonomous traveling work vehicle
JP3488279B2 (en) Travel control method for autonomous traveling work vehicle
JPH0628032A (en) Traveling control device for automatic traveling working vehicle
JP2520104B2 (en) Boundary detection device for autonomous vehicles
JPH096949A (en) Method for discriminating and removing weeds
JP2585471B2 (en) Boundary detection device for autonomous vehicles
JP3020734B2 (en) Boundary detection device for autonomous vehicles
CN115147714A (en) Method and system for identifying non-working area based on image
CN115147713A (en) Method, system, device and medium for identifying non-working area based on image
JPS63293402A (en) Image pickup type border detecting device
JPH05165519A (en) Crop string detector
JP2624390B2 (en) Crop row detector
JPH05265546A (en) Crop string detecting device