JPH09168315A - Boundary detector, display device for running stage and running control device in working vehicle - Google Patents

Boundary detector, display device for running stage and running control device in working vehicle

Info

Publication number
JPH09168315A
JPH09168315A JP7333296A JP33329695A JPH09168315A JP H09168315 A JPH09168315 A JP H09168315A JP 7333296 A JP7333296 A JP 7333296A JP 33329695 A JP33329695 A JP 33329695A JP H09168315 A JPH09168315 A JP H09168315A
Authority
JP
Japan
Prior art keywords
image
boundary
screen
work site
work
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP7333296A
Other languages
Japanese (ja)
Inventor
Muneyuki Kawase
宗之 河瀬
Yuichi Yamazaki
祐一 山崎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kubota Corp
Original Assignee
Kubota Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kubota Corp filed Critical Kubota Corp
Priority to JP7333296A priority Critical patent/JPH09168315A/en
Publication of JPH09168315A publication Critical patent/JPH09168315A/en
Pending legal-status Critical Current

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Guiding Agricultural Machines (AREA)
  • Image Processing (AREA)

Abstract

PROBLEM TO BE SOLVED: To enable the accurate judgment if the site is an untreated ground or a treated ground on an image screen, the accurate detection of the boundary between the untreated and treated grounds, the display of the running state in terms of the detected boundary on a screen and the automatic running of the working machine along the detected boundary. SOLUTION: A working vehicle V which runs along a boundary between an untreated ground and a treated ground is provided with an imaging means S1 for picking up an image around a boundary. Depending on the information of the image picked up by the imaging means S1, it is decided which side of the image on the screen in terms of the lateral direction of the machine body is an untreated ground or a treated ground as follows. For example, the image picked up is divided into two parts on the screen in terms of the lateral direction of the machine body, the ratio of the screen area corresponding to the treated site to the total screen area of each divided screen side is compared to each other, and the larger side is decided as the treated ground side. Based on the information regarding the decision and the information of the image, the line segment L corresponding to the boundary is determined by an image treating means. Besides the imaging means S1 and the image treating means, the machine body is provided with an image display means for displaying a running state and a running control means for controlling automatic running.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【発明の属する技術分野】本発明は、未処理作業地と処
理済作業地との境界に沿って走行する作業車に、前記境
界箇所を撮像する撮像手段と、その撮像手段の撮像画像
情報及びその撮像画像の機体横幅方向に沿う画面方向の
何れの側が未処理作業地又は処理済作業地側であるかを
特定する作業地特定情報に基づいて、前記境界に対応す
る線分を求める画像処理手段とが設けられた作業車の境
界検出装置、並びに、この境界検出装置にて検出された
境界に対する走行状態を表示する走行状態表示装置、及
び、この検出境界に沿って自動走行させるための走行制
御装置に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a work vehicle traveling along a boundary between an unprocessed work site and a processed work site, and an image pickup means for picking up an image of the boundary portion, and image pickup information of the image pickup means. Image processing for obtaining a line segment corresponding to the boundary based on work site identification information that identifies which side of the screen direction along the machine width direction of the captured image is the unprocessed work site or the processed work site side And a traveling state display device for displaying a traveling state with respect to the boundary detected by the boundary detection device, and traveling for automatically traveling along the detected boundary. Regarding the control device.

【0002】[0002]

【従来の技術】上記作業車の境界検出装置では、例え
ば、作物としての株単位の苗を既植作業地(処理済作業
地に相当)に隣接させて設定間隔毎に植え付ける植付け
作業車(田植え機等)を、既植苗列に沿ってつまり未植
作業地(未処理作業地に相当)と既植作業地との境界に
沿って走行させる際の制御情報を得るために、撮像画面
内で各苗に対応して抽出した複数個の作物領域の情報に
基づいて、境界に対応する線分を例えばハフ変換等を利
用した直線近似や曲線近似処理にて求めているが、この
処理を行うのに必要となる、現在の画面において機体横
幅方向に沿う画面方向の何れの側が既植側又は未植側で
あるかを特定する情報(以下、作業地特定情報ともい
う)は、例えば走行開始時の初期値を作業者が圃場を見
て手動設定し、その後は圃場の端部等にて旋回して向き
変更するのに伴って上記初期値を切り換えて、既植側又
は未植側を判断するようにしていた(例えば、特開平6
‐149362号公報参照)。
2. Description of the Related Art In the boundary detection device for a work vehicle described above, for example, a planting work vehicle (rice planting) for planting seedlings in a stock unit as a crop adjacent to an already planted work site (corresponding to a treated work site) at set intervals. In order to obtain control information when driving a machine along the line between the planted seedling row, that is, the boundary between the unplanted work site (corresponding to the untreated work site) and the planted work site, Based on the information of multiple crop areas extracted for each seedling, the line segment corresponding to the boundary is obtained by linear approximation or curve approximation processing using, for example, Hough transform. The information (hereinafter, also referred to as work site identification information) that is necessary for the operation to identify which side of the current screen in the screen width direction along the machine width direction is the planted side or the unplanted side is, for example, the start of traveling. The operator manually sets the initial value at the time by looking at the field and then Turning at field end or the like with for reorientation by switching the initial value, had to be determined SundeUe side or not planted side (e.g., JP-A-6
No. 149362).

【0003】[0003]

【発明が解決しようとする課題】しかしながら、上記従
来技術では、走行開始時に既植側又は未植側の情報を作
業者が手動設定する作業が面倒であるとともに、走行開
始後に、旋回時の角度の条件等によっては作業地特定情
報の切り換えをミスするおそれがあり、その結果、既植
作業地と未植作業地との境界検出が適正にできない場合
があった。
However, in the above-mentioned prior art, it is troublesome for the operator to manually set the information of the planted side or the unplanted side at the start of traveling, and the angle at the time of turning after the start of traveling. Depending on the conditions, etc., there is a risk of making a mistake in switching the work site identification information, and as a result, the boundary between the planted work site and the unplanted work site may not be properly detected.

【0004】本発明は、上記実情に鑑みてなされたもの
であって、その目的は、上記従来技術の不具合を解消す
べく、例えば走行開始時に作業者が上記作業地特定情報
を手動設定する作業を不要としながら、走行開始後にお
いても未処理作業地又は処理済作業地側についての情報
を正確に得て上記境界検出を適正に行い、そして、その
検出境界に対する走行状態を手動操縦している作業者が
判断できるように画面表示したり、あるいは、その検出
境界に沿って作業車を自動走行させるようにすることで
ある。
The present invention has been made in view of the above circumstances, and its purpose is, for example, a work in which an operator manually sets the work site specifying information at the start of traveling in order to solve the problems of the above-mentioned prior art. While not required, the information on the unprocessed work site or the processed work site side is accurately obtained even after the start of travel, and the boundary detection is properly performed, and the traveling state for the detected boundary is manually steered. This is to display on the screen so that the operator can judge, or to automatically drive the work vehicle along the detection boundary.

【0005】[0005]

【課題を解決するための手段】請求項1の構成によれ
ば、未処理作業地と処理済作業地との境界に沿って走行
する作業車側の撮像手段がその境界箇所を撮像した撮像
画像情報に基づいて、その撮像画像の機体横幅方向に沿
う画面方向の何れの側が未処理作業地又は処理済作業地
側であるかが特定され、その作業地特定情報と撮像手段
の撮像画像情報とに基づいて、上記境界に対応する線分
が求められる。
According to the structure of claim 1, a picked-up image obtained by picking up an image of the boundary portion by the image pickup means on the side of the work vehicle traveling along the boundary between the unprocessed work site and the processed work site. Based on the information, it is specified which side of the screen direction along the lateral direction of the body of the captured image is the unprocessed work site or the processed work site side, and the work site identification information and the imaged image information of the imaging means. Then, the line segment corresponding to the above boundary is obtained.

【0006】従って、画面方向の何れの側が未処理作業
地又は処理済作業地側であるかの情報を撮像画像情報そ
のものから判別するので、従来のように例えば走行開始
時に作業者が圃場を見て未処理作業地又は処理済作業地
側を初期設定する作業を不要としながら、走行開始後に
おいても、従来のように旋回に伴う向き変更に伴って切
り換える場合には、旋回時の角度条件等によっては正確
な切り換えが行われないという不具合もなく、未処理作
業地又は処理済作業地側についての正確な情報を得て、
その両作業地の境界検出を適正に行うことができる作業
車の境界検出装置が得られる。
Therefore, the information as to which side in the screen direction is the unprocessed work site or the processed work site side is determined from the captured image information itself, so that the worker can see the farm field at the start of traveling as in the conventional case. If there is no need to initialize the unprocessed work site or the processed work site side, and when switching is performed due to the change in direction accompanying turning, even after the start of travel, the angle conditions during turning, etc. Depending on the situation, there is no problem that accurate switching is not performed, and accurate information about the unprocessed work site or the processed work site side is obtained,
There is provided a work vehicle boundary detection device capable of appropriately performing boundary detection between both work sites.

【0007】請求項2の構成によれば、請求項1におい
て、撮像画像情報に基づいて処理済作業地に対応する処
理済領域が抽出され、撮像画面を機体横幅方向に沿う画
面方向の両側に分割した各分割画面において画面に対す
る上記処理済領域の面積比が大である側が処理済作業地
側であると判別され、あるいは、撮像画像情報に基づい
て未処理作業地に対応する未処理領域が抽出され、撮像
画面を機体横幅方向に沿う画面方向の両側に分割した各
分割画面において画面に対する上記未処理領域の面積比
が大である側が未処理作業地側であると判別される。
According to the second aspect of the present invention, in the first aspect, the processed areas corresponding to the processed work sites are extracted based on the picked-up image information, and the picked-up screens are provided on both sides in the screen direction along the machine width direction. In each of the divided screens, the side where the area ratio of the processed area to the screen is large is determined to be the processed work site side, or the unprocessed area corresponding to the unprocessed work site is based on the captured image information. It is determined that the side having a large area ratio of the unprocessed area to the screen is the unprocessed work site side in each of the divided screens that are extracted and are divided into both sides in the screen direction along the machine lateral width direction.

【0008】従って、画面内における処理済領域又は未
処理作業地を抽出しながら、例えばその処理済領域又は
未処理領域の形状や向き等を認識して処理済作業地又は
未処理作業地側を判別するものでは、パターン認識に時
間を要するのに比べて、画面に対する処理済領域又は未
処理領域の面積比を求めるという比較的単純な処理によ
って迅速且つ適切に処理済作業地又は未処理作業地側を
判別することができ、もって、請求項1の構成の好適な
手段が得られる。
Therefore, while extracting the processed area or the unprocessed work area in the screen, for example, by recognizing the shape or orientation of the processed area or the unprocessed area, the processed work area or the unprocessed work side is identified. In comparison, it takes a long time to recognize the pattern, and a relatively simple process of obtaining the area ratio of the processed area or the unprocessed area to the screen promptly and appropriately processes the processed work area or the unprocessed work area. The side can be discriminated, so that the preferable means of the constitution of claim 1 can be obtained.

【0009】請求項3の構成によれば、請求項1又は2
において、撮像手段の機体に対する適正撮像位置での撮
像画像に前記境界箇所が撮像されていない場合には、撮
像手段の撮像画像に前記境界箇所が撮像されるまで、そ
の撮像手段の撮像方向が上記適正撮像位置からこの適正
撮像位置を中央側に含む状態で機体横幅方向に沿って変
更され、その境界箇所が撮像されている撮像画像に基づ
いて前記作業地特定情報が求められる。
According to the configuration of claim 3, claim 1 or 2
In the case where the boundary portion is not imaged in the imaged image at the proper image capturing position with respect to the airframe of the image capturing means, the image capturing direction of the image capturing means is the above until the boundary portion is imaged in the image captured by the image capturing means. The work site identification information is obtained based on a captured image in which the proper image capturing position is changed along the lateral direction of the machine body while including the proper image capturing position on the center side, and the boundary portion is imaged.

【0010】従って、作業車の向きが前記境界の方向に
対して大きく傾いたために、機体に対する適正撮像位置
での撮像画像に、未処理作業地と処理済作業地との境界
箇所が撮像されていない場合でも、撮像画像に境界箇所
が撮像されるまで撮像方向を変更したことの情報から、
機体横幅方向に沿う画面方向の何れが未処理作業地又は
処理済作業地側であるかが特定でき、もって、請求項1
又は2の構成の好適な手段が得られる。
Therefore, since the orientation of the work vehicle is greatly inclined with respect to the direction of the boundary, the boundary between the unprocessed work site and the processed work site is imaged in the imaged image at the proper imaging position with respect to the machine body. Even if it does not exist, from the information that the imaging direction was changed until the boundary part was captured in the captured image,
It is possible to specify which of the screen direction along the machine width direction is the unprocessed work site or the processed work site side.
Alternatively, a suitable means having the configuration of 2 can be obtained.

【0011】請求項4の構成によれば、請求項2又は3
において、撮像手段の機体前後方向に沿う画面方向の各
位置において機体横幅方向に沿う画面方向で最も未処理
作業地側に隣接する処理済領域が求められ、その機体前
後方向に沿って列状に並ぶ処理済領域を結ぶように直線
近似して境界に対応する線分が求められ、あるいは、撮
像手段の機体前後方向に沿う画面方向の各位置において
機体横幅方向に沿う画面方向で最も処理済作業地側に隣
接する未処理領域が求められ、その機体前後方向に沿っ
て列状に並ぶ未処理領域を結ぶように直線近似して境界
に対応する線分が求められる。
According to the configuration of claim 4, claim 2 or 3
In, at each position in the screen direction along the machine longitudinal direction of the image pickup means, the processed region that is adjacent to the most unprocessed work site side in the screen direction along the machine lateral width direction is obtained, and in a row along the machine longitudinal direction. The line segment corresponding to the boundary is obtained by linear approximation so as to connect the processed regions that are lined up, or the most processed work in the screen direction along the machine width direction at each position in the screen direction along the machine longitudinal direction of the image pickup means An unprocessed area adjacent to the ground side is obtained, and a straight line approximation is performed so as to connect the unprocessed areas arranged in a line along the longitudinal direction of the airframe to obtain a line segment corresponding to the boundary.

【0012】従って、例えば、撮像画面内のすべての処
理済又は未処理領域の情報を処理すると、処理時間がか
かり過ぎるのに比べて、境界付近に位置する領域の情報
のみを使って処理対象情報量を減らしながら、しかも、
比較的狭い範囲ではほぼ直線状である境界に対応する線
分を直線近似することにより、境界検出を迅速且つ適切
に行うことができ、もって、請求項2又は3の構成の好
適な手段が得られる。
Therefore, for example, if the information of all the processed or unprocessed areas in the image pickup screen is processed, it takes a long processing time, whereas the information of the processing object information is obtained by using only the information of the areas located near the boundary. While reducing the amount,
By linearly approximating a line segment corresponding to a boundary that is substantially linear in a relatively narrow range, boundary detection can be performed quickly and appropriately, and the preferred means of the configuration according to claim 2 or 3 is obtained. To be

【0013】請求項5の構成によれば、請求項2、3又
は4において、未処理作業地としての未植作業地に対し
て、機体進行方向に設定植付け間隔を隔て、且つ、機体
横幅方向に設定植付け幅を隔てる状態で複数個の作物が
植え付けられ、その複数個の作物夫々に対応する複数個
の作物領域が処理済領域として抽出され、その抽出され
た複数個の作物領域の情報に基づいて、前記作業地特定
情報が求められるとともに、境界に対応する線分が求め
られる。
According to a fifth aspect of the present invention, in the second, third or fourth aspect, the unplanted work site as the unprocessed work site is separated from the unplanted work site by a set planting interval in the machine body traveling direction and in the machine body lateral width direction. A plurality of crops are planted with the planting width separated, and a plurality of crop areas corresponding to each of the plurality of crops are extracted as processed areas. Based on this, the work site identification information is obtained, and the line segment corresponding to the boundary is obtained.

【0014】従って、処理済領域である作物領域を互い
に設定距離を置いて確実に分離した複数個の領域として
抽出し、その個別に分離した作物領域を結んで境界に対
応する線分を的確に求めることができるので、例えば処
理済領域が連続する変形タイプの領域の場合には、その
処理済領域の情報に基づいて境界に対応する線分を的確
に求めるのが容易でないという不具合もなく、作物の植
付け作業車の境界検出に特に適する請求項2、3又は4
の構成の好適な手段が得られる。
Therefore, the crop areas which are the processed areas are extracted as a plurality of areas which are separated surely at a set distance from each other, and the crop areas which are individually separated are connected to each other so that the line segment corresponding to the boundary is accurately formed. Since it can be obtained, for example, in the case of a deformation type region where the processed region is continuous, there is no problem that it is not easy to accurately obtain the line segment corresponding to the boundary based on the information of the processed region, 5. Particularly suitable for detecting boundaries of work vehicles for planting crops.
A suitable means of the construction of

【0015】請求項6の構成によれば、作業車側の画像
表示手段の同一画面上に、境界箇所を撮像した撮像画像
と、請求項1、2、3、4また5において求めた未処理
作業地と処理済作業地との境界に対応する線分と、機体
前後方向に対する上記線分の変位を示す画像とが表示さ
れる。
According to the structure of claim 6, a picked-up image obtained by picking up a boundary portion on the same screen of the image display means on the side of the work vehicle and the unprocessed image obtained in claim 1, 2, 3, 4 or 5. A line segment corresponding to the boundary between the work site and the processed work site and an image showing the displacement of the line segment in the machine longitudinal direction are displayed.

【0016】従って、例えば、作業者が実際の作業地を
見て境界と機体前後方向とのずれ状態を判断するのに比
べて、作業者が上記表示画像から境界と機体の向きとの
ずれを容易且つ的確に判断でき、もって、機体が境界に
沿って適切な状態で走行するように手動操向させること
ができる作業車の走行状態表示装置が得られる。
Therefore, for example, as compared with the case where the operator judges the deviation state between the boundary and the front-back direction of the machine by looking at the actual work site, the operator determines the deviation between the boundary and the direction of the machine from the display image. A running state display device for a work vehicle can be obtained that can be easily and accurately determined and that can be manually steered so that the machine body travels in an appropriate state along the boundary.

【0017】請求項7の構成によれば、作業車が未処理
作業地と処理済作業地との境界に沿って走行するよう
に、請求項1、2、3、4また5において求めた境界に
対応する線分の機体前後方向に対する変位情報に基づい
て、つまりその変位を打ち消すように走行装置が自動的
に操向操作される。
According to the structure of claim 7, the boundary determined in claim 1, 2, 3, 4 or 5 is such that the work vehicle travels along the boundary between the unprocessed work site and the processed work site. The traveling device is automatically steered based on the displacement information in the longitudinal direction of the body corresponding to the line segment, that is, so as to cancel the displacement.

【0018】従って、例えば、作業者が手動にて操向操
作して機体が境界に沿って走行している場合に、他の作
業(植付け作業車における苗補給等)を行うために操縦
用のハンドル等から一時的に手を放しても、機体を境界
に沿って自動走行させることができ、もって、作業性に
優れた作業車の走行制御装置が得られる。
Therefore, for example, when the operator manually steers and the machine body is traveling along the boundary, the maneuvering operation is performed to perform other work (eg seedling replenishment in a planting work vehicle). Even if the hand is temporarily released from the steering wheel or the like, the machine body can be automatically traveled along the boundary, and thus a travel control device for a work vehicle having excellent workability can be obtained.

【0019】[0019]

【発明の実施の形態】以下、本発明に係る作業車の境界
検出装置、走行状態表示装置及び走行制御装置を、作業
車としての田植え機(植付け作業車)が植付け作業しな
がら圃場内を走行する場合について、図面に基づいて説
明する。
BEST MODE FOR CARRYING OUT THE INVENTION Hereinafter, a boundary detection device, a traveling state display device and a traveling control device for a work vehicle according to the present invention are run in a field while a rice planting machine (planting work vehicle) as a work vehicle is planting work. The case will be described with reference to the drawings.

【0020】図1及び図2に示すように、前輪1F及び
後輪1Rを備えた作業車Vの機体後部側に、作物として
の苗Tを圃場に植え付けるための苗植え付け装置2が昇
降自在に設けられ、作業車Vは、未処理作業地としての
未植作業地に対して、機体進行方向に設定植付け間隔を
隔て、且つ、機体横幅方向に設定植付け幅を隔てる状態
で複数個の苗Tを植え付けて、処理済作業地としての既
植作業地を形成するように構成されている。
As shown in FIGS. 1 and 2, a seedling planting device 2 for planting a seedling T as a crop in a field is movable up and down on the rear side of the body of a work vehicle V having front wheels 1F and rear wheels 1R. The work vehicle V is provided with a plurality of seedlings T at a set planting interval in the machine body advancing direction and a set planting width in the machine body lateral direction with respect to the unplanted work site as an unprocessed work site. Are planted to form an already planted work site as a treated work site.

【0021】作業車Vの機体前部側に、上記複数個の苗
Tが列状に植え付けられた既植作業地と未植作業地との
境界箇所を撮像する撮像手段としてのテレビカメラS1
が設けられ、このテレビカメラS1は、機体横外側方に
隣接する境界箇所を斜め上方から撮像するように、機体
横外側方に向かって突出された支持部材4の先端部に取
り付けられ、作業車Vが機体進行方向つまり機体前後方
向に沿う苗列に適正に沿っている状態において、未植作
業地に隣接する苗Tnの列方向つまり境界に対応する線
分Lが、テレビカメラS1の撮像画面の中央を前後方向
に通る走行基準線分Laと一致する状態となるように設
定してある。
A television camera S1 as an image pickup means for picking up an image of the boundary between the planted work site and the unplanted work site in which the plurality of seedlings T are planted in rows is provided on the front side of the work vehicle V.
This television camera S1 is attached to the tip of the support member 4 protruding toward the lateral outer side of the machine body so as to image the boundary portion adjacent to the lateral outer side of the machine body obliquely from above. The line segment L corresponding to the row direction, that is, the boundary of the seedlings Tn adjacent to the unplanted work site is the image pickup screen of the television camera S1 in the state where V is properly along the seedling row along the machine traveling direction, that is, the machine longitudinal direction. It is set so as to be in a state in which it coincides with a traveling reference line segment La that passes through the center of in the front-rear direction.

【0022】そして、圃場の一端側から他端側に向かう
複数個の作業行程が、機体横幅方向に平行に並ぶ状態で
設定され、各作業行程では、前記既植苗Tnに対応する
線分Lが画面内の走行基準線分Laに一致するように、
作業者が、後述のテレビモニター13の表示画面を見な
がら作業車Vを手動操向させたり、あるいは、上記線分
Lの走行基準線分Laに対するずれ情報に基づいて作業
車Vが自動的に操向制御される。尚、作業車Vは、一行
程走行する毎に圃場に対する走行方向が反転して、作業
車Vに対する前記境界の位置が左右反転することから、
前記テレビカメラS1は、作業車Vの左右夫々に各一個
が設けられ、使用する側のテレビカメラS1を後述のよ
うに切り換え制御する。尚、図1では、境界の位置が作
業車Vの左側であり、左側のテレビカメラS1の撮像情
報が使われる。
A plurality of work steps from one end side to the other end side of the field are set in a state of being arranged parallel to the machine lateral width direction. In each work step, the line segment L corresponding to the already planted seedling Tn is set. To match the travel reference line segment La on the screen,
The operator manually steers the work vehicle V while looking at the display screen of the TV monitor 13 described later, or the work vehicle V is automatically operated based on the deviation information of the line segment L from the traveling reference line segment La. Steering is controlled. Since the traveling direction of the work vehicle V with respect to the field is reversed every time the vehicle travels one stroke, and the position of the boundary with respect to the work vehicle V is horizontally reversed,
One television camera S1 is provided on each of the left and right sides of the work vehicle V, and the television cameras S1 on the side of use are switched and controlled as described below. In FIG. 1, the boundary position is the left side of the work vehicle V, and the image pickup information of the left TV camera S1 is used.

【0023】前記作業車Vの構成について説明すれば、
図3に示すように、エンジンEの出力が変速装置5を介
して前輪1F及び後輪1Rの夫々に伝達され、その変速
装置5の変速操作状態を検出する変速状態検出用ポテン
ショメータR3と、その変速装置5を変速操作するため
の変速用電動モータ6及び手動変速用のアクセルペダル
17とが設けられている。尚、図中、S2は前記変速装
置5の出力回転数に基づいて走行距離を検出するための
距離センサである。
Explaining the structure of the work vehicle V,
As shown in FIG. 3, the output of the engine E is transmitted to each of the front wheels 1F and the rear wheels 1R via the transmission device 5, and the transmission state detecting potentiometer R3 for detecting the transmission operation state of the transmission device 5 and the potentiometer R3 are provided. An electric motor 6 for shifting the transmission 5 for shifting the transmission 5 and an accelerator pedal 17 for manual shifting are provided. In the figure, S2 is a distance sensor for detecting the traveling distance based on the output speed of the transmission 5.

【0024】操向操作自在な走行装置としての前輪1F
及び後輪1Rが、夫々油圧シリンダ7F,7Rによって
各別にステアリング操作されるように構成され、前後輪
1F,1Rのステアリング角度を検出するステアリング
角検出用ポテンショメータR1,R2と、手動操縦用の
ハンドル16と、そのハンドル16のよる操作角を検出
するハンドル操作角検出用ポテンショメータR0と、検
出ステアリング角が目標ステアリング角になるように前
記油圧シリンダ7F,7Rを作動させる電磁操作式の制
御弁8F,8Rとが設けられている。尚、前後輪1F,
1Rを同位相で且つ同角度に操向する平行ステアリング
形式、前後輪1F,1Rを逆位相で且つ同角度に操向す
る4輪ステアリング形式、及び、前輪1Fのみを操向す
る2輪ステアリング形式の三種類のステアリング形式
を、図示しない切換スイッチによって選択使用できるよ
うになっている。但し、各作業行程に沿って自動走行す
る時は、2輪ステアリング形式が選択される。
Front wheel 1F as a traveling device which can be freely steered
Also, the rear wheels 1R are configured to be individually steered by hydraulic cylinders 7F and 7R, respectively, and steering angle detection potentiometers R1 and R2 for detecting steering angles of the front and rear wheels 1F and 1R, and a steering wheel for manual operation. 16, a steering wheel operation angle detection potentiometer R0 for detecting the operation angle of the steering wheel 16, and an electromagnetically operated control valve 8F for operating the hydraulic cylinders 7F, 7R so that the detected steering angle becomes a target steering angle. 8R and are provided. In addition, front and rear wheels 1F,
Parallel steering type that steers 1R in phase and at the same angle, four-wheel steering type that steers front and rear wheels 1F and 1R in opposite phase and at the same angle, and two-wheel steering type that steers only front wheel 1F. The three types of steering types can be selected and used by a changeover switch (not shown). However, when the vehicle automatically travels along each work stroke, the two-wheel steering type is selected.

【0025】マイクロコンピュータ利用の制御装置12
が設けられ、この制御装置12に、前記ステアリング角
検出用ポテンショメータR1,R2、ハンドル操作角検
出用ポテンショメータR0、変速状態検出用ポテンショ
メータR3、距離センサS2、及び、自動走行モードを
起動する自動走行スイッチ15からの各信号が入力さ
れ、制御装置12からは、前記変速用電動モータ6、及
び制御弁8F,8Rに対する駆動信号が出力されてい
る。そして、制御装置12は、自動走行スイッチ15が
オンした自動走行モードでは、変速装置5が予め設定さ
れた設定走行速度に対応する操作状態となるように変速
用電動モータ6を駆動し、且つ、前後輪1F,1Rが設
定操舵角になるように制御弁8F,8Rを駆動し、ま
た、自動走行スイッチ15がオフした手動走行モードで
は、前後輪1F,1Rがハンドル16にて指示された操
作角になるように制御弁8F,8Rを駆動する。尚、手
動走行モードでは、上記アクセルペダル17を作業者が
操作して変速させる。
Control device 12 utilizing microcomputer
The control device 12 is provided with the steering angle detecting potentiometers R1 and R2, the steering wheel operating angle detecting potentiometer R0, the shift state detecting potentiometer R3, the distance sensor S2, and the automatic traveling switch for starting the automatic traveling mode. Each signal from 15 is input, and the control device 12 outputs drive signals for the electric motor 6 for shifting and the control valves 8F and 8R. Then, in the automatic traveling mode in which the automatic traveling switch 15 is turned on, the control device 12 drives the electric motor 6 for shifting so that the transmission device 5 is in an operation state corresponding to a preset traveling speed, and In the manual traveling mode in which the control valves 8F and 8R are driven so that the front and rear wheels 1F and 1R have the set steering angle, and the automatic traveling switch 15 is turned off, the front and rear wheels 1F and 1R are operated by the steering wheel 16 The control valves 8F and 8R are driven so that the corners are formed. In the manual traveling mode, the operator operates the accelerator pedal 17 to shift gears.

【0026】図4に示すように、圃場面における苗T等
から反射光は、水面等からの直接反射光が入射しないよ
うにするための偏光フィルタ及び可視光の入射を阻止す
るための可視光カットフィルタからなる光フィルタ14
を透過した後、前記テレビカメラS1に入射している。
つまり、テレビカメラS1からは、圃場面における苗T
等についての赤外光による赤外画像がアナログ画像情報
として得られる(図9(イ)参照)。尚、この画像は、
テレビカメラS1が圃場の前方側を斜め下向きに撮像し
ているために、遠方側の画像ほど距離が縮んで見えてい
る。
As shown in FIG. 4, the reflected light from the seedling T or the like in the field scene is a polarizing filter for preventing the direct reflected light from the water surface or the like and the visible light for blocking the incident of the visible light. Optical filter 14 consisting of a cut filter
After being transmitted, it is incident on the television camera S1.
That is, from the TV camera S1, the seedling T in the field scene is displayed.
An infrared image of infrared light of the like is obtained as analog image information (see FIG. 9A). In addition, this image is
Since the television camera S1 images the front side of the field in a diagonally downward direction, the farther the image, the shorter the distance.

【0027】次に、前記テレビカメラS1の撮像情報に
基づいて、前記境界に対応する線分L及びそれに対する
機体の変位を求めるための制御構成(図3及び図4の
G)について説明すると、上記テレビカメラS1からの
アナログ画像信号を処理して、苗Tに対応する苗領域T
aの情報が2値化されたデジタル画像信号(図9
(ロ)、又はこの(ロ)の画像に対して前記遠近歪みの
除去処理を行った図9(ハ)参照)を出力するビデオ信
号処理部9と、そのデジタル画像信号を予め設定された
画素密度(32×32画素/1画面)に対応した画像デ
ータとして記憶する画像メモリ11と、画像メモリ11
内の画像データを処理して上記苗領域Taのうちで未植
作業地に隣接する苗領域Taを判別し、この苗領域Ta
を結ぶようにして前記線分Lを求める(図9(ニ)参
照)とともに、ビデオ信号処理部9に対して左右のテレ
ビカメラS1の切換や画像取り込み信号等の制御信号を
出力する画像処理部10とが設けられている。そして、
画像処理部10は、前記走行基準線Laに対する線分L
の変位状態を判別して、その判別情報を制御装置12に
出力している。図中、13は、テレビカメラS1の撮像
画像や画像処理部10の処理結果を表示するための液晶
パネル等にて構成されたテレビモニターである。
Next, the control configuration (G in FIGS. 3 and 4) for obtaining the line segment L corresponding to the boundary and the displacement of the airframe with respect to the line segment based on the image pickup information of the television camera S1 will be described. By processing the analog image signal from the TV camera S1, the seedling area T corresponding to the seedling T
The digital image signal in which the information of a is binarized (see FIG. 9).
(B), or a video signal processing unit 9 for outputting the perspective distortion removal processing for the image of (b)) and a pixel for which the digital image signal is preset. An image memory 11 that stores image data corresponding to the density (32 × 32 pixels / one screen), and the image memory 11
The seedling area Ta adjacent to the unplanted work site is discriminated from the seedling area Ta by processing the image data in the seedling area Ta.
An image processing unit that obtains the line segment L by connecting the lines (see FIG. 9D) and outputs control signals such as switching between the left and right television cameras S1 and an image capture signal to the video signal processing unit 9. And 10 are provided. And
The image processing unit 10 uses the line segment L with respect to the traveling reference line La.
Discriminating the displacement state and outputting the discrimination information to the control device 12. In the figure, 13 is a television monitor including a liquid crystal panel and the like for displaying a captured image of the television camera S1 and a processing result of the image processing unit 10.

【0028】以上より、上記ビデオ信号処理部9を利用
して、テレビカメラS1の撮像画像情報に基づいて前記
複数個の苗T夫々に対応する複数個の苗領域Ta(作物
領域に相当)を処理済作業地に対応する処理済領域とし
て抽出する作物領域抽出手段101が構成され、又、画
像メモリ11及び画像処理部10を利用して、上記作物
領域抽出手段101にて抽出された複数個の苗領域Ta
の情報に基づいて、テレビカメラS1の撮像画像の機体
横幅方向に沿う画面方向の何れの側が未植作業地又は既
植作業地側であるかを特定する前記作業地特定情報及び
前記境界に対応する線分Lを求める演算手段102が構
成されている。つまり、この作物領域抽出手段101及
び演算手段102によって、テレビカメラS1の撮像画
像情報及び前記作業地特定情報に基づいて、前記線分L
を求める画像処理手段GSが構成され、この画像処理手
段GSが、テレビカメラS1の撮像画像情報に基づいて
上記作業地特定情報を求めるように構成される。
From the above, by using the video signal processing section 9, a plurality of seedling areas Ta (corresponding to crop areas) corresponding to each of the plurality of seedlings T are generated based on the image information of the television camera S1. A crop area extracting unit 101 for extracting a processed area corresponding to a processed work site is configured, and a plurality of crop area extracting units 101 are extracted by using the image memory 11 and the image processing unit 10. Seedling area Ta
Corresponding to the work site specifying information and the boundary for specifying which side in the screen direction along the machine width direction of the image captured by the television camera S1 is the unplanted work site or the planted work site side based on the information The calculation means 102 for obtaining the line segment L to be processed is configured. That is, the line segment L is generated by the crop area extracting unit 101 and the calculating unit 102 based on the captured image information of the television camera S1 and the work site specifying information.
The image processing means GS is configured to calculate the work site identification information based on the imaged image information of the television camera S1.

【0029】上記作業地特定情報を求める構成について
説明すると、演算手段102は、図9(ハ)に示すよう
に、テレビカメラS1の撮像画面を機体横幅方向に沿う
画面方向の両側に分割した(図では、左右に2等分し
た)各分割画面GL,GRにおいて画面に対する苗領域
Taの面積比が大である側(図では、左側の分割画面G
Lでの面積比の方が大であるから左側)を既植作業地側
であると判別する。
Explaining the configuration for obtaining the work site specifying information, the calculating means 102 divides the image pickup screen of the television camera S1 into both sides in the screen direction along the machine width direction as shown in FIG. 9C. In each of the split screens GL and GR (divided into left and right in the figure), the side where the area ratio of the seedling region Ta to the screen is large (the left split screen G in the figure)
Since the area ratio in L is larger, the left side) is determined to be the planted work site side.

【0030】そして、演算手段102は、図9(ニ)に
示すように、テレビカメラS1の機体前後方向に沿う画
面方向(図では、上下方向)の各位置において、機体横
幅方向に沿う画面方向で最も未植作業地側に隣接する苗
領域Taを求め、その機体前後方向に沿って列状に並ぶ
苗領域Taを結ぶように、後述するハフ変換処理によっ
て直線近似して前記線分Lを求める。
Then, as shown in FIG. 9D, the calculating means 102, at each position in the screen direction (vertical direction in the figure) along the machine longitudinal direction of the TV camera S1, the screen direction along the machine lateral direction. The seedling area Ta that is closest to the unplanted work site side is obtained, and the line segment L is linearly approximated by a Hough conversion process described below so as to connect the seedling areas Ta arranged in a row along the machine longitudinal direction. Ask.

【0031】そして、図9(ホ)に示すように、前記テ
レビモニター13を利用して、前記テレビカメラS1の
撮像画像と、前記求めた線分Lと、機体前後方向に対す
る前記線分Lの変位つまり前記走行基準線Laの方向
(画面上下方向)に対する角度ずれθsを示す画像とを
同一画面上に表示する画像表示手段が構成されている。
従って、作業者は、上記テレビモニター13の表示画面
を見て作業車Vを手動操向させることになる。又、前記
制御装置12を利用して、前記線分Lの機体前後方向に
対する変位(上記角度ずれθs)の情報に基づいて前後
輪1F,1Rを操向制御する制御手段100が構成され
ている。つまり、角度ずれθsが零になるように操向制
御される。
Then, as shown in FIG. 9 (e), the image captured by the television camera S1, the obtained line segment L, and the line segment L in the longitudinal direction of the machine are taken by using the television monitor 13. Image display means for displaying the displacement, that is, the image showing the angle deviation θs with respect to the direction of the travel reference line La (the screen vertical direction) on the same screen is configured.
Therefore, the operator manually steers the work vehicle V by looking at the display screen of the television monitor 13. Further, using the control device 12, a control means 100 for steering control of the front and rear wheels 1F, 1R is configured based on the information on the displacement of the line segment L with respect to the machine longitudinal direction (the angular deviation θs). . That is, the steering control is performed so that the angle deviation θs becomes zero.

【0032】次に、図5〜図8に示すフローチャートに
基づいて、前記制御装置12及び画像処理手段GSによ
る制御作動について説明する。制御がスタートすると、
メインフロー(図5)では、テレビカメラS1の情報に
基づく苗列検出処理を行い、次に、自動走行モードか手
動走行モードかを判断して、手動走行モードは、走行状
態表示処理を行い、自動走行モードでは、前記変位(角
度ずれθs)情報に基づく方向制御処理を行う。
Next, the control operation by the controller 12 and the image processing means GS will be described with reference to the flow charts shown in FIGS. When the control starts,
In the main flow (FIG. 5), seedling row detection processing based on the information from the TV camera S1 is performed, then it is determined whether the vehicle is in the automatic traveling mode or the manual traveling mode, and the manual traveling mode performs the traveling state display processing. In the automatic traveling mode, the direction control process is performed based on the displacement (angle deviation θs) information.

【0033】苗列検出処理(図6)では、先ず、設定距
離を走行する毎又は設定時間毎に、テレビカメラS1か
らの撮像画像を取り込み(図9(イ))、この画像に対
して、ノイズ除去のためのフィルタ処理と、設定閾値で
2値化して苗Tに対応する作物領域Taを抽出する処理
と、図9(ロ)に示すような微小な偽の作物領域Ta’
を除去するための縮体・膨張処理を行い、さらに、前記
遠近補正処理を行って、図9(ハ)に示す作物領域Ta
画像を得る。尚、図では、3列状態で8個の作物領域T
a1〜Ta8が抽出されている状態を示す。
In the seedling row detection process (FIG. 6), first, an image picked up from the television camera S1 is taken in every time the vehicle travels a set distance or every set time (FIG. 9 (A)). A filtering process for noise removal, a process of binarizing with a set threshold value to extract a crop region Ta corresponding to the seedling T, and a minute false crop region Ta ′ as shown in FIG.
To reduce the crop area / expansion processing, and further to perform the perspective correction processing to remove the crop area Ta shown in FIG.
Get the image. In the figure, eight crop areas T are arranged in three rows.
The state where a1-Ta8 is extracted is shown.

【0034】次に、画面を左右に2分割し、左右の各画
面で苗領域Taの面積比を求める。具体的には、図10
に示すように、各画面において、縦32画素×横16画
素(合計512画素)のうちで苗領域Taに対応する画
素(画素値が’1’)の数を計数する。従って、面積比
は、(苗領域Taの画素数)/512になる。ここで
は、その苗領域Taの画素数SL,SRそのものの大小
を比較して、左側の画面での画素数SLが右側の画面で
の画素数SRよりも大きければ、画面左側が既植作業地
であり、逆に右側の画面での画素数SRが左側の画面で
の画素数SLよりも大きければ、画面右側が既植作業地
であると判断される。そして、画面左側が既植作業地で
あれば、画面横方向で最も右寄りの苗領域Taつまり既
植作業地の右端点の画素を判別し、画面右側が既植作業
地であれば、画面横方向で最も左寄りの苗領域Taつま
り既植作業地の左端点の画素を判別することになる。
Next, the screen is divided into two parts on the left and right sides, and the area ratio of the seedling area Ta is calculated on each of the left and right screens. Specifically, FIG.
As shown in, the number of pixels (pixel value is “1”) corresponding to the seedling area Ta out of 32 vertical pixels × 16 horizontal pixels (512 pixels in total) is counted on each screen. Therefore, the area ratio is (the number of pixels of the seedling area Ta) / 512. Here, the sizes of the pixel numbers SL and SR themselves of the seedling area Ta are compared, and if the pixel number SL on the left screen is larger than the pixel number SR on the right screen, the left side of the screen is the planted work site. On the contrary, if the pixel number SR on the right screen is larger than the pixel number SL on the left screen, it is determined that the right side of the screen is the planted work site. Then, if the left side of the screen is a planted work site, the rightmost seedling area Ta in the lateral direction of the screen, that is, the pixel at the right end point of the planted work site is determined. The leftmost seedling area Ta in the direction, that is, the pixel at the left end point of the planted work site is determined.

【0035】次に、上記端点の画素判別について、図1
0に示す右端点の画素検出の場合を、図7のフローに従
って説明する。図10に示すように、画面内には、8個
の苗領域Ta1〜Ta8があり、それぞれが値“1”の
画素から成っており、この苗領域Ta以外の画素データ
は“0”である。図において、撮像画面の左上端を座標
軸の原点(0,0)として、画面右向きに沿ってx軸、
画面下向きに沿ってy軸とする。ここにおいて、未植作
業地は既述のように画面右側つまりx軸のプラス方向に
なっているので、y=0〜31の座標値でx軸に平行な
32本の各ラインにおいて、画面左側の画素(x=0)
から右向きに順番に画素値(data)を判定して最も
右側に位置する値“1”の画素を検出すれば、図11に
示すように、この画素の位置term〔y〕が未植作業
地に隣接する既植作業地の右端点の画素位置になる。
Next, regarding the pixel discrimination of the above-mentioned end points, FIG.
The case of pixel detection of the right end point shown in 0 will be described according to the flow of FIG. 7. As shown in FIG. 10, there are eight seedling areas Ta1 to Ta8 in the screen, each of which consists of a pixel having a value of "1", and the pixel data other than this seedling area Ta is "0". . In the figure, with the upper left corner of the imaging screen as the origin (0, 0) of the coordinate axes, the x-axis along the right direction of the screen,
The y-axis is set along the downward direction of the screen. Here, since the unplanted work site is on the right side of the screen, that is, in the plus direction of the x-axis, as described above, on each of the 32 lines parallel to the x-axis with the coordinate value of y = 0 to 31, the left side of the screen is displayed. Pixel (x = 0)
If the pixel value (data) is sequentially determined in the right direction from and the pixel of the value “1” located on the rightmost side is detected, as shown in FIG. 11, the position term [y] of this pixel is the unplanted work site. It becomes the pixel position of the right end point of the planted work site adjacent to.

【0036】又、左端点の画素検出の場合のフローを図
8に示すが、この場合は、y=0〜31の座標値でx軸
に平行な32本の各ラインにおいて、画面右側の画素
(x=31)から左向きに順番に画素値(data)を
判定して最も左側に位置する値“1”の画素を検出すれ
ば、この画素の位置term〔y〕が未植作業地に隣接
する既植作業地の左端点の画素位置になる。
FIG. 8 shows the flow in the case of detecting the pixel at the left end point. In this case, the pixel on the right side of the screen in each of 32 lines parallel to the x axis with coordinate values of y = 0 to 31. When the pixel value (data) is sequentially determined from (x = 31) to the left and the pixel with the value “1” located on the leftmost side is detected, the position term [y] of this pixel is adjacent to the unplanted work site. It becomes the pixel position of the left end point of the planted work site.

【0037】尚、実際には、上記処理において、上記x
軸に平行な各ラインで最も右側又は左側に位置する画素
の判定のみでは、未植作業地側に隣接する端点よりも既
植側(例えば、図10のx=11、y=2)に位置する
画素を判定する場合があるので、更に、例えば画面左端
よりの距離の違いを使って、これらの画素を除去する必
要がある。
Actually, in the above processing, the above x
Only in the determination of the pixel located on the most right or left side in each line parallel to the axis, the pixel is located on the planted side (eg, x = 11, y = 2 in FIG. 10) with respect to the end point adjacent to the unplanted work site side. In some cases, it is necessary to remove these pixels by using the difference in the distance from the left end of the screen, for example.

【0038】次に、上記未植作業地に隣接する既植作業
地の端点の画素を結んだ線分Lを、ハフ変換処理により
直線近似して求める。先ず、図12に示すように、テレ
ビカメラS1の撮像画面の中心を通るx軸を極座標系に
おける基準線として、上記右端点検出された作物領域T
a1,Ta2,Ta3の各画素を通る複数本の直線を、
下記(i)式に基づいて、x軸に対して0乃至180度
の範囲において予め複数段階に設定された傾きθと、原
点つまり撮像画面中央からの距離ρとの組み合わせとし
て求める。
Next, the line segment L connecting the pixels at the end points of the planted worksite adjacent to the unplanted worksite is linearly approximated by the Hough transform process. First, as shown in FIG. 12, with the x-axis passing through the center of the image pickup screen of the television camera S1 as the reference line in the polar coordinate system, the crop region T in which the right end point has been detected is detected.
A plurality of straight lines passing through each pixel of a1, Ta2 and Ta3 are
Based on the following formula (i), it is obtained as a combination of the inclination θ set in a plurality of stages in the range of 0 to 180 degrees with respect to the x-axis and the distance ρ from the origin, that is, the center of the imaging screen.

【0039】[0039]

【数1】ρ=y・sin θ+x・cos θ ……(i)[Equation 1] ρ = y · sin θ + x · cos θ (i)

【0040】そして、一つの画素について、前記複数段
階に設定された傾きθの値が180度に達するまで、求
めた各直線の頻度を計数するための二次元ヒストグラム
を加算する処理を繰り返した後、前記作物領域Ta1,
Ta2,Ta3の各画素を通る複数種の直線の頻度を、
全画素毎に計数し、その各画素に対する直線の頻度の計
数が完了すると、前記二次元ヒストグラムに加算された
値から、最大頻度となる傾きθと距離ρの組み合わせを
求めることにより、最大頻度となる一つの直線Lxを決
定し、その直線Lxを前記境界に対応する線分Lの画面
上での直線近似したものとする。最後に、この直線Lx
が画面中央を前後方向に通る走行基準線Laに対してな
す傾き角θsを求めて角度ずれとする。
Then, for one pixel, after repeating the process of adding the two-dimensional histogram for counting the frequency of each obtained straight line until the value of the inclination θ set in the plurality of steps reaches 180 degrees, , The crop area Ta1,
The frequencies of a plurality of types of straight lines passing through each pixel of Ta2 and Ta3 are
When the counting of the frequency of the straight line for each pixel is completed for each pixel, the combination of the slope θ and the distance ρ that is the maximum frequency is obtained from the value added to the two-dimensional histogram to obtain the maximum frequency. It is assumed that one straight line Lx is determined, and that straight line Lx is a straight line approximation of the line segment L corresponding to the boundary on the screen. Finally, this straight line Lx
Is calculated with respect to the traveling reference line La that passes through the center of the screen in the front-rear direction and is determined as the angular deviation.

【0041】従って、前記作業車Vを境界(機体進行方
向に並ぶ既植苗Tの列)に沿って自動走行させるための
方向制御においては、直線Lxの走行基準線Laに対す
る傾き角θsを零に近づけるように、2輪ステアリング
形式で操向操作することになる。尚、テレビカメラS1
の撮像情報に基づいて、作業行程の終端部に達したか否
かも判別するように構成されている。具体的には、図1
3に示すように、機体横幅方向に並ぶ複数個の苗領域T
aの画素の夫々を機体横幅方向に結ぶ線分を直線近似
(ハフ変換)にて求め、その直線近似した直線の撮像画
面の左右両端部夫々での画面上端からの距離P1,P2
の値を、作業行程終端部までの距離に対応する位置情報
として求める。そして、その距離P1,P2の平均した
値が、設定値以上になると、作業行程終端部に走行した
と判断することになる。尚、自動走行モードでは、作業
行程終端部に走行すると設定距離走行後に自動停止す
る。
Therefore, in the direction control for automatically traveling the work vehicle V along the boundary (row of the planted seedlings T aligned in the machine traveling direction), the inclination angle θs of the straight line Lx with respect to the traveling reference line La is set to zero. The steering operation will be performed in a two-wheel steering manner so as to approach them. The TV camera S1
It is also configured to determine whether or not the end portion of the work process has been reached based on the imaging information of. Specifically, FIG.
As shown in FIG. 3, a plurality of seedling areas T lined up in the machine width direction
A line segment that connects each pixel of a in the lateral direction of the machine body is obtained by linear approximation (Hough conversion), and the distance P1, P2 from the screen upper end at each of the left and right end portions of the imaging screen of the linear approximation
Is obtained as position information corresponding to the distance to the end of the work process. When the average value of the distances P1 and P2 becomes equal to or larger than the set value, it is determined that the vehicle has traveled to the work stroke end portion. In the automatic traveling mode, when traveling to the end of the work process, the vehicle automatically stops after traveling the set distance.

【0042】〔別実施形態〕上記実施例では、撮像手段
としてテレビカメラS1を用いて可視光をカットした赤
外線による撮像画像を得て、その赤外線画像を2値化し
て作物(苗T)に対応する作物領域Taを抽出するよう
にしたが、例えば、三原色情報R,G,Bによるカラー
画像を撮像するカラー式のテレビカメラを用い、そのカ
ラー画像において作物と背景(泥田)との色の違いによ
って作物領域Taを抽出するようにしてもよい。
[Other Embodiments] In the above embodiment, a television camera S1 is used as an image pickup means to obtain an imaged image by infrared rays in which visible light is cut, and the infrared image is binarized to correspond to a crop (seedling T). However, for example, a color type television camera that captures a color image based on the three primary color information R, G, and B is used, and the color difference between the crop and the background (mud) in the color image is extracted. The crop area Ta may be extracted by.

【0043】画像処理手段GSが、未処理作業地側又は
処理済作業地側を特定する作業地特定情報を求めるの
に、上記実施例とは逆に、撮像手段S1の情報に基づい
て未処理作業地(未植作業地)に対応する未処理領域、
つまり、作物領域Taでない画素(値0)の領域を抽出
し、撮像手段S1の撮像画面を機体横幅方向に沿う画面
方向の両側に分割した各分割画面において画面に対する
未処理領域の面積比が大である側を未処理作業地側であ
ると判別するようにしてもよい。尚、撮像画面を機体横
幅方向に沿う画面方向の両側に分割するのは、2等分に
限らない。又、上記のように面積比による構成以外に
て、撮像画像情報に基づいて作業地特定情報を求めるよ
うにしてもよい。
The image processing means GS obtains the work site specifying information for specifying the unprocessed work site side or the processed work site side. On the contrary to the above-mentioned embodiment, the image processing means GS performs the unprocessed operation based on the information of the image pickup device S1. An unprocessed area corresponding to the work site (unplanted work site),
That is, the area ratio of the unprocessed area to the screen is large in each divided screen obtained by extracting the area of the pixel (value 0) that is not the crop area Ta and dividing the imaging screen of the imaging unit S1 into both sides in the screen direction along the machine width direction. You may make it discriminate | determine that the side which is is an unprocessed work side. It should be noted that dividing the imaging screen into both sides in the screen direction along the lateral width direction of the machine body is not limited to being divided into two. Further, other than the configuration based on the area ratio as described above, the work site identification information may be obtained based on the captured image information.

【0044】又、画像処理手段GSが、前記線分Lを求
めるのに、撮像手段S1の機体前後方向に沿う画面方向
の各位置において、上記実施例とは逆に、機体横幅方向
に沿う画面方向で最も処理済作業地側に隣接する未処理
領域を求め、その機体前後方向に沿って列状に並ぶ未処
理領域を結ぶように直線近似して前記線分Lを求めるよ
うにしてもよい。
Further, in order to obtain the line segment L by the image processing means GS, at each position in the screen direction along the longitudinal direction of the body of the image pickup means S1, contrary to the above embodiment, the screen along the lateral direction of the body is displayed. The line segment L may be obtained by linearly approximating the unprocessed regions adjacent to the most processed work site side in the direction and connecting the unprocessed regions arranged in a line along the machine longitudinal direction. .

【0045】次に、作業車Vの機体の向きが前記境界の
方向に対して大きく傾いて、撮像手段S1の撮像画面内
がすべて未処理作業地(未植作業地)であったり、ある
いは、すべて処理済作業地(既植作業地)であったりし
て、境界箇所が撮像されていないときには、画像処理手
段GSによる境界の検出ができなくなる。そこで、この
ような画像状態に対する解決手段が構成されている。つ
まり、図14に示すように、撮像手段としての前記テレ
ビカメラS1の撮像方向を、機体に対する適正撮像位置
(図の実線で示す位置)を中央側に含む状態で機体横幅
方向に沿って変更する撮像方向変更手段としての電動モ
ータ18が設けられ、前記画像処理手段GSは、テレビ
カメラS1の前記適正撮像位置での撮像画像に前記境界
箇所が撮像されていない場合には、テレビカメラS1の
撮像画像に前記境界箇所が撮像されるまでその撮像方向
を前記適正撮像位置から機体横幅方向に沿って変更させ
るように前記電動モータ18を作動させるように構成さ
れている。尚、上記撮像画像に前記境界箇所が撮像され
ていないことは、画面内での前記苗領域Taの情報量が
下限設定量よりも少ない(すべて未植作業地の状態)
か、逆に、上限設定量よりも多い(すべて既植作業地の
状態)かによって判断する。
Next, the orientation of the machine body of the work vehicle V is largely inclined with respect to the direction of the boundary, and the entire image pickup screen of the image pickup means S1 is an unprocessed work site (unplanted work site), or When the boundary is not imaged because it is all processed work (already planted work), the boundary cannot be detected by the image processing means GS. Therefore, means for solving such an image state is configured. That is, as shown in FIG. 14, the image pickup direction of the television camera S1 as the image pickup means is changed along the lateral direction of the machine body with the proper image pickup position (the position shown by the solid line in the figure) on the machine body being included in the center side. An electric motor 18 as an image capturing direction changing unit is provided, and the image processing unit GS captures the image of the TV camera S1 when the boundary portion is not captured in the image captured at the proper image capturing position of the TV camera S1. It is configured to operate the electric motor 18 so as to change the imaging direction from the proper imaging position along the lateral width direction of the machine until the boundary portion is imaged in the image. The fact that the boundary portion is not captured in the captured image means that the information amount of the seedling area Ta in the screen is less than the lower limit set amount (all unplanted work sites).
Or conversely, it is judged whether it is larger than the upper limit set amount (all are in the state of planted work sites).

【0046】以下、具体的に説明すると、テレビカメラ
S1は、その撮像方向(前方斜め下向き)に直交し且つ
機体前後方向に沿う面内で前方側に傾いた状態の縦軸芯
周りに回転されて、図15に示すように、機体に対する
設定適正位置の画像G1の次に先ず左側方向に変更さ
れ、この左側の画像G2にも境界箇所が撮像されていな
いときに、逆の右側方向に変更されて右側の画像G3が
撮像される(尚、この3つの画像G1,G2,G3は、
横方向に連続するように上記回転量が設定されてい
る)。ここで、図15(イ)の場合は、設定適正位置の
画像G1及び左側位置での画像G2がすべて既植作業地
であるが、右側位置での画像G3には境界が撮像されて
いる。図15(ロ)の場合は、設定適正位置の画像G1
はすべて未植作業地であるが、左側位置での画像G2に
は境界が撮像されており、右側位置での画像G3は撮像
されない。そして、境界が撮像されるまで撮像方向を設
定適正位置から左右何れの側に回転させたの情報より、
機体に対する境界の位置が判別される。さらに、上記境
界が撮像された画像に対して、前述(図9(ハ)(ニ)
参照)のように、画面を左右に分割して、左右いずれが
既植作業地側であるかを判別する処理を行う。
To be more specific, the television camera S1 is rotated about the longitudinal axis of the television camera S1 which is orthogonal to the image pickup direction (obliquely forward and downward) and is inclined forward in the plane along the longitudinal direction of the machine body. Then, as shown in FIG. 15, after the image G1 at the proper setting position with respect to the machine body, the image is changed to the left side first, and when the boundary portion is not imaged on the left side image G2, the image is changed to the opposite right side. Then, the image G3 on the right side is picked up (note that these three images G1, G2, G3 are
The rotation amount is set so as to be continuous in the lateral direction). Here, in the case of FIG. 15A, the image G1 at the proper setting position and the image G2 at the left side position are all planted work sites, but the boundary is imaged at the image G3 at the right side position. In the case of FIG. 15B, the image G1 at the proper setting position is displayed.
Are all unplanted work sites, but the boundary is captured in the image G2 at the left position, and the image G3 at the right position is not captured. Then, from the information of rotating the imaging direction from the setting proper position to the left or right until the boundary is imaged,
The position of the boundary with respect to the airframe is determined. Further, as for the image in which the above-mentioned boundary is captured, the above-mentioned (Fig. 9 (c) (d))
As shown in (2), the screen is divided into left and right, and a process for determining which of the left and right is the planted work site is performed.

【0047】そして、手動走行モードでは、テレビカメ
ラS1の画面には、上記境界が撮像されていないので、
その境界箇所が画面の左右何れ側であるかの情報や、機
体を境界に沿って走行させるための操向情報等を、例え
ば図15(イ)の状態の場合には、「境界は右側です。
ステアリングを右側に切って下さい。」等のような指示
を前記テレビモニター13に表示する。又、自動走行モ
ードでは、上記境界の位置する側に機体の向きが変更さ
れるように、前輪1Fを自動的にステアリング操作す
る。
In the manual traveling mode, since the boundary is not imaged on the screen of the television camera S1,
For example, in the case of the state of Fig. 15 (a), "the boundary is on the right side" is displayed, such as information on whether the boundary is on the left or right side of the screen, and steering information for traveling the aircraft along the boundary. .
Turn the steering wheel to the right. The instruction such as “” is displayed on the television monitor 13. Further, in the automatic traveling mode, the front wheels 1F are automatically steered so that the orientation of the machine body is changed to the side where the boundary is located.

【0048】上記実施例では、作業車が植付け作業車で
あって、未処理作業地としての未植作業地に対して複数
個の作物Tを植え付けて、処理済作業地としての既植作
業地を形成する場合を例示したが、これ以外に、未処理
作業地としての未刈作業地に成育した作物や草等を刈り
取って、処理済作業地としての既刈作業地を形成する刈
取収穫機や芝刈り機等の刈取作業車、あるいは、未処理
作業地としての耕耘前作業地を耕して処理済作業地とし
ての耕耘済作業地を形成する耕耘作業車等であってもよ
い。
In the above embodiment, the work vehicle is a planting work vehicle, and a plurality of crops T are planted on the unplanted work site as the unprocessed work site, and the planted work site as the treated work site. However, in addition to this, a harvesting machine for cutting crops, grass, etc. grown on an uncut work site as an unprocessed work site to form an already-cut work site as a treated work site. It may be a reaping work vehicle such as a lawn mower or a cultivating work vehicle that cultivates a pre-cultivation work site as an untreated work site to form a cultivated work site as a treated work site.

【0049】そして、刈取作業車の場合には、上記植付
け作業車とは逆に、未処理領域が作物や草の領域に対応
し、処理済領域が作物や草が存在しない領域に対応する
ように抽出処理がなされる。又、耕耘作業車の場合に
は、未処理領域が耕されていない作業地に対応し、処理
済領域が耕された作業地に対応し、そして、この両領域
は、耕されていない作業地の方の光反射率が、耕された
作業地よりも高いことによって区別して抽出処理がなさ
れる。
In the case of a mowing work vehicle, the untreated area corresponds to the area of crops and grass, and the treated area corresponds to the area where no crop or grass exists, contrary to the planting work vehicle. The extraction process is performed. In the case of a tillage vehicle, the untreated area corresponds to the uncultivated working area, the treated area corresponds to the cultivated working area, and both areas are uncultivated working areas. Since the light reflectance in the area is higher than that in the cultivated work area, the extraction processing is performed separately.

【0050】上記実施例では、画像処理手段GSが、線
分Lを直線近似して求めるのにハフ変換を利用した場合
を例示したが、ハフ変換以外に、例えば、最小二乗法等
を用いてもよい。又、線分Lを直線近似ではなく曲線近
似して求めてもよい。
In the above embodiment, the image processing means GS uses the Hough transform for linearly approximating the line segment L. However, other than the Hough transform, for example, the least square method or the like is used. Good. The line segment L may be obtained by curve approximation instead of linear approximation.

【0051】上記実施例では、手動操向用の情報として
画像表示手段13に表示したり、ありは、自動走行用の
操向制御情報として使う変位θsを、機体前後方向に対
する境界に対応する線分Lの角度ずれとして求めたが、
角度ずれとともに、機体横幅方向における位置ずれを求
めるようにしてもよい。
In the above embodiment, the displacement .theta.s which is displayed on the image display means 13 as the information for the manual steering or is used as the steering control information for the automatic traveling is the line corresponding to the boundary in the longitudinal direction of the machine body. It was calculated as the angle deviation of the minute L,
The positional deviation in the lateral direction of the machine may be obtained together with the angular deviation.

【0052】走行装置は、上記実施例のような車輪式の
走行装置以外に、例えばクローラ走行装置等であっても
よい。
The traveling device may be, for example, a crawler traveling device other than the wheel type traveling device as in the above embodiment.

【0053】尚、特許請求の範囲の項に図面との対照を
便利にするために符号を記すが、該記入により本発明は
添付図面の構成に限定されるものではない。
In the claims, reference numerals are provided for convenience of comparison with the drawings, but the present invention is not limited to the configuration of the attached drawings by the entry.

【図面の簡単な説明】[Brief description of the drawings]

【図1】植付け作業車の概略平面図FIG. 1 is a schematic plan view of a planting work vehicle.

【図2】同概略側面図FIG. 2 is a schematic side view of the same.

【図3】制御構成のブロック図FIG. 3 is a block diagram of a control configuration.

【図4】制御構成のブロック図FIG. 4 is a block diagram of a control configuration.

【図5】制御作動のフローチャートFIG. 5 is a flowchart of a control operation.

【図6】制御作動のフローチャートFIG. 6 is a flowchart of a control operation.

【図7】制御作動のフローチャートFIG. 7 is a flowchart of a control operation.

【図8】制御作動のフローチャートFIG. 8 is a flowchart of a control operation.

【図9】画像処理の説明図FIG. 9 is an explanatory diagram of image processing.

【図10】画像処理の説明図FIG. 10 is an explanatory diagram of image processing.

【図11】画像処理の説明図FIG. 11 is an explanatory diagram of image processing.

【図12】ハフ変換の説明図FIG. 12 is an explanatory diagram of Hough transform.

【図13】作業行程終端部での終端検出の説明図FIG. 13 is an explanatory diagram of end detection at the end of the work process.

【図14】別実施形態での撮像方向の変更状態を示す概
略平面図
FIG. 14 is a schematic plan view showing a changed state of the imaging direction in another embodiment.

【図15】別実施形態での画像処理の説明図FIG. 15 is an explanatory diagram of image processing according to another embodiment.

【符号の説明】[Explanation of symbols]

S1 撮像手段 L 線分 GS 画像処理手段 18 撮像方向変更手段 T 作物 Ta 作物領域 101 作物領域抽出手段 102 演算手段 13 画像表示手段 1F,1R 走行装置 100 制御手段 S1 imaging means L line segment GS image processing means 18 imaging direction changing means T crop Ta crop area 101 crop area extracting means 102 computing means 13 image display means 1F, 1R traveling device 100 control means

Claims (7)

【特許請求の範囲】[Claims] 【請求項1】 未処理作業地と処理済作業地との境界に
沿って走行する作業車に、前記境界箇所を撮像する撮像
手段(S1)と、その撮像手段(S1)の撮像画像情報
及びその撮像画像の機体横幅方向に沿う画面方向の何れ
の側が未処理作業地又は処理済作業地側であるかを特定
する作業地特定情報に基づいて、前記境界に対応する線
分(L)を求める画像処理手段(GS)とが設けられた
作業車の境界検出装置であって、 前記画像処理手段(GS)は、前記撮像手段(S1)の
撮像画像情報に基づいて前記作業地特定情報を求めるよ
うに構成されている作業車の境界検出装置。
1. An image pickup means (S1) for picking up an image of the boundary portion on a work vehicle traveling along a boundary between an unprocessed work site and a processed work site, and imaged image information of the image pickup means (S1). A line segment (L) corresponding to the boundary is determined based on work site identification information that identifies which side of the screen image in the screen width direction of the captured image is the unprocessed work site or the processed work site side. A boundary detection device for a work vehicle provided with a desired image processing means (GS), wherein the image processing means (GS) determines the work site identification information based on the imaged image information of the imaging means (S1). A work vehicle boundary detection device configured to determine.
【請求項2】 前記画像処理手段(GS)は、前記撮像
手段(S1)の情報に基づいて処理済作業地に対応する
処理済領域又は未処理作業地に対応する未処理領域を抽
出し、前記撮像手段(S1)の撮像画面を機体横幅方向
に沿う画面方向の両側に分割した各分割画面において画
面に対する処理済領域又は未処理領域の面積比が大であ
る側を処理済作業地又は未処理作業地側であると判別す
る判別処理を実行して、前記作業地特定情報を求めるよ
うに構成されている請求項1記載の作業車の境界検出装
置。
2. The image processing means (GS) extracts a processed area corresponding to a processed work area or an unprocessed area corresponding to an unprocessed work area based on the information of the image pickup means (S1), In each divided screen obtained by dividing the image pickup screen of the image pickup means (S1) on both sides in the screen width direction along the machine body width direction, the side where the area ratio of the processed region or the unprocessed region to the screen is large is the processed work site or the unprocessed work site. The work vehicle boundary detection device according to claim 1, wherein the work vehicle boundary detection device is configured to perform a determination process of determining that the processing work site is located and obtain the work site identification information.
【請求項3】 前記撮像手段(S1)の撮像方向を、機
体に対する適正撮像位置を中央側に含む状態で機体横幅
方向に沿って変更する撮像方向変更手段(18)が設け
られ、 前記画像処理手段(GS)は、前記撮像手段(S1)の
前記適正撮像位置での撮像画像に前記境界箇所が撮像さ
れていない場合には、前記撮像手段(S1)の撮像画像
に前記境界箇所が撮像されるまで前記撮像手段(S1)
の撮像方向を前記適正撮像位置から機体横幅方向に沿っ
て変更させるように前記撮像方向変更手段(18)を作
動させるように構成されている請求項1又は2記載の作
業車の境界検出装置。
3. An image pickup direction changing unit (18) is provided for changing the image pickup direction of the image pickup unit (S1) along the lateral direction of the machine body with the proper image pickup position with respect to the machine body being included in the center side. The means (GS) captures the boundary location in the captured image of the image capturing means (S1) when the boundary location is not captured in the captured image at the proper capturing position of the image capturing means (S1). Until the image pickup means (S1)
3. The work vehicle boundary detection apparatus according to claim 1, wherein the image pickup direction changing means (18) is configured to operate so as to change the image pickup direction from the appropriate image pickup position along the lateral width direction of the machine body.
【請求項4】 前記画像処理手段(GS)は、前記撮像
手段(S1)の機体前後方向に沿う画面方向の各位置に
おいて、機体横幅方向に沿う画面方向で最も未処理作業
地側に隣接する処理済領域又は最も処理済作業地側に隣
接する未処理領域を求め、その機体前後方向に沿って列
状に並ぶ処理済領域又は未処理領域を結ぶように直線近
似して前記線分(L)を求めるように構成されている請
求項2又は3記載の作業車の境界検出装置。
4. The image processing means (GS) is adjacent to the most unprocessed work site side in the screen direction along the machine width direction, at each position in the screen direction along the machine longitudinal direction of the image pickup means (S1). The processed area or the unprocessed area adjacent to the most processed work site side is obtained, and the processed area or unprocessed area arranged in a line along the longitudinal direction of the machine is linearly approximated to connect the line segment (L ) The work vehicle boundary detection device according to claim 2 or 3, which is configured to obtain
【請求項5】 前記作業車が、未処理作業地としての未
植作業地に対して、機体進行方向に設定植付け間隔を隔
て、且つ、機体横幅方向に設定植付け幅を隔てる状態で
複数個の作物(T)を植え付けて、処理済作業地として
の既植作業地を形成する植付け作業車であり、 前記画像処理手段(GS)が、前記撮像手段(S1)の
撮像画像情報に基づいて前記複数個の作物(T)夫々に
対応する複数個の作物領域(Ta)を処理済領域として
抽出する作物領域抽出手段(101)と、その作物領域
抽出手段(101)にて抽出された複数個の作物領域
(Ta)の情報に基づいて、前記作業地特定情報及び前
記線分(L)を求める演算手段(102)とから構成さ
れている請求項2、3又は4記載の作業車の境界検出装
置。
5. The plurality of work vehicles are arranged in such a manner that a set planting interval is set in a machine advancing direction and a set planting width is set in a machine lateral direction with respect to an unplanted work site as an unprocessed work site. A planting work vehicle for planting a crop (T) to form an already-planted work site as a processed work site, wherein the image processing means (GS) is based on the imaged image information of the imaging means (S1). A crop area extracting unit (101) for extracting a plurality of crop areas (Ta) corresponding to a plurality of crops (T) as processed areas, and a plurality of crop areas extracted by the crop area extracting unit (101). Boundary of the work vehicle according to claim 2, 3 or 4, further comprising: computing means (102) for obtaining the work site specifying information and the line segment (L) based on the crop area (Ta) information. Detection device.
【請求項6】 前記境界箇所を撮像している前記撮像手
段(S1)の撮像画像と、請求項1、2、3、4又は5
記載の作業車の境界検出装置にて求めた前記線分(L)
と、機体前後方向に対する前記線分(L)の変位を示す
画像とを同一画面上に表示する画像表示手段(13)が
設けられている作業車の走行状態表示装置。
6. A picked-up image of said image pickup means (S1) picking up an image of said boundary portion, and claim 1, 2, 3, 4 or 5.
The line segment (L) obtained by the boundary detection device for the work vehicle described
And a traveling state display device for a work vehicle, which is provided with image display means (13) for displaying on the same screen an image showing the displacement of the line segment (L) with respect to the machine longitudinal direction.
【請求項7】 操向操作自在な走行装置(1F,1R)
と、請求項1、2、3、4又は5記載の作業車の境界検
出装置にて求めた前記線分(L)の機体前後方向に対す
る変位情報に基づいて前記走行装置(1F,1R)を操
向制御する制御手段(100)とが設けられている作業
車の走行制御装置。
7. A traveling device (1F, 1R) which can be steerably operated
And the traveling device (1F, 1R) based on the displacement information of the line segment (L) in the machine longitudinal direction obtained by the boundary detection device for a work vehicle according to claim 1, 2, 3, 4 or 5. A traveling control device for a work vehicle, which is provided with a control means (100) for steering control.
JP7333296A 1995-12-21 1995-12-21 Boundary detector, display device for running stage and running control device in working vehicle Pending JPH09168315A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP7333296A JPH09168315A (en) 1995-12-21 1995-12-21 Boundary detector, display device for running stage and running control device in working vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP7333296A JPH09168315A (en) 1995-12-21 1995-12-21 Boundary detector, display device for running stage and running control device in working vehicle

Publications (1)

Publication Number Publication Date
JPH09168315A true JPH09168315A (en) 1997-06-30

Family

ID=18264520

Family Applications (1)

Application Number Title Priority Date Filing Date
JP7333296A Pending JPH09168315A (en) 1995-12-21 1995-12-21 Boundary detector, display device for running stage and running control device in working vehicle

Country Status (1)

Country Link
JP (1) JPH09168315A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012157333A (en) * 2011-02-02 2012-08-23 Yanmar Co Ltd Rice transplanter
WO2023120183A1 (en) * 2021-12-24 2023-06-29 株式会社クボタ Agricultural machine

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012157333A (en) * 2011-02-02 2012-08-23 Yanmar Co Ltd Rice transplanter
WO2023120183A1 (en) * 2021-12-24 2023-06-29 株式会社クボタ Agricultural machine

Similar Documents

Publication Publication Date Title
WO2016093311A1 (en) Work vehicle
JPH09168315A (en) Boundary detector, display device for running stage and running control device in working vehicle
JPH09224417A (en) Auxiliary device for working vehicle
JP3044141B2 (en) Planting condition detector for crop planting machines
JPH09224415A (en) Direction detector, traveling state display device and traveling controller for work wagon
JP2815760B2 (en) Crop row detector
JPH0628032A (en) Traveling control device for automatic traveling working vehicle
JPH09224414A (en) Direction detector, traveling state display device and traveling controller for work wagon
JP2667462B2 (en) Automatic steering control device for agricultural work machine
JPH09201109A (en) Direction detector, running state displaying device and running controller of working vehicle
JPH0312713A (en) Image processor for automatic steering controller of farming machine
JPH09201110A (en) Direction detector, running state displaying device and running controller of working vehicle
JP2710644B2 (en) Automatic steering control device for agricultural work machine
JPS61139304A (en) Steering controller of self-propelling working machine
JPH0257109A (en) Automatic steering control apparatus of farm working machine
JP2583584B2 (en) Automatic steering control device for agricultural work machine
JPH01211410A (en) Crop row detection apparatus of farm working machine
JPH07184437A (en) Planting state detecting device for crop planting machine
JP2593166B2 (en) Seedling row detecting device in agricultural work machine
JP2907613B2 (en) Crop row detector
JP2907641B2 (en) Work area detection device
JPH0661163B2 (en) Boundary detection device for self-driving work vehicles
JP3020734B2 (en) Boundary detection device for autonomous vehicles
JP2907612B2 (en) Crop row detector
JPH0746084Y2 (en) Automatic steering device for work vehicles