WO2021085053A1 - Work assistance system - Google Patents
Work assistance system Download PDFInfo
- Publication number
- WO2021085053A1 WO2021085053A1 PCT/JP2020/037862 JP2020037862W WO2021085053A1 WO 2021085053 A1 WO2021085053 A1 WO 2021085053A1 JP 2020037862 W JP2020037862 W JP 2020037862W WO 2021085053 A1 WO2021085053 A1 WO 2021085053A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- obstacle
- unit
- work
- measurement
- Prior art date
Links
- 238000005259 measurement Methods 0.000 claims abstract description 137
- 238000003384 imaging method Methods 0.000 claims abstract description 45
- 238000003860 storage Methods 0.000 claims abstract description 17
- 239000003337 fertilizer Substances 0.000 claims description 30
- 238000001514 detection method Methods 0.000 description 91
- 238000000034 method Methods 0.000 description 87
- 230000008569 process Effects 0.000 description 77
- 238000012545 processing Methods 0.000 description 69
- 230000010354 integration Effects 0.000 description 33
- 230000005540 biological transmission Effects 0.000 description 31
- 230000008859 change Effects 0.000 description 26
- 238000004891 communication Methods 0.000 description 17
- 238000010295 mobile communication Methods 0.000 description 16
- IJGRMHOSHXDMSA-UHFFFAOYSA-N Atomic nitrogen Chemical compound N#N IJGRMHOSHXDMSA-UHFFFAOYSA-N 0.000 description 12
- 238000003971 tillage Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 238000011156 evaluation Methods 0.000 description 7
- 238000005507 spraying Methods 0.000 description 7
- 102100039642 Rho-related GTP-binding protein RhoN Human genes 0.000 description 6
- 108050007497 Rho-related GTP-binding protein RhoN Proteins 0.000 description 6
- 238000010521 absorption reaction Methods 0.000 description 6
- 239000000428 dust Substances 0.000 description 6
- 229910052757 nitrogen Inorganic materials 0.000 description 6
- 230000003028 elevating effect Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 230000000087 stabilizing effect Effects 0.000 description 5
- 239000000126 substance Substances 0.000 description 5
- 230000002706 hydrostatic effect Effects 0.000 description 4
- 238000005096 rolling process Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 230000000873 masking effect Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000002441 reversible effect Effects 0.000 description 3
- 102100039643 Rho-related GTP-binding protein Rho6 Human genes 0.000 description 2
- 101710199571 Rho-related GTP-binding protein Rho6 Proteins 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000009347 mechanical transmission Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 239000002689 soil Substances 0.000 description 2
- 208000027066 STING-associated vasculopathy with onset in infancy Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000004720 fertilization Effects 0.000 description 1
- 125000001475 halogen functional group Chemical group 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000009331 sowing Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01C—PLANTING; SOWING; FERTILISING
- A01C21/00—Methods of fertilising, sowing or planting
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G7/00—Botany in general
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/27—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/43—Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
Definitions
- the present invention relates to a work support system that supports work by a work vehicle such as a tractor or a passenger management machine.
- the tractor which is an example of a work vehicle
- a dedicated plant sensor device that acquires the growth information (information on the growth status) of the crop (plant) on the tractor, the tractor travels in the field.
- Some are configured to measure the growth status of crops cultivated in the field (see, for example, Patent Document 1).
- the plant sensor device described in Patent Document 1 irradiates the same irradiation area in which the crop to be measured for growth status is cultivated with the first measurement light and the second measurement light, and the first measurement light from the growth status measurement target is applied.
- the reflected light of the 1st measurement light and the 2nd measurement light is acquired.
- the normalized difference vegetation index (NDVI: Normalized Difference Vegetation Index) and the plant height value, which are examples of the spectral vegetation index, are acquired as the growth information of the crop to be measured in the growth condition. ..
- the main problem of the present invention is to enable work support using crop growth information while suppressing soaring costs and complication of composition.
- the first feature configuration of the present invention is in a work support system.
- An imaging unit that is mounted on a work vehicle working in a field and captures visible light in an imaging range set around the work vehicle to acquire color image information.
- Measurement information including the reflection intensity of near-infrared light is acquired by the projection and reception of near-infrared light with respect to the measurement range set around the work vehicle mounted on the work vehicle, and based on the acquired measurement information.
- An obstacle sensor that detects obstacles and A work support unit that supports work by the work vehicle based on the color image information and the measurement information, and A positioning unit that measures the position of the work vehicle and acquires position information
- a growth information acquisition unit that acquires growth information of crops cultivated in the field based on the color image information and the measurement information. The point is that it includes a storage unit that stores the growth information in association with the position information.
- the work support unit displays, for example, an image of the surroundings of the work vehicle on the display unit based on the color image information of the imaging unit to make it easier to visually recognize the situation around the work vehicle. Becomes possible. Further, the work support unit notifies the existence of an obstacle around the work vehicle based on the detection of the obstacle by the obstacle sensor, and makes it easier to avoid a collision with the obstacle. Becomes possible.
- the measurement information includes the reflection intensity of near-infrared light, so that the obstacle sensor floats dust, fog, etc., which are generated in the vicinity of the obstacle sensor and have a very weak reflection intensity. It is possible to avoid the possibility of erroneously detecting an object as an obstacle.
- the growth information acquisition department acquires crop growth information by using the color image information and the measurement information acquired by the work support department to enable the above work support.
- the storage unit stores the growth information of the crop acquired by the growth information acquisition unit in association with the position information of the positioning unit.
- the growth status of the crop in each predetermined area in the field can be evaluated based on the growth information of the crop stored in association with the position information in the storage unit, and the growth of the crop in each predetermined area evaluated can be evaluated.
- the amount of fertilizer applied to the crop for each predetermined area can be calculated. Then, when performing fertilizer spraying work by a work vehicle for fertilizer spraying, if the amount of fertilizer applied to the crops in each predetermined area is adjusted based on the calculated fertilizer application amount, it is possible to adjust the amount of fertilizer applied to each predetermined area in the field. It is possible to improve the variation in the growth state of crops, improve the quality of crops cultivated in the field, and stabilize the yield.
- an imaging unit and an obstacle sensor provided to enable work support such as making it easier to visually recognize the situation around the work vehicle and making it easier to avoid a collision with an obstacle. Therefore, it is possible to acquire the growth information of the crop without providing a dedicated sensor for acquiring the growth information of the crop. As a result, work support for improving crop quality and stabilizing yields, which is possible by acquiring crop growth information, will be provided while suppressing soaring costs and complication of composition. Can be done.
- FIG. 1 is a diagram showing a schematic configuration of an automatic traveling system for a work vehicle.
- FIG. 2 is a plan view of a tractor showing an imaging range of each camera.
- FIG. 3 is a side view of the tractor showing the measurement range of each obstacle sensor.
- FIG. 4 is a plan view of the tractor showing the measurement range of each obstacle sensor.
- FIG. 5 is a plan view showing an example of a target route for automatic traveling.
- FIG. 6 is a block diagram showing a schematic configuration of an automatic traveling system for a work vehicle.
- FIG. 7 is a block diagram showing a schematic configuration of an obstacle detection system and the like.
- FIG. 8 is a plan view showing the positional relationship between the mounting position of each camera, the origin of the vehicle body coordinates, and the distance calculation reference point.
- FIG. 8 is a plan view showing the positional relationship between the mounting position of each camera, the origin of the vehicle body coordinates, and the distance calculation reference point.
- FIG. 9 is a diagram showing an obstacle detection range and a non-detection range in the distance image of the first obstacle sensor.
- FIG. 10 is a diagram showing an obstacle detection range and a non-detection range in a working device descending state in a distance image of the second obstacle sensor.
- FIG. 11 is a diagram showing an obstacle detection range and a non-detection range in a working device ascending state in the distance image of the second obstacle sensor.
- FIG. 12 is a flowchart showing the control operation of the information integration processing unit in the obstacle information acquisition control.
- FIG. 13 is a flowchart showing the control operation of the obstacle control unit in the first collision avoidance control.
- FIG. 14 is a flowchart showing the control operation of the automatic driving control unit in the dirt handling control process.
- FIG. 10 is a diagram showing an obstacle detection range and a non-detection range in a working device descending state in a distance image of the second obstacle sensor.
- FIG. 11 is a diagram showing
- FIG. 15 is a plan view showing growth information acquisition areas set in a plurality of sections in the field.
- FIG. 16 is a diagram showing an example of a color image in front of the vehicle body generated from the color image information captured by the front camera of the imaging unit.
- FIG. 17 is a diagram showing an example of a distance image in front of the vehicle body generated from the measurement information of the first obstacle sensor.
- FIG. 18 is a diagram showing an example of a near-infrared image in front of the vehicle body generated from the measurement information of the first obstacle sensor.
- FIG. 19 is a flowchart showing the control operation of the growth information acquisition unit in the growth information acquisition control.
- the work vehicle V exemplified in the present embodiment is an example of a work device detachably connected to a tractor 1 which is an example of a traveling vehicle body and a rear portion via a link mechanism 2. It has a rotary tillage device 3 which is a rotary tiller. As a result, the work vehicle V is configured to have a rotary tillage specification capable of tilling work by the rotary tillage device 3.
- the rotary tillage device 3 is connected to the rear portion of the tractor 1 via a link mechanism 2 so as to be able to move up and down and roll.
- the traveling vehicle body may be a passenger management machine other than the tractor 1 having a high ground clearance.
- the working device may be a plow, a disc halo, a cultivator, a subsoiler, a sowing device, a spraying device, a mowing device, or the like other than the rotary tiller 3.
- the tractor 1 can be automatically traveled in the field A or the like shown in FIG. 5 by using the automatic traveling system for the work vehicle.
- the automatic traveling system for a work vehicle includes an automatic traveling unit 4 mounted on the tractor 1 and a wireless communication device set to communicate wirelessly with the automatic traveling unit 4.
- An example of a mobile communication terminal 5, etc. is included.
- the mobile communication terminal 5 is provided with a multi-touch type display device (for example, a liquid crystal panel) 50 that enables various information displays and input operations related to automatic driving.
- a tablet-type personal computer, a smartphone, or the like can be adopted as the mobile communication terminal 5. Further, for wireless communication, wireless LAN (Local Area Network) such as Wi-Fi (registered trademark) and short-range wireless communication such as Bluetooth (registered trademark) can be adopted.
- wireless LAN Local Area Network
- Wi-Fi registered trademark
- Bluetooth registered trademark
- the tractor 1 includes driveable and steerable left and right front wheels 10, driveable left and right rear wheels 11, a cabin 13 forming a boarding-type driving unit 12, and a common rail.
- An electronically controlled diesel engine (hereinafter referred to as an engine) 14 having a system, a bonnet 15 covering the engine 14 and the like, a speed change unit 16 for shifting power from the engine 14, and the like are provided.
- An electronically controlled gasoline engine or the like having an electronic governor may be adopted as the engine 14.
- the tractor 1 interrupts transmission to a fully hydraulic power steering unit 17 that steers the left and right front wheels 10, a brake unit 18 that brakes the left and right rear wheels 11, and a rotary tiller 3.
- the electro-hydraulic control type work clutch unit 19 the electro-hydraulic control type elevating drive unit 20 that elevates and drives the rotary tiller 3, and the electro-hydraulic control type rolling unit 21 that enables the rotary tiller 3 to be driven in the roll direction.
- a vehicle state detection device 22 including various sensors and switches for detecting various setting states and operating states of each part in the tractor 1, an in-vehicle control unit 23 having various control units, and the like are provided. ..
- the power steering unit 17 may be of an electric type having an electric motor for steering.
- the driver unit 12 includes a steering wheel 25 for manual steering, a seat 26 for passengers, and an operation terminal 27 that enables various information displays and input operations. It is equipped.
- the driving unit 12 is provided with operating levers such as an accelerator lever and a speed change lever, and operating pedals such as an accelerator pedal and a clutch pedal.
- the operation terminal 27 a multi-touch type liquid crystal monitor, an ISOBUS (isobus) compatible virtual terminal, or the like can be adopted.
- the transmission unit 16 includes an electronically controlled continuously variable transmission that shifts the power from the engine 14, and an electron that switches the power after shifting by the continuously variable transmission between forward and reverse.
- a hydraulically controlled forward / backward switching device, etc. are included.
- I-HMT Integrated Hydro-static Mechanical Transmission
- HST Hydro Static Transmission
- the forward / backward switching device includes a hydraulic clutch for interrupting forward power, a hydraulic clutch for interrupting reverse power, and an electromagnetic valve for controlling the flow of oil with respect to them.
- the continuously variable transmission includes an HMT (Hydraulic Mechanical Transmission), a hydrostatic continuously variable transmission, or a belt-type continuously variable transmission, which is an example of a hydraulic mechanical continuously variable transmission.
- HMT Hydro Mechanical Transmission
- the transmission unit 16 includes an electro-hydraulic control type stepped transmission having a plurality of hydraulic clutches for shifting and a plurality of solenoid valves for controlling the flow of oil with respect to the continuously variable transmission instead of the continuously variable transmission. It may be.
- the brake unit 18 operates the left and right brakes that individually brake the left and right rear wheels 11 and the left and right brakes in conjunction with the depression operation of the left and right brake pedals provided in the driving unit 12.
- the vehicle state detection device 22 is a general term for various sensors and switches provided in each part of the tractor 1. As shown in FIG. 7, the vehicle state detection device 22 includes a vehicle speed sensor 22A that detects the vehicle speed of the tractor 1, a reverser sensor 22B that detects the operation position of the reverser lever for forward / backward switching, and a steering angle of the front wheels 10. The steering angle sensor 22C, which detects the above, is included. Although not shown, the vehicle state detection device 22 includes a rotation sensor that detects the output rotation speed of the engine 14, an accelerator sensor that detects the operation position of the accelerator lever, and a shift that detects the operation position of the shift lever. Sensors, etc. are included.
- the vehicle-mounted control unit 23 includes an engine control unit 23A that controls the engine 14, and a speed change unit control unit 23B that controls the speed change unit 16 such as switching the vehicle speed and forward / backward movement of the tractor 1. , Steering control unit 23C that controls steering, work device control unit 23D that controls work devices such as rotary tiller 3, display control unit 23E that controls display and notification to operation terminals 27, etc., control related to automatic driving An automatic driving control unit 23F that functions as a work support unit, and a non-volatile vehicle-mounted storage unit 23G that stores a target route P (see FIG. 5) for automatic driving generated according to the field A. Etc. are included.
- Each of the control units 23A to 23F is constructed by an electronic control unit in which a microcontroller or the like is integrated, various control programs, or the like.
- the control units 23A to 23F are connected to each other so as to be able to communicate with each other via CAN (Control Area Network).
- CAN Controller Area Network
- communication standards other than CAN and next-generation communication standards such as in-vehicle Ethernet and CAN-FD (CAN with FLexible Data rate), may be adopted.
- the engine control unit 23A executes engine speed maintenance control for maintaining the engine speed at the speed corresponding to the operation position of the accelerator lever based on the detection information from the accelerator sensor and the detection information from the rotation sensor. To do.
- the speed change unit control unit 23B is a continuously variable transmission device so that the vehicle speed of the tractor 1 is changed to a speed according to the operation position of the speed change lever based on the detection information from the speed change sensor and the detection information from the vehicle speed sensor 22A.
- Vehicle speed control for controlling the operation of the vehicle, forward / backward switching control for switching the transmission state of the forward / backward switching device based on the detection information from the reverser sensor 22B, and the like are executed.
- the vehicle speed control includes a deceleration stop process in which the continuously variable transmission is decelerated to a zero speed state to stop the running of the tractor 1 when the speed change lever is operated to the zero speed position.
- the work device control unit 23D has a work clutch control that controls the operation of the work clutch unit 19 based on the operation of the PTO switch provided in the operation unit 12, and the operation and height of the elevating switch provided in the operation unit 12.
- Lifting control that controls the operation of the lifting drive unit 20 based on the set value of the setting dial, and rolling that controls the operation of the rolling unit 21 based on the setting value of the roll angle setting dial provided in the driving unit 12. Perform control, etc.
- the PTO switch, the elevating switch, the height setting dial, and the roll angle setting dial are included in the vehicle condition detection device 22.
- the tractor 1 is provided with a positioning unit 30 that measures the position and orientation of the tractor 1.
- the positioning unit 30 includes a satellite navigation device 31 that measures the position and orientation of the tractor 1 using GNSS (Global Navigation Satellite System), which is an example of a satellite positioning system, a three-axis gyroscope, and three directions.
- An inertial measurement unit (IMU) 32 which has an acceleration sensor or the like and measures the posture, orientation, etc. of the tractor 1, is included.
- Positioning methods using GNSS include DGNSS (Differential GNSS: relative positioning method) and RTK-GNSS (Real Time Kinetic GNSS: interference positioning method).
- RTK-GNSS suitable for positioning of a moving body is adopted. Therefore, as shown in FIG. 1, a base station 6 that enables positioning by RTK-GNSS is installed at a known position around the field.
- the tractor 1 and the base station 6 are respectively the GNSS antennas 33 and 60 that receive the radio waves transmitted from the positioning satellite 7 (see FIG. 1), and the tractor 1 and the base.
- Communication modules 34, 61, etc. that enable wireless communication of each information including positioning information with the station 6 are provided.
- the satellite navigation device 31 of the positioning unit 30 receives the positioning information obtained by the GNSS antenna 33 of the tractor 1 receiving the radio waves from the positioning satellite 7, and the GNSS antenna 60 of the base station 6 receives the radio waves from the positioning satellite 7.
- the position and orientation of the tractor 1 can be measured with high accuracy based on the positioning information obtained by receiving the radio wave.
- the positioning unit 30 has the satellite navigation device 31 and the inertial measurement unit 32, the position, orientation, and attitude angle (yaw angle, roll angle, pitch angle) of the tractor 1 can be measured with high accuracy. ..
- the inertial measurement unit 32 of the positioning unit 30, the GNSS antenna 33, and the communication module 34 are included in the antenna unit 35 shown in FIG.
- the antenna unit 35 is arranged at the center of the upper left and right on the front side of the cabin 13.
- the vehicle body position when specifying the position of the tractor 1 is set to the rear wheel axle center position.
- the vehicle body position can be obtained from the positioning information from the positioning unit 42 and the vehicle body information including the positional relationship between the mounting position of the GNSS antenna 45 on the tractor 1 and the center position of the rear wheel axle.
- the mobile communication terminal 5 is provided with an electronic control unit in which a microcontroller and the like are integrated, a terminal control unit 51 having various control programs, and the like.
- the terminal control unit 51 is generated by a display control unit 51A that controls display and notification to the display device 50 and the like, a target route generation unit 51B that generates a target route P for automatic driving, and a target route generation unit 51B.
- a non-volatile terminal storage unit 51C for storing the target path P and the like is included.
- vehicle body information such as the turning radius of the tractor 1 and the working width or the number of working ridges of the working device such as the rotary tilling device 3 and the above-mentioned are described.
- Field information obtained from positioning information, etc. are stored.
- the field information in order to specify the shape and size of the field A, a plurality of shape specifying points in the field A acquired by using GNSS when the tractor 1 is run along the outer peripheral edge of the field A. (Shape-specific coordinates)
- Four corner points Cp1 to Cp4 see FIG. 5) and those corner points Cp1 to Cp4 are connected to specify the shape and size of the field A.
- Line SL see FIG. 5
- the tractor 1 and the mobile communication terminal 5 have communication modules 28 and 52 that enable wireless communication of each information including positioning information between the vehicle-mounted control unit 23 and the terminal control unit 51. It is equipped.
- the communication module 28 of the tractor 1 functions as a converter that converts communication information into both directions of CAN and Wi-Fi.
- the terminal control unit 51 can acquire various information about the tractor 1 including the position and orientation of the tractor 1 by wireless communication with the vehicle-mounted control unit 23.
- the display device 50 of the mobile communication terminal 5 can display various information including the position and orientation of the tractor 1 with respect to the target path P.
- the target route generation unit 51B targets based on the turning radius of the tractor 1 included in the vehicle body information, the working width or the number of working ridges of the work device, the shape and size of the field A included in the field information, and the like. Generate route P.
- the target route generation unit 51B first sets the field A as a margin region adjacent to the outer peripheral edge of the field A based on the above-mentioned four corner points Cp1 to Cp4 and the rectangular shape specifying line SL. It is divided into A1 and a workable area A2 located inside the margin area A1.
- the target path generation unit 51B sets the workable area A2 at each long side end of the workable area A2 based on the turning radius of the tractor 1, the work width of the work device, the number of work ridges, and the like. It is divided into a pair of end regions A2a to be formed and a central region A2b set between the pair of end regions A2a. After that, the target route generation unit 51B generates a plurality of parallel routes P1 arranged in parallel in the central region A2b at predetermined intervals according to the work width or the number of work ridges in the direction along the long side of the field A. To do. Further, the target route generation unit 51B generates a plurality of connection paths P2 for connecting the plurality of parallel paths P1 in the traveling order of the tractor 1 in each end region A2a.
- the target route generation unit 51B can generate a target route P capable of automatically traveling the tractor 1 from the start position p1 to the end position p2 of the automatic travel set in the field A shown in FIG. ..
- each end region A2a is a direction change region when the tractor 1 changes direction from the currently traveling parallel path P1 toward the next parallel path P1 according to the connection path P2.
- the central region A2b is a work region in which the tractor 1 automatically travels in a working state according to each parallel path P1.
- each parallel route P1 is a work route in which the tractor 1 automatically travels while performing work by a work device such as a rotary tiller 3.
- Each connection path P2 is a non-work path in which the tractor 1 automatically travels without performing work by the work device.
- the start position position p3 of each parallel path P1 is a work start position at which the tractor 1 starts work by the work device.
- the terminal position p4 of each parallel path P1 is a work stop position at which the tractor 1 stops the work by the work device.
- the start position p3 of the parallel path P1 in which the traveling order of the tractor 1 is set first is the start position p1 of automatic traveling.
- the start end position p3 of the remaining parallel path P1 is the connection position with the end position of the connection path P2. Further, the terminal position p4 of the parallel path P1 in which the traveling order of the tractor 1 is set last is the end position p2 of the automatic traveling. The end position p4 of the remaining parallel path P1 is the connection position with the start position of the connection path P2.
- the target route P shown in FIG. 5 is merely an example, and the target route generation unit 51B has different vehicle body information depending on the model of the tractor 1 and the type of the working device, and different field A according to the field A. Based on field information such as shape and size, various target paths P suitable for them can be generated.
- the target route P is stored in the terminal storage unit 51C in a state associated with vehicle body information, field information, and the like, and can be displayed on the display device 50 of the mobile communication terminal 5.
- the target path P includes a target vehicle speed of the tractor 1 in each parallel path P1, a target vehicle speed of the tractor 1 in each connection path P2, a front wheel steering angle in each parallel path P1, a front wheel steering angle in each connection path P2, and the like. include.
- the terminal control unit 51 transmits field information, a target route P, etc. stored in the terminal storage unit 51C to the vehicle-mounted control unit 23 in response to a transmission request command from the vehicle-mounted control unit 23.
- the vehicle-mounted control unit 23 stores the received field information, the target route P, and the like in the vehicle-mounted storage unit 23G.
- the terminal control unit 51 transmits all of the target route P from the terminal storage unit 51C to the vehicle-mounted control unit 23 at once before the tractor 1 starts automatic traveling. It may be.
- the terminal control unit 51 divides the target route P into a plurality of divided route information for each predetermined distance, and each time the traveling distance of the tractor 1 reaches the predetermined distance from the stage before the tractor 1 starts automatic traveling. , A predetermined number of division route information according to the traveling order of the tractor 1 may be sequentially transmitted from the terminal storage unit 51C to the vehicle-mounted control unit 23.
- the automatic traveling control unit 23F can monitor various setting states in the tractor 1, operating states of each unit, and the like.
- the automatic driving control unit 23F various manual setting operations for satisfying the automatic driving start condition are performed by a user such as a passenger or an administrator, and the driving mode of the tractor 1 is changed from the manual driving mode to the automatic driving mode.
- the positioning unit 30 acquires the position and orientation of the tractor 1 and follows the target route P.
- the automatic running control for automatically running 1 is started.
- the automatic driving control unit 23F is, for example, when the user operates the display device 50 of the mobile communication terminal 5 to instruct the end of the automatic driving, or is on board the driving unit 12.
- the automatic driving control is terminated and the driving mode is switched from the automatic driving mode to the manual driving mode.
- the automatic driving control by the automatic driving control unit 23F includes automatic driving control processing for the engine that transmits a control command for automatic driving related to the engine 14 to the engine control unit 23A, and control for automatic driving related to switching the vehicle speed and forward / backward movement of the tractor 1.
- Automatic control processing for vehicle speed that transmits commands to the speed change unit control unit 23B
- automatic control processing for steering that transmits control commands for automatic driving related to steering to the steering control unit 23C
- automatic control processing related to work devices such as the rotary tiller 3. It includes a work automatic control process for transmitting a running control command to the work device control unit 23D.
- the automatic driving control unit 23F issues an engine speed change command to the engine control unit 23A, which instructs the engine speed to be changed based on the set speed included in the target path P. Send.
- the engine control unit 23A executes engine speed change control that automatically changes the engine speed in response to various control commands regarding the engine 14 transmitted from the automatic travel control unit 23F.
- the automatic driving control unit 23F is included in the shift operation command for instructing the shift operation of the continuously variable transmission based on the target vehicle speed included in the target path P, and the target path P.
- a forward / backward switching command for instructing a forward / backward switching operation of the forward / backward switching device based on the traveling direction of the tractor 1 and the like are transmitted to the speed change unit control unit 23B.
- the speed change unit control unit 23B is a vehicle speed control that automatically controls the operation of the stepless speed change device in response to various control commands related to the continuously variable transmission device, the forward / backward changeover device, etc. transmitted from the automatic travel control unit 23F.
- the vehicle speed control includes, for example, an automatic deceleration stop process in which the continuously variable transmission is decelerated to a zero speed state to stop the running of the tractor 1 when the target vehicle speed included in the target path P is zero speed. It is included.
- the automatic driving control unit 23F transmits a steering command for instructing steering of the left and right front wheels 10 to the steering control unit 23C based on the front wheel steering angle and the like included in the target path P. ..
- the steering control unit 23C sets the automatic steering control for controlling the operation of the power steering unit 17 to steer the left and right front wheels 10 and the left and right front wheels 10 in response to the steering command transmitted from the automatic driving control unit 23F.
- the brake unit 18 is operated to operate the brake inside the turning, and automatic brake turning control is executed.
- the automatic traveling control unit 23F uses the rotary tillage device 3 and the like based on the arrival of the tractor 1 at each work start position (starting position p3 of each parallel path P1) included in the target path P.
- the work device is based on the work start command for instructing the switching of the work device to the work state of the work device and the arrival of the tractor 1 at each work stop position (end position p4 of each parallel path P1) included in the target path P.
- a work stop command or the like instructing the switching to the non-working state is transmitted to the working device control unit 23D.
- the work device control unit 23D controls the operation of the elevating drive unit 20 and the like in response to various control commands related to the work device transmitted from the automatic traveling control unit 23F, and lowers the work device to the work height to operate.
- Automatic work start control and automatic work stop control in which the work device is raised to a non-work height and put on standby are executed.
- the above-mentioned automatic traveling unit 4 includes a power steering unit 17, a brake unit 18, a work clutch unit 19, an elevating drive unit 20, a rolling unit 21, a vehicle state detection device 22, an in-vehicle control unit 23, a positioning unit 30, and the like. , Communication modules 28, 34, and the like are included. Then, when these operate properly, the tractor 1 can be automatically driven with high accuracy according to the target route P, and the work by the working device such as the rotary tilling device 3 can be properly performed.
- the tractor 1 is provided with an obstacle detection system 80 that monitors the surroundings of the tractor 1 and detects obstacles existing around the tractor 1.
- Obstacles detected by the obstacle detection system 80 include a person such as a worker working in the field A, another work vehicle, and an existing utility pole or tree in the field A.
- the obstacle detection system 80 includes an imaging unit 80A that captures visible light around the tractor and acquires color image information, and an obstacle detection unit that detects obstacles existing around the tractor 1.
- the 80B and the information integration processing unit 80C that integrates and processes the color image information from the imaging unit 80A and the detection information from the obstacle detection unit 80B are included.
- the imaging unit 80A includes a front camera 81 in which the first imaging range Ri1 in front of the cabin 13 is set as the imaging range, and a second imaging range Ri2 in the rear of the cabin 13.
- the camera 82, the third imaging range Ri3 to the right of the cabin 13 was set to the imaging range, the right camera 83, and the fourth imaging range Ri4 to the left of the cabin 13 were set to the imaging range.
- a left camera 84 and an image processing device 85 (see FIG. 7) that processes color image information from each of the cameras 81 to 84 are included.
- the front camera 81 and the rear camera 82 are arranged on the left and right center lines of the tractor 1.
- the front camera 81 is arranged at the center of the upper left and right of the front end side of the cabin 13 in a front-down posture in which the front side of the tractor 1 is viewed from the diagonally upper side.
- a predetermined range on the front side of the vehicle body with the left and right center lines of the tractor 1 as the axis of symmetry is set as the first imaging range Ri1.
- the rear camera 82 is arranged at the center of the upper left and right on the rear end side of the cabin 13 in a rear-down posture in which the rear side of the tractor 1 is viewed from the diagonally upper side.
- a predetermined range on the rear side of the vehicle body with the left and right center lines of the tractor 1 as the axis of symmetry is set in the second imaging range Ri2.
- the right camera 83 is arranged at the center of the upper part on the right end side of the cabin 13 in a downward-sloping posture looking down on the right side of the tractor 1 from the diagonally upper side.
- a predetermined range on the right side of the vehicle body is set to the third imaging range Ri3.
- the left camera 84 is arranged at the center of the upper part on the left end side of the cabin 13 in a downward-sloping posture looking down on the left side of the tractor 1 from the diagonally upper side.
- a predetermined range on the left side of the vehicle body is set to the fourth imaging range Ri4.
- the image processing device 85 is constructed by an electronic control unit in which a microcontroller and the like are integrated, various control programs, and the like.
- the image processing device 85 is connected to the vehicle-mounted control unit 23, the information integration processing unit 80C, and the like via CAN so that they can communicate with each other.
- the image processing device 85 performs image processing on the color image information sequentially transmitted from each of the cameras 81 to 84.
- the image processing device 85 performs an image generation process that generates front, rear, left, and right images corresponding to the imaging range of each camera 81 to 84 with respect to the color image information sequentially transmitted from each camera 81 to 84.
- An omnidirectional image generation process for generating an omnidirectional image (for example, a surround view) of the tractor 1 by synthesizing color image information from all cameras 81 to 84 is performed.
- an image transmission process is performed in which each generated image and the omnidirectional image are transmitted to the display control unit 23E of the vehicle-mounted control unit 23.
- the display control unit 23E transmits each image from the image processing device 85 and the omnidirectional image to the operation terminal 27 via the CAN, and also to the display control unit 51A of the mobile communication terminal 5 via the communication modules 28 and 52. Send.
- the omnidirectional image generated by the image processing device 85, the image corresponding to the traveling direction of the tractor 1, and the like can be displayed on the operation terminal 27 of the tractor 1, the display device 50 of the mobile communication terminal 5, and the like. Then, by this display, the user can visually recognize the situation around the tractor 1 and the situation in the traveling direction.
- the display control unit 23E of the vehicle-mounted control unit 23 displays the omnidirectional image generated by the image processing device 85, the image corresponding to the traveling direction of the tractor 1, and the like as the display device of the operation terminal 27 of the tractor 1 and the mobile communication terminal 5.
- the display control unit 23E of the vehicle-mounted control unit 23 displays the omnidirectional image generated by the image processing device 85, the image corresponding to the traveling direction of the tractor 1, and the like as the display device of the operation terminal 27 of the tractor 1 and the mobile communication terminal 5.
- it functions as a work support unit that enables work support such as making it easier for the user to visually recognize the situation around the work vehicle V.
- the image processing device 85 is subjected to learning processing for recognizing a person such as a worker working in the field A, another work vehicle, and an existing utility pole or tree in the field A as an obstacle. .. As a result, the image processing device 85 has an obstacle that affects the traveling of the tractor 1 in the imaging range Ri1 to Ri4 of any of the cameras 81 to 84 based on the color image information sequentially transmitted from the cameras 81 to 84. An obstacle determination process for determining whether or not an object exists can be performed.
- the image processing apparatus 85 determines in the obstacle determination process that an obstacle exists in any of the imaging ranges Ri1 to Ri4, the image processing apparatus 85 performs a coordinate calculation process for obtaining the coordinates of the obstacle on the image in which the obstacle exists. This is performed, and coordinate conversion processing is performed to convert the obtained coordinates of the obstacle into coordinates based on the vehicle body coordinate origin based on the mounting position and mounting angle of each of the cameras 81 to 84. Then, a distance calculation process is performed to obtain the linear distance between the converted coordinates and the preset distance calculation reference point as the distance from the distance calculation reference point to the obstacle, and the converted coordinates and the obtained obstacle are reached. The obstacle information transmission process of transmitting the distance and the distance as information about the obstacle to the information integration processing unit 80C is performed. On the other hand, when there is no obstacle in any of the imaging ranges Ri1 to Ri4, an undetected transmission process for transmitting the fact that the obstacle has not been detected to the information integration processing unit 80C is performed.
- the image processing device 85 transmits information about the obstacle to the information integration processing unit 80C.
- the integrated processing unit 80C can grasp that the obstacle exists in the imaging range Ri1 to Ri4 of each of the cameras 81 to 84, and the position of the obstacle. And the distance to the obstacle can be obtained.
- the image processing device 85 transmits the undetected obstacle to the information integration processing unit 80C, so that the information is integrated. By receiving the undetected information, the processing unit 80C can grasp that there is no obstacle in any of the imaging ranges Ri1 to Ri4 of the cameras 81 to 84.
- the vehicle body coordinate origin in the above coordinate conversion process and the distance calculation reference point in the distance calculation process are set according to the mounting positions of the cameras 81 to 84.
- the vehicle body coordinate origin O1 and the distance calculation reference point Rp1 are set for the front camera 81 according to the mounting position thereof.
- the vehicle body coordinate origin O2 and the distance calculation reference point Rp2 are set according to the mounting position.
- the vehicle body coordinate origin O3 and the distance calculation reference point Rp3 are set according to the mounting position.
- the vehicle body coordinate origin O4 and the distance calculation reference point Rp4 are set according to the mounting position.
- the image processing device 85 obtains the coordinates of the obstacle on the image of the front camera 81 in which the obstacle exists (coordinates).
- the obtained coordinates of the obstacle are converted into coordinates (x, y) based on the vehicle body coordinate origin O1 shown in FIG. 8 based on the mounting position and mounting angle of the front camera 81 (coordinate conversion).
- the linear distance between the converted coordinates (x, y) and the distance calculation reference point Rp1 is obtained as the distance La from the distance calculation reference point Rp1 to the obstacle O (distance calculation processing).
- the obstacle detection unit 80B includes a first obstacle sensor 86 that detects an obstacle existing in front of the tractor 1 and an obstacle sensor 86 behind the tractor 1.
- a second obstacle sensor 87 for detecting an obstacle and a third obstacle sensor 88 for detecting obstacles existing on the left and right sides of the tractor 1 are included.
- the first obstacle sensor 86 and the second obstacle sensor 87 employ a lidar sensor that uses a pulsed near-infrared laser beam (an example of near-infrared light) to detect an obstacle.
- the third obstacle sensor 88 employs a sonar that uses ultrasonic waves to detect obstacles.
- the first obstacle sensor 86 includes a first measuring unit 86A that measures a distance to a measurement object existing in a measuring range using near-infrared laser light. It has a first control unit 86B that generates a distance image or the like based on measurement information from the first measurement unit 86A.
- the second obstacle sensor 87 is based on the measurement information from the second measurement unit 87A that measures the distance to the measurement object existing in the measurement range using the near-infrared laser light and the measurement information from the second measurement unit 87A. It has a second control unit 87B that generates a distance image and the like.
- the third obstacle sensor 88 exists in the measurement range based on the ultrasonic sensor 88A on the right side and the ultrasonic sensor 88B on the left side for transmitting and receiving ultrasonic waves, and the ultrasonic waves transmitted and received by the ultrasonic sensors 88A and 88B, respectively. It has a single third control unit 88C that measures the distance to the object to be measured.
- the control units 86B, 87B, 88C of the obstacle sensors 86 to 88 are constructed by an electronic control unit in which a microcontroller and the like are integrated, various control programs, and the like.
- the control units 86B, 87B, 88C are connected to the vehicle-mounted control unit 23, the information integration processing unit 80C, and the like via CAN so as to be intercommunicationable.
- the first measurement range Rm1 in front of the cabin 13 is set as the measurement range.
- the second measurement range Rm2 behind the cabin 13 is set as the measurement range.
- a third measurement range Rm3 to the right of the cabin 13 and a fourth measurement range Rm4 to the left of the cabin 13 are set as measurement ranges.
- the first obstacle sensor 86 and the second obstacle sensor 87 are arranged on the left and right center lines of the tractor 1 like the front camera 81 and the rear camera 82.
- the first obstacle sensor 86 is arranged at the center of the upper left and right on the front end side of the cabin 13 in a front-down posture looking down on the front side of the tractor 1 from an obliquely upper side.
- a predetermined range on the front side of the vehicle body with the left and right center lines of the tractor 1 as the axis of symmetry is set in the first measurement range Rm1 by the measurement unit 86A.
- the second obstacle sensor 87 is arranged at the center of the upper left and right on the rear end side of the cabin 13 in a rearward lowering posture in which the rear side of the tractor 1 is viewed from the diagonally upper side.
- a predetermined range on the rear side of the vehicle body with the left and right center lines of the tractor 1 as the axis of symmetry is set in the second measurement range Rm2 by the measurement unit 87A.
- the first obstacle sensor 86 and the second obstacle sensor 87 are interlocked with the switching when the tractor 1 is traveling forward when the forward / backward switching device of the speed change unit 16 is switched to the forward transmission state.
- the 86 is in the operating state
- the second obstacle sensor 87 is in the operating stopped state. Further, when the tractor 1 is traveling backward when the forward / backward switching device of the speed change unit 16 is switched to the reverse transmission state, the first obstacle sensor 86 is stopped in operation in conjunction with the switching, and the second obstacle sensor 87 is in the operating state.
- the ultrasonic sensor 88A on the right side is attached to the boarding / alighting step 24 on the right side arranged between the front wheel 10 on the right side and the rear wheel 11 on the right side in a posture facing outward to the right of the vehicle body.
- a predetermined range on the right outer side of the vehicle body is set to the third measurement range Rm3.
- the ultrasonic sensor 88B on the left side is attached to the boarding / alighting step 24 on the left side arranged between the front wheel 10 on the left side and the rear wheel 11 on the left side in a posture facing outward to the left of the vehicle body. ..
- a predetermined range on the left outer side of the vehicle body is set to the fourth measurement range Rm4.
- the irradiated near-infrared laser light reaches the distance measuring point and returns.
- TOF Time Of Flight
- Each of the measuring units 86A and 87A scans the near-infrared laser light vertically and horizontally at high speed over the entire first measurement range Rm1 or the second measurement range Rm2, and the distance to the distance measurement point for each scanning angle (coordinate). Is sequentially measured to perform three-dimensional measurement in the first measurement range Rm1 or the second measurement range Rm2.
- Each of the measuring units 86A and 87A has the intensity of the reflected light from each AF point obtained when the near-infrared laser beam is scanned vertically and horizontally at high speed over the entire first measurement range Rm1 or the second measurement range Rm2 (hereinafter,). , Called reflection intensity) are measured sequentially.
- the measuring units 86A and 87A repeatedly measure the distance to each AF point in the first measuring range Rm1 or the second measuring range Rm2, each reflection intensity, and the like in real time.
- Each control unit 86B, 87B of the first obstacle sensor 86 and the second obstacle sensor 87 has a distance to each AF point measured by each measurement unit 86A, 87A, a scanning angle (coordinates) for each AF point, and the like. From the measurement information of the above, a distance image is generated, a range-finding point group presumed to be an obstacle is extracted, and the measurement information regarding the extracted range-finding point group is transmitted to the information integration processing unit 80C as measurement information regarding the obstacle candidate. ..
- the control units 86B and 87B of the first obstacle sensor 86 and the second obstacle sensor 87 determine whether or not the distance values of the distance measurement points measured by the measurement units 86A and 87A meet the invalid condition. , The distance value that meets the invalid condition is transmitted to the information integration processing unit 80C as an invalid value.
- each of the control units 86B and 87B has the characteristic of being dirty on the sensor surface that it exists at a close distance from the first obstacle sensor 86 or the second obstacle sensor 87.
- the distance value of the AF point is invalid. This prevents the distance value of the distance measuring point regarding the dirt on the sensor surface from being used as the measurement information regarding the obstacle in the information integrated processing unit 80C.
- each of the control units 86B and 87B utilizes the characteristic of suspended matter such as dust and fog that the reflection intensity is very weak while being present at a short distance of the first obstacle sensor 86 or the second obstacle sensor 87.
- the distance value of the AF point having that feature is regarded as an invalid value. This prevents the distance value of the distance measuring point regarding the suspended object from being used as the measurement information regarding the obstacle in the information integration processing unit 80C.
- the third control unit 88C of the third obstacle sensor 88 has a third measurement range Rm3 or a third measurement range Rm3 based on the transmission and reception of ultrasonic waves by the left and right ultrasonic sensors 88A and 88B.
- 4 Determine the presence or absence of the object to be measured in the measurement range Rm4.
- the third control unit 88C measures each ultrasonic sensor 88A by the TOF (Time Of Flight) method of measuring the distance to the AF point based on the round-trip time until the transmitted ultrasonic wave reaches the AF point and returns.
- 88B measures the distance from the measurement target, and transmits the measured distance to the measurement target and the direction of the measurement target to the information integration processing unit 80C as measurement information regarding the obstacle.
- TOF Time Of Flight
- each of the control units 86B and 87B of the first obstacle sensor 86 and the second obstacle sensor 87 has a vehicle body with respect to the measurement ranges Rm1 and Rm2 of the measurement units 86A and 87A.
- the obstacle detection target range by the first obstacle sensor 86 and the second obstacle sensor 87 is limited to the first detection range Rd1 and the second detection range Rd2. doing.
- the control units 86B and 87B acquire the maximum left-right width of the vehicle body including the work device (the left-right width of the rotary tillage device 3 in the present embodiment) by communicating with the in-vehicle control unit 23, and the vehicle body
- the obstacle detection target width Wd is set by adding a predetermined safety band to the maximum left-right width. Then, in the first measurement range Rm1 and the second measurement range Rm2, the left and right ranges outside the detection target width Wd are set as the first non-detection range Rnd1 by the cut process and excluded from the respective detection ranges Rd1 and Rd2.
- control units 86B and 87B are predetermined in a range in which the front end side of the tractor 1 enters the first measurement range Rm1 and a range in which the rear end side of the working device enters the second measurement range Rm2.
- the range to which the safety band is added is set to the second non-detection range Rnd2 by the masking process and excluded from the respective detection ranges Rd1 and Rd2.
- the first obstacle sensor 86 and the second obstacle sensor 87 can be detected from the detection target width Wd.
- Increased detection load by detecting obstacles that are out of the way and do not collide with the tractor 1, and the front end side of the tractor 1 or the rear end of the work device that is in the first measurement range Rm1 or the second measurement range Rm2. The risk of erroneous detection of the side as an obstacle is avoided.
- the second non-detection range Rnd2 shown in FIG. 9 is an example of a non-detection range suitable for the front side of the vehicle body where the left and right front wheels 10 and the bonnet 15 are present.
- the second non-detection range Rnd2 shown in FIG. 10 is an example of a non-detection range suitable for a working state in which the rotary tillage device 3 is lowered to the working height on the rear side of the vehicle body.
- the second non-detection range Rnd2 shown in FIG. 11 is an example of a non-detection range suitable for a non-working state in which the rotary tillage device 3 is raised to the evacuation height on the rear side of the vehicle body.
- the second non-detection range Rnd2 on the rear side of the vehicle body is appropriately switched in conjunction with the raising and lowering of the rotary tillage device 3.
- Information on the first detection range Rd1, the second detection range Rd2, the first non-detection range Rnd1, and the second non-detection range Rnd2 is included in the above-mentioned distance image, and the information integration processing unit is included with the above-mentioned distance image. It is transmitted to 80C.
- the detection ranges Rd1 and Rd2 of the first obstacle sensor 86 and the second obstacle sensor 87 are stopped based on the collision determination process in which the collision prediction time becomes the set time (for example, 3 seconds). It is divided into a control range Rsc, a deceleration control range Rdc, and a notification control range Rnc.
- the stop control range Rsc is set in the range from the first obstacle sensor 86 or the second obstacle sensor 87 to the determination reference position of the collision determination process.
- the deceleration control range Rdc is set in the range from the determination reference position to the deceleration start position.
- the notification control range Rnc is set in the range from the deceleration start position to the measurement limit position of the first obstacle sensor 86 or the second obstacle sensor 87.
- Each determination reference position is set at a position where a constant separation distance L (for example, 2000 mm) is set in the front-rear direction of the vehicle body from the front end or the rear end of the vehicle body including the rotary tillage device 3.
- the information integration processing unit 80C is constructed by an electronic control unit in which a microcontroller and the like are integrated, various control programs, and the like.
- the information integration processing unit 80C combines information from the imaging unit 80A, which has low distance measurement accuracy but high object discrimination accuracy, and measurement information from the obstacle detection unit 80B, which has low object discrimination accuracy but high distance measurement accuracy. Based on this, the obstacle information output control that acquires and outputs information about the obstacle is executed.
- the information integration processing unit 80C obtains information on obstacles based on color image information from the front camera 81 of the imaging unit 80A and measurement information from the first obstacle sensor 86 of the obstacle detection unit 80B. Only the control operation when acquiring and outputting will be described.
- the information integration processing unit 80C performs the first determination process of determining whether or not an obstacle is detected from the color image information of the front camera 81 based on the information from the image pickup unit 80A (step # 1), and also Based on the measurement information from the obstacle detection unit 80B, a second determination process for determining whether or not the distance image of the first obstacle sensor 86 includes an obstacle candidate is performed (step # 2).
- the information integration processing unit 80C detects an obstacle from the color image information of the front camera 81 in the first determination process, and includes an obstacle candidate in the distance image of the first obstacle sensor 86 in the second determination process. If this is the case, a third determination process for determining whether or not the position of the obstacle and the position of the obstacle candidate described above match is performed (step # 3).
- the information integration processing unit 80C determines the obstacle (obstacle candidate) by the front camera 81 and the first obstacle sensor 86. Is determined to be a proper detection state in which the front camera 81 and the first obstacle sensor 86 properly detect the same obstacle (obstacle candidate). Then, based on this determination result, the distance information of the obstacle candidate measured by the first obstacle sensor 86 is applied to the obstacle detected from the color image information of the front camera 81, and this information is detected as an obstacle.
- the appropriate information acquisition process to be acquired as information and the appropriate information output process to output the acquired obstacle detection information are performed (steps # 4 to 5).
- the information integration processing unit 80C determines the position of the obstacle in the color image information of the front camera 81 in the distance information from the first obstacle sensor 86.
- the fourth determination process for determining whether or not the measured value at the position corresponding to is valid is performed (step # 6).
- the information integration processing unit 80C determines that the detection state of the obstacle (obstacle candidate) by the front camera 81 and the first obstacle sensor 86 is the front camera 81.
- the obstacle is properly detected from the color image information of the above, and the measurement by the first obstacle sensor 86 is performed properly, but the obstacle is detected as an obstacle from the distance image of the first obstacle sensor 86. It is determined that the detection state is quasi-appropriate and cannot be specified as a candidate.
- the first obstacle sensor 86 measures the obstacle detected from the color image information of the front camera 81 with respect to the measurement target corresponding to the position of the obstacle indicated by the front camera 81.
- the quasi-appropriate information acquisition process for acquiring this information as obstacle detection information and the quasi-appropriate information output process for outputting the acquired obstacle detection information are performed by applying the distance information (steps # 7 to 8).
- the information integration processing unit 80C determines that the detection state of the obstacle (obstacle candidate) by the front camera 81 and the first obstacle sensor 86 is the front camera 81. Obstacles are properly detected from the color image information, but in the distance image of the first obstacle sensor 86, dust, fog, etc. at the measurement target position corresponding to the position of the obstacle indicated by the front camera 81, etc.
- the measured value for the object to be measured corresponding to the position of the obstacle indicated by the front camera 81 is invalid due to the generation of floating objects or the adhesion of dirt on the sensor surface of the first obstacle sensor 86. Determined to be in a state.
- the distance information for the obstacle calculated from the color image information of the front camera 81 is applied to the obstacle detected from the color image information of the front camera 81, and this information is detected as the obstacle.
- the camera independent information acquisition process to be acquired as information and the camera independent information output process to output the acquired obstacle detection information are performed (steps # 9 to 10).
- the in-vehicle control unit 23 includes an obstacle control unit 23H that executes control related to obstacles based on obstacle detection information from the information integration processing unit 80C.
- the obstacle control unit 23H is constructed by an electronic control unit in which a microcontroller and the like are integrated, various control programs, and the like.
- the obstacle control unit 23H is connected to the information integration processing unit 80C or the like via CAN so that they can communicate with each other.
- the obstacle control unit 23H executes collision avoidance control for avoiding a collision with an obstacle based on the obstacle detection information from the information integration processing unit 80C. Specifically, when the obstacle control unit 23H acquires the obstacle detection information by the appropriate information output processing or the quasi-appropriate information output processing from the information integration processing unit 80C, the first collision avoidance control is performed as the collision avoidance control. To execute. When the obstacle control unit 23H acquires the obstacle detection information by the camera-only information output processing from the information integration processing unit 80C, the second collision avoidance control has a higher collision avoidance rate than the first collision avoidance control as the collision avoidance control. Executes collision avoidance control.
- the obstacle control unit 23H determines whether the obstacle is located in the notification control range Rnc of the first detection range Rd1 shown in FIG. 4 based on the distance to the obstacle included in the obstacle detection information.
- a fifth determination process for determining whether or not it is located a sixth determination process for determining whether or not it is located in the deceleration control range Rdc, and a seventh determination process for determining whether or not it is located in the stop control range Rsc. (Steps # 11 to 13).
- the obstacle control unit 23H determines that the obstacle is located in the notification control range Rnc.
- the first command to give a notification command to the display control unit 23E of the vehicle-mounted control unit 23 and the display control unit 51A of the terminal control unit 51 to notify the display device 50 of the operation terminal 27 of the tractor 1 or the mobile communication terminal 5.
- the notification command process is performed (step # 14).
- the obstacle control unit 23H determines that the obstacle is located in the deceleration control range Rdc.
- the notification command process is performed (step # 15).
- the obstacle control unit 23H performs a deceleration command process for instructing the speed change unit control unit 23B of a deceleration command for reducing the vehicle speed of the tractor 1 as the obstacle located in the deceleration control range Rdc approaches the tractor 1. (Step # 16).
- the obstacle control unit 23H determines that the obstacle is located in the stop control range Rsc.
- the notification command process is performed (step # 17).
- the obstacle control unit 23H performs a deceleration stop command process for instructing the speed change unit control unit 23B of a deceleration stop command for decelerating and stopping the tractor 1 while the obstacle is located in the stop control range Rsc (step). # 18).
- the tractor 1 can be decelerated and stopped while the obstacle is located in the stop control range Rsc, thereby avoiding the possibility that the work vehicle V collides with the obstacle. Can be done.
- the obstacle control unit 23H has a first detection range Rd1 described above as compared with the case of the first collision avoidance control described above.
- the deceleration control range is such that the distance from the judgment reference position to the boundary between the deceleration control range Rdc and the stop control range Rsc and the distance from the judgment reference position to the boundary between the notification control range Rnc and the deceleration control range Rdc are long.
- the second collision avoidance control is a collision avoidance control capable of avoiding a collision with an obstacle at an earlier timing than the first collision avoidance control.
- the obstacle control unit 23H avoids a collision with an obstacle based on the measurement information from the imaging unit 80A, which has a lower distance measurement accuracy than the measurement information from the obstacle detection unit 80B.
- the automatic driving control of the automatic driving control unit 23F is based on the adhesion of dirt (measurement obstruction) on the sensor surfaces of the first obstacle sensor 86 and the second obstacle sensor 87 when it is detected.
- a dirt handling control process for controlling the traveling of the tractor 1 is included.
- the automatic driving control unit 23F has a ratio of an invalid value caused by a measurement obstruction such as dirt included in the measurement information from the first obstacle sensor 86 to the measurement range Rm1 of the first obstacle sensor 86.
- the eighth determination process for determining whether or not is equal to or greater than a predetermined value (for example, 50%) is performed (step # 21).
- the automatic driving control unit 23F can maintain the vehicle speed of the tractor 1 and the traveling state of the tractor 1 in the creep traveling state when the ratio of the invalid value occupied by the eighth determination process is equal to or more than a predetermined value.
- a deceleration command process for instructing the speed change unit control unit 23B to reduce the speed to a low speed is performed (step # 22).
- the automatic driving control unit 23F issues a vehicle speed maintenance command for maintaining the vehicle speed of the tractor 1 to the current vehicle speed to the speed change unit control unit 23B.
- the vehicle speed maintenance command process to be commanded is performed (step # 23).
- the automatic driving control unit 23F After performing the deceleration command process, the automatic driving control unit 23F performs the ninth determination process of determining whether or not the creep traveling state of the tractor 1 has been continued for a predetermined time (step # 24). Then, the automatic traveling control unit 23F performs the above-mentioned eighth determination process (step # 25) until the creep traveling state is continued for a predetermined time, and the ratio of the invalid value is determined in this eighth determination process. If it drops below the value, the sensor surface of the first obstacle sensor 86 is not dirty, but floating objects such as dust and fog are flying around the first obstacle sensor 86.
- step # 26 It is determined that the tractor 1 is present, and a vehicle speed return command process is performed to command the speed change unit control unit 23B to return the vehicle speed to the original vehicle speed before the tractor 1 is lowered to the ultra-low speed (step # 26). After that, the process returns to step # 21.
- the automatic running control unit 23F determines that the sensor surface of the first obstacle sensor 86 is dirty, and immediately stops the running of the tractor 1.
- a traveling stop command process for commanding a stop command to the speed change unit control unit 23B is performed (step # 27).
- the tractor Since the vehicle speed of 1 is reduced to an ultra-low speed slower than the low speed and the running state of the tractor 1 is maintained in the creep running state, it is invalid as compared with the case where the running state of the tractor 1 is maintained in the low speed running state.
- Time to determine whether the cause of the value is a deposit such as dirt on the sensor surface of each obstacle sensor 86 to 88, or a floating substance such as dust or dust flying around each obstacle sensor 86 to 88. Can be lengthened.
- the information integration processing unit 80C and the obstacle control unit 23H have an obstacle around the work vehicle V based on the detection of the obstacle by the imaging unit 80A and the obstacle detection unit 80B. It functions as a work support unit that enables work support such as notifying the existence of an object and avoiding a collision with an obstacle.
- the information integration processing unit 80C was cultivated in the field A based on the color image information from the image pickup unit 80A and the measurement information from the first obstacle sensor 86.
- the growth information acquisition unit 80Ca that acquires the growth information of the crop Z, which is an index of the growth evaluation of the crop Z, is included.
- the growth information acquisition unit 80Ca divides the central area (work area) A2b of the field A into a plurality of sections and sets the crop Z for each growth information acquisition area (an example of a predetermined area) Ag (see FIGS. 15 to 18). Get the growth information of.
- the growth information acquisition unit 80Ca provides growth information for crop Z, such as a normalized difference vegetation index (hereinafter referred to as NDVI), which is an example of a vegetation index indicating the activity of crop Z, a vegetation coverage rate corresponding to the number of stems, and Obtain the plant height value of crop Z, etc.
- NDVI normalized difference vegetation index
- FIG. 16 is an example of a color image in front of the vehicle body generated from the color image information captured by the front camera 81 of the image pickup unit 80A.
- FIG. 17 is an example of a distance image in front of the vehicle body generated from the measurement information of the first obstacle sensor 86.
- FIG. 18 is an example of a near-infrared image of the front of the vehicle body generated from the measurement information of the first obstacle sensor 86.
- the growth information acquisition unit 80Ca is, for example, associated with the automatic traveling of the tractor 1 in the middle cultivation work state in which a cultivator (not shown) for the middle cultivation work, which is an example of the work device, is connected to the rear part of the tractor 1. Growth information acquisition control is executed to acquire the growth information of the crop Z for each growth area Ag.
- the growth information acquisition unit 80Ca performs a color image information acquisition process for acquiring the color image information of the first imaging range Ri1 imaged by the front camera 81 from the imaging unit 80A, and the growth information acquisition area Ag (from the color image information of the front camera 81).
- the visible red light reflectance acquisition process for acquiring the visible red light reflectance R in is performed (steps # 31 to 32).
- the growth information acquisition unit 80Ca performs the measurement information acquisition process of acquiring the measurement information by the first obstacle sensor 86 from the obstacle detection unit 80B, and the growth information acquisition area Ag included in the measurement information of the first obstacle sensor 86.
- the reflection intensity acquisition process for acquiring the reflection intensity (see FIG. 18) of the near-infrared laser light from each existing measurement object, and the average value of each reflection intensity in the acquired growth information acquisition area Ag are the growth information acquisition area.
- the near-infrared light reflectance calculation procedure which is calculated as the near-infrared light reflectance IR in Ag, is performed (steps # 33 to 35).
- the growth information acquisition unit 80Ca has the reflectance R of visible red light and the reflectance of near-infrared light in the same growth information acquisition area Ag acquired by the visible red light reflectance acquisition process and the near-infrared light reflectance calculation procedure.
- IR IR-R
- IR + R a calculation formula for calculating NDVI
- the growth information acquisition unit 80Ca can acquire the NDVI based on the color image information from the image pickup unit 80A and the measurement information from the first obstacle sensor 86.
- the value of NDVI takes a value of -1 to 1, and the higher the nitrogen absorption of crop Z and the higher the activity, the higher the value.
- the growth information acquisition unit 80Ca can generate an NDVI image by coloring black to white, for example, according to the value of the NDVI, and can visualize the activity of the crop Z and the like.
- the growth information acquisition unit 80Ca performs a vegetation coverage acquisition process for acquiring the vegetation coverage in the growth information acquisition area Ag based on the above NDVI image (step # 37). Then, the nitrogen absorption amount of the crop Z in the growth information acquisition area Ag is estimated by taking the product of the NDVI in the same growth information acquisition area Ag acquired by the NDVI calculation treatment and the reflection intensity acquisition process and the vegetation coverage rate. Nitrogen absorption amount estimation processing is performed (step # 38).
- the growth information acquisition unit 80Ca is the color image information of the front camera 81 acquired in the color image information acquisition process, the mounting position and angle of the front camera 81 in the tractor 1, and the first obstacle sensor acquired in the measurement information acquisition process. Color image information of the front camera 81 and measurement information of the first obstacle sensor 86 based on the distance value information included in the measurement information of the 86 and the mounting position and mounting angle of the first obstacle sensor 86 in the tractor 1.
- the plant height value calculation process for calculating the plant height value of the crop Z in Ag is performed (steps # 39 to 40).
- the growth information acquisition unit 80Ca associates the acquired NDVI, vegetation coverage, nitrogen absorption amount, plant height value of crop Z, etc. in the same growth information acquisition area Ag with the position information of the tractor 1 acquired by the positioning unit 30.
- the growth information storage process to be stored in the vehicle-mounted storage unit 23G is performed (step # 41).
- the obstacle detection system 80 provided in the tractor 1 in order to avoid a collision with the obstacle detection is used for NDVI, vegetation coverage, nitrogen absorption amount, plant height of crop Z, and the like. It can also be used as a growth information acquisition unit for acquiring growth information of crop Z cultivated in the field A of. As a result, compared to the case where the tractor 1 is provided with a dedicated growth information acquisition unit, it is possible to acquire the growth information of the crop Z with high accuracy while suppressing the complexity of the configuration and the soaring cost of the tractor 1. it can.
- the information integration processing unit 80C evaluates the growth status of the crop in each growth information acquisition area Ag based on the growth information of the crop acquired by the growth information acquisition unit 80Ca, and acquires the growth information.
- a growth evaluation unit 80Cb for calculating the amount of fertilizer applied for each area Ag is included.
- the growth evaluation unit 80Cb stores the calculated fertilizer application amount for each growth information acquisition area Ag in the vehicle-mounted storage unit 23G in a state of being associated with the position information of the tractor 1 acquired by the positioning unit 30.
- the vehicle-mounted storage unit 23G stores the tractor 1 in association with the position information of the tractor 1. Fertilizer application amount that automatically adjusts the fertilizer application amount by the fertilizer spraying device for each growth information acquisition area Ag based on the fertilizer application amount for each growth information acquisition area Ag and the position information of the tractor 1 acquired by the positioning unit 30 during the fertilizer application work. Perform adjustment control.
- the configuration of the work vehicle V can be changed in various ways.
- the work vehicle V may have a semi-crawler specification traveling vehicle body 1 having left and right crawlers instead of the left and right rear wheels 11.
- the work vehicle V may have a traveling vehicle body 1 having a full crawler specification, which is provided with left and right crawlers instead of the left and right front wheels 10 and the left and right rear wheels 11.
- the work vehicle V may be configured to have a traveling vehicle body 1 capable of only manual traveling.
- the work vehicle V may be configured to have an electric specification including an electric motor instead of the engine 14.
- the work vehicle V may be configured in a hybrid specification including an engine 14 and an electric motor.
- the information integrated processing unit 80C and the obstacle control unit 23H which function as work support units, have obstacles around the work vehicle V based on the detection of obstacles only by the obstacle detection unit 80B. It may be configured to provide work support such as notifying the user and avoiding a collision with an obstacle. Further, the obstacle control unit 23H facilitates avoiding a collision with an obstacle by performing only a notification process for notifying the existence of an obstacle around the work vehicle V based on the detection of the obstacle. , Etc. may be configured to provide work support.
- the growth information acquisition unit 80Ca obtains the growth information of the crop Z cultivated in the field A based on the color image information from the rear camera 82 of the imaging unit 80A and the measurement information from the second obstacle sensor 87. It may be configured to acquire.
- the right rider sensor in which the third measurement range Rm3 on the right side of the cabin 13 is set as the measurement range and the fourth measurement range Rm4 on the left side from the cabin 13 are in the measurement range. It may be provided with a set left rider sensor. Then, in this configuration, the growth information acquisition unit 80Ca grows the crop Z cultivated in the field A based on the color image information from the right camera 83 of the imaging unit 80A and the measurement information from the rider sensor on the right side. It may be configured to retrieve information.
- the growth information acquisition unit 80Ca acquires the growth information of the crop Z cultivated in the field A based on the color image information from the left camera 84 of the imaging unit 80A and the measurement information from the rider sensor on the left side. It may be configured in.
- the growth information acquisition unit 80Ca is configured to acquire a green normalized difference vegetation index (GNDVI: Green Normalized Difference Vegetation Index), which is less affected by the soil in the furrows than NDVI, as growth information of crop Z. You may.
- the green normalized vegetation index is a calculation formula for calculating GNDVI, which is a calculation formula for calculating the reflectance G of visible green light and the reflectance IR of near-infrared light in the growth information acquisition area Ag. It can be calculated by substituting into IR + G).
- the growth information acquisition unit 80Ca is configured to acquire a vertical vegetation index (PVI: Perpendicular Vegetation Index), a soil-corrected normalized difference vegetation index (SAVI: Soil Adjusted Difference Vegetation Index), and the like as growth information of crop Z. You may.
- PV Perpendicular Vegetation Index
- SAVI Soil Adjusted Difference Vegetation Index
- the positioning unit 30 has acquired growth information such as NDVI, vegetation coverage, nitrogen absorption amount, and plant height value of crop Z acquired by the growth information acquisition unit 80Ca, and evaluation results by the growth evaluation unit 80Cb. It may be stored in the cloud server in a state associated with the position information of the tractor 1.
- the in-vehicle control unit 23 or the cloud server may be provided with the information integrated processing unit 80C including the growth information acquisition unit 80Ca and the growth evaluation unit 80Cb.
- the second characteristic configuration of the present invention is
- the obstacle sensor is a rider sensor that three-dimensionally measures the distance to a measurement object existing in the measurement range.
- the growth information acquisition unit acquires the plant height value of the crop as the growth information based on the distance information of the rider sensor.
- the rider sensor since the rider sensor has high distance measurement accuracy, it is possible to acquire the plant height value of the crop with high accuracy as the growth information of the crop. Then, based on the crop growth information including the plant height value of the acquired crop, the growth status of the crop in each predetermined area in the field can be evaluated more accurately, and based on the growth status of the crop in each predetermined area evaluated. Therefore, the appropriate amount of fertilizer applied to the crops in each predetermined area can be calculated more accurately.
- the third characteristic configuration of the present invention is
- the growth information acquisition unit acquires the reflectance of visible red light from the color image information, acquires the reflectance of the near infrared light from the measurement information, and uses the normalized vegetation index as the growth information. There is a point to acquire.
- the fourth characteristic configuration of the present invention is
- the work support unit includes an automatic travel control unit that automatically travels the work vehicle according to a target route generated according to the field based on the position information.
- the growth information acquisition unit is at a point of acquiring the growth information for each predetermined area set by dividing the field into a plurality of areas in accordance with the automatic traveling of the work vehicle.
- the fifth characteristic configuration of the present invention is
- the growth information acquisition unit calculates the fertilizer application amount for each predetermined area based on the growth information for each predetermined area.
- the automatic traveling control unit automatically adjusts the fertilizer application amount for each predetermined area based on the position information and the fertilizer application amount when the fertilizer application work is performed by the automatic travel of the work vehicle.
- fertilization work according to crop growth information for each predetermined area in the field can be performed by automatic running of the work vehicle.
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Environmental Sciences (AREA)
- Soil Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Mining & Mineral Resources (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Mathematical Physics (AREA)
- Immunology (AREA)
- Forests & Forestry (AREA)
- Ecology (AREA)
- Electromagnetism (AREA)
- Botany (AREA)
- Biodiversity & Conservation Biology (AREA)
- Agronomy & Crop Science (AREA)
- Animal Husbandry (AREA)
- Marine Sciences & Fisheries (AREA)
- Chemical & Material Sciences (AREA)
- Mechanical Engineering (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Guiding Agricultural Machines (AREA)
- Fertilizing (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
Abstract
This work assistance system comprises: an imaging unit (80A) that images visible light within an imaging range set around a work vehicle, and acquires color image information; an obstacle sensor (86) that acquires measurement information including the reflection intensity of near-infrared light by projecting and receiving the near-infrared light in a measurement range set around the work vehicle, and detects an obstacle on the basis of the acquired measurement information; a work assistance unit (23E, 23H, 80C) that assists work performed by the work vehicle on the basis of the color image information and the measurement information; a measurement unit that measures the position of the work vehicle and acquires position information; a growth information acquisition unit (80Ca) that acquires growth information for plants cultivated in a field on the basis of the color image information and the measurement information; and a storage unit (23G) that associates the growth information with the position information and stores the growth information.
Description
本発明は、トラクタや乗用管理機などの作業車両による作業を支援する作業支援システムに関する。
The present invention relates to a work support system that supports work by a work vehicle such as a tractor or a passenger management machine.
作業車両の一例であるトラクタにおいては、トラクタに、作物(植物)の生育情報(生育状況に関する情報)を取得する専用の植物用センサ装置を設置することで、圃場でのトラクタの走行に伴って、圃場で栽培された作物の生育状況を測定するように構成されたものがある(例えば特許文献1参照)。
In the tractor, which is an example of a work vehicle, by installing a dedicated plant sensor device that acquires the growth information (information on the growth status) of the crop (plant) on the tractor, the tractor travels in the field. , Some are configured to measure the growth status of crops cultivated in the field (see, for example, Patent Document 1).
特許文献1に記載の植物用センサ装置は、生育状況測定対象の作物が栽培されている同一の照射領域に第1測定光と第2測定光とを照射し、その生育状況測定対象からの第1測定光及び第2測定光の反射光を取得する。そして、その測定光及び反射光を用いて、生育状況の測定対象となる作物の生育情報として、分光植生指標の一例である正規化植生指標(NDVI:Normalized Difference Vegetation Index)や草丈値を取得する。
The plant sensor device described in Patent Document 1 irradiates the same irradiation area in which the crop to be measured for growth status is cultivated with the first measurement light and the second measurement light, and the first measurement light from the growth status measurement target is applied. The reflected light of the 1st measurement light and the 2nd measurement light is acquired. Then, using the measured light and the reflected light, the normalized difference vegetation index (NDVI: Normalized Difference Vegetation Index) and the plant height value, which are examples of the spectral vegetation index, are acquired as the growth information of the crop to be measured in the growth condition. ..
これにより、特許文献1に記載の技術においては、植物用センサ装置が取得した正規化植生指標や草丈値に基づいて作物の生育状況に応じた肥料の散布量を求めることが可能になっている。その結果、トラクタに肥料散布機が搭載された散布作業時には、作物に対する肥料の散布量を作物の生育状況に応じて調節することができる、といった作業支援が可能になっている。
As a result, in the technique described in Patent Document 1, it is possible to obtain the amount of fertilizer applied according to the growing condition of the crop based on the normalized vegetation index and the plant height value acquired by the plant sensor device. .. As a result, it is possible to support the work such that the amount of fertilizer applied to the crop can be adjusted according to the growing condition of the crop at the time of the spraying work in which the fertilizer sprayer is mounted on the tractor.
特許文献1に記載の技術においては、前述した作業支援を可能にするために、生育情報の一例である正規化植生指標や草丈値などを取得する専用の植物用センサ装置をトラクタに設置することから、作業支援を可能にする上においてコストの高騰や構成の複雑化などを招くことになる。
In the technique described in Patent Document 1, in order to enable the above-mentioned work support, a dedicated plant sensor device for acquiring a normalized difference vegetation index, a plant height value, etc., which is an example of growth information, is installed in the tractor. Therefore, in order to enable work support, the cost will rise and the configuration will be complicated.
この実情に鑑み、本発明の主たる課題は、作物の生育情報を利用した作業支援を、コストの高騰や構成の複雑化などを抑制しながら行えるようにする点にある。
In view of this situation, the main problem of the present invention is to enable work support using crop growth information while suppressing soaring costs and complication of composition.
本発明の第1特徴構成は、作業支援システムにおいて、
圃場で作業する作業車両に搭載され、当該作業車両の周辺に設定された撮像範囲の可視光を撮像してカラー画像情報を取得する撮像ユニットと、
前記作業車両に搭載され、当該作業車両の周辺に設定された測定範囲に対する近赤外光の投受光により近赤外光の反射強度を含む測定情報を取得し、取得した前記測定情報に基づいて障害物を検出する障害物センサと、
前記カラー画像情報と前記測定情報とに基づいて前記作業車両による作業を支援する作業支援部と、
前記作業車両の位置を測定して位置情報を取得する測位ユニットと、
前記カラー画像情報と前記測定情報とに基づいて前記圃場で栽培された作物の生育情報を取得する生育情報取得部と、
前記生育情報を前記位置情報に関連付けて記憶する記憶部と、を備えている点にある。 The first feature configuration of the present invention is in a work support system.
An imaging unit that is mounted on a work vehicle working in a field and captures visible light in an imaging range set around the work vehicle to acquire color image information.
Measurement information including the reflection intensity of near-infrared light is acquired by the projection and reception of near-infrared light with respect to the measurement range set around the work vehicle mounted on the work vehicle, and based on the acquired measurement information. An obstacle sensor that detects obstacles and
A work support unit that supports work by the work vehicle based on the color image information and the measurement information, and
A positioning unit that measures the position of the work vehicle and acquires position information,
A growth information acquisition unit that acquires growth information of crops cultivated in the field based on the color image information and the measurement information.
The point is that it includes a storage unit that stores the growth information in association with the position information.
圃場で作業する作業車両に搭載され、当該作業車両の周辺に設定された撮像範囲の可視光を撮像してカラー画像情報を取得する撮像ユニットと、
前記作業車両に搭載され、当該作業車両の周辺に設定された測定範囲に対する近赤外光の投受光により近赤外光の反射強度を含む測定情報を取得し、取得した前記測定情報に基づいて障害物を検出する障害物センサと、
前記カラー画像情報と前記測定情報とに基づいて前記作業車両による作業を支援する作業支援部と、
前記作業車両の位置を測定して位置情報を取得する測位ユニットと、
前記カラー画像情報と前記測定情報とに基づいて前記圃場で栽培された作物の生育情報を取得する生育情報取得部と、
前記生育情報を前記位置情報に関連付けて記憶する記憶部と、を備えている点にある。 The first feature configuration of the present invention is in a work support system.
An imaging unit that is mounted on a work vehicle working in a field and captures visible light in an imaging range set around the work vehicle to acquire color image information.
Measurement information including the reflection intensity of near-infrared light is acquired by the projection and reception of near-infrared light with respect to the measurement range set around the work vehicle mounted on the work vehicle, and based on the acquired measurement information. An obstacle sensor that detects obstacles and
A work support unit that supports work by the work vehicle based on the color image information and the measurement information, and
A positioning unit that measures the position of the work vehicle and acquires position information,
A growth information acquisition unit that acquires growth information of crops cultivated in the field based on the color image information and the measurement information.
The point is that it includes a storage unit that stores the growth information in association with the position information.
本構成によれば、作業支援部は、例えば、撮像ユニットのカラー画像情報に基づいて、作業車両の周囲画像を表示部に表示させて作業車両の周囲の状況を視認し易くする、といった作業支援が可能になる。又、作業支援部は、例えば、障害物センサによる障害物の検出に基づいて、作業車両の周囲に障害物が存在することを報知して障害物との衝突を回避し易くする、といった作業支援が可能になる。そして、障害物に関する作業支援においては、測定情報に近赤外光の反射強度が含まれていることにより、障害物センサが、その近辺に発生した反射強度が非常に弱い埃や霧などの浮遊物を障害物として誤検出する虞を回避することができる。
According to this configuration, the work support unit displays, for example, an image of the surroundings of the work vehicle on the display unit based on the color image information of the imaging unit to make it easier to visually recognize the situation around the work vehicle. Becomes possible. Further, the work support unit notifies the existence of an obstacle around the work vehicle based on the detection of the obstacle by the obstacle sensor, and makes it easier to avoid a collision with the obstacle. Becomes possible. In the work support for obstacles, the measurement information includes the reflection intensity of near-infrared light, so that the obstacle sensor floats dust, fog, etc., which are generated in the vicinity of the obstacle sensor and have a very weak reflection intensity. It is possible to avoid the possibility of erroneously detecting an object as an obstacle.
生育情報取得部は、作業支援部が上記の作業支援を可能にするために取得したカラー画像情報と測定情報とを利用して作物の生育情報を取得する。記憶部は、生育情報取得部が取得した作物の生育情報を測位ユニットの位置情報に関連付けて記憶する。
The growth information acquisition department acquires crop growth information by using the color image information and the measurement information acquired by the work support department to enable the above work support. The storage unit stores the growth information of the crop acquired by the growth information acquisition unit in association with the position information of the positioning unit.
これにより、例えば、記憶部において位置情報に関連付けて記憶された作物の生育情報を基に、圃場における所定区域ごとの作物の生育状況を評価することができ、評価した所定区域ごとの作物の生育状況に基づいて、所定区域ごとの作物に対する施肥量を算出することができる。そして、肥料散布用の作業車両による肥料散布作業を行う場合には、算出した施肥量に基づいて、所定区域ごとの作物に対する肥料の散布量を調節するようにすれば、圃場における所定区域ごとの作物の生育状態のばらつきを改善することができ、圃場で栽培される作物の品質向上や収穫量の安定化などを図ることができる。
Thereby, for example, the growth status of the crop in each predetermined area in the field can be evaluated based on the growth information of the crop stored in association with the position information in the storage unit, and the growth of the crop in each predetermined area evaluated can be evaluated. Based on the situation, the amount of fertilizer applied to the crop for each predetermined area can be calculated. Then, when performing fertilizer spraying work by a work vehicle for fertilizer spraying, if the amount of fertilizer applied to the crops in each predetermined area is adjusted based on the calculated fertilizer application amount, it is possible to adjust the amount of fertilizer applied to each predetermined area in the field. It is possible to improve the variation in the growth state of crops, improve the quality of crops cultivated in the field, and stabilize the yield.
つまり、作業車両の周囲の状況を視認し易くする、及び、障害物との衝突を回避し易くする、といった作業支援を可能にするために備えられた撮像ユニットと障害物センサとを利用することで、作物の生育情報を取得するための専用のセンサを備えることなく作物の生育情報を取得することができる。その結果、作物の生育情報を取得することで可能になる、作物の品質向上や収穫量の安定化などを図るための作業支援を、コストの高騰や構成の複雑化などを抑制しながら行うことができる。
That is, the use of an imaging unit and an obstacle sensor provided to enable work support such as making it easier to visually recognize the situation around the work vehicle and making it easier to avoid a collision with an obstacle. Therefore, it is possible to acquire the growth information of the crop without providing a dedicated sensor for acquiring the growth information of the crop. As a result, work support for improving crop quality and stabilizing yields, which is possible by acquiring crop growth information, will be provided while suppressing soaring costs and complication of composition. Can be done.
以下、本発明を実施するための形態の一例を図面に基づいて説明する。
Hereinafter, an example of a mode for carrying out the present invention will be described with reference to the drawings.
図1~5に示すように、本実施形態に例示された作業車両Vは、走行車体の一例であるトラクタ1と、その後部にリンク機構2を介して着脱可能に連結された作業装置の一例であるロータリ耕耘装置3とを有している。これにより、この作業車両Vは、ロータリ耕耘装置3による耕耘作業が可能なロータリ耕耘仕様に構成されている。ロータリ耕耘装置3は、トラクタ1の後部に、リンク機構2を介して昇降可能かつローリング可能に連結されている。
As shown in FIGS. 1 to 5, the work vehicle V exemplified in the present embodiment is an example of a work device detachably connected to a tractor 1 which is an example of a traveling vehicle body and a rear portion via a link mechanism 2. It has a rotary tillage device 3 which is a rotary tiller. As a result, the work vehicle V is configured to have a rotary tillage specification capable of tilling work by the rotary tillage device 3. The rotary tillage device 3 is connected to the rear portion of the tractor 1 via a link mechanism 2 so as to be able to move up and down and roll.
尚、走行車体は、トラクタ1以外の、最低地上高が高く設定された乗用管理機などであってもよい。作業装置は、ロータリ耕耘装置3以外の、プラウ、ディスクハロー、カルチベータ、サブソイラ、播種装置、散布装置、草刈装置、などであってもよい。
The traveling vehicle body may be a passenger management machine other than the tractor 1 having a high ground clearance. The working device may be a plow, a disc halo, a cultivator, a subsoiler, a sowing device, a spraying device, a mowing device, or the like other than the rotary tiller 3.
トラクタ1は、作業車両用の自動走行システムを使用することにより、図5に示す圃場Aなどにおいて自動走行させることができる。図1、図6に示すように、作業車両用の自動走行システムには、トラクタ1に搭載された自動走行ユニット4、及び、自動走行ユニット4と無線通信可能に通信設定された無線通信機器の一例である携帯通信端末5、などが含まれている。携帯通信端末5には、自動走行に関する各種の情報表示や入力操作などを可能にするマルチタッチ式の表示デバイス(例えば液晶パネル)50などが備えられている。
The tractor 1 can be automatically traveled in the field A or the like shown in FIG. 5 by using the automatic traveling system for the work vehicle. As shown in FIGS. 1 and 6, the automatic traveling system for a work vehicle includes an automatic traveling unit 4 mounted on the tractor 1 and a wireless communication device set to communicate wirelessly with the automatic traveling unit 4. An example of a mobile communication terminal 5, etc. is included. The mobile communication terminal 5 is provided with a multi-touch type display device (for example, a liquid crystal panel) 50 that enables various information displays and input operations related to automatic driving.
尚、携帯通信端末5には、タブレット型のパーソナルコンピュータやスマートフォンなどを採用することができる。又、無線通信には、Wi-Fi(登録商標)などの無線LAN(Local Area Network)やBluetooth(登録商標)などの近距離無線通信などを採用することができる。
A tablet-type personal computer, a smartphone, or the like can be adopted as the mobile communication terminal 5. Further, for wireless communication, wireless LAN (Local Area Network) such as Wi-Fi (registered trademark) and short-range wireless communication such as Bluetooth (registered trademark) can be adopted.
図1~3、図6に示すように、トラクタ1には、駆動可能で操舵可能な左右の前輪10、駆動可能な左右の後輪11、搭乗式の運転部12を形成するキャビン13、コモンレールシステムを有する電子制御式のディーゼルエンジン(以下、エンジンと称する)14、エンジン14などを覆うボンネット15、及び、エンジン14からの動力を変速する変速ユニット16、などが備えられている。尚、エンジン14には、電子ガバナを有する電子制御式のガソリンエンジンなどを採用してもよい。
As shown in FIGS. 1 to 3 and 6, the tractor 1 includes driveable and steerable left and right front wheels 10, driveable left and right rear wheels 11, a cabin 13 forming a boarding-type driving unit 12, and a common rail. An electronically controlled diesel engine (hereinafter referred to as an engine) 14 having a system, a bonnet 15 covering the engine 14 and the like, a speed change unit 16 for shifting power from the engine 14, and the like are provided. An electronically controlled gasoline engine or the like having an electronic governor may be adopted as the engine 14.
図6に示すように、トラクタ1には、左右の前輪10を操舵する全油圧式のパワーステアリングユニット17、左右の後輪11を制動するブレーキユニット18、ロータリ耕耘装置3への伝動を断続する電子油圧制御式の作業クラッチユニット19、ロータリ耕耘装置3を昇降駆動する電子油圧制御式の昇降駆動ユニット20、ロータリ耕耘装置3のロール方向への駆動を可能にする電子油圧制御式のローリングユニット21、トラクタ1における各種の設定状態や各部の動作状態などを検出する各種のセンサやスイッチなどを含む車両状態検出機器22、及び、各種の制御部を有する車載制御ユニット23、などが備えられている。尚、パワーステアリングユニット17には、操舵用の電動モータを有する電動式を採用してもよい。
As shown in FIG. 6, the tractor 1 interrupts transmission to a fully hydraulic power steering unit 17 that steers the left and right front wheels 10, a brake unit 18 that brakes the left and right rear wheels 11, and a rotary tiller 3. The electro-hydraulic control type work clutch unit 19, the electro-hydraulic control type elevating drive unit 20 that elevates and drives the rotary tiller 3, and the electro-hydraulic control type rolling unit 21 that enables the rotary tiller 3 to be driven in the roll direction. A vehicle state detection device 22 including various sensors and switches for detecting various setting states and operating states of each part in the tractor 1, an in-vehicle control unit 23 having various control units, and the like are provided. .. The power steering unit 17 may be of an electric type having an electric motor for steering.
図1、図3に示すように、運転部12には、手動操舵用のステアリングホイール25と、搭乗者用の座席26と、各種の情報表示や入力操作などを可能にする操作端末27とが備えられている。図示は省略するが、運転部12には、アクセルレバーや変速レバーなどの操作レバー類、及び、アクセルペダルやクラッチペダルなどの操作ペダル類、などが備えられている。操作端末27には、マルチタッチ式の液晶モニタやISOBUS(イソバス)対応のバーチャルターミナルなどを採用することができる。
As shown in FIGS. 1 and 3, the driver unit 12 includes a steering wheel 25 for manual steering, a seat 26 for passengers, and an operation terminal 27 that enables various information displays and input operations. It is equipped. Although not shown, the driving unit 12 is provided with operating levers such as an accelerator lever and a speed change lever, and operating pedals such as an accelerator pedal and a clutch pedal. As the operation terminal 27, a multi-touch type liquid crystal monitor, an ISOBUS (isobus) compatible virtual terminal, or the like can be adopted.
図示は省略するが、変速ユニット16には、エンジン14からの動力を変速する電子制御式の無段変速装置、及び、無段変速装置による変速後の動力を前進用と後進用とに切り換える電子油圧制御式の前後進切換装置、などが含まれている。無段変速装置には、静油圧式無段変速装置(HST:Hydro Static Transmission)よりも伝動効率が高い油圧機械式無段変速装置の一例であるI-HMT(Integrated Hydro-static Mechanical Transmission)が採用されている。前後進切換装置には、前進動力断続用の油圧クラッチと、後進動力断続用の油圧クラッチと、それらに対するオイルの流れを制御する電磁バルブとが含まれている。
Although not shown, the transmission unit 16 includes an electronically controlled continuously variable transmission that shifts the power from the engine 14, and an electron that switches the power after shifting by the continuously variable transmission between forward and reverse. A hydraulically controlled forward / backward switching device, etc. are included. As the continuously variable transmission, I-HMT (Integrated Hydro-static Mechanical Transmission), which is an example of a hydraulic mechanical continuously variable transmission having higher transmission efficiency than a hydrostatic continuously variable transmission (HST: Hydro Static Transmission), is used. It has been adopted. The forward / backward switching device includes a hydraulic clutch for interrupting forward power, a hydraulic clutch for interrupting reverse power, and an electromagnetic valve for controlling the flow of oil with respect to them.
尚、無段変速装置には、I-HMTの代わりに、油圧機械式無段変速装置の一例であるHMT(Hydraulic Mechanical Transmission)、静油圧式無段変速装置、又は、ベルト式無段変速装置、などを採用してもよい。又、変速ユニット16には、無段変速装置の代わりに、複数の変速用の油圧クラッチとそれらに対するオイルの流れを制御する複数の電磁バルブとを有する電子油圧制御式の有段変速装置が含まれていてもよい。
Instead of the I-HMT, the continuously variable transmission includes an HMT (Hydraulic Mechanical Transmission), a hydrostatic continuously variable transmission, or a belt-type continuously variable transmission, which is an example of a hydraulic mechanical continuously variable transmission. , Etc. may be adopted. Further, the transmission unit 16 includes an electro-hydraulic control type stepped transmission having a plurality of hydraulic clutches for shifting and a plurality of solenoid valves for controlling the flow of oil with respect to the continuously variable transmission instead of the continuously variable transmission. It may be.
図示は省略するが、ブレーキユニット18には、左右の後輪11を個別に制動する左右のブレーキ、運転部12に備えられた左右のブレーキペダルの踏み込み操作に連動して左右のブレーキを作動させるフットブレーキ系、運転部12に備えられたパーキングレバーの操作に連動して左右のブレーキを作動させるパーキングブレーキ系、及び、左右の前輪10の設定角度以上の操舵に連動して旋回内側のブレーキを作動させる旋回ブレーキ系、などが含まれている。
Although not shown, the brake unit 18 operates the left and right brakes that individually brake the left and right rear wheels 11 and the left and right brakes in conjunction with the depression operation of the left and right brake pedals provided in the driving unit 12. The foot brake system, the parking brake system that operates the left and right brakes in conjunction with the operation of the parking lever provided in the driver unit 12, and the brakes on the inside of the turn in conjunction with the steering of the left and right front wheels 10 at a set angle or more. It includes a turning brake system to operate, etc.
車両状態検出機器22は、トラクタ1の各部に備えられた各種のセンサやスイッチなどの総称である。図7に示すように、車両状態検出機器22には、トラクタ1の車速を検出する車速センサ22A、前後進切り換え用のリバーサレバーの操作位置を検出するリバーサセンサ22B、及び、前輪10の操舵角を検出する舵角センサ22C、が含まれている。又、図示は省略するが、車両状態検出機器22には、エンジン14の出力回転数を検出する回転センサ、アクセルレバーの操作位置を検出するアクセルセンサ、及び、変速レバーの操作位置を検出する変速センサ、などが含まれている。
The vehicle state detection device 22 is a general term for various sensors and switches provided in each part of the tractor 1. As shown in FIG. 7, the vehicle state detection device 22 includes a vehicle speed sensor 22A that detects the vehicle speed of the tractor 1, a reverser sensor 22B that detects the operation position of the reverser lever for forward / backward switching, and a steering angle of the front wheels 10. The steering angle sensor 22C, which detects the above, is included. Although not shown, the vehicle state detection device 22 includes a rotation sensor that detects the output rotation speed of the engine 14, an accelerator sensor that detects the operation position of the accelerator lever, and a shift that detects the operation position of the shift lever. Sensors, etc. are included.
図6~7に示すように、車載制御ユニット23には、エンジン14に関する制御を行うエンジン制御部23A、トラクタ1の車速や前後進の切り換えなどの変速ユニット16に関する制御を行う変速ユニット制御部23B、ステアリングに関する制御を行うステアリング制御部23C、ロータリ耕耘装置3などの作業装置に関する制御を行う作業装置制御部23D、操作端末27などに対する表示や報知に関する制御を行う表示制御部23E、自動走行に関する制御を行うことで作業支援部として機能する自動走行制御部23F、及び、圃場Aに応じて生成された自動走行用の目標経路P(図5参照)などを記憶する不揮発性の車載記憶部23G、などが含まれている。各制御部23A~23Fは、マイクロコントローラなどが集積された電子制御ユニットや各種の制御プログラムなどによって構築されている。各制御部23A~23Fは、CAN(Controller Area Network)を介して相互通信可能に接続されている。
As shown in FIGS. 6 to 7, the vehicle-mounted control unit 23 includes an engine control unit 23A that controls the engine 14, and a speed change unit control unit 23B that controls the speed change unit 16 such as switching the vehicle speed and forward / backward movement of the tractor 1. , Steering control unit 23C that controls steering, work device control unit 23D that controls work devices such as rotary tiller 3, display control unit 23E that controls display and notification to operation terminals 27, etc., control related to automatic driving An automatic driving control unit 23F that functions as a work support unit, and a non-volatile vehicle-mounted storage unit 23G that stores a target route P (see FIG. 5) for automatic driving generated according to the field A. Etc. are included. Each of the control units 23A to 23F is constructed by an electronic control unit in which a microcontroller or the like is integrated, various control programs, or the like. The control units 23A to 23F are connected to each other so as to be able to communicate with each other via CAN (Control Area Network).
尚、各制御部23A~23Fの相互通信には、CAN以外の通信規格や次世代通信規格である、例えば、車載EthernetやCAN-FD(CAN with FLexible Data rate)などを採用してもよい。
For mutual communication between the control units 23A to 23F, communication standards other than CAN and next-generation communication standards, such as in-vehicle Ethernet and CAN-FD (CAN with FLexible Data rate), may be adopted.
エンジン制御部23Aは、アクセルセンサからの検出情報と回転センサからの検出情報とに基づいて、エンジン回転数をアクセルレバーの操作位置に応じた回転数に維持するエンジン回転数維持制御、などを実行する。
The engine control unit 23A executes engine speed maintenance control for maintaining the engine speed at the speed corresponding to the operation position of the accelerator lever based on the detection information from the accelerator sensor and the detection information from the rotation sensor. To do.
変速ユニット制御部23Bは、変速センサからの検出情報と車速センサ22Aからの検出情報などに基づいて、トラクタ1の車速が変速レバーの操作位置に応じた速度に変更されるように無段変速装置の作動を制御する車速制御、及び、リバーサセンサ22Bからの検出情報に基づいて前後進切換装置の伝動状態を切り換える前後進切り換え制御、などを実行する。車速制御には、変速レバーが零速位置に操作された場合に、無段変速装置を零速状態まで減速制御してトラクタ1の走行を停止させる減速停止処理が含まれている。
The speed change unit control unit 23B is a continuously variable transmission device so that the vehicle speed of the tractor 1 is changed to a speed according to the operation position of the speed change lever based on the detection information from the speed change sensor and the detection information from the vehicle speed sensor 22A. Vehicle speed control for controlling the operation of the vehicle, forward / backward switching control for switching the transmission state of the forward / backward switching device based on the detection information from the reverser sensor 22B, and the like are executed. The vehicle speed control includes a deceleration stop process in which the continuously variable transmission is decelerated to a zero speed state to stop the running of the tractor 1 when the speed change lever is operated to the zero speed position.
作業装置制御部23Dには、運転部12に備えられたPTOスイッチの操作などに基づいて作業クラッチユニット19の作動を制御する作業クラッチ制御、運転部12に備えられた昇降スイッチの操作や高さ設定ダイヤルの設定値などに基づいて昇降駆動ユニット20の作動を制御する昇降制御、及び、運転部12に備えられたロール角設定ダイヤルの設定値などに基づいてローリングユニット21の作動を制御するローリング制御、などを実行する。PTOスイッチ、昇降スイッチ、高さ設定ダイヤル、及び、ロール角設定ダイヤルは、車両状態検出機器22に含まれている。
The work device control unit 23D has a work clutch control that controls the operation of the work clutch unit 19 based on the operation of the PTO switch provided in the operation unit 12, and the operation and height of the elevating switch provided in the operation unit 12. Lifting control that controls the operation of the lifting drive unit 20 based on the set value of the setting dial, and rolling that controls the operation of the rolling unit 21 based on the setting value of the roll angle setting dial provided in the driving unit 12. Perform control, etc. The PTO switch, the elevating switch, the height setting dial, and the roll angle setting dial are included in the vehicle condition detection device 22.
図6に示すように、トラクタ1には、トラクタ1の位置や方位などを測定する測位ユニット30が備えられている。測位ユニット30には、衛星測位システムの一例であるGNSS(Global Navigation Satellite System)を利用してトラクタ1の位置と方位とを測定する衛星航法装置31、及び、3軸のジャイロスコープ及び3方向の加速度センサなどを有してトラクタ1の姿勢や方位などを測定する慣性計測装置(IMU:Inertial Measurement Unit)32、などが含まれている。GNSSを利用した測位方法には、DGNSS(Differential GNSS:相対測位方式)やRTK-GNSS(Real Time Kinematic GNSS:干渉測位方式)などがある。本実施形態においては、移動体の測位に適したRTK-GNSSが採用されている。そのため、図1に示すように、圃場周辺の既知位置には、RTK-GNSSによる測位を可能にする基地局6が設置されている。
As shown in FIG. 6, the tractor 1 is provided with a positioning unit 30 that measures the position and orientation of the tractor 1. The positioning unit 30 includes a satellite navigation device 31 that measures the position and orientation of the tractor 1 using GNSS (Global Navigation Satellite System), which is an example of a satellite positioning system, a three-axis gyroscope, and three directions. An inertial measurement unit (IMU) 32, which has an acceleration sensor or the like and measures the posture, orientation, etc. of the tractor 1, is included. Positioning methods using GNSS include DGNSS (Differential GNSS: relative positioning method) and RTK-GNSS (Real Time Kinetic GNSS: interference positioning method). In this embodiment, RTK-GNSS suitable for positioning of a moving body is adopted. Therefore, as shown in FIG. 1, a base station 6 that enables positioning by RTK-GNSS is installed at a known position around the field.
図1、図6に示すように、トラクタ1と基地局6とのそれぞれには、測位衛星7(図1参照)から送信された電波を受信するGNSSアンテナ33,60、及び、トラクタ1と基地局6との間における測位情報を含む各情報の無線通信を可能にする通信モジュール34,61、などが備えられている。これにより、測位ユニット30の衛星航法装置31は、トラクタ1のGNSSアンテナ33が測位衛星7からの電波を受信して得た測位情報と、基地局6のGNSSアンテナ60が測位衛星7からの電波を受信して得た測位情報とに基づいて、トラクタ1の位置及び方位を高い精度で測定することができる。又、測位ユニット30は、衛星航法装置31と慣性計測装置32とを有することにより、トラクタ1の位置、方位、姿勢角(ヨー角、ロール角、ピッチ角)を高精度に測定することができる。
As shown in FIGS. 1 and 6, the tractor 1 and the base station 6 are respectively the GNSS antennas 33 and 60 that receive the radio waves transmitted from the positioning satellite 7 (see FIG. 1), and the tractor 1 and the base. Communication modules 34, 61, etc. that enable wireless communication of each information including positioning information with the station 6 are provided. As a result, the satellite navigation device 31 of the positioning unit 30 receives the positioning information obtained by the GNSS antenna 33 of the tractor 1 receiving the radio waves from the positioning satellite 7, and the GNSS antenna 60 of the base station 6 receives the radio waves from the positioning satellite 7. The position and orientation of the tractor 1 can be measured with high accuracy based on the positioning information obtained by receiving the radio wave. Further, since the positioning unit 30 has the satellite navigation device 31 and the inertial measurement unit 32, the position, orientation, and attitude angle (yaw angle, roll angle, pitch angle) of the tractor 1 can be measured with high accuracy. ..
このトラクタ1において、測位ユニット30の慣性計測装置32、GNSSアンテナ33、及び、通信モジュール34は、図1に示すアンテナユニット35に含まれている。アンテナユニット35は、キャビン13の前面側における上部の左右中央箇所に配置されている。
In this tractor 1, the inertial measurement unit 32 of the positioning unit 30, the GNSS antenna 33, and the communication module 34 are included in the antenna unit 35 shown in FIG. The antenna unit 35 is arranged at the center of the upper left and right on the front side of the cabin 13.
図示は省略するが、トラクタ1の位置を特定するときの車体位置は後輪車軸中心位置に設定されている。車体位置は、測位ユニット42からの測位情報、及び、トラクタ1におけるGNSSアンテナ45の取り付け位置と後輪車軸中心位置との位置関係を含む車体情報から求めることができる。
Although not shown, the vehicle body position when specifying the position of the tractor 1 is set to the rear wheel axle center position. The vehicle body position can be obtained from the positioning information from the positioning unit 42 and the vehicle body information including the positional relationship between the mounting position of the GNSS antenna 45 on the tractor 1 and the center position of the rear wheel axle.
図6に示すように、携帯通信端末5には、マイクロコントローラなどが集積された電子制御ユニットや各種の制御プログラムなどを有する端末制御ユニット51などが備えられている。端末制御ユニット51には、表示デバイス50などに対する表示や報知に関する制御を行う表示制御部51A、自動走行用の目標経路Pを生成する目標経路生成部51B、及び、目標経路生成部51Bが生成した目標経路Pなどを記憶する不揮発性の端末記憶部51C、などが含まれている。端末記憶部51Cには、目標経路Pの生成に使用する各種の情報として、トラクタ1の旋回半径やロータリ耕耘装置3などの作業装置の作業幅又は作業畝数などの車体情報、及び、前述した測位情報から得られる圃場情報、などが記憶されている。圃場情報には、圃場Aの形状や大きさなどを特定する上において、トラクタ1を圃場Aの外周縁に沿って走行させたときにGNSSを利用して取得した圃場Aにおける複数の形状特定地点(形状特定座標)となる4つの角部地点Cp1~Cp4(図5参照)、及び、それらの角部地点Cp1~Cp4を繋いで圃場Aの形状や大きさなどを特定する矩形状の形状特定線SL(図5参照)、などが含まれている。
As shown in FIG. 6, the mobile communication terminal 5 is provided with an electronic control unit in which a microcontroller and the like are integrated, a terminal control unit 51 having various control programs, and the like. The terminal control unit 51 is generated by a display control unit 51A that controls display and notification to the display device 50 and the like, a target route generation unit 51B that generates a target route P for automatic driving, and a target route generation unit 51B. A non-volatile terminal storage unit 51C for storing the target path P and the like is included. In the terminal storage unit 51C, as various information used for generating the target path P, vehicle body information such as the turning radius of the tractor 1 and the working width or the number of working ridges of the working device such as the rotary tilling device 3 and the above-mentioned are described. Field information obtained from positioning information, etc. are stored. In the field information, in order to specify the shape and size of the field A, a plurality of shape specifying points in the field A acquired by using GNSS when the tractor 1 is run along the outer peripheral edge of the field A. (Shape-specific coordinates) Four corner points Cp1 to Cp4 (see FIG. 5) and those corner points Cp1 to Cp4 are connected to specify the shape and size of the field A. Line SL (see FIG. 5), etc. are included.
図6に示すように、トラクタ1及び携帯通信端末5には、車載制御ユニット23と端末制御ユニット51との間における測位情報などを含む各情報の無線通信を可能にする通信モジュール28,52が備えられている。トラクタ1の通信モジュール28は、携帯通信端末5との無線通信にWi-Fiが採用される場合には、通信情報をCANとWi-Fiとの双方向に変換する変換器として機能する。端末制御ユニット51は、車載制御ユニット23との無線通信にてトラクタ1の位置や方位などを含むトラクタ1に関する各種の情報を取得することができる。これにより、携帯通信端末5の表示デバイス50にて、目標経路Pに対するトラクタ1の位置や方位などを含む各種の情報を表示させることができる。
As shown in FIG. 6, the tractor 1 and the mobile communication terminal 5 have communication modules 28 and 52 that enable wireless communication of each information including positioning information between the vehicle-mounted control unit 23 and the terminal control unit 51. It is equipped. When Wi-Fi is adopted for wireless communication with the mobile communication terminal 5, the communication module 28 of the tractor 1 functions as a converter that converts communication information into both directions of CAN and Wi-Fi. The terminal control unit 51 can acquire various information about the tractor 1 including the position and orientation of the tractor 1 by wireless communication with the vehicle-mounted control unit 23. As a result, the display device 50 of the mobile communication terminal 5 can display various information including the position and orientation of the tractor 1 with respect to the target path P.
目標経路生成部51Bは、車体情報に含まれたトラクタ1の旋回半径や作業装置の作業幅又は作業畝数、及び、圃場情報に含まれた圃場Aの形状や大きさ、などに基づいて目標経路Pを生成する。
The target route generation unit 51B targets based on the turning radius of the tractor 1 included in the vehicle body information, the working width or the number of working ridges of the work device, the shape and size of the field A included in the field information, and the like. Generate route P.
例えば、図5に示すように、矩形状の圃場Aにおいて、自動走行の開始位置p1と終了位置p2とが設定され、トラクタ1の作業走行方向が圃場Aの短辺に沿う方向に設定されている場合は、目標経路生成部51Bは、先ず、圃場Aを、前述した4つの角部地点Cp1~Cp4と矩形状の形状特定線SLとに基づいて、圃場Aの外周縁に隣接するマージン領域A1と、マージン領域A1の内側に位置する作業可能領域A2とに区分けする。
For example, as shown in FIG. 5, in the rectangular field A, the start position p1 and the end position p2 of the automatic running are set, and the working running direction of the tractor 1 is set to be along the short side of the field A. If so, the target route generation unit 51B first sets the field A as a margin region adjacent to the outer peripheral edge of the field A based on the above-mentioned four corner points Cp1 to Cp4 and the rectangular shape specifying line SL. It is divided into A1 and a workable area A2 located inside the margin area A1.
次に、目標経路生成部51Bは、トラクタ1の旋回半径や作業装置の作業幅又は作業畝数などに基づいて、作業可能領域A2を、作業可能領域A2における各長辺側の端部に設定される一対の端部領域A2aと、一対の端部領域A2aの間に設定される中央側領域A2bとに区分けする。その後、目標経路生成部51Bは、中央側領域A2bに、圃場Aの長辺に沿う方向に作業幅又は作業畝数に応じた所定間隔を置いて並列に配置される複数の並列経路P1を生成する。又、目標経路生成部51Bは、各端部領域A2aに、複数の並列経路P1をトラクタ1の走行順に接続する複数の接続経路P2を生成する。
Next, the target path generation unit 51B sets the workable area A2 at each long side end of the workable area A2 based on the turning radius of the tractor 1, the work width of the work device, the number of work ridges, and the like. It is divided into a pair of end regions A2a to be formed and a central region A2b set between the pair of end regions A2a. After that, the target route generation unit 51B generates a plurality of parallel routes P1 arranged in parallel in the central region A2b at predetermined intervals according to the work width or the number of work ridges in the direction along the long side of the field A. To do. Further, the target route generation unit 51B generates a plurality of connection paths P2 for connecting the plurality of parallel paths P1 in the traveling order of the tractor 1 in each end region A2a.
これにより、目標経路生成部51Bは、図5に示す圃場Aに設定された自動走行の開始位置p1から終了位置p2にわたってトラクタ1を自動走行させることが可能な目標経路Pを生成することができる。
As a result, the target route generation unit 51B can generate a target route P capable of automatically traveling the tractor 1 from the start position p1 to the end position p2 of the automatic travel set in the field A shown in FIG. ..
図5に示す圃場Aにおいて、マージン領域A1は、トラクタ1が作業可能領域A2の端部を自動走行するときに、作業装置などが圃場Aに隣接する畦や柵などの他物に接触するのを防止するために、圃場Aの外周縁と作業可能領域A2との間に確保された領域である。各端部領域A2aは、トラクタ1が現在走行中の並列経路P1から次の並列経路P1に向けて接続経路P2に従って方向転換移動するときの方向転換領域である。中央側領域A2bは、トラクタ1が各並列経路P1に従って作業状態で自動走行する作業領域である。
In the field A shown in FIG. 5, in the margin area A1, when the tractor 1 automatically travels on the end of the workable area A2, the work device or the like comes into contact with other objects such as ridges and fences adjacent to the field A. This is an area secured between the outer peripheral edge of the field A and the workable area A2 in order to prevent the above. Each end region A2a is a direction change region when the tractor 1 changes direction from the currently traveling parallel path P1 toward the next parallel path P1 according to the connection path P2. The central region A2b is a work region in which the tractor 1 automatically travels in a working state according to each parallel path P1.
図5に示す目標経路Pにおいて、各並列経路P1は、トラクタ1がロータリ耕耘装置3などの作業装置による作業を行いながら自動走行する作業経路である。各接続経路P2は、トラクタ1が作業装置による作業を行わずに自動走行する非作業経路である。各並列経路P1の始端位置p3は、トラクタ1が作業装置による作業を開始する作業開始位置である。各並列経路P1の終端位置p4は、トラクタ1が作業装置による作業を停止する作業停止位置である。各並列経路P1の始端位置p3のうち、トラクタ1の走行順位が一番目に設定された並列経路P1の始端位置p3が自動走行の開始位置p1である。そして、残りの並列経路P1の始端位置p3が、接続経路P2の終端位置との接続位置である。又、トラクタ1の走行順位が最後に設定された並列経路P1の終端位置p4が自動走行の終了位置p2である。そして、残りの並列経路P1の終端位置p4が、接続経路P2の始端位置との接続位置である。
In the target route P shown in FIG. 5, each parallel route P1 is a work route in which the tractor 1 automatically travels while performing work by a work device such as a rotary tiller 3. Each connection path P2 is a non-work path in which the tractor 1 automatically travels without performing work by the work device. The start position position p3 of each parallel path P1 is a work start position at which the tractor 1 starts work by the work device. The terminal position p4 of each parallel path P1 is a work stop position at which the tractor 1 stops the work by the work device. Of the start position p3 of each parallel path P1, the start position p3 of the parallel path P1 in which the traveling order of the tractor 1 is set first is the start position p1 of automatic traveling. The start end position p3 of the remaining parallel path P1 is the connection position with the end position of the connection path P2. Further, the terminal position p4 of the parallel path P1 in which the traveling order of the tractor 1 is set last is the end position p2 of the automatic traveling. The end position p4 of the remaining parallel path P1 is the connection position with the start position of the connection path P2.
尚、図5に示す目標経路Pはあくまでも一例であり、目標経路生成部51Bは、トラクタ1の機種や作業装置の種類などに応じて異なる車体情報、及び、圃場Aに応じて異なる圃場Aの形状や大きさなどの圃場情報、などに基づいて、それらに適した種々の目標経路Pを生成することができる。
The target route P shown in FIG. 5 is merely an example, and the target route generation unit 51B has different vehicle body information depending on the model of the tractor 1 and the type of the working device, and different field A according to the field A. Based on field information such as shape and size, various target paths P suitable for them can be generated.
目標経路Pは、車体情報や圃場情報などに関連付けされた状態で端末記憶部51Cに記憶されており、携帯通信端末5の表示デバイス50にて表示することができる。目標経路Pには、各並列経路P1におけるトラクタ1の目標車速、各接続経路P2におけるトラクタ1の目標車速、各並列経路P1における前輪操舵角、及び、各接続経路P2における前輪操舵角、などが含まれている。
The target route P is stored in the terminal storage unit 51C in a state associated with vehicle body information, field information, and the like, and can be displayed on the display device 50 of the mobile communication terminal 5. The target path P includes a target vehicle speed of the tractor 1 in each parallel path P1, a target vehicle speed of the tractor 1 in each connection path P2, a front wheel steering angle in each parallel path P1, a front wheel steering angle in each connection path P2, and the like. include.
端末制御ユニット51は、車載制御ユニット23からの送信要求指令に応じて、端末記憶部51Cに記憶されている圃場情報や目標経路Pなどを車載制御ユニット23に送信する。車載制御ユニット23は、受信した圃場情報や目標経路Pなどを車載記憶部23Gに記憶する。目標経路Pの送信に関しては、例えば、端末制御ユニット51が、トラクタ1が自動走行を開始する前の段階において、目標経路Pの全てを端末記憶部51Cから車載制御ユニット23に一挙に送信するようにしてもよい。又、端末制御ユニット51が、目標経路Pを所定距離ごとの複数の分割経路情報に分割して、トラクタ1が自動走行を開始する前の段階からトラクタ1の走行距離が所定距離に達するごとに、トラクタ1の走行順位に応じた所定数の分割経路情報を端末記憶部51Cから車載制御ユニット23に逐次送信するようにしてもよい。
The terminal control unit 51 transmits field information, a target route P, etc. stored in the terminal storage unit 51C to the vehicle-mounted control unit 23 in response to a transmission request command from the vehicle-mounted control unit 23. The vehicle-mounted control unit 23 stores the received field information, the target route P, and the like in the vehicle-mounted storage unit 23G. Regarding the transmission of the target route P, for example, the terminal control unit 51 transmits all of the target route P from the terminal storage unit 51C to the vehicle-mounted control unit 23 at once before the tractor 1 starts automatic traveling. It may be. Further, the terminal control unit 51 divides the target route P into a plurality of divided route information for each predetermined distance, and each time the traveling distance of the tractor 1 reaches the predetermined distance from the stage before the tractor 1 starts automatic traveling. , A predetermined number of division route information according to the traveling order of the tractor 1 may be sequentially transmitted from the terminal storage unit 51C to the vehicle-mounted control unit 23.
車載制御ユニット23において、自動走行制御部23Fには、車両状態検出機器22に含まれた各種のセンサやスイッチなどからの検出情報が、変速ユニット制御部23Bやステアリング制御部23Cなどを介して入力されている。これにより、自動走行制御部23Fは、トラクタ1における各種の設定状態や各部の動作状態などを監視することができる。
In the vehicle-mounted control unit 23, detection information from various sensors and switches included in the vehicle state detection device 22 is input to the automatic driving control unit 23F via the speed change unit control unit 23B, the steering control unit 23C, and the like. Has been done. As a result, the automatic traveling control unit 23F can monitor various setting states in the tractor 1, operating states of each unit, and the like.
自動走行制御部23Fは、搭乗者や管理者などのユーザにて、自動走行開始条件を満たすための各種の手動設定操作が行われて、トラクタ1の走行モードが手動走行モードから自動走行モードに切り換えられた状態において、携帯通信端末5の表示デバイス50が操作されて自動走行の開始が指令された場合に、測位ユニット30にてトラクタ1の位置や方位などを取得しながら目標経路Pに従ってトラクタ1を自動走行させる自動走行制御を開始する。
In the automatic driving control unit 23F, various manual setting operations for satisfying the automatic driving start condition are performed by a user such as a passenger or an administrator, and the driving mode of the tractor 1 is changed from the manual driving mode to the automatic driving mode. In the switched state, when the display device 50 of the mobile communication terminal 5 is operated to command the start of automatic driving, the positioning unit 30 acquires the position and orientation of the tractor 1 and follows the target route P. The automatic running control for automatically running 1 is started.
自動走行制御部23Fは、自動走行制御の実行中に、例えば、ユーザにより携帯通信端末5の表示デバイス50が操作されて自動走行の終了が指令された場合や、運転部12に搭乗しているユーザにてステアリングホイール25やアクセルペダルなどの手動操作具が操作された場合は、自動走行制御を終了するとともに走行モードを自動走行モードから手動走行モードに切り換える。
While the automatic driving control is being executed, the automatic driving control unit 23F is, for example, when the user operates the display device 50 of the mobile communication terminal 5 to instruct the end of the automatic driving, or is on board the driving unit 12. When the user operates a manual operating tool such as the steering wheel 25 or the accelerator pedal, the automatic driving control is terminated and the driving mode is switched from the automatic driving mode to the manual driving mode.
自動走行制御部23Fによる自動走行制御には、エンジン14に関する自動走行用の制御指令をエンジン制御部23Aに送信するエンジン用自動制御処理、トラクタ1の車速や前後進の切り換えに関する自動走行用の制御指令を変速ユニット制御部23Bに送信する車速用自動制御処理、ステアリングに関する自動走行用の制御指令をステアリング制御部23Cに送信するステアリング用自動制御処理、及び、ロータリ耕耘装置3などの作業装置に関する自動走行用の制御指令を作業装置制御部23Dに送信する作業用自動制御処理、などが含まれている。
The automatic driving control by the automatic driving control unit 23F includes automatic driving control processing for the engine that transmits a control command for automatic driving related to the engine 14 to the engine control unit 23A, and control for automatic driving related to switching the vehicle speed and forward / backward movement of the tractor 1. Automatic control processing for vehicle speed that transmits commands to the speed change unit control unit 23B, automatic control processing for steering that transmits control commands for automatic driving related to steering to the steering control unit 23C, and automatic control processing related to work devices such as the rotary tiller 3. It includes a work automatic control process for transmitting a running control command to the work device control unit 23D.
自動走行制御部23Fは、エンジン用自動制御処理においては、目標経路Pに含まれた設定回転数などに基づいてエンジン回転数の変更を指示するエンジン回転数変更指令、などをエンジン制御部23Aに送信する。エンジン制御部23Aは、自動走行制御部23Fから送信されたエンジン14に関する各種の制御指令に応じてエンジン回転数を自動で変更するエンジン回転数変更制御、などを実行する。
In the automatic engine control process, the automatic driving control unit 23F issues an engine speed change command to the engine control unit 23A, which instructs the engine speed to be changed based on the set speed included in the target path P. Send. The engine control unit 23A executes engine speed change control that automatically changes the engine speed in response to various control commands regarding the engine 14 transmitted from the automatic travel control unit 23F.
自動走行制御部23Fは、車速用自動制御処理においては、目標経路Pに含まれた目標車速に基づいて無段変速装置の変速操作を指示する変速操作指令、及び、目標経路Pに含まれたトラクタ1の進行方向などに基づいて前後進切換装置の前後進切り換え操作を指示する前後進切り換え指令、などを変速ユニット制御部23Bに送信する。変速ユニット制御部23Bは、自動走行制御部23Fから送信された無段変速装置や前後進切換装置などに関する各種の制御指令に応じて、無段変速装置の作動を自動で制御する自動車速制御、及び、前後進切換装置の作動を自動で制御する自動前後進切り換え制御、などを実行する。自動車速制御には、例えば、目標経路Pに含まれた目標車速が零速である場合に、無段変速装置を零速状態まで減速制御してトラクタ1の走行を停止させる自動減速停止処理などが含まれている。
In the vehicle speed automatic control process, the automatic driving control unit 23F is included in the shift operation command for instructing the shift operation of the continuously variable transmission based on the target vehicle speed included in the target path P, and the target path P. A forward / backward switching command for instructing a forward / backward switching operation of the forward / backward switching device based on the traveling direction of the tractor 1 and the like are transmitted to the speed change unit control unit 23B. The speed change unit control unit 23B is a vehicle speed control that automatically controls the operation of the stepless speed change device in response to various control commands related to the continuously variable transmission device, the forward / backward changeover device, etc. transmitted from the automatic travel control unit 23F. In addition, automatic forward / backward switching control that automatically controls the operation of the forward / backward switching device is executed. The vehicle speed control includes, for example, an automatic deceleration stop process in which the continuously variable transmission is decelerated to a zero speed state to stop the running of the tractor 1 when the target vehicle speed included in the target path P is zero speed. It is included.
自動走行制御部23Fは、ステアリング用自動制御処理においては、目標経路Pに含まれた前輪操舵角などに基づいて左右の前輪10の操舵を指示する操舵指令、などをステアリング制御部23Cに送信する。ステアリング制御部23Cは、自動走行制御部23Fから送信された操舵指令に応じて、パワーステアリングユニット17の作動を制御して左右の前輪10を操舵する自動ステアリング制御、及び、左右の前輪10が設定角度以上に操舵された場合に、ブレーキユニット18を作動させて旋回内側のブレーキを作動させる自動ブレーキ旋回制御、などを実行する。
In the automatic steering control process, the automatic driving control unit 23F transmits a steering command for instructing steering of the left and right front wheels 10 to the steering control unit 23C based on the front wheel steering angle and the like included in the target path P. .. The steering control unit 23C sets the automatic steering control for controlling the operation of the power steering unit 17 to steer the left and right front wheels 10 and the left and right front wheels 10 in response to the steering command transmitted from the automatic driving control unit 23F. When the steering is steered at an angle or more, the brake unit 18 is operated to operate the brake inside the turning, and automatic brake turning control is executed.
自動走行制御部23Fは、作業用自動制御処理においては、目標経路Pに含まれた各作業開始位置(各並列経路P1の始端位置p3)へのトラクタ1の到達に基づいてロータリ耕耘装置3などの作業装置の作業状態への切り換えを指示する作業開始指令、及び、目標経路Pに含まれた各作業停止位置(各並列経路P1の終端位置p4)へのトラクタ1の到達に基づいて作業装置の非作業状態への切り換えを指示する作業停止指令、などを作業装置制御部23Dに送信する。作業装置制御部23Dは、自動走行制御部23Fから送信された作業装置に関する各種の制御指令に応じて、昇降駆動ユニット20などの作動を制御して、作業装置を作業高さまで下降させて作用させる自動作業開始制御、及び、作業装置を非作業高さまで上昇させて待機させる自動作業停止制御、などを実行する。
In the automatic work control process, the automatic traveling control unit 23F uses the rotary tillage device 3 and the like based on the arrival of the tractor 1 at each work start position (starting position p3 of each parallel path P1) included in the target path P. The work device is based on the work start command for instructing the switching of the work device to the work state of the work device and the arrival of the tractor 1 at each work stop position (end position p4 of each parallel path P1) included in the target path P. A work stop command or the like instructing the switching to the non-working state is transmitted to the working device control unit 23D. The work device control unit 23D controls the operation of the elevating drive unit 20 and the like in response to various control commands related to the work device transmitted from the automatic traveling control unit 23F, and lowers the work device to the work height to operate. Automatic work start control and automatic work stop control in which the work device is raised to a non-work height and put on standby are executed.
つまり、前述した自動走行ユニット4には、パワーステアリングユニット17、ブレーキユニット18、作業クラッチユニット19、昇降駆動ユニット20、ローリングユニット21、車両状態検出機器22、車載制御ユニット23、測位ユニット30、及び、通信モジュール28,34、などが含まれている。そして、これらが適正に作動することにより、トラクタ1を目標経路Pに従って精度良く自動走行させることができるとともに、ロータリ耕耘装置3などの作業装置による作業を適正に行うことができる。
That is, the above-mentioned automatic traveling unit 4 includes a power steering unit 17, a brake unit 18, a work clutch unit 19, an elevating drive unit 20, a rolling unit 21, a vehicle state detection device 22, an in-vehicle control unit 23, a positioning unit 30, and the like. , Communication modules 28, 34, and the like are included. Then, when these operate properly, the tractor 1 can be automatically driven with high accuracy according to the target route P, and the work by the working device such as the rotary tilling device 3 can be properly performed.
図6~7に示すように、トラクタ1には、トラクタ1の周囲を監視して、その周囲に存在する障害物を検出する障害物検出システム80が備えられている。障害物検出システム80が検出する障害物には、圃場Aにて作業する作業者などの人物や他の作業車両、及び、圃場Aに既存の電柱や樹木などが含まれている。
As shown in FIGS. 6 to 7, the tractor 1 is provided with an obstacle detection system 80 that monitors the surroundings of the tractor 1 and detects obstacles existing around the tractor 1. Obstacles detected by the obstacle detection system 80 include a person such as a worker working in the field A, another work vehicle, and an existing utility pole or tree in the field A.
図7に示すように、障害物検出システム80には、トラクタ周辺の可視光を撮像してカラー画像情報を取得する撮像ユニット80A、トラクタ1の周囲に存在する障害物を検出する障害物検出ユニット80B、及び、撮像ユニット80Aからのカラー画像情報と障害物検出ユニット80Bからの検出情報とを統合して処理する情報統合処理部80C、が含まれている。
As shown in FIG. 7, the obstacle detection system 80 includes an imaging unit 80A that captures visible light around the tractor and acquires color image information, and an obstacle detection unit that detects obstacles existing around the tractor 1. The 80B and the information integration processing unit 80C that integrates and processes the color image information from the imaging unit 80A and the detection information from the obstacle detection unit 80B are included.
図1~3、図7に示すように、撮像ユニット80Aには、キャビン13から前方の第1撮像範囲Ri1が撮像範囲に設定された前カメラ81、キャビン13から後方の第2撮像範囲Ri2が撮像範囲に設定された後カメラ82、キャビン13から右方の第3撮像範囲Ri3が撮像範囲に設定された右カメラ83、キャビン13から左方の第4撮像範囲Ri4が撮像範囲に設定された左カメラ84、及び、各カメラ81~84からのカラー画像情報を処理する画像処理装置85(図7参照)、が含まれている。
As shown in FIGS. 1 to 3 and 7, the imaging unit 80A includes a front camera 81 in which the first imaging range Ri1 in front of the cabin 13 is set as the imaging range, and a second imaging range Ri2 in the rear of the cabin 13. After being set to the imaging range, the camera 82, the third imaging range Ri3 to the right of the cabin 13 was set to the imaging range, the right camera 83, and the fourth imaging range Ri4 to the left of the cabin 13 were set to the imaging range. A left camera 84 and an image processing device 85 (see FIG. 7) that processes color image information from each of the cameras 81 to 84 are included.
前カメラ81及び後カメラ82は、トラクタ1の左右中心線上に配置されている。前カメラ81は、キャビン13の前端側における上部の左右中央箇所に、トラクタ1の前方側を斜め上方側から見下ろす前下がり姿勢で配置されている。これにより、前カメラ81は、トラクタ1の左右中心線を対称軸とする車体前方側の所定範囲が第1撮像範囲Ri1に設定されている。後カメラ82は、キャビン13の後端側における上部の左右中央箇所に、トラクタ1の後方側を斜め上方側から見下ろす後下がり姿勢で配置されている。これにより、後カメラ82は、トラクタ1の左右中心線を対称軸とする車体後方側の所定範囲が第2撮像範囲Ri2に設定されている。右カメラ83は、キャビン13の右端側における上部の前後中央箇所に、トラクタ1の右方側を斜め上方側から見下ろす右下がり姿勢で配置されている。これにより、右カメラ83は、車体右方側の所定範囲が第3撮像範囲Ri3に設定されている。左カメラ84は、キャビン13の左端側における上部の前後中央箇所に、トラクタ1の左方側を斜め上方側から見下ろす左下がり姿勢で配置されている。これにより、左カメラ84は、車体左方側の所定範囲が第4撮像範囲Ri4に設定されている。
The front camera 81 and the rear camera 82 are arranged on the left and right center lines of the tractor 1. The front camera 81 is arranged at the center of the upper left and right of the front end side of the cabin 13 in a front-down posture in which the front side of the tractor 1 is viewed from the diagonally upper side. As a result, in the front camera 81, a predetermined range on the front side of the vehicle body with the left and right center lines of the tractor 1 as the axis of symmetry is set as the first imaging range Ri1. The rear camera 82 is arranged at the center of the upper left and right on the rear end side of the cabin 13 in a rear-down posture in which the rear side of the tractor 1 is viewed from the diagonally upper side. As a result, in the rear camera 82, a predetermined range on the rear side of the vehicle body with the left and right center lines of the tractor 1 as the axis of symmetry is set in the second imaging range Ri2. The right camera 83 is arranged at the center of the upper part on the right end side of the cabin 13 in a downward-sloping posture looking down on the right side of the tractor 1 from the diagonally upper side. As a result, in the right camera 83, a predetermined range on the right side of the vehicle body is set to the third imaging range Ri3. The left camera 84 is arranged at the center of the upper part on the left end side of the cabin 13 in a downward-sloping posture looking down on the left side of the tractor 1 from the diagonally upper side. As a result, in the left camera 84, a predetermined range on the left side of the vehicle body is set to the fourth imaging range Ri4.
画像処理装置85は、マイクロコントローラなどが集積された電子制御ユニットや各種の制御プログラムなどによって構築されている。画像処理装置85は、車載制御ユニット23及び情報統合処理部80CなどにCANを介して相互通信可能に接続されている。
The image processing device 85 is constructed by an electronic control unit in which a microcontroller and the like are integrated, various control programs, and the like. The image processing device 85 is connected to the vehicle-mounted control unit 23, the information integration processing unit 80C, and the like via CAN so that they can communicate with each other.
画像処理装置85は、各カメラ81~84から順次送信されるカラー画像情報に対して画像処理を行う。例えば、画像処理装置85は、各カメラ81~84から順次送信されるカラー画像情報に対して、各カメラ81~84の撮像範囲に対応した前後左右の各画像を生成する画像生成処理、及び、全カメラ81~84からのカラー画像情報を合成してトラクタ1の全周囲画像(例えばサラウンドビュー)を生成する全周囲画像生成処理、などを行う。そして、生成した各画像及び全周囲画像を、車載制御ユニット23の表示制御部23Eに送信する画像送信処理を行う。表示制御部23Eは、画像処理装置85からの各画像及び全周囲画像を、CANを介して操作端末27に送信するとともに、通信モジュール28,52を介して携帯通信端末5の表示制御部51Aに送信する。
The image processing device 85 performs image processing on the color image information sequentially transmitted from each of the cameras 81 to 84. For example, the image processing device 85 performs an image generation process that generates front, rear, left, and right images corresponding to the imaging range of each camera 81 to 84 with respect to the color image information sequentially transmitted from each camera 81 to 84. An omnidirectional image generation process for generating an omnidirectional image (for example, a surround view) of the tractor 1 by synthesizing color image information from all cameras 81 to 84 is performed. Then, an image transmission process is performed in which each generated image and the omnidirectional image are transmitted to the display control unit 23E of the vehicle-mounted control unit 23. The display control unit 23E transmits each image from the image processing device 85 and the omnidirectional image to the operation terminal 27 via the CAN, and also to the display control unit 51A of the mobile communication terminal 5 via the communication modules 28 and 52. Send.
これにより、画像処理装置85が生成した全周囲画像やトラクタ1の走行方向に対応する画像などを、トラクタ1の操作端末27や携帯通信端末5の表示デバイス50などにおいて表示することができる。そして、この表示により、トラクタ1の周囲の状況や走行方向の状況をユーザに視認させることができる。
As a result, the omnidirectional image generated by the image processing device 85, the image corresponding to the traveling direction of the tractor 1, and the like can be displayed on the operation terminal 27 of the tractor 1, the display device 50 of the mobile communication terminal 5, and the like. Then, by this display, the user can visually recognize the situation around the tractor 1 and the situation in the traveling direction.
つまり、車載制御ユニット23の表示制御部23Eは、画像処理装置85が生成した全周囲画像やトラクタ1の走行方向に対応する画像などを、トラクタ1の操作端末27や携帯通信端末5の表示デバイス50などにおいて表示させることで、作業車両Vの周囲の状況をユーザに視認し易くする、といった作業支援を可能にする作業支援部として機能する。
That is, the display control unit 23E of the vehicle-mounted control unit 23 displays the omnidirectional image generated by the image processing device 85, the image corresponding to the traveling direction of the tractor 1, and the like as the display device of the operation terminal 27 of the tractor 1 and the mobile communication terminal 5. By displaying it at 50 or the like, it functions as a work support unit that enables work support such as making it easier for the user to visually recognize the situation around the work vehicle V.
画像処理装置85には、圃場Aにて作業する作業者などの人物や他の作業車両、及び、圃場Aに既存の電柱や樹木などを障害物として認識するための学習処理が施されている。これにより、画像処理装置85は、各カメラ81~84から順次送信されるカラー画像情報に基づいて、各カメラ81~84のいずれかの撮像範囲Ri1~Ri4においてトラクタ1の走行に影響を及ぼす障害物が存在するか否かを判定する障害物判定処理を行うことができる。
The image processing device 85 is subjected to learning processing for recognizing a person such as a worker working in the field A, another work vehicle, and an existing utility pole or tree in the field A as an obstacle. .. As a result, the image processing device 85 has an obstacle that affects the traveling of the tractor 1 in the imaging range Ri1 to Ri4 of any of the cameras 81 to 84 based on the color image information sequentially transmitted from the cameras 81 to 84. An obstacle determination process for determining whether or not an object exists can be performed.
画像処理装置85は、障害物判定処理において、いずれかの撮像範囲Ri1~Ri4に障害物が存在すると判定した場合は、障害物が存在する画像上での障害物の座標を求める座標算出処理を行い、求めた障害物の座標を、各カメラ81~84の搭載位置や搭載角度などに基づいて、車体座標原点を基準にした座標に変換する座標変換処理を行う。そして、その変換後の座標と予め設定した距離算出基準点とにわたる直線距離を、距離算出基準点から障害物までの距離として求める距離算出処理を行い、変換後の座標と求めた障害物までの距離とを障害物に関する情報として情報統合処理部80Cに送信する障害物情報送信処理を行う。一方、いずれの撮像範囲Ri1~Ri4にも障害物が存在しない場合は、障害物が未検出であることを情報統合処理部80Cに送信する未検出送信処理を行う。
When the image processing device 85 determines in the obstacle determination process that an obstacle exists in any of the imaging ranges Ri1 to Ri4, the image processing apparatus 85 performs a coordinate calculation process for obtaining the coordinates of the obstacle on the image in which the obstacle exists. This is performed, and coordinate conversion processing is performed to convert the obtained coordinates of the obstacle into coordinates based on the vehicle body coordinate origin based on the mounting position and mounting angle of each of the cameras 81 to 84. Then, a distance calculation process is performed to obtain the linear distance between the converted coordinates and the preset distance calculation reference point as the distance from the distance calculation reference point to the obstacle, and the converted coordinates and the obtained obstacle are reached. The obstacle information transmission process of transmitting the distance and the distance as information about the obstacle to the information integration processing unit 80C is performed. On the other hand, when there is no obstacle in any of the imaging ranges Ri1 to Ri4, an undetected transmission process for transmitting the fact that the obstacle has not been detected to the information integration processing unit 80C is performed.
このように、各カメラ81~84の撮像範囲Ri1~Ri4のいずれかに障害物が存在する場合は、画像処理装置85が、障害物に関する情報を情報統合処理部80Cに送信することから、情報統合処理部80Cは、その障害物に関する情報を受け取ることにより、各カメラ81~84のいずれかの撮像範囲Ri1~Ri4に障害物が存在することを把握することができるとともに、その障害物の位置及び障害物までの距離を取得することができる。又、各カメラ81~84の撮像範囲Ri1~Ri4のいずれにも障害物が存在しない場合は、画像処理装置85が、障害物の未検出を情報統合処理部80Cに送信することから、情報統合処理部80Cは、その未検出情報を受け取ることにより、各カメラ81~84の撮像範囲Ri1~Ri4のいずれにも障害物が存在しないことを把握することができる。
As described above, when an obstacle exists in any of the imaging ranges Ri1 to Ri4 of the cameras 81 to 84, the image processing device 85 transmits information about the obstacle to the information integration processing unit 80C. By receiving the information about the obstacle, the integrated processing unit 80C can grasp that the obstacle exists in the imaging range Ri1 to Ri4 of each of the cameras 81 to 84, and the position of the obstacle. And the distance to the obstacle can be obtained. Further, when there is no obstacle in any of the imaging ranges Ri1 to Ri4 of the cameras 81 to 84, the image processing device 85 transmits the undetected obstacle to the information integration processing unit 80C, so that the information is integrated. By receiving the undetected information, the processing unit 80C can grasp that there is no obstacle in any of the imaging ranges Ri1 to Ri4 of the cameras 81 to 84.
上記の座標変換処理における車体座標原点、及び、距離算出処理における距離算出基準点は、各カメラ81~84の搭載位置に応じて設定されている。具体的には、図8に示すように、前カメラ81に対しては、その搭載位置に応じて車体座標原点O1と距離算出基準点Rp1とが設定されている。後カメラ82に対しては、その搭載位置に応じて車体座標原点O2と距離算出基準点Rp2とが設定されている。右カメラ83に対しては、その搭載位置に応じて車体座標原点O3と距離算出基準点Rp3とが設定されている。左カメラ84に対しては、その搭載位置に応じて車体座標原点O4と距離算出基準点Rp4とが設定されている。
The vehicle body coordinate origin in the above coordinate conversion process and the distance calculation reference point in the distance calculation process are set according to the mounting positions of the cameras 81 to 84. Specifically, as shown in FIG. 8, the vehicle body coordinate origin O1 and the distance calculation reference point Rp1 are set for the front camera 81 according to the mounting position thereof. For the rear camera 82, the vehicle body coordinate origin O2 and the distance calculation reference point Rp2 are set according to the mounting position. For the right camera 83, the vehicle body coordinate origin O3 and the distance calculation reference point Rp3 are set according to the mounting position. For the left camera 84, the vehicle body coordinate origin O4 and the distance calculation reference point Rp4 are set according to the mounting position.
これにより、画像処理装置85は、例えば、前カメラ81の第1撮像範囲Ri1において障害物が存在する場合は、障害物が存在する前カメラ81の画像上での障害物の座標を求め(座標算出処理)、求めた障害物の座標を、前カメラ81の搭載位置や搭載角度などに基づいて、図8に示す車体座標原点O1を基準にした座標(x,y)に変換し(座標変換処理)、変換後の座標(x,y)と距離算出基準点Rp1とにわたる直線距離を、距離算出基準点Rp1から障害物Oまでの距離Laとして求める(距離算出処理)。
As a result, for example, when an obstacle exists in the first imaging range Ri1 of the front camera 81, the image processing device 85 obtains the coordinates of the obstacle on the image of the front camera 81 in which the obstacle exists (coordinates). (Calculation processing), the obtained coordinates of the obstacle are converted into coordinates (x, y) based on the vehicle body coordinate origin O1 shown in FIG. 8 based on the mounting position and mounting angle of the front camera 81 (coordinate conversion). Processing), the linear distance between the converted coordinates (x, y) and the distance calculation reference point Rp1 is obtained as the distance La from the distance calculation reference point Rp1 to the obstacle O (distance calculation processing).
尚、前述した車体座標原点O1~O4と距離算出基準点Rp1~Rp4と各カメラ81~84の搭載位置との関係は種々の設定変更が可能である。
Various settings can be changed for the relationship between the vehicle body coordinate origins O1 to O4, the distance calculation reference points Rp1 to Rp4, and the mounting positions of the cameras 81 to 84 described above.
図1、図3~4、図7に示すように、障害物検出ユニット80Bには、トラクタ1の前方に存在する障害物を検出する第1障害物センサ86と、トラクタ1の後方に存在する障害物を検出する第2障害物センサ87と、トラクタ1の左右に存在する障害物を検出する第3障害物センサ88とが含まれている。第1障害物センサ86及び第2障害物センサ87には、障害物の検出にパルス状の近赤外レーザ光(近赤外光の一例)を使用するライダーセンサが採用されている。第3障害物センサ88には、障害物の検出に超音波を使用するソナーが採用されている。
As shown in FIGS. 1, 3 to 4, and 7, the obstacle detection unit 80B includes a first obstacle sensor 86 that detects an obstacle existing in front of the tractor 1 and an obstacle sensor 86 behind the tractor 1. A second obstacle sensor 87 for detecting an obstacle and a third obstacle sensor 88 for detecting obstacles existing on the left and right sides of the tractor 1 are included. The first obstacle sensor 86 and the second obstacle sensor 87 employ a lidar sensor that uses a pulsed near-infrared laser beam (an example of near-infrared light) to detect an obstacle. The third obstacle sensor 88 employs a sonar that uses ultrasonic waves to detect obstacles.
図1~4、図7に示すように、第1障害物センサ86は、近赤外レーザ光を使用して測定範囲に存在する測定対象物までの距離を測定する第1測定部86Aと、第1測定部86Aからの測定情報に基づいて距離画像の生成などを行う第1制御部86Bとを有している。第2障害物センサ87は、近赤外レーザ光を使用して測定範囲に存在する測定対象物までの距離を測定する第2測定部87Aと、第2測定部87Aからの測定情報に基づいて距離画像の生成などを行う第2制御部87Bとを有している。第3障害物センサ88は、超音波の送受信を行う右側の超音波センサ88Aと左側の超音波センサ88B、及び、各超音波センサ88A,88Bでの超音波の送受信に基づいて測定範囲に存在する測定対象物までの距離を測定する単一の第3制御部88Cを有している。
As shown in FIGS. 1 to 4 and 7, the first obstacle sensor 86 includes a first measuring unit 86A that measures a distance to a measurement object existing in a measuring range using near-infrared laser light. It has a first control unit 86B that generates a distance image or the like based on measurement information from the first measurement unit 86A. The second obstacle sensor 87 is based on the measurement information from the second measurement unit 87A that measures the distance to the measurement object existing in the measurement range using the near-infrared laser light and the measurement information from the second measurement unit 87A. It has a second control unit 87B that generates a distance image and the like. The third obstacle sensor 88 exists in the measurement range based on the ultrasonic sensor 88A on the right side and the ultrasonic sensor 88B on the left side for transmitting and receiving ultrasonic waves, and the ultrasonic waves transmitted and received by the ultrasonic sensors 88A and 88B, respectively. It has a single third control unit 88C that measures the distance to the object to be measured.
各障害物センサ86~88の制御部86B,87B,88Cは、マイクロコントローラなどが集積された電子制御ユニットや各種の制御プログラムなどによって構築されている。各制御部86B,87B,88Cは、車載制御ユニット23及び情報統合処理部80CなどにCANを介して相互通信可能に接続されている。
The control units 86B, 87B, 88C of the obstacle sensors 86 to 88 are constructed by an electronic control unit in which a microcontroller and the like are integrated, various control programs, and the like. The control units 86B, 87B, 88C are connected to the vehicle-mounted control unit 23, the information integration processing unit 80C, and the like via CAN so as to be intercommunicationable.
図3~4に示すように、第1障害物センサ86は、キャビン13から前方の第1測定範囲Rm1が測定範囲に設定されている。第2障害物センサ87は、キャビン13から後方の第2測定範囲Rm2が測定範囲に設定されている。第3障害物センサ88は、キャビン13から右方の第3測定範囲Rm3とキャビン13から左方の第4測定範囲Rm4とが測定範囲に設定されている。
As shown in FIGS. 3 to 4, in the first obstacle sensor 86, the first measurement range Rm1 in front of the cabin 13 is set as the measurement range. In the second obstacle sensor 87, the second measurement range Rm2 behind the cabin 13 is set as the measurement range. In the third obstacle sensor 88, a third measurement range Rm3 to the right of the cabin 13 and a fourth measurement range Rm4 to the left of the cabin 13 are set as measurement ranges.
図1、図3~4に示すように、第1障害物センサ86及び第2障害物センサ87は、前カメラ81及び後カメラ82と同様にトラクタ1の左右中心線上に配置されている。第1障害物センサ86は、キャビン13の前端側における上部の左右中央箇所に、トラクタ1の前方側を斜め上方側から見下ろす前下がり姿勢で配置されている。これにより、第1障害物センサ86は、トラクタ1の左右中心線を対称軸とする車体前方側の所定範囲が測定部86Aによる第1測定範囲Rm1に設定されている。第2障害物センサ87は、キャビン13の後端側における上部の左右中央箇所に、トラクタ1の後方側を斜め上方側から見下ろす後下がり姿勢で配置されている。これにより、第2障害物センサ87は、トラクタ1の左右中心線を対称軸とする車体後方側の所定範囲が測定部87Aによる第2測定範囲Rm2に設定されている。
As shown in FIGS. 1 and 3 to 4, the first obstacle sensor 86 and the second obstacle sensor 87 are arranged on the left and right center lines of the tractor 1 like the front camera 81 and the rear camera 82. The first obstacle sensor 86 is arranged at the center of the upper left and right on the front end side of the cabin 13 in a front-down posture looking down on the front side of the tractor 1 from an obliquely upper side. As a result, in the first obstacle sensor 86, a predetermined range on the front side of the vehicle body with the left and right center lines of the tractor 1 as the axis of symmetry is set in the first measurement range Rm1 by the measurement unit 86A. The second obstacle sensor 87 is arranged at the center of the upper left and right on the rear end side of the cabin 13 in a rearward lowering posture in which the rear side of the tractor 1 is viewed from the diagonally upper side. As a result, in the second obstacle sensor 87, a predetermined range on the rear side of the vehicle body with the left and right center lines of the tractor 1 as the axis of symmetry is set in the second measurement range Rm2 by the measurement unit 87A.
第1障害物センサ86及び第2障害物センサ87は、変速ユニット16の前後進切換装置が前進伝動状態に切り換えられたトラクタ1の前進走行時には、その切り換えに連動して、第1障害物センサ86が作動状態になり、第2障害物センサ87が作動停止状態になる。又、変速ユニット16の前後進切換装置が後進伝動状態に切り換えられたトラクタ1の後進走行時には、その切り換えに連動して、第1障害物センサ86が作動停止状態になり、第2障害物センサ87が作動状態になる。
The first obstacle sensor 86 and the second obstacle sensor 87 are interlocked with the switching when the tractor 1 is traveling forward when the forward / backward switching device of the speed change unit 16 is switched to the forward transmission state. The 86 is in the operating state, and the second obstacle sensor 87 is in the operating stopped state. Further, when the tractor 1 is traveling backward when the forward / backward switching device of the speed change unit 16 is switched to the reverse transmission state, the first obstacle sensor 86 is stopped in operation in conjunction with the switching, and the second obstacle sensor 87 is in the operating state.
図2に示すように、右側の超音波センサ88Aは、右側の前輪10と右側の後輪11との間に配置された右側の乗降ステップ24に車体右外向き姿勢で取り付けられている。これにより、右側の超音波センサ88Aは、車体右外側の所定範囲が第3測定範囲Rm3に設定されている。図1~3に示すように、左側の超音波センサ88Bは、左側の前輪10と左側の後輪11との間に配置された左側の乗降ステップ24に車体左外向き姿勢で取り付けられている。これにより、左側の超音波センサ88Bは、車体左外側の所定範囲が第4測定範囲Rm4に設定されている。
As shown in FIG. 2, the ultrasonic sensor 88A on the right side is attached to the boarding / alighting step 24 on the right side arranged between the front wheel 10 on the right side and the rear wheel 11 on the right side in a posture facing outward to the right of the vehicle body. As a result, in the ultrasonic sensor 88A on the right side, a predetermined range on the right outer side of the vehicle body is set to the third measurement range Rm3. As shown in FIGS. 1 to 3, the ultrasonic sensor 88B on the left side is attached to the boarding / alighting step 24 on the left side arranged between the front wheel 10 on the left side and the rear wheel 11 on the left side in a posture facing outward to the left of the vehicle body. .. As a result, in the ultrasonic sensor 88B on the left side, a predetermined range on the left outer side of the vehicle body is set to the fourth measurement range Rm4.
図3~4、図7に示すように、第1障害物センサ86及び第2障害物センサ87の各測定部86A,87Aは、照射した近赤外レーザ光が測距点に到達して戻るまでの往復時間に基づいて測距点までの距離を測定するTOF(Time Of Flight)方式により、各測定部86A,87Aから第1測定範囲Rm1又は第2測定範囲Rm2の各測距点(測定対象物の一例)までの距離を測定する。各測定部86A,87Aは、第1測定範囲Rm1又は第2測定範囲Rm2の全体にわたって、近赤外レーザ光を高速で縦横に走査して、走査角(座標)ごとの測距点までの距離を順次測定することにより、第1測定範囲Rm1又は第2測定範囲Rm2において3次元の測定を行う。各測定部86A,87Aは、第1測定範囲Rm1又は第2測定範囲Rm2の全体にわたって近赤外レーザ光を高速で縦横に走査したときに得られる各測距点からの反射光の強度(以下、反射強度と称する)を順次測定する。各測定部86A,87Aは、第1測定範囲Rm1又は第2測定範囲Rm2の各測距点までの距離や各反射強度などをリアルタイムで繰り返し測定する。
As shown in FIGS. 3 to 4 and 7, in the measuring units 86A and 87A of the first obstacle sensor 86 and the second obstacle sensor 87, the irradiated near-infrared laser light reaches the distance measuring point and returns. Each AF point (measurement) in the first measurement range Rm1 or the second measurement range Rm2 from each measuring unit 86A, 87A by the TOF (Time Of Flight) method that measures the distance to the AF point based on the round-trip time to Measure the distance to (an example of an object). Each of the measuring units 86A and 87A scans the near-infrared laser light vertically and horizontally at high speed over the entire first measurement range Rm1 or the second measurement range Rm2, and the distance to the distance measurement point for each scanning angle (coordinate). Is sequentially measured to perform three-dimensional measurement in the first measurement range Rm1 or the second measurement range Rm2. Each of the measuring units 86A and 87A has the intensity of the reflected light from each AF point obtained when the near-infrared laser beam is scanned vertically and horizontally at high speed over the entire first measurement range Rm1 or the second measurement range Rm2 (hereinafter,). , Called reflection intensity) are measured sequentially. The measuring units 86A and 87A repeatedly measure the distance to each AF point in the first measuring range Rm1 or the second measuring range Rm2, each reflection intensity, and the like in real time.
第1障害物センサ86及び第2障害物センサ87の各制御部86B,87Bは、各測定部86A,87Aが測定した各測距点までの距離や各測距点に対する走査角(座標)などの測定情報から、距離画像を生成するとともに障害物と推定される測距点群を抽出し、抽出した測距点群に関する測定情報を障害物候補に関する測定情報として情報統合処理部80Cに送信する。
Each control unit 86B, 87B of the first obstacle sensor 86 and the second obstacle sensor 87 has a distance to each AF point measured by each measurement unit 86A, 87A, a scanning angle (coordinates) for each AF point, and the like. From the measurement information of the above, a distance image is generated, a range-finding point group presumed to be an obstacle is extracted, and the measurement information regarding the extracted range-finding point group is transmitted to the information integration processing unit 80C as measurement information regarding the obstacle candidate. ..
第1障害物センサ86及び第2障害物センサ87の各制御部86B,87Bは、各測定部86A,87Aが測定した各測距点の距離値が無効条件に適合するか否かを判定し、無効条件に適合する距離値を無効値として情報統合処理部80Cに送信する。
The control units 86B and 87B of the first obstacle sensor 86 and the second obstacle sensor 87 determine whether or not the distance values of the distance measurement points measured by the measurement units 86A and 87A meet the invalid condition. , The distance value that meets the invalid condition is transmitted to the information integration processing unit 80C as an invalid value.
具体的には、各制御部86B,87Bは、第1障害物センサ86又は第2障害物センサ87からの至近距離に存在するというセンサ表面での汚れの特徴を利用して、その特徴を有する測距点の距離値を無効値とする。これにより、センサ表面の汚れに関する測距点の距離値が、情報統合処理部80Cにおいて障害物に関する測定情報として使用されることを防止している。
Specifically, each of the control units 86B and 87B has the characteristic of being dirty on the sensor surface that it exists at a close distance from the first obstacle sensor 86 or the second obstacle sensor 87. The distance value of the AF point is invalid. This prevents the distance value of the distance measuring point regarding the dirt on the sensor surface from being used as the measurement information regarding the obstacle in the information integrated processing unit 80C.
又、各制御部86B,87Bは、第1障害物センサ86又は第2障害物センサ87の近距離に存在しながら反射強度が非常に弱いという埃や霧などの浮遊物の特徴を利用して、その特徴を有する測距点の距離値を無効値とする。これにより、浮遊物に関する測距点の距離値が、情報統合処理部80Cにおいて障害物に関する測定情報として使用されることを防止している。
Further, each of the control units 86B and 87B utilizes the characteristic of suspended matter such as dust and fog that the reflection intensity is very weak while being present at a short distance of the first obstacle sensor 86 or the second obstacle sensor 87. , The distance value of the AF point having that feature is regarded as an invalid value. This prevents the distance value of the distance measuring point regarding the suspended object from being used as the measurement information regarding the obstacle in the information integration processing unit 80C.
図3~4、図7に示すように、第3障害物センサ88の第3制御部88Cは、左右の超音波センサ88A,88Bによる超音波の送受信に基づいて、第3測定範囲Rm3又は第4測定範囲Rm4における測定対象物の存否を判定する。第3制御部88Cは、発信した超音波が測距点に到達して戻るまでの往復時間に基づいて測距点までの距離を測定するTOF(Time Of Flight)方式により、各超音波センサ88A,88Bから測定対象物までの距離を測定し、測定した測定対象物までの距離と測定対象物の方向とを、障害物に関する測定情報として情報統合処理部80Cに送信する。
As shown in FIGS. 3 to 4 and 7, the third control unit 88C of the third obstacle sensor 88 has a third measurement range Rm3 or a third measurement range Rm3 based on the transmission and reception of ultrasonic waves by the left and right ultrasonic sensors 88A and 88B. 4 Determine the presence or absence of the object to be measured in the measurement range Rm4. The third control unit 88C measures each ultrasonic sensor 88A by the TOF (Time Of Flight) method of measuring the distance to the AF point based on the round-trip time until the transmitted ultrasonic wave reaches the AF point and returns. , 88B measures the distance from the measurement target, and transmits the measured distance to the measurement target and the direction of the measurement target to the information integration processing unit 80C as measurement information regarding the obstacle.
図4、図9~11に示すように、第1障害物センサ86及び第2障害物センサ87の各制御部86B,87Bは、各測定部86A,87Aの測定範囲Rm1,Rm2に対して車体情報などに基づくカット処理とマスキング処理とを施すことにより、第1障害物センサ86及び第2障害物センサ87による障害物の検出対象範囲を第1検出範囲Rd1と第2検出範囲Rd2とに制限している。
As shown in FIGS. 4 and 9 to 11, each of the control units 86B and 87B of the first obstacle sensor 86 and the second obstacle sensor 87 has a vehicle body with respect to the measurement ranges Rm1 and Rm2 of the measurement units 86A and 87A. By performing cut processing and masking processing based on information and the like, the obstacle detection target range by the first obstacle sensor 86 and the second obstacle sensor 87 is limited to the first detection range Rd1 and the second detection range Rd2. doing.
各制御部86B,87Bは、カット処理においては、車載制御ユニット23との通信によって作業装置を含む車体の最大左右幅(本実施形態ではロータリ耕耘装置3の左右幅)を取得し、この車体の最大左右幅に所定の安全帯域を加えることで障害物の検出対象幅Wdを設定している。そして、第1測定範囲Rm1及び第2測定範囲Rm2において、検出対象幅Wdから外れる左右の範囲をカット処理による第1非検出範囲Rnd1に設定して各検出範囲Rd1,Rd2から除外する。
In the cutting process, the control units 86B and 87B acquire the maximum left-right width of the vehicle body including the work device (the left-right width of the rotary tillage device 3 in the present embodiment) by communicating with the in-vehicle control unit 23, and the vehicle body The obstacle detection target width Wd is set by adding a predetermined safety band to the maximum left-right width. Then, in the first measurement range Rm1 and the second measurement range Rm2, the left and right ranges outside the detection target width Wd are set as the first non-detection range Rnd1 by the cut process and excluded from the respective detection ranges Rd1 and Rd2.
各制御部86B,87Bは、マスキング処理においては、第1測定範囲Rm1に対してトラクタ1の前端側が入り込む範囲、及び、第2測定範囲Rm2に対して作業装置の後端側が入り込む範囲に所定の安全帯域を加えた範囲をマスキング処理による第2非検出範囲Rnd2に設定して各検出範囲Rd1,Rd2から除外する。
In the masking process, the control units 86B and 87B are predetermined in a range in which the front end side of the tractor 1 enters the first measurement range Rm1 and a range in which the rear end side of the working device enters the second measurement range Rm2. The range to which the safety band is added is set to the second non-detection range Rnd2 by the masking process and excluded from the respective detection ranges Rd1 and Rd2.
そして、このように障害物の検出対象範囲を第1検出範囲Rd1と第2検出範囲Rd2とに制限することにより、第1障害物センサ86及び第2障害物センサ87が、検出対象幅Wdから外れていてトラクタ1と衝突する虞のない障害物を検出することによる検出負荷の増大や、第1測定範囲Rm1又は第2測定範囲Rm2に入り込んでいるトラクタ1の前端側や作業装置の後端側を障害物として誤検出する虞を回避している。
Then, by limiting the obstacle detection target range to the first detection range Rd1 and the second detection range Rd2 in this way, the first obstacle sensor 86 and the second obstacle sensor 87 can be detected from the detection target width Wd. Increased detection load by detecting obstacles that are out of the way and do not collide with the tractor 1, and the front end side of the tractor 1 or the rear end of the work device that is in the first measurement range Rm1 or the second measurement range Rm2. The risk of erroneous detection of the side as an obstacle is avoided.
尚、図9に示す第2非検出範囲Rnd2は、左右の前輪10やボンネット15が存在する車体の前部側に適した非検出範囲の一例である。図10に示す第2非検出範囲Rnd2は、車体の後部側においてロータリ耕耘装置3を作業高さまで下降させた作業状態に適した非検出範囲の一例である。図11に示す第2非検出範囲Rnd2は、車体の後部側においてロータリ耕耘装置3を退避高さまで上昇させた非作業状態に適した非検出範囲の一例である。車体後部側の第2非検出範囲Rnd2は、ロータリ耕耘装置3の昇降に連動して適正に切り換わる。
The second non-detection range Rnd2 shown in FIG. 9 is an example of a non-detection range suitable for the front side of the vehicle body where the left and right front wheels 10 and the bonnet 15 are present. The second non-detection range Rnd2 shown in FIG. 10 is an example of a non-detection range suitable for a working state in which the rotary tillage device 3 is lowered to the working height on the rear side of the vehicle body. The second non-detection range Rnd2 shown in FIG. 11 is an example of a non-detection range suitable for a non-working state in which the rotary tillage device 3 is raised to the evacuation height on the rear side of the vehicle body. The second non-detection range Rnd2 on the rear side of the vehicle body is appropriately switched in conjunction with the raising and lowering of the rotary tillage device 3.
第1検出範囲Rd1、第2検出範囲Rd2、第1非検出範囲Rnd1、及び、第2非検出範囲Rnd2に関する情報は、前述した距離画像に含まれており、前述した距離画像とともに情報統合処理部80Cに送信されている。
Information on the first detection range Rd1, the second detection range Rd2, the first non-detection range Rnd1, and the second non-detection range Rnd2 is included in the above-mentioned distance image, and the information integration processing unit is included with the above-mentioned distance image. It is transmitted to 80C.
図4に示すように、第1障害物センサ86及び第2障害物センサ87の各検出範囲Rd1,Rd2は、衝突予測時間が設定時間(例えば3秒)になる衝突判定処理に基づいて、停止制御範囲Rscと減速制御範囲Rdcと報知制御範囲Rncとに区画されている。停止制御範囲Rscは、第1障害物センサ86又は第2障害物センサ87から衝突判定処理の判定基準位置までの範囲に設定されている。減速制御範囲Rdcは、判定基準位置から減速開始位置までの範囲に設定されている。報知制御範囲Rncは、減速開始位置から第1障害物センサ86又は第2障害物センサ87の測定限界位置までの範囲に設定されている。各判定基準位置は、ロータリ耕耘装置3を含む車体の前端又は後端から車体前後方向に一定の離隔距離L(例えば2000mm)を置いた位置に設定されている。
As shown in FIG. 4, the detection ranges Rd1 and Rd2 of the first obstacle sensor 86 and the second obstacle sensor 87 are stopped based on the collision determination process in which the collision prediction time becomes the set time (for example, 3 seconds). It is divided into a control range Rsc, a deceleration control range Rdc, and a notification control range Rnc. The stop control range Rsc is set in the range from the first obstacle sensor 86 or the second obstacle sensor 87 to the determination reference position of the collision determination process. The deceleration control range Rdc is set in the range from the determination reference position to the deceleration start position. The notification control range Rnc is set in the range from the deceleration start position to the measurement limit position of the first obstacle sensor 86 or the second obstacle sensor 87. Each determination reference position is set at a position where a constant separation distance L (for example, 2000 mm) is set in the front-rear direction of the vehicle body from the front end or the rear end of the vehicle body including the rotary tillage device 3.
図7に示すように、情報統合処理部80Cは、マイクロコントローラなどが集積された電子制御ユニットや各種の制御プログラムなどによって構築されている。情報統合処理部80Cは、測距精度は低いが物体の判別精度が高い撮像ユニット80Aからの情報と、物体の判別精度は低いが測距精度が高い障害物検出ユニット80Bからの測定情報とに基づいて、障害物に関する情報を取得して出力する障害物情報出力制御を実行する。
As shown in FIG. 7, the information integration processing unit 80C is constructed by an electronic control unit in which a microcontroller and the like are integrated, various control programs, and the like. The information integration processing unit 80C combines information from the imaging unit 80A, which has low distance measurement accuracy but high object discrimination accuracy, and measurement information from the obstacle detection unit 80B, which has low object discrimination accuracy but high distance measurement accuracy. Based on this, the obstacle information output control that acquires and outputs information about the obstacle is executed.
以下、図12に示すフローチャートに基づいて、障害物情報取得制御における情報統合処理部80Cの制御作動について説明する。
尚、ここでは、情報統合処理部80Cが、撮像ユニット80Aの前カメラ81からのカラー画像情報と障害物検出ユニット80Bの第1障害物センサ86からの測定情報とに基づいて障害物に関する情報を取得して出力する場合の制御作動についてのみ説明する。そして、これ以外の、撮像ユニット80Aの後カメラ82からの情報と障害物検出ユニット80Bの第2障害物センサ87からの測定情報とに基づいて障害物に関する情報を取得して出力する場合、及び、撮像ユニット80Aの左右のカメラ83,84からの情報と障害物検出ユニット80Bの第3障害物センサ88からの測定情報とに基づいて障害物に関する情報を取得して出力する場合での情報統合処理部80Cの制御作動も、前カメラ81からの情報と第1障害物センサ86からの測定情報とに基づく場合と同様であることから、これらの場合における情報統合処理部80Cの制御作動に関する説明は省略する。 Hereinafter, the control operation of the informationintegration processing unit 80C in the obstacle information acquisition control will be described with reference to the flowchart shown in FIG.
Here, the informationintegration processing unit 80C obtains information on obstacles based on color image information from the front camera 81 of the imaging unit 80A and measurement information from the first obstacle sensor 86 of the obstacle detection unit 80B. Only the control operation when acquiring and outputting will be described. In addition to this, when acquiring and outputting information on obstacles based on the information from the rear camera 82 of the imaging unit 80A and the measurement information from the second obstacle sensor 87 of the obstacle detection unit 80B, and , Information integration in the case of acquiring and outputting information on obstacles based on the information from the left and right cameras 83 and 84 of the imaging unit 80A and the measurement information from the third obstacle sensor 88 of the obstacle detection unit 80B. Since the control operation of the processing unit 80C is the same as the case based on the information from the front camera 81 and the measurement information from the first obstacle sensor 86, the description of the control operation of the information integration processing unit 80C in these cases Is omitted.
尚、ここでは、情報統合処理部80Cが、撮像ユニット80Aの前カメラ81からのカラー画像情報と障害物検出ユニット80Bの第1障害物センサ86からの測定情報とに基づいて障害物に関する情報を取得して出力する場合の制御作動についてのみ説明する。そして、これ以外の、撮像ユニット80Aの後カメラ82からの情報と障害物検出ユニット80Bの第2障害物センサ87からの測定情報とに基づいて障害物に関する情報を取得して出力する場合、及び、撮像ユニット80Aの左右のカメラ83,84からの情報と障害物検出ユニット80Bの第3障害物センサ88からの測定情報とに基づいて障害物に関する情報を取得して出力する場合での情報統合処理部80Cの制御作動も、前カメラ81からの情報と第1障害物センサ86からの測定情報とに基づく場合と同様であることから、これらの場合における情報統合処理部80Cの制御作動に関する説明は省略する。 Hereinafter, the control operation of the information
Here, the information
情報統合処理部80Cは、撮像ユニット80Aからの情報に基づいて前カメラ81のカラー画像情報から障害物が検出されているか否かを判定する第1判定処理を行い(ステップ#1)、又、障害物検出ユニット80Bからの測定情報に基づいて第1障害物センサ86の距離画像に障害物候補が含まれているか否かを判定する第2判定処理を行う(ステップ#2)。
The information integration processing unit 80C performs the first determination process of determining whether or not an obstacle is detected from the color image information of the front camera 81 based on the information from the image pickup unit 80A (step # 1), and also Based on the measurement information from the obstacle detection unit 80B, a second determination process for determining whether or not the distance image of the first obstacle sensor 86 includes an obstacle candidate is performed (step # 2).
情報統合処理部80Cは、第1判定処理にて前カメラ81のカラー画像情報から障害物が検出され、かつ、第2判定処理にて第1障害物センサ86の距離画像に障害物候補が含まれている場合には、前述した障害物の位置と障害物候補の位置とが一致しているか否かを判定する第3判定処理を行う(ステップ#3)。
The information integration processing unit 80C detects an obstacle from the color image information of the front camera 81 in the first determination process, and includes an obstacle candidate in the distance image of the first obstacle sensor 86 in the second determination process. If this is the case, a third determination process for determining whether or not the position of the obstacle and the position of the obstacle candidate described above match is performed (step # 3).
情報統合処理部80Cは、第3判定処理にて障害物の位置と障害物候補の位置とが一致している場合は、前カメラ81及び第1障害物センサ86による障害物(障害物候補)の検出状態が、前カメラ81と第1障害物センサ86とが同じ障害物(障害物候補)を適正に検出している適正検出状態であると判定する。そして、この判定結果に基づいて、前カメラ81のカラー画像情報から検出された障害物に、第1障害物センサ86が測定した障害物候補の距離情報を適用して、この情報を障害物検出情報として取得する適正情報取得処理と、取得した障害物検出情報を出力する適正情報出力処理とを行う(ステップ#4~5)。
When the position of the obstacle and the position of the obstacle candidate match in the third determination process, the information integration processing unit 80C determines the obstacle (obstacle candidate) by the front camera 81 and the first obstacle sensor 86. Is determined to be a proper detection state in which the front camera 81 and the first obstacle sensor 86 properly detect the same obstacle (obstacle candidate). Then, based on this determination result, the distance information of the obstacle candidate measured by the first obstacle sensor 86 is applied to the obstacle detected from the color image information of the front camera 81, and this information is detected as an obstacle. The appropriate information acquisition process to be acquired as information and the appropriate information output process to output the acquired obstacle detection information are performed (steps # 4 to 5).
情報統合処理部80Cは、障害物の位置と障害物候補の位置とが一致していない場合は、第1障害物センサ86からの距離情報において、前カメラ81のカラー画像情報における障害物の位置に対応する位置の測定値が有効か否かを判定する第4判定処理を行う(ステップ#6)。
When the position of the obstacle and the position of the obstacle candidate do not match, the information integration processing unit 80C determines the position of the obstacle in the color image information of the front camera 81 in the distance information from the first obstacle sensor 86. The fourth determination process for determining whether or not the measured value at the position corresponding to is valid is performed (step # 6).
情報統合処理部80Cは、第4判定処理にて前述した測定値が有効である場合は、前カメラ81及び第1障害物センサ86による障害物(障害物候補)の検出状態が、前カメラ81のカラー画像情報からは障害物が適正に検出され、かつ、第1障害物センサ86による測定は適正に行われているが、第1障害物センサ86の距離画像からはその障害物を障害物候補として特定することができない準適正検出状態であると判定する。そして、この判定結果に基づいて、前カメラ81のカラー画像情報から検出された障害物に、前カメラ81が示す障害物の位置に相当する測定対象物に対して第1障害物センサ86が測定した距離情報を適用して、この情報を障害物検出情報として取得する準適正情報取得処理と、取得した障害物検出情報を出力する準適正情報出力処理とを行う(ステップ#7~8)。
When the measurement value described above is valid in the fourth determination process, the information integration processing unit 80C determines that the detection state of the obstacle (obstacle candidate) by the front camera 81 and the first obstacle sensor 86 is the front camera 81. The obstacle is properly detected from the color image information of the above, and the measurement by the first obstacle sensor 86 is performed properly, but the obstacle is detected as an obstacle from the distance image of the first obstacle sensor 86. It is determined that the detection state is quasi-appropriate and cannot be specified as a candidate. Then, based on this determination result, the first obstacle sensor 86 measures the obstacle detected from the color image information of the front camera 81 with respect to the measurement target corresponding to the position of the obstacle indicated by the front camera 81. The quasi-appropriate information acquisition process for acquiring this information as obstacle detection information and the quasi-appropriate information output process for outputting the acquired obstacle detection information are performed by applying the distance information (steps # 7 to 8).
情報統合処理部80Cは、第4判定処理にて前述した測定値が有効でない場合は、前カメラ81及び第1障害物センサ86による障害物(障害物候補)の検出状態が、前カメラ81のカラー画像情報からは障害物を適正に検出しているが、第1障害物センサ86の距離画像においては、前カメラ81が示す障害物の位置に相当する測定対象位置での埃や霧などの浮遊物の発生、又は、第1障害物センサ86のセンサ表面での汚れの付着により、前カメラ81が示す障害物の位置に相当する測定対象物に対する測定値が無効になっている測定値無効状態であると判定する。そして、この判定結果に基づいて、前カメラ81のカラー画像情報から検出された障害物に、前カメラ81のカラー画像情報から算出した障害物に対する距離情報を適用して、この情報を障害物検出情報として取得するカメラ単独情報取得処理と、取得した障害物検出情報を出力するカメラ単独情報出力処理とを行う(ステップ#9~10)。
When the measurement value described above is not valid in the fourth determination process, the information integration processing unit 80C determines that the detection state of the obstacle (obstacle candidate) by the front camera 81 and the first obstacle sensor 86 is the front camera 81. Obstacles are properly detected from the color image information, but in the distance image of the first obstacle sensor 86, dust, fog, etc. at the measurement target position corresponding to the position of the obstacle indicated by the front camera 81, etc. The measured value for the object to be measured corresponding to the position of the obstacle indicated by the front camera 81 is invalid due to the generation of floating objects or the adhesion of dirt on the sensor surface of the first obstacle sensor 86. Determined to be in a state. Then, based on this determination result, the distance information for the obstacle calculated from the color image information of the front camera 81 is applied to the obstacle detected from the color image information of the front camera 81, and this information is detected as the obstacle. The camera independent information acquisition process to be acquired as information and the camera independent information output process to output the acquired obstacle detection information are performed (steps # 9 to 10).
車載制御ユニット23には、情報統合処理部80Cからの障害物検出情報に基づいて障害物に関する制御を実行する障害物用制御部23Hが含まれている。障害物用制御部23Hは、マイクロコントローラなどが集積された電子制御ユニットや各種の制御プログラムなどによって構築されている。障害物用制御部23Hは、情報統合処理部80CなどにCANを介して相互通信可能に接続されている。
The in-vehicle control unit 23 includes an obstacle control unit 23H that executes control related to obstacles based on obstacle detection information from the information integration processing unit 80C. The obstacle control unit 23H is constructed by an electronic control unit in which a microcontroller and the like are integrated, various control programs, and the like. The obstacle control unit 23H is connected to the information integration processing unit 80C or the like via CAN so that they can communicate with each other.
障害物用制御部23Hは、情報統合処理部80Cからの障害物検出情報に基づいて障害物との衝突を回避する衝突回避制御を実行する。具体的には、障害物用制御部23Hは、情報統合処理部80Cからの適正情報出力処理又は準適正情報出力処理による障害物検出情報を取得した場合は、衝突回避制御として第1衝突回避制御を実行する。障害物用制御部23Hは、情報統合処理部80Cからのカメラ単独情報出力処理による障害物検出情報を取得した場合は、衝突回避制御として、第1衝突回避制御よりも衝突回避率の高い第2衝突回避制御を実行する。
The obstacle control unit 23H executes collision avoidance control for avoiding a collision with an obstacle based on the obstacle detection information from the information integration processing unit 80C. Specifically, when the obstacle control unit 23H acquires the obstacle detection information by the appropriate information output processing or the quasi-appropriate information output processing from the information integration processing unit 80C, the first collision avoidance control is performed as the collision avoidance control. To execute. When the obstacle control unit 23H acquires the obstacle detection information by the camera-only information output processing from the information integration processing unit 80C, the second collision avoidance control has a higher collision avoidance rate than the first collision avoidance control as the collision avoidance control. Executes collision avoidance control.
以下、図13に示すフローチャートに基づいて、第1衝突回避制御における障害物用制御部23Hの制御作動について説明する。
尚、ここでは、第1障害物センサ86の第1検出範囲Rd1に障害物が位置する場合を例示して説明する。 Hereinafter, the control operation of theobstacle control unit 23H in the first collision avoidance control will be described based on the flowchart shown in FIG.
Here, a case where an obstacle is located in the first detection range Rd1 of thefirst obstacle sensor 86 will be described as an example.
尚、ここでは、第1障害物センサ86の第1検出範囲Rd1に障害物が位置する場合を例示して説明する。 Hereinafter, the control operation of the
Here, a case where an obstacle is located in the first detection range Rd1 of the
障害物用制御部23Hは、障害物検出情報に含まれた障害物との距離に基づいて、障害物が、図4に示す第1検出範囲Rd1のうちの報知制御範囲Rncに位置しているか否かを判定する第5判定処理と、減速制御範囲Rdcに位置しているか否かを判定する第6判定処理と、停止制御範囲Rscに位置しているか否かを判定する第7判定処理とを行う(ステップ#11~13)。
The obstacle control unit 23H determines whether the obstacle is located in the notification control range Rnc of the first detection range Rd1 shown in FIG. 4 based on the distance to the obstacle included in the obstacle detection information. A fifth determination process for determining whether or not it is located, a sixth determination process for determining whether or not it is located in the deceleration control range Rdc, and a seventh determination process for determining whether or not it is located in the stop control range Rsc. (Steps # 11 to 13).
障害物用制御部23Hは、第5判定処理にて第1検出範囲Rd1の報知制御範囲Rncに障害物が位置することを検知した場合は、報知制御範囲Rncに障害物が位置することを、トラクタ1の操作端末27や携帯通信端末5の表示デバイス50にて報知させるための報知指令を、車載制御ユニット23の表示制御部23Eと端末制御ユニット51の表示制御部51Aとに指令する第1報知指令処理を行う(ステップ#14)。
When the obstacle control unit 23H detects that the obstacle is located in the notification control range Rnc of the first detection range Rd1 in the fifth determination process, the obstacle control unit 23H determines that the obstacle is located in the notification control range Rnc. The first command to give a notification command to the display control unit 23E of the vehicle-mounted control unit 23 and the display control unit 51A of the terminal control unit 51 to notify the display device 50 of the operation terminal 27 of the tractor 1 or the mobile communication terminal 5. The notification command process is performed (step # 14).
これにより、トラクタ1に対する第1検出範囲Rd1の報知制御範囲Rncに障害物が位置することを、運転部12の搭乗者や車外の管理者などのユーザに知らせることができる。
Thereby, it is possible to notify the user such as the passenger of the driving unit 12 or the manager outside the vehicle that the obstacle is located in the notification control range Rnc of the first detection range Rd1 with respect to the tractor 1.
障害物用制御部23Hは、第6判定処理にて第1検出範囲Rd1の減速制御範囲Rdcに障害物が位置することを検知した場合は、減速制御範囲Rdcに障害物が位置することを、トラクタ1の操作端末27や携帯通信端末5の表示デバイス50にて報知させるための報知指令を、車載制御ユニット23の表示制御部23Eと端末制御ユニット51の表示制御部51Aとに指令する第2報知指令処理を行う(ステップ#15)。又、障害物用制御部23Hは、減速制御範囲Rdcに位置する障害物がトラクタ1に近づくほどトラクタ1の車速を低下させるための減速指令を変速ユニット制御部23Bに指令する減速指令処理を行う(ステップ#16)。
When the obstacle control unit 23H detects that the obstacle is located in the deceleration control range Rdc of the first detection range Rd1 in the sixth determination process, the obstacle control unit 23H determines that the obstacle is located in the deceleration control range Rdc. A second command to the display control unit 23E of the vehicle-mounted control unit 23 and the display control unit 51A of the terminal control unit 51 to give a notification command to be notified by the operation terminal 27 of the tractor 1 or the display device 50 of the mobile communication terminal 5. The notification command process is performed (step # 15). Further, the obstacle control unit 23H performs a deceleration command process for instructing the speed change unit control unit 23B of a deceleration command for reducing the vehicle speed of the tractor 1 as the obstacle located in the deceleration control range Rdc approaches the tractor 1. (Step # 16).
これにより、トラクタ1に対する第1検出範囲Rd1の減速制御範囲Rdcに障害物が位置することを、運転部12の搭乗者や車外の管理者などのユーザに知らせることができる。又、変速ユニット制御部23Bの制御作動により、作業車両Vが障害物に近づくに連れてトラクタ1の車速を適正に低下させることができる。
Thereby, it is possible to notify the user such as the passenger of the driving unit 12 or the manager outside the vehicle that the obstacle is located in the deceleration control range Rdc of the first detection range Rd1 with respect to the tractor 1. Further, by the control operation of the speed change unit control unit 23B, the vehicle speed of the tractor 1 can be appropriately reduced as the work vehicle V approaches the obstacle.
障害物用制御部23Hは、第7判定処理にて第1検出範囲Rd1の停止制御範囲Rscに障害物が位置することを検知した場合は、停止制御範囲Rscに障害物が位置することを、トラクタ1の操作端末27や携帯通信端末5の表示デバイス50にて報知させるための報知指令を、車載制御ユニット23の表示制御部23Eと端末制御ユニット51の表示制御部51Aとに指令する第3報知指令処理を行う(ステップ#17)。又、障害物用制御部23Hは、障害物が停止制御範囲Rscに位置する間にトラクタ1を減速停止させるための減速停止指令を変速ユニット制御部23Bに指令する減速停止指令処理を行う(ステップ#18)。
When the obstacle control unit 23H detects that the obstacle is located in the stop control range Rsc of the first detection range Rd1 in the seventh determination process, the obstacle control unit 23H determines that the obstacle is located in the stop control range Rsc. A third command to give a notification command to the display control unit 23E of the vehicle-mounted control unit 23 and the display control unit 51A of the terminal control unit 51 to notify the display device 50 of the operation terminal 27 of the tractor 1 or the mobile communication terminal 5. The notification command process is performed (step # 17). Further, the obstacle control unit 23H performs a deceleration stop command process for instructing the speed change unit control unit 23B of a deceleration stop command for decelerating and stopping the tractor 1 while the obstacle is located in the stop control range Rsc (step). # 18).
これにより、トラクタ1に対する第1検出範囲Rd1の停止制御範囲Rscに障害物が位置することを、運転部12の搭乗者や車外の管理者などのユーザに知らせることができる。又、変速ユニット制御部23Bの制御作動により、障害物が停止制御範囲Rscに位置する間においてトラクタ1を減速停止させることができ、よって、作業車両Vが障害物に衝突する虞を回避することができる。
Thereby, it is possible to notify the user such as the passenger of the driving unit 12 or the manager outside the vehicle that the obstacle is located in the stop control range Rsc of the first detection range Rd1 with respect to the tractor 1. Further, by the control operation of the speed change unit control unit 23B, the tractor 1 can be decelerated and stopped while the obstacle is located in the stop control range Rsc, thereby avoiding the possibility that the work vehicle V collides with the obstacle. Can be done.
次に、第2衝突回避制御における障害物用制御部23Hの制御作動について説明すると、障害物用制御部23Hは、前述した第1衝突回避制御の場合よりも、前述した第1検出範囲Rd1の判定基準位置から減速制御範囲Rdcと停止制御範囲Rscとの境界までの距離、及び、判定基準位置から報知制御範囲Rncと減速制御範囲Rdcとの境界までの距離が長くなるように、減速制御範囲Rdc及び停止制御範囲Rscの前後長さを長くした状態で、第1衝突回避制御におけるステップ#11~18の各制御処理と同じ制御処理を行う。これにより、第2衝突回避制御は、第1衝突回避制御よりも早いタイミングで障害物との衝突を回避することが可能な衝突回避制御となっている。
Next, the control operation of the obstacle control unit 23H in the second collision avoidance control will be described. The obstacle control unit 23H has a first detection range Rd1 described above as compared with the case of the first collision avoidance control described above. The deceleration control range is such that the distance from the judgment reference position to the boundary between the deceleration control range Rdc and the stop control range Rsc and the distance from the judgment reference position to the boundary between the notification control range Rnc and the deceleration control range Rdc are long. In a state where the front-rear lengths of Rdc and the stop control range Rsc are lengthened, the same control processes as those of steps # 11 to 18 in the first collision avoidance control are performed. As a result, the second collision avoidance control is a collision avoidance control capable of avoiding a collision with an obstacle at an earlier timing than the first collision avoidance control.
これにより、障害物用制御部23Hは、障害物検出ユニット80Bからの測定情報よりも測距精度が低い撮像ユニット80Aからの測定情報に基づいて障害物との衝突を回避する第2衝突回避制御においては、第1衝突回避制御よりも衝突回避率を高めた状態で障害物との衝突を回避することができる。その結果、測距精度が低い撮像ユニット80Aからの測定情報に基づく障害物との衝突回避を良好に行うことができる。
As a result, the obstacle control unit 23H avoids a collision with an obstacle based on the measurement information from the imaging unit 80A, which has a lower distance measurement accuracy than the measurement information from the obstacle detection unit 80B. In the above case, it is possible to avoid a collision with an obstacle in a state where the collision avoidance rate is higher than that of the first collision avoidance control. As a result, it is possible to satisfactorily avoid collision with an obstacle based on the measurement information from the image pickup unit 80A having low distance measurement accuracy.
自動走行制御部23Fの自動走行制御には、第1障害物センサ86及び第2障害物センサ87のセンサ表面に対する汚れ(測定阻害物)の付着などを検知した場合に、汚れの付着などに基づいてトラクタ1の走行を制御する汚れ対応制御処理が含まれている。
The automatic driving control of the automatic driving control unit 23F is based on the adhesion of dirt (measurement obstruction) on the sensor surfaces of the first obstacle sensor 86 and the second obstacle sensor 87 when it is detected. A dirt handling control process for controlling the traveling of the tractor 1 is included.
以下、図14に示すフローチャートに基づいて、汚れ対応制御処理における自動走行制御部23Fの制御作動について説明する。
尚、ここでも、第1障害物センサ86の場合を例示して説明する。 Hereinafter, the control operation of the automaticdriving control unit 23F in the dirt handling control process will be described based on the flowchart shown in FIG.
Here, too, the case of thefirst obstacle sensor 86 will be described as an example.
尚、ここでも、第1障害物センサ86の場合を例示して説明する。 Hereinafter, the control operation of the automatic
Here, too, the case of the
自動走行制御部23Fは、第1障害物センサ86の測定範囲Rm1に対して、第1障害物センサ86からの測定情報に含まれている汚れなどの測定阻害物に起因した無効値の占める割合が所定値(例えば50%)以上か否かを判定する第8判定処理を行う(ステップ#21)。
The automatic driving control unit 23F has a ratio of an invalid value caused by a measurement obstruction such as dirt included in the measurement information from the first obstacle sensor 86 to the measurement range Rm1 of the first obstacle sensor 86. The eighth determination process for determining whether or not is equal to or greater than a predetermined value (for example, 50%) is performed (step # 21).
自動走行制御部23Fは、第8判定処理にて無効値の占める割合が所定値以上である場合は、トラクタ1の車速を、トラクタ1の走行状態をクリープ走行状態に維持することが可能な超低速まで低下させるための減速指令を変速ユニット制御部23Bに指令する減速指令処理を行う(ステップ#22)。
The automatic driving control unit 23F can maintain the vehicle speed of the tractor 1 and the traveling state of the tractor 1 in the creep traveling state when the ratio of the invalid value occupied by the eighth determination process is equal to or more than a predetermined value. A deceleration command process for instructing the speed change unit control unit 23B to reduce the speed to a low speed is performed (step # 22).
自動走行制御部23Fは、第8判定処理にて無効値の占める割合が所定値以上でない場合は、トラクタ1の車速を、現在の車速に維持するための車速維持指令を変速ユニット制御部23Bに指令する車速維持指令処理を行う(ステップ#23)。
When the ratio of the invalid value is not equal to or more than the predetermined value in the eighth determination process, the automatic driving control unit 23F issues a vehicle speed maintenance command for maintaining the vehicle speed of the tractor 1 to the current vehicle speed to the speed change unit control unit 23B. The vehicle speed maintenance command process to be commanded is performed (step # 23).
自動走行制御部23Fは、減速指令処理を行った後、トラクタ1のクリープ走行状態が所定時間継続されたか否かを判定する第9判定処理を行う(ステップ#24)。そして、自動走行制御部23Fは、クリープ走行状態が所定時間継続されるまでの間において前述した第8判定処理を行い(ステップ#25)、この第8判定処理にて無効値の占める割合が所定値未満に低下した場合は、第1障害物センサ86のセンサ表面に汚れが付着しているのではなく、第1障害物センサ86の周辺にて埃や霧などの浮遊物が舞っていただけであると判定して、トラクタ1の車速を、超低速まで低下させる前の元の車速に復帰させるための車速復帰指令を変速ユニット制御部23Bに指令する車速復帰指令処理を行い(ステップ#26)、その後、ステップ#21に戻る。
After performing the deceleration command process, the automatic driving control unit 23F performs the ninth determination process of determining whether or not the creep traveling state of the tractor 1 has been continued for a predetermined time (step # 24). Then, the automatic traveling control unit 23F performs the above-mentioned eighth determination process (step # 25) until the creep traveling state is continued for a predetermined time, and the ratio of the invalid value is determined in this eighth determination process. If it drops below the value, the sensor surface of the first obstacle sensor 86 is not dirty, but floating objects such as dust and fog are flying around the first obstacle sensor 86. It is determined that the tractor 1 is present, and a vehicle speed return command process is performed to command the speed change unit control unit 23B to return the vehicle speed to the original vehicle speed before the tractor 1 is lowered to the ultra-low speed (step # 26). After that, the process returns to step # 21.
自動走行制御部23Fは、クリープ走行状態が所定時間継続された場合は、第1障害物センサ86のセンサ表面に汚れが付着していると判定して、直ちにトラクタ1を走行停止させるための走行停止指令を変速ユニット制御部23Bに指令する走行停止指令処理を行う(ステップ#27)。
When the creep running state is continued for a predetermined time, the automatic running control unit 23F determines that the sensor surface of the first obstacle sensor 86 is dirty, and immediately stops the running of the tractor 1. A traveling stop command process for commanding a stop command to the speed change unit control unit 23B is performed (step # 27).
つまり、自動走行制御部23Fの自動走行制御によるトラクタ1の自動走行中において、第1障害物センサ86の測定範囲Rm1に対して無効値の占める割合が所定値以上になった場合には、トラクタ1の車速が低速よりも遅い超低速まで低下されて、トラクタ1の走行状態がクリープ走行状態に維持されることから、トラクタ1の走行状態を低速走行状態に維持する場合に比較して、無効値の原因が、各障害物センサ86~88のセンサ表面に対する汚れなどの付着物か、各障害物センサ86~88の周辺にて舞い上がった埃や塵埃などの浮遊物かを判定するための時間を長くすることができる。
That is, when the ratio of the invalid value to the measurement range Rm1 of the first obstacle sensor 86 becomes equal to or more than a predetermined value during the automatic traveling of the tractor 1 by the automatic traveling control of the automatic traveling control unit 23F, the tractor Since the vehicle speed of 1 is reduced to an ultra-low speed slower than the low speed and the running state of the tractor 1 is maintained in the creep running state, it is invalid as compared with the case where the running state of the tractor 1 is maintained in the low speed running state. Time to determine whether the cause of the value is a deposit such as dirt on the sensor surface of each obstacle sensor 86 to 88, or a floating substance such as dust or dust flying around each obstacle sensor 86 to 88. Can be lengthened.
そして、このように判定時間を長くすることにより、無効値の原因が付着物か浮遊物かを判定し易くなり、これにより、無効値の原因が浮遊物である場合に、この浮遊物に基づいてトラクタ1が走行を停止することによる作業効率の低下を抑制することができる。又、無効値の原因が付着物か浮遊物かの判定中に、作業車両Vが障害物に衝突する不都合の発生を抑制することができる。
Then, by lengthening the determination time in this way, it becomes easier to determine whether the cause of the invalid value is a deposit or a suspended substance, and thereby, when the cause of the invalid value is a suspended substance, it is based on this suspended substance. It is possible to suppress a decrease in work efficiency due to the tractor 1 stopping traveling. Further, it is possible to suppress the occurrence of the inconvenience that the work vehicle V collides with an obstacle while determining whether the cause of the invalid value is an adhered substance or a suspended object.
そして、上記の説明から明らかなように、情報統合処理部80C及び障害物用制御部23Hは、撮像ユニット80A及び障害物検出ユニット80Bによる障害物の検出に基づいて、作業車両Vの周囲に障害物が存在することを報知するとともに障害物との衝突を回避する、といった作業支援を可能にする作業支援部として機能する。
Then, as is clear from the above description, the information integration processing unit 80C and the obstacle control unit 23H have an obstacle around the work vehicle V based on the detection of the obstacle by the imaging unit 80A and the obstacle detection unit 80B. It functions as a work support unit that enables work support such as notifying the existence of an object and avoiding a collision with an obstacle.
図7、図15~18に示すように、情報統合処理部80Cには、撮像ユニット80Aからのカラー画像情報と第1障害物センサ86からの測定情報とに基づいて、圃場Aで栽培された作物Zの生育評価の指標となる作物Zの生育情報を取得する生育情報取得部80Caが含まれている。生育情報取得部80Caは、圃場Aの中央側領域(作業領域)A2bを複数に分割して区画設定された生育情報取得区域(所定区域の一例)Ag(図15~18参照)ごとに作物Zの生育情報を取得する。生育情報取得部80Caは、作物Zの生育情報として、作物Zの活性度などを示す植生指標の一例である正規化植生指標(以下、NDVIと称する) 、茎数に相当する植被率、及び、作物Zの草丈値、などを取得する。
As shown in FIGS. 7 and 15 to 18, the information integration processing unit 80C was cultivated in the field A based on the color image information from the image pickup unit 80A and the measurement information from the first obstacle sensor 86. The growth information acquisition unit 80Ca that acquires the growth information of the crop Z, which is an index of the growth evaluation of the crop Z, is included. The growth information acquisition unit 80Ca divides the central area (work area) A2b of the field A into a plurality of sections and sets the crop Z for each growth information acquisition area (an example of a predetermined area) Ag (see FIGS. 15 to 18). Get the growth information of. The growth information acquisition unit 80Ca provides growth information for crop Z, such as a normalized difference vegetation index (hereinafter referred to as NDVI), which is an example of a vegetation index indicating the activity of crop Z, a vegetation coverage rate corresponding to the number of stems, and Obtain the plant height value of crop Z, etc.
尚、生育情報取得区域Agの区画設定は、生育情報に基づく生育評価を利用して作業する作業装置(例えば、肥料の散布などを行う肥料散布装置)の前後長さや作業幅などを考慮して設定することが好ましい。図16は、撮像ユニット80Aの前カメラ81が撮像したカラー画像情報から生成された車体前方のカラー画像の一例である。図17は、第1障害物センサ86の測定情報から生成された車体前方の距離画像の一例である。図18は、第1障害物センサ86の測定情報から生成された車体前方の近赤外画像の一例である。
In addition, when setting the growth information acquisition area Ag, the front-rear length and work width of a work device (for example, a fertilizer spraying device that sprays fertilizer) that works using growth evaluation based on growth information are taken into consideration. It is preferable to set it. FIG. 16 is an example of a color image in front of the vehicle body generated from the color image information captured by the front camera 81 of the image pickup unit 80A. FIG. 17 is an example of a distance image in front of the vehicle body generated from the measurement information of the first obstacle sensor 86. FIG. 18 is an example of a near-infrared image of the front of the vehicle body generated from the measurement information of the first obstacle sensor 86.
生育情報取得部80Caは、例えば、作業装置の一例である中耕作業用のカルチベータ(図示せず)などがトラクタ1の後部に連結された中耕作業状態でのトラクタ1の自動走行に伴って、生育情報取得区域Agごとの作物Zの生育情報を取得する生育情報取得制御を実行する。
The growth information acquisition unit 80Ca is, for example, associated with the automatic traveling of the tractor 1 in the middle cultivation work state in which a cultivator (not shown) for the middle cultivation work, which is an example of the work device, is connected to the rear part of the tractor 1. Growth information acquisition control is executed to acquire the growth information of the crop Z for each growth area Ag.
以下、図19に示すフローチャート及び図15~18に基づいて、生育情報取得制御における生育情報取得部80Caの制御作動について説明する。
Hereinafter, the control operation of the growth information acquisition unit 80Ca in the growth information acquisition control will be described with reference to the flowchart shown in FIG. 19 and FIGS. 15 to 18.
生育情報取得部80Caは、前カメラ81が撮像した第1撮像範囲Ri1のカラー画像情報を撮像ユニット80Aから取得するカラー画像情報取得処理と、前カメラ81のカラー画像情報から生育情報取得区域Ag(図16参照)での可視赤色光の反射率Rを取得する可視赤色光反射率取得処理とを行う(ステップ#31~32)。
The growth information acquisition unit 80Ca performs a color image information acquisition process for acquiring the color image information of the first imaging range Ri1 imaged by the front camera 81 from the imaging unit 80A, and the growth information acquisition area Ag (from the color image information of the front camera 81). The visible red light reflectance acquisition process for acquiring the visible red light reflectance R in (see FIG. 16) is performed (steps # 31 to 32).
生育情報取得部80Caは、第1障害物センサ86による測定情報を障害物検出ユニット80Bから取得する測定情報取得処理と、第1障害物センサ86の測定情報に含まれた生育情報取得区域Agに存在する各測定対象物からの近赤外レーザ光の反射強度(図18参照)を取得する反射強度取得処理と、取得した生育情報取得区域Agでの各反射強度の平均値を生育情報取得区域Agでの近赤外光の反射率IRとして算出する近赤外光反射率算出処置とを行う(ステップ#33~35)。
The growth information acquisition unit 80Ca performs the measurement information acquisition process of acquiring the measurement information by the first obstacle sensor 86 from the obstacle detection unit 80B, and the growth information acquisition area Ag included in the measurement information of the first obstacle sensor 86. The reflection intensity acquisition process for acquiring the reflection intensity (see FIG. 18) of the near-infrared laser light from each existing measurement object, and the average value of each reflection intensity in the acquired growth information acquisition area Ag are the growth information acquisition area. The near-infrared light reflectance calculation procedure, which is calculated as the near-infrared light reflectance IR in Ag, is performed (steps # 33 to 35).
生育情報取得部80Caは、可視赤色光反射率取得処理と近赤外光反射率算出処置とで取得した同じ生育情報取得区域Agでの可視赤色光の反射率Rと近赤外光の反射率IRとを、NDVI算出用の計算式であるNDVI=(IR-R)/(IR+R)に代入して、生育情報取得区域AgでのNDVIを算出するNDVI算出処置を行う(ステップ#36)。
The growth information acquisition unit 80Ca has the reflectance R of visible red light and the reflectance of near-infrared light in the same growth information acquisition area Ag acquired by the visible red light reflectance acquisition process and the near-infrared light reflectance calculation procedure. By substituting IR into NDVI = (IR-R) / (IR + R), which is a calculation formula for calculating NDVI, an NDVI calculation procedure for calculating NDVI in the growth information acquisition area Ag is performed (step # 36).
これにより、生育情報取得部80Caは、撮像ユニット80Aからのカラー画像情報と第1障害物センサ86からの測定情報とに基づいてNDVIを取得することができる。NDVIの値は、-1~1の値を取り、作物Zの窒素吸収量が多く活性度が高いほど高くなる。生育情報取得部80Caは、NDVIの値に対応して、例えば黒~白を色付けすることでNDVI画像を生成することができ、作物Zの活性度などを可視化することができる。
As a result, the growth information acquisition unit 80Ca can acquire the NDVI based on the color image information from the image pickup unit 80A and the measurement information from the first obstacle sensor 86. The value of NDVI takes a value of -1 to 1, and the higher the nitrogen absorption of crop Z and the higher the activity, the higher the value. The growth information acquisition unit 80Ca can generate an NDVI image by coloring black to white, for example, according to the value of the NDVI, and can visualize the activity of the crop Z and the like.
生育情報取得部80Caは、上記のNDVI画像に基づいて生育情報取得区域Agでの植被率を取得する植被率取得処理を行う(ステップ#37)。そして、NDVI算出処置と反射強度取得処理とで取得した同じ生育情報取得区域AgでのNDVIと植被率との積を取ることにより、生育情報取得区域Agでの作物Zの窒素吸収量を推定する窒素吸収量推定処理を行う(ステップ#38)。
The growth information acquisition unit 80Ca performs a vegetation coverage acquisition process for acquiring the vegetation coverage in the growth information acquisition area Ag based on the above NDVI image (step # 37). Then, the nitrogen absorption amount of the crop Z in the growth information acquisition area Ag is estimated by taking the product of the NDVI in the same growth information acquisition area Ag acquired by the NDVI calculation treatment and the reflection intensity acquisition process and the vegetation coverage rate. Nitrogen absorption amount estimation processing is performed (step # 38).
生育情報取得部80Caは、カラー画像情報取得処理にて取得した前カメラ81のカラー画像情報、トラクタ1における前カメラ81の搭載位置や搭載角度、測定情報取得処理にて取得した第1障害物センサ86の測定情報に含まれた距離値情報、及び、トラクタ1における第1障害物センサ86の搭載位置や搭載角度に基づいて、前カメラ81のカラー画像情報と第1障害物センサ86の測定情報とに含まれた同一の生育情報取得区域Ag(図16~17参照)にて栽培された各作物Zの草丈を取得する草丈算出処理と、各作物Zの草丈の平均値を生育情報取得区域Agでの作物Zの草丈値として算出する草丈値算出処理とを行う(ステップ#39~40)。
The growth information acquisition unit 80Ca is the color image information of the front camera 81 acquired in the color image information acquisition process, the mounting position and angle of the front camera 81 in the tractor 1, and the first obstacle sensor acquired in the measurement information acquisition process. Color image information of the front camera 81 and measurement information of the first obstacle sensor 86 based on the distance value information included in the measurement information of the 86 and the mounting position and mounting angle of the first obstacle sensor 86 in the tractor 1. The plant height calculation process for acquiring the plant height of each crop Z cultivated in the same growth information acquisition area Ag (see FIGS. 16 to 17) included in and the growth information acquisition area for the average value of the plant height of each crop Z. The plant height value calculation process for calculating the plant height value of the crop Z in Ag is performed (steps # 39 to 40).
生育情報取得部80Caは、取得した同じ生育情報取得区域AgでのNDVI、植被率、窒素吸収量、及び、作物Zの草丈値、などを、測位ユニット30が取得したトラクタ1の位置情報に関連付けて車載記憶部23Gに記憶する生育情報記憶処理を行う(ステップ#41)。
The growth information acquisition unit 80Ca associates the acquired NDVI, vegetation coverage, nitrogen absorption amount, plant height value of crop Z, etc. in the same growth information acquisition area Ag with the position information of the tractor 1 acquired by the positioning unit 30. The growth information storage process to be stored in the vehicle-mounted storage unit 23G is performed (step # 41).
つまり、この作業車両Vにおいては、障害物検出との衝突を回避するためにトラクタ1に備えられた障害物検出システム80を、NDVI、植被率、窒素吸収量、及び、作物Zの草丈、などの圃場Aで栽培された作物Zの生育情報を取得する生育情報取得ユニットとして兼用することができる。これにより、専用の生育情報取得ユニットをトラクタ1に備える場合に比較して、トラクタ1における構成の複雑化及びコストの高騰などを抑制しながら、作物Zの生育情報を高い精度で取得することができる。
That is, in this work vehicle V, the obstacle detection system 80 provided in the tractor 1 in order to avoid a collision with the obstacle detection is used for NDVI, vegetation coverage, nitrogen absorption amount, plant height of crop Z, and the like. It can also be used as a growth information acquisition unit for acquiring growth information of crop Z cultivated in the field A of. As a result, compared to the case where the tractor 1 is provided with a dedicated growth information acquisition unit, it is possible to acquire the growth information of the crop Z with high accuracy while suppressing the complexity of the configuration and the soaring cost of the tractor 1. it can.
図7に示すように、情報統合処理部80Cには、生育情報取得部80Caが取得した作物の生育情報に基づいて、各生育情報取得区域Agにおける作物の生育状況を評価して、生育情報取得区域Agごとの施肥量の算出などを行う生育評価部80Cbが含まれている。生育評価部80Cbは、算出した生育情報取得区域Agごとの施肥量などを、測位ユニット30が取得したトラクタ1の位置情報に関連付けた状態で車載記憶部23Gに記憶する。
As shown in FIG. 7, the information integration processing unit 80C evaluates the growth status of the crop in each growth information acquisition area Ag based on the growth information of the crop acquired by the growth information acquisition unit 80Ca, and acquires the growth information. A growth evaluation unit 80Cb for calculating the amount of fertilizer applied for each area Ag is included. The growth evaluation unit 80Cb stores the calculated fertilizer application amount for each growth information acquisition area Ag in the vehicle-mounted storage unit 23G in a state of being associated with the position information of the tractor 1 acquired by the positioning unit 30.
自動走行制御部23Fは、作業装置の一例である肥料散布装置が連結されたトラクタ1を自動走行させて施肥作業を行う場合には、車載記憶部23Gにトラクタ1の位置情報と関連付けて記憶された生育情報取得区域Agごとの施肥量と、施肥作業時に測位ユニット30が取得するトラクタ1の位置情報とに基づいて、肥料散布装置による施肥量を生育情報取得区域Agごとに自動調節する施肥量調節制御を実行する。
When the automatic travel control unit 23F automatically travels the tractor 1 to which the fertilizer spraying device, which is an example of the work device, is connected to perform the fertilizer application work, the vehicle-mounted storage unit 23G stores the tractor 1 in association with the position information of the tractor 1. Fertilizer application amount that automatically adjusts the fertilizer application amount by the fertilizer spraying device for each growth information acquisition area Ag based on the fertilizer application amount for each growth information acquisition area Ag and the position information of the tractor 1 acquired by the positioning unit 30 during the fertilizer application work. Perform adjustment control.
これにより、圃場Aにおける生育情報取得区域Agごとの作物の生育のばらつきを簡単かつ適正に改善することができ、結果、圃場Aで栽培される作物の品質向上や収穫量の安定化などを図ることができる。
As a result, it is possible to easily and appropriately improve the variation in the growth of crops in each growth information acquisition area Ag in the field A, and as a result, improve the quality of the crops cultivated in the field A and stabilize the yield. be able to.
〔別実施形態〕
本発明の別実施形態について説明する。
なお、以下に説明する各別実施形態の構成は、それぞれ単独で適用することに限らず、他の別実施形態の構成と組み合わせて適用することも可能である。 [Another Embodiment]
Another embodiment of the present invention will be described.
It should be noted that the configurations of the respective different embodiments described below are not limited to being applied individually, but can also be applied in combination with the configurations of other other embodiments.
本発明の別実施形態について説明する。
なお、以下に説明する各別実施形態の構成は、それぞれ単独で適用することに限らず、他の別実施形態の構成と組み合わせて適用することも可能である。 [Another Embodiment]
Another embodiment of the present invention will be described.
It should be noted that the configurations of the respective different embodiments described below are not limited to being applied individually, but can also be applied in combination with the configurations of other other embodiments.
(1)作業車両Vの構成は種々の変更が可能である。
例えば、作業車両Vは、左右の後輪11に代えて左右のクローラを備えるセミクローラ仕様の走行車体1を有する構成であってもよい。
例えば、作業車両Vは、左右の前輪10及び左右の後輪11に代えて左右のクローラを備えるフルクローラ仕様の走行車体1を有する構成であってもよい。
例えば、作業車両Vは、手動走行のみが可能な走行車体1を有する構成であってもよい。
例えば、作業車両Vは、エンジン14の代わりに電動モータを備える電動仕様に構成されていてもよい。
例えば、作業車両Vは、エンジン14と電動モータとを備えるハイブリッド仕様に構成されていてもよい。 (1) The configuration of the work vehicle V can be changed in various ways.
For example, the work vehicle V may have a semi-crawler specification travelingvehicle body 1 having left and right crawlers instead of the left and right rear wheels 11.
For example, the work vehicle V may have a travelingvehicle body 1 having a full crawler specification, which is provided with left and right crawlers instead of the left and right front wheels 10 and the left and right rear wheels 11.
For example, the work vehicle V may be configured to have a travelingvehicle body 1 capable of only manual traveling.
For example, the work vehicle V may be configured to have an electric specification including an electric motor instead of theengine 14.
For example, the work vehicle V may be configured in a hybrid specification including anengine 14 and an electric motor.
例えば、作業車両Vは、左右の後輪11に代えて左右のクローラを備えるセミクローラ仕様の走行車体1を有する構成であってもよい。
例えば、作業車両Vは、左右の前輪10及び左右の後輪11に代えて左右のクローラを備えるフルクローラ仕様の走行車体1を有する構成であってもよい。
例えば、作業車両Vは、手動走行のみが可能な走行車体1を有する構成であってもよい。
例えば、作業車両Vは、エンジン14の代わりに電動モータを備える電動仕様に構成されていてもよい。
例えば、作業車両Vは、エンジン14と電動モータとを備えるハイブリッド仕様に構成されていてもよい。 (1) The configuration of the work vehicle V can be changed in various ways.
For example, the work vehicle V may have a semi-crawler specification traveling
For example, the work vehicle V may have a traveling
For example, the work vehicle V may be configured to have a traveling
For example, the work vehicle V may be configured to have an electric specification including an electric motor instead of the
For example, the work vehicle V may be configured in a hybrid specification including an
(2)作業支援部として機能する情報統合処理部80C及び障害物用制御部23Hは、障害物検出ユニット80Bのみによる障害物の検出に基づいて、作業車両Vの周囲に障害物が存在することを報知するとともに障害物との衝突を回避する、といった作業支援を行うように構成されていてもよい。
又、障害物用制御部23Hは、障害物の検出に基づいて、作業車両Vの周囲に障害物が存在することを報知する報知処理のみを行うことで障害物との衝突を回避し易くする、といった作業支援を行うように構成されていてもよい。 (2) The information integratedprocessing unit 80C and the obstacle control unit 23H, which function as work support units, have obstacles around the work vehicle V based on the detection of obstacles only by the obstacle detection unit 80B. It may be configured to provide work support such as notifying the user and avoiding a collision with an obstacle.
Further, theobstacle control unit 23H facilitates avoiding a collision with an obstacle by performing only a notification process for notifying the existence of an obstacle around the work vehicle V based on the detection of the obstacle. , Etc. may be configured to provide work support.
又、障害物用制御部23Hは、障害物の検出に基づいて、作業車両Vの周囲に障害物が存在することを報知する報知処理のみを行うことで障害物との衝突を回避し易くする、といった作業支援を行うように構成されていてもよい。 (2) The information integrated
Further, the
(3)生育情報取得部80Caは、撮像ユニット80Aの後カメラ82からのカラー画像情報と第2障害物センサ87からの測定情報とに基づいて、圃場Aで栽培された作物Zの生育情報を取得するように構成されていてもよい。
(3) The growth information acquisition unit 80Ca obtains the growth information of the crop Z cultivated in the field A based on the color image information from the rear camera 82 of the imaging unit 80A and the measurement information from the second obstacle sensor 87. It may be configured to acquire.
(4)第3障害物センサ88として、キャビン13から右方の第3測定範囲Rm3が測定範囲に設定された右側のライダーセンサと、キャビン13から左方の第4測定範囲Rm4が測定範囲に設定された左側のライダーセンサとを備えるようにしてもよい。
そして、この構成においては、生育情報取得部80Caが、撮像ユニット80Aの右カメラ83からのカラー画像情報と右側のライダーセンサからの測定情報とに基づいて、圃場Aで栽培された作物Zの生育情報を取得するように構成されていてもよい。又、生育情報取得部80Caが、撮像ユニット80Aの左カメラ84からのカラー画像情報と左側のライダーセンサからの測定情報とに基づいて、圃場Aで栽培された作物Zの生育情報を取得するように構成されていてもよい。 (4) As thethird obstacle sensor 88, the right rider sensor in which the third measurement range Rm3 on the right side of the cabin 13 is set as the measurement range and the fourth measurement range Rm4 on the left side from the cabin 13 are in the measurement range. It may be provided with a set left rider sensor.
Then, in this configuration, the growth information acquisition unit 80Ca grows the crop Z cultivated in the field A based on the color image information from theright camera 83 of the imaging unit 80A and the measurement information from the rider sensor on the right side. It may be configured to retrieve information. Further, the growth information acquisition unit 80Ca acquires the growth information of the crop Z cultivated in the field A based on the color image information from the left camera 84 of the imaging unit 80A and the measurement information from the rider sensor on the left side. It may be configured in.
そして、この構成においては、生育情報取得部80Caが、撮像ユニット80Aの右カメラ83からのカラー画像情報と右側のライダーセンサからの測定情報とに基づいて、圃場Aで栽培された作物Zの生育情報を取得するように構成されていてもよい。又、生育情報取得部80Caが、撮像ユニット80Aの左カメラ84からのカラー画像情報と左側のライダーセンサからの測定情報とに基づいて、圃場Aで栽培された作物Zの生育情報を取得するように構成されていてもよい。 (4) As the
Then, in this configuration, the growth information acquisition unit 80Ca grows the crop Z cultivated in the field A based on the color image information from the
(5)生育情報取得部80Caは、作物Zの生育情報として、NDVIよりも畝間の土壌の影響を受け難い緑正規化植生指標(GNDVI:Green Normalized Difference Vegetation Index)を取得するように構成されていてもよい。緑正規化植生指標は、生育情報取得区域Agでの可視緑色光の反射率Gと近赤外光の反射率IRとを、GNDVI算出用の計算式であるGNDVI=(IR-G)/(IR+G)に代入することで算出することができる。
又、生育情報取得部80Caは、作物Zの生育情報として、鉛直植生指標(PVI:Perpendicular Vegetation Index)や土壌補正正規化植生指標(SAVI:Soil Adjusted Vegetation Index)などを取得するように構成されていてもよい。 (5) The growth information acquisition unit 80Ca is configured to acquire a green normalized difference vegetation index (GNDVI: Green Normalized Difference Vegetation Index), which is less affected by the soil in the furrows than NDVI, as growth information of crop Z. You may. The green normalized vegetation index is a calculation formula for calculating GNDVI, which is a calculation formula for calculating the reflectance G of visible green light and the reflectance IR of near-infrared light in the growth information acquisition area Ag. It can be calculated by substituting into IR + G).
Further, the growth information acquisition unit 80Ca is configured to acquire a vertical vegetation index (PVI: Perpendicular Vegetation Index), a soil-corrected normalized difference vegetation index (SAVI: Soil Adjusted Difference Vegetation Index), and the like as growth information of crop Z. You may.
又、生育情報取得部80Caは、作物Zの生育情報として、鉛直植生指標(PVI:Perpendicular Vegetation Index)や土壌補正正規化植生指標(SAVI:Soil Adjusted Vegetation Index)などを取得するように構成されていてもよい。 (5) The growth information acquisition unit 80Ca is configured to acquire a green normalized difference vegetation index (GNDVI: Green Normalized Difference Vegetation Index), which is less affected by the soil in the furrows than NDVI, as growth information of crop Z. You may. The green normalized vegetation index is a calculation formula for calculating GNDVI, which is a calculation formula for calculating the reflectance G of visible green light and the reflectance IR of near-infrared light in the growth information acquisition area Ag. It can be calculated by substituting into IR + G).
Further, the growth information acquisition unit 80Ca is configured to acquire a vertical vegetation index (PVI: Perpendicular Vegetation Index), a soil-corrected normalized difference vegetation index (SAVI: Soil Adjusted Difference Vegetation Index), and the like as growth information of crop Z. You may.
(6)生育情報取得部80Caが取得したNDVI、植被率、窒素吸収量、及び、作物Zの草丈値、などの生育情報や、生育評価部80Cbによる評価結果などを、測位ユニット30が取得したトラクタ1の位置情報に関連付けた状態でクラウドサーバに記憶するようにしてもよい。
(6) The positioning unit 30 has acquired growth information such as NDVI, vegetation coverage, nitrogen absorption amount, and plant height value of crop Z acquired by the growth information acquisition unit 80Ca, and evaluation results by the growth evaluation unit 80Cb. It may be stored in the cloud server in a state associated with the position information of the tractor 1.
(7)生育情報取得部80Ca及び生育評価部80Cbを含む情報統合処理部80Cを、車載制御ユニット23又はクラウドサーバに備えるようにしてもよい。
(7) The in-vehicle control unit 23 or the cloud server may be provided with the information integrated processing unit 80C including the growth information acquisition unit 80Ca and the growth evaluation unit 80Cb.
本発明の第2特徴構成は、
前記障害物センサは、前記測定範囲に存在する測定対象物までの距離を3次元で測定するライダーセンサであり、
前記生育情報取得部は、前記ライダーセンサの距離情報に基づいて、前記生育情報として作物の草丈値を取得する点にある。 The second characteristic configuration of the present invention is
The obstacle sensor is a rider sensor that three-dimensionally measures the distance to a measurement object existing in the measurement range.
The growth information acquisition unit acquires the plant height value of the crop as the growth information based on the distance information of the rider sensor.
前記障害物センサは、前記測定範囲に存在する測定対象物までの距離を3次元で測定するライダーセンサであり、
前記生育情報取得部は、前記ライダーセンサの距離情報に基づいて、前記生育情報として作物の草丈値を取得する点にある。 The second characteristic configuration of the present invention is
The obstacle sensor is a rider sensor that three-dimensionally measures the distance to a measurement object existing in the measurement range.
The growth information acquisition unit acquires the plant height value of the crop as the growth information based on the distance information of the rider sensor.
本構成によれば、ライダーセンサは測距精度が高いことから、作物の生育情報として精度の高い作物の草丈値を取得することができる。そして、取得した作物の草丈値を含む作物の生育情報を基に、圃場における所定区域ごとの作物の生育状況をより精度良く評価することができ、評価した所定区域ごとの作物の生育状況に基づいて、所定区域ごとの作物に対する適正な施肥量をより精度良く算出することができる。
According to this configuration, since the rider sensor has high distance measurement accuracy, it is possible to acquire the plant height value of the crop with high accuracy as the growth information of the crop. Then, based on the crop growth information including the plant height value of the acquired crop, the growth status of the crop in each predetermined area in the field can be evaluated more accurately, and based on the growth status of the crop in each predetermined area evaluated. Therefore, the appropriate amount of fertilizer applied to the crops in each predetermined area can be calculated more accurately.
その結果、作物の品質向上や収穫量の安定化などを図るための作業支援を、精度の高い作物の草丈値を含む作物の生育情報に基づいてより適正に行うことができる。
As a result, work support for improving crop quality and stabilizing yield can be provided more appropriately based on crop growth information including highly accurate crop height values.
本発明の第3特徴構成は、
前記生育情報取得部は、前記カラー画像情報から可視赤色光の反射率を取得し、かつ、前記測定情報から前記近赤外光の反射率を取得して、前記生育情報として正規化植生指標を取得する点にある。 The third characteristic configuration of the present invention is
The growth information acquisition unit acquires the reflectance of visible red light from the color image information, acquires the reflectance of the near infrared light from the measurement information, and uses the normalized vegetation index as the growth information. There is a point to acquire.
前記生育情報取得部は、前記カラー画像情報から可視赤色光の反射率を取得し、かつ、前記測定情報から前記近赤外光の反射率を取得して、前記生育情報として正規化植生指標を取得する点にある。 The third characteristic configuration of the present invention is
The growth information acquisition unit acquires the reflectance of visible red light from the color image information, acquires the reflectance of the near infrared light from the measurement information, and uses the normalized vegetation index as the growth information. There is a point to acquire.
本構成によれば、作物の生育情報として代表的な正規化植生指標を含む作物の生育情報を基に、圃場における所定区域ごとの作物の生育状況を適正に評価することができ、評価した所定区域ごとの作物の生育状況に基づいて、所定区域ごとの作物に対する適正な施肥量を算出することができる。
According to this configuration, it is possible to appropriately evaluate the growth status of the crop in each predetermined area in the field based on the growth information of the crop including the typical normalized vegetation index as the growth information of the crop, and the evaluated predetermined value. The appropriate amount of fertilizer applied to the crops in each predetermined area can be calculated based on the growth status of the crops in each area.
その結果、作物の品質向上や収穫量の安定化などを図るための作業支援を、正規化植生指標を含む作物の生育情報に基づいて適正に行うことができる。
As a result, work support for improving the quality of crops and stabilizing the yield can be appropriately provided based on the growth information of crops including the normalized difference vegetation index.
本発明の第4特徴構成は、
前記作業支援部には、前記位置情報に基づいて、前記圃場に応じて生成された目標経路に従って前記作業車両を自動走行させる自動走行制御部が含まれており、
前記生育情報取得部は、前記作業車両の自動走行に伴って、前記圃場内を複数に区画して設定された所定区域ごとの前記生育情報を取得する点にある。 The fourth characteristic configuration of the present invention is
The work support unit includes an automatic travel control unit that automatically travels the work vehicle according to a target route generated according to the field based on the position information.
The growth information acquisition unit is at a point of acquiring the growth information for each predetermined area set by dividing the field into a plurality of areas in accordance with the automatic traveling of the work vehicle.
前記作業支援部には、前記位置情報に基づいて、前記圃場に応じて生成された目標経路に従って前記作業車両を自動走行させる自動走行制御部が含まれており、
前記生育情報取得部は、前記作業車両の自動走行に伴って、前記圃場内を複数に区画して設定された所定区域ごとの前記生育情報を取得する点にある。 The fourth characteristic configuration of the present invention is
The work support unit includes an automatic travel control unit that automatically travels the work vehicle according to a target route generated according to the field based on the position information.
The growth information acquisition unit is at a point of acquiring the growth information for each predetermined area set by dividing the field into a plurality of areas in accordance with the automatic traveling of the work vehicle.
本構成によれば、圃場において作業車両を手動走行させる手間を要することなく、圃場内における所定区域ごとの作物の生育情報を取得することができる。
According to this configuration, it is possible to acquire crop growth information for each predetermined area in the field without the trouble of manually running the work vehicle in the field.
その結果、作物の生育情報を取得するための労力を不要にしながら、作物の品質向上や収穫量の安定化などを図るための作業支援を良好に行うことができる。
As a result, it is possible to provide good work support for improving the quality of crops and stabilizing the yield while eliminating the labor required for acquiring crop growth information.
本発明の第5特徴構成は、
前記生育情報取得部は、前記所定区域ごとの前記生育情報に基づいて前記所定区域ごとの施肥量を算出し、
前記自動走行制御部は、前記作業車両の自動走行にて施肥作業を行う場合には、前記位置情報と前記施肥量とに基づいて前記所定区域ごとの施肥量を自動調節する点にある。 The fifth characteristic configuration of the present invention is
The growth information acquisition unit calculates the fertilizer application amount for each predetermined area based on the growth information for each predetermined area.
The automatic traveling control unit automatically adjusts the fertilizer application amount for each predetermined area based on the position information and the fertilizer application amount when the fertilizer application work is performed by the automatic travel of the work vehicle.
前記生育情報取得部は、前記所定区域ごとの前記生育情報に基づいて前記所定区域ごとの施肥量を算出し、
前記自動走行制御部は、前記作業車両の自動走行にて施肥作業を行う場合には、前記位置情報と前記施肥量とに基づいて前記所定区域ごとの施肥量を自動調節する点にある。 The fifth characteristic configuration of the present invention is
The growth information acquisition unit calculates the fertilizer application amount for each predetermined area based on the growth information for each predetermined area.
The automatic traveling control unit automatically adjusts the fertilizer application amount for each predetermined area based on the position information and the fertilizer application amount when the fertilizer application work is performed by the automatic travel of the work vehicle.
本構成によれば、圃場における所定区域ごとの作物の生育情報に応じた施肥作業を、作業車両の自動走行によって行うことができる。
According to this configuration, fertilization work according to crop growth information for each predetermined area in the field can be performed by automatic running of the work vehicle.
その結果、ユーザにかかる労力を大幅に削減しながら、作物の品質向上や収穫量の安定化などを図るための作業支援を良好に行うことができる。
As a result, it is possible to provide good work support for improving the quality of crops and stabilizing the yield while significantly reducing the labor required for the user.
Claims (5)
- 圃場で作業する作業車両に搭載され、当該作業車両の周辺に設定された撮像範囲の可視光を撮像してカラー画像情報を取得する撮像ユニットと、
前記作業車両に搭載され、当該作業車両の周辺に設定された測定範囲に対する近赤外光の投受光により近赤外光の反射強度を含む測定情報を取得し、取得した前記測定情報に基づいて障害物を検出する障害物センサと、
前記カラー画像情報と前記測定情報とに基づいて前記作業車両による作業を支援する作業支援部と、
前記作業車両の位置を測定して位置情報を取得する測位ユニットと、
前記カラー画像情報と前記測定情報とに基づいて前記圃場で栽培された作物の生育情報を取得する生育情報取得部と、
前記生育情報を前記位置情報に関連付けて記憶する記憶部と、を備えている作業支援システム。 An imaging unit that is mounted on a work vehicle working in a field and captures visible light in an imaging range set around the work vehicle to acquire color image information.
Measurement information including the reflection intensity of near-infrared light is acquired by the projection and reception of near-infrared light with respect to the measurement range set around the work vehicle mounted on the work vehicle, and based on the acquired measurement information. An obstacle sensor that detects obstacles and
A work support unit that supports work by the work vehicle based on the color image information and the measurement information, and
A positioning unit that measures the position of the work vehicle and acquires position information,
A growth information acquisition unit that acquires growth information of crops cultivated in the field based on the color image information and the measurement information.
A work support system including a storage unit that stores the growth information in association with the position information. - 前記障害物センサは、前記測定範囲に存在する測定対象物までの距離を3次元で測定するライダーセンサであり、
前記生育情報取得部は、前記ライダーセンサの距離情報に基づいて、前記生育情報として作物の草丈値を取得する請求項1に記載の作業支援システム。 The obstacle sensor is a rider sensor that three-dimensionally measures the distance to a measurement object existing in the measurement range.
The work support system according to claim 1, wherein the growth information acquisition unit acquires a plant height value of a crop as the growth information based on the distance information of the rider sensor. - 前記生育情報取得部は、前記カラー画像情報から可視赤色光の反射率を取得し、かつ、前記測定情報から前記近赤外光の反射率を取得して、前記生育情報として正規化植生指標を取得する請求項1又は2に記載の作業支援システム。 The growth information acquisition unit acquires the reflectance of visible red light from the color image information, acquires the reflectance of the near-infrared light from the measurement information, and uses the normalized vegetation index as the growth information. The work support system according to claim 1 or 2 to be acquired.
- 前記作業支援部には、前記位置情報に基づいて、前記圃場に応じて生成された目標経路に従って前記作業車両を自動走行させる自動走行制御部が含まれており、
前記生育情報取得部は、前記作業車両の自動走行に伴って、前記圃場内を複数に区画して設定された所定区域ごとの前記生育情報を取得する請求項1~3のいずれか一項に記載の作業支援システム。 The work support unit includes an automatic travel control unit that automatically travels the work vehicle according to a target route generated according to the field based on the position information.
According to any one of claims 1 to 3, the growth information acquisition unit acquires the growth information for each predetermined area set by dividing the field into a plurality of areas in accordance with the automatic traveling of the work vehicle. Described work support system. - 前記生育情報取得部は、前記所定区域ごとの前記生育情報に基づいて前記所定区域ごとの施肥量を算出し、
前記自動走行制御部は、前記作業車両の自動走行にて施肥作業を行う場合には、前記位置情報と前記施肥量とに基づいて前記所定区域ごとの施肥量を自動調節する請求項4に記載の作業支援システム。 The growth information acquisition unit calculates the fertilizer application amount for each predetermined area based on the growth information for each predetermined area.
The fourth aspect of claim 4, wherein the automatic traveling control unit automatically adjusts the fertilizer application amount for each predetermined area based on the position information and the fertilizer application amount when the fertilizer application work is performed by the automatic travel of the work vehicle. Work support system.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019196599A JP7399680B2 (en) | 2019-10-29 | 2019-10-29 | Work support system |
JP2019-196599 | 2019-10-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021085053A1 true WO2021085053A1 (en) | 2021-05-06 |
Family
ID=75711632
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/037862 WO2021085053A1 (en) | 2019-10-29 | 2020-10-06 | Work assistance system |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7399680B2 (en) |
WO (1) | WO2021085053A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024190391A1 (en) * | 2023-03-13 | 2024-09-19 | コニカミノルタ株式会社 | Identification device, identification method, and program |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001025306A (en) * | 1999-07-13 | 2001-01-30 | Genichi Hayashi | Weeding machine for turf |
JP2001120042A (en) * | 1999-10-22 | 2001-05-08 | Hokuriku Natl Agricultural Experiment Station | Image mapping system |
JP2016049102A (en) * | 2014-08-29 | 2016-04-11 | 株式会社リコー | Farm field management system, farm field management method, and program |
JP2017216524A (en) * | 2016-05-30 | 2017-12-07 | パナソニックIpマネジメント株式会社 | Imaging apparatus |
WO2018096840A1 (en) * | 2016-11-24 | 2018-05-31 | 富士フイルム株式会社 | Image processing device, image capturing device, and image processing method |
WO2019003400A1 (en) * | 2017-06-29 | 2019-01-03 | 日本電気株式会社 | Coefficient calculation device, coefficient calculation method, and recording medium in which coefficient calculation program is recorded |
JP2019046149A (en) * | 2017-09-01 | 2019-03-22 | コニカミノルタ株式会社 | Crop cultivation support apparatus |
JP2019114138A (en) * | 2017-12-25 | 2019-07-11 | 井関農機株式会社 | Farm work supporting system |
JP2019129178A (en) * | 2018-01-22 | 2019-08-01 | ソニーセミコンダクタソリューションズ株式会社 | Semiconductor device and electronic apparatus |
JP2019174347A (en) * | 2018-03-29 | 2019-10-10 | ヤンマー株式会社 | Obstacle detection system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6369527B2 (en) | 2016-12-13 | 2018-08-08 | Tdk株式会社 | Sensor unit |
JP2019003400A (en) | 2017-06-15 | 2019-01-10 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Parking lot management device, parking lot management method, and parking lot management system |
-
2019
- 2019-10-29 JP JP2019196599A patent/JP7399680B2/en active Active
-
2020
- 2020-10-06 WO PCT/JP2020/037862 patent/WO2021085053A1/en active Application Filing
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001025306A (en) * | 1999-07-13 | 2001-01-30 | Genichi Hayashi | Weeding machine for turf |
JP2001120042A (en) * | 1999-10-22 | 2001-05-08 | Hokuriku Natl Agricultural Experiment Station | Image mapping system |
JP2016049102A (en) * | 2014-08-29 | 2016-04-11 | 株式会社リコー | Farm field management system, farm field management method, and program |
JP2017216524A (en) * | 2016-05-30 | 2017-12-07 | パナソニックIpマネジメント株式会社 | Imaging apparatus |
WO2018096840A1 (en) * | 2016-11-24 | 2018-05-31 | 富士フイルム株式会社 | Image processing device, image capturing device, and image processing method |
WO2019003400A1 (en) * | 2017-06-29 | 2019-01-03 | 日本電気株式会社 | Coefficient calculation device, coefficient calculation method, and recording medium in which coefficient calculation program is recorded |
JP2019046149A (en) * | 2017-09-01 | 2019-03-22 | コニカミノルタ株式会社 | Crop cultivation support apparatus |
JP2019114138A (en) * | 2017-12-25 | 2019-07-11 | 井関農機株式会社 | Farm work supporting system |
JP2019129178A (en) * | 2018-01-22 | 2019-08-01 | ソニーセミコンダクタソリューションズ株式会社 | Semiconductor device and electronic apparatus |
JP2019174347A (en) * | 2018-03-29 | 2019-10-10 | ヤンマー株式会社 | Obstacle detection system |
Also Published As
Publication number | Publication date |
---|---|
JP2021069294A (en) | 2021-05-06 |
JP7399680B2 (en) | 2023-12-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7044694B2 (en) | Obstacle detection system for work vehicles | |
WO2021006322A1 (en) | Automatic travel system for work vehicle | |
US12035647B2 (en) | Automatic travel system for work vehicles | |
JP7526333B2 (en) | Automatic driving system for work vehicle and control method for work vehicle | |
US20220381920A1 (en) | Obstacle Detection System | |
US20210018617A1 (en) | Obstacle Detection System for Work Vehicle | |
WO2020137134A1 (en) | Collision avoidance system for work vehicle | |
WO2021085053A1 (en) | Work assistance system | |
JP7544778B2 (en) | Autonomous driving system for work vehicles | |
JP7059221B2 (en) | Control system for work vehicles | |
JP7317165B2 (en) | Obstacle detection system for work vehicles | |
JP2022028333A (en) | Automated travelling unit for work vehicle | |
JP2022087244A (en) | Control system for work vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20883046 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20883046 Country of ref document: EP Kind code of ref document: A1 |