WO2018235486A1 - Harvester - Google Patents

Harvester Download PDF

Info

Publication number
WO2018235486A1
WO2018235486A1 PCT/JP2018/019444 JP2018019444W WO2018235486A1 WO 2018235486 A1 WO2018235486 A1 WO 2018235486A1 JP 2018019444 W JP2018019444 W JP 2018019444W WO 2018235486 A1 WO2018235486 A1 WO 2018235486A1
Authority
WO
WIPO (PCT)
Prior art keywords
recognition
weed
unit
output data
data
Prior art date
Application number
PCT/JP2018/019444
Other languages
French (fr)
Japanese (ja)
Inventor
高原一浩
宮下隼輔
石見憲一
Original Assignee
株式会社クボタ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017123440A external-priority patent/JP7068781B2/en
Priority claimed from JP2017123441A external-priority patent/JP6765349B2/en
Priority claimed from JP2017123439A external-priority patent/JP6854713B2/en
Application filed by 株式会社クボタ filed Critical 株式会社クボタ
Priority to KR1020197031226A priority Critical patent/KR102589076B1/en
Priority to CN201880029905.8A priority patent/CN110582198B/en
Publication of WO2018235486A1 publication Critical patent/WO2018235486A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D41/00Combines, i.e. harvesters or mowers combined with threshing devices
    • A01D41/12Details of combines
    • A01D41/127Control or measuring arrangements specially adapted for combines
    • A01D41/1278Control or measuring arrangements specially adapted for combines for automatic steering
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • A01B69/007Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow
    • A01B69/008Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow automatic
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D41/00Combines, i.e. harvesters or mowers combined with threshing devices
    • A01D41/12Details of combines
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D61/00Elevators or conveyors for binders or combines
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D69/00Driving mechanisms or parts thereof for harvesters or mowers
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01FPROCESSING OF HARVESTED PRODUCE; HAY OR STRAW PRESSES; DEVICES FOR STORING AGRICULTURAL OR HORTICULTURAL PRODUCE
    • A01F12/00Parts or details of threshing apparatus
    • A01F12/46Mechanical grain conveyors

Definitions

  • the present invention relates to a harvester that harvests a grained paddy while traveling on a field.
  • Patent Document 1 in the conventional combine, a threshing depth detection sensor for detecting the tip position of the transport grain fed from the reaper to the threshing device is provided, and the detection information of the threshing depth detection sensor The threshing depth adjusting means for controlling the threshing depth in the threshing device is controlled based on the threshing device to thresh the transported grain at the target threshing depth.
  • a television camera and an image processing apparatus for capturing a grain scale in front of the reaper are provided.
  • the image processing apparatus detects a specific state of the grain scale by comparing the image from the television camera with an image indicating the establishment state of various grain scale stored in advance. For example, when the image processing apparatus detects that a part of the grain scale in front of the reaper is fallen, the scratching reel is tilted with the grain fall side down. As a result, the harvesting performance of the fallen cereal straw is improved.
  • the combine is a headland when it is detected that no grain is present in front of the reaper.
  • the power transmission to the cutter is interrupted, the cutter is raised, and the traveling speed is reduced. As a result, the collision of the cutter with the wedge or the like during the turning on the headland is avoided, and the turning of the headland of the combine is smoothly performed.
  • a camera for photographing the rear of the machine body is mounted, and from the photographed image, a stock row of cereal straw cut by the reaper is recognized. Based on the difference between the recognized stock row and the aircraft, the orientation control of the vehicle is performed such that each branch passes between the strain rows and the stock rows.
  • the technology of threshing with appropriate threshing depth contributes to the improvement of threshing performance by adjusting the threshing depth using the threshing depth detection sensor when transporting the reaping cropped rice straw reaped by the reaper to the threshing device .
  • the threshing depth adjusting means adjusts the threshing depth in the threshing device with the length of the weed, which causes a problem that the threshing performance of the original reaping grain crusher is reduced.
  • the traveling device and the working device are controlled from the detection results obtained based on the photographed image acquired by the on-vehicle camera.
  • various situations may occur in which the photographed image becomes inappropriate.
  • the traveling surface of the field is not flat like a road, a sudden vibration is transmitted to the on-vehicle camera.
  • soaring dust may be mentioned at the time of work traveling. If the work device or the traveling device is controlled based on an inappropriate photographed image, the work traveling becomes inappropriate, and a satisfactory result can not be obtained.
  • An object of the present invention is to further improve the performance of various tasks and the efficiency of tasks in light of the above-mentioned needs.
  • a feature of a harvester is a combine harvester for harvesting grained rice husk while traveling in a field, and a mowing section for mowing the grained rice husk from the field, and a mowing grain gutter Based on the length of the reaping grain weir using the grain conveying apparatus for conveying from the reaper to the threshing apparatus, the threshing depth adjusting mechanism provided in the threshing grain conveying apparatus and the threshing depth adjusting mechanism
  • a handling depth control unit that performs handling depth adjustment control, an aircraft position calculation unit that calculates an aircraft position that is map coordinates of an aircraft based on positioning data from a satellite positioning module, and the aircraft are provided at the time of harvesting work
  • the imaging unit for imaging the field and the image data of the photographed image sequentially acquired sequentially by the imaging unit are input, the weed growth region in the photographed image is estimated, and the estimated weed growth region is determined.
  • the work traveling control unit executes the weed entry control while the weeds cut in the weed growing area pass the processing depth adjustment mechanism and the weeds pass through the processing depth adjustment mechanism. It is in the point being
  • the weed growth area is estimated by the image recognition module from the image data which is the photographed image. Furthermore, since the machine position indicated by the map coordinates at the time of acquisition of the photographed image is calculated by the machine position calculation unit, the weed growth area is calculated from the machine position and the recognition output data indicating the weed growth area. Weed position information indicating the position on the map of is generated.
  • the aircraft position which is the map coordinates of the aircraft, is calculated by the aircraft position calculation unit, so by considering the weed's harvesting position in the vehicle and the weed's transport time from the harvesting position to the handling depth adjustment mechanism, The timing at which weeds pass through the depth adjustment mechanism is determined. Based on this determined timing, while weeds pass through the depth adjustment mechanism, by performing special weed entry control, weeds are mixed in the field where weeds are grown locally. It is possible to suppress the reduction of the threshing performance for the standing cereals.
  • a feature of a harvester is a combine harvester for harvesting grained rice husk while traveling in a field, and a mowing section for mowing the grained rice husk from the field, and a mowing grain gutter Based on the length of the reaping grain weir using the grain conveying device for conveying from the reaper to the threshing device, the threshing depth adjusting mechanism provided in the threshing grain conveying device, and the threshing depth adjusting mechanism
  • a handling depth control unit that performs handling depth adjustment control, an aircraft position calculation unit that calculates an aircraft position that is map coordinates of an aircraft based on positioning data from a satellite positioning module, and the aircraft are provided at the time of harvesting work
  • the imaging unit for imaging the field and the image data of the photographed image sequentially acquired sequentially by the imaging unit are input, the weed growth region in the photographed image is estimated, and the estimated weed growth region is determined.
  • Output recognition output data to indicate A weed position information generation unit for generating weed position information indicating a position on the map of the weed growth area from the image recognition module at the time when the photographed image is acquired and the recognition output data; While the mowing part passes through the weed growth area, a work traveling control part is provided which executes weed entry control.
  • weed entry control is performed while the reaper passes through the weed growth area.
  • the control is simplified since it is not necessary to consider the transport time of weeds and the like.
  • the timing when the reaper passes through the weed growth area is the position on the map calculated based on the positioning data from the satellite positioning module, the distance between the body position and the reaper, and the weed position information. It can be accurately determined from the position on the map of the weed growth area contained in. From this, in one of the preferred embodiments of the present invention, the timing at which the mowing section passes through the weed growth area is based on the weed position information and the machine position calculated by the body position calculating unit. Are configured to be determined.
  • the depth control mechanism When weeds taller than the planted grain weir enter the depth control mechanism, the depth control mechanism considers the weed to be a crop stem, so the depth control unit sets the depth to the weed length. It controls it together. As a result, the adjusted threshing depth is not suitable for the original reaper. Since the length of the harvest grain weir does not change so rapidly, when weeds enter the threshing depth adjustment mechanism, temporarily interrupt threshing depth control based on the threshing depth adjuster. It is preferable not to change the treatment depth. From this, in one of the preferred embodiments of the present invention, the execution of the weed approach control is configured to interrupt the depth adjustment control.
  • the execution of the weed approach control is configured to reduce the vehicle speed.
  • a feature of a harvester is a harvester that harvests crops while traveling in a field, and calculates an aircraft position which is map coordinates of the vehicle based on positioning data from a satellite positioning module An image data of a photographed image sequentially obtained by the photographing unit which is provided in the aircraft position calculating unit, the photographing unit for photographing the field at the time of harvesting operation, provided in the unit, is input, and recognition in the photographed image is performed.
  • An image recognition module for estimating an existence area in which an object exists, and outputting recognition output data including the existence area and an estimated probability when the existence area is estimated; and the body at the time when the photographed image is acquired
  • a recognition target position information generation unit for generating recognition target position information indicating a position on the map of the recognition target from the position and the recognition output data.
  • an image recognition module for estimating an existing area in which a recognition target exists in a captured image from image data which is a captured image is, for example, a neural network (including deep learning) or It is configured using techniques such as reinforcement learning. Moreover, since it is comprised so that the presumed probability at the time of estimating the said presence area
  • the resolution for the recognition target far from the photographing unit is lower than the resolution for the recognition target near from the photographing unit due to the perspective. From this, the recognition object appearing at a location far from the imaging unit has lower recognition reliability than the recognition object appearing at a location near the imaging unit.
  • the estimation probability of the recognition target is reduced as the recognition target is located farther from the imaging unit in the photographed image. Is configured.
  • the statistical operation is arithmetic average, weighted average, intermediate value operation or the like, and is an operation such that a more reliable data value is derived from a plurality of data values.
  • a plurality of pieces of recognition target object position information are stored, and a plurality of pieces of stored recognition target object position information are included in the corresponding recognition output data. It is configured to be corrected based on the result of the statistical calculation of the estimated probability.
  • a data storage unit for temporarily and temporally storing the recognition output data sequentially output from the image recognition module as a recognition output data string;
  • a determination unit is provided, and the data determination unit has the estimated probability lower than a predetermined level as compared with the estimated probability of the recognized output data that is sequentially related in the recognition output data sequence.
  • the recognition output data is determined as improper recognition output data.
  • the recognition output data determined to be unsuitable recognition output data may be deleted, or the estimation probability of the recognition output data may be replaced with an estimation probability interpolated with preceding and following recognition output data.
  • the objects of recognition that are important for harvesters that harvest crops while traveling on the field are people, fallen grain weirs, weeds, straw and so on. Recognizing a person who is an obstacle for work travel is an important factor for safe travel in a field because it serves as a trigger for obstacle avoidance control and obstacle alarm notification. It is important to perform high-quality harvest work, because it becomes control control of weed measures and weed control measures in work control. It is important to recognize the borderline of the field and to recognize the driving of the field on the field, etc. to recognize the ridge.
  • a harvester calculates an aircraft position calculation unit which calculates an aircraft position which is map coordinates of an aircraft based on positioning data from a satellite positioning module, and calculates a traveling locus of the aircraft from the vehicle position.
  • the image data of the photographed image sequentially acquired by the photographing unit, which is provided on the machine, and which is provided on the machine to photograph the field at the time of harvesting work, is input, and
  • An image recognition module for estimating a presence area in which a recognition target is present, and outputting recognition output data including the presence area and an estimated probability when the presence area is estimated; and a previously captured image based on the traveling locus Predicting the range in which the existing region should be located in the next captured image, and the estimated probability of the existing region overlapping the range in the next captured image, and
  • a data determination unit that determines the recognition output data based on the next captured image as inappropriate recognition output data when the estimated probability of the existing area in the previous captured image is different by a predetermined allowable amount or more Prepare.
  • a recognition object appears in front while the harvest machine is working and traveling, usually, a plurality of photographed images in which the recognition object is photographed are acquired. Therefore, the presence area of the recognition target is estimated for each of the plurality of photographed images, and recognition output data including the estimated probability is output.
  • the recognition object recognized in each of the plurality of photographed images is substantially a stationary object, each of the photographed objects is photographed according to the position of the machine at which each photographed image is photographed, that is, according to the traveling locus. It will be at different locations in the image. Therefore, based on the traveling locus, the range in which the region where the recognition target is present in the captured image should be located can be predicted.
  • the existing region of the recognition target is not located in the range predicted in this way, or when it is located, its estimation probability is significantly different from the estimation probability based on other photographed images, Images taken are considered inappropriate. Also, if the captured image is inappropriate, the recognition output data output based on it is also considered to be inappropriate. From this, based on the traveling locus, the range in which the existing area in the previous captured image should be positioned in the next captured image is predicted, and the estimated probability of the existing area overlapping the range in the next captured image and in the previous captured image When the estimation probability of the existing area is different by a predetermined allowable amount or more, it is useful to determine recognition output data based on the next photographed image as inappropriate recognition output data.
  • the estimation probability of the inappropriate recognition output data is obtained based on the estimated probability of the recognition output data that is sequentially before and after the inappropriate recognition output data.
  • a data correction unit is provided that replaces the interpolation estimation probability.
  • the estimation probability of recognition output data determined to be unsuitable recognition output data is interpolated and corrected with the estimation probability of recognition output data before and after that, thereby replacing inappropriate recognition output data with proper recognition output data, It can be used. This is more effective when the number of recognition output data regarding the same recognition target is small.
  • the objects of recognition that are important for harvesters that harvest crops while traveling on the field are people, fallen grain weirs, weeds, straw and so on. Recognizing a person who is an obstacle for work travel is an important factor for safe travel in a field because it serves as a trigger for obstacle avoidance control and obstacle alarm notification. It is important to perform high-quality harvest work, because it becomes control control of weed control and weed control in operation control. It is important to recognize the borderline of the field and to recognize the driving of the field on the field, etc. to recognize the ridge.
  • FIG. 1 is an overall side view of a combine according to a first embodiment.
  • FIG. 2 is a side view of the reaper according to the first embodiment.
  • FIG. 6 is an explanatory view of depth control in the first embodiment.
  • FIG. 6 is a functional block diagram showing functions of a control system of the combine in the first embodiment.
  • FIG. 8 is an explanatory view schematically showing a flow of generation of recognition output data by the image recognition module in the first embodiment. It is a flowchart which shows the flow of control which interrupts a handling depth control by calculating a weed position from the captured image containing the weed growth area
  • FIG. 10 is an overall side view of a combine according to a second embodiment.
  • FIG. 13 is a functional block diagram showing a control system of the combine in the second embodiment.
  • FIG. 16 is an explanatory view schematically showing a flow of generation of recognition output data by the image recognition module in the second embodiment.
  • FIG. 13 is a data flow chart showing the flow of data when generating recognition target object position information from a captured image in Embodiment 2.
  • FIG. It is a schematic diagram which shows the weed map obtained by mapping weed position information as one of recognition target object positional information in Embodiment 2.
  • FIG. FIG. 18 is an overall side view of a combine according to a third embodiment.
  • FIG. 16 is a functional block diagram showing a control system of the combine in the third embodiment.
  • FIG. 18 is an explanatory view schematically showing a flow of generation of recognition output data by the image recognition module in the third embodiment.
  • 15 is a data flow chart showing the flow of data when generating recognition target object position information from a captured image in the third embodiment.
  • FIG. 21 is an explanatory view schematically illustrating data processing for determining improper recognition output data in the third embodiment. It is a schematic diagram which shows the weed map obtained by mapping weed position information as one of recognition target object positional information in Embodiment 3.
  • Embodiment 1 As shown in FIG. 1, in the combine according to the first embodiment, the reaper 2 is connected to the front of the machine body 1 including the pair of left and right crawler traveling devices 10 so as to be movable up and down around the horizontal axis X.
  • a threshing device 11 and a grain tank 12 for storing grains are provided at the rear of the machine body 1 in a state of being aligned in the machine body width direction.
  • a cabin 14 is provided at the front right side of the airframe 1 to cover the boarding driver, and a driving engine 15 is provided below the cabin 14.
  • the threshing device 11 internally receives a reaping grain weir which is cut by the reworking unit 2 and transported backward, and the original origin of the weir by the threshing feed chain 111 and the pinching rail 112
  • the tip end side is threshed with a threshing cylinder 113 while holding and conveying.
  • the grain sorting process is performed on the threshing treatment product in the sorting unit provided below the threshing drum 113, and the grain sorted there is transported to the grain tank 12 and stored.
  • a grain discharging device 13 for discharging grains stored in the grain tank 12 to the outside is provided.
  • the reaper 2 is provided with a plurality of raising devices 21 for causing fallen cereal grains, a clipper-type cutting device 22 for cutting the origin of the raised cereal grains, a grain feeding device 23, etc. There is.
  • the grain feeder conveying device 23 is directed toward the beginning end of the threshing feed chain 111 of the threshing device 11 located on the rear side of the machine while gradually changing the cropped grain gutter in the vertical posture in which the stock origin is cut to a lying posture. Transport
  • the grain feeder conveying device 23 holds the stock origin of the combined feed conveying unit 231 which gathers and gathers a plurality of cutting grain rods cut by the cutting device 22 to the center in the cutting width direction, sandwiches the stock origin of the gathered grain collectors From the end of the stock holding and conveying device 232 to the threshing feed chain 111 to the threshing feed chain 111 It has a supply conveyance device 234 and the like that directs and guides it.
  • the stock holder holding and conveying device 232 is swingably supported by the support frame of the reaper 2 around the horizontal axis.
  • the stock holding and conveying device 232 is operated to swing up and down by the drive operation mechanism 235, and along with the swinging operation, the conveyance end portion is provided so as to change the position of the feeding and conveying device 234 in the scale length direction of the grain scale.
  • the drive operation mechanism 235 has a steering depth adjustment electric motor 236 (hereinafter referred to as a steering depth motor) as a drive source.
  • the drive operation mechanism 235 includes an operation rod 237 which is pushed and pulled by the depth-of-feed motor 236.
  • the lower end portion of the operation rod 237 is pivotally connected to an intermediate portion of the stock origin holding and conveying device 232.
  • the stock origin nipping position of the harvesting grain by the supply transport device 234 is at the stock origin nipping position of the harvesting grain by the stock origin nipping and transporting device 232.
  • it is changed to the tip side and is delivered to the feeding and conveying device 234.
  • the entry depth (handling depth) of the harvest grain weir into the threshing device 11 is changed to a shallower (shallow side).
  • the stock origin pinching position of the harvesting grain by the feed conveying device 234 is the stock origin pinching position of the harvesting grain by the stock origin gripping and transporting device 232 It is delivered to the feeding and conveying device 234 in a state where it is in the near position.
  • the depth of threshing equipment 11 for the reaping grain can be changed to the deeper side (deep side).
  • the stock origin holding and conveying device 232 and the drive operation mechanism 235 constitute a threshing depth adjustment mechanism 3 capable of changing the threshing depth of the reaping grain gutter to the threshing device 11.
  • the combine according to the first embodiment includes a contact-type scale length detecting device 30 for detecting the scale length of the reaping grain transferred by the grain transfer device 23 and a threshing depth control unit 620. It is equipped.
  • the handling depth control unit 620 performs handling depth adjustment control for adjusting the handling depth based on the detection result of the eyelid length detection device 30.
  • the threshing depth control unit 620 controls the threshing depth motor 236 so that the threshing depth of the harvesting grain cane with respect to the threshing device 11 is maintained within the target setting range.
  • the eyelid length detection device 30 is provided with a pair of swingable sensor arms 32, 33 on a main body case 31 as an apparatus main body portion formed in a substantially bottomless box shape opened downward.
  • the pair of sensor arms 32 and 33 contact with the tip end of the reaping grain weir to detect the length of the beak.
  • the pair of sensor arms 32 and 33 are provided in a state of being supported by the main body case 31 and hanging down toward the lower side in a state of being separated in the ridge length direction of the crop grain to be conveyed.
  • Each sensor arm 32, 33 is pivotable back and forth (corresponding to the moving direction of cutting grain) around the horizontal axis center provided inside the main body case 31, and is biased to return to the downward reference posture Being supported in the
  • Detection switches 34 and 35 are provided which are turned off if the amount of oscillation from the reference attitude of 32 and 33 is less than the set amount.
  • the outputs of the pair of detection switches 34 and 35 are input to the depth control unit 620.
  • the handling depth control unit 620 controls the operation of the handling depth motor 236 such that the detection switch 35 located on the tip side is in the off state, and the detection switch 34 located on the stock origin side is in the on state. Do.
  • the handling depth control unit 620 operates the handling depth motor 236 so that the stock origin holding conveyance device 232 moves to the shallow handling side.
  • the handling depth control unit 620 operates the handling depth motor 236 so that the stock origin holding conveyance device 232 moves to the deep handling side when the pair of detection switches 34 and 35 are both in the OFF state.
  • the detection switch 35 located on the tip side of the pair of detection switches 34 and 35 is in the OFF state and the detection switch 34 located on the stock origin side is in the ON state, the handling depth control unit 620 The operation of the motor 236 is stopped and the state is maintained.
  • an imaging unit 70 provided with a color camera is provided at the front end of the ceiling portion of the cabin 14.
  • the fore-and-aft extension of the imaging field of view of the imaging unit 70 substantially reaches the horizon from the front end region of the reaper 2.
  • the breadth of the field of view in the field of view reaches about 10 m to several tens of meters.
  • the photographed image acquired by the photographing unit 70 is converted into image data and sent to the control system of the combine.
  • the photographing unit 70 photographs a field at the time of harvesting work.
  • the control system of the combine has a function of recognizing the weed growth area as a recognition object from the image data sent from the imaging unit 70.
  • a group of normal planted cereals is indicated by a symbol Z0, and a weed growth area is indicated by a symbol Z1.
  • a satellite positioning module 80 is also provided on the ceiling of the cabin 14.
  • the satellite positioning module 80 includes a satellite antenna for receiving global navigation satellite system (GNSS) signals (including GPS signals).
  • GNSS global navigation satellite system
  • an inertial navigation unit incorporating a gyro acceleration sensor and a magnetic direction sensor is incorporated in the satellite positioning module 80.
  • the inertial navigation unit can be located elsewhere.
  • the satellite positioning module 80 is disposed at the rear of the ceiling of the cabin 14 for convenience of drawing, but for example, the satellite positioning module 80 is positioned as close as possible to a position just above the center of left and right of the cutting device 22. It is preferable that the front end portion be disposed closer to the center of the airframe.
  • FIG. 4 shows a functional block diagram of a control system built inside the combine aircraft 1.
  • the control system of this embodiment comprises an electronic control unit called a large number of ECUs, various operation devices, sensors, switches, and a wiring network such as a car-mounted LAN for data transmission between them.
  • the notification device 91 is a device for notifying a driver or the like of a work traveling state and various warnings, and is a buzzer, a lamp, a speaker, a display, or the like.
  • the communication unit 92 is used to exchange data with the cloud computer system 100 or the portable communication terminal 200 installed at a remote place.
  • the mobile communication terminal 200 is a tablet computer operated by a supervisor (including a driver) at the work travel site.
  • the control unit 6 is a core element of this control system, and is shown as a collection of a plurality of ECUs.
  • the positioning data acquired by the satellite positioning module 80 and the image data acquired by the imaging unit 70 are input to the control unit 6 through the wiring network.
  • the control unit 6 includes an output processing unit 6B and an input processing unit 6A as an input / output interface.
  • the output processing unit 6B is connected to the vehicle travel device group 7A and the work device device group 7B.
  • the vehicle travel device group 7A includes control devices related to vehicle travel, such as an engine control device, a shift control device, a braking control device, a steering control device, and the like.
  • the working device group 7B includes a power control device and the like in the reaper 2, the threshing device 11, the grain discharging device 13, the grain conveying device 23, and the threshing depth adjusting mechanism 3.
  • a traveling system detection sensor group 8A, a working system detection sensor group 8B, and the like are connected to the input processing unit 6A.
  • the traveling system detection sensor group 8A includes a sensor that detects the state of an engine speed adjuster, an accelerator pedal, a brake pedal, a gearshift operator, and the like.
  • the working system detection sensor group 8B includes a cutting unit 2, a threshing device 11, a grain discharging device 13, and a device state in the grain conveying device 23 and a sensor for detecting the state of the grain or grain. Furthermore, the working system detection sensor group 8B also includes the detection switches 34 and 35 in the above-described handling depth adjustment mechanism 3.
  • the control unit 6 includes an image recognition module 5, a data processing module 50, a work travel control module 60 which is a work travel control unit, a machine position calculation unit 66, a notification unit 67, and a weed position calculation unit 68.
  • the notification unit 67 generates notification data on the basis of a command or the like received from each functional unit of the control unit 6, and gives the notification data to the notification device 91.
  • the aircraft position calculation unit 66 calculates the aircraft position which is the map coordinates (or field coordinates) of the aircraft 1 based on the positioning data sequentially sent from the satellite positioning module 80.
  • the weed position calculation unit 68 calculates the position of the weed growth area, which is normally calculated by the data processing module 50, on the map of the weed growth area calculated by the body position calculation unit 66, and the grain.
  • the timing at which the weeds harvested by the reaper 2 pass through the depth adjustment mechanism 3 is determined based on the transport speed of the straw transport device 23.
  • the combine according to this embodiment can travel in both automatic travel (automatic steering) and manual travel (manual steering).
  • the work travel control module 60 includes an automatic work travel instruction unit 63 and a travel route setting unit 64.
  • a traveling mode switch (not shown) is provided in the cabin 14 to select one of an automatic traveling mode in which the vehicle travels by automatic steering and a manual steering mode in which the vehicle travels by manual steering. By operating the travel mode switch, it is possible to shift from manual steering travel to automatic steering travel, or shift from automatic steering travel to manual steering travel.
  • the traveling control unit 61 has an engine control function, a steering control function, a vehicle speed control function, and the like, and provides a traveling control signal to the vehicle traveling device group 7A.
  • the work control unit 62 provides a work control device group 7B with a work control signal in order to control the movement of the reaper 2, the threshing device 11, the grain discharging device 13, the grain feeding device 23, and the like.
  • the operation control unit 62 includes the depth control unit 620 described with reference to FIG. 3.
  • the traveling control unit 61 When the manual steering mode is selected, the traveling control unit 61 generates a control signal based on an operation by the driver to control the vehicle traveling device group 7A.
  • the traveling control unit 61 controls the traveling equipment group 7A related to steering and the traveling equipment group 7A related to vehicle speed based on the automatic traveling instruction given by the automatic work traveling instruction unit 63. .
  • the travel route setting unit 64 expands the travel route for automatic travel created by any of the control unit 6, the mobile communication terminal 200, the cloud computer system 100, and the like in the memory.
  • the travel route developed in the memory is sequentially used as a target travel route in automatic travel. Even if this traveling route is manual traveling, it is also possible to use the guidance for the combine to travel along the traveling route.
  • the automatic work travel instruction unit 63 generates an automatic steering instruction and a vehicle speed instruction, and gives the travel control unit 61.
  • the automatic steering command is generated so as to eliminate the azimuth deviation and the positional deviation between the traveling route set by the traveling route setting unit 64 and the vehicle position calculated by the machine position calculation unit 66.
  • the vehicle speed command is generated based on a previously set vehicle speed value.
  • the automatic work travel instruction unit 63 gives the work control unit 62 an operation device operation instruction according to the vehicle position and the traveling state of the vehicle.
  • the image recognition module 5 receives image data of a captured image sequentially acquired by the imaging unit 70 sequentially.
  • the image recognition module 5 estimates an existing area in the photographed image in which a recognition target exists, and outputs recognition output data including an existing area and an estimated probability when the existing area is estimated as a recognition result.
  • the image recognition module 5 is constructed using neural network technology adopting deep learning.
  • the flow of generation of recognition output data by the image recognition module 5 is shown in FIG. 5 and FIG.
  • the image recognition module 5 receives pixel values of RGB image data as input values.
  • the putative verification object is a weed. Therefore, the recognition output data as the recognition result includes a weed growth area shown by a rectangle and an estimated probability when the weed growth area is estimated.
  • the estimation result is schematically shown, and the weed growth area is indicated by a rectangular frame denoted by reference numeral F1.
  • the weed growth area is defined by four corner points respectively, but the coordinate positions on the photographed image of the four corner points of each such rectangle are also included in the estimation result.
  • the weed growth area is not output, and the estimation probability becomes zero.
  • the image recognition module 5 sets internal parameters so that the probability of estimating the recognition object is reduced as the recognition object (weed) is located farther from the imaging unit 70 in the photographed image. doing. As a result, recognition of an object to be recognized in an imaging region whose resolution is low due to being far from the imaging unit 70 is made strict, and erroneous recognition is reduced.
  • the data processing module 50 processes recognition output data output from the image recognition module 5. As shown in FIGS. 4 and 6, the data processing module 50 of this embodiment includes a weed position information generation unit 51 and a statistical processing unit 52.
  • the weed position information generation unit 51 generates weed position information indicating the position on the map of the recognition object from the machine position at the time when the captured image is acquired and the recognition output data.
  • the position on the map where weeds included in the recognition output data is converted from the coordinate position (camera coordinate position) on the photographed image of the four corner points of the rectangle indicating the weed to the coordinates on the map can get.
  • the photographing unit 70 acquires photographed images at predetermined time intervals, for example, 0.5 seconds, and inputs the image data to the image recognition module 5, the image recognition module 5 also recognizes and outputs recognition output data at the same time intervals. Output. Therefore, when weeds are included in the imaging field of view of the imaging unit 70, a plurality of recognition output data will include the presence regions for the same weeds. As a result, multiple weed location information for the same weed can be obtained.
  • the estimation probability included in the recognition output data which is each source data that is, the estimation probability of the weed presence area (weed growth area) included in the weed position information is the position between the imaging unit 70 and the weed Because the relationships are different, they often have different values.
  • such multiple weed position information is stored, and the estimated probability included in each of the stored multiple weed position information is calculated statistically.
  • a representative value of the estimated probability group is determined using a statistical operation on the estimated probabilities of the plurality of recognition object position information.
  • a plurality of pieces of recognition target object position information can be corrected to one piece of optimum recognition target object position information using the representative value.
  • One example of such correction is to obtain the arithmetic mean value or weighted mean value or median value of each estimated probability as a reference value (representative value), and the logic of the existing area (weed growth area) having an estimated probability equal to or higher than the reference value. It is to calculate the sum and generate corrected weed position information which makes it the optimum existing area.
  • other statistical operations can be used to generate reliable single weed position information.
  • the weed position information indicating the position on the map of the weed growth area thus obtained is given to the weed position calculation unit 68.
  • the weed position calculation unit 68 is also given an aircraft position which is map coordinates of the vehicle position calculated by the vehicle position calculation unit 66.
  • the weed position calculation unit 68 determines the passing timing at which the weeds cut off by the harvesting unit 2 pass through the handling depth adjustment mechanism 3 based on the weed position information, the machine position, and the transfer speed of the grain feeding device 23. decide. While the weed passes the handling depth adjustment mechanism 3, the weed position calculation unit 68 gives a weed entry flag to the handling control unit 620.
  • the handling depth control unit 620 has a standard control mode and a weed approach control mode, and normally, the standard control mode is selected, and the above-mentioned handling depth control is executed during work travel. .
  • the standard control mode is switched to the weed entry control mode, and weed entry control is executed.
  • the depth control is interrupted and the vehicle speed is also reduced.
  • a configuration may be employed in which only one of interruption of the depth control and reduction of the vehicle speed is performed.
  • the weed position information generated by the weed position information generation unit 51 can be mapped as shown in FIG. 7 for easy visual display.
  • FIG. 7 illustrates a weed map in which weed position information is mapped.
  • weed growth areas having different estimation probabilities are included in the weed position information, it is possible to display the weed growth areas in a form divided into patterns within a predetermined range of estimation probability values, as shown in FIG. .
  • a reaper 2002 is connected to the front of an airframe 2001 provided with a pair of left and right crawler traveling devices 2010 so as to be able to move up and down around the horizontal axis X.
  • a threshing device 2011 and a grain tank 2012 for storing grains are provided in line with the width direction of the fuselage.
  • a cabin 2014 is provided at the front right side of the fuselage 2001 to cover the boarding driver, and a driving engine 2015 is provided below the cabin 2014.
  • the threshing device 2011 internally receives a reaping grain weir which is reaped by the reaping unit 2002 and transported backward, and the origin of the grain weir by the threshing feed chain 2111 and the clamping rail 2112. While holding and transporting, the tip side is threshed with a threshing cylinder 2113. Then, the grain sorting process is performed on the threshed product in the sorting unit provided below the threshing drum 2113, and the grain sorted there is transported to the grain tank 2012 and stored.
  • a grain discharging device 2013 for discharging grains stored in the grain tank 2012 to the outside is provided.
  • the reaper 2002 is equipped with a plurality of raising devices 2021 for causing fallen cereal grains, a clipper-type cutting device 2022 for cutting the origin of the raised cereal grains, a grain feeding device 2023 and the like. There is.
  • the grain feeder conveyance device 2023 is directed toward the beginning end of the threshing feed chain 2111 of the threshing device 2011 located on the rear side of the fuselage, while gradually changing the cropped grain gutter in the vertical posture in which the stock origin has been cut to a lying posture. Transport
  • the grain feeder conveying device 2023 holds a rear of the combined feeding unit 2231 which conveys a plurality of cut grain remnants cut by the cutting device 2022 while gathering them to the center in the cutting width direction, sandwiches the stock origin of the gathered grain collecting trouts
  • the stock origin holding transport device 2232 for transporting to the tip, the ear tip holding transport device 2233 for locking and carrying the tip side of the harvesting grain pod, the end of the stock origin gripping transport device 2232
  • the stock origin of the harvesting grain stem to the threshing feed chain 2111 It has a supply conveyance device 2234 and the like for directing and guiding.
  • a photographing unit 2070 provided with a color camera is provided.
  • the fore-and-aft extension of the imaging field of view of the imaging unit 2070 substantially reaches the horizon from the front end region of the mowing unit 2002.
  • the breadth of the field of view in the field of view reaches about 10 m to several tens of meters.
  • the photographed image acquired by the photographing unit 2070 is converted into image data, and is sent to the control system of the combine.
  • the photographing unit 2070 photographs a field at the time of harvesting work, but various objects exist in the field as objects to be photographed.
  • the control system of the combine has a function of recognizing a specific object as a recognition target object from the image data sent from the imaging unit 2070.
  • a recognition object in FIG. 8, a normal open-celled flock group indicated by code Z0, a weed group extended higher than an open-celled hail indicated by code Z1, an lodging indicated by code Z2.
  • a group of cereals, a person indicated by a symbol Z3 is schematically shown.
  • a satellite positioning module 2080 is also provided on the ceiling of the cabin 2014.
  • the satellite positioning module 2080 includes a satellite antenna for receiving global navigation satellite system (GNSS) signals (including GPS signals).
  • GNSS global navigation satellite system
  • an inertial navigation unit incorporating a gyro acceleration sensor and a magnetic direction sensor is incorporated in the satellite positioning module 2080.
  • the satellite positioning module 2080 is disposed at the rear of the ceiling of the cabin 2014 for convenience of drawing, but for example, the satellite positioning module 2080 is positioned as close as possible to a position just above the center of left and right of the cutting device 2022. It is preferable that the front end portion be disposed closer to the center of the airframe.
  • the control system of this embodiment comprises an electronic control unit called a large number of ECUs, various operation devices, sensors, switches, and a wiring network such as a car-mounted LAN for data transmission between them.
  • the notification device 2091 is a device for notifying a driver or the like of a work traveling state and various warnings, and is a buzzer, a lamp, a speaker, a display, or the like.
  • the communication unit 2092 is used to exchange data with the cloud computer system 2100 or the portable communication terminal 2200 installed at a remote place.
  • the mobile communication terminal 2200 is a tablet computer operated by a supervisor (including a driver) at the work travel site.
  • the control unit 2006 is a core element of this control system, and is shown as a collection of a plurality of ECUs.
  • the positioning data from the satellite positioning module 2080 and the image data from the imaging unit 2070 are input to the control unit 2006 through the wired network.
  • the control unit 2006 includes an output processing unit 2006B and an input processing unit 2006A as an input / output interface.
  • the output processing unit 2006B is connected to the vehicle travel device group 2007A and the work device device group 2007B.
  • the vehicle travel device group 2007A includes control devices related to vehicle travel, such as an engine control device, a transmission control device, a braking control device, a steering control device, and the like.
  • the work device group 2007B includes a power control device and the like in the harvesting unit 2002, the threshing device 2011, the grain discharging device 2013, and the grain conveying device 2023.
  • a traveling system detection sensor group 2008A, a reaper 2008B, and the like are connected to the input processing unit 2006A.
  • the traveling system detection sensor group 2008A includes a sensor that detects the state of an engine speed adjuster, an accelerator pedal, a brake pedal, a gearshift operator, and the like.
  • the reaping unit 2008B includes a reaping unit 2002, a threshing device 2011, a grain discharging device 2013, and a device state in the grain conveying device 2023 and a sensor that detects the state of the grain or grain.
  • the control unit 2006 includes a work travel control module 2060, an image recognition module 2005, a data processing module 2050, a machine position calculation unit 2066, and a notification unit 2067.
  • the notification unit 2067 generates notification data based on an instruction or the like received from each functional unit of the control unit 2006, and gives the notification data to the notification device 2091.
  • the aircraft position calculation unit 2066 calculates the aircraft position which is the map coordinates (or field coordinates) of the aircraft 2001 based on the positioning data sequentially sent from the satellite positioning module 2080.
  • the combine according to this embodiment can travel in both automatic travel (automatic steering) and manual travel (manual steering).
  • the work travel control module 2060 includes an automatic work travel instruction unit 2063 and a travel path setting unit 2064 in addition to the travel control unit 2061 and the work control unit 2062.
  • a travel mode switch (not shown) is provided in the cabin 2014 for selecting one of an automatic travel mode in which automatic travel is performed and a manual steering mode in which manual travel is performed. By operating the travel mode switch, it is possible to shift from manual steering travel to automatic steering travel, or shift from automatic steering travel to manual steering travel.
  • the traveling control unit 2061 has an engine control function, a steering control function, a vehicle speed control function, and the like, and supplies a traveling control signal to the vehicle traveling device group 2007A.
  • the work control unit 2062 provides a work control device group 2007B with a work control signal in order to control the movement of the reaper 2002, the threshing device 2011, the grain discharging device 2013, the grain conveying device 2023, and the like.
  • the traveling control unit 2061 When the manual steering mode is selected, the traveling control unit 2061 generates a control signal based on an operation by the driver to control the vehicle traveling device group 2007A.
  • the traveling control unit 2061 controls the vehicle traveling equipment group 2007A related to steering and the vehicle traveling equipment group 2007A related to vehicle speed based on the automatic traveling instruction given by the automatic work traveling instruction unit 2063. .
  • the travel route setting unit 2064 deploys, in the memory, a travel route for automatic travel created by any of the control unit 2006, the mobile communication terminal 2200, the cloud computer system 2100, and the like.
  • the travel route developed in the memory is sequentially used as a target travel route in automatic travel. Even if this traveling route is manual traveling, it is also possible to use the guidance for the combine to travel along the traveling route.
  • the automatic work travel instruction unit 2063 generates an automatic steering instruction and a vehicle speed instruction, and supplies the instruction to the travel control unit 2061.
  • the automatic steering command is generated so as to eliminate the azimuth deviation and the positional deviation between the traveling route set by the traveling route setting unit 2064 and the vehicle position calculated by the machine position calculation unit 2066.
  • the vehicle speed command is generated based on a previously set vehicle speed value.
  • the automatic work travel instruction unit 2063 gives the work control unit 2062 a work device operation instruction according to the vehicle position and the traveling state of the vehicle.
  • the image recognition module 2005 receives image data of a captured image sequentially acquired by the imaging unit 2070 sequentially.
  • the image recognition module 2005 estimates an existing area in the photographed image in which a recognition target exists, and outputs recognition output data including an existing area and an estimated probability when the existing area is estimated as a recognition result.
  • the image recognition module 2005 is constructed using neural network technology that employs deep learning.
  • the flow of generation of recognition output data by the image recognition module 2005 is shown in FIG. 10 and FIG.
  • the image recognition module 2005 receives pixel values of RGB image data as input values.
  • the authentication objects to be estimated are a weed and a fallen cereal weir and a person. Therefore, in the recognition output data as the recognition result, the present region of weeds (hereinafter referred to as weed region) and its estimated probability, the present region of the fallen kernel (hereinafter referred to as the fallen cereal region) and its estimated probability, It includes the existence area of a person (hereinafter referred to as a person area) and its estimated probability.
  • the estimation result is schematically shown, the weed area is shown by a rectangular frame with a reference F1, the fallen grain area is shown with a rectangular frame with a reference F2, and the human area is It is indicated by a rectangular frame given a code F3.
  • Each region is linked to its estimated probability.
  • the weed area, the fallen grain area and the human area are each defined by four corner points, but the coordinate positions on the photographed image of the four corner points of each rectangle are also included in the estimation result.
  • the authentication target is not estimated, the existence region of the authentication target is not output, and the estimation probability becomes zero.
  • the image recognition module 2005 sets internal parameters so that the probability of estimating the recognition target decreases as the recognition target is located farther from the imaging unit 2070 in the captured image. .
  • recognition of an object to be recognized in an imaging region whose resolution is low because it is far from the imaging unit 2070 is made stricter, and erroneous recognition is reduced.
  • the data processing module 2050 processes recognition output data output from the image recognition module 2005. As shown in FIGS. 9 and 11, in the data processing module 2050 of this embodiment, a recognition target object position information generation unit 2051, a statistical processing unit 2052, a data storage unit 2053, a data determination unit 2054, and a data correction unit 2055 are provided. include.
  • the recognition target position information generation unit 2051 generates recognition target position information indicating the position on the map of the recognition target from the machine position at the time when the captured image is acquired and the recognition output data.
  • the position on the map where the recognition target (weed, fallen grain moth, person) is included in the recognition output data is a rectangle that indicates the presence area of the certification object (weed area, fallen grain moth area, person area)
  • the coordinate positions (camera coordinate positions) on the photographed image of the four corner points of are obtained by converting the coordinates on the map.
  • the imaging unit 2070 acquires captured images at predetermined time intervals, for example, 0.5 seconds, and inputs the image data to the image recognition module 2005. Therefore, the image recognition module 2005 also recognizes and outputs recognition output data at the same time intervals. Output. Therefore, when the recognition target is included in the shooting field of view of the imaging unit 2070, the plurality of recognition output data includes the existence area for the same recognition target. As a result, multiple pieces of recognition target object position information for the same recognition target can be obtained. At that time, the estimation probability included in the recognition output data which is each original data, that is, the estimation probability of the existence region of the recognition target object included in the recognition target object position information is between the photographing unit 2070 and the recognition target object. Because the positional relationship is different, they often have different values.
  • such plural pieces of recognition target object position information are stored, and an estimated probability included in each of the stored plural pieces of recognition target object position information is calculated statistically.
  • the statistical processing unit 2052 obtains a representative value of the estimated probability group by using a statistical operation on estimated probabilities of a plurality of pieces of recognition object position information.
  • a plurality of pieces of recognition target object position information can be corrected to one piece of optimum recognition target object position information (recognition target object correction position information) using the representative value.
  • correction recognition target object position information In which the Of course, it is also possible to generate one reliable object position information with high reliability using statistical operations other than this. That is, the plurality of recognition target object position information is corrected based on the result of the statistical calculation of the estimation probability included in the recognition output data corresponding to the recognition target object position information.
  • Recognition object position information (weed position information, fallen grain weir position information, person position) indicating the position on the map of the existence area (weed area, fallen grain weir area, person area) of the certification object thus obtained
  • traveling work control and warning notification set in advance are performed at the time of recognition of weeds, fallen grain weirs, and persons.
  • the estimated probability output from the image recognition module 2005 is important for generation of final recognition target object position information. Therefore, in this embodiment, the recognition output data is evaluated by examining the reliability of the estimation probability of the recognition output data output from the image recognition module 2005 and determining the recognition output data having the unreliable estimation probability as the inappropriate recognition output data. A function is included in data processing module 2050.
  • the recognition output data evaluation function is realized by a data storage unit 2053, a data determination unit 2054, and a data correction unit 2055, as shown in FIGS.
  • the data storage unit 2053 temporarily and temporally stores recognition output data sequentially output from the image recognition module 2005 as a recognition output data sequence.
  • the data determination unit 2054 compares the estimated probability of the recognition output data to be determined with the estimated probability of the recognition output data sequentially related to the recognition output data in the recognition output data sequence. Recognition output data having an estimated probability lower than the level is determined as inappropriate recognition output data.
  • the photographed image used to generate the recognition output data is photographed during the traveling operation of the combine, the photographed image may become unclear due to the sudden movement of the combine and the like. Furthermore, since the combine changes its direction near the eyelid, the shooting direction may be suddenly changed. Along with this, there may be a mixture of shooting with the ordinary light and shooting with the back light for the same object to be recognized. Although such fluctuation in imaging conditions may cause recognition output data having an unexpectedly low estimation probability, extraction of inappropriate recognition output data which is such recognition output data is performed by the recognition output data evaluation function described above. It is possible.
  • the estimation probability of the recognition output data determined to be unsuitable recognition output data may be interpolated and corrected with the estimation probability of recognition output data before and after that.
  • the inappropriate recognition output data is appropriately corrected and can be used, and the number of data can be secured.
  • the data correction unit 2055 performs such interpolation correction.
  • the recognition output data determined to be unsuitable recognition output data is discarded, the interpolation correction becomes unnecessary, so the data correction unit 2055 is omitted.
  • each recognition target object position information generated by the recognition target object position information generation unit 2051 can be mapped as shown in FIG. 12 for easy visual understanding.
  • FIG. 12 illustrates a weed map in which weed position information is mapped.
  • the weed location information includes weed presence regions having different estimation probabilities, it is possible to display the weed presence regions in a form divided in a predetermined range of estimation probability values, as shown in FIG. .
  • a reaper 3002 is connected to a front portion of an airframe 3001 including a pair of left and right crawler traveling devices 3010 so as to be movable up and down around a horizontal axis X.
  • a threshing device 3011 and a grain tank 3012 for storing grains are provided at the rear of the machine body 3001 in a state of being aligned in the machine body width direction.
  • a cabin 3014 is provided at the front right side of the fuselage 3001 to cover the boarding operation unit, and a driving engine 3015 is provided below the cabin 3014.
  • the threshing device 3011 receives internally the cutting grain weir which is cut off by the cutting unit 3002 and transported backward, and the origin of the grain weir by the threshing feed chain 3111 and the clamping rail 3112 While holding and transporting, the tip side is threshed by a threshing cylinder 3113. Then, the grain sorting process is performed on the threshed product in the sorting unit provided below the threshing drum 3113, and the grain sorted there is transported to the grain tank 3012 and stored. Further, although not described in detail, a grain discharging device 3013 for discharging grains stored in the grain tank 3012 to the outside is provided.
  • the reaper 3002 is provided with a plurality of raising devices 3021 for causing fallen cereal grains, a clipper-type cutting device 3022 for cutting the origin of the raised cereal grains, a grain feeding device 3023 and the like. There is.
  • the grain feeder conveying device 3023 is directed toward the beginning end of the threshing feed chain 3111 of the threshing device 3011 located on the rear side of the fuselage while gradually changing the cropped grain gutter in the vertical posture in which the stock origin has been cut to a lying posture. Transport
  • the grain feeder conveying device 3023 holds a rear of the confluence conveying unit 3231 which gathers and gathers a plurality of cut grain remnants cut by the cutting device 3022 to the center in the cutting width direction, sandwiches the stock origin of the gathered grain relics From the end of the stock holding and conveying device 3232 and the stock holding and conveying device 3232 to the threshing feed chain 3111 It is provided with a feeding and conveying device 3234 and the like that directs it to guide it.
  • a photographing unit 3070 provided with a color camera is provided.
  • the extension in the front-rear direction of the field of view of the imaging unit 3070 substantially reaches the horizon from the front end region of the reaper 3002.
  • the breadth of the field of view in the field of view reaches about 10 m to several tens of meters.
  • the photographed images sequentially acquired by the photographing unit 3070 are converted into image data and sent to the control system of the combine.
  • the photographing unit 3070 photographs a field at the time of harvesting work, but various objects exist in the field as objects to be photographed.
  • the control system of the combine has a function of recognizing a specific object as a recognition target object from the image data sent from the imaging unit 3070.
  • a recognition object in FIG. 13, a normal open-cell crop group indicated by symbol Z0, a weed group extended higher than the open-cell granule indicated by symbol Z1, and an lodging indicated by symbol Z2
  • a group of cereals, a person indicated by a symbol Z3 is schematically shown.
  • a satellite positioning module 3080 is also provided on the ceiling of the cabin 3014.
  • the satellite positioning module 3080 includes a satellite antenna for receiving global navigation satellite system (GNSS) signals (including GPS signals).
  • GNSS global navigation satellite system
  • an inertial navigation unit incorporating a gyro acceleration sensor and a magnetic direction sensor is incorporated in the satellite positioning module 3080.
  • the satellite positioning module 3080 is disposed at the rear of the ceiling of the cabin 3014 for convenience of drawing, but for example, the satellite positioning module 3080 is positioned as close as possible to a position just above the center of the left and right of the cutting device 3022. It is preferable that the front end portion be disposed closer to the center of the airframe.
  • FIG. 14 shows a functional block diagram of the control system of the combine.
  • the control system of this embodiment comprises an electronic control unit called a large number of ECUs, various operation devices, sensors, switches, and a wiring network such as a car-mounted LAN for data transmission between them.
  • the notification device 3091 is a device for notifying a driver or the like of a work traveling state and various warnings, and is a buzzer, a lamp, a speaker, a display or the like.
  • the communication unit 3092 is used to exchange data with the cloud computer system 3100 or the portable communication terminal 3200 installed at a remote location.
  • the mobile communication terminal 3200 is a tablet computer operated by a supervisor (including a driver) at the work travel site.
  • the control unit 3006 is a core element of this control system, and is shown as a collection of a plurality of ECUs.
  • the positioning data from the satellite positioning module 3080 and the image data from the imaging unit 3070 are input to the control unit 3006 through the wired network.
  • the control unit 3006 includes an output processing unit 3006B and an input processing unit 3006A as an input / output interface.
  • the output processing unit 3006B is connected to the vehicle travel device group 3007A and the work device device group 3007B.
  • the vehicle travel device group 3007A includes control devices related to vehicle travel, such as an engine control device, a shift control device, a braking control device, and a steering control device.
  • the work device group 3007 B includes power control devices and the like in the harvesting unit 3002, the threshing device 3011, the grain discharging device 3013, and the grain feeding device 3023.
  • a traveling system detection sensor group 3008A, a working system detection sensor group 3008B, and the like are connected to the input processing unit 3006A.
  • the traveling system detection sensor group 3008A includes a sensor that detects the state of an engine speed adjuster, an accelerator pedal, a brake pedal, a gearshift operator, and the like.
  • the work system detection sensor group 3008 B includes a sensor for detecting the state of the harvester 3002, the threshing device 3011, the grain discharging device 3013, and the device state in the grain conveying device 3023 and the state of the grain or grain.
  • the control unit 3006 includes a work travel control module 3060, an image recognition module 3005, a data processing module 3050, a machine position calculation unit 3066, a notification unit 3067, and a travel locus calculation unit 3068.
  • the notification unit 3067 generates notification data based on an instruction or the like received from each functional unit of the control unit 3006, and gives the notification data to the notification device 3091.
  • the aircraft position calculation unit 3066 calculates the aircraft position which is the map coordinates (or field coordinates) of the aircraft 3001 based on the positioning data sequentially sent from the satellite positioning module 3080.
  • the traveling locus calculation unit 3068 calculates a traveling locus of the vehicle body 3001 from the successive vehicle position calculated by the vehicle body position calculating unit 3066. Furthermore, the traveling locus calculation unit 3068 can add the predicted traveling locus immediately ahead of the vehicle 3001 to the traveling locus based on the latest vehicle position and the steering angle or target traveling route by the steering control device at that time. .
  • the combine according to this embodiment can travel in both automatic travel (automatic steering) and manual travel (manual steering).
  • the work travel control module 3060 is provided with an automatic work travel instruction unit 3063 and a travel route setting unit 3064 in addition to the travel control unit 3061 and the work control unit 3062.
  • a travel mode switch (not shown) is provided in the cabin 3014 to select one of an automatic travel mode for traveling by automatic steering and a manual steering mode for traveling by manual steering. By operating the travel mode switch, it is possible to shift from manual steering travel to automatic steering travel, or shift from automatic steering travel to manual steering travel.
  • the traveling control unit 3061 has an engine control function, a steering control function, a vehicle speed control function, and the like, and provides a traveling control signal to the vehicle traveling device group 3007A.
  • the work control unit 3062 gives a work control signal to the work device group 3007B to control the movement of the reaper 3002, the threshing device 3011, the grain discharging device 3013, the grain feeding device 3023, and the like.
  • the traveling control unit 3061 When the manual steering mode is selected, the traveling control unit 3061 generates a control signal based on the operation by the driver to control the vehicle traveling device group 3007A. When the automatic steering mode is selected, the traveling control unit 3061 controls the vehicle traveling device group 3007A related to steering and the vehicle traveling device group 3007A related to vehicle speed based on the automatic traveling instruction given by the automatic work traveling instruction unit 3063. .
  • the travel route setting unit 3064 develops a travel route for automatic travel created by any of the control unit 3006, the mobile communication terminal 3200, the cloud computer system 3100, and the like in a memory.
  • the travel route developed in the memory is sequentially used as a target travel route in automatic travel. Even if this traveling route is manual traveling, it is also possible to use the guidance for the combine to travel along the traveling route.
  • the automatic work travel instruction unit 3063 generates an automatic steering instruction and a vehicle speed instruction, and gives the travel control unit 3061.
  • the automatic steering command is generated so as to eliminate the azimuth deviation and the positional deviation between the traveling route set by the traveling route setting unit 3064 and the vehicle position calculated by the machine position calculation unit 3066.
  • the vehicle speed command is generated based on a previously set vehicle speed value.
  • the automatic work travel instruction unit 3063 gives the work control unit 3062 a work device operation instruction according to the vehicle position and the traveling state of the vehicle.
  • the image recognition module 3005 receives image data of a photographed image sequentially acquired by the photographing unit 3070 sequentially.
  • the image recognition module 3005 estimates an existing area in the photographed image in which the recognition target exists, and outputs recognition output data including an existing area and an estimated probability when the existing area is estimated as a recognition result.
  • the image recognition module 3005 is constructed using neural network technology that employs deep learning.
  • the flow of generation of recognition output data by the image recognition module 3005 is shown in FIG. 15 and FIG. Pixel values of RGB image data are input to the image recognition module 3005 as input values.
  • the authentication objects to be estimated are weeds and fallen cereals and persons. Therefore, in the recognition output data as the recognition result, the present region of weeds (hereinafter referred to as weed region) and its estimated probability, the present region of the fallen kernel (hereinafter referred to as the fallen cereal region) and its estimated probability, It includes the existence area of a person (hereinafter referred to as a person area) and its estimated probability.
  • the estimation result is schematically shown, the weed area is shown by a rectangular frame with a reference F1, the fallen grain area is shown with a rectangular frame with a reference F2, and the human area is It is indicated by a rectangular frame given a code F3.
  • Each region is linked to its estimated probability.
  • the weed area, the fallen grain area and the human area are each defined by four corner points, but the coordinate positions on the photographed image of the four corner points of each rectangle are also included in the estimation result.
  • the authentication target is not estimated, the existence region of the authentication target is not output, and the estimation probability becomes zero.
  • the presence area of the recognition target and the estimation probability output from the image recognition module 3005 as recognition output data are important for the generation of final recognition target position information.
  • the shooting is performed due to sudden shaking of the shooting unit 3070, instantaneous backlighting, dust crossing within the shooting field of the shooting unit 3070, etc. Images may be inappropriate.
  • recognition output data having an unexpectedly low estimation probability will be output.
  • the recognition target can not be estimated, and recognition output data with an estimated probability of zero is output.
  • the data processing module 3050 that processes the recognition output data output from the image recognition module 3005 checks the reliability of the recognition output data output from the image recognition module 3005 as a pre-processing function to recognize the unreliability
  • a recognition output data evaluation function is provided to determine output data as inappropriate recognition output data.
  • This recognition output data evaluation function is realized by the data storage unit 3053 of the data processing module 3050, the data determination unit 3054 and the data correction unit 3055 as shown in FIG.
  • the data storage unit 3053 temporarily and temporally stores recognition output data sequentially output from the image recognition module 3005 as a recognition output data sequence.
  • the data determination unit 3054 is a range within which the existing area in the previously captured image estimated by the image recognition module 3005 should be positioned in the next captured image based on the travel locus calculated by the travel locus calculation unit 3068 (hereinafter also referred to as a prediction range). Predict). Furthermore, the data determination unit 3054 compares the estimated probability of the existing area overlapping the prediction range in the next captured image with the estimated probability of the existing area in the previous captured image.
  • recognition output data based on the next captured image is determined as unsuitable recognition output data.
  • recognition output data for which the estimation probability is equal to or less than a predetermined value for example, 0.5 or less
  • the object to be estimated is a weed.
  • the image recognition module 3005 recognizes the weed area surrounded by the rectangular frame indicated by the symbol F1 from the captured image acquired by the imaging unit 3070 (# 01a), and the estimated probability is 0.75 (# 01 b). If the estimated probability exceeds 0.7, it is considered as high confidence.
  • an expected range PA in which the weed region should be located in the photographed image to be acquired next is calculated. (# 01c).
  • Step 02 the image recognition module 3005 recognizes the weed area surrounded by the rectangular frame indicated by the code F1 from the photographed image acquired by the photographing unit 3070 (# 02a), and the estimated probability is It is 0.8 (# 02 b).
  • the recognized weed areas overlap with the expected range PA.
  • the expected range PA where the weed region should be located in the captured image to be acquired next is calculated (# 02c).
  • Step 03 Further, based on the next captured image (# 03a), the image recognition module 3005 estimates a presence area where weeds are present, but the estimation probability is 0.2 (# 03b). As described above, when the estimation probability is lower than 0.5, the reliability of the position of the rectangular frame indicating the weed area also becomes low. Therefore, for calculating the prediction range PA here, it is preferable to use the position of the rectangular frame in step 02 and the traveling locus instead of using the position of the rectangular frame of the recognition output data with low reliability (# 03c).
  • Step 04 Further, based on the next captured image (# 04a), the presence area where weeds are present is estimated by the image recognition module 3005, and the weed area surrounded by the rectangular frame indicated by the code F1 is It is recognized, and its estimated probability is 0.9 (# 04 b). The recognized weed areas overlap with the expected range PA.
  • the data determination unit 3054 lowers the estimated probability in step 03 from 0.8, which is the previous estimated probability, to 0.2.
  • the estimated output data in step 03 is determined to be unsuitable recognition output data because the reduction amount is different than a predetermined allowable amount (for example, the change rate is 50% or more).
  • the data correction unit 3055 uses the estimation probability of proper recognition output data before and after that to determine improper recognition output data. Calculate the interpolation estimation probability for Here, the arithmetic mean is used as the simplest interpolation operation, and an interpolation estimation probability of 0.85 is obtained. By replacing the estimation probability of inadequate recognition output data with the obtained interpolation estimation probability, the inadequate recognition output data can be used as proper recognition output data in the subsequent processing.
  • the estimation probability of the recognition output data determined to be unsuitable recognition output data is interpolated and corrected by the estimation probability of the recognition output data before and after that. it can.
  • the inappropriate recognition output data is appropriately corrected and can be used, and the number of data can be secured.
  • the interpolation correction becomes unnecessary, so the data correction unit 3055 is omitted.
  • the data processing module 3050 of this embodiment further has a function of generating recognition target object position information from the recognition output data evaluated by the recognition output data evaluation function described above. This function is realized by the recognition target object position information generation unit 3051 and the statistical processing unit 3052 of the data processing module 3050.
  • the recognition target object position information is information indicating the position of the recognition target on the map.
  • the recognition target object position information generation unit 3051 generates recognition target object position information from the machine position at the time when the captured image is acquired and the recognition output data.
  • the position on the map where the recognition target (weed, fallen grain moth, person) is included in the recognition output data is a rectangle that indicates the presence area of the certification object (weed area, fallen grain moth area, person area)
  • the coordinate positions (camera coordinate positions) on the photographed image of the four corner points of are obtained by converting the coordinates on the map.
  • the imaging unit 3070 acquires captured images at predetermined time intervals, for example, 0.5 seconds, and inputs the image data to the image recognition module 3005, the image recognition module 3005 also recognizes and outputs recognition output data at the same time intervals. Output. Therefore, when the recognition target is included in the shooting field of view of the imaging unit 3070, the plurality of recognition output data includes the existence region for the same recognition target. As a result, multiple pieces of recognition target object position information for the same recognition target can be obtained. At that time, the estimation probability included in the recognition output data which is each original data, that is, the estimation probability of the existing region of the recognition target included in the recognition target object position information, is between the photographing unit 3070 and the recognition target Because the positional relationship is different, they often have different values.
  • a plurality of such recognition target object position information is stored, and an estimated probability included in each of the stored plurality of recognition target object position information is statistically calculated.
  • the statistical processing unit 3052 obtains a representative value of the estimated probability group by using a statistical operation on the estimated probabilities of a plurality of recognition target object position information.
  • a plurality of pieces of recognition target object position information can be corrected to one piece of optimum recognition target object position information (recognition target object correction position information) using the representative value.
  • correction recognition target object position information In which the Of course, it is also possible to generate one reliable object position information with high reliability using statistical operations other than this. That is, the plurality of recognition target object position information is corrected based on the result of the statistical calculation of the estimation probability included in the recognition output data corresponding to the recognition target object position information.
  • Recognition object position information (weed position information, fallen grain weir position information, person position) indicating the position on the map of the existence area (weed area, fallen grain weir area, person area) of the certification object thus obtained
  • traveling work control and warning notification set in advance are performed at the time of recognition of weeds, fallen grain weirs, and persons.
  • each recognition target object position information generated by the recognition target object position information generation unit 3051 can be mapped as shown in FIG. 18 for display that is easy to understand visually.
  • FIG. 18 exemplifies a weed map in which weed position information is mapped. If the weed location information includes a weed presence region having a different estimation probability, it is possible to display the weed presence region in a form divided in a predetermined range of the estimation probability value, as shown in FIG. .
  • the weed group which is stretched higher than the planted grain weir is set as the recognition object to be recognized by the image recognition module (5, 2005, 3005).
  • Falling cereals or people may also be set.
  • the work travel control module (60, 2060, 3060) is configured to perform the necessary control in response to the recognition of the fallen cereal group or person.
  • the image recognition module (5, 2005, 3005) is constructed using deep learning type neural network technology.
  • an image recognition module (5, 2005, 3005) constructed using other machine learning techniques may be employed.
  • the image recognition module (5, 2005, 3005), the data processing module (50, 2050, 3050), and the weed position calculation unit 68 correspond to the control unit (6, 2006, 3006) of the combine. Although incorporated, some or all of them can be configured in a control unit independent of the combine, such as a mobile communication terminal (200, 2200, 3200).
  • the functional units shown in FIG. 4, FIG. 9, FIG. 14 and FIG. 16 are divided mainly for the purpose of explanation. In practice, each functional unit may be integrated with another functional unit or may be further divided into a plurality of functional units.
  • the present invention is applicable not only to combine harvesters such as rice and wheat, but also to combine harvesters that harvest other crops such as corn and harvesters that harvest carrots and the like.
  • Handle depth adjustment mechanism 5 Image recognition module 6: Control unit 23: Grain conveyor device 236: Electric motor for adjustment (motor) 237: operation rod 30: length detection device 34: detection switch 35: detection switch 50: data processing module 51: weed position information generation unit 52: statistical processing unit 60: work traveling control module (work traveling control unit) 61: traveling control unit 63: automatic work traveling instruction unit 620: grain depth control unit 66: machine position calculating unit 68: weed position calculating unit 70: imaging unit 80: satellite positioning module 91: notification device 92: communication unit 2053: Data storage unit 2054: data determination unit 2055: data correction unit 2051: recognition target object position information generation unit

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Soil Sciences (AREA)
  • Guiding Agricultural Machines (AREA)
  • Combines (AREA)
  • Harvester Elements (AREA)

Abstract

This combine harvester is provided with: a threshing depth adjustment mechanism; a threshing depth control unit 620 which performs, using the threshing depth adjustment mechanism, threshing depth adjustment control on the basis of the length of reaped grain culms; a machine body position calculation unit 66 which calculates the machine body position; an image recognition module 5 which deduces a weed-growth region in a photographic image acquired by an photographing unit 70 and which outputs recognition output data representing the deduced weed-growth region; a weed position information generation unit 51 which generates, from the recognition output data and the machine body position when the photographic image was acquired, weed position information indicating the position of the weed-growth region on a map; and a work travel control unit for calculating the timing when weeds pass through the threshing depth adjustment mechanism and for executing weed ingress control while weeds are passing the threshing depth adjustment mechanism.

Description

収穫機Harvester
 本発明は、圃場を走行しながら植立穀稈を収穫する収穫機に関する。 TECHNICAL FIELD The present invention relates to a harvester that harvests a grained paddy while traveling on a field.
 従来の収穫機では、作業の性能を向上させ、あるいは作業の効率を向上させるために様々な機能を備えている。
 例えば、特許文献1に示すように、従来のコンバインでは、刈取部から脱穀装置に送られる搬送穀稈の穂先位置を検出する扱深さ検出センサが設けられ、この扱深さ検出センサの検出情報に基づいて、脱穀装置での扱深さを調節する扱深さ調節手段を制御して、搬送穀稈を目標扱深さで脱穀するように構成されている。
Conventional harvesters are equipped with various functions to improve the performance of work or to improve the efficiency of work.
For example, as shown in Patent Document 1, in the conventional combine, a threshing depth detection sensor for detecting the tip position of the transport grain fed from the reaper to the threshing device is provided, and the detection information of the threshing depth detection sensor The threshing depth adjusting means for controlling the threshing depth in the threshing device is controlled based on the threshing device to thresh the transported grain at the target threshing depth.
 また、圃場での収穫機を用いた農作物の収穫作業では、局所的に異なる農作物の生育状態や人物を含む障害物の存在に注意を払う必要がある。このため、例えば、特許文献2によるコンバインでは、刈取部の前方の穀稈を撮影するテレビカメラと画像処理装置とが備えられている。画像処理装置は、テレビカメラからの画像と、予め記憶させておいた種々の穀稈の植立状態を示す画像とを比較して、穀稈の特定の状態を検出する。例えば、刈取部前方の穀稈の一部が倒伏していることを、画像処理装置が検出すると、掻込みリールが穀稈倒伏側を下方にして傾動される。これにより、倒伏穀稈の刈取性能が向上する。また、特許文献2のコンバインと同様にテレビカメラと画像処理装置が備えられている特許文献3によるコンバインでは、刈取部の前方に穀稈が存在しないことが検出された場合に、コンバインが枕地に達したものとみなし、カッターへの動力伝達を遮断してカッターを上昇させ、走行速度を減速させる。これにより、枕地での回行中におけるカッターの畦等への衝突が回避されるとともに、コンバインの枕地の回行がスムーズに行われる。 In addition, in the field of crop harvesting work using a harvester in the field, it is necessary to pay attention to the locally different growth status of crops and the presence of obstacles including people. For this reason, for example, in the combine according to Patent Document 2, a television camera and an image processing apparatus for capturing a grain scale in front of the reaper are provided. The image processing apparatus detects a specific state of the grain scale by comparing the image from the television camera with an image indicating the establishment state of various grain scale stored in advance. For example, when the image processing apparatus detects that a part of the grain scale in front of the reaper is fallen, the scratching reel is tilted with the grain fall side down. As a result, the harvesting performance of the fallen cereal straw is improved. In the combine according to Patent Document 3 in which a television camera and an image processing apparatus are provided as in the combine according to Patent Document 2, the combine is a headland when it is detected that no grain is present in front of the reaper. The power transmission to the cutter is interrupted, the cutter is raised, and the traveling speed is reduced. As a result, the collision of the cutter with the wedge or the like during the turning on the headland is avoided, and the turning of the headland of the combine is smoothly performed.
 さらに、特許文献4によるコンバインでは、機体の後方を撮影するカメラを搭載し、その撮影画像から、刈取部によって切断された穀稈の株列が認識される。認識された株列と機体とのずれに基づいて、株列と株列の間の株間を各分草体が通るように、機体の方向制御が行われる。 Furthermore, in the combine according to Patent Document 4, a camera for photographing the rear of the machine body is mounted, and from the photographed image, a stock row of cereal straw cut by the reaper is recognized. Based on the difference between the recognized stock row and the aircraft, the orientation control of the vehicle is performed such that each branch passes between the strain rows and the stock rows.
特開平08-172867号公報Unexamined-Japanese-Patent No. 08-172867 gazette 特開平11-155340号公報Japanese Patent Application Laid-Open No. 11-155340 特開平11-137062号公報Japanese Patent Application Laid-Open No. 11-137062 特開2006-121952号公報Unexamined-Japanese-Patent No. 2006-121952
 刈取部で刈り取った刈取穀稈の脱穀装置への搬送時に、扱深さ検出センサを用いた扱深さ調節によって、適切な扱深さで脱穀する技術は、脱穀性能の向上に貢献している。しかしながら、圃場によっては、植立穀稈の周囲に植立穀稈より長く伸びた雑草が混在している場合があり、この雑草が刈取部で刈り取られ、扱深さ検出センサで検出される。その結果、扱深さ調節手段が雑草の長さで脱穀装置での扱深さを調節してしまい、本来の刈取穀稈の脱穀性能を低下させてしまうという問題が生じる。 The technology of threshing with appropriate threshing depth contributes to the improvement of threshing performance by adjusting the threshing depth using the threshing depth detection sensor when transporting the reaping cropped rice straw reaped by the reaper to the threshing device . However, depending on the field, there may be a mixture of weeds that extend longer than the grain canal around the grain canal, and this weed is cut away by a reaper and detected by a threshing depth detection sensor. As a result, the threshing depth adjusting means adjusts the threshing depth in the threshing device with the length of the weed, which causes a problem that the threshing performance of the original reaping grain crusher is reduced.
 このような実情に鑑み、雑草が局所的に生育している圃場においても、植立穀稈の脱穀性能の低下をできる限り回避できるコンバインが要望されている。 In view of such circumstances, there has been a demand for a combine that can avoid the reduction in the threshing performance of the cultivated cereal as much as possible even in a field where weeds are grown locally.
 また、特許文献2や特許文献3によるコンバインでは、刈取部前方の穀稈の状態が画像処理技術を用いて検出され、その検出結果に基づいて作業機器の動作が制御される。しかしながら、検出された穀稈の圃場における地図上の実際の位置は考慮されていない。このため、作業走行において問題となる状態の穀稈が検出され、その問題を解決するための回避制御を行うとしても、問題となる穀稈状態を示している領域とコンバインとが遠く離れている場合には、回避制御が早過ぎるとか遅すぎるとかいう問題が生じる。また、回避制御を終了すべきタイミングを適切に決定することも困難である。 Moreover, in the combine by patent document 2 or patent document 3, the state of the grain crucible ahead of a reaper is detected using an image processing technique, and operation | movement of a working apparatus is controlled based on the detection result. However, the actual position on the map in the field of the detected cereals is not taken into consideration. For this reason, even if a grain crumb in a problem state is detected in work travel, and even if avoidance control is performed to solve the problem, the area showing the problem grain crumb condition and the combine are far apart In this case, there arises a problem that the avoidance control is too early or too late. In addition, it is also difficult to properly determine the timing at which the avoidance control should be ended.
 このような実情に鑑み、車体に設けられた撮影部による撮影画像を用いた収穫作業支援が効果的に行われる収穫機も要望されている。 In view of such actual circumstances, there is also a demand for a harvester that can effectively support harvesting work using an image captured by an imaging unit provided on a vehicle body.
 特許文献2、3、4によるコンバインでは、いずれも、車載カメラによって取得された撮影画像に基づいて得られる検出結果から、走行装置や作業装置が制御されている。しかしながら、圃場を作業走行するコンバインに設けられたカメラによる撮影においては、その撮影画像が不適切なものとなってしまう種々の状況が生じうる。そのような状況として、例えば、圃場の走行面が道路のように平坦でないことから車載カメラに突発的な振動が伝達されることが挙げられる。また、方向転換を行いながら作業走行することから車載カメラに対する太陽の向き等が頻繁に変動することが挙げられる。さらに、刈取作業等では作業走行時に粉塵が舞い上がることなどが挙げられる。不適切な撮影画像に基づいて作業装置や走行装置を制御すると、その作業走行が不適切なものとなり、満足すべき結果が得られない。 In the combine according to Patent Documents 2, 3 and 4, the traveling device and the working device are controlled from the detection results obtained based on the photographed image acquired by the on-vehicle camera. However, in the case of photographing by a camera provided in a combine traveling on a field, various situations may occur in which the photographed image becomes inappropriate. As such a situation, for example, since the traveling surface of the field is not flat like a road, a sudden vibration is transmitted to the on-vehicle camera. In addition, there are frequent fluctuations in the direction of the sun with respect to the on-vehicle camera and the like because the work travels while changing the direction. Furthermore, in reaping work etc., soaring dust may be mentioned at the time of work traveling. If the work device or the traveling device is controlled based on an inappropriate photographed image, the work traveling becomes inappropriate, and a satisfactory result can not be obtained.
 このような問題を抑制するため、撮影画像を利用した収穫作業支援において不適切な撮影画像をできるだけ排除できる技術が要望されている。 In order to suppress such a problem, there is a demand for a technology that can eliminate inappropriate photographed images as much as possible in crop work support using the photographed images.
 本発明は、以上の要望を踏まえて、各種作業の性能や作業の効率のさらなる向上を図ることを目的とする。 An object of the present invention is to further improve the performance of various tasks and the efficiency of tasks in light of the above-mentioned needs.
 本発明の一実施形態に係る収穫機の特徴は、圃場を走行しながら植立穀稈を収穫するコンバインであって、前記圃場から前記植立穀稈を刈り取る刈取部と、刈取穀稈を前記刈取部から脱穀装置に向けて搬送する穀稈搬送装置と、前記穀稈搬送装置に設けられた扱深さ調整機構と、前記扱深さ調整機構を用いて前記刈取穀稈の長さに基づく扱深さ調整制御を行う扱深さ制御部と、衛星測位モジュールからの測位データに基づいて機体の地図座標である機体位置を算出する機体位置算出部と、前記機体に設けられ、収穫作業時に前記圃場を撮影する撮影部と、前記撮影部によって継時的に遂次取得された撮影画像の画像データが入力され、前記撮影画像における雑草生育領域を推定し、推定された前記雑草生育領域を示す認識出力データを出力する画像認識モジュールと、前記撮影画像が取得された時点の前記機体位置と前記認識出力データとから前記雑草生育領域の地図上の位置を示す雑草位置情報を生成する雑草位置情報生成部と、前記雑草生育領域で刈り取られた雑草が前記扱深さ調整機構を通過するタイミングが決定され、前記雑草が前記扱深さ調整機構を通過する間、雑草進入制御を実行する作業走行制御部とが備えられている点にある。 A feature of a harvester according to an embodiment of the present invention is a combine harvester for harvesting grained rice husk while traveling in a field, and a mowing section for mowing the grained rice husk from the field, and a mowing grain gutter Based on the length of the reaping grain weir using the grain conveying apparatus for conveying from the reaper to the threshing apparatus, the threshing depth adjusting mechanism provided in the threshing grain conveying apparatus and the threshing depth adjusting mechanism A handling depth control unit that performs handling depth adjustment control, an aircraft position calculation unit that calculates an aircraft position that is map coordinates of an aircraft based on positioning data from a satellite positioning module, and the aircraft are provided at the time of harvesting work The imaging unit for imaging the field and the image data of the photographed image sequentially acquired sequentially by the imaging unit are input, the weed growth region in the photographed image is estimated, and the estimated weed growth region is determined. Output recognition output data to indicate A weed position information generation unit for generating weed position information indicating a position on the map of the weed growth area from the image recognition module at the time when the photographed image is acquired and the recognition output data; The work traveling control unit executes the weed entry control while the weeds cut in the weed growing area pass the processing depth adjustment mechanism and the weeds pass through the processing depth adjustment mechanism. It is in the point being
 本発明の一実施形態に係る収穫機では、撮影画像に雑草が存在している場合、当該撮影画像である画像データから、画像認識モジュールによって雑草生育領域が推定される。さらには、撮影画像が取得された時点の地図座標で示された機体位置が機体位置算出部によって算出されているので、当該機体位置と、雑草生育領域を示す認識出力データとから、雑草生育領域の地図上の位置を示す雑草位置情報が生成される。機体の地図座標である機体位置は、機体位置算出部によって算出されているので、機体における雑草の刈取り位置、及び当該刈取り位置から扱深さ調整機構までの雑草の搬送時間を考慮することで、雑草が扱深さ調整機構を通過するタイミングが決定される。この決定されたタイミングに基づいて、雑草が扱深さ調整機構を通過する間、特別な雑草進入制御を実行することにより、雑草が局所的に生育している圃場においても、雑草が混在する植立穀稈に対する脱穀性能の低下が抑制可能となる。 In the harvester according to one embodiment of the present invention, when weeds are present in the photographed image, the weed growth area is estimated by the image recognition module from the image data which is the photographed image. Furthermore, since the machine position indicated by the map coordinates at the time of acquisition of the photographed image is calculated by the machine position calculation unit, the weed growth area is calculated from the machine position and the recognition output data indicating the weed growth area. Weed position information indicating the position on the map of is generated. The aircraft position, which is the map coordinates of the aircraft, is calculated by the aircraft position calculation unit, so by considering the weed's harvesting position in the vehicle and the weed's transport time from the harvesting position to the handling depth adjustment mechanism, The timing at which weeds pass through the depth adjustment mechanism is determined. Based on this determined timing, while weeds pass through the depth adjustment mechanism, by performing special weed entry control, weeds are mixed in the field where weeds are grown locally. It is possible to suppress the reduction of the threshing performance for the standing cereals.
 本発明の一実施形態に係る収穫機の特徴は、圃場を走行しながら植立穀稈を収穫するコンバインであって、前記圃場から前記植立穀稈を刈り取る刈取部と、刈取穀稈を前記刈取部から脱穀装置に向けて搬送する穀稈搬送装置と、前記穀稈搬送装置に設けられた扱深さ調整機構と、前記扱深さ調整機構を用いて前記刈取穀稈の長さに基づく扱深さ調整制御を行う扱深さ制御部と、衛星測位モジュールからの測位データに基づいて機体の地図座標である機体位置を算出する機体位置算出部と、前記機体に設けられ、収穫作業時に前記圃場を撮影する撮影部と、前記撮影部によって継時的に遂次取得された撮影画像の画像データが入力され、前記撮影画像における雑草生育領域を推定し、推定された前記雑草生育領域を示す認識出力データを出力する画像認識モジュールと、前記撮影画像が取得された時点の前記機体位置と前記認識出力データとから前記雑草生育領域の地図上の位置を示す雑草位置情報を生成する雑草位置情報生成部と、前記刈取部が前記雑草生育領域を通過する間、雑草進入制御を実行する作業走行制御部とが備えられている点である。 A feature of a harvester according to an embodiment of the present invention is a combine harvester for harvesting grained rice husk while traveling in a field, and a mowing section for mowing the grained rice husk from the field, and a mowing grain gutter Based on the length of the reaping grain weir using the grain conveying device for conveying from the reaper to the threshing device, the threshing depth adjusting mechanism provided in the threshing grain conveying device, and the threshing depth adjusting mechanism A handling depth control unit that performs handling depth adjustment control, an aircraft position calculation unit that calculates an aircraft position that is map coordinates of an aircraft based on positioning data from a satellite positioning module, and the aircraft are provided at the time of harvesting work The imaging unit for imaging the field and the image data of the photographed image sequentially acquired sequentially by the imaging unit are input, the weed growth region in the photographed image is estimated, and the estimated weed growth region is determined. Output recognition output data to indicate A weed position information generation unit for generating weed position information indicating a position on the map of the weed growth area from the image recognition module at the time when the photographed image is acquired and the recognition output data; While the mowing part passes through the weed growth area, a work traveling control part is provided which executes weed entry control.
 本発明の一実施形態に係る収穫機では、刈取部が雑草生育領域を通過する間、雑草進入制御が実行される。この構成では、雑草の搬送時間などを考慮する必要がないので、制御が簡単となる利点がある。 In the harvester according to one embodiment of the present invention, weed entry control is performed while the reaper passes through the weed growth area. In this configuration, there is an advantage that the control is simplified since it is not necessary to consider the transport time of weeds and the like.
 なお、刈取部が雑草生育領域を通過するタイミングは、衛星測位モジュールからの測位データに基づいて算出される地図上の機体位置と、この機体位置と刈取部との間の距離、及び雑草位置情報に含まれている雑草生育領域の地図上の位置とから正確に求めることができる。このことから、本発明の好適な実施形態の1つでは、前記刈取部が前記雑草生育領域を通過するタイミングは、前記雑草位置情報と前記機体位置算出部によって算出された前記機体位置とに基づいて決定されるように構成されている。 The timing when the reaper passes through the weed growth area is the position on the map calculated based on the positioning data from the satellite positioning module, the distance between the body position and the reaper, and the weed position information. It can be accurately determined from the position on the map of the weed growth area contained in. From this, in one of the preferred embodiments of the present invention, the timing at which the mowing section passes through the weed growth area is based on the weed position information and the machine position calculated by the body position calculating unit. Are configured to be determined.
 植立穀稈より背の高い雑草が扱深さ調整機構に進入すると、扱深さ調整機構は当該雑草を刈取穀稈とみなすため、扱深さ制御部は扱深さを雑草の長さに合わせて制御してしまう。これにより、調整された扱き深さは本来の刈取穀稈には適さなくなる。刈取穀稈の長さはそれほど急激には変動しないので、雑草が扱深さ調整機構に入り込むような事態が生じたときには、扱深さ調整機に基づく扱深さ制御を一時的に中断して、扱き深さを変更しないことが好ましい。このことから、本発明の好適な実施形態の1つでは、前記雑草進入制御の実行により、前記扱深さ調整制御が中断されるように構成されている。 When weeds taller than the planted grain weir enter the depth control mechanism, the depth control mechanism considers the weed to be a crop stem, so the depth control unit sets the depth to the weed length. It controls it together. As a result, the adjusted threshing depth is not suitable for the original reaper. Since the length of the harvest grain weir does not change so rapidly, when weeds enter the threshing depth adjustment mechanism, temporarily interrupt threshing depth control based on the threshing depth adjuster. It is preferable not to change the treatment depth. From this, in one of the preferred embodiments of the present invention, the execution of the weed approach control is configured to interrupt the depth adjustment control.
 また、雑草は植立穀稈と混在して生育しているため、この雑草生育領域に対して刈取り作業を行った場合、雑草と刈取穀稈とを含めた処理量が増大し、刈取部、穀稈搬送装置、脱穀装置などの作業装置の負荷が増大する。このような過負荷を回避するためには、車速を低減することが好適である。このことから、本発明の好適な実施形態の1つでは、前記雑草進入制御の実行により、車速が低減されるように構成されている。 In addition, since weeds grow together with planted grain weirs, when weeding the weed growth area, the amount of processing including weeds and weeded grain weeds increases, and the weeding area, The load on working devices such as the grain transfer device and the threshing device is increased. In order to avoid such an overload, it is preferable to reduce the vehicle speed. From this, in one of the preferred embodiments of the present invention, the execution of the weed approach control is configured to reduce the vehicle speed.
 本発明の一実施形態に係る収穫機の特徴は、圃場を走行しながら農作物を収穫する収穫機であって、衛星測位モジュールからの測位データに基づいて機体の地図座標である機体位置を算出する機体位置算出部と、前記機体に設けられ、収穫作業時に圃場を撮影する撮影部と、前記撮影部によって継時的に遂次取得された撮影画像の画像データが入力され、前記撮影画像における認識対象物が存在する存在領域を推定し、前記存在領域及び前記存在領域が推定された際の推定確率を含む認識出力データを出力する画像認識モジュールと、前記撮影画像が取得された時点の前記機体位置と前記認識出力データとから前記認識対象物の地図上の位置を示す認識対象物位置情報を生成する認識対象物位置情報生成部とを備えている点にある。 A feature of a harvester according to an embodiment of the present invention is a harvester that harvests crops while traveling in a field, and calculates an aircraft position which is map coordinates of the vehicle based on positioning data from a satellite positioning module An image data of a photographed image sequentially obtained by the photographing unit which is provided in the aircraft position calculating unit, the photographing unit for photographing the field at the time of harvesting operation, provided in the unit, is input, and recognition in the photographed image is performed. An image recognition module for estimating an existence area in which an object exists, and outputting recognition output data including the existence area and an estimated probability when the existence area is estimated; and the body at the time when the photographed image is acquired And a recognition target position information generation unit for generating recognition target position information indicating a position on the map of the recognition target from the position and the recognition output data.
 本発明の一実施形態に係る収穫機では、撮影画像である画像データから、撮影画像における認識対象物が存在する存在領域を推定する画像認識モジュールが、例えば、ニューラルネットワーク(ディープラーニングを含む)や強化学習などの技術を用いて構成されている。また、当該存在領域を推定した際の推定確率も同時に出力することができるように構成されているので、その確率値を用いた最適制御も可能となる。さらには、撮影画像が取得された時点の地図座標で示された機体位置が、機体位置算出部によって算出され、当該機体位置と、認識対象物の存在領域を示す認識出力データとから、認識対象物の地図上の位置を示す認識対象物位置情報が生成される。このような構成から、認識対象物の存在領域を推定する推定確率、及び、当該認識対象物の地図上の位置、結果的には当該認識対象物と収穫機との距離を考慮した収穫作業支援の制御が可能となる。 In a harvester according to an embodiment of the present invention, an image recognition module for estimating an existing area in which a recognition target exists in a captured image from image data which is a captured image is, for example, a neural network (including deep learning) or It is configured using techniques such as reinforcement learning. Moreover, since it is comprised so that the presumed probability at the time of estimating the said presence area | region can also be output simultaneously, the optimal control using the probability value also becomes possible. Furthermore, the machine position indicated by the map coordinates at the time the captured image is acquired is calculated by the machine position calculation unit, and the recognition target is calculated from the position of the machine and the recognition output data indicating the presence area of the recognition target. Recognized object position information indicating the position of the object on the map is generated. From such a configuration, an estimation probability for estimating the existence area of the recognition object, and a position of the recognition object on the map, and consequently, a harvesting operation support considering the distance between the recognition object and the harvester Control of the
 撮影部によって撮影された圃場の撮影画像では、遠近法の関係で、撮影部から遠い認識対象物に対する分解能は、撮影部から近い認識対象物に対する分解能に比べて低下する。
このことから、撮影部から遠い場所で写っている認識対象物は、撮影部から近い場所で写っている認識対象物に比べてその認識信頼性は低いものとなってしまう。これに対応して、本発明の好適な実施形態の1つでは、前記撮影画像において前記認識対象物が前記撮影部から遠くに位置するほど、当該認識対象物の前記推定確率は低減されるように構成されている。
In the photographed image of the field photographed by the photographing unit, the resolution for the recognition target far from the photographing unit is lower than the resolution for the recognition target near from the photographing unit due to the perspective.
From this, the recognition object appearing at a location far from the imaging unit has lower recognition reliability than the recognition object appearing at a location near the imaging unit. Correspondingly, in one of the preferred embodiments of the present invention, the estimation probability of the recognition target is reduced as the recognition target is located farther from the imaging unit in the photographed image. Is configured.
 撮像画像に基づく認識対象物位置情報の生成は、収穫機の車速と比較して高速で行うことができるので、同一の認識対象物に関する認識対象物位置情報は複数生成される。このため、同一の認識対象物に関する複数の認識対象物位置情報に含まれているそれぞれの推定確率を統計的演算することが可能となる。この統計的演算を通じて、より信頼度の高い認識対象物位置情報を生成することができる。ここでの統計的演算とは、算術平均、重み平均、中間値演算などであり、複数のデータ値からより信頼性の高いデータ値が導出されるような演算である。このような統計的演算を推定確率に対して利用することで、複数の認識対象物位置情報から、より適切な認識対象物位置情報を導出することが可能となる。
このことから、本発明の好適な実施形態の1つでは、複数の前記認識対象物位置情報が記憶され、記憶された複数の前記認識対象物位置情報が、対応する前記認識出力データに含まれている前記推定確率の統計的演算の結果に基づいて補正されるように構成されている。
Since generation of recognition target object position information based on a captured image can be performed at high speed as compared with the vehicle speed of the harvester, a plurality of recognition target object position information regarding the same recognition target object is generated. For this reason, it is possible to calculate statistically the respective estimated probabilities included in a plurality of recognition target object position information regarding the same recognition target. Through this statistical operation, more reliable recognition object position information can be generated. The statistical operation here is arithmetic average, weighted average, intermediate value operation or the like, and is an operation such that a more reliable data value is derived from a plurality of data values. By using such a statistical operation for the estimated probability, it is possible to derive more appropriate recognition target object position information from a plurality of recognition target object position information.
From this, in one of the preferred embodiments of the present invention, a plurality of pieces of recognition target object position information are stored, and a plurality of pieces of stored recognition target object position information are included in the corresponding recognition output data. It is configured to be corrected based on the result of the statistical calculation of the estimated probability.
 機体に設けられた撮影部によって取得された撮影画像は、画像認識モジュールの入力ソースとなるにもかかわらず、撮影条件によって認識対象物の認識に不適当なものとなってしまう可能性がある。その結果、認識対象物が認識されたとしても、その認識出力データの認識確率は低いものとなってしまうので、当該認識出力データを、その後の処理にそのまま利用することは好ましくはない。このことから、本発明の好適な実施形態の1つでは、前記画像認識モジュールから遂次出力された前記認識出力データを認識出力データ列として一時的かつ経時的に記憶するデータ記憶部と、データ判定部とが備えられ、前記データ判定部は、前記認識出力データ列において継時的に前後関係となる前記認識出力データの前記推定確率と比べて、所定レベル以上に低い前記推定確率を有する前記認識出力データを不適認識出力データとして判定する。不適認識出力データと判定された認識出力データは削除してもよいし、当該認識出力データの推定確率を、前後の認識出力データで補間した推定確率が置き換えるようにしてもよい。 Although the photographed image acquired by the photographing unit provided in the machine becomes an input source of the image recognition module, there is a possibility that it becomes unsuitable for the recognition of the recognition object depending on the photographing condition. As a result, even if the recognition target object is recognized, the recognition probability of the recognition output data becomes low, so it is not preferable to use the recognition output data as it is for the subsequent processing. From this, in one of the preferred embodiments of the present invention, there is provided a data storage unit for temporarily and temporally storing the recognition output data sequentially output from the image recognition module as a recognition output data string; A determination unit is provided, and the data determination unit has the estimated probability lower than a predetermined level as compared with the estimated probability of the recognized output data that is sequentially related in the recognition output data sequence. The recognition output data is determined as improper recognition output data. The recognition output data determined to be unsuitable recognition output data may be deleted, or the estimation probability of the recognition output data may be replaced with an estimation probability interpolated with preceding and following recognition output data.
 圃場を走行しながら農作物を収穫する収穫機にとって重要となる認識対象物は、人物、倒伏穀稈、雑草、畦などである。作業走行の障害物となる人物を認識することは、障害物回避制御や障害物警報報知のトリガーとなるため、圃場における安全走行のために重要である。倒伏穀稈や雑草を認識することは、作業制御における倒伏穀稈対策制御や雑草対策制御のとなるため、高品質の収穫作業を行うために重要である。畦を認識することは、圃場の境界線を検出することになり、圃場の枕地走行制御等にとって重要である。 The objects of recognition that are important for harvesters that harvest crops while traveling on the field are people, fallen grain weirs, weeds, straw and so on. Recognizing a person who is an obstacle for work travel is an important factor for safe travel in a field because it serves as a trigger for obstacle avoidance control and obstacle alarm notification. It is important to perform high-quality harvest work, because it becomes control control of weed measures and weed control measures in work control. It is important to recognize the borderline of the field and to recognize the driving of the field on the field, etc. to recognize the ridge.
 本発明の一実施形態に係る収穫機は、衛星測位モジュールからの測位データに基づいて機体の地図座標である機体位置を算出する機体位置算出部と、前記機体位置から前記機体の走行軌跡を算出する走行軌跡算出部と、前記機体に設けられ、収穫作業時に圃場を撮影する撮影部と、前記撮影部によって継時的に遂次取得された撮影画像の画像データが入力され、前記撮影画像における認識対象物が存在する存在領域を推定し、前記存在領域および前記存在領域が推定された際の推定確率を含む認識出力データを出力する画像認識モジュールと、前記走行軌跡に基づいて、前回撮影画像における前記存在領域が次回撮影画像において位置すべき範囲を予測し、前記次回撮影画像において前記範囲と重複する前記存在領域の前記推定確率と、前記前回撮影画像における前記存在領域の前記推定確率とが、予め定めた許容量以上に相違している場合、前記次回撮影画像に基づく前記認識出力データを不適認識出力データとして判定するデータ判定部とを備える。 A harvester according to an embodiment of the present invention calculates an aircraft position calculation unit which calculates an aircraft position which is map coordinates of an aircraft based on positioning data from a satellite positioning module, and calculates a traveling locus of the aircraft from the vehicle position. The image data of the photographed image sequentially acquired by the photographing unit, which is provided on the machine, and which is provided on the machine to photograph the field at the time of harvesting work, is input, and An image recognition module for estimating a presence area in which a recognition target is present, and outputting recognition output data including the presence area and an estimated probability when the presence area is estimated; and a previously captured image based on the traveling locus Predicting the range in which the existing region should be located in the next captured image, and the estimated probability of the existing region overlapping the range in the next captured image, and A data determination unit that determines the recognition output data based on the next captured image as inappropriate recognition output data when the estimated probability of the existing area in the previous captured image is different by a predetermined allowable amount or more Prepare.
 収穫機が作業走行している途中で、前方に認識対象物が現れた場合、通常、当該認識対象物が写されている撮影画像が複数取得される。したがって、このような複数の撮影画像毎に、当該認識対象物の存在領域が推定され、その推定確率を含む認識出力データが出力される。この複数の撮影画像のそれぞれにおいて認識された認識対象物は、それが実質的に静止体である場合、各撮影画像が撮影された機体位置に応じて、つまり走行軌跡に応じて、それぞれの撮影画像において異なる位置に存在することになる。したがって、走行軌跡に基づいて、撮影画像における認識対象物の存在領域が位置すべき範囲は予測できる。
このように予測された範囲に認識対象物の存在領域が位置していない場合、あるいは、位置していてもその推定確率が他の撮影画像に基づく推定確率に比べて大きく異なっている場合、その撮影画像が不適切であるとみなされる。また、撮影画像が不適切であれば、当然それに基づいて出力される認識出力データも不適切であるとみなされる。このことから、走行軌跡に基づいて、前回撮影画像における存在領域が次回撮影画像において位置すべき範囲を予測し、次回撮影画像において当該範囲と重複する存在領域の前記推定確率と、前回撮影画像における存在領域の推定確率とが、予め定めた許容量以上に相違している場合、次回撮影画像に基づく認識出力データを不適認識出力データとして判定することは有益である。
When a recognition object appears in front while the harvest machine is working and traveling, usually, a plurality of photographed images in which the recognition object is photographed are acquired. Therefore, the presence area of the recognition target is estimated for each of the plurality of photographed images, and recognition output data including the estimated probability is output. When the recognition object recognized in each of the plurality of photographed images is substantially a stationary object, each of the photographed objects is photographed according to the position of the machine at which each photographed image is photographed, that is, according to the traveling locus. It will be at different locations in the image. Therefore, based on the traveling locus, the range in which the region where the recognition target is present in the captured image should be located can be predicted.
When the existing region of the recognition target is not located in the range predicted in this way, or when it is located, its estimation probability is significantly different from the estimation probability based on other photographed images, Images taken are considered inappropriate. Also, if the captured image is inappropriate, the recognition output data output based on it is also considered to be inappropriate. From this, based on the traveling locus, the range in which the existing area in the previous captured image should be positioned in the next captured image is predicted, and the estimated probability of the existing area overlapping the range in the next captured image and in the previous captured image When the estimation probability of the existing area is different by a predetermined allowable amount or more, it is useful to determine recognition output data based on the next photographed image as inappropriate recognition output data.
 本発明の好適な実施形態の1つでは、前記不適認識出力データの前記推定確率を、前記不適認識出力データの継時的に前後となる前記認識出力データの前記推定確率に基づいて得られた補間推定確率で置き換えるデータ補正部が備えられている。この構成では、不適認識出力データと判定された認識出力データの推定確率を、その前後の認識出力データの推定確率で補間補正することにより、不適認識出力データを適正な認識出力データに置き換えて、利用することができる。これは、同一の認識対象物に関する認識出力データの数が少ない場合に、より効果的である。 In one of the preferred embodiments of the present invention, the estimation probability of the inappropriate recognition output data is obtained based on the estimated probability of the recognition output data that is sequentially before and after the inappropriate recognition output data. A data correction unit is provided that replaces the interpolation estimation probability. In this configuration, the estimation probability of recognition output data determined to be unsuitable recognition output data is interpolated and corrected with the estimation probability of recognition output data before and after that, thereby replacing inappropriate recognition output data with proper recognition output data, It can be used. This is more effective when the number of recognition output data regarding the same recognition target is small.
 圃場を走行しながら農作物を収穫する収穫機にとって重要となる認識対象物は、人物、倒伏穀稈、雑草、畦などである。作業走行の障害物となる人物を認識することは、障害物回避制御や障害物警報報知のトリガーとなるため、圃場における安全走行のために重要である。倒伏穀稈や雑草を認識することは、作業制御における倒伏穀稈対策制御や雑草対策制御となるため、高品質の収穫作業を行うために重要である。畦を認識することは、圃場の境界線を検出することになり、圃場の枕地走行制御等にとって重要である。 The objects of recognition that are important for harvesters that harvest crops while traveling on the field are people, fallen grain weirs, weeds, straw and so on. Recognizing a person who is an obstacle for work travel is an important factor for safe travel in a field because it serves as a trigger for obstacle avoidance control and obstacle alarm notification. It is important to perform high-quality harvest work, because it becomes control control of weed control and weed control in operation control. It is important to recognize the borderline of the field and to recognize the driving of the field on the field, etc. to recognize the ridge.
実施形態1におけるコンバインの全体側面図である。FIG. 1 is an overall side view of a combine according to a first embodiment. 実施形態1における刈取部の側面図である。FIG. 2 is a side view of the reaper according to the first embodiment. 実施形態1における扱深さ制御の説明図である。FIG. 6 is an explanatory view of depth control in the first embodiment. 実施形態1におけるコンバインの制御系の機能を示す機能ブロック図である。FIG. 6 is a functional block diagram showing functions of a control system of the combine in the first embodiment. 実施形態1における画像認識モジュールによる認識出力データの生成の流れを模式的に示す説明図である。FIG. 8 is an explanatory view schematically showing a flow of generation of recognition output data by the image recognition module in the first embodiment. 実施形態1における雑草生育領域を含む撮像画像と機体位置とから雑草位置を算出して、扱深さ制御を中断させる制御の流れを示す流れ図である。It is a flowchart which shows the flow of control which interrupts a handling depth control by calculating a weed position from the captured image containing the weed growth area | region in Embodiment 1, and an airframe position. 実施形態1における雑草位置情報をマップ化することによって得られた雑草マップを示す模式図である。It is a schematic diagram which shows the weed map obtained by mapping weed position information in Embodiment 1. FIG. 実施形態2におけるコンバインの全体側面図である。FIG. 10 is an overall side view of a combine according to a second embodiment. 実施形態2におけるコンバインの制御系を示す機能ブロック図である。FIG. 13 is a functional block diagram showing a control system of the combine in the second embodiment. 実施形態2における画像認識モジュールによる認識出力データの生成の流れを模式的に示す説明図である。FIG. 16 is an explanatory view schematically showing a flow of generation of recognition output data by the image recognition module in the second embodiment. 実施形態2における撮像画像から認識対象物位置情報を生成する際のデータの流れを示すデータ流れ図である。FIG. 13 is a data flow chart showing the flow of data when generating recognition target object position information from a captured image in Embodiment 2. FIG. 実施形態2における認識対象物位置情報の1つとしての雑草位置情報をマップ化することによって得られた雑草マップを示す模式図である。It is a schematic diagram which shows the weed map obtained by mapping weed position information as one of recognition target object positional information in Embodiment 2. FIG. 実施形態3におけるコンバインの全体側面図である。FIG. 18 is an overall side view of a combine according to a third embodiment. 実施形態3におけるコンバインの制御系を示す機能ブロック図である。FIG. 16 is a functional block diagram showing a control system of the combine in the third embodiment. 実施形態3における画像認識モジュールによる認識出力データの生成の流れを模式的に示す説明図である。FIG. 18 is an explanatory view schematically showing a flow of generation of recognition output data by the image recognition module in the third embodiment. 実施形態3における撮像画像から認識対象物位置情報を生成する際のデータの流れを示すデータ流れ図である。15 is a data flow chart showing the flow of data when generating recognition target object position information from a captured image in the third embodiment. 実施形態3における不適認識出力データを判定するデータ処理を模式的に説明する説明図である。FIG. 21 is an explanatory view schematically illustrating data processing for determining improper recognition output data in the third embodiment. 実施形態3における認識対象物位置情報の1つとしての雑草位置情報をマップ化することによって得られた雑草マップを示す模式図である。It is a schematic diagram which shows the weed map obtained by mapping weed position information as one of recognition target object positional information in Embodiment 3. FIG.
 以下、本発明に係る収穫機の一例としてのコンバインの実施形態を図面に基づいて説明する。各実施形態で、機体1の前後方向を定義するときは、作業状態における機体進行方向に沿って定義する。図1に符号(F)で示す方向が機体前側、図1に符号(B)で示す方向が機体後側である。機体1の左右方向を定義するときは、機体進行方向視で見た状態で左右を定義する。 Hereinafter, an embodiment of a combine as an example of a harvester according to the present invention will be described based on the drawings. In each embodiment, when defining the front-back direction of the airframe 1, it defines along the airframe advancing direction in a working state. The direction indicated by the symbol (F) in FIG. 1 is the front side of the vehicle, and the direction indicated by the symbol (B) in FIG. 1 is the rear side of the vehicle. When defining the left-right direction of the airframe 1, right and left are defined in the state seen in the advancing direction of the airframe.
[実施形態1]
 図1に示すように、実施形態1に係るコンバインでは、左右一対のクローラ走行装置10を備えた機体1の前部に横軸芯X周りで昇降操作自在に刈取部2が連結されている。機体1の後部には、機体横幅方向に並ぶ状態で脱穀装置11と、穀粒を貯留する穀粒タンク12とが備えられている。機体1の前部右側箇所に搭乗運転部を覆うキャビン14が備えられ、このキャビン14の下方に駆動用のエンジン15が備えられている。
Embodiment 1
As shown in FIG. 1, in the combine according to the first embodiment, the reaper 2 is connected to the front of the machine body 1 including the pair of left and right crawler traveling devices 10 so as to be movable up and down around the horizontal axis X. A threshing device 11 and a grain tank 12 for storing grains are provided at the rear of the machine body 1 in a state of being aligned in the machine body width direction. A cabin 14 is provided at the front right side of the airframe 1 to cover the boarding driver, and a driving engine 15 is provided below the cabin 14.
 図1に示すように、脱穀装置11は、刈取部2で刈り取られて後方に搬送されてくる刈取穀稈を内部に受け入れて、穀稈の株元を脱穀フィードチェーン111と挟持レール112とによって挟持搬送しながら穂先側を扱胴113にて脱穀処理する。そして、扱胴113の下方に備えられた選別部にて脱穀処理物に対する穀粒選別処理が実行され、そこで選別された穀粒が穀粒タンク12へ搬送され、貯留される。また、詳述はしないが、穀粒タンク12にて貯留される穀粒を外部に排出する穀粒排出装置13が備えられている。 As shown in FIG. 1, the threshing device 11 internally receives a reaping grain weir which is cut by the reworking unit 2 and transported backward, and the original origin of the weir by the threshing feed chain 111 and the pinching rail 112 The tip end side is threshed with a threshing cylinder 113 while holding and conveying. Then, the grain sorting process is performed on the threshing treatment product in the sorting unit provided below the threshing drum 113, and the grain sorted there is transported to the grain tank 12 and stored. Although not described in detail, a grain discharging device 13 for discharging grains stored in the grain tank 12 to the outside is provided.
 刈取部2には、倒伏した植立穀稈を引き起こす複数の引き起こし装置21、引起された植立穀稈の株元を切断するバリカン型の切断装置22、穀稈搬送装置23等が備えられている。穀稈搬送装置23は、株元が切断された縦姿勢の刈取穀稈を徐々に横倒れ姿勢に変更させながら、機体後方側に位置する脱穀装置11の脱穀フィードチェーン111の始端部に向けて搬送する。 The reaper 2 is provided with a plurality of raising devices 21 for causing fallen cereal grains, a clipper-type cutting device 22 for cutting the origin of the raised cereal grains, a grain feeding device 23, etc. There is. The grain feeder conveying device 23 is directed toward the beginning end of the threshing feed chain 111 of the threshing device 11 located on the rear side of the machine while gradually changing the cropped grain gutter in the vertical posture in which the stock origin is cut to a lying posture. Transport
 穀稈搬送装置23は、切断装置22により刈り取られた複数条の刈取穀稈を刈幅方向中央に寄せ集めながら搬送する合流搬送部231、寄せ集めた刈取穀稈の株元を挟持して後方に搬送する株元挟持搬送装置232、刈取穀稈の穂先側を係止搬送する穂先係止搬送装置233、株元挟持搬送装置232の終端部から刈取穀稈の株元を脱穀フィードチェーン111に向けて案内する供給搬送装置234等を備えている。 The grain feeder conveying device 23 holds the stock origin of the combined feed conveying unit 231 which gathers and gathers a plurality of cutting grain rods cut by the cutting device 22 to the center in the cutting width direction, sandwiches the stock origin of the gathered grain collectors From the end of the stock holding and conveying device 232 to the threshing feed chain 111 to the threshing feed chain 111 It has a supply conveyance device 234 and the like that directs and guides it.
 図2に示すように、株元挟持搬送装置232は、刈取部2の支持フレームに横軸芯周りで揺動自在に支持されている。株元挟持搬送装置232は駆動操作機構235により上下揺動操作され、その揺動操作に伴って、搬送終端部が供給搬送装置234に対して穀稈の稈長方向に位置変更するように設けられている。駆動操作機構235は、駆動源として扱深さ調節用電動モータ236(以下、扱深さモータという)を有する。駆動操作機構235は、扱深さモータ236によって押し引き操作される操作ロッド237を備えている。
操作ロッド237の下端部が株元挟持搬送装置232の途中部に枢支連結されている。
As shown in FIG. 2, the stock holder holding and conveying device 232 is swingably supported by the support frame of the reaper 2 around the horizontal axis. The stock holding and conveying device 232 is operated to swing up and down by the drive operation mechanism 235, and along with the swinging operation, the conveyance end portion is provided so as to change the position of the feeding and conveying device 234 in the scale length direction of the grain scale. ing. The drive operation mechanism 235 has a steering depth adjustment electric motor 236 (hereinafter referred to as a steering depth motor) as a drive source. The drive operation mechanism 235 includes an operation rod 237 which is pushed and pulled by the depth-of-feed motor 236.
The lower end portion of the operation rod 237 is pivotally connected to an intermediate portion of the stock origin holding and conveying device 232.
 株元挟持搬送装置232の搬送終端部が供給搬送装置234から離れると、供給搬送装置234による刈取穀稈の株元挟持位置が、株元挟持搬送装置232による刈取穀稈の株元挟持位置に対して穂先側に変更されて、供給搬送装置234に受け渡される。その結果、刈取穀稈の脱穀装置11への入り込み深さ(扱深さ)が浅め(浅扱き側)に変更される。 When the transport end portion of the stock origin nipping and conveying device 232 is separated from the feeding and conveying device 234, the stock origin nipping position of the harvesting grain by the supply transport device 234 is at the stock origin nipping position of the harvesting grain by the stock origin nipping and transporting device 232. On the other hand, it is changed to the tip side and is delivered to the feeding and conveying device 234. As a result, the entry depth (handling depth) of the harvest grain weir into the threshing device 11 is changed to a shallower (shallow side).
 株元挟持搬送装置232の搬送終端部が供給搬送装置234に近づくと、供給搬送装置234による刈取穀稈の株元挟持位置が、株元挟持搬送装置232による刈取穀稈の株元挟持位置に近い位置になった状態で供給搬送装置234に受け渡される。その結果、刈取穀稈の脱穀装置11に対する扱深さが深め(深扱き側)に変更される。 When the transport end portion of the stock origin gripping and conveying device 232 approaches the feeding and transporting device 234, the stock origin pinching position of the harvesting grain by the feed conveying device 234 is the stock origin pinching position of the harvesting grain by the stock origin gripping and transporting device 232 It is delivered to the feeding and conveying device 234 in a state where it is in the near position. As a result, the depth of threshing equipment 11 for the reaping grain can be changed to the deeper side (deep side).
 このように株元挟持搬送装置232の姿勢を変更することにより、脱穀装置11に対する刈取穀稈の扱深さを変更することができる。つまり、株元挟持搬送装置232と駆動操作機構235とにより、脱穀装置11に対する刈取穀稈の扱深さを変更可能な扱深さ調整機構3が構成されている。 By changing the posture of the stock origin holding and conveying device 232 in this manner, it is possible to change the handling depth of the harvesting grain cane with respect to the threshing device 11. In other words, the stock origin holding and conveying device 232 and the drive operation mechanism 235 constitute a threshing depth adjustment mechanism 3 capable of changing the threshing depth of the reaping grain gutter to the threshing device 11.
 図3に示すように、実施形態1に係るコンバインは、穀稈搬送装置23にて搬送される刈取穀稈の稈長を検出する接触式の稈長検出装置30と、扱深さ制御部620とが備えられている。扱深さ制御部620は、稈長検出装置30の検出結果に基づいて、扱深さを調整する扱深さ調整制御を行う。この実施形態では、扱深さ制御部620は、脱穀装置11に対する刈取穀稈の扱深さが目標設定範囲内に維持されるように扱深さモータ236を制御する。 As shown in FIG. 3, the combine according to the first embodiment includes a contact-type scale length detecting device 30 for detecting the scale length of the reaping grain transferred by the grain transfer device 23 and a threshing depth control unit 620. It is equipped. The handling depth control unit 620 performs handling depth adjustment control for adjusting the handling depth based on the detection result of the eyelid length detection device 30. In this embodiment, the threshing depth control unit 620 controls the threshing depth motor 236 so that the threshing depth of the harvesting grain cane with respect to the threshing device 11 is maintained within the target setting range.
 図3に示すように、稈長検出装置30は、下向きに開放された略無底箱状に形成された装置本体部としての本体ケース31に一対の揺動式のセンサアーム32,33が備えられ、それら一対のセンサアーム32,33が刈取穀稈の穂先側箇所に接触作用して稈長を検知する構成となっている。一対のセンサアーム32,33は、搬送される刈取穀稈の稈長方向に離間する状態で、上部側箇所が本体ケース31に支持されるとともに下方側に向けて垂下する状態で設けられている。各センサアーム32,33は、本体ケース31の内部に設けられた横軸芯周りで前後方向(刈取穀稈の移動方向に相当)に揺動自在に、且つ、下向きの基準姿勢に復帰付勢される状態で支持されている。 As shown in FIG. 3, the eyelid length detection device 30 is provided with a pair of swingable sensor arms 32, 33 on a main body case 31 as an apparatus main body portion formed in a substantially bottomless box shape opened downward. The pair of sensor arms 32 and 33 contact with the tip end of the reaping grain weir to detect the length of the beak. The pair of sensor arms 32 and 33 are provided in a state of being supported by the main body case 31 and hanging down toward the lower side in a state of being separated in the ridge length direction of the crop grain to be conveyed. Each sensor arm 32, 33 is pivotable back and forth (corresponding to the moving direction of cutting grain) around the horizontal axis center provided inside the main body case 31, and is biased to return to the downward reference posture Being supported in the
 一対のセンサアーム32,33それぞれの上部の基端側箇所には、搬送される刈取穀稈が接触することによりセンサアーム32,33が基準姿勢から設定量以上揺動するとオン状態となり、センサアーム32,33の基準姿勢からの揺動量が設定量未満であればオフ状態となる検知スイッチ34,35が備えられている。 When the sensor arm 32, 33 is swung more than the set amount from the reference posture by bringing the reaper to be conveyed into contact with the base end side portion of the upper portion of each of the pair of sensor arms 32, 33, the sensor arm is turned on Detection switches 34 and 35 are provided which are turned off if the amount of oscillation from the reference attitude of 32 and 33 is less than the set amount.
 一対の検知スイッチ34,35の出力が扱深さ制御部620に入力されている。扱深さ制御部620は、穂先側に位置する検知スイッチ35がオフ状態であり、且つ、株元側に位置する検知スイッチ34がオン状態となるように、扱深さモータ236の作動を制御する。 The outputs of the pair of detection switches 34 and 35 are input to the depth control unit 620. The handling depth control unit 620 controls the operation of the handling depth motor 236 such that the detection switch 35 located on the tip side is in the off state, and the detection switch 34 located on the stock origin side is in the on state. Do.
 すなわち、扱深さ制御部620は、一対の検知スイッチ34,35が共にオン状態であれば、株元挟持搬送装置232が浅扱ぎ側へ移動するように扱深さモータ236を作動させる。また、扱深さ制御部620は、一対の検知スイッチ34,35が共にオフ状態であれば、株元挟持搬送装置232が深扱ぎ側へ移動するように扱深さモータ236を作動させる。さらに、扱深さ制御部620は、一対の検知スイッチ34,35のうち穂先側に位置する検知スイッチ35がオフ状態で、株元側に位置する検知スイッチ34がオン状態であれば、扱深さモータ236の作動を停止してその状態を維持する。 That is, when the pair of detection switches 34 and 35 are both in the on state, the handling depth control unit 620 operates the handling depth motor 236 so that the stock origin holding conveyance device 232 moves to the shallow handling side. In addition, the handling depth control unit 620 operates the handling depth motor 236 so that the stock origin holding conveyance device 232 moves to the deep handling side when the pair of detection switches 34 and 35 are both in the OFF state. Furthermore, if the detection switch 35 located on the tip side of the pair of detection switches 34 and 35 is in the OFF state and the detection switch 34 located on the stock origin side is in the ON state, the handling depth control unit 620 The operation of the motor 236 is stopped and the state is maintained.
 図1に示すように、キャビン14の天井部の前端に、カラーカメラを備えた撮影部70が設けられている。撮影部70の撮影視野の前後方向の広がりは、刈取部2の前端領域からほぼ地平線に達している。撮影視野の幅方法の広がりは、10m程度から数十mに達している。撮影部70によって取得された撮影画像は、画像データ化され、コンバインの制御系に送られる。 As shown in FIG. 1, at the front end of the ceiling portion of the cabin 14, an imaging unit 70 provided with a color camera is provided. The fore-and-aft extension of the imaging field of view of the imaging unit 70 substantially reaches the horizon from the front end region of the reaper 2. The breadth of the field of view in the field of view reaches about 10 m to several tens of meters. The photographed image acquired by the photographing unit 70 is converted into image data and sent to the control system of the combine.
 撮影部70は、収穫作業時に圃場を撮影する。コンバインの制御系は、撮影部70から送られてきた画像データから雑草生育領域を認識対象物として認識する機能を有する。図1では、正常な植立穀稈群が符号Z0で示され、雑草生育領域が符号Z1で示されている。 The photographing unit 70 photographs a field at the time of harvesting work. The control system of the combine has a function of recognizing the weed growth area as a recognition object from the image data sent from the imaging unit 70. In FIG. 1, a group of normal planted cereals is indicated by a symbol Z0, and a weed growth area is indicated by a symbol Z1.
 さらに、キャビン14の天井部には、衛星測位モジュール80も設けられている。衛星測位モジュール80には、GNSS(global navigation satellite system)信号(GPS信号を含む)を受信するための衛星用アンテナが含まれている。衛星測位モジュール80による衛星航法を補完するために、ジャイロ加速度センサや磁気方位センサを組み込んだ慣性航法ユニットが衛星測位モジュール80に組み込まれている。もちろん、慣性航法ユニットは別の場所に配置できる。図1において、衛星測位モジュール80は、作図の便宜上、キャビン14の天井部における後部に配置されているが、例えば、切断装置22の左右中央部の直上方位置にできるだけ近づくように、天井部の前端部における機体中央側寄りの位置に配置されていると好適である。 Furthermore, a satellite positioning module 80 is also provided on the ceiling of the cabin 14. The satellite positioning module 80 includes a satellite antenna for receiving global navigation satellite system (GNSS) signals (including GPS signals). In order to complement the satellite navigation by the satellite positioning module 80, an inertial navigation unit incorporating a gyro acceleration sensor and a magnetic direction sensor is incorporated in the satellite positioning module 80. Of course, the inertial navigation unit can be located elsewhere. In FIG. 1, the satellite positioning module 80 is disposed at the rear of the ceiling of the cabin 14 for convenience of drawing, but for example, the satellite positioning module 80 is positioned as close as possible to a position just above the center of left and right of the cutting device 22. It is preferable that the front end portion be disposed closer to the center of the airframe.
 図4には、コンバインの機体1の内部に構築された制御系の機能ブロック図が示されている。この実施形態の制御系は、多数のECUと呼ばれる電子制御ユニットと、各種動作機器、センサ群やスイッチ群、それらの間のデータ伝送を行う車載LANなどの配線網から構成されている。報知デバイス91は、運転者等に作業走行状態や種々の警告を報知するためのデバイスであり、ブザー、ランプ、スピーカ、ディスプレイなどである。通信部92は、このコンバインの制御系が、遠隔地に設置されているクラウドコンピュータシステム100や携帯通信端末200との間でデータ交換するために用いられる。携帯通信端末200は、ここでは、作業走行現場における監視者(運転者も含む)が操作するタブレットコンピュータである。制御ユニット6は、この制御系の中核要素であり、複数のECUの集合体として示されている。衛星測位モジュール80で取得された測位データ、及び、撮影部70で取得された画像データは、配線網を通じて制御ユニット6に入力される。 FIG. 4 shows a functional block diagram of a control system built inside the combine aircraft 1. The control system of this embodiment comprises an electronic control unit called a large number of ECUs, various operation devices, sensors, switches, and a wiring network such as a car-mounted LAN for data transmission between them. The notification device 91 is a device for notifying a driver or the like of a work traveling state and various warnings, and is a buzzer, a lamp, a speaker, a display, or the like. The communication unit 92 is used to exchange data with the cloud computer system 100 or the portable communication terminal 200 installed at a remote place. Here, the mobile communication terminal 200 is a tablet computer operated by a supervisor (including a driver) at the work travel site. The control unit 6 is a core element of this control system, and is shown as a collection of a plurality of ECUs. The positioning data acquired by the satellite positioning module 80 and the image data acquired by the imaging unit 70 are input to the control unit 6 through the wiring network.
 制御ユニット6は、入出力インタフェースとして、出力処理部6Bと入力処理部6Aとを備えている。出力処理部6Bは、車両走行機器群7A及び作業装置機器群7Bと接続している。車両走行機器群7Aには、車両走行に関する制御機器、例えばエンジン制御機器、変速制御機器、制動制御機器、操舵制御機器などが含まれている。作業装置機器群7Bには、刈取部2、脱穀装置11、穀粒排出装置13、穀稈搬送装置23、扱深さ調整機構3における動力制御機器などが含まれている。 The control unit 6 includes an output processing unit 6B and an input processing unit 6A as an input / output interface. The output processing unit 6B is connected to the vehicle travel device group 7A and the work device device group 7B. The vehicle travel device group 7A includes control devices related to vehicle travel, such as an engine control device, a shift control device, a braking control device, a steering control device, and the like. The working device group 7B includes a power control device and the like in the reaper 2, the threshing device 11, the grain discharging device 13, the grain conveying device 23, and the threshing depth adjusting mechanism 3.
 入力処理部6Aには、走行系検出センサ群8Aや作業系検出センサ群8Bなどが接続されている。走行系検出センサ群8Aには、エンジン回転数調整具、アクセルペダル、ブレーキペダル、変速操作具などの状態を検出するセンサが含まれている。作業系検出センサ群8Bには、刈取部2、脱穀装置11、穀粒排出装置13、穀稈搬送装置23における装置状態及び穀稈や穀粒の状態を検出するセンサが含まれている。さらには、作業系検出センサ群8Bには、上述した扱深さ調整機構3における検知スイッチ34,35も含まれている。 A traveling system detection sensor group 8A, a working system detection sensor group 8B, and the like are connected to the input processing unit 6A. The traveling system detection sensor group 8A includes a sensor that detects the state of an engine speed adjuster, an accelerator pedal, a brake pedal, a gearshift operator, and the like. The working system detection sensor group 8B includes a cutting unit 2, a threshing device 11, a grain discharging device 13, and a device state in the grain conveying device 23 and a sensor for detecting the state of the grain or grain. Furthermore, the working system detection sensor group 8B also includes the detection switches 34 and 35 in the above-described handling depth adjustment mechanism 3.
 制御ユニット6には、画像認識モジュール5、データ処理モジュール50、作業走行制御部である作業走行制御モジュール60、機体位置算出部66、報知部67、雑草位置算出部68が備えられている。 The control unit 6 includes an image recognition module 5, a data processing module 50, a work travel control module 60 which is a work travel control unit, a machine position calculation unit 66, a notification unit 67, and a weed position calculation unit 68.
 報知部67は、制御ユニット6の各機能部から受信した指令等に基づいて報知データを生成し、報知デバイス91に与える。機体位置算出部66は、衛星測位モジュール80から逐次送られてくる測位データに基づいて、機体1の地図座標(又は圃場座標)である機体位置を算出する。この実施形態の雑草位置算出部68は、機体位置算出部66で算出された、通常はアンテナ位置である機体位置と、データ処理モジュール50によって算出される雑草生育領域の地図上の位置と、穀稈搬送装置23の搬送速度とに基づいて、刈取部2に刈り取られた雑草が扱深さ調整機構3を通過するタイミングを決定する。 The notification unit 67 generates notification data on the basis of a command or the like received from each functional unit of the control unit 6, and gives the notification data to the notification device 91. The aircraft position calculation unit 66 calculates the aircraft position which is the map coordinates (or field coordinates) of the aircraft 1 based on the positioning data sequentially sent from the satellite positioning module 80. The weed position calculation unit 68 according to this embodiment calculates the position of the weed growth area, which is normally calculated by the data processing module 50, on the map of the weed growth area calculated by the body position calculation unit 66, and the grain. The timing at which the weeds harvested by the reaper 2 pass through the depth adjustment mechanism 3 is determined based on the transport speed of the straw transport device 23.
 この実施形態のコンバインは自動走行(自動操舵)と手動走行(手動操舵)の両方で走行可能である。作業走行制御モジュール60には、走行制御部61と作業制御部62とに加えて、自動作業走行指令部63及び走行経路設定部64が備えられている。自動操舵で走行する自動走行モードと、手動操舵で走行する手動操舵モードとのいずれかを選択する走行モードスイッチ(非図示)がキャビン14内に設けられている。この走行モードスイッチを操作することで、手動操舵走行から自動操舵走行への移行、あるいは自動操舵走行から手動操舵走行への移行が可能である。 The combine according to this embodiment can travel in both automatic travel (automatic steering) and manual travel (manual steering). In addition to the travel control unit 61 and the work control unit 62, the work travel control module 60 includes an automatic work travel instruction unit 63 and a travel route setting unit 64. A traveling mode switch (not shown) is provided in the cabin 14 to select one of an automatic traveling mode in which the vehicle travels by automatic steering and a manual steering mode in which the vehicle travels by manual steering. By operating the travel mode switch, it is possible to shift from manual steering travel to automatic steering travel, or shift from automatic steering travel to manual steering travel.
 走行制御部61は、エンジン制御機能、操舵制御機能、車速制御機能などを有し、車両走行機器群7Aに走行制御信号を与える。作業制御部62は、刈取部2、脱穀装置11、穀粒排出装置13、穀稈搬送装置23などの動きを制御するために、作業装置機器群7Bに作業制御信号を与える。さらに、作業制御部62には、図3を用いて説明した扱深さ制御部620が含まれている。 The traveling control unit 61 has an engine control function, a steering control function, a vehicle speed control function, and the like, and provides a traveling control signal to the vehicle traveling device group 7A. The work control unit 62 provides a work control device group 7B with a work control signal in order to control the movement of the reaper 2, the threshing device 11, the grain discharging device 13, the grain feeding device 23, and the like. Furthermore, the operation control unit 62 includes the depth control unit 620 described with reference to FIG. 3.
 手動操舵モードが選択されている場合、運転者による操作に基づいて、走行制御部61が制御信号を生成し、車両走行機器群7Aを制御する。自動操舵モードが選択されている場合、自動作業走行指令部63によって与えられる自動走行指令に基づいて、走行制御部61は、操舵に関する車両走行機器群7Aや車速に関する車両走行機器群7Aを制御する。 When the manual steering mode is selected, the traveling control unit 61 generates a control signal based on an operation by the driver to control the vehicle traveling device group 7A. When the automatic steering mode is selected, the traveling control unit 61 controls the traveling equipment group 7A related to steering and the traveling equipment group 7A related to vehicle speed based on the automatic traveling instruction given by the automatic work traveling instruction unit 63. .
 走行経路設定部64は、制御ユニット6、携帯通信端末200、クラウドコンピュータシステム100などのいずれかで作成された自動走行のための走行経路を、メモリに展開する。メモリに展開された走行経路は、順次自動走行における目標走行経路として用いられる。この走行経路は、手動走行であっても、コンバインが当該走行経路に沿って走行するためのガイダンスのために利用することも可能である。 The travel route setting unit 64 expands the travel route for automatic travel created by any of the control unit 6, the mobile communication terminal 200, the cloud computer system 100, and the like in the memory. The travel route developed in the memory is sequentially used as a target travel route in automatic travel. Even if this traveling route is manual traveling, it is also possible to use the guidance for the combine to travel along the traveling route.
 自動作業走行指令部63は、より詳しくは、自動操舵指令及び車速指令を生成して、走行制御部61に与える。自動操舵指令は、走行経路設定部64によって設定された走行経路と、機体位置算出部66によって算出された自車位置との間の方位ずれ及び位置ずれを解消するように生成される。車速指令は、前もって設定された車速値に基づいて生成される。さらに、自動作業走行指令部63は、作業制御部62に、自車位置や自車の走行状態に応じて、作業装置動作指令を与える。 More specifically, the automatic work travel instruction unit 63 generates an automatic steering instruction and a vehicle speed instruction, and gives the travel control unit 61. The automatic steering command is generated so as to eliminate the azimuth deviation and the positional deviation between the traveling route set by the traveling route setting unit 64 and the vehicle position calculated by the machine position calculation unit 66. The vehicle speed command is generated based on a previously set vehicle speed value. Furthermore, the automatic work travel instruction unit 63 gives the work control unit 62 an operation device operation instruction according to the vehicle position and the traveling state of the vehicle.
 画像認識モジュール5には、撮影部70によって継時的に遂次取得された撮影画像の画像データが入力される。画像認識モジュール5は、この撮影画像における認識対象物が存在する存在領域を推定し、存在領域及び存在領域が推定された際の推定確率を含む認識出力データを、認識結果として出力する。画像認識モジュール5は、深層学習を採用したニューラルネットワーク技術を用いて構築されている。 The image recognition module 5 receives image data of a captured image sequentially acquired by the imaging unit 70 sequentially. The image recognition module 5 estimates an existing area in the photographed image in which a recognition target exists, and outputs recognition output data including an existing area and an estimated probability when the existing area is estimated as a recognition result. The image recognition module 5 is constructed using neural network technology adopting deep learning.
 画像認識モジュール5による認識出力データの生成の流れが、図5及び図6に示されている。画像認識モジュール5には、RGB画像データの画素値が入力値として入力される。この実施形態では、推定される認証対象物は雑草である。したがって、認識結果としての認識出力データには、矩形で示されている雑草生育領域と、その雑草生育領域を推定した際の推定確率が含まれる。 The flow of generation of recognition output data by the image recognition module 5 is shown in FIG. 5 and FIG. The image recognition module 5 receives pixel values of RGB image data as input values. In this embodiment, the putative verification object is a weed. Therefore, the recognition output data as the recognition result includes a weed growth area shown by a rectangle and an estimated probability when the weed growth area is estimated.
 図5では、推定結果は模式化されており、雑草生育領域は符号F1を付与された矩形の枠で示されている。雑草生育領域は、それぞれ4つのコーナ点で規定されるが、そのような各矩形の4つのコーナ点の撮影画像上の座標位置も推定結果に含まれている。もちろん、認証対象物としての雑草が推定されなければ、雑草生育領域は出力されず、その推定確率はゼロとなる。 In FIG. 5, the estimation result is schematically shown, and the weed growth area is indicated by a rectangular frame denoted by reference numeral F1. The weed growth area is defined by four corner points respectively, but the coordinate positions on the photographed image of the four corner points of each such rectangle are also included in the estimation result. Of course, if the weed as a certification target is not estimated, the weed growth area is not output, and the estimation probability becomes zero.
 なお、この実施形態では、画像認識モジュール5は、撮影画像において認識対象物(雑草)が撮影部70から遠くに位置するほど、当該認識対象物の推定確率は低減されるように内部パラメータを設定している。これにより、撮影部70から遠く離れているために分解能が低くなっている撮影領域における認識対象物の認識を厳しくし、誤認識を低減させている。 In this embodiment, the image recognition module 5 sets internal parameters so that the probability of estimating the recognition object is reduced as the recognition object (weed) is located farther from the imaging unit 70 in the photographed image. doing. As a result, recognition of an object to be recognized in an imaging region whose resolution is low due to being far from the imaging unit 70 is made strict, and erroneous recognition is reduced.
 データ処理モジュール50は、画像認識モジュール5から出力された認識出力データを処理する。図4及び図6に示すように、この実施形態のデータ処理モジュール50には、雑草位置情報生成部51と統計処理部52とが含まれている。 The data processing module 50 processes recognition output data output from the image recognition module 5. As shown in FIGS. 4 and 6, the data processing module 50 of this embodiment includes a weed position information generation unit 51 and a statistical processing unit 52.
 雑草位置情報生成部51は、撮影画像が取得された時点の機体位置と認識出力データとから、認識対象物の地図上の位置を示す雑草位置情報を生成する。認識出力データに含まれている雑草が存在する地図上の位置は、雑草を示す矩形の4つのコーナ点の撮影画像上の座標位置(カメラ座標位置)を、地図上の座標に変換することで得られる。 The weed position information generation unit 51 generates weed position information indicating the position on the map of the recognition object from the machine position at the time when the captured image is acquired and the recognition output data. The position on the map where weeds included in the recognition output data is converted from the coordinate position (camera coordinate position) on the photographed image of the four corner points of the rectangle indicating the weed to the coordinates on the map can get.
 撮影部70は、所定時間間隔、例えば0.5秒間隔で撮影画像を取得し、その画像データを画像認識モジュール5に入力するので、画像認識モジュール5も、同じ時間間隔で、認識出力データを出力する。したがって、撮影部70の撮影視野に雑草が入っていた場合には、複数の認識出力データが同一の雑草に対する存在領域を含むことになる。その結果、同一の雑草に対する複数の雑草位置情報が得られる。その際、各元データである認識出力データに含まれている推定確率、つまり雑草位置情報に含まれる雑草の存在領域(雑草生育領域)の推定確率は、撮影部70と雑草との間の位置関係が相違することから、違う値となることが多い。 Since the photographing unit 70 acquires photographed images at predetermined time intervals, for example, 0.5 seconds, and inputs the image data to the image recognition module 5, the image recognition module 5 also recognizes and outputs recognition output data at the same time intervals. Output. Therefore, when weeds are included in the imaging field of view of the imaging unit 70, a plurality of recognition output data will include the presence regions for the same weeds. As a result, multiple weed location information for the same weed can be obtained. At that time, the estimation probability included in the recognition output data which is each source data, that is, the estimation probability of the weed presence area (weed growth area) included in the weed position information is the position between the imaging unit 70 and the weed Because the relationships are different, they often have different values.
 したがって、この実施形態では、そのような複数の雑草位置情報が記憶され、記憶された複数の雑草位置情報のそれぞれに含まれる推定確率が統計的演算される。複数の認識対象物位置情報の推定確率に対する統計的な演算を用いて、推定確率群の代表値が求められる。その代表値を用いて、複数の認識対象物位置情報を、1つの最適認識対象物位置情報に補正することができる。そのような補正の一例は、各推定確率の算術平均値又は重み平均値あるいは中間値を基準値(代表値)として求め、その基準値以上の推定確率を有する存在領域(雑草生育領域)の論理和を求め、それを最適存在領域とする補正雑草位置情報を生成することである。もちろん、これ以外の統計的演算を用いて信頼性の高い1つの雑草位置情報を生成することも可能である。 Therefore, in this embodiment, such multiple weed position information is stored, and the estimated probability included in each of the stored multiple weed position information is calculated statistically. A representative value of the estimated probability group is determined using a statistical operation on the estimated probabilities of the plurality of recognition object position information. A plurality of pieces of recognition target object position information can be corrected to one piece of optimum recognition target object position information using the representative value. One example of such correction is to obtain the arithmetic mean value or weighted mean value or median value of each estimated probability as a reference value (representative value), and the logic of the existing area (weed growth area) having an estimated probability equal to or higher than the reference value. It is to calculate the sum and generate corrected weed position information which makes it the optimum existing area. Of course, other statistical operations can be used to generate reliable single weed position information.
 図6に示すように、このようにして求められた雑草生育領域の地図上の位置を示す雑草位置情報は、雑草位置算出部68に与られる。雑草位置算出部68には、機体位置算出部66で算出された機体位置の地図座標である機体位置も与えられる。雑草位置算出部68は、雑草位置情報と、機体位置と、穀稈搬送装置23の搬送速度とに基づいて、刈取部2に刈り取られた雑草が扱深さ調整機構3を通過する通過タイミングを決定する。雑草が扱深さ調整機構3を通過している間、雑草位置算出部68は、雑草進入フラグを扱深さ制御部620に与える。 As shown in FIG. 6, the weed position information indicating the position on the map of the weed growth area thus obtained is given to the weed position calculation unit 68. The weed position calculation unit 68 is also given an aircraft position which is map coordinates of the vehicle position calculated by the vehicle position calculation unit 66. The weed position calculation unit 68 determines the passing timing at which the weeds cut off by the harvesting unit 2 pass through the handling depth adjustment mechanism 3 based on the weed position information, the machine position, and the transfer speed of the grain feeding device 23. decide. While the weed passes the handling depth adjustment mechanism 3, the weed position calculation unit 68 gives a weed entry flag to the handling control unit 620.
 扱深さ制御部620は、標準制御モードと雑草進入制御モードとを有しており、通常は、標準制御モードが選択されており、作業走行の間、上述した扱深さ制御が実行される。
但し、雑草位置算出部68から雑草進入フラグが与えられると、標準制御モードから雑草進入制御モードに切り替わり、雑草進入制御が実行される。この実施形態では、雑草進入制御が実行されると、扱深さ制御が中断されるとともに、車速も低減される。もちろん、雑草進入制御において、扱深さ制御の中断又は車速の低減のいずれか一方だけが行われる構成を採用してもよい。
The handling depth control unit 620 has a standard control mode and a weed approach control mode, and normally, the standard control mode is selected, and the above-mentioned handling depth control is executed during work travel. .
However, when the weed entry flag is given from the weed position calculation unit 68, the standard control mode is switched to the weed entry control mode, and weed entry control is executed. In this embodiment, when weed entry control is performed, the depth control is interrupted and the vehicle speed is also reduced. Of course, in the weed entry control, a configuration may be employed in which only one of interruption of the depth control and reduction of the vehicle speed is performed.
 なお、雑草位置情報生成部51で生成された雑草位置情報は、視覚的に分かりやすい表示のために、図7に示すようなマップ化が可能である。図7では、雑草位置情報をマップ化した雑草マップが例示されている。雑草位置情報において推定確率が相違する雑草生育領域が含まれている場合、図7に示すように、推定確率値の所定範囲でパターン分けされた形態で雑草生育領域を表示することも可能である。 The weed position information generated by the weed position information generation unit 51 can be mapped as shown in FIG. 7 for easy visual display. FIG. 7 illustrates a weed map in which weed position information is mapped. When weed growth areas having different estimation probabilities are included in the weed position information, it is possible to display the weed growth areas in a form divided into patterns within a predetermined range of estimation probability values, as shown in FIG. .
[実施形態2]
 図8に示すように、コンバインでは、左右一対のクローラ走行装置2010を備えた機体2001の前部に横軸芯X周りで昇降操作自在に刈取部2002が連結されている。機体2001の後部には、機体横幅方向に並ぶ状態で脱穀装置2011と、穀粒を貯留する穀粒タンク2012とが備えられている。機体2001の前部右側箇所に搭乗運転部を覆うキャビン2014が備えられ、このキャビン2014の下方に駆動用のエンジン2015が備えられている。
Second Embodiment
As shown in FIG. 8, in the combine, a reaper 2002 is connected to the front of an airframe 2001 provided with a pair of left and right crawler traveling devices 2010 so as to be able to move up and down around the horizontal axis X. At the rear of the fuselage 2001, a threshing device 2011 and a grain tank 2012 for storing grains are provided in line with the width direction of the fuselage. A cabin 2014 is provided at the front right side of the fuselage 2001 to cover the boarding driver, and a driving engine 2015 is provided below the cabin 2014.
 図8に示すように、脱穀装置2011は、刈取部2002で刈り取られて後方に搬送されてくる刈取穀稈を内部に受け入れて、穀稈の株元を脱穀フィードチェーン2111と挟持レール2112とによって挟持して搬送しながら穂先側を扱胴2113にて脱穀処理する。そして、扱胴2113の下方に備えられた選別部にて脱穀処理物に対する穀粒選別処理が実行され、そこで選別された穀粒が穀粒タンク2012へ搬送され、貯留される。また、詳述はしないが、穀粒タンク2012にて貯留される穀粒を外部に排出する穀粒排出装置2013が備えられている。 As shown in FIG. 8, the threshing device 2011 internally receives a reaping grain weir which is reaped by the reaping unit 2002 and transported backward, and the origin of the grain weir by the threshing feed chain 2111 and the clamping rail 2112. While holding and transporting, the tip side is threshed with a threshing cylinder 2113. Then, the grain sorting process is performed on the threshed product in the sorting unit provided below the threshing drum 2113, and the grain sorted there is transported to the grain tank 2012 and stored. Although not described in detail, a grain discharging device 2013 for discharging grains stored in the grain tank 2012 to the outside is provided.
 刈取部2002には、倒伏した植立穀稈を引き起こす複数の引き起こし装置2021、引起された植立穀稈の株元を切断するバリカン型の切断装置2022、穀稈搬送装置2023等が備えられている。穀稈搬送装置2023は、株元が切断された縦姿勢の刈取穀稈を徐々に横倒れ姿勢に変更させながら、機体後方側に位置する脱穀装置2011の脱穀フィードチェーン2111の始端部に向けて搬送する。 The reaper 2002 is equipped with a plurality of raising devices 2021 for causing fallen cereal grains, a clipper-type cutting device 2022 for cutting the origin of the raised cereal grains, a grain feeding device 2023 and the like. There is. The grain feeder conveyance device 2023 is directed toward the beginning end of the threshing feed chain 2111 of the threshing device 2011 located on the rear side of the fuselage, while gradually changing the cropped grain gutter in the vertical posture in which the stock origin has been cut to a lying posture. Transport
 穀稈搬送装置2023は、切断装置2022により刈り取られた複数条の刈取穀稈を刈幅方向中央に寄せ集めながら搬送する合流搬送部2231、寄せ集めた刈取穀稈の株元を挟持して後方に搬送する株元挟持搬送装置2232、刈取穀稈の穂先側を係止搬送する穂先係止搬送装置2233、株元挟持搬送装置2232の終端部から刈取穀稈の株元を脱穀フィードチェーン2111に向けて案内する供給搬送装置2234等を備えている。 The grain feeder conveying device 2023 holds a rear of the combined feeding unit 2231 which conveys a plurality of cut grain remnants cut by the cutting device 2022 while gathering them to the center in the cutting width direction, sandwiches the stock origin of the gathered grain collecting trouts The stock origin holding transport device 2232 for transporting to the tip, the ear tip holding transport device 2233 for locking and carrying the tip side of the harvesting grain pod, the end of the stock origin gripping transport device 2232 The stock origin of the harvesting grain stem to the threshing feed chain 2111 It has a supply conveyance device 2234 and the like for directing and guiding.
 キャビン2014の天井部の前端に、カラーカメラを備えた撮影部2070が設けられている。
この実施形態では、撮影部2070の撮影視野の前後方向の広がりは、刈取部2002の前端領域からほぼ地平線に達している。撮影視野の幅方法の広がりは、10m程度から数十mに達している。撮影部2070によって取得された撮影画像は、画像データ化され、コンバインの制御系に送られる。
At the front end of the ceiling of the cabin 2014, a photographing unit 2070 provided with a color camera is provided.
In this embodiment, the fore-and-aft extension of the imaging field of view of the imaging unit 2070 substantially reaches the horizon from the front end region of the mowing unit 2002. The breadth of the field of view in the field of view reaches about 10 m to several tens of meters. The photographed image acquired by the photographing unit 2070 is converted into image data, and is sent to the control system of the combine.
 撮影部2070は、収穫作業時に圃場を撮影するが、圃場には種々の物体が撮影対象として存在している。コンバインの制御系は、撮影部2070から送られてきた画像データから特定の物体を認識対象物として認識する機能を有する。そのような認識対象物として、図8では、符号Z0で示された正常な植立穀稈群、符号Z1で示された植立穀稈より高く伸びた雑草群、符号Z2で示された倒伏穀稈群、符号Z3で示された人物が模式的に示されている。 The photographing unit 2070 photographs a field at the time of harvesting work, but various objects exist in the field as objects to be photographed. The control system of the combine has a function of recognizing a specific object as a recognition target object from the image data sent from the imaging unit 2070. As such a recognition object, in FIG. 8, a normal open-celled flock group indicated by code Z0, a weed group extended higher than an open-celled hail indicated by code Z1, an lodging indicated by code Z2. A group of cereals, a person indicated by a symbol Z3 is schematically shown.
 キャビン2014の天井部には、衛星測位モジュール2080も設けられている。衛星測位モジュール2080には、GNSS(global navigation satellite system)信号(GPS信号を含む)を受信するための衛星用アンテナが含まれている。衛星測位モジュール2080による衛星航法を補完するために、ジャイロ加速度センサや磁気方位センサを組み込んだ慣性航法ユニットが衛星測位モジュール2080に組み込まれている。もちろん、慣性航法ユニットは別の場所に配置できる。図8において、衛星測位モジュール2080は、作図の便宜上、キャビン2014の天井部における後部に配置されているが、例えば、切断装置2022の左右中央部の直上方位置にできるだけ近づくように、天井部の前端部における機体中央側寄りの位置に配置されていると好適である。 A satellite positioning module 2080 is also provided on the ceiling of the cabin 2014. The satellite positioning module 2080 includes a satellite antenna for receiving global navigation satellite system (GNSS) signals (including GPS signals). In order to complement the satellite navigation by the satellite positioning module 2080, an inertial navigation unit incorporating a gyro acceleration sensor and a magnetic direction sensor is incorporated in the satellite positioning module 2080. Of course, the inertial navigation unit can be located elsewhere. In FIG. 8, the satellite positioning module 2080 is disposed at the rear of the ceiling of the cabin 2014 for convenience of drawing, but for example, the satellite positioning module 2080 is positioned as close as possible to a position just above the center of left and right of the cutting device 2022. It is preferable that the front end portion be disposed closer to the center of the airframe.
 図9には、コンバインの制御系の機能ブロック図が示されている。この実施形態の制御系は、多数のECUと呼ばれる電子制御ユニットと、各種動作機器、センサ群やスイッチ群、それらの間のデータ伝送を行う車載LANなどの配線網から構成されている。報知デバイス2091は、運転者等に作業走行状態や種々の警告を報知するためのデバイスであり、ブザー、ランプ、スピーカ、ディスプレイなどである。通信部2092は、このコンバインの制御系が、遠隔地に設置されているクラウドコンピュータシステム2100や携帯通信端末2200との間でデータ交換するために用いられる。携帯通信端末2200は、ここでは、作業走行現場における監視者(運転者も含む)が操作するタブレットコンピュータである。
制御ユニット2006は、この制御系の中核要素であり、複数のECUの集合体として示されている。衛星測位モジュール2080からの測位データ、及び、撮影部2070からの画像データは、配線網を通じて制御ユニット2006に入力される。
A functional block diagram of the control system of the combine is shown in FIG. The control system of this embodiment comprises an electronic control unit called a large number of ECUs, various operation devices, sensors, switches, and a wiring network such as a car-mounted LAN for data transmission between them. The notification device 2091 is a device for notifying a driver or the like of a work traveling state and various warnings, and is a buzzer, a lamp, a speaker, a display, or the like. The communication unit 2092 is used to exchange data with the cloud computer system 2100 or the portable communication terminal 2200 installed at a remote place. Here, the mobile communication terminal 2200 is a tablet computer operated by a supervisor (including a driver) at the work travel site.
The control unit 2006 is a core element of this control system, and is shown as a collection of a plurality of ECUs. The positioning data from the satellite positioning module 2080 and the image data from the imaging unit 2070 are input to the control unit 2006 through the wired network.
 制御ユニット2006は、入出力インタフェースとして、出力処理部2006Bと入力処理部2006Aとを備えている。出力処理部2006Bは、車両走行機器群2007A及び作業装置機器群2007Bと接続している。車両走行機器群2007Aには、車両走行に関する制御機器、例えばエンジン制御機器、変速制御機器、制動制御機器、操舵制御機器などが含まれている。作業装置機器群2007Bには、刈取部2002、脱穀装置2011、穀粒排出装置2013、穀稈搬送装置2023における動力制御機器などが含まれている。 The control unit 2006 includes an output processing unit 2006B and an input processing unit 2006A as an input / output interface. The output processing unit 2006B is connected to the vehicle travel device group 2007A and the work device device group 2007B. The vehicle travel device group 2007A includes control devices related to vehicle travel, such as an engine control device, a transmission control device, a braking control device, a steering control device, and the like. The work device group 2007B includes a power control device and the like in the harvesting unit 2002, the threshing device 2011, the grain discharging device 2013, and the grain conveying device 2023.
 入力処理部2006Aには、走行系検出センサ群2008Aや刈取部2008Bなどが接続されている。走行系検出センサ群2008Aには、エンジン回転数調整具、アクセルペダル、ブレーキペダル、変速操作具などの状態を検出するセンサが含まれている。刈取部2008Bには、刈取部2002、脱穀装置2011、穀粒排出装置2013、穀稈搬送装置2023における装置状態及び穀稈や穀粒の状態を検出するセンサが含まれている。 A traveling system detection sensor group 2008A, a reaper 2008B, and the like are connected to the input processing unit 2006A. The traveling system detection sensor group 2008A includes a sensor that detects the state of an engine speed adjuster, an accelerator pedal, a brake pedal, a gearshift operator, and the like. The reaping unit 2008B includes a reaping unit 2002, a threshing device 2011, a grain discharging device 2013, and a device state in the grain conveying device 2023 and a sensor that detects the state of the grain or grain.
 制御ユニット2006には、作業走行制御モジュール2060、画像認識モジュール2005、データ処理モジュール2050、機体位置算出部2066、報知部2067が備えられている。 The control unit 2006 includes a work travel control module 2060, an image recognition module 2005, a data processing module 2050, a machine position calculation unit 2066, and a notification unit 2067.
 報知部2067は、制御ユニット2006の各機能部から受信した指令等に基づいて報知データを生成し、報知デバイス2091に与える。機体位置算出部2066は、衛星測位モジュール2080から逐次送られてくる測位データに基づいて、機体2001の地図座標(または圃場座標)である機体位置を算出する。 The notification unit 2067 generates notification data based on an instruction or the like received from each functional unit of the control unit 2006, and gives the notification data to the notification device 2091. The aircraft position calculation unit 2066 calculates the aircraft position which is the map coordinates (or field coordinates) of the aircraft 2001 based on the positioning data sequentially sent from the satellite positioning module 2080.
 この実施形態のコンバインは自動走行(自動操舵)と手動走行(手動操舵)の両方で走行可能である。作業走行制御モジュール2060には、走行制御部2061と作業制御部2062とに加えて、自動作業走行指令部2063及び走行経路設定部2064が備えられている。自動操舵で走行する自動走行モードと、手動操舵で走行する手動操舵モードとのいずれかを選択する走行モードスイッチ(非図示)がキャビン2014内に設けられている。この走行モードスイッチを操作することで、手動操舵走行から自動操舵走行への移行、あるいは自動操舵走行から手動操舵走行への移行が可能である。 The combine according to this embodiment can travel in both automatic travel (automatic steering) and manual travel (manual steering). The work travel control module 2060 includes an automatic work travel instruction unit 2063 and a travel path setting unit 2064 in addition to the travel control unit 2061 and the work control unit 2062. A travel mode switch (not shown) is provided in the cabin 2014 for selecting one of an automatic travel mode in which automatic travel is performed and a manual steering mode in which manual travel is performed. By operating the travel mode switch, it is possible to shift from manual steering travel to automatic steering travel, or shift from automatic steering travel to manual steering travel.
 走行制御部2061は、エンジン制御機能、操舵制御機能、車速制御機能などを有し、車両走行機器群2007Aに走行制御信号を与える。作業制御部2062は、刈取部2002、脱穀装置2011、穀粒排出装置2013、穀稈搬送装置2023などの動きを制御するために、作業装置機器群2007Bに作業制御信号を与える。 The traveling control unit 2061 has an engine control function, a steering control function, a vehicle speed control function, and the like, and supplies a traveling control signal to the vehicle traveling device group 2007A. The work control unit 2062 provides a work control device group 2007B with a work control signal in order to control the movement of the reaper 2002, the threshing device 2011, the grain discharging device 2013, the grain conveying device 2023, and the like.
 手動操舵モードが選択されている場合、運転者による操作に基づいて、走行制御部2061が制御信号を生成し、車両走行機器群2007Aを制御する。自動操舵モードが選択されている場合、自動作業走行指令部2063によって与えられる自動走行指令に基づいて、走行制御部2061は、操舵に関する車両走行機器群2007Aや車速に関する車両走行機器群2007Aを制御する。 When the manual steering mode is selected, the traveling control unit 2061 generates a control signal based on an operation by the driver to control the vehicle traveling device group 2007A. When the automatic steering mode is selected, the traveling control unit 2061 controls the vehicle traveling equipment group 2007A related to steering and the vehicle traveling equipment group 2007A related to vehicle speed based on the automatic traveling instruction given by the automatic work traveling instruction unit 2063. .
 走行経路設定部2064は、制御ユニット2006、携帯通信端末2200、クラウドコンピュータシステム2100などのいずれかで作成された自動走行のための走行経路を、メモリに展開する。メモリに展開された走行経路は、順次自動走行における目標走行経路として用いられる。この走行経路は、手動走行であっても、コンバインが当該走行経路に沿って走行するためのガイダンスのために利用することも可能である。 The travel route setting unit 2064 deploys, in the memory, a travel route for automatic travel created by any of the control unit 2006, the mobile communication terminal 2200, the cloud computer system 2100, and the like. The travel route developed in the memory is sequentially used as a target travel route in automatic travel. Even if this traveling route is manual traveling, it is also possible to use the guidance for the combine to travel along the traveling route.
 自動作業走行指令部2063は、より詳しくは、自動操舵指令及び車速指令を生成して、走行制御部2061に与える。自動操舵指令は、走行経路設定部2064によって設定された走行経路と、機体位置算出部2066によって算出された自車位置との間の方位ずれ及び位置ずれを解消するように生成される。車速指令は、前もって設定された車速値に基づいて生成される。さらに、自動作業走行指令部2063は、作業制御部2062に、自車位置や自車の走行状態に応じて、作業装置動作指令を与える。 More specifically, the automatic work travel instruction unit 2063 generates an automatic steering instruction and a vehicle speed instruction, and supplies the instruction to the travel control unit 2061. The automatic steering command is generated so as to eliminate the azimuth deviation and the positional deviation between the traveling route set by the traveling route setting unit 2064 and the vehicle position calculated by the machine position calculation unit 2066. The vehicle speed command is generated based on a previously set vehicle speed value. Furthermore, the automatic work travel instruction unit 2063 gives the work control unit 2062 a work device operation instruction according to the vehicle position and the traveling state of the vehicle.
 画像認識モジュール2005には、撮影部2070によって継時的に遂次取得された撮影画像の画像データが入力される。画像認識モジュール2005は、この撮影画像における認識対象物が存在する存在領域を推定し、存在領域及び存在領域が推定された際の推定確率を含む認識出力データを、認識結果として出力する。画像認識モジュール2005は、深層学習を採用したニューラルネットワーク技術を用いて構築されている。 The image recognition module 2005 receives image data of a captured image sequentially acquired by the imaging unit 2070 sequentially. The image recognition module 2005 estimates an existing area in the photographed image in which a recognition target exists, and outputs recognition output data including an existing area and an estimated probability when the existing area is estimated as a recognition result. The image recognition module 2005 is constructed using neural network technology that employs deep learning.
 画像認識モジュール2005による認識出力データの生成の流れが、図10及び図11に示されている。画像認識モジュール2005には、RGB画像データの画素値が入力値として入力される。図10の図例では、推定される認証対象物は、雑草と倒伏穀稈と人物である。したがって、認識結果としての認識出力データには、雑草の存在領域(以下、雑草領域と称する)とその推定確率、倒伏穀稈の存在領域(以下、倒伏穀稈領域と称する)とその推定確率、人物の存在領域(以下、人物領域と称する)とその推定確率が含まれる。 The flow of generation of recognition output data by the image recognition module 2005 is shown in FIG. 10 and FIG. The image recognition module 2005 receives pixel values of RGB image data as input values. In the illustrated example of FIG. 10, the authentication objects to be estimated are a weed and a fallen cereal weir and a person. Therefore, in the recognition output data as the recognition result, the present region of weeds (hereinafter referred to as weed region) and its estimated probability, the present region of the fallen kernel (hereinafter referred to as the fallen cereal region) and its estimated probability, It includes the existence area of a person (hereinafter referred to as a person area) and its estimated probability.
 図10では、推定結果は模式化されており、雑草領域は符号F1を付与された矩形の枠で示され、倒伏穀稈領域は符号F2を付与された矩形の枠で示され、人物領域は符号F3を付与された矩形の枠で示されている。それぞれの領域には、その推定確率がリンクされる。雑草領域、倒伏穀稈領域、人物領域は、それぞれ4つのコーナ点で規定されるが、そのような各矩形の4つのコーナ点の撮影画像上の座標位置も推定結果に含まれている。もちろん、認証対象物が推定されなければ、認証対象物の存在領域は出力されず、その推定確率はゼロとなる。 In FIG. 10, the estimation result is schematically shown, the weed area is shown by a rectangular frame with a reference F1, the fallen grain area is shown with a rectangular frame with a reference F2, and the human area is It is indicated by a rectangular frame given a code F3. Each region is linked to its estimated probability. The weed area, the fallen grain area and the human area are each defined by four corner points, but the coordinate positions on the photographed image of the four corner points of each rectangle are also included in the estimation result. Of course, if the authentication target is not estimated, the existence region of the authentication target is not output, and the estimation probability becomes zero.
 なお、この実施形態では、画像認識モジュール2005は、撮影画像において認識対象物が撮影部2070から遠くに位置するほど、当該認識対象物の推定確率は低減されるように内部パラメータを設定している。これにより、撮影部2070から遠く離れているために分解能が低くなっている撮影領域における認識対象物の認識を厳しくし、誤認識を低減させている。 In this embodiment, the image recognition module 2005 sets internal parameters so that the probability of estimating the recognition target decreases as the recognition target is located farther from the imaging unit 2070 in the captured image. . As a result, recognition of an object to be recognized in an imaging region whose resolution is low because it is far from the imaging unit 2070 is made stricter, and erroneous recognition is reduced.
 データ処理モジュール2050は、画像認識モジュール2005から出力された認識出力データを処理する。図9及び図11に示すように、この実施形態のデータ処理モジュール2050には、認識対象物位置情報生成部2051、統計処理部2052、データ記憶部2053、データ判定部2054、データ補正部2055が含まれている。 The data processing module 2050 processes recognition output data output from the image recognition module 2005. As shown in FIGS. 9 and 11, in the data processing module 2050 of this embodiment, a recognition target object position information generation unit 2051, a statistical processing unit 2052, a data storage unit 2053, a data determination unit 2054, and a data correction unit 2055 are provided. include.
 認識対象物位置情報生成部2051は、撮影画像が取得された時点の機体位置と認識出力データとから、認識対象物の地図上の位置を示す認識対象物位置情報を生成する。認識出力データに含まれている認識対象物(雑草、倒伏穀稈、人物)が存在する地図上の位置は、認証対象物の存在領域(雑草領域、倒伏穀稈領域、人物領域)を示す矩形の4つのコーナ点の撮影画像上の座標位置(カメラ座標位置)を、地図上の座標に変換することで得られる。 The recognition target position information generation unit 2051 generates recognition target position information indicating the position on the map of the recognition target from the machine position at the time when the captured image is acquired and the recognition output data. The position on the map where the recognition target (weed, fallen grain moth, person) is included in the recognition output data is a rectangle that indicates the presence area of the certification object (weed area, fallen grain moth area, person area) The coordinate positions (camera coordinate positions) on the photographed image of the four corner points of are obtained by converting the coordinates on the map.
 撮影部2070は、所定時間間隔、例えば0.5秒間隔で撮影画像を取得し、その画像データを画像認識モジュール2005に入力するので、画像認識モジュール2005も、同じ時間間隔で、認識出力データを出力する。したがって、撮影部2070の撮影視野に認識対象物が入っていた場合には、複数の認識出力データが同一の認識対象物に対する存在領域を含むことになる。その結果、同一の認識対象物に対する複数の認識対象物位置情報が得られる。その際、各元データである認識出力データに含まれている推定確率、つまり認識対象物位置情報に含まれる認識対象物の存在領域の推定確率は、撮影部2070と認識対象物との間の位置関係が相違することから、違う値となることが多い。 The imaging unit 2070 acquires captured images at predetermined time intervals, for example, 0.5 seconds, and inputs the image data to the image recognition module 2005. Therefore, the image recognition module 2005 also recognizes and outputs recognition output data at the same time intervals. Output. Therefore, when the recognition target is included in the shooting field of view of the imaging unit 2070, the plurality of recognition output data includes the existence area for the same recognition target. As a result, multiple pieces of recognition target object position information for the same recognition target can be obtained. At that time, the estimation probability included in the recognition output data which is each original data, that is, the estimation probability of the existence region of the recognition target object included in the recognition target object position information is between the photographing unit 2070 and the recognition target object. Because the positional relationship is different, they often have different values.
 したがって、この実施形態では、そのような複数の認識対象物位置情報が記憶され、記憶された複数の認識対象物位置情報のそれぞれに含まれる推定確率が統計的演算される。
統計処理部2052は、複数の認識対象物位置情報の推定確率に対する統計的な演算を用いて、推定確率群の代表値を求める。その代表値を用いて、複数の認識対象物位置情報を、1つの最適認識対象物位置情報(認識対象物補正位置情報)に補正することができる。そのような補正の一例は、各推定確率の算術平均値または重み平均値あるいは中間値を基準値(代表値)として求め、その基準値以上の推定確率を有する存在領域の論理和を求め、それを最適存在領域とする補正認識対象物位置情報を生成することである。もちろん、これ以外の統計的演算を用いて信頼性の高い1つの認識対象物位置情報を生成することも可能である。つまり、複数の認識対象物位置情報が、当該認識対象物位置情報に対応する認識出力データに含まれている推定確率の統計的演算の結果に基づいて補正されるのである。
Therefore, in this embodiment, such plural pieces of recognition target object position information are stored, and an estimated probability included in each of the stored plural pieces of recognition target object position information is calculated statistically.
The statistical processing unit 2052 obtains a representative value of the estimated probability group by using a statistical operation on estimated probabilities of a plurality of pieces of recognition object position information. A plurality of pieces of recognition target object position information can be corrected to one piece of optimum recognition target object position information (recognition target object correction position information) using the representative value. One example of such correction is to obtain the arithmetic mean value or weighted mean value or median value of each estimated probability as a reference value (representative value), and obtain the logical sum of the existing regions having the estimated probability equal to or higher than the reference value It is to generate correction recognition target object position information in which the Of course, it is also possible to generate one reliable object position information with high reliability using statistical operations other than this. That is, the plurality of recognition target object position information is corrected based on the result of the statistical calculation of the estimation probability included in the recognition output data corresponding to the recognition target object position information.
 このようにして求められた認証対象物の存在領域(雑草領域、倒伏穀稈領域、人物領域)の地図上の位置を示す認識対象物位置情報(雑草位置情報、倒伏穀稈位置情報、人物位置情報)を用いることで、雑草、倒伏穀稈、人物の認識時において、それぞれ予め設定された走行作業制御や警告報知が行われる。 Recognition object position information (weed position information, fallen grain weir position information, person position) indicating the position on the map of the existence area (weed area, fallen grain weir area, person area) of the certification object thus obtained By using the information), at the time of recognition of weeds, fallen grain weirs, and persons, traveling work control and warning notification set in advance are performed.
 上述したように、本発明では、画像認識モジュール2005から出力される推定確率は、最終的な認識対象物位置情報の生成にとって重要である。このため、この実施形態では、画像認識モジュール2005から出力された認識出力データの推定確率の信頼性を調べて、信頼できない推定確率を有する認識出力データを不適認識出力データと判定する認識出力データ評価機能が、データ処理モジュール2050に備えられている。 As described above, in the present invention, the estimated probability output from the image recognition module 2005 is important for generation of final recognition target object position information. Therefore, in this embodiment, the recognition output data is evaluated by examining the reliability of the estimation probability of the recognition output data output from the image recognition module 2005 and determining the recognition output data having the unreliable estimation probability as the inappropriate recognition output data. A function is included in data processing module 2050.
 認識出力データ評価機能は、図9及び図11に示すように、データ記憶部2053と、データ判定部2054と、データ補正部2055とによって実現する。データ記憶部2053は、画像認識モジュール2005から遂次出力された認識出力データを認識出力データ列として一時的かつ経時的に記憶する。データ判定部2054は、認識出力データ列において、判定対象となる認識出力データの推定確率と、当該認識出力データに対して継時的に前後関係となる認識出力データの推定確率と比べて、所定レベル以上に低い推定確率を有する認識出力データを不適認識出力データとして判定する。 The recognition output data evaluation function is realized by a data storage unit 2053, a data determination unit 2054, and a data correction unit 2055, as shown in FIGS. The data storage unit 2053 temporarily and temporally stores recognition output data sequentially output from the image recognition module 2005 as a recognition output data sequence. The data determination unit 2054 compares the estimated probability of the recognition output data to be determined with the estimated probability of the recognition output data sequentially related to the recognition output data in the recognition output data sequence. Recognition output data having an estimated probability lower than the level is determined as inappropriate recognition output data.
 認識出力データの生成に用いる撮影画像は、コンバインの作業走行中に撮影されるので、コンバインの突発的な動きなどにより、撮影画像が不鮮明になることがある。さらに、コンバインは、畦近傍で方向転換するので、撮影方向が急激に変更されることもある。それに伴って、同一の認識対象物に対して順光での撮影と逆光での撮影とが混じることもある。このような撮影条件等の変動により、不測に低い推定確率を有する認識出力データが生じうるが、そのような認識出力データである不適認識出力データの抽出は、上述した認識出力データ評価機能により、可能である。 Since the photographed image used to generate the recognition output data is photographed during the traveling operation of the combine, the photographed image may become unclear due to the sudden movement of the combine and the like. Furthermore, since the combine changes its direction near the eyelid, the shooting direction may be suddenly changed. Along with this, there may be a mixture of shooting with the ordinary light and shooting with the back light for the same object to be recognized. Although such fluctuation in imaging conditions may cause recognition output data having an unexpectedly low estimation probability, extraction of inappropriate recognition output data which is such recognition output data is performed by the recognition output data evaluation function described above. It is possible.
 認識出力データの数が少ない場合には、不適認識出力データと判定された認識出力データの推定確率を、その前後の認識出力データの推定確率で補間補正するとよい。これにより、当該不適認識出力データは適切に補正されたことになって、利用可能となり、データ数が確保されるという利点が得られる。データ補正部2055は、そのような補間補正を行う。もちろん、不適認識出力データと判定された認識出力データを破棄する場合には、補間補正が不要になるので、データ補正部2055は省略される。 When the number of recognition output data is small, the estimation probability of the recognition output data determined to be unsuitable recognition output data may be interpolated and corrected with the estimation probability of recognition output data before and after that. As a result, the inappropriate recognition output data is appropriately corrected and can be used, and the number of data can be secured. The data correction unit 2055 performs such interpolation correction. Of course, when the recognition output data determined to be unsuitable recognition output data is discarded, the interpolation correction becomes unnecessary, so the data correction unit 2055 is omitted.
 なお、認識対象物位置情報生成部2051で生成された各認識対象物位置情報は、視覚的に分かりやすい表示のために、図12に示すようなマップ化が可能である。図12では、雑草位置情報をマップ化した雑草マップが例示されている。雑草位置情報において推定確率が相違する雑草存在領域が含まれている場合、図12に示すように、推定確率値の所定範囲でパターン分けされた形態で雑草存在領域を表示することも可能である。 Note that each recognition target object position information generated by the recognition target object position information generation unit 2051 can be mapped as shown in FIG. 12 for easy visual understanding. FIG. 12 illustrates a weed map in which weed position information is mapped. When the weed location information includes weed presence regions having different estimation probabilities, it is possible to display the weed presence regions in a form divided in a predetermined range of estimation probability values, as shown in FIG. .
[実施形態3]
 図13に示すように、実施形態3に係るコンバインでは、左右一対のクローラ走行装置3010を備えた機体3001の前部に横軸芯X周りで昇降操作自在に刈取部3002が連結されている。機体3001の後部には、機体横幅方向に並ぶ状態で脱穀装置3011と、穀粒を貯留する穀粒タンク3012とが備えられている。機体3001の前部右側箇所に搭乗運転部を覆うキャビン3014が備えられ、このキャビン3014の下方に駆動用のエンジン3015が備えられている。
Third Embodiment
As shown in FIG. 13, in the combine according to the third embodiment, a reaper 3002 is connected to a front portion of an airframe 3001 including a pair of left and right crawler traveling devices 3010 so as to be movable up and down around a horizontal axis X. A threshing device 3011 and a grain tank 3012 for storing grains are provided at the rear of the machine body 3001 in a state of being aligned in the machine body width direction. A cabin 3014 is provided at the front right side of the fuselage 3001 to cover the boarding operation unit, and a driving engine 3015 is provided below the cabin 3014.
 図13に示すように、脱穀装置3011は、刈取部3002で刈り取られて後方に搬送されてくる刈取穀稈を内部に受け入れて、穀稈の株元を脱穀フィードチェーン3111と挟持レール3112とによって挟持して搬送しながら穂先側を扱胴3113にて脱穀処理する。そして、扱胴3113の下方に備えられた選別部にて脱穀処理物に対する穀粒選別処理が実行され、そこで選別された穀粒が穀粒タンク3012へ搬送され、貯留される。また、詳述はしないが、穀粒タンク3012にて貯留される穀粒を外部に排出する穀粒排出装置3013が備えられている。 As shown in FIG. 13, the threshing device 3011 receives internally the cutting grain weir which is cut off by the cutting unit 3002 and transported backward, and the origin of the grain weir by the threshing feed chain 3111 and the clamping rail 3112 While holding and transporting, the tip side is threshed by a threshing cylinder 3113. Then, the grain sorting process is performed on the threshed product in the sorting unit provided below the threshing drum 3113, and the grain sorted there is transported to the grain tank 3012 and stored. Further, although not described in detail, a grain discharging device 3013 for discharging grains stored in the grain tank 3012 to the outside is provided.
 刈取部3002には、倒伏した植立穀稈を引き起こす複数の引き起こし装置3021、引起された植立穀稈の株元を切断するバリカン型の切断装置3022、穀稈搬送装置3023等が備えられている。穀稈搬送装置3023は、株元が切断された縦姿勢の刈取穀稈を徐々に横倒れ姿勢に変更させながら、機体後方側に位置する脱穀装置3011の脱穀フィードチェーン3111の始端部に向けて搬送する。 The reaper 3002 is provided with a plurality of raising devices 3021 for causing fallen cereal grains, a clipper-type cutting device 3022 for cutting the origin of the raised cereal grains, a grain feeding device 3023 and the like. There is. The grain feeder conveying device 3023 is directed toward the beginning end of the threshing feed chain 3111 of the threshing device 3011 located on the rear side of the fuselage while gradually changing the cropped grain gutter in the vertical posture in which the stock origin has been cut to a lying posture. Transport
 穀稈搬送装置3023は、切断装置3022により刈り取られた複数条の刈取穀稈を刈幅方向中央に寄せ集めながら搬送する合流搬送部3231、寄せ集めた刈取穀稈の株元を挟持して後方に搬送する株元挟持搬送装置3232、刈取穀稈の穂先側を係止搬送する穂先係止搬送装置3233、株元挟持搬送装置3232の終端部から刈取穀稈の株元を脱穀フィードチェーン3111に向けて案内する供給搬送装置3234等を備えている。 The grain feeder conveying device 3023 holds a rear of the confluence conveying unit 3231 which gathers and gathers a plurality of cut grain remnants cut by the cutting device 3022 to the center in the cutting width direction, sandwiches the stock origin of the gathered grain relics From the end of the stock holding and conveying device 3232 and the stock holding and conveying device 3232 to the threshing feed chain 3111 It is provided with a feeding and conveying device 3234 and the like that directs it to guide it.
 キャビン3014の天井部の前端に、カラーカメラを備えた撮影部3070が設けられている。
この実施形態では、撮影部3070の撮影視野の前後方向の広がりは、刈取部3002の前端領域からほぼ地平線に達している。撮影視野の幅方法の広がりは、10m程度から数十mに達している。撮影部3070によって継時的に遂次取得された撮影画像は、画像データ化され、コンバインの制御系に送られる。
At the front end of the ceiling of the cabin 3014, a photographing unit 3070 provided with a color camera is provided.
In this embodiment, the extension in the front-rear direction of the field of view of the imaging unit 3070 substantially reaches the horizon from the front end region of the reaper 3002. The breadth of the field of view in the field of view reaches about 10 m to several tens of meters. The photographed images sequentially acquired by the photographing unit 3070 are converted into image data and sent to the control system of the combine.
 撮影部3070は、収穫作業時に圃場を撮影するが、圃場には種々の物体が撮影対象として存在している。コンバインの制御系は、撮影部3070から送られてきた画像データから特定の物体を認識対象物として認識する機能を有する。そのような認識対象物として、図13では、符号Z0で示された正常な植立穀稈群、符号Z1で示された植立穀稈より高く伸びた雑草群、符号Z2で示された倒伏穀稈群、符号Z3で示された人物が模式的に示されている。 The photographing unit 3070 photographs a field at the time of harvesting work, but various objects exist in the field as objects to be photographed. The control system of the combine has a function of recognizing a specific object as a recognition target object from the image data sent from the imaging unit 3070. As such a recognition object, in FIG. 13, a normal open-cell crop group indicated by symbol Z0, a weed group extended higher than the open-cell granule indicated by symbol Z1, and an lodging indicated by symbol Z2 A group of cereals, a person indicated by a symbol Z3 is schematically shown.
 キャビン3014の天井部には、衛星測位モジュール3080も設けられている。衛星測位モジュール3080には、GNSS(global navigation satellite system)信号(GPS信号を含む)を受信するための衛星用アンテナが含まれている。衛星測位モジュール3080による衛星航法を補完するために、ジャイロ加速度センサや磁気方位センサを組み込んだ慣性航法ユニットが衛星測位モジュール3080に組み込まれている。もちろん、慣性航法ユニットは別の場所に配置できる。図13において、衛星測位モジュール3080は、作図の便宜上、キャビン3014の天井部における後部に配置されているが、例えば、切断装置3022の左右中央部の直上方位置にできるだけ近づくように、天井部の前端部における機体中央側寄りの位置に配置されていると好適である。 A satellite positioning module 3080 is also provided on the ceiling of the cabin 3014. The satellite positioning module 3080 includes a satellite antenna for receiving global navigation satellite system (GNSS) signals (including GPS signals). In order to complement the satellite navigation by the satellite positioning module 3080, an inertial navigation unit incorporating a gyro acceleration sensor and a magnetic direction sensor is incorporated in the satellite positioning module 3080. Of course, the inertial navigation unit can be located elsewhere. In FIG. 13, the satellite positioning module 3080 is disposed at the rear of the ceiling of the cabin 3014 for convenience of drawing, but for example, the satellite positioning module 3080 is positioned as close as possible to a position just above the center of the left and right of the cutting device 3022. It is preferable that the front end portion be disposed closer to the center of the airframe.
 図14には、コンバインの制御系の機能ブロック図が示されている。この実施形態の制御系は、多数のECUと呼ばれる電子制御ユニットと、各種動作機器、センサ群やスイッチ群、それらの間のデータ伝送を行う車載LANなどの配線網から構成されている。報知デバイス3091は、運転者等に作業走行状態や種々の警告を報知するためのデバイスであり、ブザー、ランプ、スピーカ、ディスプレイなどである。通信部3092は、このコンバインの制御系が、遠隔地に設置されているクラウドコンピュータシステム3100や携帯通信端末3200との間でデータ交換するために用いられる。携帯通信端末3200は、ここでは、作業走行現場における監視者(運転者も含む)が操作するタブレットコンピュータである。
制御ユニット3006は、この制御系の中核要素であり、複数のECUの集合体として示されている。衛星測位モジュール3080からの測位データ、および、撮影部3070からの画像データは、配線網を通じて制御ユニット3006に入力される。
FIG. 14 shows a functional block diagram of the control system of the combine. The control system of this embodiment comprises an electronic control unit called a large number of ECUs, various operation devices, sensors, switches, and a wiring network such as a car-mounted LAN for data transmission between them. The notification device 3091 is a device for notifying a driver or the like of a work traveling state and various warnings, and is a buzzer, a lamp, a speaker, a display or the like. The communication unit 3092 is used to exchange data with the cloud computer system 3100 or the portable communication terminal 3200 installed at a remote location. Here, the mobile communication terminal 3200 is a tablet computer operated by a supervisor (including a driver) at the work travel site.
The control unit 3006 is a core element of this control system, and is shown as a collection of a plurality of ECUs. The positioning data from the satellite positioning module 3080 and the image data from the imaging unit 3070 are input to the control unit 3006 through the wired network.
 制御ユニット3006は、入出力インタフェースとして、出力処理部3006Bと入力処理部3006Aとを備えている。出力処理部3006Bは、車両走行機器群3007Aおよび作業装置機器群3007Bと接続している。車両走行機器群3007Aには、車両走行に関する制御機器、例えばエンジン制御機器、変速制御機器、制動制御機器、操舵制御機器などが含まれている。作業装置機器群3007Bには、刈取部3002、脱穀装置3011、穀粒排出装置3013、穀稈搬送装置3023における動力制御機器などが含まれている。 The control unit 3006 includes an output processing unit 3006B and an input processing unit 3006A as an input / output interface. The output processing unit 3006B is connected to the vehicle travel device group 3007A and the work device device group 3007B. The vehicle travel device group 3007A includes control devices related to vehicle travel, such as an engine control device, a shift control device, a braking control device, and a steering control device. The work device group 3007 B includes power control devices and the like in the harvesting unit 3002, the threshing device 3011, the grain discharging device 3013, and the grain feeding device 3023.
 入力処理部3006Aには、走行系検出センサ群3008Aや作業系検出センサ群3008Bなどが接続されている。走行系検出センサ群3008Aには、エンジン回転数調整具、アクセルペダル、ブレーキペダル、変速操作具などの状態を検出するセンサが含まれている。作業系検出センサ群3008Bには、刈取部3002、脱穀装置3011、穀粒排出装置3013、穀稈搬送装置3023における装置状態および穀稈や穀粒の状態を検出するセンサが含まれている。 A traveling system detection sensor group 3008A, a working system detection sensor group 3008B, and the like are connected to the input processing unit 3006A. The traveling system detection sensor group 3008A includes a sensor that detects the state of an engine speed adjuster, an accelerator pedal, a brake pedal, a gearshift operator, and the like. The work system detection sensor group 3008 B includes a sensor for detecting the state of the harvester 3002, the threshing device 3011, the grain discharging device 3013, and the device state in the grain conveying device 3023 and the state of the grain or grain.
 制御ユニット3006には、作業走行制御モジュール3060、画像認識モジュール3005、データ処理モジュール3050、機体位置算出部3066、報知部3067、走行軌跡算出部3068が備えられている。 The control unit 3006 includes a work travel control module 3060, an image recognition module 3005, a data processing module 3050, a machine position calculation unit 3066, a notification unit 3067, and a travel locus calculation unit 3068.
 報知部3067は、制御ユニット3006の各機能部から受信した指令等に基づいて報知データを生成し、報知デバイス3091に与える。機体位置算出部3066は、衛星測位モジュール3080から逐次送られてくる測位データに基づいて、機体3001の地図座標(または圃場座標)である機体位置を算出する。走行軌跡算出部3068は、機体位置算出部3066によって算出された継時的な機体位置から機体3001の走行軌跡を算出する。さらに、走行軌跡算出部3068は、最新の機体位置と、その時に操舵制御機器による操舵角あるいは目標走行経路とに基づいて、機体3001の直ぐ前方の予想走行軌跡を走行軌跡に付加することができる。 The notification unit 3067 generates notification data based on an instruction or the like received from each functional unit of the control unit 3006, and gives the notification data to the notification device 3091. The aircraft position calculation unit 3066 calculates the aircraft position which is the map coordinates (or field coordinates) of the aircraft 3001 based on the positioning data sequentially sent from the satellite positioning module 3080. The traveling locus calculation unit 3068 calculates a traveling locus of the vehicle body 3001 from the successive vehicle position calculated by the vehicle body position calculating unit 3066. Furthermore, the traveling locus calculation unit 3068 can add the predicted traveling locus immediately ahead of the vehicle 3001 to the traveling locus based on the latest vehicle position and the steering angle or target traveling route by the steering control device at that time. .
 この実施形態のコンバインは自動走行(自動操舵)と手動走行(手動操舵)の両方で走行可能である。作業走行制御モジュール3060には、走行制御部3061と作業制御部3062とに加えて、自動作業走行指令部3063および走行経路設定部3064が備えられている。自動操舵で走行する自動走行モードと、手動操舵で走行する手動操舵モードとのいずれかを選択する走行モードスイッチ(非図示)がキャビン3014内に設けられている。この走行モードスイッチを操作することで、手動操舵走行から自動操舵走行への移行、あるいは自動操舵走行から手動操舵走行への移行が可能である。 The combine according to this embodiment can travel in both automatic travel (automatic steering) and manual travel (manual steering). The work travel control module 3060 is provided with an automatic work travel instruction unit 3063 and a travel route setting unit 3064 in addition to the travel control unit 3061 and the work control unit 3062. A travel mode switch (not shown) is provided in the cabin 3014 to select one of an automatic travel mode for traveling by automatic steering and a manual steering mode for traveling by manual steering. By operating the travel mode switch, it is possible to shift from manual steering travel to automatic steering travel, or shift from automatic steering travel to manual steering travel.
 走行制御部3061は、エンジン制御機能、操舵制御機能、車速制御機能などを有し、車両走行機器群3007Aに走行制御信号を与える。作業制御部3062は、刈取部3002、脱穀装置3011、穀粒排出装置3013、穀稈搬送装置3023などの動きを制御するために、作業装置機器群3007Bに作業制御信号を与える。 The traveling control unit 3061 has an engine control function, a steering control function, a vehicle speed control function, and the like, and provides a traveling control signal to the vehicle traveling device group 3007A. The work control unit 3062 gives a work control signal to the work device group 3007B to control the movement of the reaper 3002, the threshing device 3011, the grain discharging device 3013, the grain feeding device 3023, and the like.
 手動操舵モードが選択されている場合、運転者による操作に基づいて、走行制御部3061が制御信号を生成し、車両走行機器群3007Aを制御する。自動操舵モードが選択されている場合、自動作業走行指令部3063によって与えられる自動走行指令に基づいて、走行制御部3061は、操舵に関する車両走行機器群3007Aや車速に関する車両走行機器群3007Aを制御する。 When the manual steering mode is selected, the traveling control unit 3061 generates a control signal based on the operation by the driver to control the vehicle traveling device group 3007A. When the automatic steering mode is selected, the traveling control unit 3061 controls the vehicle traveling device group 3007A related to steering and the vehicle traveling device group 3007A related to vehicle speed based on the automatic traveling instruction given by the automatic work traveling instruction unit 3063. .
 走行経路設定部3064は、制御ユニット3006、携帯通信端末3200、クラウドコンピュータシステム3100などのいずれかで作成された自動走行のための走行経路を、走行経路をメモリに展開する。メモリに展開された走行経路は、順次自動走行における目標走行経路として用いられる。この走行経路は、手動走行であっても、コンバインが当該走行経路に沿って走行するためのガイダンスのために利用することも可能である。 The travel route setting unit 3064 develops a travel route for automatic travel created by any of the control unit 3006, the mobile communication terminal 3200, the cloud computer system 3100, and the like in a memory. The travel route developed in the memory is sequentially used as a target travel route in automatic travel. Even if this traveling route is manual traveling, it is also possible to use the guidance for the combine to travel along the traveling route.
 自動作業走行指令部3063は、より詳しくは、自動操舵指令および車速指令を生成して、走行制御部3061に与える。自動操舵指令は、走行経路設定部3064によって設定された走行経路と、機体位置算出部3066によって算出された自車位置との間の方位ずれおよび位置ずれを解消するように生成される。車速指令は、前もって設定された車速値に基づいて生成される。さらに、自動作業走行指令部3063は、作業制御部3062に、自車位置や自車の走行状態に応じて、作業装置動作指令を与える。 More specifically, the automatic work travel instruction unit 3063 generates an automatic steering instruction and a vehicle speed instruction, and gives the travel control unit 3061. The automatic steering command is generated so as to eliminate the azimuth deviation and the positional deviation between the traveling route set by the traveling route setting unit 3064 and the vehicle position calculated by the machine position calculation unit 3066. The vehicle speed command is generated based on a previously set vehicle speed value. Furthermore, the automatic work travel instruction unit 3063 gives the work control unit 3062 a work device operation instruction according to the vehicle position and the traveling state of the vehicle.
 画像認識モジュール3005には、撮影部3070によって継時的に遂次取得された撮影画像の画像データが入力される。画像認識モジュール3005は、この撮影画像における認識対象物が存在する存在領域を推定し、存在領域および存在領域が推定された際の推定確率を含む認識出力データを、認識結果として出力する。画像認識モジュール3005は、深層学習を採用したニューラルネットワーク技術を用いて構築されている。 The image recognition module 3005 receives image data of a photographed image sequentially acquired by the photographing unit 3070 sequentially. The image recognition module 3005 estimates an existing area in the photographed image in which the recognition target exists, and outputs recognition output data including an existing area and an estimated probability when the existing area is estimated as a recognition result. The image recognition module 3005 is constructed using neural network technology that employs deep learning.
 画像認識モジュール3005による認識出力データの生成の流れが、図15および図16に示されている。画像認識モジュール3005には、RGB画像データの画素値が入力値として入力される。図15の図例では、推定される認証対象物は、雑草と倒伏穀稈と人物である。したがって、認識結果としての認識出力データには、雑草の存在領域(以下、雑草領域と称する)とその推定確率、倒伏穀稈の存在領域(以下、倒伏穀稈領域と称する)とその推定確率、人物の存在領域(以下、人物領域と称する)とその推定確率が含まれる。 The flow of generation of recognition output data by the image recognition module 3005 is shown in FIG. 15 and FIG. Pixel values of RGB image data are input to the image recognition module 3005 as input values. In the illustrated example of FIG. 15, the authentication objects to be estimated are weeds and fallen cereals and persons. Therefore, in the recognition output data as the recognition result, the present region of weeds (hereinafter referred to as weed region) and its estimated probability, the present region of the fallen kernel (hereinafter referred to as the fallen cereal region) and its estimated probability, It includes the existence area of a person (hereinafter referred to as a person area) and its estimated probability.
 図15では、推定結果は模式化されており、雑草領域は符号F1を付与された矩形の枠で示され、倒伏穀稈領域は符号F2を付与された矩形の枠で示され、人物領域は符号F3を付与された矩形の枠で示されている。それぞれの領域には、その推定確率がリンクされる。雑草領域、倒伏穀稈領域、人物領域は、それぞれ4つのコーナ点で規定されるが、そのような各矩形の4つのコーナ点の撮影画像上の座標位置も推定結果に含まれている。もちろん、認証対象物が推定されなければ、認証対象物の存在領域は出力されず、その推定確率はゼロとなる。 In FIG. 15, the estimation result is schematically shown, the weed area is shown by a rectangular frame with a reference F1, the fallen grain area is shown with a rectangular frame with a reference F2, and the human area is It is indicated by a rectangular frame given a code F3. Each region is linked to its estimated probability. The weed area, the fallen grain area and the human area are each defined by four corner points, but the coordinate positions on the photographed image of the four corner points of each rectangle are also included in the estimation result. Of course, if the authentication target is not estimated, the existence region of the authentication target is not output, and the estimation probability becomes zero.
 画像認識モジュール3005から認識出力データとして出力される認識対象物の存在領域および推定確率は、最終的な認識対象物位置情報の生成にとって重要である。しかしながら、圃場を作業走行するコンバインに設けられたカメラによる撮影では、撮影部3070の突発的な揺れ、瞬間的な逆光、撮影部3070の撮影視野内を横切る粉塵、などに起因して、その撮影画像が不適切なものとなる可能性がある。その場合には、不測に低い推定確率を有する認識出力データが出力されてしまう。あるいは、認識対象物が推定できず、推定確率ゼロという認識出力データが出力されてしまう。このため、画像認識モジュール3005から出力された認識出力データを処理するデータ処理モジュール3050には、前処理機能として、画像認識モジュール3005から出力された認識出力データの信頼度を調べて、信頼できない認識出力データを不適認識出力データと判定する認識出力データ評価機能が備えられている。 The presence area of the recognition target and the estimation probability output from the image recognition module 3005 as recognition output data are important for the generation of final recognition target position information. However, in the case of shooting with a camera provided in a combine traveling on a field, the shooting is performed due to sudden shaking of the shooting unit 3070, instantaneous backlighting, dust crossing within the shooting field of the shooting unit 3070, etc. Images may be inappropriate. In that case, recognition output data having an unexpectedly low estimation probability will be output. Alternatively, the recognition target can not be estimated, and recognition output data with an estimated probability of zero is output. Therefore, the data processing module 3050 that processes the recognition output data output from the image recognition module 3005 checks the reliability of the recognition output data output from the image recognition module 3005 as a pre-processing function to recognize the unreliability A recognition output data evaluation function is provided to determine output data as inappropriate recognition output data.
 この認識出力データ評価機能は、図16に示すように、データ処理モジュール3050のデータ記憶部3053と、データ判定部3054と、データ補正部3055とによって実現される。データ記憶部3053は、画像認識モジュール3005から遂次出力された認識出力データを認識出力データ列として一時的かつ経時的に記憶する。データ判定部3054は、走行軌跡算出部3068によって算出される走行軌跡に基づいて、画像認識モジュール3005によって推定された前回撮影画像における存在領域が次回撮影画像において位置すべき範囲(以下、予測範囲とも称する)を予測する。さらに、データ判定部3054は、次回撮影画像においてその予測範囲と重複する前記存在領域の推定確率と、前回撮影画像における存在領域の推定確率とを比較する。この比較で、2つの推定確率が、予め定めた許容量以上に相違している場合、次回撮影画像に基づく認識出力データは不適認識出力データとして判定される。あるいは、推定確率が所定値以下(例えば0.5以下)となっている認識出力データを不適認識出力データとして判定する構成を採用してもよい。 This recognition output data evaluation function is realized by the data storage unit 3053 of the data processing module 3050, the data determination unit 3054 and the data correction unit 3055 as shown in FIG. The data storage unit 3053 temporarily and temporally stores recognition output data sequentially output from the image recognition module 3005 as a recognition output data sequence. The data determination unit 3054 is a range within which the existing area in the previously captured image estimated by the image recognition module 3005 should be positioned in the next captured image based on the travel locus calculated by the travel locus calculation unit 3068 (hereinafter also referred to as a prediction range). Predict). Furthermore, the data determination unit 3054 compares the estimated probability of the existing area overlapping the prediction range in the next captured image with the estimated probability of the existing area in the previous captured image. In this comparison, when the two estimated probabilities differ by more than a predetermined allowable amount, recognition output data based on the next captured image is determined as unsuitable recognition output data. Alternatively, a configuration may be adopted in which recognition output data for which the estimation probability is equal to or less than a predetermined value (for example, 0.5 or less) is determined as inappropriate recognition output data.
 この不適認識出力データを判定するデータ処理を、図17を用いて、以下に説明する。なお、この例では、推定される認識対象物が雑草である。
(ステップ01)撮影部3070で取得された撮影画像から(#01a)、画像認識モジュール3005によって、符号F1で示された矩形枠で囲まれた雑草領域が認識され、その推定確率が0.75である(#01b)。推定確率が0.7を超えると高信頼度とみなされる。次に、撮影画像での矩形枠の位置と、走行軌跡算出部3068によって算出される走行軌跡とに基づいて、次に取得される撮影画像において雑草領域が位置すべき予想範囲PAが算定される(#01c)。
Data processing for determining the inappropriate recognition output data will be described below with reference to FIG. In this example, the object to be estimated is a weed.
(Step 01) The image recognition module 3005 recognizes the weed area surrounded by the rectangular frame indicated by the symbol F1 from the captured image acquired by the imaging unit 3070 (# 01a), and the estimated probability is 0.75 (# 01 b). If the estimated probability exceeds 0.7, it is considered as high confidence. Next, based on the position of the rectangular frame in the photographed image and the traveling locus calculated by the traveling locus calculation unit 3068, an expected range PA in which the weed region should be located in the photographed image to be acquired next is calculated. (# 01c).
(ステップ02)次に、撮影部3070で取得された撮影画像から(#02a)、画像認識モジュール3005によって、符号F1で示された矩形枠で囲まれた雑草領域が認識され、その推定確率が0.8である(#02b)。認識された雑草領域は、予想範囲PAと重複している。ここでの矩形枠の位置と、走行軌跡とに基づいて、次に取得される撮影画像において雑草領域が位置すべき予想範囲PAが算定される(#02c)。 (Step 02) Next, the image recognition module 3005 recognizes the weed area surrounded by the rectangular frame indicated by the code F1 from the photographed image acquired by the photographing unit 3070 (# 02a), and the estimated probability is It is 0.8 (# 02 b). The recognized weed areas overlap with the expected range PA. Based on the position of the rectangular frame and the travel locus, the expected range PA where the weed region should be located in the captured image to be acquired next is calculated (# 02c).
(ステップ03)さらに、次の撮影画像に基づいて(#03a)、画像認識モジュール3005による雑草が存在する存在領域の推定が行われるが、その推定確率は0.2である(#03b)。このように、推定確率が0.5より低い場合には、雑草領域を示す矩形枠の位置の信頼度も低いものとなる。したがって、ここでの予想範囲PAの算定には、この信頼度の低い認識出力データの矩形枠の位置を用いる代わりに、ステップ02における矩形枠の位置と、走行軌跡とを用いてほうが望ましい(#03c)。 (Step 03) Further, based on the next captured image (# 03a), the image recognition module 3005 estimates a presence area where weeds are present, but the estimation probability is 0.2 (# 03b). As described above, when the estimation probability is lower than 0.5, the reliability of the position of the rectangular frame indicating the weed area also becomes low. Therefore, for calculating the prediction range PA here, it is preferable to use the position of the rectangular frame in step 02 and the traveling locus instead of using the position of the rectangular frame of the recognition output data with low reliability (# 03c).
(ステップ04)さらに、次の撮影画像に基づいて(#04a)、画像認識モジュール3005による雑草が存在する存在領域の推定が行われ、符号F1で示された矩形枠で囲まれた雑草領域が認識され、その推定確率が0.9である(#04b)。認識された雑草領域は、予想範囲PAと重複している。 (Step 04) Further, based on the next captured image (# 04a), the presence area where weeds are present is estimated by the image recognition module 3005, and the weed area surrounded by the rectangular frame indicated by the code F1 is It is recognized, and its estimated probability is 0.9 (# 04 b). The recognized weed areas overlap with the expected range PA.
 画像認識モジュール3005が、上述したような推定出力データを出力した場合、データ判定部3054は、ステップ03のおける推定確率が、前回の推定確率である0.8から0.2まで低下しており、その低下量が予め定めた許容量以上(例えば変化率が50%以上)に相違しているので、ステップ03のおける推定出力データは、不適認識出力データと判定される。 When the image recognition module 3005 outputs the estimated output data as described above, the data determination unit 3054 lowers the estimated probability in step 03 from 0.8, which is the previous estimated probability, to 0.2. The estimated output data in step 03 is determined to be unsuitable recognition output data because the reduction amount is different than a predetermined allowable amount (for example, the change rate is 50% or more).
 データ判定部3054が、ステップ03のおける推定出力データを、不適認識出力データと判定した場合、データ補正部3055は、その前後の適正な認識出力データの推定確率を用いて、不適認識出力データのための補間推定確率を演算する。ここでは、最も簡単な補間演算として、算術平均が用いられており、0.85の補間推定確率が得られる。得られた補間推定確率で不適認識出力データの推定確率を置き換えることで、当該不適認識出力データは、以後の処理において、適正な認識出力データとして利用することができる。 When the data determination unit 3054 determines that the estimated output data in step 03 is inappropriate recognition output data, the data correction unit 3055 uses the estimation probability of proper recognition output data before and after that to determine improper recognition output data. Calculate the interpolation estimation probability for Here, the arithmetic mean is used as the simplest interpolation operation, and an interpolation estimation probability of 0.85 is obtained. By replacing the estimation probability of inadequate recognition output data with the obtained interpolation estimation probability, the inadequate recognition output data can be used as proper recognition output data in the subsequent processing.
 上述したように、この実施形態では認識出力データの数が少ない場合に、不適認識出力データと判定された認識出力データの推定確率を、その前後の認識出力データの推定確率で補間補正することができる。これにより、当該不適認識出力データは適切に補正されたことになって、利用可能となり、データ数が確保されるという利点が得られる。もちろん、不適認識出力データと判定された認識出力データを破棄する場合には、補間補正が不要になるので、データ補正部3055は省略される。 As described above, in this embodiment, when the number of recognition output data is small, the estimation probability of the recognition output data determined to be unsuitable recognition output data is interpolated and corrected by the estimation probability of the recognition output data before and after that. it can. As a result, the inappropriate recognition output data is appropriately corrected and can be used, and the number of data can be secured. Of course, when the recognition output data determined to be unsuitable recognition output data is discarded, the interpolation correction becomes unnecessary, so the data correction unit 3055 is omitted.
 この実施形態のデータ処理モジュール3050は、さらに、上述した認識出力データ評価機能によって評価された認識出力データから認識対象物位置情報を生成する機能も有する。
この機能は、データ処理モジュール3050の認識対象物位置情報生成部3051と統計処理部3052とによって、実現される。
The data processing module 3050 of this embodiment further has a function of generating recognition target object position information from the recognition output data evaluated by the recognition output data evaluation function described above.
This function is realized by the recognition target object position information generation unit 3051 and the statistical processing unit 3052 of the data processing module 3050.
 認識対象物位置情報とは、認識対象物の地図上の位置を示す情報である。認識対象物位置情報生成部3051は、撮影画像が取得された時点の機体位置と認識出力データとから認識対象物位置情報を生成する。認識出力データに含まれている認識対象物(雑草、倒伏穀稈、人物)が存在する地図上の位置は、認証対象物の存在領域(雑草領域、倒伏穀稈領域、人物領域)を示す矩形の4つのコーナ点の撮影画像上の座標位置(カメラ座標位置)を、地図上の座標に変換することで得られる。 The recognition target object position information is information indicating the position of the recognition target on the map. The recognition target object position information generation unit 3051 generates recognition target object position information from the machine position at the time when the captured image is acquired and the recognition output data. The position on the map where the recognition target (weed, fallen grain moth, person) is included in the recognition output data is a rectangle that indicates the presence area of the certification object (weed area, fallen grain moth area, person area) The coordinate positions (camera coordinate positions) on the photographed image of the four corner points of are obtained by converting the coordinates on the map.
 撮影部3070は、所定時間間隔、例えば0.5秒間隔で撮影画像を取得し、その画像データを画像認識モジュール3005に入力するので、画像認識モジュール3005も、同じ時間間隔で、認識出力データを出力する。したがって、撮影部3070の撮影視野に認識対象物が入っていた場合には、複数の認識出力データが同一の認識対象物に対する存在領域を含むことになる。その結果、同一の認識対象物に対する複数の認識対象物位置情報が得られる。その際、各元データである認識出力データに含まれている推定確率、つまり認識対象物位置情報に含まれる認識対象物の存在領域の推定確率は、撮影部3070と認識対象物との間の位置関係が相違することから、違う値となることが多い。 Since the imaging unit 3070 acquires captured images at predetermined time intervals, for example, 0.5 seconds, and inputs the image data to the image recognition module 3005, the image recognition module 3005 also recognizes and outputs recognition output data at the same time intervals. Output. Therefore, when the recognition target is included in the shooting field of view of the imaging unit 3070, the plurality of recognition output data includes the existence region for the same recognition target. As a result, multiple pieces of recognition target object position information for the same recognition target can be obtained. At that time, the estimation probability included in the recognition output data which is each original data, that is, the estimation probability of the existing region of the recognition target included in the recognition target object position information, is between the photographing unit 3070 and the recognition target Because the positional relationship is different, they often have different values.
 したがって、この実施形態では、図16に示すように、そのような複数の認識対象物位置情報が記憶され、記憶された複数の認識対象物位置情報のそれぞれに含まれる推定確率が統計的演算される。統計処理部3052は、複数の認識対象物位置情報の推定確率に対する統計的な演算を用いて、推定確率群の代表値を求める。その代表値を用いて、複数の認識対象物位置情報を、1つの最適認識対象物位置情報(認識対象物補正位置情報)に補正することができる。そのような補正の一例は、各推定確率の算術平均値または重み平均値あるいは中間値を基準値(代表値)として求め、その基準値以上の推定確率を有する存在領域の論理和を求め、それを最適存在領域とする補正認識対象物位置情報を生成することである。もちろん、これ以外の統計的演算を用いて信頼性の高い1つの認識対象物位置情報を生成することも可能である。つまり、複数の認識対象物位置情報が、当該認識対象物位置情報に対応する認識出力データに含まれている推定確率の統計的演算の結果に基づいて補正されるのである。 Therefore, in this embodiment, as shown in FIG. 16, a plurality of such recognition target object position information is stored, and an estimated probability included in each of the stored plurality of recognition target object position information is statistically calculated. Ru. The statistical processing unit 3052 obtains a representative value of the estimated probability group by using a statistical operation on the estimated probabilities of a plurality of recognition target object position information. A plurality of pieces of recognition target object position information can be corrected to one piece of optimum recognition target object position information (recognition target object correction position information) using the representative value. One example of such correction is to obtain the arithmetic mean value or weighted mean value or median value of each estimated probability as a reference value (representative value), and obtain the logical sum of the existing regions having the estimated probability equal to or higher than the reference value It is to generate correction recognition target object position information in which the Of course, it is also possible to generate one reliable object position information with high reliability using statistical operations other than this. That is, the plurality of recognition target object position information is corrected based on the result of the statistical calculation of the estimation probability included in the recognition output data corresponding to the recognition target object position information.
 このようにして求められた認証対象物の存在領域(雑草領域、倒伏穀稈領域、人物領域)の地図上の位置を示す認識対象物位置情報(雑草位置情報、倒伏穀稈位置情報、人物位置情報)を用いることで、雑草、倒伏穀稈、人物の認識時において、それぞれ予め設定された走行作業制御や警告報知が行われる。 Recognition object position information (weed position information, fallen grain weir position information, person position) indicating the position on the map of the existence area (weed area, fallen grain weir area, person area) of the certification object thus obtained By using the information), at the time of recognition of weeds, fallen grain weirs, and persons, traveling work control and warning notification set in advance are performed.
 なお、認識対象物位置情報生成部3051で生成された各認識対象物位置情報は、視覚的に分かりやすい表示のために、図18に示すようなマップ化が可能である。図18では、雑草位置情報をマップ化した雑草マップが例示されている。雑草位置情報において推定確率が相違する雑草存在領域が含まれている場合、図18に示すように、推定確率値の所定範囲でパターン分けされた形態で雑草存在領域を表示することも可能である。 Note that each recognition target object position information generated by the recognition target object position information generation unit 3051 can be mapped as shown in FIG. 18 for display that is easy to understand visually. FIG. 18 exemplifies a weed map in which weed position information is mapped. If the weed location information includes a weed presence region having a different estimation probability, it is possible to display the weed presence region in a form divided in a predetermined range of the estimation probability value, as shown in FIG. .
 なお、上記各実施形態(別実施形態を含む、以下同じ)で開示される構成は、矛盾が生じない限り、他の実施形態で開示される構成と組み合わせて適用することが可能であり、また、本明細書において開示された実施形態は例示であって、本発明の実施形態はこれに限定されず、本発明の目的を逸脱しない範囲内で適宜改変することが可能である。 The configurations disclosed in the above-described embodiments (including the other embodiments, the same applies hereinafter) can be applied in combination with the configurations disclosed in the other embodiments as long as no contradiction arises. The embodiment disclosed in the present specification is an exemplification, and the embodiment of the present invention is not limited to this, and can be appropriately modified within the scope of the object of the present invention.
〔別実施の形態〕
(1)上述した実施形態では、画像認識モジュール(5,2005,3005)が認識する認識対象物として植立穀稈より高く伸びた雑草群が設定されていたが、その他の認識対象物、例えば、倒伏穀稈群や人物なども設定されてもよい。その際には、作業走行制御モジュール(60,2060,3060)は、倒伏穀稈群や人物の認識に応答して、必要な制御をおこなうように構成される。
(2)上述した実施形態では、画像認識モジュール(5,2005,3005)は、深層学習タイプのニューラルネットワーク技術を用いて構築されている。これに代えて、その他の機械学習技術を用いて構築された画像認識モジュール(5,2005,3005)が採用されてもよい。
(3)上述した実施形態では、画像認識モジュール(5,2005,3005)、データ処理モジュール(50,2050,3050)、雑草位置算出部68は、コンバインの制御ユニット(6,2006,3006)に組み込まれていたが、その一部又は全部は、コンバインから独立した制御ユニット、例えば、携帯通信端末(200,2200,3200)などに構築可能である。
(4)図4,図9,図14および図16で示された各機能部は、主に説明目的で区分けされている。実際には、各機能部は他の機能部と統合してもよいし、又はさらに複数の機能部に分けてもよい。
[Another embodiment]
(1) In the embodiment described above, the weed group which is stretched higher than the planted grain weir is set as the recognition object to be recognized by the image recognition module (5, 2005, 3005). , Falling cereals or people may also be set. At that time, the work travel control module (60, 2060, 3060) is configured to perform the necessary control in response to the recognition of the fallen cereal group or person.
(2) In the embodiment described above, the image recognition module (5, 2005, 3005) is constructed using deep learning type neural network technology. Alternatively, an image recognition module (5, 2005, 3005) constructed using other machine learning techniques may be employed.
(3) In the embodiment described above, the image recognition module (5, 2005, 3005), the data processing module (50, 2050, 3050), and the weed position calculation unit 68 correspond to the control unit (6, 2006, 3006) of the combine. Although incorporated, some or all of them can be configured in a control unit independent of the combine, such as a mobile communication terminal (200, 2200, 3200).
(4) The functional units shown in FIG. 4, FIG. 9, FIG. 14 and FIG. 16 are divided mainly for the purpose of explanation. In practice, each functional unit may be integrated with another functional unit or may be further divided into a plurality of functional units.
 本発明は、稲や小麦等を収穫するコンバインだけでなく、トウモロコシなど他の農作物を収穫するコンバインや、ニンジンなどを収穫する収穫機にも適用可能である。 The present invention is applicable not only to combine harvesters such as rice and wheat, but also to combine harvesters that harvest other crops such as corn and harvesters that harvest carrots and the like.
3   :扱深さ調整機構
5   :画像認識モジュール
6   :制御ユニット
23  :穀稈搬送装置
236 :調節用電動モータ(モータ)
237 :操作ロッド
30  :稈長検出装置
34  :検知スイッチ
35  :検知スイッチ
50  :データ処理モジュール
51  :雑草位置情報生成部
52  :統計処理部
60  :作業走行制御モジュール(作業走行制御部)
61  :走行制御部
63  :自動作業走行指令部
620 :穀深さ制御部
66  :機体位置算出部
68  :雑草位置算出部
70  :撮影部
80  :衛星測位モジュール
91  :報知デバイス
92  :通信部
2053:データ記憶部
2054:データ判定部
2055:データ補正部
2051:認識対象物位置情報生成部
 
3: Handle depth adjustment mechanism 5: Image recognition module 6: Control unit 23: Grain conveyor device 236: Electric motor for adjustment (motor)
237: operation rod 30: length detection device 34: detection switch 35: detection switch 50: data processing module 51: weed position information generation unit 52: statistical processing unit 60: work traveling control module (work traveling control unit)
61: traveling control unit 63: automatic work traveling instruction unit 620: grain depth control unit 66: machine position calculating unit 68: weed position calculating unit 70: imaging unit 80: satellite positioning module 91: notification device 92: communication unit 2053: Data storage unit 2054: data determination unit 2055: data correction unit 2051: recognition target object position information generation unit

Claims (13)

  1.  圃場を走行しながら植立穀稈を収穫する収穫機であって、
     前記圃場から前記植立穀稈を刈り取る刈取部と、
     刈取穀稈を前記刈取部から脱穀装置に向けて搬送する穀稈搬送装置と、
     前記穀稈搬送装置に設けられた扱深さ調整機構と、
     前記扱深さ調整機構を用いて前記刈取穀稈の長さに基づく扱深さ調整制御を行う扱深さ制御部と、
     衛星測位モジュールからの測位データに基づいて機体の地図座標である機体位置を算出する機体位置算出部と、
     前記機体に設けられ、収穫作業時に前記圃場を撮影する撮影部と、
     前記撮影部によって継時的に遂次取得された撮影画像の画像データが入力され、前記撮影画像における雑草生育領域を推定し、推定された前記雑草生育領域を示す認識出力データを出力する画像認識モジュールと、
     前記撮影画像が取得された時点の前記機体位置と前記認識出力データとから前記雑草生育領域の地図上の位置を示す雑草位置情報を生成する雑草位置情報生成部と、
     前記雑草生育領域で刈り取られた雑草が前記扱深さ調整機構を通過するタイミングが算出され、前記雑草が前記扱深さ調整機構を通過する間、雑草進入制御を実行する作業走行制御部と、
    を備えた収穫機。
    It is a harvester that harvests cropped rice husk while traveling on the field,
    A reaper section for reaping the crop from the field;
    A grain conveying device for conveying a reaper from the reaper to a threshing device;
    A handling depth adjustment mechanism provided in the cereal gutter transport device;
    A handling depth control unit that performs handling depth adjustment control based on the length of the reaping grain using the handling depth adjustment mechanism;
    An aircraft position calculation unit that calculates an aircraft position, which is map coordinates of an aircraft, based on positioning data from a satellite positioning module;
    An imaging unit provided on the machine body and imaging the field during harvesting work;
    Image recognition in which image data of a captured image sequentially acquired sequentially by the imaging unit is input, a weed growth region in the captured image is estimated, and recognition output data indicating the estimated weed growth region is output Modules,
    A weed position information generation unit that generates weed position information indicating a position on the map of the weed growth area from the position of the body at the time when the photographed image is acquired and the recognition output data;
    A work travel control unit that executes weed entry control while the weeds cut in the weed growth region pass the threshing depth adjustment mechanism and the weeds pass through the threshing depth adjustment mechanism;
    Harvester equipped with.
  2.  圃場を走行しながら植立穀稈を収穫する収穫機であって、
     前記圃場から前記植立穀稈を刈り取る刈取部と、
     刈取穀稈を前記刈取部から脱穀装置に向けて搬送する穀稈搬送装置と、
     前記穀稈搬送装置に設けられた扱深さ調整機構と、
     前記扱深さ調整機構を用いて前記刈取穀稈の長さに基づく扱深さ調整制御を行う扱深さ制御部と、
     衛星測位モジュールからの測位データに基づいて機体の地図座標である機体位置を算出する機体位置算出部と、
     前記機体に設けられ、収穫作業時に前記圃場を撮影する撮影部と、
     前記撮影部によって継時的に遂次取得された撮影画像の画像データが入力され、前記撮影画像における雑草生育領域を推定し、推定された前記雑草生育領域を示す認識出力データを出力する画像認識モジュールと、
     前記撮影画像が取得された時点の前記機体位置と前記認識出力データとから前記雑草生育領域の地図上の位置を示す雑草位置情報を生成する雑草位置情報生成部と、
     前記刈取部が前記雑草生育領域を通過する間、雑草進入制御を実行する作業走行制御部と、
    を備えた収穫機。
    It is a harvester that harvests cropped rice husk while traveling on the field,
    A reaper section for reaping the crop from the field;
    A grain conveying device for conveying a reaper from the reaper to a threshing device;
    A handling depth adjustment mechanism provided in the cereal gutter transport device;
    A handling depth control unit that performs handling depth adjustment control based on the length of the reaping grain using the handling depth adjustment mechanism;
    An aircraft position calculation unit that calculates an aircraft position, which is map coordinates of an aircraft, based on positioning data from a satellite positioning module;
    An imaging unit provided on the machine body and imaging the field during harvesting work;
    Image recognition in which image data of a captured image sequentially acquired sequentially by the imaging unit is input, a weed growth region in the captured image is estimated, and recognition output data indicating the estimated weed growth region is output Modules,
    A weed position information generation unit that generates weed position information indicating a position on the map of the weed growth area from the position of the body at the time when the photographed image is acquired and the recognition output data;
    A work traveling control unit that executes weed entry control while the mowing unit passes through the weed growth area;
    Harvester equipped with.
  3.  前記刈取部が前記雑草生育領域を通過するタイミングは、前記雑草位置情報と前記機体位置算出部によって算出された前記機体位置とに基づいて決定される請求項2に記載の収穫機。 The harvester according to claim 2, wherein the timing at which the harvesting section passes the weed growth area is determined based on the weed position information and the machine position calculated by the machine position calculation unit.
  4.  前記雑草進入制御の実行により、前記扱深さ調整制御が中断される請求項1から3のいずれか一項に記載の収穫機。 The harvester according to any one of claims 1 to 3, wherein the depth adjustment control is interrupted by execution of the weed entry control.
  5.  前記雑草進入制御の実行により、車速が低減される請求項1から3のいずれか一項に記載の収穫機。 The harvester according to any one of claims 1 to 3, wherein the vehicle speed is reduced by the execution of the weed entry control.
  6.  圃場を走行しながら農作物を収穫する収穫機であって、
     衛星測位モジュールからの測位データに基づいて機体の地図座標である機体位置を算出する機体位置算出部と、
     前記機体に設けられ、収穫作業時に圃場を撮影する撮影部と、
     前記撮影部によって継時的に遂次取得された撮影画像の画像データが入力され、前記撮影画像における認識対象物が存在する存在領域を推定し、前記存在領域及び前記存在領域が推定された際の推定確率を含む認識出力データを出力する画像認識モジュールと、
     前記撮影画像が取得された時点の前記機体位置と前記認識出力データとから前記認識対象物の地図上の位置を示す認識対象物位置情報を生成する認識対象物位置情報生成部と、を備えた収穫機。
    It is a harvester that harvests crops while traveling in the field,
    An aircraft position calculation unit that calculates an aircraft position, which is map coordinates of an aircraft, based on positioning data from a satellite positioning module;
    An imaging unit provided on the airframe for imaging a field during harvesting work;
    When the image data of the captured image sequentially acquired sequentially by the imaging unit is input, the existing region where the recognition target exists in the captured image is estimated, and the existing region and the existing region are estimated An image recognition module that outputs recognition output data including an estimated probability of
    A recognition target position information generation unit for generating recognition target position information indicating the position on the map of the recognition target from the position of the machine at the time when the photographed image is acquired and the recognition output data; Harvester.
  7.  前記撮影画像において前記認識対象物が前記撮影部から遠くに位置するほど、当該認識対象物の前記推定確率は低減される請求項6に記載の収穫機。 The harvester according to claim 6, wherein the estimation probability of the recognition object is reduced as the recognition object is located farther from the imaging unit in the photographed image.
  8.  複数の前記認識対象物位置情報が記憶され、記憶された複数の前記認識対象物位置情報が、対応する前記認識出力データに含まれている前記推定確率の統計的演算の結果に基づいて補正される請求項6または7に記載の収穫機。 A plurality of pieces of recognition target object position information are stored, and the plurality of pieces of recognition target object position information stored are corrected based on the result of statistical calculation of the estimated probability included in the corresponding recognition output data. A harvester according to claim 6 or 7.
  9.  前記画像認識モジュールから遂次出力された前記認識出力データを認識出力データ列として一時的かつ経時的に記憶するデータ記憶部と、データ判定部とが備えられ、
     前記データ判定部は、前記認識出力データ列において継時的に前後関係となる前記認識出力データの前記推定確率と比べて、所定レベル以上に低い前記推定確率を有する前記認識出力データを不適認識出力データとして判定する請求項6から8のいずれか一項に記載の収穫機。
    A data storage unit that temporarily and temporally stores the recognition output data sequentially output from the image recognition module as a recognition output data sequence; and a data determination unit.
    The data determination unit improperly recognizes and outputs the recognition output data having the estimated probability lower than a predetermined level as compared to the estimated probability of the recognition output data sequentially related to each other in the recognition output data sequence. The harvester according to any one of claims 6 to 8, which is determined as data.
  10.  前記認識対象物は、人物、倒伏穀稈、雑草、畦のうちの少なくとも1つである請求項6から9のいずれか一項に記載の収穫機。 The harvester according to any one of claims 6 to 9, wherein the object to be recognized is at least one of a person, a fallen grain weir, a weed, and a weir.
  11.  圃場を走行しながら農作物を収穫する収穫機であって、
     衛星測位モジュールからの測位データに基づいて機体の地図座標である機体位置を算出する機体位置算出部と、
     前記機体位置から前記機体の走行軌跡を算出する走行軌跡算出部と、
     前記機体に設けられ、収穫作業時に圃場を撮影する撮影部と、
     前記撮影部によって継時的に遂次取得された撮影画像の画像データが入力され、前記撮影画像における認識対象物が存在する存在領域を推定し、前記存在領域および前記存在領域が推定された際の推定確率を含む認識出力データを出力する画像認識モジュールと、
     前記走行軌跡に基づいて、前回撮影画像における前記存在領域が次回撮影画像において位置すべき範囲を予測し、前記次回撮影画像において前記範囲と重複する前記存在領域の前記推定確率と、前記前回撮影画像における前記存在領域の前記推定確率とが、予め定めた許容量以上に相違している場合、前記次回撮影画像に基づく前記認識出力データを不適認識出力データとして判定するデータ判定部と、
    を備えた収穫機。
    It is a harvester that harvests crops while traveling in the field,
    An aircraft position calculation unit that calculates an aircraft position, which is map coordinates of an aircraft, based on positioning data from a satellite positioning module;
    A travel locus calculation unit that calculates a travel locus of the vehicle from the position of the vehicle;
    An imaging unit provided on the airframe for imaging a field during harvesting work;
    When the image data of the captured image sequentially acquired sequentially by the imaging unit is input, the existing region where the recognition target exists in the captured image is estimated, and the existing region and the existing region are estimated An image recognition module that outputs recognition output data including an estimated probability of
    Based on the traveling locus, a range where the existing area in the previous captured image is to be located in the next captured image is predicted, and the estimated probability of the existing area overlapping the range in the next captured image, and the previous captured image A data determination unit that determines the recognition output data based on the next captured image as improper recognition output data when the estimated probability of the existing area in the area is different by a predetermined allowable amount or more;
    Harvester equipped with.
  12.  前記不適認識出力データの前記推定確率を、前記不適認識出力データの継時的に前後となる前記認識出力データの前記推定確率に基づいて得られた補間推定確率で置き換えるデータ補正部が備えられている請求項11に記載の収穫機。 A data correction unit is provided which replaces the estimated probability of the inappropriate recognition output data with an interpolation estimation probability obtained based on the estimated probability of the recognition output data that is sequentially before and after the inappropriate recognition output data. The harvester according to claim 11.
  13.  前記認識対象物は、人物、倒伏穀稈、雑草、畦のうちの少なくとも1つである請求項11または12に記載の収穫機。
     
    The harvester according to claim 11 or 12, wherein the object to be recognized is at least one of a person, a fallen grain weir, a weed, and a weir.
PCT/JP2018/019444 2017-06-23 2018-05-21 Harvester WO2018235486A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020197031226A KR102589076B1 (en) 2017-06-23 2018-05-21 harvest
CN201880029905.8A CN110582198B (en) 2017-06-23 2018-05-21 Harvester

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2017-123439 2017-06-23
JP2017123440A JP7068781B2 (en) 2017-06-23 2017-06-23 Harvester
JP2017123441A JP6765349B2 (en) 2017-06-23 2017-06-23 Harvester
JP2017-123441 2017-06-23
JP2017-123440 2017-06-23
JP2017123439A JP6854713B2 (en) 2017-06-23 2017-06-23 combine

Publications (1)

Publication Number Publication Date
WO2018235486A1 true WO2018235486A1 (en) 2018-12-27

Family

ID=64735628

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/019444 WO2018235486A1 (en) 2017-06-23 2018-05-21 Harvester

Country Status (3)

Country Link
KR (1) KR102589076B1 (en)
CN (1) CN110582198B (en)
WO (1) WO2018235486A1 (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110070565A (en) * 2019-03-12 2019-07-30 杭州电子科技大学 A kind of ship trajectory predictions method based on image superposition
US10986778B2 (en) 2018-10-31 2021-04-27 Deere & Company Weed seed devitalizer control
US11079725B2 (en) 2019-04-10 2021-08-03 Deere & Company Machine control using real-time model
US11178818B2 (en) 2018-10-26 2021-11-23 Deere & Company Harvesting machine control system with fill level processing based on yield data
US11206763B2 (en) 2018-10-31 2021-12-28 Deere & Company Weed seed based harvester working member control
US11234366B2 (en) 2019-04-10 2022-02-01 Deere & Company Image selection for machine control
US11240961B2 (en) 2018-10-26 2022-02-08 Deere & Company Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity
US20220110251A1 (en) 2020-10-09 2022-04-14 Deere & Company Crop moisture map generation and control system
WO2022130038A1 (en) * 2020-12-18 2022-06-23 Agco International Gmbh System and method of assisted or automated grain unload synchronization
US11467605B2 (en) 2019-04-10 2022-10-11 Deere & Company Zonal machine control
US11474523B2 (en) 2020-10-09 2022-10-18 Deere & Company Machine control using a predictive speed map
US11477940B2 (en) 2020-03-26 2022-10-25 Deere & Company Mobile work machine control based on zone parameter modification
US11510364B2 (en) 2019-07-19 2022-11-29 Deere & Company Crop residue based field operation adjustment
US11592822B2 (en) 2020-10-09 2023-02-28 Deere & Company Machine control using a predictive map
US11589509B2 (en) 2018-10-26 2023-02-28 Deere & Company Predictive machine characteristic map generation and control system
US11635765B2 (en) 2020-10-09 2023-04-25 Deere & Company Crop state map generation and control system
US11641800B2 (en) 2020-02-06 2023-05-09 Deere & Company Agricultural harvesting machine with pre-emergence weed detection and mitigation system
EP3959955A4 (en) * 2019-04-25 2023-05-10 Kubota Corporation Agricultural machine such as harvester
US11650587B2 (en) 2020-10-09 2023-05-16 Deere & Company Predictive power map generation and control system
US11653588B2 (en) 2018-10-26 2023-05-23 Deere & Company Yield map generation and control system
US11672203B2 (en) 2018-10-26 2023-06-13 Deere & Company Predictive map generation and control
US11675354B2 (en) 2020-10-09 2023-06-13 Deere & Company Machine control using a predictive map
EP3991531A4 (en) * 2019-06-27 2023-07-26 Kubota Corporation Obstacle detection system, agricultural work vehicle, obstacle detection program, recording medium on which obstacle detection program is recorded, and obstacle detection method
US11711995B2 (en) 2020-10-09 2023-08-01 Deere & Company Machine control using a predictive map
US11727680B2 (en) 2020-10-09 2023-08-15 Deere & Company Predictive map generation based on seeding characteristics and control
US11778945B2 (en) 2019-04-10 2023-10-10 Deere & Company Machine control using real-time model
US11825768B2 (en) 2020-10-09 2023-11-28 Deere & Company Machine control using a predictive map
US11845449B2 (en) 2020-10-09 2023-12-19 Deere & Company Map generation and control system
US11844311B2 (en) 2020-10-09 2023-12-19 Deere & Company Machine control using a predictive map
US11849671B2 (en) 2020-10-09 2023-12-26 Deere & Company Crop state map generation and control system
US11849672B2 (en) 2020-10-09 2023-12-26 Deere & Company Machine control using a predictive map
US11864483B2 (en) 2020-10-09 2024-01-09 Deere & Company Predictive map generation and control system
US11874669B2 (en) 2020-10-09 2024-01-16 Deere & Company Map generation and control system
JP7423440B2 (en) 2020-06-23 2024-01-29 株式会社クボタ harvester
US11889788B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive biomass map generation and control
US11889787B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive speed map generation and control system
US11895948B2 (en) 2020-10-09 2024-02-13 Deere & Company Predictive map generation and control based on soil properties
US11927459B2 (en) 2020-10-09 2024-03-12 Deere & Company Machine control using a predictive map
US11946747B2 (en) 2020-10-09 2024-04-02 Deere & Company Crop constituent map generation and control system
US11957072B2 (en) 2020-02-06 2024-04-16 Deere & Company Pre-emergence weed detection and mitigation system
US11983009B2 (en) 2020-10-09 2024-05-14 Deere & Company Map generation and control system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210243951A1 (en) * 2020-02-06 2021-08-12 Deere & Company Machine control using a predictive map

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10210846A (en) * 1997-01-28 1998-08-11 Iseki & Co Ltd Threshing depth controller for combine or the like
JP2000354416A (en) * 1999-06-16 2000-12-26 Yanmar Agricult Equip Co Ltd Harvested quantity detector for combine harvester
JP2004133498A (en) * 2002-10-08 2004-04-30 Sakae Shibusawa Precise agricultural method information management system
JP2009284808A (en) * 2008-05-29 2009-12-10 Iseki & Co Ltd Grain culm feed control system for combine harvester
JP2010252722A (en) * 2009-04-27 2010-11-11 Kubota Corp Combine harvester
JP2012198688A (en) * 2011-03-18 2012-10-18 Fujitsu Ltd Crop image processing program, crop image processing method and crop image processing system
JP2016049102A (en) * 2014-08-29 2016-04-11 株式会社リコー Farm field management system, farm field management method, and program
JP2016086668A (en) * 2014-10-30 2016-05-23 井関農機株式会社 combine

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5832767A (en) * 1981-08-21 1983-02-25 新島 伸児 Lower leg mounting tool
JP2510564B2 (en) * 1987-04-03 1996-06-26 株式会社日立製作所 Image feature detection method
JPH08172867A (en) 1994-12-26 1996-07-09 Kubota Corp Threshing depth control device of combine harvester
JPH11137062A (en) 1997-11-10 1999-05-25 Yanmar Agricult Equip Co Ltd Control device of conventional combine harvester
JPH11155340A (en) 1997-11-25 1999-06-15 Yanmar Agricult Equip Co Ltd General-purpose combine
JP2006121952A (en) 2004-10-27 2006-05-18 Iseki & Co Ltd Combine harvester
CN103609101A (en) * 2011-06-16 2014-02-26 爱信精机株式会社 Vehicle periphery monitoring device
US9322629B2 (en) * 2011-11-22 2016-04-26 Precision Planting Llc Stalk sensor apparatus, systems, and methods
JP6116173B2 (en) 2012-09-26 2017-04-19 株式会社クボタ Farm management system
US9639998B2 (en) * 2012-09-26 2017-05-02 Kubota Corporation Ground work vehicle, ground work vehicle management system, and ground work information display method
US9074571B1 (en) * 2013-12-17 2015-07-07 Ford Global Technologies, Llc Vehicle and method of controlling an engine auto-stop and restart
US9836059B2 (en) * 2014-03-26 2017-12-05 Yanmar Co., Ltd. Autonomous travel working vehicle
JP6566833B2 (en) * 2015-10-20 2019-08-28 ヤンマー株式会社 Mapping system, mapping apparatus and computer program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10210846A (en) * 1997-01-28 1998-08-11 Iseki & Co Ltd Threshing depth controller for combine or the like
JP2000354416A (en) * 1999-06-16 2000-12-26 Yanmar Agricult Equip Co Ltd Harvested quantity detector for combine harvester
JP2004133498A (en) * 2002-10-08 2004-04-30 Sakae Shibusawa Precise agricultural method information management system
JP2009284808A (en) * 2008-05-29 2009-12-10 Iseki & Co Ltd Grain culm feed control system for combine harvester
JP2010252722A (en) * 2009-04-27 2010-11-11 Kubota Corp Combine harvester
JP2012198688A (en) * 2011-03-18 2012-10-18 Fujitsu Ltd Crop image processing program, crop image processing method and crop image processing system
JP2016049102A (en) * 2014-08-29 2016-04-11 株式会社リコー Farm field management system, farm field management method, and program
JP2016086668A (en) * 2014-10-30 2016-05-23 井関農機株式会社 combine

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11672203B2 (en) 2018-10-26 2023-06-13 Deere & Company Predictive map generation and control
US11653588B2 (en) 2018-10-26 2023-05-23 Deere & Company Yield map generation and control system
US11178818B2 (en) 2018-10-26 2021-11-23 Deere & Company Harvesting machine control system with fill level processing based on yield data
US11240961B2 (en) 2018-10-26 2022-02-08 Deere & Company Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity
US11589509B2 (en) 2018-10-26 2023-02-28 Deere & Company Predictive machine characteristic map generation and control system
US10986778B2 (en) 2018-10-31 2021-04-27 Deere & Company Weed seed devitalizer control
US11206763B2 (en) 2018-10-31 2021-12-28 Deere & Company Weed seed based harvester working member control
CN110070565A (en) * 2019-03-12 2019-07-30 杭州电子科技大学 A kind of ship trajectory predictions method based on image superposition
US11829112B2 (en) 2019-04-10 2023-11-28 Deere & Company Machine control using real-time model
US11467605B2 (en) 2019-04-10 2022-10-11 Deere & Company Zonal machine control
US11778945B2 (en) 2019-04-10 2023-10-10 Deere & Company Machine control using real-time model
US11234366B2 (en) 2019-04-10 2022-02-01 Deere & Company Image selection for machine control
US11650553B2 (en) 2019-04-10 2023-05-16 Deere & Company Machine control using real-time model
US11079725B2 (en) 2019-04-10 2021-08-03 Deere & Company Machine control using real-time model
EP3959955A4 (en) * 2019-04-25 2023-05-10 Kubota Corporation Agricultural machine such as harvester
EP3991531A4 (en) * 2019-06-27 2023-07-26 Kubota Corporation Obstacle detection system, agricultural work vehicle, obstacle detection program, recording medium on which obstacle detection program is recorded, and obstacle detection method
US11510364B2 (en) 2019-07-19 2022-11-29 Deere & Company Crop residue based field operation adjustment
US11957072B2 (en) 2020-02-06 2024-04-16 Deere & Company Pre-emergence weed detection and mitigation system
US11641800B2 (en) 2020-02-06 2023-05-09 Deere & Company Agricultural harvesting machine with pre-emergence weed detection and mitigation system
US11477940B2 (en) 2020-03-26 2022-10-25 Deere & Company Mobile work machine control based on zone parameter modification
JP7423440B2 (en) 2020-06-23 2024-01-29 株式会社クボタ harvester
US11675354B2 (en) 2020-10-09 2023-06-13 Deere & Company Machine control using a predictive map
US11849672B2 (en) 2020-10-09 2023-12-26 Deere & Company Machine control using a predictive map
US11635765B2 (en) 2020-10-09 2023-04-25 Deere & Company Crop state map generation and control system
US11711995B2 (en) 2020-10-09 2023-08-01 Deere & Company Machine control using a predictive map
US11727680B2 (en) 2020-10-09 2023-08-15 Deere & Company Predictive map generation based on seeding characteristics and control
US11592822B2 (en) 2020-10-09 2023-02-28 Deere & Company Machine control using a predictive map
US11474523B2 (en) 2020-10-09 2022-10-18 Deere & Company Machine control using a predictive speed map
US11825768B2 (en) 2020-10-09 2023-11-28 Deere & Company Machine control using a predictive map
US11845449B2 (en) 2020-10-09 2023-12-19 Deere & Company Map generation and control system
US11844311B2 (en) 2020-10-09 2023-12-19 Deere & Company Machine control using a predictive map
US11849671B2 (en) 2020-10-09 2023-12-26 Deere & Company Crop state map generation and control system
US11650587B2 (en) 2020-10-09 2023-05-16 Deere & Company Predictive power map generation and control system
US11864483B2 (en) 2020-10-09 2024-01-09 Deere & Company Predictive map generation and control system
US11871697B2 (en) 2020-10-09 2024-01-16 Deere & Company Crop moisture map generation and control system
US11874669B2 (en) 2020-10-09 2024-01-16 Deere & Company Map generation and control system
US11983009B2 (en) 2020-10-09 2024-05-14 Deere & Company Map generation and control system
US11889788B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive biomass map generation and control
US11889787B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive speed map generation and control system
US11895948B2 (en) 2020-10-09 2024-02-13 Deere & Company Predictive map generation and control based on soil properties
US11927459B2 (en) 2020-10-09 2024-03-12 Deere & Company Machine control using a predictive map
US11946747B2 (en) 2020-10-09 2024-04-02 Deere & Company Crop constituent map generation and control system
US20220110251A1 (en) 2020-10-09 2022-04-14 Deere & Company Crop moisture map generation and control system
WO2022130038A1 (en) * 2020-12-18 2022-06-23 Agco International Gmbh System and method of assisted or automated grain unload synchronization

Also Published As

Publication number Publication date
CN110582198B (en) 2022-08-23
KR20200014735A (en) 2020-02-11
CN110582198A (en) 2019-12-17
KR102589076B1 (en) 2023-10-16

Similar Documents

Publication Publication Date Title
WO2018235486A1 (en) Harvester
JP6887323B2 (en) Combine and field farming map generation method
JP6854713B2 (en) combine
US11483972B2 (en) System for controlling an operative parameter of a harvesting header
US9532504B2 (en) Control arrangement and method for controlling a position of a transfer device of a harvesting machine
JP2020018255A (en) Harvesting work system
JP7068781B2 (en) Harvester
US20220230444A1 (en) Obstacle Detection System, Agricultural Work Vehicle, Obstacle Detection Program, Recording Medium on Which Obstacle Detection Program is Recorded, and Obstacle Detection Method
JP7381402B2 (en) automatic driving system
JP2019083703A (en) Harvesting machine
CN113766824A (en) Harvester, obstacle determination program, recording medium having obstacle determination program recorded thereon, obstacle determination method, agricultural machine, control program, recording medium having control program recorded thereon, and control method
JP7246641B2 (en) agricultural machine
US20220212602A1 (en) Harvester, System, Program, Recording Medium, and Method
WO2020261824A1 (en) Working vehicle, obstacle detection method, and obstacle detection program
JP2021007385A (en) Farm implement
JP2020178619A (en) Agricultural work machine
WO2020218528A1 (en) Agricultural machine such as harvester
WO2020262416A1 (en) Automatic traveling system, agricultural work machine, program, recording medium with program recorded thereon, and method
JP2022092391A (en) Mobile vehicle
JP6765349B2 (en) Harvester
US20240016087A1 (en) Combine and travel control method
JP7130602B2 (en) harvester
JP7174487B2 (en) agricultural machine
JP2022096321A (en) Arrival information setting method and combine
KR20230074717A (en) harvest

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18820413

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20197031226

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18820413

Country of ref document: EP

Kind code of ref document: A1