WO2018235942A1 - コンバイン、圃場営農マップ生成方法、圃場営農マップ生成プログラム及び圃場営農マップ生成プログラムが記録された記録媒体 - Google Patents
コンバイン、圃場営農マップ生成方法、圃場営農マップ生成プログラム及び圃場営農マップ生成プログラムが記録された記録媒体 Download PDFInfo
- Publication number
- WO2018235942A1 WO2018235942A1 PCT/JP2018/023791 JP2018023791W WO2018235942A1 WO 2018235942 A1 WO2018235942 A1 WO 2018235942A1 JP 2018023791 W JP2018023791 W JP 2018023791W WO 2018235942 A1 WO2018235942 A1 WO 2018235942A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- map
- fallen
- grain
- crop
- combine
- Prior art date
Links
- 238000009313 farming Methods 0.000 title claims abstract description 44
- 238000000034 method Methods 0.000 title abstract description 16
- 238000003306 harvesting Methods 0.000 claims abstract description 77
- 238000011156 evaluation Methods 0.000 claims abstract description 45
- 235000013339 cereals Nutrition 0.000 claims description 143
- 238000004364 calculation method Methods 0.000 claims description 14
- 238000003384 imaging method Methods 0.000 claims description 13
- 238000001514 detection method Methods 0.000 abstract description 8
- 230000006870 function Effects 0.000 description 18
- 238000012545 processing Methods 0.000 description 15
- 238000007726 management method Methods 0.000 description 12
- 238000005259 measurement Methods 0.000 description 10
- 238000009826 distribution Methods 0.000 description 9
- 238000005520 cutting process Methods 0.000 description 8
- 241000209094 Oryza Species 0.000 description 7
- 235000007164 Oryza sativa Nutrition 0.000 description 7
- 239000003337 fertilizer Substances 0.000 description 7
- 235000009566 rice Nutrition 0.000 description 7
- 238000007599 discharging Methods 0.000 description 5
- 230000004720 fertilization Effects 0.000 description 5
- 239000010902 straw Substances 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000010295 mobile communication Methods 0.000 description 4
- 241000753145 Sitotroga cerealella Species 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 235000004252 protein component Nutrition 0.000 description 3
- IJGRMHOSHXDMSA-UHFFFAOYSA-N Atomic nitrogen Chemical compound N#N IJGRMHOSHXDMSA-UHFFFAOYSA-N 0.000 description 2
- 241000196324 Embryophyta Species 0.000 description 2
- 241000209140 Triticum Species 0.000 description 2
- 235000021307 Triticum Nutrition 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 239000004464 cereal grain Substances 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 230000032258 transport Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 208000019300 CLIPPERS Diseases 0.000 description 1
- 244000000626 Daucus carota Species 0.000 description 1
- 235000002767 Daucus carota Nutrition 0.000 description 1
- 241001124569 Lycaenidae Species 0.000 description 1
- 235000014676 Phragmites communis Nutrition 0.000 description 1
- 240000008042 Zea mays Species 0.000 description 1
- 235000005824 Zea mays ssp. parviglumis Nutrition 0.000 description 1
- 235000002017 Zea mays subsp mays Nutrition 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 208000021930 chronic lymphocytic inflammation with pontine perivascular enhancement responsive to steroids Diseases 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 235000005822 corn Nutrition 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013340 harvest operation Methods 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 229910052757 nitrogen Inorganic materials 0.000 description 1
- 235000018102 proteins Nutrition 0.000 description 1
- 102000004169 proteins and genes Human genes 0.000 description 1
- 108090000623 proteins and genes Proteins 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000004611 spectroscopical analysis Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000009333 weeding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D41/00—Combines, i.e. harvesters or mowers combined with threshing devices
- A01D41/12—Details of combines
- A01D41/127—Control or measuring arrangements specially adapted for combines
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G7/00—Botany in general
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
- A01B79/005—Precision agriculture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2200/00—Type of vehicle
- B60Y2200/20—Off-Road Vehicles
- B60Y2200/22—Agricultural vehicles
- B60Y2200/222—Harvesters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
Definitions
- the present invention relates to a combine that can harvest agricultural products while traveling on a field, and can support farming based on a photographed image acquired by a photographing unit, and a farm farming management method using information acquired by such a combine.
- a television camera and an image processing apparatus for capturing a cereal grain in front of the reaper are provided.
- the image processing apparatus detects the erected state of the grain scale by comparing the image from the television camera and the image indicating the established state of various grain scale stored in advance. At that time, if it is detected that a part of the grain scale in front of the reaper is fallen, the take-in reel is tilted with the fall side facing down. By this, it is intended to improve the harvesting performance of the lumbering straw.
- the evaluation determination of the degree of lodging of the crop ground is performed before cutting from the power spectrum distribution obtained based on the image captured by the electronic camera acquired at the time of the cutting operation.
- the threshing load is adjusted, and a smooth threshing operation is realized.
- a feature of the present invention is a combine that harvests crops while traveling in a field, and calculates an aircraft position which is map coordinates of an aircraft based on positioning data from a satellite positioning module;
- the imaging unit is provided, and the imaging unit for imaging the field during harvesting work and the image data of the captured image sequentially acquired sequentially by the imaging unit are input, and an estimated area of a fallen grain in the captured image is estimated
- An image recognition module for outputting recognition output data indicating the fallen grain area, and an evaluation module for outputting a crop evaluation value per unit travel obtained by evaluating the crops to be harvested successively; Falling grain weir position information indicating the position on the map of the weeding grain area from the airframe position when the image is acquired and the recognition output data
- a generating unit in that it includes a harvesting information generation unit for generating a harvest information from said aircraft position at the time of harvesting the crop and the crop evaluation value.
- the image recognition module estimates a clumped grain region from the image data which is the photographed image. Furthermore, since the machine position indicated by the map coordinates at the time when the photographed image is acquired is calculated by the machine position calculation unit, the fallen grain is obtained from the body position and the recognition output data indicating the fallen grain area. The information about the position of the fallen grain is shown. At the same time, a crop evaluation value per traveling crop unit to be harvested is determined, and harvest information is generated from the position of the crop at the time of harvesting the crop and the crop evaluation value of the crop.
- the distribution of the fallen grain area on the map can be confirmed from the fallen grain position information, and the distribution of the crop evaluation value of the crop on the map can be confirmed from the harvest information.
- the distribution of fallen grain area in the field With the distribution of crop evaluation value, it is possible to reduce the amount of fertilizer applied to the falling grain area and adjust the amount of planting in the next cultivation of crops. It becomes.
- the combine according to the present invention it becomes possible to control the harvesting operation support in consideration of the position on the map of the fallen cereal (the distance between the fallen cereal and the combine), the next farming plan It will be possible to obtain information to support.
- the yield which is the yield, and the taste of the harvest are important evaluation quantities.
- the crop is wheat, rice, etc.
- the yield of grain input to the grain tank per traveling distance (per hour) and the moisture and protein components of the grain harvested per traveling distance (per hour) It is possible to measure sequentially. Therefore, in one of the preferred embodiments of the present invention, the crop evaluation value includes yield and / or taste. Thereby, it becomes possible to grasp the yield and taste depending on the position of the fallen grain area in the field and the position of the field.
- a field farming map generating unit for producing a field farming map that aligns the fallen grain weir location information and the harvest information on a map basis, and a control system inside the machine, It is built on a cloud computer system, a server provided at a remote place, and the like.
- Such field farming map shows the distribution of fallen grain area showing the distribution of fallen grain area in the unit in the field, and the harvest map showing the yield, taste and distribution of area in the field, with map coordinates or field coordinates. It can be generated by combining to match.
- the farm farming map is generated by a combine or a communication terminal (such as a liquid crystal monitor, tablet computer or smartphone) attached to the combine, by uploading this farm farming map to the cloud computer system, from anywhere anytime But you can use it.
- a field farming map crop evaluation values in the area of fallen grain can be analyzed in plot units. For example, if the crop evaluation value is a yield, it is possible to clearly and easily grasp the difference in yield between the fallen grain area and the non-falling grain area from such a field farming map. It can be referred to in farming planning such as fertilization plan.
- this cloud computer system is used as a general term for a system that provides various users with various information services collectively or in a distributed manner using a computer network, and is a conventional server / client system.
- a personal information exchange system is also included.
- the present invention is also directed to a method for generating a field farming map that generates a field farming map as described above.
- the field farming management map generation method comprises the steps of: outputting recognition output data indicating a fallen grain area estimated based on a photographed image acquired by a photographing unit provided in the combine; and acquiring the photographed image From the position of the aircraft at the point of time and the recognition output data to generate position information of Outputting the crop evaluation value per unit run obtained by evaluating the crop, and generating harvesting information from the position of the machine at the time of harvesting the crop and the crop evaluation value; And a step of generating a field farming map by map-wise aligning map position information and the harvest information.
- the fallen grain weir location information and harvest information are generated, and the field farming management map is generated by simply matching the fallen grain weir location information and the harvest information on a map. Be done.
- the fallen grain weir position information and the harvest information are generated using the map data that is a common base, it is not necessary to map the fallen grain weir position information and the harvest information in a map.
- the field farming management map generating program has a function of outputting recognition output data indicating a fallen grain area estimated based on a photographed image acquired by a photographing unit provided in the combine, and the photographed image
- the computer realizes a function of generating a field farming map by map-matching the fallen grain weir position information with the harvest information.
- the recording medium on which the field farming management map generation program according to the present invention is recorded outputs recognition output data indicating a fallen grain area estimated based on a photographed image acquired by a photographing unit provided in the combine. And a function for generating the position information of the position of the fallen grain on the map of the area of the fallen grain from the body position at the time when the photographed image is acquired and the recognition output data, and the combine works the field A function to output a crop evaluation value per unit travel obtained by evaluating crops that are successively harvested by traveling, and harvest information from the position of the machine at the time of harvesting the crops and the crop evaluation value
- a field farming map that causes a computer to realize a function to generate a field map and a function to generate a field farming map by map-matching the function to generate the fallen grain position information and the harvest information on a map Formation program is recorded.
- the reaper 2 is connected to the front of the machine body 1 including the pair of left and right crawler traveling devices 10 so as to be able to move up and down around the horizontal axis X.
- a threshing device 11 and a grain tank 12 for storing grains are provided at the rear of the machine body 1 in a state of being aligned in the machine body width direction.
- a cabin 14 is provided at the front right side of the airframe 1 to cover the boarding driver, and a driving engine 15 is provided below the cabin 14.
- the threshing device 11 internally receives a reaping grain weir which is cut by the reworking unit 2 and transported backward, and the original origin of the weir by the threshing feed chain 111 and the pinching rail 112
- the tip end side is threshed with a threshing cylinder 113 while holding and conveying.
- the grain sorting process is performed on the threshing treatment product in the sorting unit provided below the threshing drum 113.
- the grain sorted by the sorting unit is transported to the grain tank 12 and stored in the grain tank 12.
- a grain discharging device 13 for discharging grains stored in the grain tank 12 to the outside is provided.
- the reaper 2 is provided with a plurality of raising devices 21, a cutting device 22 of a hair clipper type, a grain transporting device 23 and the like.
- the inducing device 21 causes the fallen rice straw to fall.
- the cutting device 22 cuts the origin of the cropped rice straw which has been raised.
- the grain feeder conveying device 23 is directed toward the beginning end of the threshing feed chain 111 of the threshing device 11 located on the rear side of the machine while gradually changing the cropped grain gutter in the vertical posture in which the stock origin is cut to a lying posture. Transport
- the cereal grain conveying apparatus 23 includes a merging and conveying unit 231, a stock origin pinching and conveying apparatus 232, an ear tip engaging and conveying apparatus 233, a supply and conveyance apparatus 234, and the like.
- the merging and conveying portion 231 conveys the plural rows of reaping grain remnants cut by the cutting device 22 while collecting them at the center in the cutting width direction.
- the stock origin holding conveyance device 232 holds the stock origin of the gathered harvesting grain gutter and carries it backward.
- the ear tip locking and conveying device 233 locks and transports the tip side of the reaper.
- the feeding and conveying device 234 guides the stock origin of the harvesting grain gutter to the threshing feed chain 111 from the terminal end of the stock holding and conveying device 232.
- a photographing unit 70 provided with a color camera is provided.
- the fore-and-aft extension of the imaging field of view of the imaging unit 70 substantially reaches the horizon from the front end region of the reaper 2.
- the breadth of the field of view in the field of view reaches about 10 m to several tens of meters.
- the photographed image acquired by the photographing unit 70 is converted into image data and sent to the control system of the combine.
- the photographing unit 70 photographs a field at the time of harvesting work.
- the control system of the combine has a function of recognizing the fallen grain as the recognition object from the image data sent from the photographing unit 70.
- a normal assembled cereal group is indicated by a symbol Z0
- a fallen cereal group is indicated by a symbol Z2.
- a satellite positioning module 80 is also provided on the ceiling of the cabin 14.
- the satellite positioning module 80 includes a satellite antenna for receiving global navigation satellite system (GNSS) signals (including GPS signals).
- GNSS global navigation satellite system
- an inertial navigation unit incorporating a gyro acceleration sensor and a magnetic direction sensor is incorporated in the satellite positioning module 80.
- the satellite positioning module 80 is disposed at the rear of the ceiling of the cabin 14 for the convenience of drawing.
- the satellite positioning module 80 be disposed, for example, at a position closer to the center of the fuselage in the front end of the ceiling so as to approach the position of the cutting device 22 right above the left and right center as much as possible.
- This combine has a function of calculating and outputting grain yield and taste as a crop evaluation value per unit run obtained by evaluating crops to be harvested sequentially. Specifically, as shown in FIG. 2, the amount (that is, the yield) of the grain supplied from the threshing apparatus 11 to the grain tank 12 with the passage of time, and the taste (water, protein, etc.) of the grain Is measured, and based on the measurement result, the evaluation module 4A calculates and outputs the yield and the taste as a crop evaluation value.
- a yield measurement unit 120 for measuring the yield and a taste measuring unit 125 for measuring the taste are provided in the end region of the supply line 130 of the grain connecting the threshing device 11 and the grain tank 12.
- a screw conveyor 131 which rotates around an axis PX is provided in a grain tank inner pipe portion of the supply pipe 130. The end of the housing 132 of the screw conveyor 131 is used as a housing of the yield measuring unit 120, and an opening functioning as the outlet 122 of the yield measuring unit 120 is formed.
- the yield measuring unit 120 includes a discharge rotating body 121 which rotates around an axis PX to discharge the grain fed by the screw conveyor 131, and a load cell structure 123 which detects a load generated when the grain is discharged. Is equipped.
- the load received by the load cell structure 123 by the grains discharged from the discharge port 122 by the release rotary body 121 is correlated with the release amount (that is, the yield) of the grain per one rotation of the release rotary body 121.
- the yield per unit time is calculated from the rotation number signal of the discharge rotating body 121 and the load cell signal of the load cell structure 123. Furthermore, a unit running yield is calculated and output as a crop evaluation value based on the yield per unit time and the traveling speed.
- the taste measuring unit 125 irradiates the grain with a light beam and spectrally analyzes the returned light beam to obtain a measurement value concerning moisture and protein components.
- the taste measuring unit 125 comprises a cylindrical container 129 having a receiving opening 126 for receiving at least a portion of the grain released by the yield measuring unit 120 and a discharge opening 127 for releasing the received grain.
- the shutter 128 is provided in the cylindrical container 129. The shutter 128 enables the temporary storage of kernels received through the inlet 126 or the flow of kernels to the outlet 127 by opening and closing.
- the taste calculating process executed in the evaluation module 4A when the shutter 128 is changed to the storage (closed) posture and the grains of a predetermined amount are stored in the cylindrical container 129, the taste measurement of the spectrometry method is started. From the measured value, the taste value is calculated and output as a crop evaluation value. When this taste measurement is completed, the shutter 128 is changed to the release (open) posture, and the stored grains are released. Immediately after that, the shutter 128 is returned to the storable posture, and shifts to the next taste calculation, and the taste value calculated in order is output as a crop evaluation value.
- the unit traveling yield calculated in the yield calculating process is related to the traveling locus of the aircraft 1 obtained from the vehicle position calculated by the vehicle position calculating unit 66 in the harvest information generating unit 4B. Thus, the yield is sequentially recorded as the combine harvester travels.
- the taste value calculated in the taste calculation process is also related to the travel locus of the aircraft 1 obtained from the vehicle position calculated by the vehicle position calculation unit 66 in the harvest information generation unit 4B. As a result, the taste value is sequentially recorded as the combine harvester travels.
- the unit traveling yield and the unit traveling taste value are associated with the unit traveling distance in the field (indicated by P in FIG. 2) as the harvest information. Since the harvest position is calculated from the vehicle position calculated based on the positioning data from the satellite positioning module 80, the position on the map can be represented by the absolute azimuthal position represented by the latitude and longitude or the coordinate position in the field coordinates. is there. Therefore, from this harvest information, it is possible to generate a yield map and a taste map that show the distribution of yield and taste per unit distance traveled in the field (resultingly, per minute section of the field).
- the distance between the antenna of the satellite positioning module 80 and the reaper 2 and the yield for the grain from the harvesting of the grain is set in advance.
- FIG. 3 shows a functional block diagram of a control system built inside the combine aircraft 1.
- the control system of this embodiment is composed of an electronic control unit called a large number of ECUs, various operation devices, sensors and switches, and a wiring network such as an in-vehicle LAN for data transmission between them.
- the notification device 91 is a device for notifying a driver or the like of a work traveling state and various warnings, and is a buzzer, a lamp, a speaker, a display, or the like.
- the communication unit 92 is used to exchange data with the cloud computer system 100 or the portable communication terminal 200 installed at a remote place.
- the mobile communication terminal 200 is a tablet computer operated by a supervisor (including a driver) at the work travel site.
- the control unit 6 is a core element of this control system, and is shown as a collection of a plurality of ECUs.
- the positioning data from the satellite positioning module 80 and the image data from the imaging unit 70 are input to the control unit 6 through the wired network
- the control unit 6 includes an output processing unit 6B and an input processing unit 6A as an input / output interface.
- the output processing unit 6B is connected to the vehicle travel device group 7A and the work device device group 7B.
- the vehicle travel device group 7A includes control devices related to vehicle travel (for example, an engine control device, a shift control device, a braking control device, a steering control device, etc.).
- the work device group 7B includes power control devices and the like in the reaper 2, the threshing device 11, the grain discharging device 13, and the grain conveying device 23.
- a traveling system detection sensor group 8A, a working system detection sensor group 8B, and the like are connected to the input processing unit 6A.
- the traveling system detection sensor group 8A includes a sensor that detects the state of an engine speed adjuster, an accelerator pedal, a brake pedal, a gearshift operator, and the like.
- the working system detection sensor group 8B includes a cutting unit 2, a threshing device 11, a grain discharging device 13, and a device state in the grain conveying device 23 and a sensor for detecting the state of the grain or grain.
- the control unit 6 includes a work travel control module 60, an image recognition module 5, a data processing module 50, a machine position calculation unit 66, a notification unit 67, and an evaluation module 4A and a harvest information generation unit described with reference to FIG. 4B is provided.
- the notification unit 67 generates notification data based on an instruction or the like from each functional unit of the control unit 6, and gives the notification data to the notification device 91.
- the aircraft position calculation unit 66 calculates the aircraft position which is the map coordinates (or field coordinates) of the aircraft 1 based on the positioning data sequentially sent from the satellite positioning module 80.
- the combine according to this embodiment can travel in both automatic travel (automatic steering) and manual travel (manual steering).
- the work travel control module 60 includes an automatic work travel instruction unit 63 and a travel route setting unit 64.
- a traveling mode switch (not shown) is provided in the cabin 14 to select one of an automatic traveling mode in which the vehicle travels by automatic steering and a manual steering mode in which the vehicle travels by manual steering. By operating the travel mode switch, it is possible to shift from manual steering travel to automatic steering travel, or shift from automatic steering travel to manual steering travel.
- the traveling control unit 61 has an engine control function, a steering control function, a vehicle speed control function, and the like, and provides a traveling control signal to the vehicle traveling device group 7A.
- the work control unit 62 provides a work control device group 7B with a work control signal in order to control the movement of the reaper 2, the threshing device 11, the grain discharging device 13, the grain feeding device 23, and the like.
- the traveling control unit 61 When the manual steering mode is selected, the traveling control unit 61 generates a control signal based on an operation by the driver to control the vehicle traveling device group 7A.
- the traveling control unit 61 controls the traveling equipment group 7A related to steering and the traveling equipment group 7A related to vehicle speed based on the automatic traveling instruction given by the automatic work traveling instruction unit 63. .
- the travel route setting unit 64 develops a travel route for automatic travel created by any of the control unit 6, the mobile communication terminal 200, the cloud computer system 100, and the like in the memory.
- the travel route developed in the memory is sequentially used as a target travel route in automatic travel. Even if this traveling route is manual traveling, it is also possible to use the guidance for the combine to travel along the traveling route.
- the automatic work travel instruction unit 63 generates an automatic steering instruction and a vehicle speed instruction, and gives the travel control unit 61.
- the automatic steering command is generated so as to eliminate the azimuth deviation and the positional deviation between the traveling route developed by the traveling route setting unit 64 and the vehicle position calculated by the machine position calculation unit 66.
- the vehicle speed command is generated based on a previously set vehicle speed value.
- the automatic work travel instruction unit 63 gives the work control unit 62 an operation device operation instruction according to the vehicle position and the traveling state of the vehicle.
- the image recognition module 5 receives image data of a captured image sequentially acquired by the imaging unit 70 sequentially.
- the image recognition module 5 estimates an existing area in the photographed image in which a recognition target exists, and outputs recognition output data including an existing area and an estimated probability when the existing area is estimated as a recognition result.
- the image recognition module 5 is constructed using neural network technology adopting deep learning.
- the flow of generation of recognition output data by the image recognition module 5 is shown in FIG. 4 and FIG.
- the image recognition module 5 receives pixel values of RGB image data as input values.
- the authentication object to be estimated is the area where the fallen cereals are present (hereinafter referred to as the fallen cereals area). Therefore, the recognition output data as the recognition result includes the fallen grain area indicated by a rectangle and the estimated probability when the falling grain area is estimated.
- the estimation result is schematically shown, and the fallen grain area is indicated by a rectangular frame given a reference F2.
- the fallen grain area is defined by four corner points respectively, but the coordinate position on the photographed image of the four corner points of each such rectangle is also included in the estimation result.
- a fall kernel area is not output and its estimation probability becomes zero.
- the image recognition module 5 is configured such that the estimation probability of the recognition object is reduced as the recognition object (falling grain) is located farther from the imaging unit 70 in the photographed image. Is set. As a result, recognition of an object to be recognized in an imaging region whose resolution is low due to being far from the imaging unit 70 is made strict, and erroneous recognition is reduced.
- the data processing module 50 processes recognition output data output from the image recognition module 5. As shown in FIG. 3 and FIG. 5, the data processing module 50 of this embodiment includes a fallen grain weir position information generating unit 51 and a statistical processing unit 52.
- the fallen grain weir position information generation unit 51 generates the fallen grain weir position information indicating the position on the map of the recognition object from the machine position at the time when the photographed image is acquired and the recognition output data.
- the position on the map where the fallen cereals are present, which is included in the recognition output data, is the coordinate position (camera coordinate position) on the photographed image of the four corner points of the rectangle indicating the fallen cereals as the coordinate on the map It is obtained by converting.
- the photographing unit 70 acquires photographed images at predetermined time intervals (for example, 0.5 seconds) and inputs the image data to the image recognition module 5, the image recognition module 5 also recognizes and outputs recognition output data at the same time intervals. Output Therefore, in the case where the shooting field of view of the photographing unit 70 includes the falling grain moth, a plurality of recognition output data usually includes the existing region for the same falling grain moth. As a result, multiple pieces of fallen grain position information for the same fallen grain can be obtained.
- the estimated probability (that is, the estimated probability of the existence region of the fallen cereal weir included in the fallen cereal weir position information) included in the recognition output data which is each original data is between the photographing unit 70 and the fallen cereal weir Because the positional relationship of is different, it often takes different values.
- such a plurality of fallen cereal weir position information is stored, and the estimated probability included in each of the plurality of memorized fallen cereal weir position information is statistically calculated.
- a representative value of the estimated probability group is determined using a statistical operation on the estimated probabilities of the plurality of recognition object position information.
- a plurality of pieces of recognition target object position information can be corrected to one piece of optimum recognition target object position information using the representative value.
- One example of such correction is to obtain the arithmetic mean value or weighted mean value or median value of each estimated probability as a reference value (representative value), and obtain the logical sum of the existing regions having the estimated probability equal to or higher than the reference value It is to generate the fallen grain moth correction position information which makes an optimal existence area.
- the traveling operation control and the warning notification set in advance are performed when recognizing the fallen grain area. It will be.
- the evaluation module 4A calculates the taste value (crop evaluation value) of grain through the taste calculation process, and calculates the grain yield (crop evaluation value) through the yield calculation process.
- the taste value and the yield sequentially outputted from the evaluation module 4A with the progress of the work travel are given to the harvest information generation unit 4B.
- the harvest information generation unit 4B generates harvest information by associating the taste value and the yield sequentially given with the traveling track of the airframe 1 and recording the result.
- the harvest information generated by the harvest information generation unit 4B and the fallen grain crucible position information generated by the fallen grain crucible position information generation unit 51 are uploaded to the cloud computer system 100 through the communication unit 92. Ru.
- a field farming management map generation unit 101 is constructed, which generates a field farming management map by aligning the fallen grain weir position information and the harvest information in a map-wise manner.
- FIG. 6 schematically shows an example of the farm farming management map.
- the fallen grain weir map in which the presence area (indicated by hatching) of the fallen grain weir is assigned to the minute compartment set in the field, and the yield (q11,.
- a taste map in which taste values (shown as w11,...) Are assigned to the same minute sections.
- This field farming map further includes a fertilization plan map in which the next fertilization amount (indicated by f11,...) Is recorded in the same minute section.
- the division of the same size is used for the minute section in each map, you may use the division of a different size, respectively.
- Calculation of the type of fertilizer and the amount of fertilizer application may be performed automatically by computer software, or the farmer may do it himself while viewing the farming map. Also, a semi-automatic method may be adopted in which the farmer corrects the fertilization amount calculated by the computer software.
- the fallen grain weir was set as the recognition object to be recognized by the image recognition module 5, but other recognition objects (for example, weed groups or people that were stretched higher than the planted rice weir) And obstacles) may be additionally set.
- the work travel control module 60 is configured to perform necessary control in response to the recognition of weeds and obstacles.
- the image recognition module 5 is constructed using deep learning type neural network technology. Alternatively, an image recognition module 5 constructed using other machine learning techniques may be employed.
- each functional unit shown in FIG. 3 is divided mainly for the purpose of explanation. In practice, each functional unit may be integrated with another functional unit or may be further divided into a plurality of functional units.
- the present invention is not only a combine harvester that harvests rice, wheat, etc., but also a combine that harvests other crops such as corn, carrots, etc., if it is a harvester having a function of photographing a field and a function of calculating an aircraft position. It is also applicable to harvesters that harvest
- Aircraft position calculation unit 6 Control Unit 6A: input processing unit 6B: output processing unit 60: work traveling control module 61: traveling control unit 62: work control unit 63: automatic work traveling instruction unit 64: traveling route setting unit 66: machine position calculating unit 70: photographing unit 80: Satellite positioning module 91: Notification device 120: Yield measurement container 125: Taste measurement container 100: Cloud computer system 101: Field farming management map generation unit 200: Mobile communication terminal
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Environmental Sciences (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- Primary Health Care (AREA)
- Mining & Mineral Resources (AREA)
- Marine Sciences & Fisheries (AREA)
- Animal Husbandry (AREA)
- Agronomy & Crop Science (AREA)
- Quality & Reliability (AREA)
- Entrepreneurship & Innovation (AREA)
- Botany (AREA)
- Medical Informatics (AREA)
- Ecology (AREA)
- Forests & Forestry (AREA)
- Biodiversity & Conservation Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Soil Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Mechanical Engineering (AREA)
- Development Economics (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Educational Administration (AREA)
- Combines (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
(1)上述した実施形態では、画像認識モジュール5が認識する認識対象物として倒伏穀稈が設定されていたが、その他の認識対象物(例えば、植立穀稈より高く伸びた雑草群や人物などの障害物)も追加的に設定されてもよい。その際には、作業走行制御モジュール60は、雑草群や障害物の認識に応答して、必要な制御を行うように構成される。
(2)上述した実施形態では、画像認識モジュール5は、深層学習タイプのニューラルネットワーク技術を用いて構築されている。これに代えて、その他の機械学習技術を用いて構築された画像認識モジュール5が採用されてもよい。
(3)上述した実施形態では、画像認識モジュール5、データ処理モジュール50、評価モジュール4A、収穫情報生成部4Bは、コンバインの制御ユニット6に組み込まれていたが、その一部又は全部は、コンバインから独立した制御ユニット(例えば、携帯通信端末200など)に構築可能である。
(4)図3で示された各機能部は、主に説明目的で区分けされている。実際には、各機能部は他の機能部と統合してもよいし、又はさらに複数の機能部に分けてもよい。
2 :刈取部
4A :評価モジュール
4B :収穫情報生成部
5 :画像認識モジュール
50 :データ処理モジュール
51 :倒伏穀稈位置情報生成部
52 :統計処理部
57 :機体位置算出部
6 :制御ユニット
6A :入力処理部
6B :出力処理部
60 :作業走行制御モジュール
61 :走行制御部
62 :作業制御部
63 :自動作業走行指令部
64 :走行経路設定部
66 :機体位置算出部
70 :撮影部
80 :衛星測位モジュール
91 :報知デバイス
120 :収量測定容器
125 :食味測定容器
100 :クラウドコンピュータシステム
101 :圃場営農マップ生成部
200 :携帯通信端末
Claims (6)
- 圃場を走行しながら農作物を収穫するコンバインであって、
衛星測位モジュールからの測位データに基づいて機体の地図座標である機体位置を算出する機体位置算出部と、
前記機体に設けられ、収穫作業時に前記圃場を撮影する撮影部と、
前記撮影部によって継時的に遂次取得された撮影画像の画像データが入力され、前記撮影画像における倒伏穀稈領域を推定し、推定された前記倒伏穀稈領域を示す認識出力データを出力する画像認識モジュールと、
遂次収穫される前記農作物を評価して得られた単位走行当たりの作物評価値を出力する評価モジュールと、
前記撮影画像が取得された時点の前記機体位置と前記認識出力データとから前記倒伏穀稈領域の地図上の位置を示す倒伏穀稈位置情報を生成する倒伏穀稈位置情報生成部と、
前記農作物を収穫した時点の前記機体位置と前記作物評価値とから収穫情報を生成する収穫情報生成部と、を備えたコンバイン。 - 前記作物評価値には、収量又は食味あるいはその両方が含まれている請求項1に記載のコンバイン。
- 前記倒伏穀稈位置情報と前記収穫情報とを地図的に整合して、圃場営農マップを生成する圃場営農マップ生成部が機体内部の制御系に、又はクラウドコンピュータシステムに構築されている請求項1又は2に記載のコンバイン。
- コンバインに設けられた撮影部によって取得された撮影画像に基づいて推定された倒伏穀稈領域を示す認識出力データを出力するステップと、
前記撮影画像が取得された時点の機体位置と前記認識出力データとから前記倒伏穀稈領域の地図上の位置を示す倒伏穀稈位置情報を生成するステップと、
前記コンバインが圃場を作業走行することで遂次収穫される農作物を評価して得られた単位走行当たりの作物評価値を出力するステップと、
前記農作物を収穫した時点の前記機体位置と前記作物評価値とから収穫情報を生成するステップと、
前記倒伏穀稈位置情報と前記収穫情報とを地図的に整合して、圃場営農マップを生成するステップと、からなる圃場営農マップ生成方法。 - コンバインに設けられた撮影部によって取得された撮影画像に基づいて推定された倒伏穀稈領域を示す認識出力データを出力する機能と、
前記撮影画像が取得された時点の機体位置と前記認識出力データとから前記倒伏穀稈領域の地図上の位置を示す倒伏穀稈位置情報を生成する機能と、
前記コンバインが圃場を作業走行することで遂次収穫される農作物を評価して得られた単位走行当たりの作物評価値を出力する機能と、
前記農作物を収穫した時点の前記機体位置と前記作物評価値とから収穫情報を生成する機能と、
前記倒伏穀稈位置情報と前記収穫情報とを地図的に整合して、圃場営農マップを生成する機能と、をコンピュータに実現させる圃場営農マップ生成プログラム。 - コンバインに設けられた撮影部によって取得された撮影画像に基づいて推定された倒伏穀稈領域を示す認識出力データを出力する機能と、
前記撮影画像が取得された時点の機体位置と前記認識出力データとから前記倒伏穀稈領域の地図上の位置を示す倒伏穀稈位置情報を生成する機能と、
前記コンバインが圃場を作業走行することで遂次収穫される農作物を評価して得られた単位走行当たりの作物評価値を出力する機能と、
前記農作物を収穫した時点の前記機体位置と前記作物評価値とから収穫情報を生成する機能と、
前記倒伏穀稈位置情報と前記収穫情報とを地図的に整合して、圃場営農マップを生成する機能と、をコンピュータに実現させる圃場営農マップ生成プログラムが記録された記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/614,960 US11170547B2 (en) | 2017-06-23 | 2018-06-22 | Combine, method of generating field farming map, program for generating the field farming map and storage medium recording the field farming map generating program |
CN201880029855.3A CN110582794A (zh) | 2017-06-23 | 2018-06-22 | 联合收割机、田地农业经营地图生成方法、田地农业经营地图生成程序及记录有田地农业经营地图生成程序的记录介质 |
EP18820302.0A EP3644266A4 (en) | 2017-06-23 | 2018-06-22 | COMBINE, PROCESS FOR CREATING AN AGRICULTURAL CARD OF AGRICULTURAL FIELD, PROGRAM FOR CREATING AN AGRICULTURAL CARD OF AGRICULTURAL FIELD, AND REGISTRATION MEDIA ON WHICH A PROGRAM IS REGISTERED FOR CREATING AN AGRICULTURAL CARD OF AGRICULTURAL FIELD |
KR1020197031592A KR102618797B1 (ko) | 2017-06-23 | 2018-06-22 | 콤바인, 포장 영농 맵 생성 방법, 포장 영농 맵 생성 프로그램 및 포장 영농 맵 생성 프로그램이 기록된 기록 매체 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-123438 | 2017-06-23 | ||
JP2017123438A JP6887323B2 (ja) | 2017-06-23 | 2017-06-23 | コンバイン及び圃場営農マップ生成方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018235942A1 true WO2018235942A1 (ja) | 2018-12-27 |
Family
ID=64735974
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/023791 WO2018235942A1 (ja) | 2017-06-23 | 2018-06-22 | コンバイン、圃場営農マップ生成方法、圃場営農マップ生成プログラム及び圃場営農マップ生成プログラムが記録された記録媒体 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11170547B2 (ja) |
EP (1) | EP3644266A4 (ja) |
JP (1) | JP6887323B2 (ja) |
KR (1) | KR102618797B1 (ja) |
CN (1) | CN110582794A (ja) |
WO (1) | WO2018235942A1 (ja) |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11079725B2 (en) | 2019-04-10 | 2021-08-03 | Deere & Company | Machine control using real-time model |
US11178818B2 (en) | 2018-10-26 | 2021-11-23 | Deere & Company | Harvesting machine control system with fill level processing based on yield data |
US11234366B2 (en) | 2019-04-10 | 2022-02-01 | Deere & Company | Image selection for machine control |
US11240961B2 (en) | 2018-10-26 | 2022-02-08 | Deere & Company | Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity |
US20220110251A1 (en) | 2020-10-09 | 2022-04-14 | Deere & Company | Crop moisture map generation and control system |
US11467605B2 (en) | 2019-04-10 | 2022-10-11 | Deere & Company | Zonal machine control |
US11474523B2 (en) | 2020-10-09 | 2022-10-18 | Deere & Company | Machine control using a predictive speed map |
US11477940B2 (en) | 2020-03-26 | 2022-10-25 | Deere & Company | Mobile work machine control based on zone parameter modification |
US11592822B2 (en) | 2020-10-09 | 2023-02-28 | Deere & Company | Machine control using a predictive map |
US11589509B2 (en) | 2018-10-26 | 2023-02-28 | Deere & Company | Predictive machine characteristic map generation and control system |
US11635765B2 (en) | 2020-10-09 | 2023-04-25 | Deere & Company | Crop state map generation and control system |
US11641800B2 (en) | 2020-02-06 | 2023-05-09 | Deere & Company | Agricultural harvesting machine with pre-emergence weed detection and mitigation system |
US11650587B2 (en) | 2020-10-09 | 2023-05-16 | Deere & Company | Predictive power map generation and control system |
US11653588B2 (en) | 2018-10-26 | 2023-05-23 | Deere & Company | Yield map generation and control system |
US11672203B2 (en) | 2018-10-26 | 2023-06-13 | Deere & Company | Predictive map generation and control |
US11675354B2 (en) | 2020-10-09 | 2023-06-13 | Deere & Company | Machine control using a predictive map |
US11711995B2 (en) | 2020-10-09 | 2023-08-01 | Deere & Company | Machine control using a predictive map |
US11727680B2 (en) | 2020-10-09 | 2023-08-15 | Deere & Company | Predictive map generation based on seeding characteristics and control |
US11778945B2 (en) | 2019-04-10 | 2023-10-10 | Deere & Company | Machine control using real-time model |
US11825768B2 (en) | 2020-10-09 | 2023-11-28 | Deere & Company | Machine control using a predictive map |
US11844311B2 (en) | 2020-10-09 | 2023-12-19 | Deere & Company | Machine control using a predictive map |
US11845449B2 (en) | 2020-10-09 | 2023-12-19 | Deere & Company | Map generation and control system |
US11849671B2 (en) | 2020-10-09 | 2023-12-26 | Deere & Company | Crop state map generation and control system |
US11849672B2 (en) | 2020-10-09 | 2023-12-26 | Deere & Company | Machine control using a predictive map |
US11864483B2 (en) | 2020-10-09 | 2024-01-09 | Deere & Company | Predictive map generation and control system |
US11874669B2 (en) | 2020-10-09 | 2024-01-16 | Deere & Company | Map generation and control system |
US11889787B2 (en) | 2020-10-09 | 2024-02-06 | Deere & Company | Predictive speed map generation and control system |
US11889788B2 (en) | 2020-10-09 | 2024-02-06 | Deere & Company | Predictive biomass map generation and control |
US11895948B2 (en) | 2020-10-09 | 2024-02-13 | Deere & Company | Predictive map generation and control based on soil properties |
US11927459B2 (en) | 2020-10-09 | 2024-03-12 | Deere & Company | Machine control using a predictive map |
US11946747B2 (en) | 2020-10-09 | 2024-04-02 | Deere & Company | Crop constituent map generation and control system |
US11957072B2 (en) | 2020-02-06 | 2024-04-16 | Deere & Company | Pre-emergence weed detection and mitigation system |
US11983009B2 (en) | 2020-10-09 | 2024-05-14 | Deere & Company | Map generation and control system |
US12013245B2 (en) | 2020-10-09 | 2024-06-18 | Deere & Company | Predictive map generation and control system |
US12035648B2 (en) | 2020-02-06 | 2024-07-16 | Deere & Company | Predictive weed map generation and control system |
JP7525381B2 (ja) | 2020-11-30 | 2024-07-30 | 三菱マヒンドラ農機株式会社 | コンバイン |
US12058951B2 (en) | 2022-04-08 | 2024-08-13 | Deere & Company | Predictive nutrient map and control |
US12069986B2 (en) | 2020-10-09 | 2024-08-27 | Deere & Company | Map generation and control system |
US12069978B2 (en) | 2018-10-26 | 2024-08-27 | Deere & Company | Predictive environmental characteristic map generation and control system |
US12082531B2 (en) | 2022-01-26 | 2024-09-10 | Deere & Company | Systems and methods for predicting material dynamics |
US12127500B2 (en) | 2021-01-27 | 2024-10-29 | Deere & Company | Machine control using a map with regime zones |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7225002B2 (ja) * | 2019-03-29 | 2023-02-20 | 株式会社クボタ | 散布支援装置及び散布支援システム |
CN113923977B (zh) | 2019-06-28 | 2023-10-24 | 株式会社久保田 | 自动行驶系统、农业机械、记录介质以及方法 |
JP7191785B2 (ja) * | 2019-06-28 | 2022-12-19 | 株式会社クボタ | 農業支援装置 |
JP6923632B2 (ja) * | 2019-12-27 | 2021-08-25 | 株式会社クボタ | 農作業請負システム及び農作業請負サーバ |
EP4099258A4 (en) * | 2020-01-30 | 2023-05-31 | Sagri Co., Ltd. | INFORMATION PROCESSING DEVICE |
US20220011119A1 (en) * | 2020-07-09 | 2022-01-13 | International Business Machines Corporation | Generating and improving upon agricultural maps |
JP7471211B2 (ja) * | 2020-12-10 | 2024-04-19 | 株式会社クボタ | 圃場マップ生成システム |
KR20220168875A (ko) | 2021-06-17 | 2022-12-26 | 대한민국(농촌진흥청장) | 인공지능을 이용하여 벼 도복 피해면적을 산정하는 장치 및 방법 |
US12067718B2 (en) * | 2021-12-27 | 2024-08-20 | Deere & Company | Crop yield component map |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10304734A (ja) | 1997-05-08 | 1998-11-17 | Iseki & Co Ltd | コンバイン等の倒伏判定装置 |
JPH11155340A (ja) | 1997-11-25 | 1999-06-15 | Yanmar Agricult Equip Co Ltd | 汎用コンバイン |
WO2014050524A1 (ja) * | 2012-09-26 | 2014-04-03 | 株式会社クボタ | 農作管理システム及び農作物収穫機 |
JP2016010371A (ja) * | 2014-06-30 | 2016-01-21 | 井関農機株式会社 | コンバイン |
JP2016086668A (ja) * | 2014-10-30 | 2016-05-23 | 井関農機株式会社 | コンバイン |
WO2016147521A1 (ja) * | 2015-03-18 | 2016-09-22 | 株式会社クボタ | コンバイン及びコンバインのための穀粒評価制御装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FI942218A0 (fi) * | 1994-05-13 | 1994-05-13 | Modulaire Oy | Automatiskt styrningssystem foer obemannat fordon |
JP3657357B2 (ja) * | 1996-06-19 | 2005-06-08 | ヤンマー農機株式会社 | コンバイン |
JPH11137062A (ja) * | 1997-11-10 | 1999-05-25 | Yanmar Agricult Equip Co Ltd | 汎用コンバインの制御装置 |
JP2005211045A (ja) * | 2004-02-02 | 2005-08-11 | National Agriculture & Bio-Oriented Research Organization | コンバイン |
DE102011086021A1 (de) * | 2011-11-09 | 2013-05-16 | Deere & Company | Anordnung und Verfahren zur automatischen Dokumentation von Situationen bei der Feldarbeit |
DE102012223434B4 (de) * | 2012-12-17 | 2021-03-25 | Deere & Company | Verfahren und Anordnung zur Optimierung eines Betriebsparameters eines Mähdreschers |
JP5980162B2 (ja) * | 2013-04-26 | 2016-08-31 | 株式会社クボタ | コンバイン |
DE102015106302A1 (de) * | 2015-04-24 | 2016-10-27 | Claas Selbstfahrende Erntemaschinen Gmbh | Erntesystem mit einer selbstfahrenden Erntemaschine |
JP6509087B2 (ja) | 2015-09-25 | 2019-05-08 | 株式会社クボタ | コンバイン |
JP6700696B2 (ja) | 2015-09-18 | 2020-05-27 | 株式会社クボタ | コンバイン |
US9807932B2 (en) * | 2015-10-02 | 2017-11-07 | Deere & Company | Probabilistic control of an agricultural machine |
JP6566833B2 (ja) | 2015-10-20 | 2019-08-28 | ヤンマー株式会社 | マッピングシステム、マッピング装置及びコンピュータプログラム |
-
2017
- 2017-06-23 JP JP2017123438A patent/JP6887323B2/ja active Active
-
2018
- 2018-06-22 US US16/614,960 patent/US11170547B2/en active Active
- 2018-06-22 CN CN201880029855.3A patent/CN110582794A/zh active Pending
- 2018-06-22 EP EP18820302.0A patent/EP3644266A4/en active Pending
- 2018-06-22 WO PCT/JP2018/023791 patent/WO2018235942A1/ja active Application Filing
- 2018-06-22 KR KR1020197031592A patent/KR102618797B1/ko active IP Right Grant
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10304734A (ja) | 1997-05-08 | 1998-11-17 | Iseki & Co Ltd | コンバイン等の倒伏判定装置 |
JPH11155340A (ja) | 1997-11-25 | 1999-06-15 | Yanmar Agricult Equip Co Ltd | 汎用コンバイン |
WO2014050524A1 (ja) * | 2012-09-26 | 2014-04-03 | 株式会社クボタ | 農作管理システム及び農作物収穫機 |
JP2016010371A (ja) * | 2014-06-30 | 2016-01-21 | 井関農機株式会社 | コンバイン |
JP2016086668A (ja) * | 2014-10-30 | 2016-05-23 | 井関農機株式会社 | コンバイン |
WO2016147521A1 (ja) * | 2015-03-18 | 2016-09-22 | 株式会社クボタ | コンバイン及びコンバインのための穀粒評価制御装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3644266A4 |
Cited By (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12010947B2 (en) | 2018-10-26 | 2024-06-18 | Deere & Company | Predictive machine characteristic map generation and control system |
US12069978B2 (en) | 2018-10-26 | 2024-08-27 | Deere & Company | Predictive environmental characteristic map generation and control system |
US11672203B2 (en) | 2018-10-26 | 2023-06-13 | Deere & Company | Predictive map generation and control |
US11240961B2 (en) | 2018-10-26 | 2022-02-08 | Deere & Company | Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity |
US11589509B2 (en) | 2018-10-26 | 2023-02-28 | Deere & Company | Predictive machine characteristic map generation and control system |
US11178818B2 (en) | 2018-10-26 | 2021-11-23 | Deere & Company | Harvesting machine control system with fill level processing based on yield data |
US11653588B2 (en) | 2018-10-26 | 2023-05-23 | Deere & Company | Yield map generation and control system |
US11467605B2 (en) | 2019-04-10 | 2022-10-11 | Deere & Company | Zonal machine control |
US11234366B2 (en) | 2019-04-10 | 2022-02-01 | Deere & Company | Image selection for machine control |
US11650553B2 (en) | 2019-04-10 | 2023-05-16 | Deere & Company | Machine control using real-time model |
US11079725B2 (en) | 2019-04-10 | 2021-08-03 | Deere & Company | Machine control using real-time model |
US11778945B2 (en) | 2019-04-10 | 2023-10-10 | Deere & Company | Machine control using real-time model |
US11829112B2 (en) | 2019-04-10 | 2023-11-28 | Deere & Company | Machine control using real-time model |
US11957072B2 (en) | 2020-02-06 | 2024-04-16 | Deere & Company | Pre-emergence weed detection and mitigation system |
US11641800B2 (en) | 2020-02-06 | 2023-05-09 | Deere & Company | Agricultural harvesting machine with pre-emergence weed detection and mitigation system |
US12035648B2 (en) | 2020-02-06 | 2024-07-16 | Deere & Company | Predictive weed map generation and control system |
US11477940B2 (en) | 2020-03-26 | 2022-10-25 | Deere & Company | Mobile work machine control based on zone parameter modification |
US11849671B2 (en) | 2020-10-09 | 2023-12-26 | Deere & Company | Crop state map generation and control system |
US11927459B2 (en) | 2020-10-09 | 2024-03-12 | Deere & Company | Machine control using a predictive map |
US11711995B2 (en) | 2020-10-09 | 2023-08-01 | Deere & Company | Machine control using a predictive map |
US11675354B2 (en) | 2020-10-09 | 2023-06-13 | Deere & Company | Machine control using a predictive map |
US11825768B2 (en) | 2020-10-09 | 2023-11-28 | Deere & Company | Machine control using a predictive map |
US11844311B2 (en) | 2020-10-09 | 2023-12-19 | Deere & Company | Machine control using a predictive map |
US11845449B2 (en) | 2020-10-09 | 2023-12-19 | Deere & Company | Map generation and control system |
US11650587B2 (en) | 2020-10-09 | 2023-05-16 | Deere & Company | Predictive power map generation and control system |
US11849672B2 (en) | 2020-10-09 | 2023-12-26 | Deere & Company | Machine control using a predictive map |
US11864483B2 (en) | 2020-10-09 | 2024-01-09 | Deere & Company | Predictive map generation and control system |
US11871697B2 (en) | 2020-10-09 | 2024-01-16 | Deere & Company | Crop moisture map generation and control system |
US11874669B2 (en) | 2020-10-09 | 2024-01-16 | Deere & Company | Map generation and control system |
US11889787B2 (en) | 2020-10-09 | 2024-02-06 | Deere & Company | Predictive speed map generation and control system |
US11889788B2 (en) | 2020-10-09 | 2024-02-06 | Deere & Company | Predictive biomass map generation and control |
US11895948B2 (en) | 2020-10-09 | 2024-02-13 | Deere & Company | Predictive map generation and control based on soil properties |
US11727680B2 (en) | 2020-10-09 | 2023-08-15 | Deere & Company | Predictive map generation based on seeding characteristics and control |
US11946747B2 (en) | 2020-10-09 | 2024-04-02 | Deere & Company | Crop constituent map generation and control system |
US11635765B2 (en) | 2020-10-09 | 2023-04-25 | Deere & Company | Crop state map generation and control system |
US11983009B2 (en) | 2020-10-09 | 2024-05-14 | Deere & Company | Map generation and control system |
US12013698B2 (en) | 2020-10-09 | 2024-06-18 | Deere & Company | Machine control using a predictive map |
US11592822B2 (en) | 2020-10-09 | 2023-02-28 | Deere & Company | Machine control using a predictive map |
US12013245B2 (en) | 2020-10-09 | 2024-06-18 | Deere & Company | Predictive map generation and control system |
US11474523B2 (en) | 2020-10-09 | 2022-10-18 | Deere & Company | Machine control using a predictive speed map |
US12080062B2 (en) | 2020-10-09 | 2024-09-03 | Deere & Company | Predictive map generation based on seeding characteristics and control |
US12048271B2 (en) | 2020-10-09 | 2024-07-30 | Deere &Company | Crop moisture map generation and control system |
US20220110251A1 (en) | 2020-10-09 | 2022-04-14 | Deere & Company | Crop moisture map generation and control system |
US12069986B2 (en) | 2020-10-09 | 2024-08-27 | Deere & Company | Map generation and control system |
JP7525381B2 (ja) | 2020-11-30 | 2024-07-30 | 三菱マヒンドラ農機株式会社 | コンバイン |
US12127500B2 (en) | 2021-01-27 | 2024-10-29 | Deere & Company | Machine control using a map with regime zones |
US12082531B2 (en) | 2022-01-26 | 2024-09-10 | Deere & Company | Systems and methods for predicting material dynamics |
US12058951B2 (en) | 2022-04-08 | 2024-08-13 | Deere & Company | Predictive nutrient map and control |
Also Published As
Publication number | Publication date |
---|---|
KR20200019847A (ko) | 2020-02-25 |
CN110582794A (zh) | 2019-12-17 |
KR102618797B1 (ko) | 2023-12-29 |
US11170547B2 (en) | 2021-11-09 |
EP3644266A4 (en) | 2021-03-10 |
JP6887323B2 (ja) | 2021-06-16 |
US20200202596A1 (en) | 2020-06-25 |
JP2019008536A (ja) | 2019-01-17 |
EP3644266A1 (en) | 2020-04-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018235942A1 (ja) | コンバイン、圃場営農マップ生成方法、圃場営農マップ生成プログラム及び圃場営農マップ生成プログラムが記録された記録媒体 | |
US9696162B2 (en) | Mission and path planning using images of crop wind damage | |
EP3872722B1 (en) | Network-based work machine software optimization | |
US10278329B2 (en) | Grain management system and combine | |
JP2019004771A (ja) | コンバイン | |
JP2019004772A (ja) | 収穫機 | |
EP4245117A1 (en) | Systems and methods for predictive reel control | |
US20240037806A1 (en) | System and method of assisted or automated unload synchronization | |
US20240032469A1 (en) | System and method of assisted or automated unload synchronization | |
JP2020178619A (ja) | 農作業機 | |
JP7527838B2 (ja) | 農作業機 | |
US20230012175A1 (en) | Threshing Status Management System, Method, and Program, and Recording Medium for Threshing State Management Program, Harvester Management System, Harvester, Harvester Management Method and Program, and Recording Medium for Harvester Management Program, Work Vehicle, Work Vehicle Management Method, System, and Program, and Recording Medium for Work Vehicle Management Program, Management System, Method, and Program, and Recording Medium for Management Program | |
JP2024091691A (ja) | 圃場作業車 | |
WO2022123889A1 (ja) | 作業車、作物状態検出システム、作物状態検出方法、作物状態検出プログラム、及び作物状態検出プログラムが記録されている記録媒体 | |
CN113923977B (zh) | 自动行驶系统、农业机械、记录介质以及方法 | |
WO2020262287A1 (ja) | 農作業機、自動走行システム、プログラム、プログラムを記録した記録媒体、及び方法 | |
WO2020218528A1 (ja) | 収穫機等の農作業機 | |
US20240142980A1 (en) | Agronomy Data Utilization System And Method | |
US20240284826A1 (en) | Agricultural operation evaluation system and method | |
US20230309449A1 (en) | Predictive machine setting map generation and control system | |
JP7321087B2 (ja) | 収穫機管理システム、収穫機、及び収穫機管理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18820302 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20197031592 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018820302 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2018820302 Country of ref document: EP Effective date: 20200123 |