WO2014093814A1 - Estimation de charge prédictive par vision frontale - Google Patents
Estimation de charge prédictive par vision frontale Download PDFInfo
- Publication number
- WO2014093814A1 WO2014093814A1 PCT/US2013/074999 US2013074999W WO2014093814A1 WO 2014093814 A1 WO2014093814 A1 WO 2014093814A1 US 2013074999 W US2013074999 W US 2013074999W WO 2014093814 A1 WO2014093814 A1 WO 2014093814A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- agricultural machine
- machine
- crop material
- header
- controller
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D41/00—Combines, i.e. harvesters or mowers combined with threshing devices
- A01D41/12—Details of combines
- A01D41/127—Control or measuring arrangements specially adapted for combines
- A01D41/1271—Control or measuring arrangements specially adapted for combines for measuring crop flow
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D75/00—Accessories for harvesters or mowers
- A01D75/18—Safety devices for parts of the machines
- A01D75/182—Avoiding overload
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D41/00—Combines, i.e. harvesters or mowers combined with threshing devices
- A01D41/12—Details of combines
- A01D41/127—Control or measuring arrangements specially adapted for combines
Definitions
- the present disclosure is generally related to agriculture technology, and, more particularly, computer-assisted farming.
- combine harvesters may employ a form of cruise control, with the perceived benefits of preventing operator fatigue and maximizing machine efficiency.
- FIG. 1 is a schematic diagram of an example combine harvester showing an embodiment of a detection system.
- FIG. 2 is a schematic diagram showing an overhead perspective view of a feeder house and a header as viewed from the perspective of an imaging system mounted to the top of an operator cab of the combine harvester.
- FIG. 3A is a block diagram showing an embodiment of a detection system.
- FIG. 3B is a block diagram showing an embodiment of a controller for the
- FIG. 4 is a flow diagram that illustrates an example embodiment of a detection method.
- a method comprising receiving a scan or plural images of crop material located in front of, and proximal to, a header coupled to a front portion of an agricultural machine; determining a crop material parameter based on the scan or the plural images; and adjusting a machine parameter of the agricultural machine based on the crop material parameter.
- the detection system detects the amount of crop material (e.g., crops, weeds, etc.) that is about to be gathered (e.g., harvested) by an agricultural machine, such as a combine harvester (hereinafter, a combine harvester is referred to as a combine), and adjusts one or more machine parameters to enable the agricultural machine to suitably react to the anticipated crop material load.
- the agricultural machine may have a header coupled to the front of the agricultural machine, and the detection system detects the amount of crop material in advance of the header engaging the crop material.
- the detection system comprises an imaging system and a controller.
- the imaging system may comprise plural cameras or a laser scanner that capture plural images or a scan, respectively.
- the imaging system is mounted on the agricultural machine in a manner that enables the crop material located in front of the header to be within detection range.
- the controller which may be embodied as a programmable logic controller (PLC), microcontroller, processor(s), or computer (or other computing device), receives the plural images or scan and determines one or more crop material parameters.
- the crop material parameters may include crop height, crop density, crop moisture, or any combination of these or other parameters that impact the machine load.
- the detection system embodiments disclosed herein enable a determination of the one or more crop material parameters for an area of crop material located in front of the header to facilitate prediction of upcoming changes in machine load, and ultimately as an input to help optimize ground speed and/or other machine settings.
- one or more embodiments of detection systems comprise a plurality of cameras that are located in a position that enables capture of the crop material in front of a header from different perspectives.
- the controller pairs these plural images to provide a stereoscopic image.
- a point cloud comprising three dimensional coordinates that provide an outline of the crop material is generated based on the stereoscopic image, and from the point cloud, a cross sectional area of the crop material is determined. In some embodiments, the generation of a point cloud may be omitted.
- some embodiments may use other types of imaging systems, such as laser radar topography, among other types of imaging systems using other portions of the electromagnetic spectrum.
- the imaging system may further enable determination of one or more of the crop material parameters based on detection between the header and a front of the agricultural machine (e.g., the feeder house).
- the description identifies or describes specifics of one or more embodiments, such specifics are not necessarily part of every embodiment, nor are all various stated advantages necessarily associated with a single embodiment or all embodiments. On the contrary, the intent is to cover all alternatives, modifications and equivalents included within the spirit and scope of the disclosure as defined by the appended claims. Further, it should be appreciated in the context of the present disclosure that the claims are not necessarily limited to the particular embodiments set out in the description.
- FIG. 1 shown is an example agricultural machine embodied as a combine 10 in which an embodiment of a detection system may be implemented.
- a detection system may be implemented in some embodiments.
- FIG. 1 is merely illustrative, and that other combine configurations may be implemented in some embodiments.
- some embodiments of detection systems may be used in combines having an axial flow, hybrid, or dual rotor configuration, among other combine configurations, and hence are contemplated to be within the scope of the disclosure.
- the example combine 10 is shown in FIG. 1 without a header, and from front to back, comprises a feeder house 12 and an operator cab 14, followed by a processing compartment that includes a processing apparatus 16.
- the combine 10 includes a harvesting header (shown in FIG. 2, as described later) at the front of the machine that cuts cop materials and delivers the cut crop materials to the front end of the feeder house 12.
- a harvesting header shown in FIG. 2, as described later
- Such crop materials are moved upwardly and rearwardly within and beyond the feeder house 12 by a conveyer 18 until reaching a thresher rotor 20 of the processing apparatus 16.
- the thresher rotor 20 comprises a single, transverse rotor, such as that found in a Gleaner® Super Series Combine by AGCO.
- the thresher rotor 20 processes the crop materials in known manner and passes a portion of the crop material (e.g., heavier chaff, corn stalks, etc.) toward the rear of the combine 10 and another portion (e.g., grain and possibly light chaff) through a cleaning process, as described below.
- the conveyor 18 may convey the cut crop material to a beater before reaching a rotor or rotors.
- the crop materials undergo threshing and separating operations.
- the crop materials are threshed and separated by the thresher rotor 20 operating in cooperation with certain elements of a cage 22, for instance, well-known foraminous processing members in the form of threshing concave assemblies and separator grate assemblies, with the grain (and possibly light chaff) escaping through the concave assemblies and the grate assemblies and onto one or more distribution augers 24 located beneath the processing apparatus 16.
- Bulkier stalk and leaf materials are generally retained by the concave assemblies and the grate assemblies and are disbursed out from the processing apparatus 16 and ultimately out of the rear of the combine 10.
- the distribution augers 24 uniformly spread the crop material that falls upon it, with the spread crop material conveyed to accelerator rolls 26.
- the accelerator rolls 26 speed the descent of the crop material toward a cleaning assembly 28.
- the cleaning assembly 28 includes a transverse fan 30 (or equivalently, a blower), which facilitates the cleaning of the heavier crop material directly beneath the accelerator rolls 26 while causing the chaff to be carried out of the rear of the combine 10.
- the cleaning assembly 28 also includes plural stacked sieves 32, through which the fan 30 provides an additional push or influence of the chaff flow to the rear of the combine 10.
- the cleaned grain that drops to the bottom of the cleaning assembly 28 is delivered by an auger 34 that transports the grain to a well-known elevator mechanism (not shown), which conveys the grain to a grain bin 36 located at the top of the combine 10. Any remaining chaff and partially threshed grain is recirculated through the processing apparatus 16 via a tailings return auger 38.
- a well-known elevator mechanism not shown
- Any remaining chaff and partially threshed grain is recirculated through the processing apparatus 16 via a tailings return auger 38.
- the example combine 10 also comprises a detection system 40, which in one embodiment comprises an imaging system 42 mounted on the combine 10, and a controller 44 (shown schematically). Though depicted in the operator cab 14, the controller 44 may be located elsewhere on the combine 10 in some embodiments.
- the detection system 40 comprises the imaging system 42, which comprises two cameras 46A and 46B (collectively, cameras 46, schematically depicted in FIG. 1 ) mounted to the top of the operator cab 14, and a controller 44. It should be appreciated that the quantity and/or the location of the cameras 46 may vary in some embodiments.
- the imaging system 42 is generally mounted in a location within a range and view point that enables the capture of images (or scans in some embodiments, wherein a scanner may replace one of the cameras (the other camera omitted), and/or located in any of a plurality of places on the combine 10) of crop material located ahead of the header that is coupled to the front of the combine 10.
- the cameras 46A, 46B are configured to operate in the visible light spectrum, and are depicted in this example as offset symmetrically across a longitudinal centerline of the combine 10, although it is not necessary for the cameras 46A and 46B to be symmetrically offset or offset with respect to the centerline.
- the cameras 46A, 46B are positioned to capture images of the crop material (e.g., uncut crops or weeds, or in some embodiments, cut crops) located proximal to, and in front of, the header of the combine 10.
- the captured image may reveal one or more crop material parameters, such as a height of the crops along all or a portion of a width of the header, the density, and/or moisture content (e.g., via the color).
- the pair of images captured by the cameras 46A, 46B are used to produce stereo images and in some embodiments, a point cloud (or otherwise, three dimensional coordinates), as described below.
- a point cloud or otherwise, three dimensional coordinates
- the imaging system 42 may operate in the non-visible spectrum, such as the infrared, ultraviolet, ultrasonic, among other ranges.
- the imaging system 42 may be embodied as a laser radar topography system (referred to herein as a scanner or laser scanner).
- FIG. 2 shows a view from a location proximal to the top of the operator cab 14, which may be the view seen by one of the cameras 46A (FIG. 1 ) of the imaging system 42 (FIG. 1 ).
- the feeder house 12 has secured to it a header 48, shown partially in FIG. 2, which may be removed and replaced with other types of headers depending on the application.
- a header 48 shown partially in FIG. 2, which may be removed and replaced with other types of headers depending on the application.
- certain embodiments of a detection system 40 may include other types of headers, such as pickup headers, corn headers, etc.
- the header 48 comprises a cutting portion 50 for cutting the crops and a transition portion 52 that conveys (e.g., using a conveyor, such as a belt or belts, chain and slat configuration, etc.) the cut crops toward a rear, center portion 54 of the header 48, as is known.
- the center portion 54 may comprise a feeder auger (not shown) to advance the harvested crop material into the feeder house 12, where the conveyor 18 (FIG. 1 ) conveys the crop material toward the processing apparatus 16.
- the imaging system 42 captures plural images (or a scan) of the crop material located in front of, and proximal to, the header 48.
- the detection system 40 (FIG. 1 ) enables a predictive determination of the a load of the crop material that is to move through the combine 10 to prevent plugging or overloading of components in the combine 10, such as components in the feeder house 12.
- the plural images of a given area comprising crop material are captured by the cameras 46A and 46B (FIG.
- images or scans are captured by a scanner), and the images or scan(s) may be communicated (e.g., over a wired connection or network, such as via a controller area network (CAN), or wirelessly) to the controller 44 (FIG. 1 ).
- the communication of images or scans may be implemented continuously, regularly (e.g., periodically, every defined quantity of feet of travel of the combine 10, every fixed time interval, etc.) or irregularly or aperiodically (e.g., responsive to a given event, such as operator intervention locally or remotely, or at random intervals).
- the captured images or scan(s) are received at the controller 44 (FIG. 1 ), which pairs the images and in one embodiment determines a point cloud to enable a determination of one or more crop material parameters, such as crop height, crop density, and/or moisture content.
- a determination may be a relative and/or absolute determination.
- the detection system 40 may determine that the imaged crop material meets one or more defined conditions, and enables the combine 10 (FIG. 1 ) to continue operating according to its current state of operations until the one or more defined conditions vary according to some threshold amount of change.
- the detection system 40 may perform a calibration procedure (e.g., in the assembly plant or field where determinations of crop material parameters are based on known object parameters imaged during the calibration procedure) or target certain features of known parameters (e.g., known header width, height, etc.) and associate the same with a benchmark to enable the crop material determinations.
- the controller 44 may determine the moisture content from the color of the imaged or scanned crop material. Based on the image or scan, the controller 44 determines an optimum or desirous machine parameter, such as ground speed of the combine 10, direction of the combine 10, or other combine settings pertaining to combine operations (e.g., concave clearance, cleaning fan speed, etc.).
- the controller 44 adjusts a parameter corresponding to one or more of these machine parameters, and provides a corresponding control signal to an actuator (e.g., hydraulic valve, solenoid, or other known actuating devices) to cause the adjustment in the machine parameter to take effect.
- an actuator e.g., hydraulic valve, solenoid, or other known actuating devices
- the control signal from the controller 44 may be provided to a hydraulic control valve that adjusts the flow of fluid to one or more cylinders associated with a steering mechanism or header height adjustment for the combine 10, causing an adjustment in the direction of the combine 10 or the height of the header 48.
- the speed or other machine parameters may be adjusted, as should be appreciated by one having ordinary skill in the art.
- adjustments are made to avoid an excessive amount of crop material entering the center portion 54 and clogging or plugging the mechanisms of the feeder house 12. These adjustments are made ahead of the problem, and hence avoided or mitigated.
- control signal may be delivered to one or more devices upstream of the device directly responsible for the physical adjustment in the machine parameter, or directly to the actuating device directly responsible for effecting the adjustment in the setting.
- the imaging system 42 may be used to prompt additional actions and/or other actions (e.g., not directly involving the crop material parameter for crop material located ahead of the header 48).
- the imaging system 42 may detect an obstacle located within the detection range ahead of the header 48.
- the one or more cameras 46A or 46B may capture an image of an obstacle
- the controller 44 may receive that image (or a stereo image) and determine (e.g., through well-known vision and/or feature detection algorithms) the presence of the obstacle and adjust one or more machine parameters to avoid the obstacle (e.g., such as lifting the header 48, stopping the combine 10, steering evasively, etc.).
- the controller 44 may alert an operator in the operator cab 14 and/or other personnel located elsewhere (e.g., remotely) of the obstacle or even provide an alert of the adjustments before (e.g., seeking approval, or making a recommendation), during, or after they are implemented. Such alerts may be in the form of an audible, visible, and/or tactile alert on or associated with user interface equipment (e.g., displays, headsets, joysticks, etc.) in the operator cab 14 (or remotely).
- the controller 44 may generate crop density maps based on the imaged or scanned areas of the traversed field, and record the same in a storage device (e.g., memory stick, memory, etc.) and/or communicate the same remotely.
- the controller 44 may cause the adjustment of one or more machine parameters autonomously, such as in an automated or semi-automated agricultural system.
- the controller 44 may cause the adjustment of one or more machine parameters with some operator (e.g., located in the operator cab 14 or in a remote facility for remote-controlled operations) involvement, such as to provide an alert or notification of the adjustment or an impending adjustment (e.g., allowing the operator to allow or disallow or override the adjustment).
- the controller 44 may merely cause a visual or audible (e.g., verbal or via a sound, such as a buzzer) notification that involves a recommendation (e.g., shown on a graphical user interface on a console in the operator cab 14) as to the appropriate adjustment that the operator should make in view of an assessment by the controller 44 of the image(s) or scan.
- a visual or audible e.g., verbal or via a sound, such as a buzzer
- a recommendation e.g., shown on a graphical user interface on a console in the operator cab 14
- the controller 44 may include a computer or microcontroller or other computing device embodied in a single package (e.g., enclosure) or with similar functionality distributed among several components.
- the controller 44 may receive the plural images and pair the images to provide a stereoscopic image.
- the stereoscopic image may be decomposed into, or otherwise represented by, a point cloud, which the controller 44 uses to determine one or more of the crop material parameters.
- one or more of the functionality of the controller 44 may be implemented in the cameras 46A and 46B (FIG. 1 ) or scanner.
- the plural images (or scan) are communicated by one or more of the cameras 46A and 46B to the controller 44, which then generates the point cloud and determines the crop material parameter(s).
- the controller 44 may then determine (e.g., approximate) the crop density, and display the same on a computer monitor or other display device (or in some embodiments, store to a storage device such as memory or generally a computer readable medium) located proximally to, or remotely from, the detection system 40.
- the crop density map may be based in part on positioning information input to the controller 44, such as coordinates provided by a global positioning system (GPS) or other positioning systems or mechanisms.
- GPS global positioning system
- the images e.g., or scan(s)
- the stereoscopic images, the point cloud, and/or the controller determinations may be communicated to a remote processing system (e.g., computer) located remotely from the combine 10 (FIG. 1 ), such as in a farm management office, famer's home, or elsewhere.
- a remote processing system e.g., computer located remotely from the combine 10 (FIG. 1 )
- the controller 44 may be performed (e.g., in addition to or in lieu of local computations) in a remote processing system based on communications of the plural images or the point cloud.
- Such communication may be performed over a wireless network and/or combination of wired (e.g., landline phone or cable system) and wireless (e.g., from a transceiver in the combine 10).
- the combine 10 may be operated and/or at least controlled in part from a remote location, based on the communicated feedback from the detection system 40 (FIG. 1 ).
- FIG. 3A illustrates an embodiment of a detection system 40.
- the detection system 40 comprises the controller 44 coupled in a CAN network 56 (though not limited to a CAN network or a single network) to the imaging system 42, machine controls 58, and a user interface 60.
- the imaging system 42 has been described already, and may include visible and non-visible spectrum devices, such as cameras, laser radar technology, etc.
- the machine controls 58 collectively comprise the various actuators, sensors, and/or controlled devices residing on the combine 10 (FIG.
- the user interface 60 may be a keyboard, mouse, microphone, touch-type display device, or other devices (e.g., switches) that enable input by an operator (e.g., such as while in the operator cab 14 (FIG. 1 )).
- the controller 44 receives and processes the information from the imaging system 42 and delivers control signals to the machine controls 58 (e.g., directly, or indirectly through an intermediary device in some embodiments).
- the controller 44 may receive input from the machine controls 58 (e.g., such as to enable feedback as to the position or status of certain devices, such as header height, speed of the combine 10, etc.), and/or receive input from other devices, such as global positioning devices, transceivers, etc.
- the controller 44 may also receive input from the user interface 60, such as during the process of adjustment to provide feedback of a change in machine parameters or an impending change or need or recommendation for change.
- FIG. 3B further illustrates an example embodiment of the controller 44.
- the example controller 44 is merely illustrative, and that some embodiments of controllers may comprise fewer or additional components, and/or some of the functionality associated with the various components depicted in FIG. 3B may be combined, or further distributed among additional modules, in some embodiments.
- the controller 44 is depicted in this example as a computer system, but may be embodied as a programmable logic controller (PLC), FPGA, among other devices. It should be appreciated that certain well-known components of computer systems are omitted here to avoid obfuscating relevant features of the controller 44.
- PLC programmable logic controller
- the controller 44 comprises one or more processing units, such as processing unit 62, input/output (I/O) interface(s) 64, and memory 64, all coupled to one or more data busses, such as data bus 68.
- the memory 66 may include any one or a combination of volatile memory elements (e.g., random-access memory RAM, such as DRAM, and SRAM, etc.) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.).
- the memory 66 may store a native operating system, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc. In the embodiment depicted in FIG.
- the memory 66 comprises an operating system 70, and crop material parameter determination software 72 that in one embodiment comprises stereo/point cloud software 74 and obstacle detection software 76. It should be appreciated that in some embodiments, additional or fewer software modules (e.g., combined functionality) may be employed in the memory 66 or additional memory. In some embodiments, a separate storage device may be coupled to the data bus 68, such as a persistent memory (e.g., optical, magnetic, and/or semiconductor memory and associated drives).
- a persistent memory e.g., optical, magnetic, and/or semiconductor memory and associated drives.
- the crop material parameter determination software 72 determines crop material parameters such as the height, density, and/or moisture content from the plural images or scan received from the imaging system 42 (FIG. 3A).
- the stereo/point cloud software 74 enables the pairing of plural images and generation of three-dimensional coordinates of the paired images or scan, enabling the determination of the one or more crop material parameters.
- the obstacle detect software 76 enables the detection of obstacles according to well-known vision and/or feature recognition software.
- the processing unit 62 may be embodied as a custom-made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and/or other well- known electrical configurations comprising discrete elements both individually and in various combinations to coordinate the overall operation of the controller 44.
- CPU central processing unit
- ASICs application specific integrated circuits
- the I/O interfaces 64 provide one or more interfaces to the network 56 (FIG.
- the I/O interfaces 64 may comprise any number of interfaces for the input and output of signals (e.g., analog or digital data) for conveyance over the network 56.
- the input may comprise input by an operator (local or remote) through the user interface 60 (e.g., a keyboard or mouse or other input device (or audible input in some embodiments)), and input from signals carrying information from one or more of the components of the detection system 40 (FIG. 3A), such as machine controls 58 (FIG. 3A), among other devices.
- controller 44 When certain embodiments of the controller 44 are implemented at least in part as software (including firmware), as depicted in FIG. 3B, it should be noted that the software can be stored on a variety of non-transitory computer-readable medium for use by, or in connection with, a variety of computer-related systems or methods.
- a computer-readable medium may comprise an electronic, magnetic, optical, or other physical device or apparatus that may contain or store a computer program (e.g., executable code or instructions) for use by or in connection with a computer-related system or method.
- the software may be embedded in a variety of computer-readable mediums for use by, or in connection with, an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
- an instruction execution system, apparatus, or device such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
- controller 44 When certain embodiment of the controller 44 are implemented at least in part as hardware, such functionality may be implemented with any or a combination of the following technologies, which are all well-known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
- ASIC application specific integrated circuit
- PGA programmable gate array
- FPGA field programmable gate array
- one embodiment of a detection method comprises receiving a scan or plural images of crop material located in front of, and proximal to, a header coupled to a front portion of an agricultural machine (80); determining a crop material parameter based on the scan or the plural images (82); and adjusting a machine parameter of the agricultural machine based on the crop material parameter (84).
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Environmental Sciences (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Procédé consistant à recevoir une image obtenue par balayage ou plusieurs images de matières à récolter situées devant ou à proximité de l'avant d'une tête de récolte couplée à une partie avant d'une machine agricole, à déterminer un paramètre des matières à récolter à partir de ladite image obtenue par balayage et/ou de la pluralité d'images, et à modifier un paramètre machine de la machine agricole en fonction de ce paramètre des matières à récolter.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261737223P | 2012-12-14 | 2012-12-14 | |
US61/737,223 | 2012-12-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014093814A1 true WO2014093814A1 (fr) | 2014-06-19 |
Family
ID=50934988
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2013/074999 WO2014093814A1 (fr) | 2012-12-14 | 2013-12-13 | Estimation de charge prédictive par vision frontale |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2014093814A1 (fr) |
Cited By (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3132711A1 (fr) | 2015-08-17 | 2017-02-22 | CLAAS Selbstfahrende Erntemaschinen GmbH | Moissonneuse agricole |
DE102015122269A1 (de) * | 2015-12-18 | 2017-06-22 | Claas Selbstfahrende Erntemaschinen Gmbh | Verfahren für den Betrieb eines Mähdreschers |
EP3300580A1 (fr) | 2016-09-30 | 2018-04-04 | CLAAS Selbstfahrende Erntemaschinen GmbH | Moissonneuse-batteuse pourvue de barre de coupe et dispositif de commande d'une barre de coupe |
WO2021116802A1 (fr) * | 2019-12-09 | 2021-06-17 | Precision Planting Llc | Procédés et systèmes d'imagerie pour la récolte |
US11079725B2 (en) | 2019-04-10 | 2021-08-03 | Deere & Company | Machine control using real-time model |
US11178818B2 (en) | 2018-10-26 | 2021-11-23 | Deere & Company | Harvesting machine control system with fill level processing based on yield data |
US11234366B2 (en) | 2019-04-10 | 2022-02-01 | Deere & Company | Image selection for machine control |
US11240961B2 (en) | 2018-10-26 | 2022-02-08 | Deere & Company | Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity |
US20220110251A1 (en) | 2020-10-09 | 2022-04-14 | Deere & Company | Crop moisture map generation and control system |
US11467605B2 (en) | 2019-04-10 | 2022-10-11 | Deere & Company | Zonal machine control |
US11474523B2 (en) | 2020-10-09 | 2022-10-18 | Deere & Company | Machine control using a predictive speed map |
US11477940B2 (en) | 2020-03-26 | 2022-10-25 | Deere & Company | Mobile work machine control based on zone parameter modification |
RU2784488C2 (ru) * | 2018-01-16 | 2022-11-28 | Макдон Индастриз Лтд. | Устройство для уборки сельскохозяйственной культуры (варианты) |
US11589509B2 (en) | 2018-10-26 | 2023-02-28 | Deere & Company | Predictive machine characteristic map generation and control system |
US11592822B2 (en) | 2020-10-09 | 2023-02-28 | Deere & Company | Machine control using a predictive map |
US11635765B2 (en) | 2020-10-09 | 2023-04-25 | Deere & Company | Crop state map generation and control system |
US11641800B2 (en) | 2020-02-06 | 2023-05-09 | Deere & Company | Agricultural harvesting machine with pre-emergence weed detection and mitigation system |
US11650587B2 (en) | 2020-10-09 | 2023-05-16 | Deere & Company | Predictive power map generation and control system |
US11653588B2 (en) | 2018-10-26 | 2023-05-23 | Deere & Company | Yield map generation and control system |
US11675354B2 (en) | 2020-10-09 | 2023-06-13 | Deere & Company | Machine control using a predictive map |
US11672203B2 (en) | 2018-10-26 | 2023-06-13 | Deere & Company | Predictive map generation and control |
US11711995B2 (en) | 2020-10-09 | 2023-08-01 | Deere & Company | Machine control using a predictive map |
US11727680B2 (en) | 2020-10-09 | 2023-08-15 | Deere & Company | Predictive map generation based on seeding characteristics and control |
US11744180B2 (en) | 2018-01-29 | 2023-09-05 | Deere & Company | Harvester crop mapping |
US11758846B2 (en) | 2019-12-23 | 2023-09-19 | Cnh Industrial America Llc | Header control system to adjust a header of a harvester based on sensor information |
US11778945B2 (en) | 2019-04-10 | 2023-10-10 | Deere & Company | Machine control using real-time model |
US11812694B2 (en) | 2018-01-29 | 2023-11-14 | Deere & Company | Monitor system for a harvester |
US11825768B2 (en) | 2020-10-09 | 2023-11-28 | Deere & Company | Machine control using a predictive map |
US11845449B2 (en) | 2020-10-09 | 2023-12-19 | Deere & Company | Map generation and control system |
US11844311B2 (en) | 2020-10-09 | 2023-12-19 | Deere & Company | Machine control using a predictive map |
US11849671B2 (en) | 2020-10-09 | 2023-12-26 | Deere & Company | Crop state map generation and control system |
US11849672B2 (en) | 2020-10-09 | 2023-12-26 | Deere & Company | Machine control using a predictive map |
US11864483B2 (en) | 2020-10-09 | 2024-01-09 | Deere & Company | Predictive map generation and control system |
US11870973B2 (en) | 2021-07-27 | 2024-01-09 | Deere & Company | Camera calibration tool |
US11874669B2 (en) | 2020-10-09 | 2024-01-16 | Deere & Company | Map generation and control system |
US11889787B2 (en) | 2020-10-09 | 2024-02-06 | Deere & Company | Predictive speed map generation and control system |
US11889788B2 (en) | 2020-10-09 | 2024-02-06 | Deere & Company | Predictive biomass map generation and control |
US11895948B2 (en) | 2020-10-09 | 2024-02-13 | Deere & Company | Predictive map generation and control based on soil properties |
US11927459B2 (en) | 2020-10-09 | 2024-03-12 | Deere & Company | Machine control using a predictive map |
US11946747B2 (en) | 2020-10-09 | 2024-04-02 | Deere & Company | Crop constituent map generation and control system |
US11957072B2 (en) | 2020-02-06 | 2024-04-16 | Deere & Company | Pre-emergence weed detection and mitigation system |
US11983009B2 (en) | 2020-10-09 | 2024-05-14 | Deere & Company | Map generation and control system |
US12013245B2 (en) | 2020-10-09 | 2024-06-18 | Deere & Company | Predictive map generation and control system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030004630A1 (en) * | 2001-06-28 | 2003-01-02 | Deere & Company | System for measuring the amount of crop to be harvested |
US20050279070A1 (en) * | 2004-06-21 | 2005-12-22 | Peter Pirro | Self-propelled harvesting machine |
US20100063680A1 (en) * | 2008-09-11 | 2010-03-11 | Jonathan Louis Tolstedt | Leader-follower semi-autonomous vehicle with operator on side |
-
2013
- 2013-12-13 WO PCT/US2013/074999 patent/WO2014093814A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030004630A1 (en) * | 2001-06-28 | 2003-01-02 | Deere & Company | System for measuring the amount of crop to be harvested |
US20050279070A1 (en) * | 2004-06-21 | 2005-12-22 | Peter Pirro | Self-propelled harvesting machine |
US20100063680A1 (en) * | 2008-09-11 | 2010-03-11 | Jonathan Louis Tolstedt | Leader-follower semi-autonomous vehicle with operator on side |
Cited By (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3132711A1 (fr) | 2015-08-17 | 2017-02-22 | CLAAS Selbstfahrende Erntemaschinen GmbH | Moissonneuse agricole |
DE102015113527A1 (de) | 2015-08-17 | 2017-02-23 | Claas Selbstfahrende Erntemaschinen Gmbh | Landwirtschaftliche Erntemaschine |
US9807926B2 (en) | 2015-08-17 | 2017-11-07 | Claas Selbstfahrende Erntemaschinen Gmbh | Agricultural harvesting machine |
DE102015122269A1 (de) * | 2015-12-18 | 2017-06-22 | Claas Selbstfahrende Erntemaschinen Gmbh | Verfahren für den Betrieb eines Mähdreschers |
US10448569B2 (en) | 2015-12-18 | 2019-10-22 | Claas Selbstfahrende Erntemaschinen Gmbh | Method and apparatus for operating a combine harvester |
EP3300580A1 (fr) | 2016-09-30 | 2018-04-04 | CLAAS Selbstfahrende Erntemaschinen GmbH | Moissonneuse-batteuse pourvue de barre de coupe et dispositif de commande d'une barre de coupe |
DE102016118637A1 (de) | 2016-09-30 | 2018-04-05 | Claas Selbstfahrende Erntemaschinen Gmbh | Mähdrescher mit einem Schneidwerk und Steuerung eines Schneidwerks |
RU2784488C2 (ru) * | 2018-01-16 | 2022-11-28 | Макдон Индастриз Лтд. | Устройство для уборки сельскохозяйственной культуры (варианты) |
US11812694B2 (en) | 2018-01-29 | 2023-11-14 | Deere & Company | Monitor system for a harvester |
US11744180B2 (en) | 2018-01-29 | 2023-09-05 | Deere & Company | Harvester crop mapping |
US12010947B2 (en) | 2018-10-26 | 2024-06-18 | Deere & Company | Predictive machine characteristic map generation and control system |
US11589509B2 (en) | 2018-10-26 | 2023-02-28 | Deere & Company | Predictive machine characteristic map generation and control system |
US11653588B2 (en) | 2018-10-26 | 2023-05-23 | Deere & Company | Yield map generation and control system |
US11672203B2 (en) | 2018-10-26 | 2023-06-13 | Deere & Company | Predictive map generation and control |
US11178818B2 (en) | 2018-10-26 | 2021-11-23 | Deere & Company | Harvesting machine control system with fill level processing based on yield data |
US11240961B2 (en) | 2018-10-26 | 2022-02-08 | Deere & Company | Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity |
US11079725B2 (en) | 2019-04-10 | 2021-08-03 | Deere & Company | Machine control using real-time model |
US11829112B2 (en) | 2019-04-10 | 2023-11-28 | Deere & Company | Machine control using real-time model |
US11467605B2 (en) | 2019-04-10 | 2022-10-11 | Deere & Company | Zonal machine control |
US11778945B2 (en) | 2019-04-10 | 2023-10-10 | Deere & Company | Machine control using real-time model |
US11650553B2 (en) | 2019-04-10 | 2023-05-16 | Deere & Company | Machine control using real-time model |
US11234366B2 (en) | 2019-04-10 | 2022-02-01 | Deere & Company | Image selection for machine control |
US11632905B2 (en) | 2019-12-09 | 2023-04-25 | Precision Planting Llc | Methods and imaging systems for harvesting |
WO2021116802A1 (fr) * | 2019-12-09 | 2021-06-17 | Precision Planting Llc | Procédés et systèmes d'imagerie pour la récolte |
US11758846B2 (en) | 2019-12-23 | 2023-09-19 | Cnh Industrial America Llc | Header control system to adjust a header of a harvester based on sensor information |
US11957072B2 (en) | 2020-02-06 | 2024-04-16 | Deere & Company | Pre-emergence weed detection and mitigation system |
US11641800B2 (en) | 2020-02-06 | 2023-05-09 | Deere & Company | Agricultural harvesting machine with pre-emergence weed detection and mitigation system |
US11477940B2 (en) | 2020-03-26 | 2022-10-25 | Deere & Company | Mobile work machine control based on zone parameter modification |
US11675354B2 (en) | 2020-10-09 | 2023-06-13 | Deere & Company | Machine control using a predictive map |
US11874669B2 (en) | 2020-10-09 | 2024-01-16 | Deere & Company | Map generation and control system |
US11711995B2 (en) | 2020-10-09 | 2023-08-01 | Deere & Company | Machine control using a predictive map |
US11650587B2 (en) | 2020-10-09 | 2023-05-16 | Deere & Company | Predictive power map generation and control system |
US11635765B2 (en) | 2020-10-09 | 2023-04-25 | Deere & Company | Crop state map generation and control system |
US11825768B2 (en) | 2020-10-09 | 2023-11-28 | Deere & Company | Machine control using a predictive map |
US11592822B2 (en) | 2020-10-09 | 2023-02-28 | Deere & Company | Machine control using a predictive map |
US11845449B2 (en) | 2020-10-09 | 2023-12-19 | Deere & Company | Map generation and control system |
US11844311B2 (en) | 2020-10-09 | 2023-12-19 | Deere & Company | Machine control using a predictive map |
US11849671B2 (en) | 2020-10-09 | 2023-12-26 | Deere & Company | Crop state map generation and control system |
US11849672B2 (en) | 2020-10-09 | 2023-12-26 | Deere & Company | Machine control using a predictive map |
US11864483B2 (en) | 2020-10-09 | 2024-01-09 | Deere & Company | Predictive map generation and control system |
US12013245B2 (en) | 2020-10-09 | 2024-06-18 | Deere & Company | Predictive map generation and control system |
US11727680B2 (en) | 2020-10-09 | 2023-08-15 | Deere & Company | Predictive map generation based on seeding characteristics and control |
US11871697B2 (en) | 2020-10-09 | 2024-01-16 | Deere & Company | Crop moisture map generation and control system |
US11889787B2 (en) | 2020-10-09 | 2024-02-06 | Deere & Company | Predictive speed map generation and control system |
US11889788B2 (en) | 2020-10-09 | 2024-02-06 | Deere & Company | Predictive biomass map generation and control |
US11895948B2 (en) | 2020-10-09 | 2024-02-13 | Deere & Company | Predictive map generation and control based on soil properties |
US11927459B2 (en) | 2020-10-09 | 2024-03-12 | Deere & Company | Machine control using a predictive map |
US11946747B2 (en) | 2020-10-09 | 2024-04-02 | Deere & Company | Crop constituent map generation and control system |
US11474523B2 (en) | 2020-10-09 | 2022-10-18 | Deere & Company | Machine control using a predictive speed map |
US11983009B2 (en) | 2020-10-09 | 2024-05-14 | Deere & Company | Map generation and control system |
US20220110251A1 (en) | 2020-10-09 | 2022-04-14 | Deere & Company | Crop moisture map generation and control system |
US12013698B2 (en) | 2020-10-09 | 2024-06-18 | Deere & Company | Machine control using a predictive map |
US11870973B2 (en) | 2021-07-27 | 2024-01-09 | Deere & Company | Camera calibration tool |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014093814A1 (fr) | Estimation de charge prédictive par vision frontale | |
US20160366821A1 (en) | Crop mat measurement through stereo imaging | |
US10806078B2 (en) | Control system for adjusting conditioning rollers of work vehicle | |
EP4002983B1 (fr) | Système et méthode pour déterminer la couverture de résidus dans un champ après une opération de récolte | |
CA2923037C (fr) | Mecanisme de recolte dote d'un appareil de recolte autopropulse | |
US20190351765A1 (en) | System and method for regulating the operating distance between work vehicles | |
US9521805B2 (en) | Harvester with predictive driving speed specification | |
US9807933B2 (en) | Sensor equipped agricultural harvester | |
US20140215984A1 (en) | Method for Setting the Work Parameters of a Harvester | |
AU2018102227A4 (en) | An agricultural system | |
EP3476199B1 (fr) | Régulateur de glissement pour convoyeurs latéraux d'une tête de coupe de machine de récolte | |
US20150264864A1 (en) | Mog sensing system for a residue spreader | |
US9788486B2 (en) | Grain header with swathing and chopping capability | |
JP2019028688A (ja) | 自律走行コンバインの収穫システム | |
US10588259B2 (en) | Location based chop to swath conversion for riparian buffer zone management | |
US11903342B2 (en) | Auto reel height | |
WO2021261343A1 (fr) | Moissonneuse, système de commande de moissonneuse, procédé de commande de moissonneuse, programme de commande de moissonneuse, et support de stockage | |
US20230225246A1 (en) | Agricultural residue depositing apparatus and method | |
WO2022123889A1 (fr) | Véhicule de travail, système de détection d'état d'objet, procédé de détection d'état d'objet, programme de détection d'état d'objet, et support d'enregistrement dans lequel un programme de détection d'état d'objet est enregistré | |
EP3646700B1 (fr) | Système d'estimation de biomasse de moissonneuse agricole | |
CN113727597A (zh) | 收割机等农业机械 | |
JP7423441B2 (ja) | 収穫機 | |
WO2022239779A1 (fr) | Moissonneuse-batteuse et procédé associé | |
JP7433145B2 (ja) | 収穫機 | |
WO2022124001A1 (fr) | Machine de travail agricole, programme de commande de machine de travail agricole, support d'enregistrement sur lequel le programme de commande de machine de travail agricole est enregistré, et procédé de commande de machine de travail agricole |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13862423 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13862423 Country of ref document: EP Kind code of ref document: A1 |