WO2014093814A1 - Predictive load estimation through forward vision - Google Patents

Predictive load estimation through forward vision Download PDF

Info

Publication number
WO2014093814A1
WO2014093814A1 PCT/US2013/074999 US2013074999W WO2014093814A1 WO 2014093814 A1 WO2014093814 A1 WO 2014093814A1 US 2013074999 W US2013074999 W US 2013074999W WO 2014093814 A1 WO2014093814 A1 WO 2014093814A1
Authority
WO
WIPO (PCT)
Prior art keywords
agricultural machine
machine
crop material
header
controller
Prior art date
Application number
PCT/US2013/074999
Other languages
French (fr)
Inventor
Grant Good
Robert Matousek
Original Assignee
Agco Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agco Corporation filed Critical Agco Corporation
Publication of WO2014093814A1 publication Critical patent/WO2014093814A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D41/00Combines, i.e. harvesters or mowers combined with threshing devices
    • A01D41/12Details of combines
    • A01D41/127Control or measuring arrangements specially adapted for combines
    • A01D41/1271Control or measuring arrangements specially adapted for combines for measuring crop flow
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D75/00Accessories for harvesters or mowers
    • A01D75/18Safety devices for parts of the machines
    • A01D75/182Avoiding overload
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D41/00Combines, i.e. harvesters or mowers combined with threshing devices
    • A01D41/12Details of combines
    • A01D41/127Control or measuring arrangements specially adapted for combines

Definitions

  • the present disclosure is generally related to agriculture technology, and, more particularly, computer-assisted farming.
  • combine harvesters may employ a form of cruise control, with the perceived benefits of preventing operator fatigue and maximizing machine efficiency.
  • FIG. 1 is a schematic diagram of an example combine harvester showing an embodiment of a detection system.
  • FIG. 2 is a schematic diagram showing an overhead perspective view of a feeder house and a header as viewed from the perspective of an imaging system mounted to the top of an operator cab of the combine harvester.
  • FIG. 3A is a block diagram showing an embodiment of a detection system.
  • FIG. 3B is a block diagram showing an embodiment of a controller for the
  • FIG. 4 is a flow diagram that illustrates an example embodiment of a detection method.
  • a method comprising receiving a scan or plural images of crop material located in front of, and proximal to, a header coupled to a front portion of an agricultural machine; determining a crop material parameter based on the scan or the plural images; and adjusting a machine parameter of the agricultural machine based on the crop material parameter.
  • the detection system detects the amount of crop material (e.g., crops, weeds, etc.) that is about to be gathered (e.g., harvested) by an agricultural machine, such as a combine harvester (hereinafter, a combine harvester is referred to as a combine), and adjusts one or more machine parameters to enable the agricultural machine to suitably react to the anticipated crop material load.
  • the agricultural machine may have a header coupled to the front of the agricultural machine, and the detection system detects the amount of crop material in advance of the header engaging the crop material.
  • the detection system comprises an imaging system and a controller.
  • the imaging system may comprise plural cameras or a laser scanner that capture plural images or a scan, respectively.
  • the imaging system is mounted on the agricultural machine in a manner that enables the crop material located in front of the header to be within detection range.
  • the controller which may be embodied as a programmable logic controller (PLC), microcontroller, processor(s), or computer (or other computing device), receives the plural images or scan and determines one or more crop material parameters.
  • the crop material parameters may include crop height, crop density, crop moisture, or any combination of these or other parameters that impact the machine load.
  • the detection system embodiments disclosed herein enable a determination of the one or more crop material parameters for an area of crop material located in front of the header to facilitate prediction of upcoming changes in machine load, and ultimately as an input to help optimize ground speed and/or other machine settings.
  • one or more embodiments of detection systems comprise a plurality of cameras that are located in a position that enables capture of the crop material in front of a header from different perspectives.
  • the controller pairs these plural images to provide a stereoscopic image.
  • a point cloud comprising three dimensional coordinates that provide an outline of the crop material is generated based on the stereoscopic image, and from the point cloud, a cross sectional area of the crop material is determined. In some embodiments, the generation of a point cloud may be omitted.
  • some embodiments may use other types of imaging systems, such as laser radar topography, among other types of imaging systems using other portions of the electromagnetic spectrum.
  • the imaging system may further enable determination of one or more of the crop material parameters based on detection between the header and a front of the agricultural machine (e.g., the feeder house).
  • the description identifies or describes specifics of one or more embodiments, such specifics are not necessarily part of every embodiment, nor are all various stated advantages necessarily associated with a single embodiment or all embodiments. On the contrary, the intent is to cover all alternatives, modifications and equivalents included within the spirit and scope of the disclosure as defined by the appended claims. Further, it should be appreciated in the context of the present disclosure that the claims are not necessarily limited to the particular embodiments set out in the description.
  • FIG. 1 shown is an example agricultural machine embodied as a combine 10 in which an embodiment of a detection system may be implemented.
  • a detection system may be implemented in some embodiments.
  • FIG. 1 is merely illustrative, and that other combine configurations may be implemented in some embodiments.
  • some embodiments of detection systems may be used in combines having an axial flow, hybrid, or dual rotor configuration, among other combine configurations, and hence are contemplated to be within the scope of the disclosure.
  • the example combine 10 is shown in FIG. 1 without a header, and from front to back, comprises a feeder house 12 and an operator cab 14, followed by a processing compartment that includes a processing apparatus 16.
  • the combine 10 includes a harvesting header (shown in FIG. 2, as described later) at the front of the machine that cuts cop materials and delivers the cut crop materials to the front end of the feeder house 12.
  • a harvesting header shown in FIG. 2, as described later
  • Such crop materials are moved upwardly and rearwardly within and beyond the feeder house 12 by a conveyer 18 until reaching a thresher rotor 20 of the processing apparatus 16.
  • the thresher rotor 20 comprises a single, transverse rotor, such as that found in a Gleaner® Super Series Combine by AGCO.
  • the thresher rotor 20 processes the crop materials in known manner and passes a portion of the crop material (e.g., heavier chaff, corn stalks, etc.) toward the rear of the combine 10 and another portion (e.g., grain and possibly light chaff) through a cleaning process, as described below.
  • the conveyor 18 may convey the cut crop material to a beater before reaching a rotor or rotors.
  • the crop materials undergo threshing and separating operations.
  • the crop materials are threshed and separated by the thresher rotor 20 operating in cooperation with certain elements of a cage 22, for instance, well-known foraminous processing members in the form of threshing concave assemblies and separator grate assemblies, with the grain (and possibly light chaff) escaping through the concave assemblies and the grate assemblies and onto one or more distribution augers 24 located beneath the processing apparatus 16.
  • Bulkier stalk and leaf materials are generally retained by the concave assemblies and the grate assemblies and are disbursed out from the processing apparatus 16 and ultimately out of the rear of the combine 10.
  • the distribution augers 24 uniformly spread the crop material that falls upon it, with the spread crop material conveyed to accelerator rolls 26.
  • the accelerator rolls 26 speed the descent of the crop material toward a cleaning assembly 28.
  • the cleaning assembly 28 includes a transverse fan 30 (or equivalently, a blower), which facilitates the cleaning of the heavier crop material directly beneath the accelerator rolls 26 while causing the chaff to be carried out of the rear of the combine 10.
  • the cleaning assembly 28 also includes plural stacked sieves 32, through which the fan 30 provides an additional push or influence of the chaff flow to the rear of the combine 10.
  • the cleaned grain that drops to the bottom of the cleaning assembly 28 is delivered by an auger 34 that transports the grain to a well-known elevator mechanism (not shown), which conveys the grain to a grain bin 36 located at the top of the combine 10. Any remaining chaff and partially threshed grain is recirculated through the processing apparatus 16 via a tailings return auger 38.
  • a well-known elevator mechanism not shown
  • Any remaining chaff and partially threshed grain is recirculated through the processing apparatus 16 via a tailings return auger 38.
  • the example combine 10 also comprises a detection system 40, which in one embodiment comprises an imaging system 42 mounted on the combine 10, and a controller 44 (shown schematically). Though depicted in the operator cab 14, the controller 44 may be located elsewhere on the combine 10 in some embodiments.
  • the detection system 40 comprises the imaging system 42, which comprises two cameras 46A and 46B (collectively, cameras 46, schematically depicted in FIG. 1 ) mounted to the top of the operator cab 14, and a controller 44. It should be appreciated that the quantity and/or the location of the cameras 46 may vary in some embodiments.
  • the imaging system 42 is generally mounted in a location within a range and view point that enables the capture of images (or scans in some embodiments, wherein a scanner may replace one of the cameras (the other camera omitted), and/or located in any of a plurality of places on the combine 10) of crop material located ahead of the header that is coupled to the front of the combine 10.
  • the cameras 46A, 46B are configured to operate in the visible light spectrum, and are depicted in this example as offset symmetrically across a longitudinal centerline of the combine 10, although it is not necessary for the cameras 46A and 46B to be symmetrically offset or offset with respect to the centerline.
  • the cameras 46A, 46B are positioned to capture images of the crop material (e.g., uncut crops or weeds, or in some embodiments, cut crops) located proximal to, and in front of, the header of the combine 10.
  • the captured image may reveal one or more crop material parameters, such as a height of the crops along all or a portion of a width of the header, the density, and/or moisture content (e.g., via the color).
  • the pair of images captured by the cameras 46A, 46B are used to produce stereo images and in some embodiments, a point cloud (or otherwise, three dimensional coordinates), as described below.
  • a point cloud or otherwise, three dimensional coordinates
  • the imaging system 42 may operate in the non-visible spectrum, such as the infrared, ultraviolet, ultrasonic, among other ranges.
  • the imaging system 42 may be embodied as a laser radar topography system (referred to herein as a scanner or laser scanner).
  • FIG. 2 shows a view from a location proximal to the top of the operator cab 14, which may be the view seen by one of the cameras 46A (FIG. 1 ) of the imaging system 42 (FIG. 1 ).
  • the feeder house 12 has secured to it a header 48, shown partially in FIG. 2, which may be removed and replaced with other types of headers depending on the application.
  • a header 48 shown partially in FIG. 2, which may be removed and replaced with other types of headers depending on the application.
  • certain embodiments of a detection system 40 may include other types of headers, such as pickup headers, corn headers, etc.
  • the header 48 comprises a cutting portion 50 for cutting the crops and a transition portion 52 that conveys (e.g., using a conveyor, such as a belt or belts, chain and slat configuration, etc.) the cut crops toward a rear, center portion 54 of the header 48, as is known.
  • the center portion 54 may comprise a feeder auger (not shown) to advance the harvested crop material into the feeder house 12, where the conveyor 18 (FIG. 1 ) conveys the crop material toward the processing apparatus 16.
  • the imaging system 42 captures plural images (or a scan) of the crop material located in front of, and proximal to, the header 48.
  • the detection system 40 (FIG. 1 ) enables a predictive determination of the a load of the crop material that is to move through the combine 10 to prevent plugging or overloading of components in the combine 10, such as components in the feeder house 12.
  • the plural images of a given area comprising crop material are captured by the cameras 46A and 46B (FIG.
  • images or scans are captured by a scanner), and the images or scan(s) may be communicated (e.g., over a wired connection or network, such as via a controller area network (CAN), or wirelessly) to the controller 44 (FIG. 1 ).
  • the communication of images or scans may be implemented continuously, regularly (e.g., periodically, every defined quantity of feet of travel of the combine 10, every fixed time interval, etc.) or irregularly or aperiodically (e.g., responsive to a given event, such as operator intervention locally or remotely, or at random intervals).
  • the captured images or scan(s) are received at the controller 44 (FIG. 1 ), which pairs the images and in one embodiment determines a point cloud to enable a determination of one or more crop material parameters, such as crop height, crop density, and/or moisture content.
  • a determination may be a relative and/or absolute determination.
  • the detection system 40 may determine that the imaged crop material meets one or more defined conditions, and enables the combine 10 (FIG. 1 ) to continue operating according to its current state of operations until the one or more defined conditions vary according to some threshold amount of change.
  • the detection system 40 may perform a calibration procedure (e.g., in the assembly plant or field where determinations of crop material parameters are based on known object parameters imaged during the calibration procedure) or target certain features of known parameters (e.g., known header width, height, etc.) and associate the same with a benchmark to enable the crop material determinations.
  • the controller 44 may determine the moisture content from the color of the imaged or scanned crop material. Based on the image or scan, the controller 44 determines an optimum or desirous machine parameter, such as ground speed of the combine 10, direction of the combine 10, or other combine settings pertaining to combine operations (e.g., concave clearance, cleaning fan speed, etc.).
  • the controller 44 adjusts a parameter corresponding to one or more of these machine parameters, and provides a corresponding control signal to an actuator (e.g., hydraulic valve, solenoid, or other known actuating devices) to cause the adjustment in the machine parameter to take effect.
  • an actuator e.g., hydraulic valve, solenoid, or other known actuating devices
  • the control signal from the controller 44 may be provided to a hydraulic control valve that adjusts the flow of fluid to one or more cylinders associated with a steering mechanism or header height adjustment for the combine 10, causing an adjustment in the direction of the combine 10 or the height of the header 48.
  • the speed or other machine parameters may be adjusted, as should be appreciated by one having ordinary skill in the art.
  • adjustments are made to avoid an excessive amount of crop material entering the center portion 54 and clogging or plugging the mechanisms of the feeder house 12. These adjustments are made ahead of the problem, and hence avoided or mitigated.
  • control signal may be delivered to one or more devices upstream of the device directly responsible for the physical adjustment in the machine parameter, or directly to the actuating device directly responsible for effecting the adjustment in the setting.
  • the imaging system 42 may be used to prompt additional actions and/or other actions (e.g., not directly involving the crop material parameter for crop material located ahead of the header 48).
  • the imaging system 42 may detect an obstacle located within the detection range ahead of the header 48.
  • the one or more cameras 46A or 46B may capture an image of an obstacle
  • the controller 44 may receive that image (or a stereo image) and determine (e.g., through well-known vision and/or feature detection algorithms) the presence of the obstacle and adjust one or more machine parameters to avoid the obstacle (e.g., such as lifting the header 48, stopping the combine 10, steering evasively, etc.).
  • the controller 44 may alert an operator in the operator cab 14 and/or other personnel located elsewhere (e.g., remotely) of the obstacle or even provide an alert of the adjustments before (e.g., seeking approval, or making a recommendation), during, or after they are implemented. Such alerts may be in the form of an audible, visible, and/or tactile alert on or associated with user interface equipment (e.g., displays, headsets, joysticks, etc.) in the operator cab 14 (or remotely).
  • the controller 44 may generate crop density maps based on the imaged or scanned areas of the traversed field, and record the same in a storage device (e.g., memory stick, memory, etc.) and/or communicate the same remotely.
  • the controller 44 may cause the adjustment of one or more machine parameters autonomously, such as in an automated or semi-automated agricultural system.
  • the controller 44 may cause the adjustment of one or more machine parameters with some operator (e.g., located in the operator cab 14 or in a remote facility for remote-controlled operations) involvement, such as to provide an alert or notification of the adjustment or an impending adjustment (e.g., allowing the operator to allow or disallow or override the adjustment).
  • the controller 44 may merely cause a visual or audible (e.g., verbal or via a sound, such as a buzzer) notification that involves a recommendation (e.g., shown on a graphical user interface on a console in the operator cab 14) as to the appropriate adjustment that the operator should make in view of an assessment by the controller 44 of the image(s) or scan.
  • a visual or audible e.g., verbal or via a sound, such as a buzzer
  • a recommendation e.g., shown on a graphical user interface on a console in the operator cab 14
  • the controller 44 may include a computer or microcontroller or other computing device embodied in a single package (e.g., enclosure) or with similar functionality distributed among several components.
  • the controller 44 may receive the plural images and pair the images to provide a stereoscopic image.
  • the stereoscopic image may be decomposed into, or otherwise represented by, a point cloud, which the controller 44 uses to determine one or more of the crop material parameters.
  • one or more of the functionality of the controller 44 may be implemented in the cameras 46A and 46B (FIG. 1 ) or scanner.
  • the plural images (or scan) are communicated by one or more of the cameras 46A and 46B to the controller 44, which then generates the point cloud and determines the crop material parameter(s).
  • the controller 44 may then determine (e.g., approximate) the crop density, and display the same on a computer monitor or other display device (or in some embodiments, store to a storage device such as memory or generally a computer readable medium) located proximally to, or remotely from, the detection system 40.
  • the crop density map may be based in part on positioning information input to the controller 44, such as coordinates provided by a global positioning system (GPS) or other positioning systems or mechanisms.
  • GPS global positioning system
  • the images e.g., or scan(s)
  • the stereoscopic images, the point cloud, and/or the controller determinations may be communicated to a remote processing system (e.g., computer) located remotely from the combine 10 (FIG. 1 ), such as in a farm management office, famer's home, or elsewhere.
  • a remote processing system e.g., computer located remotely from the combine 10 (FIG. 1 )
  • the controller 44 may be performed (e.g., in addition to or in lieu of local computations) in a remote processing system based on communications of the plural images or the point cloud.
  • Such communication may be performed over a wireless network and/or combination of wired (e.g., landline phone or cable system) and wireless (e.g., from a transceiver in the combine 10).
  • the combine 10 may be operated and/or at least controlled in part from a remote location, based on the communicated feedback from the detection system 40 (FIG. 1 ).
  • FIG. 3A illustrates an embodiment of a detection system 40.
  • the detection system 40 comprises the controller 44 coupled in a CAN network 56 (though not limited to a CAN network or a single network) to the imaging system 42, machine controls 58, and a user interface 60.
  • the imaging system 42 has been described already, and may include visible and non-visible spectrum devices, such as cameras, laser radar technology, etc.
  • the machine controls 58 collectively comprise the various actuators, sensors, and/or controlled devices residing on the combine 10 (FIG.
  • the user interface 60 may be a keyboard, mouse, microphone, touch-type display device, or other devices (e.g., switches) that enable input by an operator (e.g., such as while in the operator cab 14 (FIG. 1 )).
  • the controller 44 receives and processes the information from the imaging system 42 and delivers control signals to the machine controls 58 (e.g., directly, or indirectly through an intermediary device in some embodiments).
  • the controller 44 may receive input from the machine controls 58 (e.g., such as to enable feedback as to the position or status of certain devices, such as header height, speed of the combine 10, etc.), and/or receive input from other devices, such as global positioning devices, transceivers, etc.
  • the controller 44 may also receive input from the user interface 60, such as during the process of adjustment to provide feedback of a change in machine parameters or an impending change or need or recommendation for change.
  • FIG. 3B further illustrates an example embodiment of the controller 44.
  • the example controller 44 is merely illustrative, and that some embodiments of controllers may comprise fewer or additional components, and/or some of the functionality associated with the various components depicted in FIG. 3B may be combined, or further distributed among additional modules, in some embodiments.
  • the controller 44 is depicted in this example as a computer system, but may be embodied as a programmable logic controller (PLC), FPGA, among other devices. It should be appreciated that certain well-known components of computer systems are omitted here to avoid obfuscating relevant features of the controller 44.
  • PLC programmable logic controller
  • the controller 44 comprises one or more processing units, such as processing unit 62, input/output (I/O) interface(s) 64, and memory 64, all coupled to one or more data busses, such as data bus 68.
  • the memory 66 may include any one or a combination of volatile memory elements (e.g., random-access memory RAM, such as DRAM, and SRAM, etc.) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.).
  • the memory 66 may store a native operating system, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc. In the embodiment depicted in FIG.
  • the memory 66 comprises an operating system 70, and crop material parameter determination software 72 that in one embodiment comprises stereo/point cloud software 74 and obstacle detection software 76. It should be appreciated that in some embodiments, additional or fewer software modules (e.g., combined functionality) may be employed in the memory 66 or additional memory. In some embodiments, a separate storage device may be coupled to the data bus 68, such as a persistent memory (e.g., optical, magnetic, and/or semiconductor memory and associated drives).
  • a persistent memory e.g., optical, magnetic, and/or semiconductor memory and associated drives.
  • the crop material parameter determination software 72 determines crop material parameters such as the height, density, and/or moisture content from the plural images or scan received from the imaging system 42 (FIG. 3A).
  • the stereo/point cloud software 74 enables the pairing of plural images and generation of three-dimensional coordinates of the paired images or scan, enabling the determination of the one or more crop material parameters.
  • the obstacle detect software 76 enables the detection of obstacles according to well-known vision and/or feature recognition software.
  • the processing unit 62 may be embodied as a custom-made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and/or other well- known electrical configurations comprising discrete elements both individually and in various combinations to coordinate the overall operation of the controller 44.
  • CPU central processing unit
  • ASICs application specific integrated circuits
  • the I/O interfaces 64 provide one or more interfaces to the network 56 (FIG.
  • the I/O interfaces 64 may comprise any number of interfaces for the input and output of signals (e.g., analog or digital data) for conveyance over the network 56.
  • the input may comprise input by an operator (local or remote) through the user interface 60 (e.g., a keyboard or mouse or other input device (or audible input in some embodiments)), and input from signals carrying information from one or more of the components of the detection system 40 (FIG. 3A), such as machine controls 58 (FIG. 3A), among other devices.
  • controller 44 When certain embodiments of the controller 44 are implemented at least in part as software (including firmware), as depicted in FIG. 3B, it should be noted that the software can be stored on a variety of non-transitory computer-readable medium for use by, or in connection with, a variety of computer-related systems or methods.
  • a computer-readable medium may comprise an electronic, magnetic, optical, or other physical device or apparatus that may contain or store a computer program (e.g., executable code or instructions) for use by or in connection with a computer-related system or method.
  • the software may be embedded in a variety of computer-readable mediums for use by, or in connection with, an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • an instruction execution system, apparatus, or device such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
  • controller 44 When certain embodiment of the controller 44 are implemented at least in part as hardware, such functionality may be implemented with any or a combination of the following technologies, which are all well-known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
  • ASIC application specific integrated circuit
  • PGA programmable gate array
  • FPGA field programmable gate array
  • one embodiment of a detection method comprises receiving a scan or plural images of crop material located in front of, and proximal to, a header coupled to a front portion of an agricultural machine (80); determining a crop material parameter based on the scan or the plural images (82); and adjusting a machine parameter of the agricultural machine based on the crop material parameter (84).

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A method comprising receiving a scan or plural images of crop material located in front of, and proximal to, a header coupled to a front portion of an agricultural machine; determining a crop material parameter based on the scan or the plural images; and adjusting a machine parameter of the agricultural machine based on the crop material parameter.

Description

PREDICTIVE LOAD ESTIMATION THROUGH FORWARD VISION
CROSS REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to copending U.S. provisional application entitled,
"Predictive Load Estimation Through Forward Vision," having serial number 61/737,223, filed December 14, 2012, which is entirely incorporated herein by reference.
TECHNICAL FIELD
[0002] The present disclosure is generally related to agriculture technology, and, more particularly, computer-assisted farming.
BACKGROUND
[0003] Recent efforts have been made to automate or semi-automate farming
operations. Such efforts serve not only to reduce operating costs but also improve working conditions on operators and reduce operator error, enabling gains in operational efficiency and yield. For instance, combine harvesters may employ a form of cruise control, with the perceived benefits of preventing operator fatigue and maximizing machine efficiency.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] Many aspects of the disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views. [0005] FIG. 1 is a schematic diagram of an example combine harvester showing an embodiment of a detection system.
[0006] FIG. 2 is a schematic diagram showing an overhead perspective view of a feeder house and a header as viewed from the perspective of an imaging system mounted to the top of an operator cab of the combine harvester.
[0007] FIG. 3A is a block diagram showing an embodiment of a detection system.
[0008] FIG. 3B is a block diagram showing an embodiment of a controller for the
detection system.
[0009] FIG. 4 is a flow diagram that illustrates an example embodiment of a detection method.
DESCRIPTION OF EXAMPLE EMBODIMENTS
Overview
[0010] In one embodiment, a method comprising receiving a scan or plural images of crop material located in front of, and proximal to, a header coupled to a front portion of an agricultural machine; determining a crop material parameter based on the scan or the plural images; and adjusting a machine parameter of the agricultural machine based on the crop material parameter.
Detailed Description
[001 1] Certain embodiments of detection systems and methods are disclosed that
enable an agricultural machine to function in a form of cruise control, enabling a prediction of crop material processing load and adjustments to suitably handle the crop material once it enters the machine. In one embodiment, the detection system detects the amount of crop material (e.g., crops, weeds, etc.) that is about to be gathered (e.g., harvested) by an agricultural machine, such as a combine harvester (hereinafter, a combine harvester is referred to as a combine), and adjusts one or more machine parameters to enable the agricultural machine to suitably react to the anticipated crop material load. For instance, the agricultural machine may have a header coupled to the front of the agricultural machine, and the detection system detects the amount of crop material in advance of the header engaging the crop material. The detection system comprises an imaging system and a controller. The imaging system may comprise plural cameras or a laser scanner that capture plural images or a scan, respectively. The imaging system is mounted on the agricultural machine in a manner that enables the crop material located in front of the header to be within detection range. The controller, which may be embodied as a programmable logic controller (PLC), microcontroller, processor(s), or computer (or other computing device), receives the plural images or scan and determines one or more crop material parameters. The crop material parameters may include crop height, crop density, crop moisture, or any combination of these or other parameters that impact the machine load. The detection system embodiments disclosed herein enable a determination of the one or more crop material parameters for an area of crop material located in front of the header to facilitate prediction of upcoming changes in machine load, and ultimately as an input to help optimize ground speed and/or other machine settings.
Digressing briefly, today's agricultural systems, though operating in a form of crude cruise control, nevertheless are more reactionary to problems that have already arisen. For instance, if there is a patch of tall, green weeds in the travel path of the machine, a conventional combine (e.g., without operator intervention) would continue at the same speed until the patch of weeds was drawn into the feeder house, likely causing a plug or drive overload to occur. Certain embodiments of detection systems, with their forward-looking features, recognize changes in the crop material parameter(s) to avoid or mitigate these and/or other problems. [0013] Note that one or more embodiments of detection systems comprise a plurality of cameras that are located in a position that enables capture of the crop material in front of a header from different perspectives. The controller pairs these plural images to provide a stereoscopic image. In one embodiment, a point cloud comprising three dimensional coordinates that provide an outline of the crop material is generated based on the stereoscopic image, and from the point cloud, a cross sectional area of the crop material is determined. In some embodiments, the generation of a point cloud may be omitted.
014] Having summarized certain features of detection systems of the present disclosure, reference will now be made in detail to the description of the disclosure as illustrated in the drawings. While the disclosure will be described in connection with these drawings, there is no intent to limit it to the embodiment or embodiments disclosed herein. For instance, in the description that follows, one focus is on an agricultural machine embodied as a combine, though it should be appreciated that other self- propelled or towed agricultural machines that process crop material are contemplated to be within the scope of the disclosure. As another example, certain embodiments of detection systems are disclosed herein for illustration with a focus on stereoscopic imaging (e.g., plural cameras that capture an image from slightly different locations or perspectives to enable generation of a stereo image from the resulting image pairs and three dimensional coordinates). However, some embodiments may use other types of imaging systems, such as laser radar topography, among other types of imaging systems using other portions of the electromagnetic spectrum. Note that, although the emphasis herein is on the imaging of crop material located ahead of a header, in some embodiments, the imaging system may further enable determination of one or more of the crop material parameters based on detection between the header and a front of the agricultural machine (e.g., the feeder house). Further, although the description identifies or describes specifics of one or more embodiments, such specifics are not necessarily part of every embodiment, nor are all various stated advantages necessarily associated with a single embodiment or all embodiments. On the contrary, the intent is to cover all alternatives, modifications and equivalents included within the spirit and scope of the disclosure as defined by the appended claims. Further, it should be appreciated in the context of the present disclosure that the claims are not necessarily limited to the particular embodiments set out in the description.
[0015] Note that references hereinafter made to certain directions, such as, for example,
"front", "rear", "left" and "right", are made as viewed from the rear of the combine looking forwardly.
[0016] Referring now to FIG. 1 , shown is an example agricultural machine embodied as a combine 10 in which an embodiment of a detection system may be implemented. It should be understood by one having ordinary skill in the art, in the context of the present disclosure, that the example combine 10 shown in FIG. 1 is merely illustrative, and that other combine configurations may be implemented in some embodiments. For instance, though shown as a single, transverse-rotor design, some embodiments of detection systems may be used in combines having an axial flow, hybrid, or dual rotor configuration, among other combine configurations, and hence are contemplated to be within the scope of the disclosure. The example combine 10 is shown in FIG. 1 without a header, and from front to back, comprises a feeder house 12 and an operator cab 14, followed by a processing compartment that includes a processing apparatus 16.
[0017] In operation, the combine 10 includes a harvesting header (shown in FIG. 2, as described later) at the front of the machine that cuts cop materials and delivers the cut crop materials to the front end of the feeder house 12. Such crop materials are moved upwardly and rearwardly within and beyond the feeder house 12 by a conveyer 18 until reaching a thresher rotor 20 of the processing apparatus 16. The thresher rotor 20 comprises a single, transverse rotor, such as that found in a Gleaner® Super Series Combine by AGCO. The thresher rotor 20 processes the crop materials in known manner and passes a portion of the crop material (e.g., heavier chaff, corn stalks, etc.) toward the rear of the combine 10 and another portion (e.g., grain and possibly light chaff) through a cleaning process, as described below. In some embodiments, such as in axial flow designs, the conveyor 18 may convey the cut crop material to a beater before reaching a rotor or rotors.
] In the processing apparatus 16, the crop materials undergo threshing and separating operations. In other words, the crop materials are threshed and separated by the thresher rotor 20 operating in cooperation with certain elements of a cage 22, for instance, well-known foraminous processing members in the form of threshing concave assemblies and separator grate assemblies, with the grain (and possibly light chaff) escaping through the concave assemblies and the grate assemblies and onto one or more distribution augers 24 located beneath the processing apparatus 16. Bulkier stalk and leaf materials are generally retained by the concave assemblies and the grate assemblies and are disbursed out from the processing apparatus 16 and ultimately out of the rear of the combine 10. The distribution augers 24 uniformly spread the crop material that falls upon it, with the spread crop material conveyed to accelerator rolls 26. The accelerator rolls 26 speed the descent of the crop material toward a cleaning assembly 28. The cleaning assembly 28 includes a transverse fan 30 (or equivalently, a blower), which facilitates the cleaning of the heavier crop material directly beneath the accelerator rolls 26 while causing the chaff to be carried out of the rear of the combine 10. The cleaning assembly 28 also includes plural stacked sieves 32, through which the fan 30 provides an additional push or influence of the chaff flow to the rear of the combine 10. The cleaned grain that drops to the bottom of the cleaning assembly 28 is delivered by an auger 34 that transports the grain to a well-known elevator mechanism (not shown), which conveys the grain to a grain bin 36 located at the top of the combine 10. Any remaining chaff and partially threshed grain is recirculated through the processing apparatus 16 via a tailings return auger 38. As combine processing is known to those having ordinary skill in the art, further discussion of the same is omitted here for brevity.
] The example combine 10 also comprises a detection system 40, which in one embodiment comprises an imaging system 42 mounted on the combine 10, and a controller 44 (shown schematically). Though depicted in the operator cab 14, the controller 44 may be located elsewhere on the combine 10 in some embodiments. In the embodiment depicted in FIG. 1 , the detection system 40 comprises the imaging system 42, which comprises two cameras 46A and 46B (collectively, cameras 46, schematically depicted in FIG. 1 ) mounted to the top of the operator cab 14, and a controller 44. It should be appreciated that the quantity and/or the location of the cameras 46 may vary in some embodiments. The imaging system 42 is generally mounted in a location within a range and view point that enables the capture of images (or scans in some embodiments, wherein a scanner may replace one of the cameras (the other camera omitted), and/or located in any of a plurality of places on the combine 10) of crop material located ahead of the header that is coupled to the front of the combine 10. The cameras 46A, 46B are configured to operate in the visible light spectrum, and are depicted in this example as offset symmetrically across a longitudinal centerline of the combine 10, although it is not necessary for the cameras 46A and 46B to be symmetrically offset or offset with respect to the centerline. The cameras 46A, 46B are positioned to capture images of the crop material (e.g., uncut crops or weeds, or in some embodiments, cut crops) located proximal to, and in front of, the header of the combine 10. The captured image may reveal one or more crop material parameters, such as a height of the crops along all or a portion of a width of the header, the density, and/or moisture content (e.g., via the color).
[0020] The pair of images captured by the cameras 46A, 46B are used to produce stereo images and in some embodiments, a point cloud (or otherwise, three dimensional coordinates), as described below. Although described in the context of cameras operating in the visible spectrum, some embodiments of the imaging system 42 may operate in the non-visible spectrum, such as the infrared, ultraviolet, ultrasonic, among other ranges. In some embodiments, the imaging system 42 may be embodied as a laser radar topography system (referred to herein as a scanner or laser scanner).
[0021] FIG. 2 shows a view from a location proximal to the top of the operator cab 14, which may be the view seen by one of the cameras 46A (FIG. 1 ) of the imaging system 42 (FIG. 1 ). As shown, the feeder house 12 has secured to it a header 48, shown partially in FIG. 2, which may be removed and replaced with other types of headers depending on the application. Although shown as a draper style header, certain embodiments of a detection system 40 (FIG. 1 ) may include other types of headers, such as pickup headers, corn headers, etc. In one embodiment, the header 48 comprises a cutting portion 50 for cutting the crops and a transition portion 52 that conveys (e.g., using a conveyor, such as a belt or belts, chain and slat configuration, etc.) the cut crops toward a rear, center portion 54 of the header 48, as is known. The center portion 54 may comprise a feeder auger (not shown) to advance the harvested crop material into the feeder house 12, where the conveyor 18 (FIG. 1 ) conveys the crop material toward the processing apparatus 16.
[0022] In operation, the imaging system 42 (FIG. 1 ) captures plural images (or a scan) of the crop material located in front of, and proximal to, the header 48. As indicated above, the detection system 40 (FIG. 1 ) enables a predictive determination of the a load of the crop material that is to move through the combine 10 to prevent plugging or overloading of components in the combine 10, such as components in the feeder house 12. In one embodiment, the plural images of a given area comprising crop material (e.g., uncut crop material located in front of the header 48) are captured by the cameras 46A and 46B (FIG. 1 , or scans are captured by a scanner), and the images or scan(s) may be communicated (e.g., over a wired connection or network, such as via a controller area network (CAN), or wirelessly) to the controller 44 (FIG. 1 ). The communication of images or scans may be implemented continuously, regularly (e.g., periodically, every defined quantity of feet of travel of the combine 10, every fixed time interval, etc.) or irregularly or aperiodically (e.g., responsive to a given event, such as operator intervention locally or remotely, or at random intervals).
The captured images or scan(s) are received at the controller 44 (FIG. 1 ), which pairs the images and in one embodiment determines a point cloud to enable a determination of one or more crop material parameters, such as crop height, crop density, and/or moisture content. Such a determination may be a relative and/or absolute determination. For instance, for a relative determination, the detection system 40 (FIG. 1 ) may determine that the imaged crop material meets one or more defined conditions, and enables the combine 10 (FIG. 1 ) to continue operating according to its current state of operations until the one or more defined conditions vary according to some threshold amount of change. In embodiments where absolute determinations are implemented, the detection system 40 may perform a calibration procedure (e.g., in the assembly plant or field where determinations of crop material parameters are based on known object parameters imaged during the calibration procedure) or target certain features of known parameters (e.g., known header width, height, etc.) and associate the same with a benchmark to enable the crop material determinations. As another example of crop material determinations, the controller 44 may determine the moisture content from the color of the imaged or scanned crop material. Based on the image or scan, the controller 44 determines an optimum or desirous machine parameter, such as ground speed of the combine 10, direction of the combine 10, or other combine settings pertaining to combine operations (e.g., concave clearance, cleaning fan speed, etc.). Other machine parameters that the controller 44 determines for optimal or desirous operation include header placement (e.g., height, pitch, and/or yaw), and/or other header settings or operations, such as speed of the cutting portion 50 or feeder auger. The controller 44 adjusts a parameter corresponding to one or more of these machine parameters, and provides a corresponding control signal to an actuator (e.g., hydraulic valve, solenoid, or other known actuating devices) to cause the adjustment in the machine parameter to take effect. For instance, the control signal from the controller 44 may be provided to a hydraulic control valve that adjusts the flow of fluid to one or more cylinders associated with a steering mechanism or header height adjustment for the combine 10, causing an adjustment in the direction of the combine 10 or the height of the header 48. The speed or other machine parameters may be adjusted, as should be appreciated by one having ordinary skill in the art. In one embodiment, adjustments are made to avoid an excessive amount of crop material entering the center portion 54 and clogging or plugging the mechanisms of the feeder house 12. These adjustments are made ahead of the problem, and hence avoided or mitigated.
[0024] It should be appreciated within the context of the present disclosure that the manner of actuating the devices may vary depending on the application, where the control signal may be delivered to one or more devices upstream of the device directly responsible for the physical adjustment in the machine parameter, or directly to the actuating device directly responsible for effecting the adjustment in the setting.
[0025] In some embodiments, the imaging system 42 (FIG. 1 ) may be used to prompt additional actions and/or other actions (e.g., not directly involving the crop material parameter for crop material located ahead of the header 48). In one embodiment, the imaging system 42 may detect an obstacle located within the detection range ahead of the header 48. For instance, the one or more cameras 46A or 46B (FIG. 1 ) may capture an image of an obstacle, and the controller 44 (FIG. 1 ) may receive that image (or a stereo image) and determine (e.g., through well-known vision and/or feature detection algorithms) the presence of the obstacle and adjust one or more machine parameters to avoid the obstacle (e.g., such as lifting the header 48, stopping the combine 10, steering evasively, etc.). In some embodiments, the controller 44 may alert an operator in the operator cab 14 and/or other personnel located elsewhere (e.g., remotely) of the obstacle or even provide an alert of the adjustments before (e.g., seeking approval, or making a recommendation), during, or after they are implemented. Such alerts may be in the form of an audible, visible, and/or tactile alert on or associated with user interface equipment (e.g., displays, headsets, joysticks, etc.) in the operator cab 14 (or remotely). In some embodiments, the controller 44 may generate crop density maps based on the imaged or scanned areas of the traversed field, and record the same in a storage device (e.g., memory stick, memory, etc.) and/or communicate the same remotely.
As indicated above, in some embodiments, the controller 44 may cause the adjustment of one or more machine parameters autonomously, such as in an automated or semi-automated agricultural system. In some embodiments, the controller 44 may cause the adjustment of one or more machine parameters with some operator (e.g., located in the operator cab 14 or in a remote facility for remote-controlled operations) involvement, such as to provide an alert or notification of the adjustment or an impending adjustment (e.g., allowing the operator to allow or disallow or override the adjustment). In some embodiments, the controller 44 may merely cause a visual or audible (e.g., verbal or via a sound, such as a buzzer) notification that involves a recommendation (e.g., shown on a graphical user interface on a console in the operator cab 14) as to the appropriate adjustment that the operator should make in view of an assessment by the controller 44 of the image(s) or scan.
[0027] The controller 44 (FIG. 1 ) may include a computer or microcontroller or other computing device embodied in a single package (e.g., enclosure) or with similar functionality distributed among several components. The controller 44, as explained below, may receive the plural images and pair the images to provide a stereoscopic image. As is known, the stereoscopic image may be decomposed into, or otherwise represented by, a point cloud, which the controller 44 uses to determine one or more of the crop material parameters. In some embodiments, one or more of the functionality of the controller 44 may be implemented in the cameras 46A and 46B (FIG. 1 ) or scanner. In some embodiments, the plural images (or scan) are communicated by one or more of the cameras 46A and 46B to the controller 44, which then generates the point cloud and determines the crop material parameter(s). The controller 44 may then determine (e.g., approximate) the crop density, and display the same on a computer monitor or other display device (or in some embodiments, store to a storage device such as memory or generally a computer readable medium) located proximally to, or remotely from, the detection system 40. The crop density map may be based in part on positioning information input to the controller 44, such as coordinates provided by a global positioning system (GPS) or other positioning systems or mechanisms.
[0028] In some embodiments, the images (e.g., or scan(s)), the stereoscopic images, the point cloud, and/or the controller determinations may be communicated to a remote processing system (e.g., computer) located remotely from the combine 10 (FIG. 1 ), such as in a farm management office, famer's home, or elsewhere. For instance, in some embodiments, one or more of the functionality of the controller 44 may be performed (e.g., in addition to or in lieu of local computations) in a remote processing system based on communications of the plural images or the point cloud. Such communication may be performed over a wireless network and/or combination of wired (e.g., landline phone or cable system) and wireless (e.g., from a transceiver in the combine 10). For instance, the combine 10 may be operated and/or at least controlled in part from a remote location, based on the communicated feedback from the detection system 40 (FIG. 1 ).
[0029] Attention is now directed to FIG. 3A, which illustrates an embodiment of a detection system 40. It should be appreciated within the context of the present disclosure that some embodiments may include additional components or fewer or different components, and that the example depicted in FIG. 3A is merely illustrative of one embodiment among others. The detection system 40 comprises the controller 44 coupled in a CAN network 56 (though not limited to a CAN network or a single network) to the imaging system 42, machine controls 58, and a user interface 60. The imaging system 42 has been described already, and may include visible and non-visible spectrum devices, such as cameras, laser radar technology, etc. The machine controls 58 collectively comprise the various actuators, sensors, and/or controlled devices residing on the combine 10 (FIG. 1 ), including those used to control machine navigation (e.g., speed, direction, etc.), internal machinery operations (e.g., for processing system adjustments, cleaning system adjustments, etc.), header position and/or control, among others. The user interface 60 may be a keyboard, mouse, microphone, touch-type display device, or other devices (e.g., switches) that enable input by an operator (e.g., such as while in the operator cab 14 (FIG. 1 )).
[0030] The controller 44 receives and processes the information from the imaging system 42 and delivers control signals to the machine controls 58 (e.g., directly, or indirectly through an intermediary device in some embodiments). In some embodiments, the controller 44 may receive input from the machine controls 58 (e.g., such as to enable feedback as to the position or status of certain devices, such as header height, speed of the combine 10, etc.), and/or receive input from other devices, such as global positioning devices, transceivers, etc. The controller 44 may also receive input from the user interface 60, such as during the process of adjustment to provide feedback of a change in machine parameters or an impending change or need or recommendation for change.
FIG. 3B further illustrates an example embodiment of the controller 44. One having ordinary skill in the art should appreciate in the context of the present disclosure that the example controller 44 is merely illustrative, and that some embodiments of controllers may comprise fewer or additional components, and/or some of the functionality associated with the various components depicted in FIG. 3B may be combined, or further distributed among additional modules, in some embodiments. The controller 44 is depicted in this example as a computer system, but may be embodied as a programmable logic controller (PLC), FPGA, among other devices. It should be appreciated that certain well-known components of computer systems are omitted here to avoid obfuscating relevant features of the controller 44. In one embodiment, the controller 44 comprises one or more processing units, such as processing unit 62, input/output (I/O) interface(s) 64, and memory 64, all coupled to one or more data busses, such as data bus 68. The memory 66 may include any one or a combination of volatile memory elements (e.g., random-access memory RAM, such as DRAM, and SRAM, etc.) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). The memory 66 may store a native operating system, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc. In the embodiment depicted in FIG. 3B, the memory 66 comprises an operating system 70, and crop material parameter determination software 72 that in one embodiment comprises stereo/point cloud software 74 and obstacle detection software 76. It should be appreciated that in some embodiments, additional or fewer software modules (e.g., combined functionality) may be employed in the memory 66 or additional memory. In some embodiments, a separate storage device may be coupled to the data bus 68, such as a persistent memory (e.g., optical, magnetic, and/or semiconductor memory and associated drives).
[0032] The crop material parameter determination software 72 determines crop material parameters such as the height, density, and/or moisture content from the plural images or scan received from the imaging system 42 (FIG. 3A). The stereo/point cloud software 74 enables the pairing of plural images and generation of three-dimensional coordinates of the paired images or scan, enabling the determination of the one or more crop material parameters. The obstacle detect software 76 enables the detection of obstacles according to well-known vision and/or feature recognition software.
[0033] Execution of the software modules 70-76 is implemented by the processing unit
62 under the management and/or control of the operating system 70. In some embodiments, the operating system 70 may be omitted and a more rudimentary manner of control implemented. The processing unit 62 may be embodied as a custom-made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and/or other well- known electrical configurations comprising discrete elements both individually and in various combinations to coordinate the overall operation of the controller 44.
[0034] The I/O interfaces 64 provide one or more interfaces to the network 56 (FIG.
3A) and other networks. In other words, the I/O interfaces 64 may comprise any number of interfaces for the input and output of signals (e.g., analog or digital data) for conveyance over the network 56. The input may comprise input by an operator (local or remote) through the user interface 60 (e.g., a keyboard or mouse or other input device (or audible input in some embodiments)), and input from signals carrying information from one or more of the components of the detection system 40 (FIG. 3A), such as machine controls 58 (FIG. 3A), among other devices.
[0035] When certain embodiments of the controller 44 are implemented at least in part as software (including firmware), as depicted in FIG. 3B, it should be noted that the software can be stored on a variety of non-transitory computer-readable medium for use by, or in connection with, a variety of computer-related systems or methods. In the context of this document, a computer-readable medium may comprise an electronic, magnetic, optical, or other physical device or apparatus that may contain or store a computer program (e.g., executable code or instructions) for use by or in connection with a computer-related system or method. The software may be embedded in a variety of computer-readable mediums for use by, or in connection with, an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
[0036] When certain embodiment of the controller 44 are implemented at least in part as hardware, such functionality may be implemented with any or a combination of the following technologies, which are all well-known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
[0037] Having described certain embodiments of a detection system 40, it should be appreciated within the context of the present disclosure that one embodiment of a detection method, denoted as method 78 as illustrated in FIG. 4, comprises receiving a scan or plural images of crop material located in front of, and proximal to, a header coupled to a front portion of an agricultural machine (80); determining a crop material parameter based on the scan or the plural images (82); and adjusting a machine parameter of the agricultural machine based on the crop material parameter (84).
[0038] Any process descriptions or blocks in flow diagrams should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of the embodiments in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present disclosure.
[0039] It should be emphasized that the above-described embodiments of the present disclosure, particularly, any "preferred" embodiments, are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) of the disclosure without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.

Claims

CLAIMS At least the following is claimed:
1. A method, comprising:
receiving a scan or plural images of crop material located in front of, and proximal to, a header coupled to a front portion of an agricultural machine;
determining a crop material parameter based on the scan or the plural images; and
adjusting a machine parameter of the agricultural machine based on the crop material parameter.
2. The method of claim 1 , wherein receiving comprises receiving the scan of the crop material from a laser scanner mounted on the agricultural machine.
3. The method of claim 1 , wherein receiving comprises receiving the plural images of the crop material from plural cameras mounted on the agricultural machine.
4. The method of claim 1 , wherein determining the crop material parameter or crop material parameters comprises determining one or more of a crop height, a crop density, or a crop moisture content.
5. The method of claim 1 , wherein adjusting the machine parameter comprises adjusting a setting corresponding to a speed or direction of the agricultural machine.
6. The method of claim 1 , wherein adjusting the machine parameter comprises adjusting one or more header settings.
7. The method of claim 1 , further comprising generating a crop density map based on the scan or the plural images.
8. The method of claim 1 , further comprising detecting an obstacle based on the scan or plural images and adjusting the machine parameter based on the detection.
9. The method of claim 1 , further comprising detecting an obstacle based on the scan or plural images and providing a notification of the presence of the obstacle based on the detection.
10. An agricultural machine, comprising:
a header coupled to the front of the agricultural machine; and
a detection system comprising:
a controller; and
a plurality of cameras mounted to the agricultural machine, wherein the controller is configured to receive, from the plurality of cameras, plural images of an area located in front of, and proximal to, the header, and adjust a machine parameter of the agricultural machine based on the plural images.
1 1 . The agricultural machine of claim 10, wherein the plurality of cameras are configured to capture each of the plural images of the area from different perspectives.
12. The agricultural machine of claim 10, wherein the plural images of the area includes an image of crop material.
13. The agricultural machine of claim 12, wherein the controller is configured to determine one or more of a density, a height, or moisture content of the crop material based on the plural images.
14. The agricultural machine of claim 12, wherein the controller is configured to cause generation of a crop density map based on the plural images.
15. The agricultural machine of claim 10, wherein the machine parameter or machine parameters corresponds to navigational operations of the agricultural machine, internal processing operations of the agricultural machine, operations of the header, or a combination of one or more of the navigational operations, the internal processing operations, or the header operations.
16. The agricultural machine of claim 10, wherein the plural images of the area include an image of an obstacle, wherein the machine parameter or machine parameters comprises one or more of speed or direction of the agricultural machine.
17. The agricultural machine of claim 10, wherein the plural images of the area includes an image of an obstacle, and wherein the controller is further configured to provide a notification of the presence of the obstacle.
18. An agricultural machine, comprising:
a header coupled to the front of the agricultural machine; and
a detection system comprising:
a controller; and
a scanner mounted to the agricultural machine, wherein the controller is configured to receive, from the scanner, a scan of an area located in front of, and proximal to, the header and cause adjustment of a machine parameter of the agricultural machine based on the scan.
19. The agricultural machine of claim 18, wherein the controller is configured to determine one or more of a density, a height, or moisture content of crops located within the area based on the scan.
20. The agricultural machine of claim 18, wherein the machine parameter or machine parameters comprises speed or direction of the agricultural machine, internal operations of the agricultural machine, operations of the header, or a combination of one or more of the speed, the direction, the internal operations, or the header operations.
PCT/US2013/074999 2012-12-14 2013-12-13 Predictive load estimation through forward vision WO2014093814A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261737223P 2012-12-14 2012-12-14
US61/737,223 2012-12-14

Publications (1)

Publication Number Publication Date
WO2014093814A1 true WO2014093814A1 (en) 2014-06-19

Family

ID=50934988

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/074999 WO2014093814A1 (en) 2012-12-14 2013-12-13 Predictive load estimation through forward vision

Country Status (1)

Country Link
WO (1) WO2014093814A1 (en)

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3132711A1 (en) 2015-08-17 2017-02-22 CLAAS Selbstfahrende Erntemaschinen GmbH Agricultural harvester
DE102015122269A1 (en) * 2015-12-18 2017-06-22 Claas Selbstfahrende Erntemaschinen Gmbh Method for operating a combine harvester
EP3300580A1 (en) 2016-09-30 2018-04-04 CLAAS Selbstfahrende Erntemaschinen GmbH Combine harvester with a cutting unit and control of the cutting unit
WO2021116802A1 (en) * 2019-12-09 2021-06-17 Precision Planting Llc Methods and imaging systems for harvesting
US11079725B2 (en) 2019-04-10 2021-08-03 Deere & Company Machine control using real-time model
US11178818B2 (en) 2018-10-26 2021-11-23 Deere & Company Harvesting machine control system with fill level processing based on yield data
US11234366B2 (en) 2019-04-10 2022-02-01 Deere & Company Image selection for machine control
US11240961B2 (en) 2018-10-26 2022-02-08 Deere & Company Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity
US20220110251A1 (en) 2020-10-09 2022-04-14 Deere & Company Crop moisture map generation and control system
US11467605B2 (en) 2019-04-10 2022-10-11 Deere & Company Zonal machine control
US11474523B2 (en) 2020-10-09 2022-10-18 Deere & Company Machine control using a predictive speed map
US11477940B2 (en) 2020-03-26 2022-10-25 Deere & Company Mobile work machine control based on zone parameter modification
RU2784488C2 (en) * 2018-01-16 2022-11-28 Макдон Индастриз Лтд. Device for harvesting agricultural crop (options)
US11589509B2 (en) 2018-10-26 2023-02-28 Deere & Company Predictive machine characteristic map generation and control system
US11592822B2 (en) 2020-10-09 2023-02-28 Deere & Company Machine control using a predictive map
US11635765B2 (en) 2020-10-09 2023-04-25 Deere & Company Crop state map generation and control system
US11641800B2 (en) 2020-02-06 2023-05-09 Deere & Company Agricultural harvesting machine with pre-emergence weed detection and mitigation system
US11650587B2 (en) 2020-10-09 2023-05-16 Deere & Company Predictive power map generation and control system
US11653588B2 (en) 2018-10-26 2023-05-23 Deere & Company Yield map generation and control system
US11675354B2 (en) 2020-10-09 2023-06-13 Deere & Company Machine control using a predictive map
US11672203B2 (en) 2018-10-26 2023-06-13 Deere & Company Predictive map generation and control
US11711995B2 (en) 2020-10-09 2023-08-01 Deere & Company Machine control using a predictive map
US11727680B2 (en) 2020-10-09 2023-08-15 Deere & Company Predictive map generation based on seeding characteristics and control
US11744180B2 (en) 2018-01-29 2023-09-05 Deere & Company Harvester crop mapping
US11758846B2 (en) 2019-12-23 2023-09-19 Cnh Industrial America Llc Header control system to adjust a header of a harvester based on sensor information
US11778945B2 (en) 2019-04-10 2023-10-10 Deere & Company Machine control using real-time model
US11812694B2 (en) 2018-01-29 2023-11-14 Deere & Company Monitor system for a harvester
US11825768B2 (en) 2020-10-09 2023-11-28 Deere & Company Machine control using a predictive map
US11845449B2 (en) 2020-10-09 2023-12-19 Deere & Company Map generation and control system
US11844311B2 (en) 2020-10-09 2023-12-19 Deere & Company Machine control using a predictive map
US11849672B2 (en) 2020-10-09 2023-12-26 Deere & Company Machine control using a predictive map
US11849671B2 (en) 2020-10-09 2023-12-26 Deere & Company Crop state map generation and control system
US11870973B2 (en) 2021-07-27 2024-01-09 Deere & Company Camera calibration tool
US11864483B2 (en) 2020-10-09 2024-01-09 Deere & Company Predictive map generation and control system
US11874669B2 (en) 2020-10-09 2024-01-16 Deere & Company Map generation and control system
US11889787B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive speed map generation and control system
US11889788B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive biomass map generation and control
US11895948B2 (en) 2020-10-09 2024-02-13 Deere & Company Predictive map generation and control based on soil properties
US11927459B2 (en) 2020-10-09 2024-03-12 Deere & Company Machine control using a predictive map
US11946747B2 (en) 2020-10-09 2024-04-02 Deere & Company Crop constituent map generation and control system
US11957072B2 (en) 2020-02-06 2024-04-16 Deere & Company Pre-emergence weed detection and mitigation system
US11983009B2 (en) 2020-10-09 2024-05-14 Deere & Company Map generation and control system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030004630A1 (en) * 2001-06-28 2003-01-02 Deere & Company System for measuring the amount of crop to be harvested
US20050279070A1 (en) * 2004-06-21 2005-12-22 Peter Pirro Self-propelled harvesting machine
US20100063680A1 (en) * 2008-09-11 2010-03-11 Jonathan Louis Tolstedt Leader-follower semi-autonomous vehicle with operator on side

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030004630A1 (en) * 2001-06-28 2003-01-02 Deere & Company System for measuring the amount of crop to be harvested
US20050279070A1 (en) * 2004-06-21 2005-12-22 Peter Pirro Self-propelled harvesting machine
US20100063680A1 (en) * 2008-09-11 2010-03-11 Jonathan Louis Tolstedt Leader-follower semi-autonomous vehicle with operator on side

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9807926B2 (en) 2015-08-17 2017-11-07 Claas Selbstfahrende Erntemaschinen Gmbh Agricultural harvesting machine
DE102015113527A1 (en) 2015-08-17 2017-02-23 Claas Selbstfahrende Erntemaschinen Gmbh Agricultural harvester
EP3132711A1 (en) 2015-08-17 2017-02-22 CLAAS Selbstfahrende Erntemaschinen GmbH Agricultural harvester
US10448569B2 (en) 2015-12-18 2019-10-22 Claas Selbstfahrende Erntemaschinen Gmbh Method and apparatus for operating a combine harvester
DE102015122269A1 (en) * 2015-12-18 2017-06-22 Claas Selbstfahrende Erntemaschinen Gmbh Method for operating a combine harvester
EP3300580A1 (en) 2016-09-30 2018-04-04 CLAAS Selbstfahrende Erntemaschinen GmbH Combine harvester with a cutting unit and control of the cutting unit
DE102016118637A1 (en) 2016-09-30 2018-04-05 Claas Selbstfahrende Erntemaschinen Gmbh Combine harvester with a cutting unit and control of a cutting unit
RU2784488C2 (en) * 2018-01-16 2022-11-28 Макдон Индастриз Лтд. Device for harvesting agricultural crop (options)
US11812694B2 (en) 2018-01-29 2023-11-14 Deere & Company Monitor system for a harvester
US11744180B2 (en) 2018-01-29 2023-09-05 Deere & Company Harvester crop mapping
US11178818B2 (en) 2018-10-26 2021-11-23 Deere & Company Harvesting machine control system with fill level processing based on yield data
US11672203B2 (en) 2018-10-26 2023-06-13 Deere & Company Predictive map generation and control
US11240961B2 (en) 2018-10-26 2022-02-08 Deere & Company Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity
US11653588B2 (en) 2018-10-26 2023-05-23 Deere & Company Yield map generation and control system
US11589509B2 (en) 2018-10-26 2023-02-28 Deere & Company Predictive machine characteristic map generation and control system
US11650553B2 (en) 2019-04-10 2023-05-16 Deere & Company Machine control using real-time model
US11079725B2 (en) 2019-04-10 2021-08-03 Deere & Company Machine control using real-time model
US11829112B2 (en) 2019-04-10 2023-11-28 Deere & Company Machine control using real-time model
US11778945B2 (en) 2019-04-10 2023-10-10 Deere & Company Machine control using real-time model
US11467605B2 (en) 2019-04-10 2022-10-11 Deere & Company Zonal machine control
US11234366B2 (en) 2019-04-10 2022-02-01 Deere & Company Image selection for machine control
WO2021116802A1 (en) * 2019-12-09 2021-06-17 Precision Planting Llc Methods and imaging systems for harvesting
US11632905B2 (en) 2019-12-09 2023-04-25 Precision Planting Llc Methods and imaging systems for harvesting
US11758846B2 (en) 2019-12-23 2023-09-19 Cnh Industrial America Llc Header control system to adjust a header of a harvester based on sensor information
US11957072B2 (en) 2020-02-06 2024-04-16 Deere & Company Pre-emergence weed detection and mitigation system
US11641800B2 (en) 2020-02-06 2023-05-09 Deere & Company Agricultural harvesting machine with pre-emergence weed detection and mitigation system
US11477940B2 (en) 2020-03-26 2022-10-25 Deere & Company Mobile work machine control based on zone parameter modification
US20220110251A1 (en) 2020-10-09 2022-04-14 Deere & Company Crop moisture map generation and control system
US11849671B2 (en) 2020-10-09 2023-12-26 Deere & Company Crop state map generation and control system
US11711995B2 (en) 2020-10-09 2023-08-01 Deere & Company Machine control using a predictive map
US11675354B2 (en) 2020-10-09 2023-06-13 Deere & Company Machine control using a predictive map
US11650587B2 (en) 2020-10-09 2023-05-16 Deere & Company Predictive power map generation and control system
US11635765B2 (en) 2020-10-09 2023-04-25 Deere & Company Crop state map generation and control system
US11592822B2 (en) 2020-10-09 2023-02-28 Deere & Company Machine control using a predictive map
US11825768B2 (en) 2020-10-09 2023-11-28 Deere & Company Machine control using a predictive map
US11845449B2 (en) 2020-10-09 2023-12-19 Deere & Company Map generation and control system
US11844311B2 (en) 2020-10-09 2023-12-19 Deere & Company Machine control using a predictive map
US11849672B2 (en) 2020-10-09 2023-12-26 Deere & Company Machine control using a predictive map
US11727680B2 (en) 2020-10-09 2023-08-15 Deere & Company Predictive map generation based on seeding characteristics and control
US11983009B2 (en) 2020-10-09 2024-05-14 Deere & Company Map generation and control system
US11864483B2 (en) 2020-10-09 2024-01-09 Deere & Company Predictive map generation and control system
US11874669B2 (en) 2020-10-09 2024-01-16 Deere & Company Map generation and control system
US11871697B2 (en) 2020-10-09 2024-01-16 Deere & Company Crop moisture map generation and control system
US11889787B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive speed map generation and control system
US11889788B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive biomass map generation and control
US11895948B2 (en) 2020-10-09 2024-02-13 Deere & Company Predictive map generation and control based on soil properties
US11927459B2 (en) 2020-10-09 2024-03-12 Deere & Company Machine control using a predictive map
US11946747B2 (en) 2020-10-09 2024-04-02 Deere & Company Crop constituent map generation and control system
US11474523B2 (en) 2020-10-09 2022-10-18 Deere & Company Machine control using a predictive speed map
US11870973B2 (en) 2021-07-27 2024-01-09 Deere & Company Camera calibration tool

Similar Documents

Publication Publication Date Title
WO2014093814A1 (en) Predictive load estimation through forward vision
US20160366821A1 (en) Crop mat measurement through stereo imaging
US10806078B2 (en) Control system for adjusting conditioning rollers of work vehicle
CA2923037C (en) Harvesting system with a self-propelled harvester
US20190351765A1 (en) System and method for regulating the operating distance between work vehicles
EP4002983B1 (en) System and method for determining residue coverage within a field following a harvesting operation
US9521805B2 (en) Harvester with predictive driving speed specification
US9807933B2 (en) Sensor equipped agricultural harvester
US20140215984A1 (en) Method for Setting the Work Parameters of a Harvester
US20150264864A1 (en) Mog sensing system for a residue spreader
US9788486B2 (en) Grain header with swathing and chopping capability
EP3476199B1 (en) Slip controller for side conveyors of a draper harvesting head
AU2021254671A1 (en) An agricultural system
JP2019028688A (en) Harvesting system for autonomously-traveling combine harvester
US11903342B2 (en) Auto reel height
US10588259B2 (en) Location based chop to swath conversion for riparian buffer zone management
WO2021261343A1 (en) Harvester, system for controlling harvester, method for controlling harvester, program for controlling harvester, and storage medium
US20230225246A1 (en) Agricultural residue depositing apparatus and method
WO2022123889A1 (en) Work vehicle, object state detection system, object state detection method, object state detection program, and recording medium in which object state detection program is recorded
EP3646700B1 (en) Agricultural harvester biomass estimating system
CN113727597A (en) Agricultural machinery such as harvester
JP7423441B2 (en) harvester
WO2022239779A1 (en) Combine and method
JP7433145B2 (en) harvester
JP7423440B2 (en) harvester

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13862423

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13862423

Country of ref document: EP

Kind code of ref document: A1