WO2016086035A1 - System and methods for identifying fields and tasks - Google Patents

System and methods for identifying fields and tasks Download PDF

Info

Publication number
WO2016086035A1
WO2016086035A1 PCT/US2015/062501 US2015062501W WO2016086035A1 WO 2016086035 A1 WO2016086035 A1 WO 2016086035A1 US 2015062501 W US2015062501 W US 2015062501W WO 2016086035 A1 WO2016086035 A1 WO 2016086035A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
task
display device
implement
machine
Prior art date
Application number
PCT/US2015/062501
Other languages
French (fr)
Inventor
Doug SAUDER
Ryan ALLGAIER
Original Assignee
Precision Planting Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Precision Planting Llc filed Critical Precision Planting Llc
Priority to CA2968365A priority Critical patent/CA2968365C/en
Priority to BR112017010698A priority patent/BR112017010698A2/en
Priority to EP15863682.9A priority patent/EP3224734A4/en
Priority to UAA201706467A priority patent/UA124489C2/en
Priority to AU2015353587A priority patent/AU2015353587B2/en
Priority to US15/532,940 priority patent/US10817812B2/en
Publication of WO2016086035A1 publication Critical patent/WO2016086035A1/en
Priority to ZA201704163A priority patent/ZA201704163B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • G06F9/30181Instruction operation extension or modification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Definitions

  • Planters are used for planting seeds of crops (e.g., corn, soybeans) in a field.
  • crops e.g., corn, soybeans
  • Some planters include a display monitor within a cab for displaying a coverage map that shows regions of the field that have been planted.
  • the coverage map of the planter is generated based on planting data collected by the planter. Swath control prevents the planter from planting in a region that has already been planted by the same planter.
  • a combine harvester or combine is a machine that harvests crops.
  • a coverage map of a combine displays regions of the field that have been harvested by that combine.
  • a coverage map allows the operator of the combine to know that a region of the field has already been harvested by the same combine. The operator may have difficulty operating the machine, operating the implement, and analyzing the data and maps provided by the display monitor in a timely manner.
  • a system includes a display device to display a representation of one or more agricultural fields with geo-referenced boundaries and to receive one or more inputs for identifying at least one agricultural field with agricultural field identification information.
  • a processing system is communicatively coupled to the display device. The processing system is configured to automatically transmit raw data including measurement data and location component data to the display device in response to a machine or an implement starting and to automatically identify location component data of the raw data. The processing system is further configured to
  • said at least one agricultural field has been associated with at least one of a business, a farm, and a user.
  • automatically identifying location component data of the raw data comprises searching the raw data for data having a characteristic associated with the location component data including at least one of an identifying portion associated with location data and a data unit size, length or frequency associated with location data.
  • the processing system is further configured to send a query to a machine network of the machine or an implement network of the implement for requesting location information or identification of location information.
  • the measurement data includes at least one of seed sensor data, yield data, and liquid application rate data.
  • the location component data includes at least one of GPS data and real-time kinematics data.
  • processing system is integrated with the machine or implement and the display device is removable from the machine.
  • a method includes automatically transmitting, with a
  • raw data including task information identifying at least one of an agricultural task to be performed and an implement to be used to a display device in response to a machine starting or an implement capable of being attached to the machine starting.
  • the method further includes automatically identifying at least one of an agricultural task and an implement to be used based on the task information.
  • the method further includes generating data and maps from raw data based on the identified agricultural task or implement.
  • the method optionally includes displaying the generated data and maps on a graphical user interface of the display device.
  • the identified agricultural task comprises harvesting and the raw data includes sensor data and location data to generate a yield map.
  • the identified agricultural task comprises planting and the raw data includes sensor data and location data to generate a planting map.
  • the automatically identifying at least one of an agricultural task and an implement to be used optionally comprises searching the raw data for data having a characteristic associated with the task information including at least one of an identifying portion associated with task information and a data unit size, length or frequency associated with task data.
  • the task information optionally includes an implement identifier that is associated with at least one of implement types, makes, or model.
  • the task information optionally includes controller or sensor signals having a frequency and the display device searches a database associating frequency of controller or sensor pulses with a type of agricultural application.
  • the method further includes sending a query to a machine network of the machine or an implement network of the implement for requesting location information or identification of location information.
  • a method includes initiating a software application on a display device, determining, with at least one of a processing system, a communication unit, and the display device, whether at least one of automatic field identification and automatic task identification occurs based on initiation of the software application. The method further includes displaying on a graphical user interface of the display device at least one of the determined automatic field identification and the automatic task identification if at least one of automatic field identification and automatic task identification occurs. The method further includes receiving input for correcting the automatic field identification with at least one alternative field if correction is needed when the automatic field identification occurs.
  • the method optionally includes receiving input for correcting the automatic task identification with at least one alternative task if correction is needed when the automatic task identification occurs.
  • the method optionally includes waiting for a subsequent determination of whether at least one of automatic field identification and automatic task identification occurs when no automatic field or task identification is initially determined to have occurred.
  • the method optionally includes generating alternative fields for the automatic field identification if appropriate and sending the alternative fields to the display device for display on the graphical user interface.
  • the method optionally includes generating alternative tasks for the automatic task identification if appropriate and sending the alternative tasks to the display device for display on the graphical user interface.
  • FIG. 1 shows an example of a system 100 that includes a machine 102 (e.g., tractor, combine harvester, etc.) and an implement 140 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) in accordance with one embodiment;
  • a machine 102 e.g., tractor, combine harvester, etc.
  • an implement 140 e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.
  • FIG. 2 illustrates a flow diagram of one embodiment for a method 200 of
  • FIG. 3 illustrates a flow diagram of one embodiment for a method 300 of
  • FIG. 4 illustrates a flow diagram of one embodiment for a method 400 of correcting at least one of automatic field identification and automatic task identification.
  • the system includes a machine (e.g., tractor, combine harvester, etc.) that includes a machine network in communication with a plurality of sensors and controllers on the machine.
  • the machine also includes a
  • the communication unit that includes a processor.
  • the communication unit is in data
  • a display device includes a processor and graphical user interface for displaying the processed agricultural data including fields and tasks for agricultural operations.
  • Embodiments of the invention provide an improved system and methods for automatic field identification, automatic task identification, correction of the automatic identification if necessary, and correction of the automatic task identification if necessary.
  • An operator or user can review the displayed processed agricultural data for identifying fields and tasks. Subsequently, the user can make any corrections if necessary for the automatic field identification and the automatic task identification.
  • the operator can also remove the display device (e.g., a tablet device, a computing device) after finishing in field operations and review data and images with the display device at a different location (e.g., farm, home) than the location (e.g., field) where the data is acquired.
  • FIG. 1 shows an example of a system 100 that includes a machine 102 (e.g., tractor, combine harvester, etc.) and an implement 140 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) in accordance with one embodiment.
  • a machine 102 e.g., tractor, combine harvester, etc.
  • an implement 140 e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.
  • the machine 102 includes a processing system 120, memory 105, machine network 110 (e.g., a controller area network (CAN) serial bus protocol network, an ISOBUS network, etc.), and a network interface 115 for communicating with other systems or devices including the implement 140.
  • the machine network 110 includes sensors 112 (e.g., speed sensors) and controllers 111 (e.g., GPS receiver, radar unit) for controlling and monitoring operations of the machine.
  • the network interface 115 can include at least one of a GPS transceiver, a WLAN transceiver (e.g., WiFi), an infrared transceiver, a Bluetooth transceiver, Ethernet, or other interfaces from communications with other devices and systems including the implement 140.
  • the network interface 115 may be integrated with the machine network 110 or separate from the machine network 110 as illustrated in FIG. 1.
  • the I/O ports 129 e.g., diagnostic/on board diagnostic (OBD) port
  • OOB diagnostic/on board diagnostic
  • the processing system 120 may include one or more microprocessors, processors, a system on a chip (integrated circuit), or one or more microcontrollers.
  • the processing system includes processing logic 126 for executing software instructions of one or more programs and a communication unit 128 (e.g., transmitter, transceiver) for transmitting and receiving communications from the machine via machine network 110 or network interface 115 or implement via implement network 150 or network interface 160.
  • the communication unit 128 may be integrated with the processing system or separate from the processing system. In one embodiment, the communication unit 128 is in data communication with the machine network 110 and implement network 150 via a diagnostic/OBD port of the I/O ports 129.
  • Processing logic 126 including one or more processors may process the
  • the system 100 includes memory 105 for storing data and programs for execution (software 106) by the processing system.
  • the memory 105 can store, for example, software components such as an agricultural implement software application for monitoring and controlling field operations, a field and task identification software application or module for identifying one or more fields, identifying one or more tasks, and user correction of the field and task identification, or any other software application or module.
  • the memory 105 can be any known form of a machine readable non-transitory storage medium, such as semiconductor memory (e.g., flash; SRAM; DRAM; etc.) or non-volatile memory, such as hard disks or solid-state drive.
  • the system can also include an audio input/output subsystem (not shown) which may include a microphone and a speaker for, for example, receiving and sending voice commands or for user authentication or authorization (e.g., biometrics).
  • Display devices 125 and 130 can provide visual user interfaces for a user or operator.
  • the display devices may include display controllers.
  • the display device 125 is a portable tablet device or computing device with a touchscreen that displays images (e.g., high definition field maps of as-planted or as-harvested data, images for identification of fields and tasks) and data generated by the field and task identification software application or agricultural implement software application and receives input from the user or operator for identifying fields and tasks, correcting identified fields and tasks, or monitoring and controlling field operations.
  • the operations may include configuration of the machine or implement, reporting of data, control of the machine or implement including sensors and controllers, and storage of the data generated.
  • the display device 130 may be a display (e.g., display provided by an original equipment manufacturer (OEM)) that displays images and data for identifying fields and tasks, correcting identified fields and tasks, controlling a machine (e.g., planter, tractor, combine, sprayer, etc.), steering the machine, and monitoring the machine or an implement (e.g., planter, combine, sprayer, etc.) that is connected to the machine with sensors and controllers located on the machine or implement.
  • OEM original equipment manufacturer
  • a cab control module 170 may include an additional control module for enabling or disabling certain components or devices of the machine or implement. For example, if the user or operator is not able to control the machine or implement using one or more of the display devices, then the cab control module may include switches to shut down or turn off components or devices of the machine or implement.
  • the implement 140 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) includes an implement network 150, a processing system 162, a network interface 160, and optional input/output ports 166 for communicating with other systems or devices including the machine 102.
  • the implement network 150 e.g., a controller area network (CAN) serial bus protocol network, an ISOBUS network, etc.
  • sensors 152 e.g., speed sensors, seed sensors for detecting passage of seed, downforce sensors, actuator valves, etc.
  • controllers 154 e.g., GPS receiver
  • the sensors may include moisture sensors or flow sensors for a combine, speed sensors for the machine, downforce (e.g., row unit downforce) sensors for a planter, liquid application sensors for a sprayer, or vacuum, lift, or lower sensors for an implement.
  • the sensors may comprise processors in communication with a plurality of seed sensors.
  • the processors are preferably configured to process seed sensor data and transmit processed data to the processing system 162 or 120.
  • the controllers and sensors may be used for monitoring motors and drives on a planter including a variable rate drive system for changing plant populations.
  • the controllers and sensors may also provide swath control to shut off individual rows or sections of the planter.
  • the sensors and controllers may sense changes in an electric motor that controls each row of a planter individually. These sensors and controllers may sense seed delivery speeds in a seed tube for each row of a planter.
  • the network interface 160 can be a WLAN transceiver (e.g., WiFi), an infrared transceiver, a Bluetooth transceiver, Ethernet, or other interfaces for communication with other devices and systems including the machine 102.
  • the network interface 160 may be integrated with the implement network 150 or separate from the implement network 150 as illustrated in FIG. 1.
  • the implement communicates with the machine via wired and/or wireless bidirectional communications 104.
  • the implement network 150 may communicate directly with the machine network 150 or via the networks interfaces 115 and 160.
  • the implement may also be physically coupled to the machine for agricultural operations (e.g., planting, harvesting, spraying, etc.).
  • the memory 105 may be a machine-accessible non-transitory medium on which is stored one or more sets of instructions (e.g., software 106) embodying any one or more of the methodologies or functions described herein.
  • the software 106 may also reside, completely or at least partially, within the memory 105 and/or within the processing system 120 during execution thereof by the system 100, the memory and the processing system also constituting machine-accessible storage media.
  • the software 106 may further be transmitted or received over a network via the network interface device 115.
  • a machine-accessible non-transitory medium e.g., memory 105 contains executable computer program instructions which when executed by a data processing system cause the system to identify agricultural fields and tasks. While the machine-accessible non-transitory medium (e.g., memory 105) is shown in an exemplary embodiment to be a single medium, the term "machine-accessible non-transitory medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • machine-accessible non-transitory medium shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention.
  • machine-accessible non-transitory medium shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.
  • FIG. 2 illustrates a flow diagram of one embodiment for a method 200 of
  • the method 200 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine or a device), or a combination of both.
  • the method 200 is performed by processing logic of at least one of a machine (e.g., processing system of a tractor, processing system of a combine harvester, processing system of an implement, etc.), at least one communication unit of the machine, and processing logic of a display device.
  • the processing system or communication unit of the machine or processing logic of the display device executes instructions of a software application or program.
  • the software application or program can be initiated by an operator or user of a machine (e.g., tractor, combine harvester).
  • a display device optionally receives a user input for initiating a software application (e.g., field and task identification software application or module, agricultural implement software application, etc.) on the display device.
  • a software application e.g., field and task identification software application or module, agricultural implement software application, etc.
  • the processing system or communication unit optionally receives a communication from the display device in response to the initiation of the software application.
  • the processing system, communication unit, or display device provides instructions for displaying a user interface that includes one or more fields with geo-referenced boundaries on the display device.
  • the display device may receive one or more user inputs for identifying at least one field.
  • the identifying information associated with the field may be stored in memory after having been entered by the user on desktop software and/or obtained from a computer server containing field identification information.
  • the user associates at least the identified field with at least one of a business, a farm, and a user.
  • the machine (or an implement that can be attached to the machine) starts based on a user input or action.
  • Starting the machine or implement may comprise, e.g., starting the electronic and/or electrical control systems of a tractor, seed planter (e.g., seed meter drive motors thereof), a combine harvester, or a liquid application device (e.g., a metering pump and/or control valve thereof).
  • the communication unit automatically transmits raw data to the display device in response to the machine starting and optionally in response to initiation of the software application.
  • the raw data includes measurement data (e.g., seed sensor data, yield data, liquid application rate data) and location component data (e.g., GPS data, realtime kinematics data).
  • Seed sensor data may be generated by an optical or electromagnetic sensor disposed to detect passage of seeds deposited by a seed meter of a seed planter; it should be appreciated that such seed sensor data may result from planting operations carried out by the implement 140 (e.g., seed planter).
  • Yield data may be generated by a yield monitor (e.g., an impact-type sensor generating a yield signal related to the amount of grain striking an impact plate) mounted to a combine harvester and disposed to contact grain being processed or transferred within the combine harvester; it should be appreciated that such yield data may result from crop harvesting operations carried out by the implement 140 (e.g., combine harvester).
  • Liquid application rate data may be generated by a flow rate sensor disposed on a liquid application device (e.g., sprayer or seed planter) to generate a measured or predicted flow rate associated with a valve or flow path; it should be appreciated that such liquid application rate data may result from liquid application operations carried out by the implement 140 (e.g., liquid application device).
  • a liquid application device e.g., sprayer or seed planter
  • the display device automatically identifies location component data of the raw data. For example, the display device may search the raw data for data having a characteristic associated with the location component data; in one such example, the display device may search the raw data for data (e.g., a CAN frame) having at least one of the following: (1) an identifying portion (e.g., an identifier field of a CAN frame) associated with location data such as GPS data; and (2) a data unit size, length or frequency associated with location data such as GPS data. In other examples, the display device may send a query (e.g., a message or signal) to the machine network or implement network requesting location information or identification of location information.
  • the display device automatically assigns raw data to at least one of the identified field, business, farm, and user. For example, the display device may select a field if the location data is associated with a geo-referenced position within the field boundary.
  • the processing system or communication unit of the machine processes the raw data by identifying location component data of the raw data and assigns raw data to at least one of the identified field, business, farm, and user.
  • the processed data is then sent to the display device for display on the graphical user interface of the display device.
  • FIG. 3 illustrates a flow diagram of one embodiment for a method 300 of
  • the method 300 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine or a device), or a combination of both.
  • the method 300 is performed by processing logic of at least one of a machine (e.g., processing system of a tractor, processing system of a combine, processing system of an implement, etc.), at least one communication unit of the machine, and processing logic of a display device.
  • the processing system or communication unit of the machine or processing logic of the display device executes instructions of a software application or program.
  • the software application or program can be initiated by an operator or user of a machine (e.g., tractor, planter, combine harvester).
  • a display device optionally receives a user input for initiating a software application (e.g., field and task identification software application or module, agricultural implement software application, etc.) on the display device.
  • a software application e.g., field and task identification software application or module, agricultural implement software application, etc.
  • the processing system or communication unit optionally receives a communication from the display device in response to the initiation of the software application.
  • the machine or an implement that can be attached to the machine starts based on user input.
  • the communication unit automatically transmits raw data to the display device in response to the machine starting and optionally in response to initiation of the software application.
  • the raw data includes task information identifying at least one of a task to be performed and an implement to be used.
  • the display device automatically identifies at least one of a task and an implement based on the task information.
  • the display device preferably first identifies the task information in the raw data. For example, the display device may search the raw data for data having a characteristic associated with the task information; in one such example, the display device may search the raw data for data (e.g., a CAN frame) having at least one of the following: (1) an identifying portion (e.g., an identifier field of a CAN frame) associated with task information such as application rate data (e.g., seed sensor data), flow rate data, or yield data; and (2) a data unit size, length or frequency associated with task data.
  • an identifying portion e.g., an identifier field of a CAN frame
  • the display device may send a query (e.g., a message or signal) to the machine network or implement network requesting task information such as application type information or vehicle or implement information (e.g., a model name or number such as a vehicle identification number).
  • a query e.g., a message or signal
  • the display device preferably identified at least one of the task and implement using a characteristic of the task information. For example, if the task information includes an implement identifier, the display device may search a database associating implement identifiers with implement types, makes, or model.
  • the display device may search a database associating frequency of controller or sensor pulses with a type of application (e.g., planting, liquid application, or harvesting).
  • a type of application e.g., planting, liquid application, or harvesting.
  • the display device generates data and maps from raw data based on the identified task or implement. For example, if the display device determines that the current task is harvesting, the display device preferably uses the sensor data and the location data to generate a yield map. In another example, if the display device determines that the current task is planting, the display device preferably uses the sensor data and the location data to generate a planting map such as a population map.
  • the display device displays the generated data and maps on a graphical user interface of the display device.
  • the processing system or communication unit of the machine processes the raw data by identifying at least one of a task to be performed and an implement to be used.
  • the processing system or communication unit then generates data and maps from raw data based on the identified task or implement.
  • the processed data is then sent to the display device for display on the graphical user interface of the display device.
  • FIG. 4 illustrates a flow diagram of one embodiment for a method 400 of correcting at least one of automatic field identification and automatic task identification.
  • the method 400 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine or a device), or a combination of both.
  • the method 400 is performed by processing logic of at least one of a machine (e.g., processing system of a tractor, processing system of a combine, processing system of an implement, etc.), at least one communication unit of the machine, and processing logic of a display device.
  • the processing system or communication unit of the machine or processing logic of the display device executes instructions of a software application or program.
  • the software application or program can be initiated by an operator or user of a machine (e.g., tractor, planter, combine harvester).
  • a display device optionally receives a user input for initiating a software application (e.g., field and task identification software application or module, agricultural implement software application, etc.) on the display device.
  • a software application e.g., field and task identification software application or module, agricultural implement software application, etc.
  • the processing system or communication unit optionally receives a communication from the display device in response to the initiation of the software application.
  • the processing system, communication unit, or display device determines whether at least one of automatic field identification (e.g., method 200) and automatic task identification (e.g., method 300) occurs. If so, then the display device displays on a graphical user interface at least one of the determined automatic field identification and the automatic task
  • the display device receives user input for correcting the automatic field identification with an alternative field(s) if correction is needed or the display receives user input for correcting the automatic task identification with an alternative task(s) if correction is needed at block 410. If no automatic field or task identification is determined to have occurred at block 406, then the method 400 waits for a subsequent determination of whether at least one of automatic field identification and automatic task identification occurs when no automatic field or task identification is initially determined to have occurred.
  • the processing system or communication unit of the machine performs the operations of block 406, then optionally at operation 407 the processing system or communication unit generates alternative fields for the automatic field identification if appropriate and alternative tasks for the automatic task identification if appropriate, and sends the alternative fields and alternative tasks to the display device for display on the graphical user interface.
  • the operations of the method(s) disclosed herein can be altered, modified, combined, or deleted.
  • the operations of blocks 302 and 304 can be removed from method 300 and the operations of blocks 402 and 404 can be removed from method 400.
  • the methods in embodiments of the present invention may be performed with a device, an apparatus, or data processing system as described herein.
  • the device, apparatus, or data processing system may be a conventional, general-purpose computer system or special purpose computers, which are designed or programmed to perform only one function, may also be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Tourism & Hospitality (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Development Economics (AREA)
  • Operations Research (AREA)
  • Mining & Mineral Resources (AREA)
  • Animal Husbandry (AREA)
  • Agronomy & Crop Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Game Theory and Decision Science (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Quality & Reliability (AREA)
  • Primary Health Care (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Guiding Agricultural Machines (AREA)
  • Fertilizing (AREA)

Abstract

Described herein are a system and methods for identifying fields and tasks (e.g., agricultural fields and tasks). In one embodiment, system includes a display device to display a representation of one or more agricultural fields with geo-referenced boundaries and to receive one or more inputs for identifying at least one agricultural field with agricultural field identification information. A processing system is communicatively coupled to the display device. The processing system is configured to automatically transmit raw data including measurement data and location component data to the display device in response to a machine or an implement starting and to automatically identify location component data of the raw data. The processing system is further configured to automatically assign raw data to at least one agricultural field.

Description

SYSTEM AND METHODS FOR IDENTIFYING FIELDS AND TASKS
BACKGROUND
[0001 ] This application claims the benefit of U.S. Provisional Application No. 62/083,640, filed on November 24, 2014, the entire contents of which are hereby incorporated by reference.
[0002] Planters are used for planting seeds of crops (e.g., corn, soybeans) in a field. Some planters include a display monitor within a cab for displaying a coverage map that shows regions of the field that have been planted. The coverage map of the planter is generated based on planting data collected by the planter. Swath control prevents the planter from planting in a region that has already been planted by the same planter.
[0003] A combine harvester or combine is a machine that harvests crops. A coverage map of a combine displays regions of the field that have been harvested by that combine. A coverage map allows the operator of the combine to know that a region of the field has already been harvested by the same combine. The operator may have difficulty operating the machine, operating the implement, and analyzing the data and maps provided by the display monitor in a timely manner.
SUMMARY
[0004] Described herein are a system and methods for identifying fields and tasks (e.g., agricultural fields and tasks). In one embodiment, a system includes a display device to display a representation of one or more agricultural fields with geo-referenced boundaries and to receive one or more inputs for identifying at least one agricultural field with agricultural field identification information. A processing system is communicatively coupled to the display device. The processing system is configured to automatically transmit raw data including measurement data and location component data to the display device in response to a machine or an implement starting and to automatically identify location component data of the raw data. The processing system is further configured to
automatically assign raw data to at least one agricultural field.
[0005] In one example, said at least one agricultural field has been associated with at least one of a business, a farm, and a user.
[0006] In another example, automatically identifying location component data of the raw data comprises searching the raw data for data having a characteristic associated with the location component data including at least one of an identifying portion associated with location data and a data unit size, length or frequency associated with location data.
[0007] In another example, the processing system is further configured to send a query to a machine network of the machine or an implement network of the implement for requesting location information or identification of location information.
[0008] In another example, the measurement data includes at least one of seed sensor data, yield data, and liquid application rate data.
[0009] In another example, the location component data includes at least one of GPS data and real-time kinematics data.
[0010] In another example, the processing system is integrated with the machine or implement and the display device is removable from the machine.
[0011 ] In one embodiment, a method includes automatically transmitting, with a
communication unit, raw data including task information identifying at least one of an agricultural task to be performed and an implement to be used to a display device in response to a machine starting or an implement capable of being attached to the machine starting. The method further includes automatically identifying at least one of an agricultural task and an implement to be used based on the task information. The method further includes generating data and maps from raw data based on the identified agricultural task or implement.
[0012] In another example, the method optionally includes displaying the generated data and maps on a graphical user interface of the display device.
[0013] In another example, the identified agricultural task comprises harvesting and the raw data includes sensor data and location data to generate a yield map.
[0014] In another example, the identified agricultural task comprises planting and the raw data includes sensor data and location data to generate a planting map.
[0015] In another example, the automatically identifying at least one of an agricultural task and an implement to be used optionally comprises searching the raw data for data having a characteristic associated with the task information including at least one of an identifying portion associated with task information and a data unit size, length or frequency associated with task data.
[0016] In another example, the task information optionally includes an implement identifier that is associated with at least one of implement types, makes, or model.
[0017] In another example, the task information optionally includes controller or sensor signals having a frequency and the display device searches a database associating frequency of controller or sensor pulses with a type of agricultural application.
[0018] In another example, the method further includes sending a query to a machine network of the machine or an implement network of the implement for requesting location information or identification of location information.
[0019] In one embodiment, a method includes initiating a software application on a display device, determining, with at least one of a processing system, a communication unit, and the display device, whether at least one of automatic field identification and automatic task identification occurs based on initiation of the software application. The method further includes displaying on a graphical user interface of the display device at least one of the determined automatic field identification and the automatic task identification if at least one of automatic field identification and automatic task identification occurs. The method further includes receiving input for correcting the automatic field identification with at least one alternative field if correction is needed when the automatic field identification occurs.
[0020] In another example, the method optionally includes receiving input for correcting the automatic task identification with at least one alternative task if correction is needed when the automatic task identification occurs.
[0021 ] In another example, the method optionally includes waiting for a subsequent determination of whether at least one of automatic field identification and automatic task identification occurs when no automatic field or task identification is initially determined to have occurred.
[0022] In another example, the method optionally includes generating alternative fields for the automatic field identification if appropriate and sending the alternative fields to the display device for display on the graphical user interface.
[0023] In another example, the method optionally includes generating alternative tasks for the automatic task identification if appropriate and sending the alternative tasks to the display device for display on the graphical user interface.
DESCRIPTION OF THE DRAWINGS
[0024] The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
[0025] FIG. 1 shows an example of a system 100 that includes a machine 102 (e.g., tractor, combine harvester, etc.) and an implement 140 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) in accordance with one embodiment;
[0026] FIG. 2 illustrates a flow diagram of one embodiment for a method 200 of
automatically identifying one or more agricultural fields for field operations;
[0027] FIG. 3 illustrates a flow diagram of one embodiment for a method 300 of
automatically identifying one or more agricultural tasks; and
[0028] FIG. 4 illustrates a flow diagram of one embodiment for a method 400 of correcting at least one of automatic field identification and automatic task identification.
DESCRIPTION
[0029] Described herein are a system and methods for identifying fields and tasks (e.g., agricultural fields and tasks). In one embodiment, the system includes a machine (e.g., tractor, combine harvester, etc.) that includes a machine network in communication with a plurality of sensors and controllers on the machine. The machine also includes a
communication unit that includes a processor. The communication unit is in data
communication with the machine network and an implement network of an implement. The processor processes agricultural data received from the machine network and the implement network to generate processed agricultural data. A display device includes a processor and graphical user interface for displaying the processed agricultural data including fields and tasks for agricultural operations.
[0030] Embodiments of the invention provide an improved system and methods for automatic field identification, automatic task identification, correction of the automatic identification if necessary, and correction of the automatic task identification if necessary. An operator or user can review the displayed processed agricultural data for identifying fields and tasks. Subsequently, the user can make any corrections if necessary for the automatic field identification and the automatic task identification. The operator can also remove the display device (e.g., a tablet device, a computing device) after finishing in field operations and review data and images with the display device at a different location (e.g., farm, home) than the location (e.g., field) where the data is acquired.
[0031 ] In the following description, numerous details are set forth. It will be apparent, however, to one skilled in the art, that the present invention may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present invention. [0032] FIG. 1 shows an example of a system 100 that includes a machine 102 (e.g., tractor, combine harvester, etc.) and an implement 140 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) in accordance with one embodiment. The machine 102 includes a processing system 120, memory 105, machine network 110 (e.g., a controller area network (CAN) serial bus protocol network, an ISOBUS network, etc.), and a network interface 115 for communicating with other systems or devices including the implement 140. The machine network 110 includes sensors 112 (e.g., speed sensors) and controllers 111 (e.g., GPS receiver, radar unit) for controlling and monitoring operations of the machine. The network interface 115 can include at least one of a GPS transceiver, a WLAN transceiver (e.g., WiFi), an infrared transceiver, a Bluetooth transceiver, Ethernet, or other interfaces from communications with other devices and systems including the implement 140. The network interface 115 may be integrated with the machine network 110 or separate from the machine network 110 as illustrated in FIG. 1. The I/O ports 129 (e.g., diagnostic/on board diagnostic (OBD) port) enable communication with another data processing system or device (e.g., display devices, sensors, etc.).
[0033] The processing system 120 may include one or more microprocessors, processors, a system on a chip (integrated circuit), or one or more microcontrollers. The processing system includes processing logic 126 for executing software instructions of one or more programs and a communication unit 128 (e.g., transmitter, transceiver) for transmitting and receiving communications from the machine via machine network 110 or network interface 115 or implement via implement network 150 or network interface 160. The communication unit 128 may be integrated with the processing system or separate from the processing system. In one embodiment, the communication unit 128 is in data communication with the machine network 110 and implement network 150 via a diagnostic/OBD port of the I/O ports 129.
[0034] Processing logic 126 including one or more processors may process the
communications received from the communication unit 128 including agricultural data. The system 100 includes memory 105 for storing data and programs for execution (software 106) by the processing system. The memory 105 can store, for example, software components such as an agricultural implement software application for monitoring and controlling field operations, a field and task identification software application or module for identifying one or more fields, identifying one or more tasks, and user correction of the field and task identification, or any other software application or module. The memory 105 can be any known form of a machine readable non-transitory storage medium, such as semiconductor memory (e.g., flash; SRAM; DRAM; etc.) or non-volatile memory, such as hard disks or solid-state drive. The system can also include an audio input/output subsystem (not shown) which may include a microphone and a speaker for, for example, receiving and sending voice commands or for user authentication or authorization (e.g., biometrics).
[0035] Display devices 125 and 130 can provide visual user interfaces for a user or operator. The display devices may include display controllers. In one embodiment, the display device 125 is a portable tablet device or computing device with a touchscreen that displays images (e.g., high definition field maps of as-planted or as-harvested data, images for identification of fields and tasks) and data generated by the field and task identification software application or agricultural implement software application and receives input from the user or operator for identifying fields and tasks, correcting identified fields and tasks, or monitoring and controlling field operations. The operations may include configuration of the machine or implement, reporting of data, control of the machine or implement including sensors and controllers, and storage of the data generated. The display device 130 may be a display (e.g., display provided by an original equipment manufacturer (OEM)) that displays images and data for identifying fields and tasks, correcting identified fields and tasks, controlling a machine (e.g., planter, tractor, combine, sprayer, etc.), steering the machine, and monitoring the machine or an implement (e.g., planter, combine, sprayer, etc.) that is connected to the machine with sensors and controllers located on the machine or implement.
[0036] A cab control module 170 may include an additional control module for enabling or disabling certain components or devices of the machine or implement. For example, if the user or operator is not able to control the machine or implement using one or more of the display devices, then the cab control module may include switches to shut down or turn off components or devices of the machine or implement.
[0037] The implement 140 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) includes an implement network 150, a processing system 162, a network interface 160, and optional input/output ports 166 for communicating with other systems or devices including the machine 102. The implement network 150 (e.g., a controller area network (CAN) serial bus protocol network, an ISOBUS network, etc.) includes sensors 152 (e.g., speed sensors, seed sensors for detecting passage of seed, downforce sensors, actuator valves, etc.), controllers 154 (e.g., GPS receiver), and the processing system 162 for controlling and monitoring operations of the machine. The sensors may include moisture sensors or flow sensors for a combine, speed sensors for the machine, downforce (e.g., row unit downforce) sensors for a planter, liquid application sensors for a sprayer, or vacuum, lift, or lower sensors for an implement. For example, the sensors may comprise processors in communication with a plurality of seed sensors. The processors are preferably configured to process seed sensor data and transmit processed data to the processing system 162 or 120. The controllers and sensors may be used for monitoring motors and drives on a planter including a variable rate drive system for changing plant populations. The controllers and sensors may also provide swath control to shut off individual rows or sections of the planter. The sensors and controllers may sense changes in an electric motor that controls each row of a planter individually. These sensors and controllers may sense seed delivery speeds in a seed tube for each row of a planter.
[0038] The network interface 160 can be a WLAN transceiver (e.g., WiFi), an infrared transceiver, a Bluetooth transceiver, Ethernet, or other interfaces for communication with other devices and systems including the machine 102. The network interface 160 may be integrated with the implement network 150 or separate from the implement network 150 as illustrated in FIG. 1.
[0039] The implement communicates with the machine via wired and/or wireless bidirectional communications 104. The implement network 150 may communicate directly with the machine network 150 or via the networks interfaces 115 and 160. The implement may also be physically coupled to the machine for agricultural operations (e.g., planting, harvesting, spraying, etc.).
[0040] The memory 105 may be a machine-accessible non-transitory medium on which is stored one or more sets of instructions (e.g., software 106) embodying any one or more of the methodologies or functions described herein. The software 106 may also reside, completely or at least partially, within the memory 105 and/or within the processing system 120 during execution thereof by the system 100, the memory and the processing system also constituting machine-accessible storage media. The software 106 may further be transmitted or received over a network via the network interface device 115.
[0041 ] In one embodiment, a machine-accessible non-transitory medium (e.g., memory 105) contains executable computer program instructions which when executed by a data processing system cause the system to identify agricultural fields and tasks. While the machine-accessible non-transitory medium (e.g., memory 105) is shown in an exemplary embodiment to be a single medium, the term "machine-accessible non-transitory medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "machine-accessible non-transitory medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention. The term "machine-accessible non-transitory medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.
[0042] FIG. 2 illustrates a flow diagram of one embodiment for a method 200 of
automatically identifying one or more agricultural fields for field operations. The method 200 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine or a device), or a combination of both. In one embodiment, the method 200 is performed by processing logic of at least one of a machine (e.g., processing system of a tractor, processing system of a combine harvester, processing system of an implement, etc.), at least one communication unit of the machine, and processing logic of a display device. The processing system or communication unit of the machine or processing logic of the display device executes instructions of a software application or program. The software application or program can be initiated by an operator or user of a machine (e.g., tractor, combine harvester).
[0043] At block 202, a display device optionally receives a user input for initiating a software application (e.g., field and task identification software application or module, agricultural implement software application, etc.) on the display device. At block 204, the processing system or communication unit optionally receives a communication from the display device in response to the initiation of the software application. At block 206, the processing system, communication unit, or display device provides instructions for displaying a user interface that includes one or more fields with geo-referenced boundaries on the display device. At block 208, the display device may receive one or more user inputs for identifying at least one field. It should be appreciated that the identifying information associated with the field (e.g., name, boundary) may be stored in memory after having been entered by the user on desktop software and/or obtained from a computer server containing field identification information. Optionally, the user associates at least the identified field with at least one of a business, a farm, and a user. At block 210, the machine (or an implement that can be attached to the machine) starts based on a user input or action.
Starting the machine or implement may comprise, e.g., starting the electronic and/or electrical control systems of a tractor, seed planter (e.g., seed meter drive motors thereof), a combine harvester, or a liquid application device (e.g., a metering pump and/or control valve thereof). At block 212, the communication unit automatically transmits raw data to the display device in response to the machine starting and optionally in response to initiation of the software application. The raw data includes measurement data (e.g., seed sensor data, yield data, liquid application rate data) and location component data (e.g., GPS data, realtime kinematics data). Seed sensor data may be generated by an optical or electromagnetic sensor disposed to detect passage of seeds deposited by a seed meter of a seed planter; it should be appreciated that such seed sensor data may result from planting operations carried out by the implement 140 (e.g., seed planter). Yield data may be generated by a yield monitor (e.g., an impact-type sensor generating a yield signal related to the amount of grain striking an impact plate) mounted to a combine harvester and disposed to contact grain being processed or transferred within the combine harvester; it should be appreciated that such yield data may result from crop harvesting operations carried out by the implement 140 (e.g., combine harvester). Liquid application rate data may be generated by a flow rate sensor disposed on a liquid application device (e.g., sprayer or seed planter) to generate a measured or predicted flow rate associated with a valve or flow path; it should be appreciated that such liquid application rate data may result from liquid application operations carried out by the implement 140 (e.g., liquid application device).
[0044] At block 214, the display device automatically identifies location component data of the raw data. For example, the display device may search the raw data for data having a characteristic associated with the location component data; in one such example, the display device may search the raw data for data (e.g., a CAN frame) having at least one of the following: (1) an identifying portion (e.g., an identifier field of a CAN frame) associated with location data such as GPS data; and (2) a data unit size, length or frequency associated with location data such as GPS data. In other examples, the display device may send a query (e.g., a message or signal) to the machine network or implement network requesting location information or identification of location information. At block 216, the display device automatically assigns raw data to at least one of the identified field, business, farm, and user. For example, the display device may select a field if the location data is associated with a geo-referenced position within the field boundary.
[0045] In another embodiment, the processing system or communication unit of the machine processes the raw data by identifying location component data of the raw data and assigns raw data to at least one of the identified field, business, farm, and user. The processed data is then sent to the display device for display on the graphical user interface of the display device.
[0046] FIG. 3 illustrates a flow diagram of one embodiment for a method 300 of
automatically identifying one or more agricultural tasks. The method 300 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine or a device), or a combination of both. In one embodiment, the method 300 is performed by processing logic of at least one of a machine (e.g., processing system of a tractor, processing system of a combine, processing system of an implement, etc.), at least one communication unit of the machine, and processing logic of a display device. The processing system or communication unit of the machine or processing logic of the display device executes instructions of a software application or program. The software application or program can be initiated by an operator or user of a machine (e.g., tractor, planter, combine harvester).
[0047] At block 302, a display device optionally receives a user input for initiating a software application (e.g., field and task identification software application or module, agricultural implement software application, etc.) on the display device. At block 304, the processing system or communication unit optionally receives a communication from the display device in response to the initiation of the software application. At block 306, the machine (or an implement that can be attached to the machine) starts based on user input. At block 308, the communication unit automatically transmits raw data to the display device in response to the machine starting and optionally in response to initiation of the software application. The raw data includes task information identifying at least one of a task to be performed and an implement to be used. At block 310, the display device automatically identifies at least one of a task and an implement based on the task information. The display device preferably first identifies the task information in the raw data. For example, the display device may search the raw data for data having a characteristic associated with the task information; in one such example, the display device may search the raw data for data (e.g., a CAN frame) having at least one of the following: (1) an identifying portion (e.g., an identifier field of a CAN frame) associated with task information such as application rate data (e.g., seed sensor data), flow rate data, or yield data; and (2) a data unit size, length or frequency associated with task data. In other examples, the display device may send a query (e.g., a message or signal) to the machine network or implement network requesting task information such as application type information or vehicle or implement information (e.g., a model name or number such as a vehicle identification number). Once the display device has identified the task information, the display device preferably identified at least one of the task and implement using a characteristic of the task information. For example, if the task information includes an implement identifier, the display device may search a database associating implement identifiers with implement types, makes, or model. In other examples, if the task information includes controller or sensor signals having a frequency, the display device may search a database associating frequency of controller or sensor pulses with a type of application (e.g., planting, liquid application, or harvesting). At block 312, the display device generates data and maps from raw data based on the identified task or implement. For example, if the display device determines that the current task is harvesting, the display device preferably uses the sensor data and the location data to generate a yield map. In another example, if the display device determines that the current task is planting, the display device preferably uses the sensor data and the location data to generate a planting map such as a population map. At block 314, the display device displays the generated data and maps on a graphical user interface of the display device.
[0048] In another embodiment, the processing system or communication unit of the machine processes the raw data by identifying at least one of a task to be performed and an implement to be used. The processing system or communication unit then generates data and maps from raw data based on the identified task or implement. The processed data is then sent to the display device for display on the graphical user interface of the display device.
[0049] FIG. 4 illustrates a flow diagram of one embodiment for a method 400 of correcting at least one of automatic field identification and automatic task identification. The method 400 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine or a device), or a combination of both. In one embodiment, the method 400 is performed by processing logic of at least one of a machine (e.g., processing system of a tractor, processing system of a combine, processing system of an implement, etc.), at least one communication unit of the machine, and processing logic of a display device. The processing system or communication unit of the machine or processing logic of the display device executes instructions of a software application or program. The software application or program can be initiated by an operator or user of a machine (e.g., tractor, planter, combine harvester).
[0050] At block 402, a display device optionally receives a user input for initiating a software application (e.g., field and task identification software application or module, agricultural implement software application, etc.) on the display device. At block 404, the processing system or communication unit optionally receives a communication from the display device in response to the initiation of the software application. At block 406, the processing system, communication unit, or display device determines whether at least one of automatic field identification (e.g., method 200) and automatic task identification (e.g., method 300) occurs. If so, then the display device displays on a graphical user interface at least one of the determined automatic field identification and the automatic task
identification and also displays alternative fields for the automatic field identification and alternative tasks for the automatic task identification at block 408. The display device receives user input for correcting the automatic field identification with an alternative field(s) if correction is needed or the display receives user input for correcting the automatic task identification with an alternative task(s) if correction is needed at block 410. If no automatic field or task identification is determined to have occurred at block 406, then the method 400 waits for a subsequent determination of whether at least one of automatic field identification and automatic task identification occurs when no automatic field or task identification is initially determined to have occurred.
[0051 ] In another embodiment, the processing system or communication unit of the machine performs the operations of block 406, then optionally at operation 407 the processing system or communication unit generates alternative fields for the automatic field identification if appropriate and alternative tasks for the automatic task identification if appropriate, and sends the alternative fields and alternative tasks to the display device for display on the graphical user interface.
[0052] In some embodiments, the operations of the method(s) disclosed herein can be altered, modified, combined, or deleted. For example, the operations of blocks 302 and 304 can be removed from method 300 and the operations of blocks 402 and 404 can be removed from method 400. The methods in embodiments of the present invention may be performed with a device, an apparatus, or data processing system as described herein. The device, apparatus, or data processing system may be a conventional, general-purpose computer system or special purpose computers, which are designed or programmed to perform only one function, may also be used.
[0053] It is to be understood that the above description is intended to be illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reading and understanding the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims

1. A system comprising:
a display device to display a representation of one or more agricultural fields with geo-referenced boundaries and to receive one or more inputs for identifying at least one agricultural field with agricultural field identification information; and
a processing system communicatively coupled to the display device, the processing system is configured to automatically transmit raw data including measurement data and location component data to the display device in response to a machine or an implement starting and to automatically identify location component data of the raw data, the processing system is further configured to automatically assign raw data to at least one agricultural field.
2. The system of claim 1, wherein said at least one agricultural field has been associated with at least one of a business, a farm, and a user.
3. The system of claim 1, wherein automatically identifying location component data of the raw data comprises searching the raw data for data having a characteristic associated with the location component data including at least one of an identifying portion associated with location data and a data unit size, length or frequency associated with location data.
4. The system of claim 1, wherein the processing system is further configured to:
send a query to a machine network of the machine or an implement network of the implement for requesting location information or identification of location information.
5. The system of claim 1, wherein the measurement data includes at least one of seed sensor data, yield data, and liquid application rate data.
6. The system of claim 1, wherein the location component data includes at least one of GPS data and real-time kinematics data.
7. The system of claim 1, wherein the processing system is integrated with the machine or implement, wherein the display device is removable from the machine.
8. A method, comprising: automatically transmitting, with a communication unit, raw data including task information identifying at least one of an agricultural task to be performed and an implement to be used to a display device in response to a machine starting or an implement capable of being attached to the machine starting; automatically identifying at least one of an agricultural task and an implement to be used based on the task information; and generating data and maps from raw data based on the identified agricultural task or implement.
9. The method of claim 8, further comprising: displaying the generated data and maps on a graphical user interface of the display device.
10. The method of claim 8, wherein the identified agricultural task comprises harvesting and the raw data includes sensor data and location data to generate a yield map.
11. The method of claim 8, wherein the identified agricultural task comprises planting and the raw data includes sensor data and location data to generate a planting map.
12. The method of claim 8, wherein automatically identifying at least one of an agricultural task and an implement to be used comprises searching the raw data for data having a characteristic associated with the task information including at least one of an identifying portion associated with task information and a data unit size, length or frequency associated with task data.
13. The method of claim 12, wherein the task information includes an implement identifier that is associated with at least one of implement types, makes, or model.
14. The method of claim 12, wherein the task information includes controller or sensor signals having a frequency and the display device searches a database associating frequency of controller or sensor pulses with a type of agricultural application.
15. The method of claim 8, further comprising:
sending a query to a machine network of the machine or an implement network of the implement for requesting location information or identification of location information.
16. A method comprising: initiating a software application on a display device; determining, with a processing system, a communication unit, or the display device, whether at least one of automatic field identification and automatic task identification occurs based on initiation of the software application; displaying on a graphical user interface of the display device at least one of the determined automatic field identification and the automatic task identification if at least one of automatic field identification and automatic task identification occurs; and receiving input for correcting the automatic field identification with at least one alternative field if correction is needed when the automatic field identification occurs.
17. The method of claim 16, further comprising: receiving input for correcting the automatic task identification with at least one alternative task if correction is needed when the automatic task identification occurs.
18. The method of claim 16, further comprising: waiting for a subsequent determination of whether at least one of automatic field identification and automatic task identification occurs when no automatic field or task identification is initially determined to have occurred.
19. The method of claim 16, further comprising: generating alternative fields for the automatic field identification if appropriate; and sending the alternative fields to the display device for display on the graphical user interface.
20. The method of claim 16, further comprising: generating alternative tasks for the automatic task identification if appropriate; and sending the alternative tasks to the display device for display on the graphical user interface.
PCT/US2015/062501 2014-11-24 2015-11-24 System and methods for identifying fields and tasks WO2016086035A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
CA2968365A CA2968365C (en) 2014-11-24 2015-11-24 System and methods for identifying fields and tasks
BR112017010698A BR112017010698A2 (en) 2014-11-24 2015-11-24 system and methods for identifying fields and tasks
EP15863682.9A EP3224734A4 (en) 2014-11-24 2015-11-24 System and methods for identifying fields and tasks
UAA201706467A UA124489C2 (en) 2014-11-24 2015-11-24 System and methods for identifying fields and tasks
AU2015353587A AU2015353587B2 (en) 2014-11-24 2015-11-24 System and methods for identifying fields and tasks
US15/532,940 US10817812B2 (en) 2014-11-24 2015-11-24 System and methods for identifying fields and tasks
ZA201704163A ZA201704163B (en) 2014-11-24 2017-06-19 System and methods for identifying fields and tasks

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462083640P 2014-11-24 2014-11-24
US62/083,640 2014-11-24

Publications (1)

Publication Number Publication Date
WO2016086035A1 true WO2016086035A1 (en) 2016-06-02

Family

ID=56074993

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/062501 WO2016086035A1 (en) 2014-11-24 2015-11-24 System and methods for identifying fields and tasks

Country Status (8)

Country Link
US (1) US10817812B2 (en)
EP (1) EP3224734A4 (en)
AU (1) AU2015353587B2 (en)
BR (1) BR112017010698A2 (en)
CA (1) CA2968365C (en)
UA (1) UA124489C2 (en)
WO (1) WO2016086035A1 (en)
ZA (1) ZA201704163B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3376447A1 (en) * 2017-03-15 2018-09-19 Amazonen-Werke H. Dreyer GmbH & Co. KG Agricultural data processing system and method for storing data of a working process
US20210325868A1 (en) * 2018-08-23 2021-10-21 Precision Planting Llc Expandable network architecture for communications between machines and implements
US11544296B1 (en) * 2016-09-15 2023-01-03 Winfield Solutions, Llc Systems and methods for spatially-indexing agricultural content

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2016287397B2 (en) 2015-06-30 2021-05-20 Climate Llc Systems and methods for image capture and analysis of agricultural fields
US10111415B2 (en) * 2016-03-01 2018-10-30 Capstan Ag Systems, Inc. Systems and methods for determining and visually depicting spray band length of an agricultural fluid application system
WO2018070924A1 (en) * 2016-10-10 2018-04-19 Ålö AB An agriculture operation monitoring system and monitoring method
DE102017201040A1 (en) * 2017-01-23 2018-07-26 Deere & Company Method and device for identifying a hitch
US11925949B1 (en) * 2018-10-16 2024-03-12 Optix Technologies, LLC Fluid metering component and spraying apparatus thereof
US20220350307A1 (en) * 2021-04-30 2022-11-03 Farmers Edge Inc. Method and System for Validating Execution of a Planned Agricultural Operation

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5467271A (en) * 1993-12-17 1995-11-14 Trw, Inc. Mapping and analysis system for precision farming applications
US5978723A (en) * 1996-11-22 1999-11-02 Case Corporation Automatic identification of field boundaries in a site-specific farming system
US20010016788A1 (en) * 1995-05-30 2001-08-23 Ag-Chem Equipment Company, Inc. System and method for creating agricultural decision and application maps for automated agricultural machines
US20030028321A1 (en) * 2001-06-29 2003-02-06 The Regents Of The University Of California Method and apparatus for ultra precise GPS-based mapping of seeds or vegetation during planting
US20030187560A1 (en) * 1998-07-15 2003-10-02 Keller Russell J. Methods and apparatus for precision agriculture operations utilizing real time kinematic global positioning system systems
CN101110161A (en) * 2007-08-31 2008-01-23 北京科技大学 System for automatic cab model recognition and automatic vehicle flowrate detection and method thereof
US20100211594A1 (en) * 2007-06-14 2010-08-19 Stichting Imec Nederland Method of and system for sensor signal data analysis
US20110134138A1 (en) * 2008-06-06 2011-06-09 Monsanto Technology Llc Generating Agricultural Information Products Using Remote Sensing
US20110313779A1 (en) * 2010-06-17 2011-12-22 Microsoft Corporation Augmentation and correction of location based data through user feedback
US20120001876A1 (en) * 2010-07-02 2012-01-05 Chervenka Kirk J Managing a display of a terminal device associated with a vehicle data bus
US8412419B1 (en) * 2009-09-17 2013-04-02 Helena Chemical Company System for mapping GIS layers

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5961573A (en) 1996-11-22 1999-10-05 Case Corporation Height control of an agricultural tool in a site-specific farming system
US5995894A (en) * 1997-05-27 1999-11-30 Case Corporation System for analyzing spatially-variable harvest data by pass
US7802724B1 (en) * 2002-12-20 2010-09-28 Steven Paul Nohr Identifications and communications methods
US20060282467A1 (en) * 2005-06-10 2006-12-14 Pioneer Hi-Bred International, Inc. Field and crop information gathering system
US8635011B2 (en) * 2007-07-31 2014-01-21 Deere & Company System and method for controlling a vehicle in response to a particular boundary
US7739015B2 (en) * 2007-07-31 2010-06-15 Deere & Company System and method for controlling a vehicle with a sequence of vehicle events
FR2934739B1 (en) * 2008-08-04 2010-09-17 Samer Jarrah METHOD, SYSTEM AND MODULE FOR SCORING A USER TO A REMOTE WORKPLACE
KR100929811B1 (en) * 2009-08-21 2009-12-07 디,에스,에스 주식회사 Communication control device for fire sensing and emergency state
KR101108771B1 (en) * 2009-08-31 2012-02-24 주식회사 엘지유플러스 Dtmf signal transmission mobile communication terminal by motion sensing and method thereof
US20110270724A1 (en) * 2010-04-30 2011-11-03 Agco Corporation Agricultural inventory and invoice system
JP5760432B2 (en) * 2010-12-24 2015-08-12 富士通株式会社 Planting support method and planting support device
US20150370935A1 (en) * 2014-06-24 2015-12-24 360 Yield Center, Llc Agronomic systems, methods and apparatuses

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5467271A (en) * 1993-12-17 1995-11-14 Trw, Inc. Mapping and analysis system for precision farming applications
US20010016788A1 (en) * 1995-05-30 2001-08-23 Ag-Chem Equipment Company, Inc. System and method for creating agricultural decision and application maps for automated agricultural machines
US5978723A (en) * 1996-11-22 1999-11-02 Case Corporation Automatic identification of field boundaries in a site-specific farming system
US20030187560A1 (en) * 1998-07-15 2003-10-02 Keller Russell J. Methods and apparatus for precision agriculture operations utilizing real time kinematic global positioning system systems
US20030028321A1 (en) * 2001-06-29 2003-02-06 The Regents Of The University Of California Method and apparatus for ultra precise GPS-based mapping of seeds or vegetation during planting
US20100211594A1 (en) * 2007-06-14 2010-08-19 Stichting Imec Nederland Method of and system for sensor signal data analysis
CN101110161A (en) * 2007-08-31 2008-01-23 北京科技大学 System for automatic cab model recognition and automatic vehicle flowrate detection and method thereof
US20110134138A1 (en) * 2008-06-06 2011-06-09 Monsanto Technology Llc Generating Agricultural Information Products Using Remote Sensing
US8412419B1 (en) * 2009-09-17 2013-04-02 Helena Chemical Company System for mapping GIS layers
US20110313779A1 (en) * 2010-06-17 2011-12-22 Microsoft Corporation Augmentation and correction of location based data through user feedback
US20120001876A1 (en) * 2010-07-02 2012-01-05 Chervenka Kirk J Managing a display of a terminal device associated with a vehicle data bus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3224734A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11544296B1 (en) * 2016-09-15 2023-01-03 Winfield Solutions, Llc Systems and methods for spatially-indexing agricultural content
EP3376447A1 (en) * 2017-03-15 2018-09-19 Amazonen-Werke H. Dreyer GmbH & Co. KG Agricultural data processing system and method for storing data of a working process
US20210325868A1 (en) * 2018-08-23 2021-10-21 Precision Planting Llc Expandable network architecture for communications between machines and implements

Also Published As

Publication number Publication date
US10817812B2 (en) 2020-10-27
ZA201704163B (en) 2019-11-27
CA2968365A1 (en) 2016-06-02
US20170344922A1 (en) 2017-11-30
AU2015353587A1 (en) 2017-06-29
UA124489C2 (en) 2021-09-29
AU2015353587B2 (en) 2021-05-20
EP3224734A1 (en) 2017-10-04
CA2968365C (en) 2021-08-17
EP3224734A4 (en) 2018-04-11
BR112017010698A2 (en) 2018-05-08

Similar Documents

Publication Publication Date Title
AU2015353587B2 (en) System and methods for identifying fields and tasks
EP3510851B1 (en) Systems and methods for customizing scale and corresponding views of data displays
US11690310B2 (en) Systems and method for monitoring, controlling, and displaying field operations
US11304363B2 (en) Seed firmer for passive seed orientation within agricultural fields
US11324162B2 (en) Seed firmer for seed orientation adjustment in agricultural fields
US20200103910A1 (en) Methods and systems for generating shared collaborative maps
US20240057508A1 (en) Data transfer
US20240295954A1 (en) Systems and Methods for Providing Field Views Including Enhanced Agricultural Maps Having a Data Layer and Image Data
US20220279705A1 (en) Method and systems for using sensors to determine relative seed or particle speed
US11493395B2 (en) Pressure measurement module for measuring inlet pressure and outlet pressure of a fluid application system
US20240224839A9 (en) Systems and Methods for Determining State Data for Agricultural Parameters and Providing Spatial State Maps
WO2024150057A1 (en) Method and system to provide a viewing and replay functionality for agricultural data layers
WO2024150056A1 (en) Method and system to provide a region explorer function for selecting regions of interest of agricultural data layers and to provide data metrics for the regions of interest
WO2024150055A1 (en) Methods and systems for adjusting a range feature of an editor tool to automatically adjust a range of data values in a range region and automatically adjust a corresponding field view of a data display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15863682

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2968365

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2015863682

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: A201706467

Country of ref document: UA

ENP Entry into the national phase

Ref document number: 2015353587

Country of ref document: AU

Date of ref document: 20151124

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112017010698

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112017010698

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20170522