EP3406124B1 - Vision-based system for acquiring crop residue data and related calibration methods - Google Patents

Vision-based system for acquiring crop residue data and related calibration methods Download PDF

Info

Publication number
EP3406124B1
EP3406124B1 EP18172495.6A EP18172495A EP3406124B1 EP 3406124 B1 EP3406124 B1 EP 3406124B1 EP 18172495 A EP18172495 A EP 18172495A EP 3406124 B1 EP3406124 B1 EP 3406124B1
Authority
EP
European Patent Office
Prior art keywords
field
residue
crop residue
image data
imaged
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP18172495.6A
Other languages
German (de)
French (fr)
Other versions
EP3406124A1 (en
Inventor
John H. Posselius
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CNH Industrial Belgium NV
Original Assignee
CNH Industrial Belgium NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CNH Industrial Belgium NV filed Critical CNH Industrial Belgium NV
Publication of EP3406124A1 publication Critical patent/EP3406124A1/en
Application granted granted Critical
Publication of EP3406124B1 publication Critical patent/EP3406124B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B49/00Combined machines
    • A01B49/02Combined machines with two or more soil-working tools of different kind
    • A01B49/027Combined machines with two or more soil-working tools of different kind with a rotating, soil working support element, e.g. a roller
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B63/00Lifting or adjusting devices or arrangements for agricultural machines or implements
    • A01B63/14Lifting or adjusting devices or arrangements for agricultural machines or implements for implements drawn by animals or tractors
    • A01B63/24Tools or tool-holders adjustable relatively to the frame
    • A01B63/32Tools or tool-holders adjustable relatively to the frame operated by hydraulic or pneumatic means without automatic control
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B76/00Parts, details or accessories of agricultural machines or implements, not provided for in groups A01B51/00 - A01B75/00
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B79/00Methods for working soil
    • A01B79/005Precision agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables

Definitions

  • the present subject matter relates generally to a vision-based system for automatically acquiring crop residue data while an operation (e.g., a tillage operation) is being performed within a field and, more particularly, to related methods for calibrating a vision-based system used to acquire crop residue data.
  • an operation e.g., a tillage operation
  • Crop residue generally refers to the vegetation (e.g., straw, chaff, husks, cobs) remaining on the soil surface following the performance of a given agricultural operation, such as a harvesting operation or a tillage operation.
  • a given agricultural operation such as a harvesting operation or a tillage operation.
  • crop residue remaining within the field can help in maintaining the content of organic matter within the soil and can also serve to protect the soil from wind and water erosion.
  • leaving an excessive amount of crop residue within a field can have a negative effect on the soil's productivity potential, such as by slowing down the warming of the soil at planting time and/or by slowing down seed germination.
  • the ability to monitor and/or adjust the amount of crop residue remaining within a field can be very important to maintaining a healthy, productive field, particularly when it comes to performing tillage operations.
  • vision-based systems have been developed that attempt to estimate crop residue coverage from images captured of the field.
  • vision-based systems suffer from various drawbacks or disadvantages, particularly with reference to the accuracy of the crop residue estimates provided through the use of computer-aided image processing techniques.
  • US 2017/112043 describes a system and method for assessing the amount or size of residue in a crop field when conducting tillage or planting operations.
  • the described agricultural or crop care implement can automatically respond, self-adjust or be manually (e.g. by command) adjusted to work with the amount and type of residue detected.
  • US 2016/134844 describes a control system and computer-implemented method for monitoring residue coverage and controlling various operations based on residue coverage.
  • An improved vision-based system for acquiring crop residue data and related methods for calibrating such a system to improve the accuracy of the crop residue estimates provided therewith would be welcomed in the technology.
  • the present subject matter is directed to a method for calibrating crop residue data for a field acquired using a vision-based system.
  • the method may include controlling, with a computing device, an operation of at least one of an implement or a work vehicle as the implement is being towed by the work vehicle across the field and receiving, with the computing device, image data associated with an imaged portion of the field.
  • the method may include analyzing, with the computing device, the image data using a first residue-estimating technique to determine a first estimated value of a crop residue parameter for the imaged portion of the field and analyzing, with the computing device, the image data using a second residue-estimating technique to determine a second estimated value of the crop residue parameter for the imaged portion of the field, wherein the second residue-estimating technique differs from the first residue-estimating technique.
  • the method may also include adjusting at least one of the first estimated value or one or more additional estimated values of the crop residue parameter determined using the first residue-estimating technique based on at least one of the second estimated value or the differential between first and second estimated values.
  • the present subject matter is directed to a vision-based system for estimating and adjusting crop residue parameters as an implement is being towed across a field by a work vehicle.
  • the system may include an imaging device installed relative to one of the work vehicle or the implement such that the imaging device is configured to capture images of the field.
  • the system may also include a controller communicatively coupled to the imaging device, with the controller including a processor and associated memory.
  • the memory may store instructions that, when implemented by the processor, configure the controller to receive, from the imaging device, image data associated with an imaged portion of the field, analyze the image data using a first residue-estimating technique to determine a first estimated value of a crop residue parameter for the imaged portion of the field, and analyze the image data using a second residue-estimating technique to determine a second estimated value of the crop residue parameter for the imaged portion of the field, wherein the second residue-estimating technique differs from the first residue-estimating technique.
  • the controller may be configured to adjust at least one of the first estimated value or one or more additional estimated values of the crop residue parameter determined using the first residue-estimating technique based on at least one of the second estimated value or the differential between first and second estimated values.
  • the present subject matter is directed to a vision-based system for acquiring crop residue data associated with a field.
  • the present subject matter is directed to methods for calibrating crop residue acquired using a vision-based system.
  • one or more imaging devices e.g., a camera(s)
  • a work vehicle e.g., a tractor
  • an associated implement e.g., a tillage operation
  • the images may then be automatically analyzed via an associated controller using two different computer vision-based techniques to estimate a crop residue parameter for the imaged portion of the field (e.g., the percent crop residue coverage).
  • a first vision-based residue-estimating technique e.g., a computer vision-based blob analysis or other data extraction techniques
  • a second vision-based residue-estimating technique e.g., a computer vision-based linear transact method
  • the first and second estimated values determined using the differing residue-estimating techniques may then be compared to determine whether a differential exists between the estimated values.
  • the estimated value determined using one of the residue-estimating techniques may be calibrated or adjusted based on the estimated value determined using the other residue-estimating technique (or based on the differential between the two estimated values) to improve the accuracy of the crop residue data.
  • FIGS. 1 and 2 illustrate perspective views of one embodiment of a work vehicle 10 and an associated agricultural implement 12 in accordance with aspects of the present subject matter.
  • FIG. 1 illustrates a perspective view of the work vehicle 10 towing the implement 12 (e.g., across a field).
  • FIG. 2 illustrates a perspective view of the implement 12 shown in FIG. 1 .
  • the work vehicle 10 is configured as an agricultural tractor.
  • the work vehicle 10 may be configured as any other suitable agricultural vehicle.
  • the work vehicle 10 includes a pair of front track assemblies 14, a pair or rear track assemblies 16 and a frame or chassis 18 coupled to and supported by the track assemblies 14, 16.
  • An operator's cab 20 may be supported by a portion of the chassis 18 and may house various input devices for permitting an operator to control the operation of one or more components of the work vehicle 10 and/or one or more components of the implement 12.
  • the work vehicle 10 may include an engine 22 ( FIG. 3 ) and a transmission 24 ( FIG. 3 ) mounted on the chassis 18.
  • the transmission 24 may be operably coupled to the engine 22 and may provide variably adjusted gear ratios for transferring engine power to the track assemblies 14, 16 via a drive axle assembly (not shown) (or via axles if multiple drive axles are employed).
  • the implement 12 may generally include a carriage frame assembly 30 configured to be towed by the work vehicle via a pull hitch or tow bar 32 in a travel direction of the vehicle (e.g., as indicated by arrow 34).
  • the carriage frame assembly 30 may be configured to support a plurality of ground-engaging tools, such as a plurality of shanks, disk blades, leveling blades, basket assemblies, and/or the like.
  • the various ground-engaging tools may be configured to perform a tillage operation across the field along which the implement 12 is being towed.
  • the carriage frame assembly 30 may include aft extending carrier frame members 36 coupled to the tow bar 32.
  • reinforcing gusset plates 38 may be used to strengthen the connection between the tow bar 32 and the carrier frame members 36.
  • the carriage frame assembly 30 may generally function to support a central frame 40, a forward frame 42 positioned forward of the central frame 40 in the direction of travel 34 of the work vehicle 10, and an aft frame 44 positioned aft of the central frame 40 in the direction of travel 34 of the work vehicle 10.
  • the central frame 40 may correspond to a shank frame configured to support a plurality of ground-engaging shanks 46.
  • the shanks 46 may be configured to till the soil as the implement 12 is towed across the field.
  • the central frame 40 may be configured to support any other suitable ground-engaging tools.
  • the forward frame 42 may correspond to a disk frame configured to support various gangs or sets 48 of disk blades 50.
  • each disk blade 50 may, for example, include both a concave side (not shown) and a convex side (not shown).
  • the various gangs 48 of disk blades 50 may be oriented at an angle relative to the travel direction 34 of the work vehicle 10 to promote more effective tilling of the soil.
  • the forward frame 42 may be configured to support any other suitable ground-engaging tools.
  • the aft frame 44 may also be configured to support a plurality of ground-engaging tools.
  • the aft frame is configured to support a plurality of leveling blades 52 and rolling (or crumbier) basket assemblies 54.
  • any other suitable ground-engaging tools may be coupled to and supported by the aft frame 44, such as a plurality closing disks.
  • the implement 12 may also include any number of suitable actuators (e.g., hydraulic cylinders) for adjusting the relative positioning, penetration depth, and/or down force associated with the various ground-engaging tools 46, 50, 52, 54.
  • the implement 12 may include one or more first actuators 56 coupled to the central frame 40 for raising or lowering the central frame 40 relative to the ground, thereby allowing the penetration depth and/or the down pressure of the shanks 46 to be adjusted.
  • the implement 12 may include one or more second actuators 58 coupled to the disk forward frame 42 to adjust the penetration depth and/or the down pressure of the disk blades 50.
  • the implement 12 may include one or more third actuators 60 coupled to the aft frame 44 to allow the aft frame 44 to be moved relative to the central frame 40, thereby allowing the relevant operating parameters of the ground-engaging tools 52, 54 supported by the aft frame 44 (e.g., the down pressure and/or the penetration depth) to be adjusted.
  • the relevant operating parameters of the ground-engaging tools 52, 54 supported by the aft frame 44 e.g., the down pressure and/or the penetration depth
  • the configuration of the work vehicle 10 described above and shown in FIG. 1 is provided only to place the present subject matter in an exemplary field of use.
  • the present subject matter may be readily adaptable to any manner of work vehicle configuration.
  • a separate frame or chassis may be provided to which the engine, transmission, and drive axle assembly are coupled, a configuration common in smaller tractors.
  • Still other configurations may use an articulated chassis to steer the work vehicle 10, or rely on tires/wheels in lieu of the track assemblies 14, 16.
  • each frame section of the implement 12 may be configured to support any suitable type of ground-engaging tools, such as by installing closing disks on the aft frame 44 of the implement 12.
  • the work vehicle 10 and/or the implement 12 may include one or more imaging devices coupled thereto and/or supported thereon for capturing images or other image data associated with the field as an operation is being performed via the implement 12.
  • the imaging device(s) may be provided in operative association with the work vehicle 10 and/or the implement 12 such that the imaging device(s) has a field of view directed towards a portion(s) of the field disposed in front of, behind, and/or along one or both of the sides of the work vehicle 10 and/or the implement 12 as the implement 12 is being towed across the field.
  • the imaging device(s) may capture images from the tractor 10 and/or implement 12 of one or more portion(s) of the field being passed by the tractor 10 and/or implement 12.
  • the imaging device(s) may correspond to any suitable device(s) configured to capture images or other image data of the field that allow the field's soil to be distinguished from the crop residue remaining on top of the soil.
  • the imaging device(s) may correspond to any suitable camera(s), such as single-spectrum camera or a multi-spectrum camera configured to capture images in the visible light range and/or infrared spectral range.
  • the camera(s) may correspond to a single lens camera configured to capture two-dimensional images or a stereo camera(s) having two or more lenses with a separate image sensor for each lens to allow the camera(s) to capture stereographic or three-dimensional images.
  • the imaging device(s) may correspond to any other suitable image capture device(s) and/or vision system(s) that is capable of capturing "images" or other image-like data that allow the crop residue existing on the soil to be distinguished from the soil.
  • work vehicle 10 and/or implement 12 may include any number of imaging device(s) 104 provided at any suitable location that allows images of the field to be captured as the vehicle 10 and implement 12 traverse through the field.
  • FIGS. 1 and 2 illustrate examples of various locations for mounting one or more imaging device(s) for capturing images of the field.
  • one or more imaging devices 104A may be coupled to the front of the work vehicle 10 such that the imaging device(s) 104A has a field of view 106 that allows it to capture images of an adjacent area or portion of the field disposed in front of the work vehicle 10.
  • the field of view 106 of the imaging device(s) 104A may be directed outwardly from the front of the work vehicle 10 along a plane or reference line that extends generally parallel to the travel direction 34 of the work vehicle 10.
  • one or more imaging devices 104B may also be coupled to one of the sides of the work vehicle 10 such that the imaging device(s) 104B has a field of view 106 that allows it to capture images of an adjacent area or portion of the field disposed along such side of the work vehicle 10.
  • the field of view 106 of the imaging device(s) 104B may be directed outwardly from the side of the work vehicle 10 along a plane or reference line that extends generally perpendicular to the travel direction 34 of the work vehicle 10.
  • one or more imaging devices 104C may be coupled to the rear of the implement 12 such that the imaging device(s) 104C has a field of view 106 that allows it to capture images of an adjacent area or portion of the field disposed aft of the implement.
  • the field of view 106 of the imaging device(s) 104C may be directed outwardly from the rear of the implement 12 along a plane or reference line that extends generally parallel to the travel direction 34 of the work vehicle 10.
  • one or more imaging devices 104D may also be coupled to one of the sides of the implement 12 such that the imaging device(s) 104D has a field of view 106 that allows it to capture images of an adjacent area or portion of the field disposed along such side of the implement 12.
  • the field of view 106 of the imaging device 104D may be directed outwardly from the side of the implement 12 along a plane or reference line that extends generally perpendicular to the travel direction 34 of the work vehicle 10.
  • the imaging device(s) 104 may be installed at any other suitable location that allows the device(s) to capture images of an adjacent portion of the field, such as by installing an imaging device(s) at or adjacent to the aft end of the work vehicle 10 and/or at or adjacent to the forward end of the implement 10. It should also be appreciated that, in several embodiments, the imaging devices 104 may be specifically installed at locations on the work vehicle 10 and/or the implement 12 to allow images to be captured of the field both before and after the performance of a field operation by the implement 12.
  • the forward imaging device 104A may capture images of the field prior to performance of the field operation while the aft imaging device 104C may capture images of the same portions of the field following the performance of the field operation.
  • FIG. 3 a schematic view of one embodiment of a vision-based system 100 for estimating crop residue parameters is illustrated in accordance with aspects of the present subject matter.
  • the system 100 will be described herein with reference to the work vehicle 10 and the implement 12 described above with reference to FIGS. 1 and 2 .
  • the disclosed system 100 may generally be utilized with work vehicles having any suitable vehicle configuration and/or implements have any suitable implement configuration.
  • the system 100 may include a controller 102 and various other components configured to be communicatively coupled to and/or controlled by the controller 102, such as one or more imaging devices 104 and/or various components of the work vehicle 10 and/or the implement 12.
  • the controller 102 may be configured to receive images or other image data from the imaging device(s) 104 that depict portions of the field as an operation (e.g., a tillage operation) is being performed within the field. Based on an analysis of the image data received from the imaging device(s) 104, the controller 102 may be configured to estimate a first value for a crop residue parameter associated with the field (e.g., a percent crop residue coverage) using a first residue-estimating technique.
  • a crop residue parameter associated with the field e.g., a percent crop residue coverage
  • the controller 102 may be configured to analyze the same or similar images or other image data to estimate a second value for the crop residue parameter using a second residue-estimating technique that differs from the first residue-estimating technique. Based on a comparison between the estimated values for the crop residue parameter determined using the two differing techniques, the controller may, if necessary or desired, calibrate the crop residue date being generated using one of the residue-estimating technique, such as by adjusting the first estimated value for the crop residue parameter determined using the first residue-estimating technique based on the second estimated value determined using the second residue-estimating technique.
  • the controller 102 may correspond to any suitable processor-based device(s), such as a computing device or any combination of computing devices.
  • the controller 102 may generally include one or more processor(s) 110 and associated memory devices 112 configured to perform a variety of computer-implemented functions (e.g., performing the methods, steps, algorithms, calculations and the like disclosed herein).
  • processor refers not only to integrated circuits referred to in the art as being included in a computer, but also refers to a controller, a microcontroller, a microcomputer, a programmable logic controller (PLC), an application specific integrated circuit, and other programmable circuits.
  • PLC programmable logic controller
  • the memory 112 may generally comprise memory element(s) including, but not limited to, computer readable medium (e.g., random access memory (RAM)), computer readable non-volatile medium (e.g., a flash memory), a floppy disk, a compact disc-read only memory (CD-ROM), a magneto-optical disk (MOD), a digital versatile disc (DVD) and/or other suitable memory elements.
  • RAM random access memory
  • RAM computer readable non-volatile medium
  • CD-ROM compact disc-read only memory
  • MOD magneto-optical disk
  • DVD digital versatile disc
  • Such memory 112 may generally be configured to store information accessible to the processor(s) 110, including data 114 that can be retrieved, manipulated, created and/or stored by the processor(s) 110 and instructions 116 that can be executed by the processor(s) 110.
  • the data 114 may be stored in one or more databases.
  • the memory 112 may include an image database 118 for storing image data received from the imaging device(s) 104.
  • the imaging device(s) 104 may be configured to continuously or periodically capture images of adjacent portion(s) of the field as an operation is being performed with the field.
  • the images transmitted to the controller 102 from the imaging device(s) 104 may be stored within the image database 118 for subsequent processing and/or analysis.
  • image data may include any suitable type of data received from the imaging device(s) 104 that allows for the crop residue coverage of a field to be analyzed, including photographs and other image-related data (e.g., scan data and/or the like).
  • the memory 12 may include a crop residue database 120 for storing information related to crop residue parameters for the field being processed.
  • the controller 102 may be configured to estimate or calculate one or more values for one or more crop residue parameters associated with the field, such as a value(s) for the percent crop residue coverage for an imaged portion(s) of the field (and/or a value(s) for the average percent crop residue coverage for the field).
  • the crop residue parameter(s) estimated or calculated by the controller 102 may then be stored within the crop residue database 120 for subsequent processing and/or analysis.
  • the memory 12 may also include a location database 122 storing location information about the work vehicle/implement 10, 12 and/or information about the field being processed (e.g., a field map).
  • the controller 102 may be communicatively coupled to a positioning device(s) 124 installed on or within the work vehicle 10 and/or on or within the implement 12.
  • the positioning device(s) 124 may be configured to determine the exact location of the work vehicle 10 and/or the implement 12 using a satellite navigation position system (e.g. a GPS system, a Galileo positioning system, the Global Navigation satellite system (GLONASS), the BeiDou Satellite Navigation and Positioning system, and/or the like).
  • the location determined by the positioning device(s) 124 may be transmitted to the controller 102 (e.g., in the form coordinates) and subsequently stored within the location database 122 for subsequent processing and/or analysis.
  • the location data stored within the location database 122 may also be correlated to the image data stored within the image database 118.
  • the location coordinates derived from the positioning device(s) 124 and the image(s) captured by the imaging device(s) 104 may both be time-stamped.
  • the time-stamped data may allow each image captured by the imaging device(s) 104 to be matched or correlated to a corresponding set of location coordinates received from the positioning device(s) 124, thereby allowing the precise location of the portion of the field depicted within a given image to be known (or at least capable of calculation) by the controller 102.
  • the controller 102 may also be configured to generate or update a corresponding field map associated with the field being processed. For example, in instances in which the controller 102 already includes a field map stored within its memory 112 that includes location coordinates associated with various points across the field, each image captured by the imaging device(s) 104 may be mapped or correlated to a given location within the field map. Alternatively, based on the location data and the associated image data, the controller 102 may be configured to generate a field map for the field that includes the geo-located images associated therewith.
  • the instructions 116 stored within the memory 112 of the controller 102 may be executed by the processor(s) 110 to implement an image analysis module 126.
  • the image analysis module 126 may be configured to analyze the images received by the imaging device(s) 104 using one or more residue-estimating techniques to allow the controller 102 to estimate one or more crop residue parameters associated with the field currently being processed.
  • the image analysis module 126 may be configured to implement two different residue-estimating techniques (e.g., first and second residue-estimating techniques), with each residue-estimating technique being based on a different computer-vision algorithm or any other suitable image-processing technique that allows the controller 102 to identify crop residue remaining on top of the soil.
  • the controller 102 may then determine two values for the crop residue parameter(s) associated with a given imaged portion of the field. Such values may then be stored within the crop residue database 120.
  • the residue-estimating techniques used by the image analysis module 126 to estimate the crop residue parameter(s) may correspond to any suitable computer-vision algorithms or image-processing techniques that allow the controller 102 to identify crop residue remaining on top of the soil.
  • the image analysis module may be configured to utilize both a computer vision-based blob analysis and a computer vision-based linear transact method to estimate values for the crop residue parameter(s). The estimated values determined using each of such residue-estimating techniques may then be stored within the crop residue database 120 for subsequent analysis and/or processing.
  • the image analysis module 126 may be configured to implement any other suitable vision-based residue-estimating techniques to estimate the crop residue parameter(s).
  • the instructions 116 stored within the memory 112 of the controller 102 may also be executed by the processor(s) 110 to implement a calibration module 128.
  • the calibration module 128 may be configured to calibrate the crop residue data generated by the image analysis module 126 based on the estimated values determined using the differing residue-estimating techniques.
  • the calibration module 128 may be configured to compare the estimated value(s) of the crop residue parameter(s) determined using the first residue-estimating technique to the corresponding estimate value(s) of the crop residue parameter(s) determined using the second residue-estimating technique.
  • the calibration module 128 may be configured to calibrate or adjust the estimated value(s) determined using one of the residue-estimating techniques based on the estimated value(s) determined using the other residue-estimating technique.
  • the estimated value(s) determined using the second residue-estimating technique may be used to calibrate or adjust the estimate value(s) determined using the first residue-estimating technique.
  • the image analysis module 126 may analyze one or more images of an imaged portion of the field and determine a first estimated value of 45% crop residue coverage using the first residue-estimating technique and a second estimated value of 50% crop residue coverage using the second residue-estimating technique.
  • the calibration module 128 may then compare the first and estimated values and determine that a +5% differential exists between the estimated values.
  • the calibration module 128 may then, in one embodiment, adjust the first estimated value and/or any future/past estimated values determined using the first residue-estimating technique based on the second estimated value and/or the differential determined between the first and second estimated values. For instance, the calibration module 128 may be configured to increase the first estimated value and/or any future/past estimated values determined using the first residue-estimating technique by 5% to ensure that the crop residue data generated using the first residue-estimating technique is consistent with the crop residue data generated using the second residue-estimating technique.
  • the calibration module 128 may be configured to analyze the estimated values determined for various different imaged portions of the field.
  • the calibration module 128 may be configured to compare the estimated values determined using the first and second residue-estimating techniques for each imaged portion of the field to determine an average differential existing between the first and second estimated values. The calibration module 128 may then adjust the first estimated value and/or any future/past estimated values determined using the first residue-estimating technique based on the average differential determined across the various imaged portions of the field.
  • the present subject matter is generally described herein as using the second estimated value(s) determined via the second residue-estimating technique to calibrate or adjust the first estimate value(s) determined via the first residue-estimating technique
  • the configuration may be reversed, with the first estimated value(s) being used to calibrate or adjust the second estimated value(s).
  • the residue-estimating technique used as the calibration source may be selected based on any number of factors, including accuracy considerations, computer processing requirements, standards or regulations set for crop residue data and/or the like.
  • the instructions 116 stored within the memory 112 of the controller 102 may also be executed by the processor(s) 110 to implement a control module 129.
  • the control module 129 may be configured to adjust the operation of the work vehicle 10 and/or the implement 12 by controlling one or more components of the implement/vehicle 12, 10.
  • the control module 129 may be configured to fine-tune the operation of the work vehicle 10 and/or the implement 12 in a manner designed to adjust the amount of crop residue remaining in the field.
  • control module 129 may be configured to adjust the operation of the work vehicle and/or the implement 12 so as to increase or decrease the amount of crop residue remaining in the field when the estimated percent crop residue coverage for a given imaged portion of the field (or an average estimated percent crop residue coverage across multiple imaged portions of the field) differs from the target percentage.
  • controller 102 may be configured to implement various different control actions to adjust the operation of the work vehicle 10 and/or the implement 12 in a manner that increases or decreases the amount of crop residue remaining in the field.
  • the controller 102 may be configured to increase or decrease the operational or ground speed of the implement 12 to affect an increase or decrease in the crop residue coverage.
  • the controller 102 may be communicatively coupled to both the engine 22 and the transmission 24 of the work vehicle 10.
  • the controller 102 may be configured to adjust the operation of the engine 22 and/or the transmission 24 in a manner that increases or decreases the ground speed of the work vehicle 10 and, thus, the ground speed of the implement 12, such as by transmitting suitable control signals for controlling an engine or speed governor (not shown) associated with the engine 22 and/or transmitting suitable control signals for controlling the engagement/disengagement of one or more clutches (not shown) provided in operative association with the transmission 24.
  • the controller 102 may also be configured to adjust an operating parameter associated with the ground-engaging tools of the implement 12.
  • the controller 102 may be communicatively coupled to one or more valves 130 configured to regulate the supply of fluid (e.g., hydraulic fluid or air) to one or more corresponding actuators 56, 58, 60 of the implement 12.
  • the controller 102 may automatically adjust the penetration depth, the down force, and/or any other suitable operating parameter associated with the ground-engaging tools of the implement 12.
  • the controller 102 may also include a communications interface 132 to provide a means for the controller 102 to communicate with any of the various other system components described herein.
  • a communications interface 132 may be provided between the communications interface 132 and the imaging device(s) 104 to allow images transmitted from the imaging device(s) 104 to be received by the controller 102.
  • one or more communicative links or interfaces 136 may be provided between the communications interface 132 and the positioning device(s) 124 to allow the location information generated by the positioning device(s) 124 to be received by the controller 102.
  • one or more communicative links or interfaces 138 may be provided between the communications interface 132 and the engine 22, the transmission 24, the control valves 130, and/or the like to allow the controller 102 to control the operation of such system components.
  • FIG. 4 an example, simplified image of a portion of a field that may be provided by one of the imaging device(s) 104 of the disclosed system 100 is illustrated in accordance with aspects of the present subject matter, particularly illustrating the field including crop residue 160 (indicated by cross-hatching) positioned on the top of the soil 162.
  • the image analysis module 126 of the controller 102 may generally be configured to utilize any suitable computer-vision algorithms or image-processing techniques that allow the controller 102 to identify crop residue 160 remaining on top of the soil 162.
  • the vision-based technique used by the image analysis module 126 may rely upon the identification of one or more image characteristics captured by the imaging device(s) 104 to distinguish the crop residue 160 from the soil 162 contained within each image.
  • the controller 102 may be configured to implement a computer-vision algorithm that identifies the differences in the reflectivity or spectral absorption between the soil 162 and the crop residue 160 contained within each image being analyzed.
  • the controller 102 may be configured to utilize an edge-finding algorithm to identify or distinguish the crop residue 160 from the soil 162 contained within each image.
  • the controller 102 may be configured to utilize any suitable technique or methodology for calculating the percent crop residue coverage for the portion of the field contained within each image. For instance, as indicated above, the controller 102 may, in one embodiment, utilize a "blob analysis" in which the crop residue identified within each image via the associated computer-vision technique is represented as a "blob" or plurality of "blobs” encompassing a given area within the image. Specifically, as shown in FIG. 4 , the crop residue 160 depicted within the image is represented as cross-hatched blobs overlaying the soil 162.
  • FIG. 5 an example, simplified view of a continuous section 170 of an imaged portion of a field is illustrated in accordance with aspects of the present subject matter.
  • FIG. 5 illustrates a plurality of images captured by one or more of the imaging device(s) 104 of the disclosed system 100 that collectively depict a continuous section 170 of the field.
  • the field of view 106 of the imaging device(s) 104 may allow the imaging device(s) 104 to capture an image of the field that spans a given field distance.
  • multiple images may be stitched together or otherwise analyzed in combination. For instance, in the example view shown in FIG. 5 , a plurality of images captured by one of the imaging device(s) 104 have been stitched together (e.g., the separate images being indicated by the dashed horizontal lines) to provide a view of a continuous section 170 of the field that spans across a predetermined field length 172.
  • the controller 102 may be configured to identify which images can be used to collectively depict a continuous section of the field using any suitable methodology or technique.
  • the images provided by the imaging device(s) 104 may be time-stamped.
  • the controller 102 may be configured to stitch together or otherwise access the images captured by the imaging device(s) 104 that collectively depict a continuous field section 170 spanning across the predetermined field length 172.
  • the controller 102 may be configured to utilize any suitable image-processing algorithm that allows the controller 102 to identify the images (or portions of images) that collectively depict a continuous section of the field.
  • the image analysis module 126 of the controller 102 may, in several embodiments, be configured to implement a computer vision-based line transact method to estimate the percent crop residue coverage for the imaged portion of the field. Specifically, in several embodiments, the image analysis module 126 may access the images collectively depicting the continuous imaged section 170 of the field and apply a known scale 174 to such continuous imaged section 170 of the field such that a plurality of reference points 176 are defined along the continuous imaged field section 170 that are spaced apart evenly across the predetermined field length 172.
  • the images may be analyzed to identify the number or percentage of reference points 176 that are aligned with or intersect crop residue within the continuous imaged section 170 of the field. Such identified number or percentage of the reference points 176 may then correspond to or may be used to calculate the percent crop residue coverage within the continuous imaged section 170 of the field.
  • the scale 174 applied to the continuous imaged section 170 of the field may divide the predetermined field length 172 into one hundred distinct field sections such that one hundred reference points 176 are evenly spaced apart along the predetermined field length 170.
  • the imposed scale 174 may divide the predetermined field length 172 into one hundred one-foot sections, with a reference point 176 being defined at each foot mark along the predetermined field length 172.
  • the predetermined field length 172 may correspond to any other suitable field length, such as a fifty foot field section, a twenty-five foot field section or any other suitable field length.
  • any other suitable scale 174 may be applied to the continuous imaged section 170 of the field to allow any suitable number of evenly spaced reference points 176 to be defined across the predetermined field length 172.
  • a fifty-point scale or a twenty-five-point scale may be applied such that fifty or twenty-five evenly spaced reference points 176, respectively, are defined across the predetermined field length 172.
  • the image analysis module 126 may only be configured to identify the reference points 176 within the images that are aligned with or intersect crop residue that exceeds a given residue size threshold for purposes of calculating the percent crop residue coverage for the continuous imaged section 170 of the field.
  • the size threshold for the crop residue may be selected based on the minimum residue size capable of intercepting rain drops.
  • the residue size threshold may correspond to a residue diameter of one-eighth of an inch (1/8" or 0,3175 cm).
  • the crop residue aligned with or intersecting one of the reference points 176 is determined to have a cross-wise dimension within the image that exceeds 1/8" (or 0,3175 cm) (e.g., via a suitable image analysis technique)
  • such reference point 176 may be counted for purposes of calculating the percent crop residue coverage for the continuous imaged section 170 of the field.
  • the crop residue aligned with or intersecting one of the reference points 176 is determined to have a cross-wise dimension within the image that is less than 1/8" (or 0,3175 cm), such reference point 176 may not be counted for purposes of calculating the percent crop residue coverage.
  • the image analysis module 126 may be configured to perform the above-referenced analysis for multiple imaged sections of the field. For example, the image analysis module 126 may access images captured by the imaging device(s) 104 that collectively depict several different continuous imaged sections of the field, with each continuous imaged field section spanning a predetermined field length. Thereafter, for each continuous imaged section of the field, the image analysis module 126 may apply a known scale to such continuous imaged field section such that a plurality of reference points are defined along the continuous imaged field section that are spaced apart evenly across the predetermined field length.
  • the images associated with each continuous imaged section of the field may then be analyzed to identify the number or percentage of reference points that are aligned with or intersect crop residue within such continuous imaged section of the field, thereby allowing a value for the percent crop residue coverage to be determined for each continuous imaged field section.
  • the image analysis module 126 may then calculate an average percent crop residue coverage based on the residue coverage values calculated for the various continuous imaged field sections. In doing so, it may be desirable for the average percent crop residue coverage to be calculated based on the residue coverage values determined for at least five continuous imaged field sections, thereby allowing a desirable confidence level to be obtained for the calculated average.
  • FIG. 6 a flow diagram of one embodiment of a method 200 for calibrating crop residue data acquired using a vision-based system is illustrated in accordance with aspects of the present subject matter.
  • the method 200 will be described herein with reference to the work vehicle 10 and the implement 12 shown in FIGS. 1 and 2 , as well as the various system components shown in FIG. 3 .
  • the disclosed method 200 may be implemented with work vehicles and/or implements having any other suitable configurations and/or within systems having any other suitable system configuration.
  • FIG. 6 depicts steps performed in a particular order for purposes of illustration and discussion, the methods discussed herein are not limited to any particular order or arrangement.
  • steps of the methods disclosed herein can be omitted, rearranged, combined, and/or adapted in various ways without deviating from the scope of the present disclosure.
  • the method 200 may include controlling the operation of at least one of an implement or a work vehicle as the implement is being towed by the work vehicle across a field.
  • the controller 102 of the disclosed system 100 may be configured to control the operation of the work vehicle 10 and/or the implement 12, such as by controlling one or more components of the work vehicle 10 and/or the implement 12 to allow an operation to be performed within the field (e.g., a tillage operation).
  • the method 200 may include receiving image data associated with an imaged portion of the field.
  • the controller 102 may be coupled to one or more imaging devices 104 configured to capture images of various portions of the field.
  • the method 200 may include analyzing the image data using a first residue-estimating technique to determine a first estimated value of a crop residue parameter for the imaged portion of the field.
  • the image analysis module 126 of the controller 102 may be configured to implement a vision-based residue-estimating technique to estimate a crop residue parameter for the imaged portion of the field, such as by estimating the percent crop residue coverage for the imaged portion of the field using a computer vision-based blob analysis or using a computer vision-based line transact method.
  • the method 200 may include analyzing the image data using a second residue-estimating technique to determine a second estimated value of the crop residue parameter for the imaged portion of the field.
  • the image analysis module 126 of the controller 102 may, in accordance with aspects of the present subject matter, be configured to implement two different vision-based residue-estimating techniques for estimating a given crop residue parameter for the imaged portion of the field.
  • the second residue estimating technique may, for example, correspond to a computer vision-based line transact method or vice versa.
  • the controller 102 may determine two separate estimated values for the crop reside parameter using the two different residue-estimating techniques.
  • the method 200 may include adjusting at least one of the first estimated value or one or more additional estimated values of the crop residue parameter obtained using the first residue-estimating technique based on at least one of the second estimated value or the differential between first and second estimated values.
  • the controller 102 may be configured to adjust the first estimated value determined using the first residue-estimated technique based on the second estimated value determined using the second residue-estimated technique and/or based on the differential existing between the first and second estimated values.
  • the controller 102 may, in one embodiment, the adjust the estimated percent crop residue coverage associated with the first residue-estimating technique to match the percent crop residue coverage associated with the second residue-estimating technique (e.g., by reducing the percent crop residue coverage from 40% to 32%).
  • the controller 102 may also utilize the differential defined between the first and second estimated values to adjust any past or future estimated values determined using the first residue-estimating technique, such as by applying a -8% modifier to each estimated value determined using the first residue-estimating technique.
  • the method 200 may also include any additional steps or method elements consistent with the disclosure provided herein.
  • the method 200 may also include actively adjusting the operation of the implement 12 and/or the work vehicle 10 when the adjusted value for the first estimated value and/or the adjusted value(s) for the one or more additional estimated values determined using the first residue-estimating technique differs from a target value set for crop residue parameter.
  • the controller 102 may be configured to actively adjust the operation of the work vehicle 10 and/or the implement 12 in a manner that increases or decreases the amount of crop residue remaining within the field following the operation being performed (e.g., a tillage operation), such as by adjusting the ground speed at which the implement 12 is being towed and/or by adjusting one or more operating parameters associated with the ground-engaging elements of the implement 12.
  • a tillage operation e.g., a tillage operation

Description

    FIELD OF THE INVENTION
  • The present subject matter relates generally to a vision-based system for automatically acquiring crop residue data while an operation (e.g., a tillage operation) is being performed within a field and, more particularly, to related methods for calibrating a vision-based system used to acquire crop residue data.
  • BACKGROUND OF THE INVENTION
  • Crop residue generally refers to the vegetation (e.g., straw, chaff, husks, cobs) remaining on the soil surface following the performance of a given agricultural operation, such as a harvesting operation or a tillage operation. For various reasons, it is important to maintain a given amount of crop residue within a field following an agricultural operation. Specifically, crop residue remaining within the field can help in maintaining the content of organic matter within the soil and can also serve to protect the soil from wind and water erosion. However, in some cases, leaving an excessive amount of crop residue within a field can have a negative effect on the soil's productivity potential, such as by slowing down the warming of the soil at planting time and/or by slowing down seed germination. As such, the ability to monitor and/or adjust the amount of crop residue remaining within a field can be very important to maintaining a healthy, productive field, particularly when it comes to performing tillage operations.
  • In this regard, vision-based systems have been developed that attempt to estimate crop residue coverage from images captured of the field. However, such vision-based systems suffer from various drawbacks or disadvantages, particularly with reference to the accuracy of the crop residue estimates provided through the use of computer-aided image processing techniques.
  • US 2017/112043 describes a system and method for assessing the amount or size of residue in a crop field when conducting tillage or planting operations. The described agricultural or crop care implement can automatically respond, self-adjust or be manually (e.g. by command) adjusted to work with the amount and type of residue detected. US 2016/134844 describes a control system and computer-implemented method for monitoring residue coverage and controlling various operations based on residue coverage.
  • An improved vision-based system for acquiring crop residue data and related methods for calibrating such a system to improve the accuracy of the crop residue estimates provided therewith would be welcomed in the technology.
  • BRIEF DESCRIPTION OF THE INVENTION
  • Aspects and advantages of the invention will be set forth in part in the following description, or may be obvious from the description, or may be learned through practice of the invention.
  • In one aspect, the present subject matter is directed to a method for calibrating crop residue data for a field acquired using a vision-based system. The method may include controlling, with a computing device, an operation of at least one of an implement or a work vehicle as the implement is being towed by the work vehicle across the field and receiving, with the computing device, image data associated with an imaged portion of the field. In addition, the method may include analyzing, with the computing device, the image data using a first residue-estimating technique to determine a first estimated value of a crop residue parameter for the imaged portion of the field and analyzing, with the computing device, the image data using a second residue-estimating technique to determine a second estimated value of the crop residue parameter for the imaged portion of the field, wherein the second residue-estimating technique differs from the first residue-estimating technique. Moreover, when a differential exists between the first and second estimated values, the method may also include adjusting at least one of the first estimated value or one or more additional estimated values of the crop residue parameter determined using the first residue-estimating technique based on at least one of the second estimated value or the differential between first and second estimated values.
  • In another aspect, the present subject matter is directed to a vision-based system for estimating and adjusting crop residue parameters as an implement is being towed across a field by a work vehicle. The system may include an imaging device installed relative to one of the work vehicle or the implement such that the imaging device is configured to capture images of the field. The system may also include a controller communicatively coupled to the imaging device, with the controller including a processor and associated memory. The memory may store instructions that, when implemented by the processor, configure the controller to receive, from the imaging device, image data associated with an imaged portion of the field, analyze the image data using a first residue-estimating technique to determine a first estimated value of a crop residue parameter for the imaged portion of the field, and analyze the image data using a second residue-estimating technique to determine a second estimated value of the crop residue parameter for the imaged portion of the field, wherein the second residue-estimating technique differs from the first residue-estimating technique. Moreover, when a differential exists between the first and second estimated values, the controller may be configured to adjust at least one of the first estimated value or one or more additional estimated values of the crop residue parameter determined using the first residue-estimating technique based on at least one of the second estimated value or the differential between first and second estimated values.
  • These and other features, aspects and advantages of the present invention will become better understood with reference to the following description and appended claims. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A full and enabling disclosure of the present invention, including the best mode thereof, directed to one of ordinary skill in the art, is set forth in the specification, which makes reference to the appended figures, in which:
    • FIG. 1 illustrates a perspective view of one embodiment of a work vehicle towing an implement in accordance with aspects of the present subject matter;
    • FIG. 2 illustrates a perspective view of the implement shown in FIG. 1;
    • FIG. 3 illustrates a schematic view of one embodiment of a vision-based system for acquiring crop residue data in accordance with aspects of the present subject matter;
    • FIG. 4 illustrates an example, simplified view of an image of a field acquired using an imaging device(s) of the disclosed system, particularly illustrating how the image may be analyzed using one embodiment of a vision-based residue-estimating technique in accordance with aspects of the present subject matter;
    • FIG. 5 illustrates an example, simplified view of a continuous imaged section of a field acquired using an imaging device(s) of the disclosed system, particularly illustrating how the images associated with the continuous imaged section of the field may be analyzed using another embodiment of a vision-based residue-estimating technique in accordance with aspects of the present subject matter;
    • FIG. 6 illustrates a flow diagram of one embodiment of a method for calibrating crop residue data for a field acquired using a vision-based system in accordance with aspects of the present subject matter.
    DETAILED DESCRIPTION OF THE INVENTION
  • Reference now will be made in detail to embodiments of the invention, one or more examples of which are illustrated in the drawings. Each example is provided by way of explanation of the invention, not limitation of the invention. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the scope or spirit of the invention. For instance, features illustrated or described as part of one embodiment can be used with another embodiment to yield a still further embodiment. Thus, it is intended that the present invention covers such modifications and variations as come within the scope of the appended claims and their equivalents.
  • In general, the present subject matter is directed to a vision-based system for acquiring crop residue data associated with a field. In addition, the present subject matter is directed to methods for calibrating crop residue acquired using a vision-based system. Specifically, in several embodiments, one or more imaging devices (e.g., a camera(s)) may be provided in operative association with a work vehicle and/or an associated implement to capture images of a field as an operation (e.g., a tillage operation) is being performed within the field. The images may then be automatically analyzed via an associated controller using two different computer vision-based techniques to estimate a crop residue parameter for the imaged portion of the field (e.g., the percent crop residue coverage). For instance, a first vision-based residue-estimating technique (e.g., a computer vision-based blob analysis or other data extraction techniques) may be used to determine a first estimated value for the crop residue parameter associated with the imaged portion of the field while a second vision-based residue-estimating technique (e.g., a computer vision-based linear transact method) may be used to determine a second estimated value for the crop residue parameter associated with same imaged portion of the field. The first and second estimated values determined using the differing residue-estimating techniques may then be compared to determine whether a differential exists between the estimated values. In the event that the separate residue-estimating techniques provide differing values for the crop residue parameter, the estimated value determined using one of the residue-estimating techniques may be calibrated or adjusted based on the estimated value determined using the other residue-estimating technique (or based on the differential between the two estimated values) to improve the accuracy of the crop residue data.
  • Referring now to drawings, FIGS. 1 and 2 illustrate perspective views of one embodiment of a work vehicle 10 and an associated agricultural implement 12 in accordance with aspects of the present subject matter. Specifically, FIG. 1 illustrates a perspective view of the work vehicle 10 towing the implement 12 (e.g., across a field). Additionally, FIG. 2 illustrates a perspective view of the implement 12 shown in FIG. 1. As shown in the illustrated embodiment, the work vehicle 10 is configured as an agricultural tractor. However, in other embodiments, the work vehicle 10 may be configured as any other suitable agricultural vehicle.
  • As particularly shown in FIG. 1, the work vehicle 10 includes a pair of front track assemblies 14, a pair or rear track assemblies 16 and a frame or chassis 18 coupled to and supported by the track assemblies 14, 16. An operator's cab 20 may be supported by a portion of the chassis 18 and may house various input devices for permitting an operator to control the operation of one or more components of the work vehicle 10 and/or one or more components of the implement 12. Additionally, as is generally understood, the work vehicle 10 may include an engine 22 (FIG. 3) and a transmission 24 (FIG. 3) mounted on the chassis 18. The transmission 24 may be operably coupled to the engine 22 and may provide variably adjusted gear ratios for transferring engine power to the track assemblies 14, 16 via a drive axle assembly (not shown) (or via axles if multiple drive axles are employed).
  • Moreover, as shown in FIGS. 1 and 2, the implement 12 may generally include a carriage frame assembly 30 configured to be towed by the work vehicle via a pull hitch or tow bar 32 in a travel direction of the vehicle (e.g., as indicated by arrow 34). As is generally understood, the carriage frame assembly 30 may be configured to support a plurality of ground-engaging tools, such as a plurality of shanks, disk blades, leveling blades, basket assemblies, and/or the like. In several embodiments, the various ground-engaging tools may be configured to perform a tillage operation across the field along which the implement 12 is being towed.
  • As particularly shown in FIG. 2, the carriage frame assembly 30 may include aft extending carrier frame members 36 coupled to the tow bar 32. In addition, reinforcing gusset plates 38 may be used to strengthen the connection between the tow bar 32 and the carrier frame members 36. In several embodiments, the carriage frame assembly 30 may generally function to support a central frame 40, a forward frame 42 positioned forward of the central frame 40 in the direction of travel 34 of the work vehicle 10, and an aft frame 44 positioned aft of the central frame 40 in the direction of travel 34 of the work vehicle 10. As shown in FIG. 2, in one embodiment, the central frame 40 may correspond to a shank frame configured to support a plurality of ground-engaging shanks 46. In such an embodiment, the shanks 46 may be configured to till the soil as the implement 12 is towed across the field. However, in other embodiments, the central frame 40 may be configured to support any other suitable ground-engaging tools.
  • Additionally, as shown in FIG. 2, in one embodiment, the forward frame 42 may correspond to a disk frame configured to support various gangs or sets 48 of disk blades 50. In such an embodiment, each disk blade 50 may, for example, include both a concave side (not shown) and a convex side (not shown). In addition, the various gangs 48 of disk blades 50 may be oriented at an angle relative to the travel direction 34 of the work vehicle 10 to promote more effective tilling of the soil. However, in other embodiments, the forward frame 42 may be configured to support any other suitable ground-engaging tools.
  • Moreover, similar to the central and forward frames 40, 42, the aft frame 44 may also be configured to support a plurality of ground-engaging tools. For instance, in the illustrated embodiment, the aft frame is configured to support a plurality of leveling blades 52 and rolling (or crumbier) basket assemblies 54. However, in other embodiments, any other suitable ground-engaging tools may be coupled to and supported by the aft frame 44, such as a plurality closing disks.
  • In addition, the implement 12 may also include any number of suitable actuators (e.g., hydraulic cylinders) for adjusting the relative positioning, penetration depth, and/or down force associated with the various ground-engaging tools 46, 50, 52, 54. For instance, the implement 12 may include one or more first actuators 56 coupled to the central frame 40 for raising or lowering the central frame 40 relative to the ground, thereby allowing the penetration depth and/or the down pressure of the shanks 46 to be adjusted. Similarly, the implement 12 may include one or more second actuators 58 coupled to the disk forward frame 42 to adjust the penetration depth and/or the down pressure of the disk blades 50. Moreover, the implement 12 may include one or more third actuators 60 coupled to the aft frame 44 to allow the aft frame 44 to be moved relative to the central frame 40, thereby allowing the relevant operating parameters of the ground-engaging tools 52, 54 supported by the aft frame 44 (e.g., the down pressure and/or the penetration depth) to be adjusted.
  • It should be appreciated that the configuration of the work vehicle 10 described above and shown in FIG. 1 is provided only to place the present subject matter in an exemplary field of use. Thus, it should be appreciated that the present subject matter may be readily adaptable to any manner of work vehicle configuration. For example, in an alternative embodiment, a separate frame or chassis may be provided to which the engine, transmission, and drive axle assembly are coupled, a configuration common in smaller tractors. Still other configurations may use an articulated chassis to steer the work vehicle 10, or rely on tires/wheels in lieu of the track assemblies 14, 16.
  • It should also be appreciated that the configuration of the implement 12 described above and shown in FIGS. 1 and 2 is only provided for exemplary purposes. Thus, it should be appreciated that the present subject matter may be readily adaptable to any manner of implement configuration. For example, as indicated above, each frame section of the implement 12 may be configured to support any suitable type of ground-engaging tools, such as by installing closing disks on the aft frame 44 of the implement 12.
  • Additionally, in accordance with aspects of the present subject matter, the work vehicle 10 and/or the implement 12 may include one or more imaging devices coupled thereto and/or supported thereon for capturing images or other image data associated with the field as an operation is being performed via the implement 12. Specifically, in several embodiments, the imaging device(s) may be provided in operative association with the work vehicle 10 and/or the implement 12 such that the imaging device(s) has a field of view directed towards a portion(s) of the field disposed in front of, behind, and/or along one or both of the sides of the work vehicle 10 and/or the implement 12 as the implement 12 is being towed across the field. As such, the imaging device(s) may capture images from the tractor 10 and/or implement 12 of one or more portion(s) of the field being passed by the tractor 10 and/or implement 12.
  • In general, the imaging device(s) may correspond to any suitable device(s) configured to capture images or other image data of the field that allow the field's soil to be distinguished from the crop residue remaining on top of the soil. For instance, in several embodiments, the imaging device(s) may correspond to any suitable camera(s), such as single-spectrum camera or a multi-spectrum camera configured to capture images in the visible light range and/or infrared spectral range. Additionally, in a particular embodiment, the camera(s) may correspond to a single lens camera configured to capture two-dimensional images or a stereo camera(s) having two or more lenses with a separate image sensor for each lens to allow the camera(s) to capture stereographic or three-dimensional images. Alternatively, the imaging device(s) may correspond to any other suitable image capture device(s) and/or vision system(s) that is capable of capturing "images" or other image-like data that allow the crop residue existing on the soil to be distinguished from the soil.
  • It should be appreciated that work vehicle 10 and/or implement 12 may include any number of imaging device(s) 104 provided at any suitable location that allows images of the field to be captured as the vehicle 10 and implement 12 traverse through the field. For instance, FIGS. 1 and 2 illustrate examples of various locations for mounting one or more imaging device(s) for capturing images of the field. Specifically, as shown in FIG. 1, in one embodiment, one or more imaging devices 104A may be coupled to the front of the work vehicle 10 such that the imaging device(s) 104A has a field of view 106 that allows it to capture images of an adjacent area or portion of the field disposed in front of the work vehicle 10. For instance, the field of view 106 of the imaging device(s) 104A may be directed outwardly from the front of the work vehicle 10 along a plane or reference line that extends generally parallel to the travel direction 34 of the work vehicle 10. In addition to such imaging device(s) 104A (or as an alternative thereto), one or more imaging devices 104B may also be coupled to one of the sides of the work vehicle 10 such that the imaging device(s) 104B has a field of view 106 that allows it to capture images of an adjacent area or portion of the field disposed along such side of the work vehicle 10. For instance, the field of view 106 of the imaging device(s) 104B may be directed outwardly from the side of the work vehicle 10 along a plane or reference line that extends generally perpendicular to the travel direction 34 of the work vehicle 10.
  • Similarly, as shown in FIG. 2, in one embodiment, one or more imaging devices 104C may be coupled to the rear of the implement 12 such that the imaging device(s) 104C has a field of view 106 that allows it to capture images of an adjacent area or portion of the field disposed aft of the implement. For instance, the field of view 106 of the imaging device(s) 104C may be directed outwardly from the rear of the implement 12 along a plane or reference line that extends generally parallel to the travel direction 34 of the work vehicle 10. In addition to such imaging device(s) 104C (or as an alternative thereto), one or more imaging devices 104D may also be coupled to one of the sides of the implement 12 such that the imaging device(s) 104D has a field of view 106 that allows it to capture images of an adjacent area or portion of the field disposed along such side of the implement 12. For instance, the field of view 106 of the imaging device 104D may be directed outwardly from the side of the implement 12 along a plane or reference line that extends generally perpendicular to the travel direction 34 of the work vehicle 10.
  • It should be appreciated that, in alternative embodiments, the imaging device(s) 104 may be installed at any other suitable location that allows the device(s) to capture images of an adjacent portion of the field, such as by installing an imaging device(s) at or adjacent to the aft end of the work vehicle 10 and/or at or adjacent to the forward end of the implement 10. It should also be appreciated that, in several embodiments, the imaging devices 104 may be specifically installed at locations on the work vehicle 10 and/or the implement 12 to allow images to be captured of the field both before and after the performance of a field operation by the implement 12. For instance, by installing the imaging device 104A at the forward end of the work vehicle 10 and the imaging device 104C at the aft end of the implement 12, the forward imaging device 104A may capture images of the field prior to performance of the field operation while the aft imaging device 104C may capture images of the same portions of the field following the performance of the field operation.
  • Referring now to FIG. 3, a schematic view of one embodiment of a vision-based system 100 for estimating crop residue parameters is illustrated in accordance with aspects of the present subject matter. In general, the system 100 will be described herein with reference to the work vehicle 10 and the implement 12 described above with reference to FIGS. 1 and 2. However, it should be appreciated that the disclosed system 100 may generally be utilized with work vehicles having any suitable vehicle configuration and/or implements have any suitable implement configuration.
  • In several embodiments, the system 100 may include a controller 102 and various other components configured to be communicatively coupled to and/or controlled by the controller 102, such as one or more imaging devices 104 and/or various components of the work vehicle 10 and/or the implement 12. As will be described in greater detail below, the controller 102 may be configured to receive images or other image data from the imaging device(s) 104 that depict portions of the field as an operation (e.g., a tillage operation) is being performed within the field. Based on an analysis of the image data received from the imaging device(s) 104, the controller 102 may be configured to estimate a first value for a crop residue parameter associated with the field (e.g., a percent crop residue coverage) using a first residue-estimating technique. Thereafter, the controller 102 may be configured to analyze the same or similar images or other image data to estimate a second value for the crop residue parameter using a second residue-estimating technique that differs from the first residue-estimating technique. Based on a comparison between the estimated values for the crop residue parameter determined using the two differing techniques, the controller may, if necessary or desired, calibrate the crop residue date being generated using one of the residue-estimating technique, such as by adjusting the first estimated value for the crop residue parameter determined using the first residue-estimating technique based on the second estimated value determined using the second residue-estimating technique.
  • In general, the controller 102 may correspond to any suitable processor-based device(s), such as a computing device or any combination of computing devices. Thus, as shown in FIG. 3, the controller 102 may generally include one or more processor(s) 110 and associated memory devices 112 configured to perform a variety of computer-implemented functions (e.g., performing the methods, steps, algorithms, calculations and the like disclosed herein). As used herein, the term "processor" refers not only to integrated circuits referred to in the art as being included in a computer, but also refers to a controller, a microcontroller, a microcomputer, a programmable logic controller (PLC), an application specific integrated circuit, and other programmable circuits. Additionally, the memory 112 may generally comprise memory element(s) including, but not limited to, computer readable medium (e.g., random access memory (RAM)), computer readable non-volatile medium (e.g., a flash memory), a floppy disk, a compact disc-read only memory (CD-ROM), a magneto-optical disk (MOD), a digital versatile disc (DVD) and/or other suitable memory elements. Such memory 112 may generally be configured to store information accessible to the processor(s) 110, including data 114 that can be retrieved, manipulated, created and/or stored by the processor(s) 110 and instructions 116 that can be executed by the processor(s) 110.
  • In several embodiments, the data 114 may be stored in one or more databases. For example, the memory 112 may include an image database 118 for storing image data received from the imaging device(s) 104. For example, the imaging device(s) 104 may be configured to continuously or periodically capture images of adjacent portion(s) of the field as an operation is being performed with the field. In such an embodiment, the images transmitted to the controller 102 from the imaging device(s) 104 may be stored within the image database 118 for subsequent processing and/or analysis. It should be appreciated that, as used herein, the term image data may include any suitable type of data received from the imaging device(s) 104 that allows for the crop residue coverage of a field to be analyzed, including photographs and other image-related data (e.g., scan data and/or the like).
  • Additionally, as shown in FIG. 3, the memory 12 may include a crop residue database 120 for storing information related to crop residue parameters for the field being processed. For example, as indicated above, based on the image data received from the imaging device(s) 104, the controller 102 may be configured to estimate or calculate one or more values for one or more crop residue parameters associated with the field, such as a value(s) for the percent crop residue coverage for an imaged portion(s) of the field (and/or a value(s) for the average percent crop residue coverage for the field). The crop residue parameter(s) estimated or calculated by the controller 102 may then be stored within the crop residue database 120 for subsequent processing and/or analysis.
  • Moreover, in several embodiments, the memory 12 may also include a location database 122 storing location information about the work vehicle/implement 10, 12 and/or information about the field being processed (e.g., a field map). Specifically, as shown in FIG. 3, the controller 102 may be communicatively coupled to a positioning device(s) 124 installed on or within the work vehicle 10 and/or on or within the implement 12. For example, in one embodiment, the positioning device(s) 124 may be configured to determine the exact location of the work vehicle 10 and/or the implement 12 using a satellite navigation position system (e.g. a GPS system, a Galileo positioning system, the Global Navigation satellite system (GLONASS), the BeiDou Satellite Navigation and Positioning system, and/or the like). In such an embodiment, the location determined by the positioning device(s) 124 may be transmitted to the controller 102 (e.g., in the form coordinates) and subsequently stored within the location database 122 for subsequent processing and/or analysis.
  • Additionally, in several embodiments, the location data stored within the location database 122 may also be correlated to the image data stored within the image database 118. For instance, in one embodiment, the location coordinates derived from the positioning device(s) 124 and the image(s) captured by the imaging device(s) 104 may both be time-stamped. In such an embodiment, the time-stamped data may allow each image captured by the imaging device(s) 104 to be matched or correlated to a corresponding set of location coordinates received from the positioning device(s) 124, thereby allowing the precise location of the portion of the field depicted within a given image to be known (or at least capable of calculation) by the controller 102.
  • Moreover, by matching each image to a corresponding set of location coordinates, the controller 102 may also be configured to generate or update a corresponding field map associated with the field being processed. For example, in instances in which the controller 102 already includes a field map stored within its memory 112 that includes location coordinates associated with various points across the field, each image captured by the imaging device(s) 104 may be mapped or correlated to a given location within the field map. Alternatively, based on the location data and the associated image data, the controller 102 may be configured to generate a field map for the field that includes the geo-located images associated therewith.
  • Referring still to FIG. 3, in several embodiments, the instructions 116 stored within the memory 112 of the controller 102 may be executed by the processor(s) 110 to implement an image analysis module 126. In general, the image analysis module 126 may be configured to analyze the images received by the imaging device(s) 104 using one or more residue-estimating techniques to allow the controller 102 to estimate one or more crop residue parameters associated with the field currently being processed. For instance, in several embodiments, the image analysis module 126 may be configured to implement two different residue-estimating techniques (e.g., first and second residue-estimating techniques), with each residue-estimating technique being based on a different computer-vision algorithm or any other suitable image-processing technique that allows the controller 102 to identify crop residue remaining on top of the soil. By identifying all or a portion of the crop residue contained within each image (or within a subset of the images) using the two different residue-estimating techniques, the controller 102 may then determine two values for the crop residue parameter(s) associated with a given imaged portion of the field. Such values may then be stored within the crop residue database 120.
  • It should be appreciated that, in general, the residue-estimating techniques used by the image analysis module 126 to estimate the crop residue parameter(s) may correspond to any suitable computer-vision algorithms or image-processing techniques that allow the controller 102 to identify crop residue remaining on top of the soil. For instance, as will be described below with reference to FIGS. 4 and 5, in one embodiment, the image analysis module may be configured to utilize both a computer vision-based blob analysis and a computer vision-based linear transact method to estimate values for the crop residue parameter(s). The estimated values determined using each of such residue-estimating techniques may then be stored within the crop residue database 120 for subsequent analysis and/or processing. However, in other embodiments, the image analysis module 126 may be configured to implement any other suitable vision-based residue-estimating techniques to estimate the crop residue parameter(s).
  • Moreover, as shown in FIG. 3, the instructions 116 stored within the memory 112 of the controller 102 may also be executed by the processor(s) 110 to implement a calibration module 128. In general, the calibration module 128 may be configured to calibrate the crop residue data generated by the image analysis module 126 based on the estimated values determined using the differing residue-estimating techniques. Specifically, in several embodiments, the calibration module 128 may be configured to compare the estimated value(s) of the crop residue parameter(s) determined using the first residue-estimating technique to the corresponding estimate value(s) of the crop residue parameter(s) determined using the second residue-estimating technique. In such embodiments, when a differential exists between the estimate value(s) determined using the first residue-estimating technique and the corresponding estimate value(s) determined using the second residue-estimating technique, the calibration module 128 may be configured to calibrate or adjust the estimated value(s) determined using one of the residue-estimating techniques based on the estimated value(s) determined using the other residue-estimating technique.
  • For instance, in one embodiment, the estimated value(s) determined using the second residue-estimating technique may be used to calibrate or adjust the estimate value(s) determined using the first residue-estimating technique. As an example, assuming that residue-estimating techniques are being used to determine estimated values of the percent crop residue coverage within the field, the image analysis module 126 may analyze one or more images of an imaged portion of the field and determine a first estimated value of 45% crop residue coverage using the first residue-estimating technique and a second estimated value of 50% crop residue coverage using the second residue-estimating technique. The calibration module 128 may then compare the first and estimated values and determine that a +5% differential exists between the estimated values. The calibration module 128 may then, in one embodiment, adjust the first estimated value and/or any future/past estimated values determined using the first residue-estimating technique based on the second estimated value and/or the differential determined between the first and second estimated values. For instance, the calibration module 128 may be configured to increase the first estimated value and/or any future/past estimated values determined using the first residue-estimating technique by 5% to ensure that the crop residue data generated using the first residue-estimating technique is consistent with the crop residue data generated using the second residue-estimating technique.
  • It should be appreciated that, in addition to analyzing the estimated values determined for a singled imaged portion of the field or as an alternative thereto, the calibration module 128 may be configured to analyze the estimated values determined for various different imaged portions of the field. In such an embodiment, the calibration module 128 may be configured to compare the estimated values determined using the first and second residue-estimating techniques for each imaged portion of the field to determine an average differential existing between the first and second estimated values. The calibration module 128 may then adjust the first estimated value and/or any future/past estimated values determined using the first residue-estimating technique based on the average differential determined across the various imaged portions of the field.
  • Additionally, it should be appreciated that, although the present subject matter is generally described herein as using the second estimated value(s) determined via the second residue-estimating technique to calibrate or adjust the first estimate value(s) determined via the first residue-estimating technique, the configuration may be reversed, with the first estimated value(s) being used to calibrate or adjust the second estimated value(s). In general, the residue-estimating technique used as the calibration source may be selected based on any number of factors, including accuracy considerations, computer processing requirements, standards or regulations set for crop residue data and/or the like.
  • Referring still to FIG. 3, the instructions 116 stored within the memory 112 of the controller 102 may also be executed by the processor(s) 110 to implement a control module 129. In general, the control module 129 may be configured to adjust the operation of the work vehicle 10 and/or the implement 12 by controlling one or more components of the implement/ vehicle 12, 10. Specifically, in several embodiments, when the estimated crop residue parameter determined by the controller 102 differs from a given target set for such parameter, the control module 129 may be configured to fine-tune the operation of the work vehicle 10 and/or the implement 12 in a manner designed to adjust the amount of crop residue remaining in the field. For instance, when it is desired to have a percent crop residue coverage of 30%, the control module 129 may be configured to adjust the operation of the work vehicle and/or the implement 12 so as to increase or decrease the amount of crop residue remaining in the field when the estimated percent crop residue coverage for a given imaged portion of the field (or an average estimated percent crop residue coverage across multiple imaged portions of the field) differs from the target percentage.
  • It should be appreciated that the controller 102 may be configured to implement various different control actions to adjust the operation of the work vehicle 10 and/or the implement 12 in a manner that increases or decreases the amount of crop residue remaining in the field. In one embodiment, the controller 102 may be configured to increase or decrease the operational or ground speed of the implement 12 to affect an increase or decrease in the crop residue coverage. For instance, as shown in FIG. 3, the controller 102 may be communicatively coupled to both the engine 22 and the transmission 24 of the work vehicle 10. In such an embodiment, the controller 102 may be configured to adjust the operation of the engine 22 and/or the transmission 24 in a manner that increases or decreases the ground speed of the work vehicle 10 and, thus, the ground speed of the implement 12, such as by transmitting suitable control signals for controlling an engine or speed governor (not shown) associated with the engine 22 and/or transmitting suitable control signals for controlling the engagement/disengagement of one or more clutches (not shown) provided in operative association with the transmission 24.
  • In addition to the adjusting the ground speed of the vehicle/implement 10, 12 (or as an alternative thereto), the controller 102 may also be configured to adjust an operating parameter associated with the ground-engaging tools of the implement 12. For instance, as shown in FIG. 3, the controller 102 may be communicatively coupled to one or more valves 130 configured to regulate the supply of fluid (e.g., hydraulic fluid or air) to one or more corresponding actuators 56, 58, 60 of the implement 12. In such an embodiment, by regulating the supply of fluid to the actuator(s) 56, 58, 60, the controller 102 may automatically adjust the penetration depth, the down force, and/or any other suitable operating parameter associated with the ground-engaging tools of the implement 12.
  • Moreover, as shown in FIG. 3, the controller 102 may also include a communications interface 132 to provide a means for the controller 102 to communicate with any of the various other system components described herein. For instance, one or more communicative links or interfaces 134 (e.g., one or more data buses) may be provided between the communications interface 132 and the imaging device(s) 104 to allow images transmitted from the imaging device(s) 104 to be received by the controller 102. Similarly, one or more communicative links or interfaces 136 (e.g., one or more data buses) may be provided between the communications interface 132 and the positioning device(s) 124 to allow the location information generated by the positioning device(s) 124 to be received by the controller 102. Additionally, as shown in FIG. 3, one or more communicative links or interfaces 138 (e.g., one or more data buses) may be provided between the communications interface 132 and the engine 22, the transmission 24, the control valves 130, and/or the like to allow the controller 102 to control the operation of such system components.
  • Referring now to FIG. 4, an example, simplified image of a portion of a field that may be provided by one of the imaging device(s) 104 of the disclosed system 100 is illustrated in accordance with aspects of the present subject matter, particularly illustrating the field including crop residue 160 (indicated by cross-hatching) positioned on the top of the soil 162. As indicated above, the image analysis module 126 of the controller 102 may generally be configured to utilize any suitable computer-vision algorithms or image-processing techniques that allow the controller 102 to identify crop residue 160 remaining on top of the soil 162. For instance, in one embodiment, the vision-based technique used by the image analysis module 126 may rely upon the identification of one or more image characteristics captured by the imaging device(s) 104 to distinguish the crop residue 160 from the soil 162 contained within each image. For instance, when the imaging device(s) 104 corresponds to a camera capable of capturing the distinction between the reflective characteristics of the soil 162 and the crop residue 160, the controller 102 may be configured to implement a computer-vision algorithm that identifies the differences in the reflectivity or spectral absorption between the soil 162 and the crop residue 160 contained within each image being analyzed. Alternatively, the controller 102 may be configured to utilize an edge-finding algorithm to identify or distinguish the crop residue 160 from the soil 162 contained within each image.
  • Additionally, upon distinguishing the crop residue 160 from the soil 162, the controller 102 may be configured to utilize any suitable technique or methodology for calculating the percent crop residue coverage for the portion of the field contained within each image. For instance, as indicated above, the controller 102 may, in one embodiment, utilize a "blob analysis" in which the crop residue identified within each image via the associated computer-vision technique is represented as a "blob" or plurality of "blobs" encompassing a given area within the image. Specifically, as shown in FIG. 4, the crop residue 160 depicted within the image is represented as cross-hatched blobs overlaying the soil 162. In such an embodiment, the image analysis module 126 may be configured to calculate the percent crop residue coverage for the imaged portion of the field using the following equation (Equation 1): Percent Crop Residue = 1 total image area blob area total image area 100
    Figure imgb0001
    wherein, the total image area corresponds to the total area defined within the image (e.g., as a function of the total number of pixels of the image) and the blob area corresponds to the total area represented by crop residue 160 within the image (e.g., as a function of the total number of pixels representing the identified crop residue).
  • Referring now to FIG. 5, an example, simplified view of a continuous section 170 of an imaged portion of a field is illustrated in accordance with aspects of the present subject matter. Specifically, FIG. 5 illustrates a plurality of images captured by one or more of the imaging device(s) 104 of the disclosed system 100 that collectively depict a continuous section 170 of the field. For instance, the field of view 106 of the imaging device(s) 104 may allow the imaging device(s) 104 to capture an image of the field that spans a given field distance. In such an embodiment, to analyze a continuous section 170 of an imaged portion of the field that extends across a predetermined field length 172 that is greater than the field distance captured within each image, multiple images may be stitched together or otherwise analyzed in combination. For instance, in the example view shown in FIG. 5, a plurality of images captured by one of the imaging device(s) 104 have been stitched together (e.g., the separate images being indicated by the dashed horizontal lines) to provide a view of a continuous section 170 of the field that spans across a predetermined field length 172.
  • It should be appreciated that the controller 102 (e.g., the image analysis module 126) may be configured to identify which images can be used to collectively depict a continuous section of the field using any suitable methodology or technique. For instance, as indicated above, the images provided by the imaging device(s) 104 may be time-stamped. In such an embodiment, by knowing the ground speed of the work vehicle/implement 10, 12 and the field of view 106 of the imaging device(s) 104 relative to the field, the controller 102 may be configured to stitch together or otherwise access the images captured by the imaging device(s) 104 that collectively depict a continuous field section 170 spanning across the predetermined field length 172. Alternatively, the controller 102 may be configured to utilize any suitable image-processing algorithm that allows the controller 102 to identify the images (or portions of images) that collectively depict a continuous section of the field.
  • By capturing images that collectively depict a continuous section 170 of the field, the image analysis module 126 of the controller 102 may, in several embodiments, be configured to implement a computer vision-based line transact method to estimate the percent crop residue coverage for the imaged portion of the field. Specifically, in several embodiments, the image analysis module 126 may access the images collectively depicting the continuous imaged section 170 of the field and apply a known scale 174 to such continuous imaged section 170 of the field such that a plurality of reference points 176 are defined along the continuous imaged field section 170 that are spaced apart evenly across the predetermined field length 172. Thereafter, the images may be analyzed to identify the number or percentage of reference points 176 that are aligned with or intersect crop residue within the continuous imaged section 170 of the field. Such identified number or percentage of the reference points 176 may then correspond to or may be used to calculate the percent crop residue coverage within the continuous imaged section 170 of the field. For example, in one embodiment, the percent crop residue_coverage for the continuous imaged section 170 of the field may be calculated using the following equation (Equation 2): Percent Crop Residue = identified reference points total reference points 100
    Figure imgb0002
    wherein, the "identified reference points" correspond to the total number of reference points 176 identified by the image analysis module 126 that are aligned with or intersect crop residue with the analyzed images and the "total reference points" correspond to the total number of reference points 176 defined across the predetermined field length 172 via the applied scale 174.
  • In several embodiments, the scale 174 applied to the continuous imaged section 170 of the field may divide the predetermined field length 172 into one hundred distinct field sections such that one hundred reference points 176 are evenly spaced apart along the predetermined field length 170. In such an embodiment, it may be desirable for the continuous imaged section 170 of the field to correspond to a continuous field section spanning one hundred feet (i.e., such that the predetermined field length 172 is equal to one hundred feet). As a result, the imposed scale 174 may divide the predetermined field length 172 into one hundred one-foot sections, with a reference point 176 being defined at each foot mark along the predetermined field length 172. However, in other embodiments, the predetermined field length 172 may correspond to any other suitable field length, such as a fifty foot field section, a twenty-five foot field section or any other suitable field length. Similarly, any other suitable scale 174 may be applied to the continuous imaged section 170 of the field to allow any suitable number of evenly spaced reference points 176 to be defined across the predetermined field length 172. For instance, in alternative embodiments, a fifty-point scale or a twenty-five-point scale may be applied such that fifty or twenty-five evenly spaced reference points 176, respectively, are defined across the predetermined field length 172.
  • It should be appreciated that, in several embodiments, the image analysis module 126 may only be configured to identify the reference points 176 within the images that are aligned with or intersect crop residue that exceeds a given residue size threshold for purposes of calculating the percent crop residue coverage for the continuous imaged section 170 of the field. Specifically, in several embodiments, the size threshold for the crop residue may be selected based on the minimum residue size capable of intercepting rain drops. For instance, in one embodiment, the residue size threshold may correspond to a residue diameter of one-eighth of an inch (1/8" or 0,3175 cm). In such an embodiment, if the crop residue aligned with or intersecting one of the reference points 176 is determined to have a cross-wise dimension within the image that exceeds 1/8" (or 0,3175 cm) (e.g., via a suitable image analysis technique), such reference point 176 may be counted for purposes of calculating the percent crop residue coverage for the continuous imaged section 170 of the field. However, if the crop residue aligned with or intersecting one of the reference points 176 is determined to have a cross-wise dimension within the image that is less than 1/8" (or 0,3175 cm), such reference point 176 may not be counted for purposes of calculating the percent crop residue coverage.
  • It should also be appreciated that, in several embodiments, the image analysis module 126 may be configured to perform the above-referenced analysis for multiple imaged sections of the field. For example, the image analysis module 126 may access images captured by the imaging device(s) 104 that collectively depict several different continuous imaged sections of the field, with each continuous imaged field section spanning a predetermined field length. Thereafter, for each continuous imaged section of the field, the image analysis module 126 may apply a known scale to such continuous imaged field section such that a plurality of reference points are defined along the continuous imaged field section that are spaced apart evenly across the predetermined field length. The images associated with each continuous imaged section of the field may then be analyzed to identify the number or percentage of reference points that are aligned with or intersect crop residue within such continuous imaged section of the field, thereby allowing a value for the percent crop residue coverage to be determined for each continuous imaged field section. In such an embodiment, the image analysis module 126 may then calculate an average percent crop residue coverage based on the residue coverage values calculated for the various continuous imaged field sections. In doing so, it may be desirable for the average percent crop residue coverage to be calculated based on the residue coverage values determined for at least five continuous imaged field sections, thereby allowing a desirable confidence level to be obtained for the calculated average.
  • Referring now to FIG. 6, a flow diagram of one embodiment of a method 200 for calibrating crop residue data acquired using a vision-based system is illustrated in accordance with aspects of the present subject matter. In general, the method 200 will be described herein with reference to the work vehicle 10 and the implement 12 shown in FIGS. 1 and 2, as well as the various system components shown in FIG. 3. However, it should be appreciated that the disclosed method 200 may be implemented with work vehicles and/or implements having any other suitable configurations and/or within systems having any other suitable system configuration. In addition, although FIG. 6 depicts steps performed in a particular order for purposes of illustration and discussion, the methods discussed herein are not limited to any particular order or arrangement. One skilled in the art, using the disclosures provided herein, will appreciate that various steps of the methods disclosed herein can be omitted, rearranged, combined, and/or adapted in various ways without deviating from the scope of the present disclosure.
  • As shown in FIG. 6, at (202), the method 200 may include controlling the operation of at least one of an implement or a work vehicle as the implement is being towed by the work vehicle across a field. Specifically, as indicated above, the controller 102 of the disclosed system 100 may be configured to control the operation of the work vehicle 10 and/or the implement 12, such as by controlling one or more components of the work vehicle 10 and/or the implement 12 to allow an operation to be performed within the field (e.g., a tillage operation).
  • Additionally, at (204), the method 200 may include receiving image data associated with an imaged portion of the field. Specifically, as indicated above, the controller 102 may be coupled to one or more imaging devices 104 configured to capture images of various portions of the field.
  • Moreover, at (206), the method 200 may include analyzing the image data using a first residue-estimating technique to determine a first estimated value of a crop residue parameter for the imaged portion of the field. For instance, as indicated above, the image analysis module 126 of the controller 102 may be configured to implement a vision-based residue-estimating technique to estimate a crop residue parameter for the imaged portion of the field, such as by estimating the percent crop residue coverage for the imaged portion of the field using a computer vision-based blob analysis or using a computer vision-based line transact method.
  • Referring still to FIG. 6, at (208), the method 200 may include analyzing the image data using a second residue-estimating technique to determine a second estimated value of the crop residue parameter for the imaged portion of the field. Specifically, as indicated above, the image analysis module 126 of the controller 102 may, in accordance with aspects of the present subject matter, be configured to implement two different vision-based residue-estimating techniques for estimating a given crop residue parameter for the imaged portion of the field. For instance, in an embodiment in which the first residue-estimating technique corresponds to a computer vision-based blob analysis, the second residue estimating technique may, for example, correspond to a computer vision-based line transact method or vice versa. As such, the controller 102 may determine two separate estimated values for the crop reside parameter using the two different residue-estimating techniques.
  • Additionally, at (210), the method 200 may include adjusting at least one of the first estimated value or one or more additional estimated values of the crop residue parameter obtained using the first residue-estimating technique based on at least one of the second estimated value or the differential between first and second estimated values. Specifically, as indicated above, when a differential exists between the first and second estimated values, the controller 102 may be configured to adjust the first estimated value determined using the first residue-estimated technique based on the second estimated value determined using the second residue-estimated technique and/or based on the differential existing between the first and second estimated values. For example, assuming that a percent crop residue coverage of 40% is determined using the first residue-estimating technique and a percent crop residue coverage of 32% is determined using the second residue-estimating technique, the controller 102 may, in one embodiment, the adjust the estimated percent crop residue coverage associated with the first residue-estimating technique to match the percent crop residue coverage associated with the second residue-estimating technique (e.g., by reducing the percent crop residue coverage from 40% to 32%). In addition, the controller 102 may also utilize the differential defined between the first and second estimated values to adjust any past or future estimated values determined using the first residue-estimating technique, such as by applying a -8% modifier to each estimated value determined using the first residue-estimating technique.
  • It should be appreciated that, although not shown, the method 200 may also include any additional steps or method elements consistent with the disclosure provided herein. For example, the method 200 may also include actively adjusting the operation of the implement 12 and/or the work vehicle 10 when the adjusted value for the first estimated value and/or the adjusted value(s) for the one or more additional estimated values determined using the first residue-estimating technique differs from a target value set for crop residue parameter. Specifically, as indicated above, when the estimated crop residue parameter differs from a target value set for such parameter, the controller 102 may be configured to actively adjust the operation of the work vehicle 10 and/or the implement 12 in a manner that increases or decreases the amount of crop residue remaining within the field following the operation being performed (e.g., a tillage operation), such as by adjusting the ground speed at which the implement 12 is being towed and/or by adjusting one or more operating parameters associated with the ground-engaging elements of the implement 12.
  • This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art.

Claims (15)

  1. A method (200) for calibrating crop residue data for a field acquired using a vision-based system (100), the method (200) comprising controlling, with a computing device (102), an operation of at least one of an implement (12) or a work vehicle (10) as the implement (12) is being towed by the work vehicle (10) across the field, the method (200) being characterized by:
    receiving, with the computing device (102), image data (118) associated with an imaged portion of the field;
    analyzing, with the computing device (102), the image data (118) using a first residue-estimating technique to determine a first estimated value of a crop residue parameter for the imaged portion of the field;
    analyzing, with the computing device (102), the image data (118) using a second residue-estimating technique to determine a second estimated value of the crop residue parameter for the imaged portion of the field, the second residue-estimating technique differing from the first residue-estimating technique; and
    when a differential exists between the first and second estimated values, adjusting, with the computing device (102), at least one of the first estimated value or one or more additional estimated values of the crop residue parameter determined using the first residue-estimating technique based on at least one of the second estimated value or the differential between first and second estimated values.
  2. The method (200) as in claim 1, wherein receiving image data (118) associated with the imaged portion of the field comprises receiving the image data (118) from one or more imaging devices (104) provided in operative association with at least one of the work vehicle (10) or the implement (12).
  3. The method (200) as in any preceding claim, wherein the crop residue parameter corresponds to a percent crop residue coverage associated with the imaged portion of the field.
  4. The method (200) as in any preceding claim, wherein analyzing the image data (118) using the first residue-estimating technique comprises analyzing the image data (118) using a computer vision-based image processing technique.
  5. The method (200) as in claim 4, wherein the computer vision-based image processing technique corresponds to a vision-based blob analysis of the image data (118) to determine the first estimated value of the crop residue parameter for the imaged portion of the field.
  6. The method (200) as in any preceding claim, wherein analyzing the image data (118) using the second residue-estimating technique comprises analyzing the image data (118) using a computer vision-based line transact method.
  7. The method (200) as in claim 6, wherein analyzing the image data (118) using the computer vision-based line transact method comprises:
    accessing a plurality of images of the image data (118) that collectively depict a continuous imaged section (170) of the field, the continuous imaged section (170) extending across a predetermined length (172);
    applying a known scale (174) to the continuous imaged section (170) of the field such that a plurality of reference points (176) are associated with the continuous imaged section (170); and
    determining a percentage of the plurality of reference points (176) that are aligned with or intersect crop residue within the plurality of images.
  8. The method (200) as in claim 7, further comprises determining the second estimated value of the crop residue parameter based at least in part on the percentage of the plurality of reference points (176) that are aligned with or intersect crop residue within the plurality of images.
  9. The method (200) as in claim 7, wherein determining the percentage of the plurality of reference points (176) that are aligned with or intersect crop residue within the plurality of images comprises determining the percentage of the plurality of reference points (176) that are aligned with or intersect crop residue that exceeds a given size threshold.
  10. The method (200) as in claim 7, wherein applying the known scale (174) to the continuous imaged section (170) of the field comprises applying the known scale (174) such that the plurality of reference points (176) are spaced apart evenly across the predetermined length (172).
  11. The method (200) as in claim 7, further comprising:
    accessing a second plurality of images of the image data (118) that collectively depict one or more additional continuous imaged sections of the field, each of the one or more additional continuous imaged sections (170) of the field extending across the predetermined length (172);
    applying the known scale (174) to the one more additional continuous imaged sections (170) of the field such that a plurality of reference points (176) are associated with each of the one or more additional continuous imaged sections (170) of the field;
    determining a percentage of the plurality of reference points (176) that are aligned with or intersect crop residue within the second plurality of images for each of the one or more additional continuous imaged sections (170) of the field; and
    determining the second estimated value of the crop residue parameter based on an average of the determined percentages associated with the continuous image section (170) of the field and the one or more additional continuous imaged sections (170) of the field.
  12. The method (200) as in any preceding claim, further comprising actively adjusting the operation of at least one of the implement (12) or the work vehicle (10) when the adjusted value for the at least one of the first estimated value or the one or more additional estimated values differs from a target value set for the crop residue parameter.
  13. A vision-based system (100) for estimating and adjusting crop residue parameters as an implement (12) is being towed across a field by a work vehicle (10), the system (100) comprising an imaging device (104) installed relative to one of the work vehicle (10) or the implement (12) such that the imaging device (104) is configured to capture images of the field, the system (200) further comprising a controller (102) communicatively coupled to the imaging device (104), the controller (102) including a processor (110) and associated memory (112), the memory (112) storing instructions that, when implemented by the processor (110), configure the controller to receive, from the imaging device (104), image data (118) associated with an imaged portion of the field, the system (200) being characterized by:
    the controller (102) being configured to:
    analyze the image data (118) using a first residue-estimating technique to determine a first estimated value of a crop residue parameter for the imaged portion of the field;
    analyze the image data (118) using a second residue-estimating technique to determine a second estimated value of the crop residue parameter for the imaged portion of the field, the second residue-estimating technique differing from the first residue-estimating technique; and
    when a differential exists between the first and second estimated values, adjust at least one of the first estimated value or one or more additional estimated values of the crop residue parameter determined using the first residue-estimating technique based on at least one of the second estimated value or the differential between first and second estimated values.
  14. The system (200) as in claim 13, wherein the imaging device (104) comprises a camera installed relative to one of the work vehicle (10) or the implement (12) such that a field of view (106) of the imaging device (104) is directed either parallel or perpendicular to a travel direction (34) of the work vehicle (10).
  15. The system (100) as in claim 13, wherein the crop residue parameter corresponds to a percent crop residue coverage associated with the imaged portion of the field.
EP18172495.6A 2017-05-16 2018-05-15 Vision-based system for acquiring crop residue data and related calibration methods Active EP3406124B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/596,145 US10262206B2 (en) 2017-05-16 2017-05-16 Vision-based system for acquiring crop residue data and related calibration methods

Publications (2)

Publication Number Publication Date
EP3406124A1 EP3406124A1 (en) 2018-11-28
EP3406124B1 true EP3406124B1 (en) 2019-12-25

Family

ID=62186291

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18172495.6A Active EP3406124B1 (en) 2017-05-16 2018-05-15 Vision-based system for acquiring crop residue data and related calibration methods

Country Status (2)

Country Link
US (2) US10262206B2 (en)
EP (1) EP3406124B1 (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2018322047A1 (en) * 2017-08-21 2020-04-02 Climate Llc Digital modeling and tracking of agricultural fields for implementing agricultural field trials
US10582655B2 (en) 2017-08-23 2020-03-10 Cnh Industrial Canada, Ltd. System and method for spraying fluid onto seeds dispensed from a planter
US11144775B2 (en) * 2018-06-25 2021-10-12 Cnh Industrial Canada, Ltd. System and method for illuminating the field of view of a vision-based sensor mounted on an agricultural machine
US20220183214A1 (en) * 2019-04-26 2022-06-16 Agco Corporation Methods of operating a tillage implement
WO2020231934A1 (en) 2019-05-10 2020-11-19 Great Plains Manufacturing, Inc. Vision sensors for agricultural implements and processes
US11510364B2 (en) 2019-07-19 2022-11-29 Deere & Company Crop residue based field operation adjustment
US20210027449A1 (en) * 2019-07-23 2021-01-28 Cnh Industrial America Llc System and method for determining field characteristics based on data from multiple types of sensors
US10820478B1 (en) 2019-07-30 2020-11-03 Cnh Industrial America Llc System and method for providing a visual indication of field surface conditions
US11937527B2 (en) 2019-07-31 2024-03-26 Cnh Industrial America Llc System and method for determining residue coverage within a field following a harvesting operation
US11793098B2 (en) 2019-08-27 2023-10-24 Cnh Industrial America Llc System and method for detecting levelness of tools of a tillage implement based on material flow
US11665991B2 (en) 2019-09-24 2023-06-06 Cnh Industrial America Llc System and method for monitoring the levelness of a multi-wing agricultural implement
US11503756B2 (en) 2019-09-25 2022-11-22 Cnh Industrial America Llc System and method for determining soil levelness using spectral analysis
US20210105928A1 (en) * 2019-10-14 2021-04-15 Cnh Industrial Canada, Ltd. System and method for managing material accumulation relative to ground engaging tools of an agricultural implement
US11877527B2 (en) 2019-10-17 2024-01-23 Cnh Industrial America Llc System and method for controlling agricultural implements based on field material cloud characteristics
US11761757B2 (en) 2019-10-28 2023-09-19 Cnh Industrial America Llc System and method for detecting tool plugging of an agricultural implement based on residue differential
US11528836B2 (en) 2019-11-22 2022-12-20 Cnh Industrial America Llc System and method for sequentially controlling agricultural implement ground-engaging tools
US11445656B2 (en) 2019-11-26 2022-09-20 Cnh Industrial America Llc System and method for preventing material accumulation relative to ground engaging tools of an agricultural implement
US11357153B2 (en) 2019-12-11 2022-06-14 Cnh Industrial Canada, Ltd. System and method for determining soil clod size using captured images of a field
US11410301B2 (en) 2020-02-28 2022-08-09 Cnh Industrial America Llc System and method for determining residue coverage within a field based on pre-harvest image data
EP3967118A1 (en) * 2020-09-15 2022-03-16 Kubota Corporation A method for operating an agricultural machine having working tools configured for mechanical weeding and agricultural machine
US20220256764A1 (en) * 2021-02-12 2022-08-18 Cnh Industrial Canada, Ltd. Systems and methods for determining residue coverage of a field
US11669592B2 (en) * 2021-02-12 2023-06-06 Cnh Industrial America Llc Systems and methods for residue bunch detection
US11538180B2 (en) 2021-02-12 2022-12-27 Cnh Industrial Canada, Ltd. Systems and methods for determining residue length within a field
US11810285B2 (en) 2021-03-16 2023-11-07 Cnh Industrial Canada, Ltd. System and method for determining soil clod parameters of a field using three-dimensional image data

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4935700A (en) 1985-08-09 1990-06-19 Washington Research Foundation Fringe field capacitive sensor for measuring the size of an opening
US5044756A (en) 1989-03-13 1991-09-03 Purdue Research Foundation Real-time soil organic matter sensor
US5278423A (en) 1992-12-30 1994-01-11 Schwartz Electro-Optics, Inc. Object sensor and method for use in controlling an agricultural sprayer
US5412219A (en) * 1993-11-22 1995-05-02 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method for determining surface coverage by materials exhibiting different fluorescent properties
IT1302609B1 (en) 1998-10-06 2000-09-29 Techint Spa PROCEDURE AND RELATED EQUIPMENT FOR MEASURING THE DEVIATION OF SHAPE OF WORKED SURFACES.
US6608672B1 (en) 1999-03-15 2003-08-19 Omron Corporation Soil survey device and system for precision agriculture
US6919959B2 (en) 1999-06-30 2005-07-19 Masten Opto-Diagnostics Co. Digital spectral identifier-controller and related methods
US6853937B2 (en) 2001-07-06 2005-02-08 Tokyo University Of Agriculture And Technology Tlo Co., Ltd. Soil characteristics survey device and soil characteristics survey method
US7092106B2 (en) 2002-12-13 2006-08-15 The United States Of America As Represented By The Secretary Of The Army System for determining the configuration of obscured structure by employing phase profilometry and method of use therefor
CN100504383C (en) 2003-04-08 2009-06-24 广东省生态环境与土壤研究所 Method for measuring degree of roughness of soils
US20070039745A1 (en) 2005-08-18 2007-02-22 Deere & Company, A Delaware Corporation Wireless subsoil sensor network
WO2007039815A1 (en) 2005-10-05 2007-04-12 Mechanical System Dynamics Pty Ltd Measurement of pavement unevenness
US8451449B2 (en) 2009-10-30 2013-05-28 Kyle H. Holland Optical real-time soil sensor
US9050725B2 (en) 2007-10-24 2015-06-09 Caterpillar Inc. Tool control system based on anticipated terrain
WO2009153304A1 (en) 2008-06-20 2009-12-23 Faculté Universitaire des Sciences Agronomiques de Gembloux Weed detection and/or destruction
US9148995B2 (en) 2010-04-29 2015-10-06 Hagie Manufacturing Company Spray boom height control system
ITTO20100720A1 (en) 2010-08-30 2012-03-01 Bridgestone Corp SYSTEM AND METHOD OF MEASURING THE ROUGHNESS OF A ROAD SURFACE
DE102012101085A1 (en) 2012-02-10 2013-08-14 Conti Temic Microelectronic Gmbh Determining a condition of a road surface by means of a 3D camera
US8862339B2 (en) 2012-08-09 2014-10-14 Cnh Industrial Canada, Ltd. System and method for controlling soil finish from an agricultural implement
BE1021123B1 (en) 2013-01-14 2015-12-14 Cnh Industrial Belgium Nv CALIBRATE A DISTANCE SENSOR ON AN AGRICULTURAL VEHICLE
EP2972252B1 (en) 2013-03-14 2023-10-18 Robert Ernest Troxler Systems and methods for pavement bulk properties and moisture content measurements using ground penetrating radar
US9516802B2 (en) 2014-04-25 2016-12-13 Cnh Industrial America Llc System and method for controlling an agricultural system based on soil analysis
US9554098B2 (en) 2014-04-25 2017-01-24 Deere & Company Residue monitoring and residue-based control
US9282688B2 (en) 2014-04-25 2016-03-15 Deere & Company Residue monitoring and residue-based control
US20160029547A1 (en) 2014-07-30 2016-02-04 Deere & Company Sensing the soil profile behind a soil-engaging implement
US9428885B2 (en) 2014-09-15 2016-08-30 Trimble Navigation Limited Guidance system for earthmoving machinery
IL236606B (en) 2015-01-11 2020-09-30 Gornik Amihay Systems and methods for agricultural monitoring
BR112018005312B1 (en) 2015-09-18 2022-02-15 Precision Planting Llc AGRICULTURAL EQUIPMENT AND PLANTER CONTROL METHOD
US11266056B2 (en) * 2015-10-23 2022-03-08 Deere & Company System and method for residue detection and implement control

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
US10262206B2 (en) 2019-04-16
US20190236359A1 (en) 2019-08-01
US20180336410A1 (en) 2018-11-22
EP3406124A1 (en) 2018-11-28

Similar Documents

Publication Publication Date Title
EP3406124B1 (en) Vision-based system for acquiring crop residue data and related calibration methods
US11730071B2 (en) System and method for automatically estimating and adjusting crop residue parameters as a tillage operation is being performed
US10681856B2 (en) System and method for automatically monitoring soil surface roughness
US9554098B2 (en) Residue monitoring and residue-based control
US11761757B2 (en) System and method for detecting tool plugging of an agricultural implement based on residue differential
CN105043247B (en) Residue monitoring and residue-based control
EP3189719A1 (en) Control system for residue management and method
US10769771B2 (en) Measuring crop residue from imagery using a machine-learned semantic segmentation model
US11357153B2 (en) System and method for determining soil clod size using captured images of a field
EP4033875B1 (en) System and method for providing a visual indicator of field surface profile
US10817755B2 (en) Measuring crop residue from imagery using a machine-learned classification model in combination with principal components analysis
US11503756B2 (en) System and method for determining soil levelness using spectral analysis
US10748042B2 (en) Measuring crop residue from imagery using a machine-learned convolutional neural network
US20200170174A1 (en) System and method for generating a prescription map for an agricultural implement based on soil compaction
EP4027767B1 (en) System and method for determining soil clod size distribution using spectral analysis
US20210176912A1 (en) System and method for assessing agricultural operation performance based on image data of processed and unprocessed portions of the field
US11877527B2 (en) System and method for controlling agricultural implements based on field material cloud characteristics
US20240130263A1 (en) Row detection system, agricultural machine having a row detection system, and method of row detection

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190528

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20190716

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1216073

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200115

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602018001762

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20191225

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200325

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200325

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200326

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200520

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200425

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602018001762

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1216073

Country of ref document: AT

Kind code of ref document: T

Effective date: 20191225

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

26N No opposition filed

Effective date: 20200928

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20200531

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200515

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200515

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200531

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 602018001762

Country of ref document: DE

Representative=s name: KROHER STROBEL RECHTS- UND PATENTANWAELTE PART, DE

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210531

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210531

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191225

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20230512

Year of fee payment: 6

Ref country code: FR

Payment date: 20230523

Year of fee payment: 6

Ref country code: DE

Payment date: 20230525

Year of fee payment: 6

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230526

Year of fee payment: 6