WO2015188251A1 - Method and system for processing image data from unmanned aerial vehicles - Google Patents
Method and system for processing image data from unmanned aerial vehicles Download PDFInfo
- Publication number
- WO2015188251A1 WO2015188251A1 PCT/CA2015/000352 CA2015000352W WO2015188251A1 WO 2015188251 A1 WO2015188251 A1 WO 2015188251A1 CA 2015000352 W CA2015000352 W CA 2015000352W WO 2015188251 A1 WO2015188251 A1 WO 2015188251A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- image data
- sensor
- user
- interest
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
- A01B79/005—Precision agriculture
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01C—PLANTING; SOWING; FERTILISING
- A01C21/00—Methods of fertilising, sowing or planting
- A01C21/007—Determining fertilization requirements
Definitions
- This invention relates to the field of data analysis, and more specifically, to a method and system for processing image data from unmanned aerial vehicles.
- One current method for producing crop related information that allows farmers to manage crop producing land is to use remote sensing platforms (e.g., aircraft, satellites, unmanned aerial vehicles (“UAVs”), land vehicles, etc.) that are equipped with sensors that allow raw image and telemetry data to be collected for a region-of-interest of the land as the platform passes overhead.
- the raw image data may be from one or more discrete bands in the electromagnetic spectrum that are capable of yielding agricultural information.
- an indication of the amount of chlorophyll in a plant may manifest itself in the "light" reflected from the region-of- interest in one or a combination of bands.
- This raw image data for the region-of-interest may be subsequently processed to produce maps that provide the farmer with information that may be used to manage crop production. For example, a map that shows chlorophyll concentrations over the region-of-interest may be used by the farmer to identify portions of the land that are under producing relative to other portions of the land and take appropriate action.
- Aerial inspection, using aircraft, satellites, UAV, etc., allows the farmer to inspect large tracks of land from an advantaged overhead position in a short period of time.
- one problem with this approach it that often requires the farmer to interpret the sensor data gathered.
- Such sensor data is often complex in nature and actionable interpretation of the data can be difficult for those not skilled in the required techniques.
- a method for processing image data for a region-of-interest comprising: receiving the image data and telemetry data pertaining to the image data at a server; receiving user data pertaining to the region-of-interest at the server; using a processor, selecting an algorithm for processing the image data from a library of algorithms using the telemetry data and the user data; and, applying the algorithm to the image data to generate output data.
- an apparatus such as a data processing system, a method for adapting same, as well as articles of manufacture such as a computer readable medium or product and computer program product or software product (e.g., comprising a non-transitory medium) having program instructions recorded thereon for practising the method of the invention.
- FIG. 1 is a block diagram illustrating a data processing system in accordance with an embodiment of the invention
- FIG. 2 is a block diagram illustrating a data analysis system in accordance with an embodiment of the invention.
- FIG. 3 is a circle diagram illustrating the data analysis system of FIG. 2 in accordance with an embodiment of the invention.
- FIG. 4 is a flow chart illustrating operations of modules within a data processing system for processing image data for a region-of-interest, in accordance with an embodiment of the invention.
- data processing system is used herein to refer to any machine for processing data, including the computer systems, wireless devices, and network arrangements described herein.
- the present invention may be implemented in any computer programming language provided that the operating system of the data processing system provides the facilities that may support the requirements of the present invention. Any limitations presented would be a result of a particular type of operating system or computer programming language and would not be a limitation of the present invention.
- the present invention may also be implemented in hardware or in a combination of hardware and software.
- the present invention provides a method and system for selecting and executing remote sensing and geographic information system processes (e.g., relating to image processing, etc.) from a library of algorithms using user inputs and sensor telemetry.
- the present invention provides a user-centered system and cloud-based method for automating analysis of multi-sensor data from UAVs.
- FIG. 1 is a block diagram illustrating a data processing system 300 in accordance with an embodiment of the invention.
- the data processing system 300 is suitable for data processing, management, storage, and for generating, displaying, and adjusting data presentations in conjunction with a user interface or a graphical user interface ("GUI"), as described below.
- the data processing system 300 may be a client and/or server in a client/server system (e.g., 110, 120, 130).
- the data processing system 300 may be a server system or a personal computer (“PC”) system.
- the data processing system 300 may also be a wireless device or other mobile, portable, or handheld device.
- the data processing system 300 may also be a distributed system which is deployed across multiple processors.
- the data processing system 300 may also be a virtual machine.
- the data processing system 300 includes an input device 310, at least one central processing unit (“CPU") 320, memory 330, a display 340, and an interface device 350.
- the input device 310 may include a keyboard, a mouse, a trackball, a touch sensitive surface or screen, a position tracking device, an eye tracking device, or a similar device.
- the display 340 may include a computer screen, television screen, display screen, terminal device, a touch sensitive display surface or screen, or a hardcopy producing output device such as a printer or plotter.
- the memory 330 may include a variety of storage devices including internal memory and external mass storage typically arranged in a hierarchy of storage as understood by those skilled in the art.
- the memory 330 may include databases, random access memory (“RAM”), read-only memory (“ROM”), flash memory, and/or disk devices.
- the interface device 350 may include one or more network connections.
- the data processing system 300 may be adapted for communicating with other data processing systems (e.g., similar to data processing system 300) over a network 351 via the interface device 350.
- the interface device 350 may include an interface to a network 351 such as the Internet and/or another wired or wireless network (e.g., a wireless local area network (“WLAN”), a cellular telephone network, etc.).
- WLAN wireless local area network
- the interface 350 may include suitable transmitters, receivers, antennae, etc.
- the data processing system 300 may include a Global Positioning System ("GPS”) receiver.
- GPS Global Positioning System
- the data processing system 300 may be linked to other data processing systems by the network 351.
- the CPU 320 may include or be operatively coupled to dedicated coprocessors, memory devices, or other hardware modules 321.
- the CPU 320 is operatively coupled to the memory 330 which stores an operating system (e.g., 331) for general management of the system 300.
- the CPU 320 is operatively coupled to the input device 310 for receiving user commands or queries and for displaying the results of these commands or queries to the user on the display 340. Commands and queries may also be received via the interface device 350 and results may be transmitted via the interface device 350.
- the data processing system 300 may include a data store or database system 332 for storing data and programming information.
- the database system 332 may include a database management system (e.g., 332) and a database (e.g., 332) and may be stored in the memory 330 of the data processing system 300.
- the data processing system 300 has stored therein data representing sequences of instructions which when executed cause the method described herein to be performed.
- the data processing system 300 may contain additional software and hardware a description of which is not necessary for understanding the invention.
- the data processing system 300 includes computer executable programmed instructions for directing the system 300 to implement the embodiments of the present invention.
- the programmed instructions may be embodied in one or more hardware modules 321 or software modules 331 resident in the memory 330 of the data processing system 300 or elsewhere (e.g., 320).
- the programmed instructions may be embodied on a computer readable medium or product (e.g., one or more digital video disks ("DVDs”), compact disks ("CDs”), memory sticks, etc.) which may be used for transporting the programmed instructions to the memory 330 of the data processing system 300.
- DVDs digital video disks
- CDs compact disks
- memory sticks etc.
- the programmed instructions may be embedded in a computer- readable signal or signal-bearing medium or product that is uploaded to a network 351 by a vendor or supplier of the programmed instructions, and this signal or signal-bearing medium or product may be downloaded through an interface (e.g., 350) to the data processing system 300 from the network 351 by end users or potential buyers.
- a user may interact with the data processing system 300 and its hardware and software modules 321, 331 using a user interface such as a graphical user interface (“GUI") 380 (and related modules 321, 331).
- GUI graphical user interface
- the GUI 380 may be used for monitoring, managing, and accessing the data processing system 300.
- GUIs are supported by common operating systems and provide a display format which enables a user to choose commands, execute application programs, manage computer files, and perform other functions by selecting pictorial representations known as icons, or items from a menu through use of an input device 310 such as a mouse.
- a GUI is used to convey information to and receive commands from users and generally includes a variety of GUI objects or controls, including icons, toolbars, drop-down menus, text, dialog boxes, buttons, and the like.
- a user typically interacts with a GUI 380 presented on a display 340 by using an input device (e.g., a mouse) 310 to position a pointer or cursor 390 over an object (e.g., an icon) 391 and by selecting or "clicking" on the object 391.
- a GUI based system presents application, system status, and other information to the user in one or more "windows" appearing on the display 340.
- a window 392 is a more or less rectangular area within the display 340 in which a user may view an application or a document. Such a window 392 may be open, closed, displayed full screen, reduced to an icon, increased or reduced in size, or moved to different areas of the display 340. Multiple windows may be displayed simultaneously, such as: windows included within other windows, windows overlapping other windows, or windows tiled within the display area.
- FIG. 2 is a block diagram illustrating a data analysis system 100 in accordance with an embodiment of the invention.
- FIG. 3 is a circle diagram illustrating the data analysis system 100 of FIG. 2 in accordance with an embodiment of the invention.
- the system 100 includes a unmanned aerial vehicle ("UAV") 110 which may be communicatively coupled to a server 120 over a network 351.
- the server 120 in turn may be communicatively coupled to a user device 130 over a network 351.
- Each of the data analysis system 100, UAV 110, server 120, and user device 130 may be or include a data processing system 300 or elements of such a system 300.
- the UAV 110 may include at least one sensor (e.g., a camera, etc.) Ill for capturing raw data (e.g., image data 210, sensor telemetry data 211, etc.) of a region-of-interest 201 in a farm field 200, for example.
- the server 120 may include an application or module 331 for generating output data 220 (e.g., stand count, geo-referenced data, etc.) using the raw data 210, 211.
- the user device 130 may include an application or module 331 for generating a presentation for display on a display 340 of the user device 130 using the output data 220 from the server 120.
- the UAV 110 collects large amounts of raw data 210, 211 when surveying agricultural sites (e.g., a field 200 or region-of-interest 201 therein) and this data is processed into formats that are useful to the user.
- the exact format of the output data 220 varies depending on the type of sensor 111 that is used by the UAV 110 and on the needs of the user.
- the raw data 210, 211 may include image data 210 and sensor telemetry data 211.
- the sensor telemetry data 211 may include sensor type, location of image, time image was taken, etc.
- the present invention provides for extracting intelligence from remotely sensed imagery and/or telemetry data automatically by non-technical users using cloud-based technology.
- the present invention provides a user-centered data analysis system 100 that makes use of sensor image 210, telemetry data 211, and/or user input data 212 to recommend or select an applicable algorithm (or algorithms) 215 stored in an algorithm library 214, to automatically process the image data 210 using the selected algorithm 215, and to return the analytical result or output data 220 to the user.
- the user device 130 may be used by an agriculture user or farmer who is interested in obtaining a stand count for a corn field 200 or region- of-interest 201 therein.
- the user conducts an UAV survey of the field 200 at an altitude of 50 meters using an RGB sensor 111 mounted in the UAV 110.
- the corn field 200 may in the V2 growth stage, the month may be June, and the field 200 may be located in western Iowa, for example.
- the raw image data 210 and telemetry data 211 which may include data relating to the sensor used (e.g., RGB sensor 111) are uploaded from the UAV 110 to an online software application or module 331 (e.g., PrecisionMapperTM) maintained by the server 120.
- the server 120 may be a cloud-based server having analysis engine applications or modules 331.
- the server 120 requests information or user data 212 pertaining to the survey from the user via the user device 130.
- the user may be asked for information pertaining to the region-of-interest 201, the field 200, the goal of the survey, the user's industry, the crop type, the growth stage, etc.
- the user data 212 is uploaded to the server 120 from the user device 130.
- the server 120 selects an appropriate algorithm 215 to perform the stand count from an online algorithm library 214 which may store thousands of image processing and analysis workflows.
- the algorithm library 214 may be stored in the server 120.
- the appropriate algorithm 215 may be selected using the following sensor-specific 211 and user-provided data 212: industry (“Agriculture”), goal (“Stand Count”), crop type ("Corn”), growth stage (“V2”), sensor type ("RGB”), season (“Summer”), location (“western Iowa”), and altitude (“50 m”).
- the server 120 processes the image data 210 using the selected algorithm 215 to generate information products, output data, or a stand count 220.
- the stand count 220 is then returned to the user device 130 for presentation to the user via the online software application or module 331 (e.g., PrecisionMapperTM).
- the algorithms in the algorithm library 214 may be selected and/or and purchased by a user from an online algorithm marketplace application or module 218.
- the online algorithm marketplace 218 may be stored in or accessible from the server 120 via the online software application or module 331 (e.g., PrecisionMapperTM).
- the algorithm marketplace 218 may function as a central database or online community for algorithms 214 for processing UAV- captured data 210. This online or collective community may be used by users, farmers, growers or independent crop consultants to exchange algorithms 214 to be run on agronomic data 210.
- the algorithm marketplace 218 may function as an application or "app" store for algorithms 214.
- An administrator and administrator application or module 140 may be associated with the algorithm marketplace 218 for administrating the algorithm marketplace 218.
- the administrator application or module 140 may be used to conduct offline transactions with algorithm providers to populate the algorithm marketplace 218 with algorithms 214.
- the image data 210 and sensor telemetry data 211 may be received from an UAV 110, a manned aerial vehicle (e.g., a plane, helicopter, jet, etc.), a satellite, and/or a ground vehicle (e.g., a combine, tractor, truck, etc.).
- a manned aerial vehicle e.g., a plane, helicopter, jet, etc.
- a satellite e.g., a satellite, and/or a ground vehicle (e.g., a combine, tractor, truck, etc.).
- a user-centered method for generating actionable geospatial intelligence 220 of an area or region-of-interest 201, 200 through the selection of analysis techniques or algorithms 215 from an e-commerce website 218, comprising: software- based filtering of algorithms 214 by sensor telemetry data (e.g., location, time, date, sensor type, resolution, etc.) 211 and user input data (e.g., industry, desired result, existing data, etc.) 212; generating analytical results 220 in the form of image data and location-aware structured data from the selected algorithm 215 using raw image data 210 at the server 120; overlaying the resultant data 220 on live image data to provide business decision support intelligence 220 to the user on a wide variety of Internet-capable devices 130.
- sensor telemetry data e.g., location, time, date, sensor type, resolution, etc.
- user input data e.g., industry, desired result, existing data, etc.
- the above embodiments may contribute to an improved method and system 100 for processing image data 210 from UAVs 110 and may provide one or more advantages.
- the invention allows for the creation of actionable intelligence 220 by a user who is not an expert in the field of image data analysis. This helps avoid the need to have offline manual processing performed on image data 210 and/or sensor telemetry data 211 by experts in the field of image and telemetry data analysis.
- the invention provides a method and system for selecting and executing remote sensing and geographic information system processes (e.g., relating to image processing, etc.) 215 from a library of algorithms 214 using user inputs 212 and sensor telemetry 210, 211.
- the invention provides a user-centered system 100 and cloud-based method for automating analysis of multi-sensor data 210, 211 from UAVs 110.
- FIG. 4 is a flow chart illustrating operations 400 of modules (e.g., software or hardware modules 331, 321) within a data processing system (e.g., 120, 300) for processing image data 210 for a region-of-interest 201, in accordance with an embodiment of the invention.
- modules e.g., software or hardware modules 331, 321
- the operations 400 start. [0029] At step 402, the image data 210 and telemetry data 211 pertaining to the image data 210 is received at the system 120.
- step 403 user data 212 pertaining to the region-of-interest 201 is received at the system 120.
- an algorithm 215 for processing (or analyzing, etc.) the image data 210 is selected from a library of algorithms 214 using the telemetry data 211 and the user data 212.
- step 405 the algorithm 215 is applied to the image data 210 to generate output data 220.
- step 406 the operations 400 end.
- the above method may further include transmitting the output data 220 to a user device 130 for presenting on a display 340 of the user device 130.
- the image data 210 may be captured by a sensor 111 mounted in an unmanned aerial vehicle 110 when the unmanned aerial vehicle 110 is flown over the region-of-interest 201.
- the image data 210 and the telemetry data 211 may be received from the unmanned aerial vehicle 110.
- the user data 212 may be received at the system 120 from a user device 130.
- the user device 130 may be a wireless device.
- the telemetry data 210 may pertain to a sensor 111 used to capture the image data 210.
- the sensor 111 may be one or more of a visual sensor, a multispectral sensor, a hyperspectral sensor, a LiDAR sensor, and a thermal sensor.
- the region-of-interest 201 may be one of a farm field 200 and located in a farm field 200.
- the output data 220 may be a stand count for the region-of-interest 201.
- the telemetry data 211 may be a type of sensor 111 used to capture the image data 210.
- the library of algorithms 214 may be populated from an online algorithm marketplace 218.
- the online algorithm marketplace 218 may be an online application store.
- the user data 212 may further pertain to an objective for gathering the image data 210.
- each of the above steps 401-406 may be implemented by a respective software module 331. According to another embodiment, each of the above steps 401-406 may be implemented by a respective hardware module 321. According to another embodiment, each of the above steps 401-406 may be implemented by a combination of software 331 and hardware modules 321.
- FIG. 4 may represent a block diagram illustrating the interconnection of specific hardware modules 401-406 (collectively 321) within a data processing system 300, each hardware module 401-406 adapted or configured to implement a respective step of the method of the invention.
- sequences of instructions which when executed cause the method described herein to be performed by the data processing system 300 may be contained in a data carrier product according to one embodiment of the invention.
- This data carrier product may be loaded into and run by the data processing system 300.
- sequences of instructions which when executed cause the method described herein to be performed by the data processing system 300 may be contained in a computer software product or computer program product (e.g., comprising a non- transitory medium) according to one embodiment of the invention.
- This computer software product or computer program product may be loaded into and run by the data processing system 300.
- sequences of instructions which when executed cause the method described herein to be performed by the data processing system 300 may be contained in an integrated circuit product (e.g., a hardware module or modules 321) which may include a coprocessor or memory according to one embodiment of the invention.
- This integrated circuit product may be installed in the data processing system 300.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Soil Sciences (AREA)
- Engineering & Computer Science (AREA)
- Environmental Sciences (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Mechanical Engineering (AREA)
- Image Processing (AREA)
Abstract
A method for processing image data for a region-of-interest, comprising: receiving the image data and telemetry data pertaining to the image data at a server; receiving user data pertaining to the region-of-interest at the server; using a processor, selecting an algorithm for processing the image data from a library of algorithms using the telemetry data and the user data; and, applying the algorithm to the image data to generate output data.
Description
METHOD AND SYSTEM FOR PROCESSING IMAGE DATA FROM UNMANNED
AERIAL VEHICLES
[0001] This application claims priority from and the benefit of the filing date of United States Provisional Patent Application No. 62/01 1 ,123, filed June 12, 2014, and the entire content of such application is incorporated herein by reference.
FIELD OF THE INVENTION
[0002] This invention relates to the field of data analysis, and more specifically, to a method and system for processing image data from unmanned aerial vehicles.
BACKGROUND OF THE INVENTION [0003] One current method for producing crop related information that allows farmers to manage crop producing land is to use remote sensing platforms (e.g., aircraft, satellites, unmanned aerial vehicles ("UAVs"), land vehicles, etc.) that are equipped with sensors that allow raw image and telemetry data to be collected for a region-of-interest of the land as the platform passes overhead. For example, the raw image data may be from one or more discrete bands in the electromagnetic spectrum that are capable of yielding agricultural information. In particular, an indication of the amount of chlorophyll in a plant may manifest itself in the "light" reflected from the region-of- interest in one or a combination of bands. This raw image data for the region-of-interest may be subsequently processed to produce maps that provide the farmer with information that may be used to manage crop production. For example, a map that shows chlorophyll concentrations over the region-of-interest may be used by the farmer to identify portions of the land that are under producing relative to other portions of the land and take appropriate action.
[0004] Aerial inspection, using aircraft, satellites, UAV, etc., allows the farmer to inspect large tracks of land from an advantaged overhead position in a short period of time. However, one problem with this approach it that often requires the farmer to interpret the sensor data gathered. Such sensor data is often complex in nature and actionable interpretation of the data can be difficult for those not skilled in the required techniques.
[0005] A need therefore exists for an improved method and system for processing image data from unmanned aerial vehicles. Accordingly, a solution that addresses, at least in part, the above and other shortcomings is desired.
SUMMARY OF THE INVENTION [0006] According to one aspect of the invention, there is provided a method for processing image data for a region-of-interest, comprising: receiving the image data and telemetry data pertaining to the image data at a server; receiving user data pertaining to the region-of-interest at the server; using a processor, selecting an algorithm for processing the image data from a library of algorithms using the telemetry data and the user data; and, applying the algorithm to the image data to generate output data.
[0007] In accordance with further aspects of the invention, there is provided an apparatus such as a data processing system, a method for adapting same, as well as articles of manufacture such as a computer readable medium or product and computer program product or software product (e.g., comprising a non-transitory medium) having program instructions recorded thereon for practising the method of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] Further features and advantages of the embodiments of the present invention will become apparent from the following detailed description, taken in combination with the appended drawings, in which: [0009] FIG. 1 is a block diagram illustrating a data processing system in accordance with an embodiment of the invention;
[0010] FIG. 2 is a block diagram illustrating a data analysis system in accordance with an embodiment of the invention;
[0011] FIG. 3 is a circle diagram illustrating the data analysis system of FIG. 2 in accordance with an embodiment of the invention; and,
[0012] FIG. 4 is a flow chart illustrating operations of modules within a data processing system for processing image data for a region-of-interest, in accordance with an embodiment of the invention.
[0013] It will be noted that throughout the appended drawings, like features are identified by like reference numerals. DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS
[0014] In the following description, details are set forth to provide an understanding of the invention. In some instances, certain software, circuits, structures and methods have not been described or shown in detail in order not to obscure the invention. The term "data processing system" is used herein to refer to any machine for processing data, including the computer systems, wireless devices, and network arrangements described herein. The present invention may be implemented in any computer programming language provided that the operating system of the data processing system provides the facilities that may support the requirements of the present invention. Any limitations presented would be a result of a particular type of operating system or computer programming language and would not be a limitation of the present invention. The present invention may also be implemented in hardware or in a combination of hardware and software.
[0015] The present invention provides a method and system for selecting and executing remote sensing and geographic information system processes (e.g., relating to image processing, etc.) from a library of algorithms using user inputs and sensor telemetry. In addition, the present invention provides a user-centered system and cloud-based method for automating analysis of multi-sensor data from UAVs.
[0016] FIG. 1 is a block diagram illustrating a data processing system 300 in accordance with an embodiment of the invention. The data processing system 300 is suitable for data processing, management, storage, and for generating, displaying, and adjusting data presentations in conjunction with a user interface or a graphical user interface ("GUI"), as described below. The data processing system 300 may be a client and/or server in a client/server system (e.g., 110, 120, 130). For example, the data processing system 300 may be a server system or a personal computer ("PC") system. The data processing system 300 may also be a wireless device or other mobile, portable, or handheld device. The data processing system 300 may also be a distributed system which is deployed across
multiple processors. The data processing system 300 may also be a virtual machine. The data processing system 300 includes an input device 310, at least one central processing unit ("CPU") 320, memory 330, a display 340, and an interface device 350. The input device 310 may include a keyboard, a mouse, a trackball, a touch sensitive surface or screen, a position tracking device, an eye tracking device, or a similar device. The display 340 may include a computer screen, television screen, display screen, terminal device, a touch sensitive display surface or screen, or a hardcopy producing output device such as a printer or plotter. The memory 330 may include a variety of storage devices including internal memory and external mass storage typically arranged in a hierarchy of storage as understood by those skilled in the art. For example, the memory 330 may include databases, random access memory ("RAM"), read-only memory ("ROM"), flash memory, and/or disk devices. The interface device 350 may include one or more network connections. The data processing system 300 may be adapted for communicating with other data processing systems (e.g., similar to data processing system 300) over a network 351 via the interface device 350. For example, the interface device 350 may include an interface to a network 351 such as the Internet and/or another wired or wireless network (e.g., a wireless local area network ("WLAN"), a cellular telephone network, etc.). As such, the interface 350 may include suitable transmitters, receivers, antennae, etc. In addition, the data processing system 300 may include a Global Positioning System ("GPS") receiver. Thus, the data processing system 300 may be linked to other data processing systems by the network 351. The CPU 320 may include or be operatively coupled to dedicated coprocessors, memory devices, or other hardware modules 321. The CPU 320 is operatively coupled to the memory 330 which stores an operating system (e.g., 331) for general management of the system 300. The CPU 320 is operatively coupled to the input device 310 for receiving user commands or queries and for displaying the results of these commands or queries to the user on the display 340. Commands and queries may also be received via the interface device 350 and results may be transmitted via the interface device 350. The data processing system 300 may include a data store or database system 332 for storing data and programming information. The database system 332 may include a database management system (e.g., 332) and a database (e.g., 332) and may be stored in the memory 330 of the data processing system 300. In general, the data processing system 300 has stored therein data representing sequences of instructions which when executed cause the method described herein to be performed. Of course, the data processing system 300 may contain
additional software and hardware a description of which is not necessary for understanding the invention.
[0017] Thus, the data processing system 300 includes computer executable programmed instructions for directing the system 300 to implement the embodiments of the present invention. The programmed instructions may be embodied in one or more hardware modules 321 or software modules 331 resident in the memory 330 of the data processing system 300 or elsewhere (e.g., 320). Alternatively, the programmed instructions may be embodied on a computer readable medium or product (e.g., one or more digital video disks ("DVDs"), compact disks ("CDs"), memory sticks, etc.) which may be used for transporting the programmed instructions to the memory 330 of the data processing system 300. Alternatively, the programmed instructions may be embedded in a computer- readable signal or signal-bearing medium or product that is uploaded to a network 351 by a vendor or supplier of the programmed instructions, and this signal or signal-bearing medium or product may be downloaded through an interface (e.g., 350) to the data processing system 300 from the network 351 by end users or potential buyers. [0018] A user may interact with the data processing system 300 and its hardware and software modules 321, 331 using a user interface such as a graphical user interface ("GUI") 380 (and related modules 321, 331). The GUI 380 may be used for monitoring, managing, and accessing the data processing system 300. GUIs are supported by common operating systems and provide a display format which enables a user to choose commands, execute application programs, manage computer files, and perform other functions by selecting pictorial representations known as icons, or items from a menu through use of an input device 310 such as a mouse. In general, a GUI is used to convey information to and receive commands from users and generally includes a variety of GUI objects or controls, including icons, toolbars, drop-down menus, text, dialog boxes, buttons, and the like. A user typically interacts with a GUI 380 presented on a display 340 by using an input device (e.g., a mouse) 310 to position a pointer or cursor 390 over an object (e.g., an icon) 391 and by selecting or "clicking" on the object 391. Typically, a GUI based system presents application, system status, and other information to the user in one or more "windows" appearing on the display 340. A window 392 is a more or less rectangular area within the display 340 in which a user may view an application or a document. Such a window 392 may be open, closed, displayed full screen, reduced to an icon, increased or reduced in size, or moved to different areas of the display 340. Multiple
windows may be displayed simultaneously, such as: windows included within other windows, windows overlapping other windows, or windows tiled within the display area.
[0019] FIG. 2 is a block diagram illustrating a data analysis system 100 in accordance with an embodiment of the invention. And, FIG. 3 is a circle diagram illustrating the data analysis system 100 of FIG. 2 in accordance with an embodiment of the invention. The system 100 includes a unmanned aerial vehicle ("UAV") 110 which may be communicatively coupled to a server 120 over a network 351. The server 120 in turn may be communicatively coupled to a user device 130 over a network 351. Each of the data analysis system 100, UAV 110, server 120, and user device 130 may be or include a data processing system 300 or elements of such a system 300. The UAV 110 may include at least one sensor (e.g., a camera, etc.) Ill for capturing raw data (e.g., image data 210, sensor telemetry data 211, etc.) of a region-of-interest 201 in a farm field 200, for example. The server 120 may include an application or module 331 for generating output data 220 (e.g., stand count, geo-referenced data, etc.) using the raw data 210, 211. The user device 130 may include an application or module 331 for generating a presentation for display on a display 340 of the user device 130 using the output data 220 from the server 120. The UAV 110 collects large amounts of raw data 210, 211 when surveying agricultural sites (e.g., a field 200 or region-of-interest 201 therein) and this data is processed into formats that are useful to the user. The exact format of the output data 220 varies depending on the type of sensor 111 that is used by the UAV 110 and on the needs of the user. The raw data 210, 211 may include image data 210 and sensor telemetry data 211. The sensor telemetry data 211 may include sensor type, location of image, time image was taken, etc.
[0020] Advantageously, the present invention provides for extracting intelligence from remotely sensed imagery and/or telemetry data automatically by non-technical users using cloud-based technology. The present invention provides a user-centered data analysis system 100 that makes use of sensor image 210, telemetry data 211, and/or user input data 212 to recommend or select an applicable algorithm (or algorithms) 215 stored in an algorithm library 214, to automatically process the image data 210 using the selected algorithm 215, and to return the analytical result or output data 220 to the user.
[0021] In operation, according to one example embodiment, the user device 130 may be used by an agriculture user or farmer who is interested in obtaining a stand count for a corn field 200 or region- of-interest 201 therein. The user conducts an UAV survey of the field 200 at an altitude of 50 meters using an RGB sensor 111 mounted in the UAV 110. The corn field 200 may in the V2 growth stage, the month may be June, and the field 200 may be located in western Iowa, for example. The raw image data 210 and telemetry data 211 which may include data relating to the sensor used (e.g., RGB sensor 111) are uploaded from the UAV 110 to an online software application or module 331 (e.g., PrecisionMapper™) maintained by the server 120. The server 120 may be a cloud-based server having analysis engine applications or modules 331. The server 120 requests information or user data 212 pertaining to the survey from the user via the user device 130. For example, the user may be asked for information pertaining to the region-of-interest 201, the field 200, the goal of the survey, the user's industry, the crop type, the growth stage, etc. The user data 212 is uploaded to the server 120 from the user device 130. Using one or more of the image data 210, telemetry data 211, and user data 212, the server 120 selects an appropriate algorithm 215 to perform the stand count from an online algorithm library 214 which may store thousands of image processing and analysis workflows. The algorithm library 214 may be stored in the server 120. For example, the appropriate algorithm 215 may be selected using the following sensor-specific 211 and user-provided data 212: industry ("Agriculture"), goal ("Stand Count"), crop type ("Corn"), growth stage ("V2"), sensor type ("RGB"), season ("Summer"), location ("western Iowa"), and altitude ("50 m"). The server 120 processes the image data 210 using the selected algorithm 215 to generate information products, output data, or a stand count 220. The stand count 220 is then returned to the user device 130 for presentation to the user via the online software application or module 331 (e.g., PrecisionMapper™).
[0022] According to one embodiment, the algorithms in the algorithm library 214 may be selected and/or and purchased by a user from an online algorithm marketplace application or module 218. The online algorithm marketplace 218 may be stored in or accessible from the server 120 via the online software application or module 331 (e.g., PrecisionMapper™). The algorithm marketplace 218 may function as a central database or online community for algorithms 214 for processing UAV- captured data 210. This online or collective community may be used by users, farmers, growers or independent crop consultants to exchange algorithms 214 to be run on agronomic data 210. The algorithm marketplace 218 may function as an application or "app" store for algorithms 214. An administrator and administrator application or module 140 may be associated with the algorithm
marketplace 218 for administrating the algorithm marketplace 218. The administrator application or module 140 may be used to conduct offline transactions with algorithm providers to populate the algorithm marketplace 218 with algorithms 214.
[0023] According to one embodiment, the image data 210 and sensor telemetry data 211 may be received from an UAV 110, a manned aerial vehicle (e.g., a plane, helicopter, jet, etc.), a satellite, and/or a ground vehicle (e.g., a combine, tractor, truck, etc.).
[0024] According to one embodiment, there is provided a user-centered method for generating actionable geospatial intelligence 220 of an area or region-of-interest 201, 200 through the selection of analysis techniques or algorithms 215 from an e-commerce website 218, comprising: software- based filtering of algorithms 214 by sensor telemetry data (e.g., location, time, date, sensor type, resolution, etc.) 211 and user input data (e.g., industry, desired result, existing data, etc.) 212; generating analytical results 220 in the form of image data and location-aware structured data from the selected algorithm 215 using raw image data 210 at the server 120; overlaying the resultant data 220 on live image data to provide business decision support intelligence 220 to the user on a wide variety of Internet-capable devices 130.
[0025] The above embodiments may contribute to an improved method and system 100 for processing image data 210 from UAVs 110 and may provide one or more advantages. First, the invention allows for the creation of actionable intelligence 220 by a user who is not an expert in the field of image data analysis. This helps avoid the need to have offline manual processing performed on image data 210 and/or sensor telemetry data 211 by experts in the field of image and telemetry data analysis. Second, the invention provides a method and system for selecting and executing remote sensing and geographic information system processes (e.g., relating to image processing, etc.) 215 from a library of algorithms 214 using user inputs 212 and sensor telemetry 210, 211. Third, the invention provides a user-centered system 100 and cloud-based method for automating analysis of multi-sensor data 210, 211 from UAVs 110.
[0026] Aspects of the above described method may be summarized with the aid of a flowchart.
[0027] FIG. 4 is a flow chart illustrating operations 400 of modules (e.g., software or hardware modules 331, 321) within a data processing system (e.g., 120, 300) for processing image data 210 for a region-of-interest 201, in accordance with an embodiment of the invention.
[0028] At step 401, the operations 400 start. [0029] At step 402, the image data 210 and telemetry data 211 pertaining to the image data 210 is received at the system 120.
[0030] At step 403, user data 212 pertaining to the region-of-interest 201 is received at the system 120.
[0031] At step 404, using a processor 320, an algorithm 215 for processing (or analyzing, etc.) the image data 210 is selected from a library of algorithms 214 using the telemetry data 211 and the user data 212.
[0032] At step 405, the algorithm 215 is applied to the image data 210 to generate output data 220. [0033] At step 406, the operations 400 end.
[0034] The above method may further include transmitting the output data 220 to a user device 130 for presenting on a display 340 of the user device 130. The image data 210 may be captured by a sensor 111 mounted in an unmanned aerial vehicle 110 when the unmanned aerial vehicle 110 is flown over the region-of-interest 201. The image data 210 and the telemetry data 211 may be received from the unmanned aerial vehicle 110. The user data 212 may be received at the system 120 from a user device 130. The user device 130 may be a wireless device. The telemetry data 210 may pertain to a sensor 111 used to capture the image data 210. The sensor 111 may be one or more of a visual sensor, a multispectral sensor, a hyperspectral sensor, a LiDAR sensor, and a thermal sensor. The region-of-interest 201 may be one of a farm field 200 and located in a farm field 200. The output data 220 may be a stand count for the region-of-interest 201. The telemetry data 211 may be a type of sensor 111 used to capture the image data 210. The library of algorithms 214 may be populated from an online algorithm marketplace 218. The online algorithm marketplace 218 may be an online application store. And, the user data 212 may further pertain to an objective for gathering the image data 210.
[0035] According to one embodiment, each of the above steps 401-406 may be implemented by a respective software module 331. According to another embodiment, each of the above steps 401-406 may be implemented by a respective hardware module 321. According to another embodiment, each of the above steps 401-406 may be implemented by a combination of software 331 and hardware modules 321. For example, FIG. 4 may represent a block diagram illustrating the interconnection of specific hardware modules 401-406 (collectively 321) within a data processing system 300, each hardware module 401-406 adapted or configured to implement a respective step of the method of the invention.
[0036] While this invention is primarily discussed as a method, a person of ordinary skill in the art will understand that the apparatus discussed above with reference to a data processing system 300 may be programmed to enable the practice of the method of the invention. Moreover, an article of manufacture for use with a data processing system 300, such as a pre-recorded storage device or other similar computer readable medium or computer program product including program instructions recorded thereon, may direct the data processing system 300 to facilitate the practice of the method of the invention. It is understood that such apparatus, products, and articles of manufacture also come within the scope of the invention.
[0037] In particular, the sequences of instructions which when executed cause the method described herein to be performed by the data processing system 300 may be contained in a data carrier product according to one embodiment of the invention. This data carrier product may be loaded into and run by the data processing system 300. In addition, the sequences of instructions which when executed cause the method described herein to be performed by the data processing system 300 may be contained in a computer software product or computer program product (e.g., comprising a non- transitory medium) according to one embodiment of the invention. This computer software product or computer program product may be loaded into and run by the data processing system 300. Moreover, the sequences of instructions which when executed cause the method described herein to be performed by the data processing system 300 may be contained in an integrated circuit product (e.g., a hardware module or modules 321) which may include a coprocessor or memory according to one embodiment of the invention. This integrated circuit product may be installed in the data processing system 300.
[0038] The embodiments of the invention described above are intended to be exemplary only. Those skilled in the art will understand that various modifications of detail may be made to these embodiments, all of which come within the scope of the invention.
Claims
1. A method for processing image data for a region-of-interest, comprising:
receiving the image data and telemetry data pertaining to the image data at a server; receiving user data pertaining to the region-of-interest at the server;
using a processor, selecting an algorithm for processing the image data from a library of algorithms using the telemetry data and the user data; and,
applying the algorithm to the image data to generate output data.
2. The method of claim 1 , further comprising transmitting the output data to a user device for presenting on a display of the user device.
3. The method of claim 1 wherein the image data is captured by a sensor mounted in the unmanned aerial vehicle when the unmanned aerial vehicle is flown over the region-of-interest.
4. The method of claim 3 wherein the image data and the telemetry data are received from the unmanned aerial vehicle.
5. The method of claim 1 wherein the user data is received at the server from a user device.
6. The method of claim 5 wherein the user device is a wireless device.
7. The method of claim 1 wherein the telemetry data pertains to a sensor used to capture the image data.
8. The method of claim 7 wherein the sensor is one or more of a visual sensor, a multispectral sensor, a hyperspectral sensor, a LiDAR sensor, and a thermal sensor.
9. The method of claim 1 wherein the region-of-interest is one of a farm field and located in a farm field.
10. The method of claim 9 wherein the output data is a stand count for the region-of- interest.
1 1. The method of claim 1 wherein the telemetry data is a type of sensor used to capture the image data.
12. The method of claim 1 wherein the library of algorithms is populated from an online algorithm marketplace.
13. The method of claim 12 wherein the online algorithm marketplace is an online application store.
14. The method of claim 1 wherein the user data further pertains to an objective for gathering the image data.
15. A system for processing sensor telemetry data for a region-of-interest, comprising:
a processor coupled to memory; and,
at least one of hardware and software modules within the memory and controlled or executed by the processor, the modules including:
a module for receiving the image data and telemetry data pertaining to the image data;
a module for receiving user data pertaining to the region-of-interest; a module for selecting an algorithm for processing the image data from a library of algorithms using the telemetry data and the user data; and,
a module for applying the algorithm to the image data to generate output data.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462011123P | 2014-06-12 | 2014-06-12 | |
US62/011,123 | 2014-06-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015188251A1 true WO2015188251A1 (en) | 2015-12-17 |
Family
ID=54832644
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CA2015/000352 WO2015188251A1 (en) | 2014-06-12 | 2015-06-01 | Method and system for processing image data from unmanned aerial vehicles |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2015188251A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5467271A (en) * | 1993-12-17 | 1995-11-14 | Trw, Inc. | Mapping and analysis system for precision farming applications |
EP1450559A1 (en) * | 1996-10-31 | 2004-08-25 | Sensormatic Electronics Corporation | Intelligent video information management system |
US20090297013A1 (en) * | 2008-06-03 | 2009-12-03 | Siemens Medical Solutions Usa, Inc. | System and Method for Intelligent CAD Processing |
US8134571B2 (en) * | 2005-10-05 | 2012-03-13 | Siemens Medical Solutions Usa, Inc. | Automatic CAD algorithm selection |
US20140036054A1 (en) * | 2012-03-28 | 2014-02-06 | George Zouridakis | Methods and Software for Screening and Diagnosing Skin Lesions and Plant Diseases |
-
2015
- 2015-06-01 WO PCT/CA2015/000352 patent/WO2015188251A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5467271A (en) * | 1993-12-17 | 1995-11-14 | Trw, Inc. | Mapping and analysis system for precision farming applications |
EP1450559A1 (en) * | 1996-10-31 | 2004-08-25 | Sensormatic Electronics Corporation | Intelligent video information management system |
US8134571B2 (en) * | 2005-10-05 | 2012-03-13 | Siemens Medical Solutions Usa, Inc. | Automatic CAD algorithm selection |
US20090297013A1 (en) * | 2008-06-03 | 2009-12-03 | Siemens Medical Solutions Usa, Inc. | System and Method for Intelligent CAD Processing |
US20140036054A1 (en) * | 2012-03-28 | 2014-02-06 | George Zouridakis | Methods and Software for Screening and Diagnosing Skin Lesions and Plant Diseases |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9928659B2 (en) | Method and system for generating augmented reality agricultural presentations | |
De Souza et al. | Height estimation of sugarcane using an unmanned aerial system (UAS) based on structure from motion (SfM) point clouds | |
Ahmed et al. | Hierarchical land cover and vegetation classification using multispectral data acquired from an unmanned aerial vehicle | |
Wahab et al. | Remote sensing of yields: Application of uav imagery-derived ndvi for estimating maize vigor and yields in complex farming systems in sub-saharan africa | |
Birdal et al. | Estimating tree heights with images from an unmanned aerial vehicle | |
Calera et al. | Remote sensing for crop water management: From ET modelling to services for the end users | |
Du et al. | Monitoring of wheat growth status and mapping of wheat yield’s within-field spatial variations using color images acquired from UAV-camera system | |
Prins et al. | Crop type mapping using LiDAR, Sentinel-2 and aerial imagery with machine learning algorithms | |
Schirrmann et al. | Monitoring agronomic parameters of winter wheat crops with low-cost UAV imagery | |
US11564357B2 (en) | Capture of ground truthed labels of plant traits method and system | |
Fardusi et al. | Concept to practice of geospatial-information tools to assist forest management and planning under precision forestry framework: A review | |
US10089716B2 (en) | Generating real-time sensor maps from videos and in-ground sensor data | |
CN105787801B (en) | Precision agriculture system | |
Townshend et al. | Global characterization and monitoring of forest cover using Landsat data: opportunities and challenges | |
Bagheri | Development of a high-resolution aerial remote-sensing system for precision agriculture | |
Gil-Docampo et al. | Above-ground biomass estimation of arable crops using UAV-based SfM photogrammetry | |
US20110200249A1 (en) | Surface detection in images based on spatial data | |
Meivel et al. | Remote sensing analysis of agricultural drone | |
Maja et al. | Predicting cotton yield of small field plots in a cotton breeding program using UAV imagery data | |
Vlachopoulos et al. | Delineation of bare soil field areas from unmanned aircraft system imagery with the mean shift unsupervised clustering and the random forest supervised classification | |
US20220172306A1 (en) | Automated mobile field scouting sensor data and image classification devices | |
WO2014199354A1 (en) | A graphical user interface for an agricultural information system | |
Jiménez López et al. | Crops diagnosis using digital image processing and precision agriculture technologies | |
WO2015188251A1 (en) | Method and system for processing image data from unmanned aerial vehicles | |
Se et al. | Automated UAV-based video exploitation using service oriented architecture framework |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15806335 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15806335 Country of ref document: EP Kind code of ref document: A1 |