WO2018089268A1 - Systèmes et procédés d'imagerie autonome et d'analyse structurelle - Google Patents

Systèmes et procédés d'imagerie autonome et d'analyse structurelle Download PDF

Info

Publication number
WO2018089268A1
WO2018089268A1 PCT/US2017/059990 US2017059990W WO2018089268A1 WO 2018089268 A1 WO2018089268 A1 WO 2018089268A1 US 2017059990 W US2017059990 W US 2017059990W WO 2018089268 A1 WO2018089268 A1 WO 2018089268A1
Authority
WO
WIPO (PCT)
Prior art keywords
uav
roof
scan
images
camera
Prior art date
Application number
PCT/US2017/059990
Other languages
English (en)
Inventor
Jim Loveland
Leif Larson
Dan Christiansen
Tad Christiansen
Cam Christiansen
Original Assignee
Loveland Innovations, LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/360,630 external-priority patent/US9734397B1/en
Priority claimed from US15/388,754 external-priority patent/US9823658B1/en
Priority claimed from US15/710,221 external-priority patent/US9886632B1/en
Application filed by Loveland Innovations, LLC filed Critical Loveland Innovations, LLC
Publication of WO2018089268A1 publication Critical patent/WO2018089268A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30184Infrastructure

Definitions

  • This disclosure generally relates to systems and methods for autonomous property evaluations, inspections, examinations, reporting, and the like.
  • FIG. 1 A illustrates a site selection interface to receive an electronic input identifying a location of a structure, according to one embodiment.
  • FIG. IB illustrates parcel boundaries associated with the location identified in FIG. 1 A, according to one embodiment.
  • FIG. 2 illustrates a boundary identification interface to receive electronic input identifying geographic boundaries of an area that includes the structure, according to one embodiment
  • FIG. 3 A illustrates a structure identification interface, according to one embodiment.
  • FIG. 3B illustrates a close-up view of the parcel boundaries and the structure identified in FIG. 3 A, according to one embodiment.
  • FIG. 4 illustrates a boustrophedonic scan of a site, according to one embodiment.
  • FIG. 5 illustrates an elevation map, according to one embodiment.
  • FIG. 6A illustrates an unmanned aerial vehicle (UAV) performing a micro scan of a site, according to one embodiment.
  • UAV unmanned aerial vehicle
  • FIG. 6B illustrates an elevation map of a structure to allow for micro scans or detailed scans to be performed from a consistent distance to each portion of the structure, according to one embodiment.
  • FIGS. 7A-C illustrate a loop scan and a model of a structure, according to one embodiment.
  • FIG. 8 illustrates a UAV determining a pitch of a roof, according to one embodiment.
  • FIG. 9 illustrates a UAV assessment and reporting system using the date and time to identify and/or eliminate shadows in image captures, according to one embodiment.
  • FIG. 10 illustrates a UAV assessment and reporting system for analyzing a structure, according to one embodiment.
  • FIG. 11 A illustrates a navigational risk zone within which at least one
  • navigational risk tag is located, according to one embodiment.
  • FIG. 1 IB illustrates example navigational risk tags associated with obstacles and elements within the navigational risk zone, according to one embodiment.
  • FIG. l lC illustrates an example of the type of information that may be associated with tags, according to one embodiment.
  • FIG. 12 illustrates a system for property analysis including a property analysis library for computer vision matching, according to one embodiment.
  • FIG. 13 illustrates examples of possible library images of a property analysis library, according to one embodiment.
  • FIG. 14A illustrates an example of a planview map created or imported for use by an autonomous vehicle.
  • FIG. 14B illustrates the detection of pathways for generating perimeters and the detection of standard symbols, according to one embodiment.
  • FIG. 14C illustrates an example of regions being zoned for a particular analysis and/or sensor utilization, according to one embodiment.
  • FIG. 14D illustrates an internal boustrophedonic navigation path generated through an analysis of an existing floor plan, according to one embodiment.
  • FIG. 14E illustrates an embodiment in which multiple UAVs are used to analyze various rooms of a facility and/or communicate with each other and/or relay communication with a host or control device, according to various embodiments.
  • FIG. 14F illustrates an embodiment in which a UAV swarm is used to analyze a room of a facility, according to one embodiment.
  • FIG. 15 illustrates a UAV scanning a wall to identify deviations and/or confirm that the wall is plumb, according to one embodiment.
  • FIG. 16A illustrates a UAV scanning in a vertical direction from multiple locations to identify deviations in the ceiling, such as bowing or sagging, according to one embodiment.
  • FIG. 16B illustrates a UAV scanning a ceiling to identify paint chipping, according to one embodiment.
  • FIG. 16C illustrates a UAV scanning a wall to identify deteriorating sheetrock, according to one embodiment.
  • FIG. 16D illustrates a UAV scanning a wall using at least one of infrared and ultrasonic sensors to detect structural straps on studs within a wall, according to one embodiment.
  • FIG. 17 illustrates a comparative geometry analysis to identify structural defects in a door frame, according to one embodiment.
  • FIG. 18 illustrates the use of a single sensor or an array of sensors in a UAV to identify boundaries and objects of an interior region of a structure, according to one embodiment.
  • FIG. 19A illustrates the use of an alternative unmanned ground vehicle (UGV) conducting an inspection of ducting, according to one embodiment.
  • UUV unmanned ground vehicle
  • FIG. 19B illustrates the inspection and sampling of ducting using a UGV, according to one embodiment.
  • FIG. 20A illustrates a UAV using thermal imaging to identify defects around a window, according to one embodiment.
  • FIG. 20B illustrates a UAV scanning within a wall to detect a poorly insulated portion of the wall near a corner, according to one embodiment.
  • FIG. 20C illustrates a UAV analyzing a subsurface of a wall to detect moisture or a water-damaged portion, according to one embodiment.
  • FIG. 20D illustrates an infrared scan near an outlet to identify an anomalously high temperature, according to one embodiment.
  • FIG. 20E illustrates an infrared scan near an outlet to identify wire paths within a wall, according to one embodiment.
  • FIG. 20F illustrates an internal scan of a portion of a wall to identify insect damage and/or insects within the wall, according to one embodiment.
  • FIG. 20G illustrates a UAV swarm of micro-UAVs for scanning a wall, according to one embodiment.
  • FIG. 20H illustrates a UAV swarm of micro-UAVs for scanning a wall in a scanning formation, according to one embodiment.
  • FIG. 21 illustrates a measurement of width, length, rise, etc. of stairs for a report and for navigation, according to various embodiments.
  • FIG. 22A illustrates a UGV conducting a structural analysis using ground penetrating radar (GPR), according to one embodiment.
  • GPR ground penetrating radar
  • FIG. 22B illustrates flaws and/or damage detected using the GPR, according to one embodiment.
  • FIG. 23 illustrates examples of defects, irregularities, and/or areas flagged for additional and/or manual inspection, according to various embodiments.
  • This disclosure provides methods and systems for assessing structures and/or other personal property using an unmanned aerial vehicle (UAV).
  • UAV unmanned aerial vehicle
  • similar analysis, systems, and methods may be employed, incorporated within, or
  • UGVs autonomous terrestrial vehicles
  • UAVs autonomous terrestrial vehicles
  • a single UAV, multiple UAVs, or swarms of UAVs may carry one or more imaging systems to capture a sequence of images of a target object, such as a structure.
  • a swarm of UAVs may carry a single imaging system.
  • the UAV may initially position itself above the location of interest to allow the imaging system to capture a nadir image of an area of interest that includes a target structure.
  • the UAV may subsequently follow a boustrophedonic flight path while the imaging system captures a series of closer images and/or collects non-image scan information.
  • the UAV may subsequently position itself at various altitudes and angles relative to the structure or point of interest to collect oblique images at one or more heights on each critical side of the structure and/or the vertex of the structure.
  • the UAV may perform a loop scan while the imaging system captures a set of oblique images.
  • the UAV and imaging system may perform a series of micro scans, sometimes referred to as detailed micro scans or microscans.
  • a rendering system may generate interactive models of the target structure or another object.
  • UAV hardware, firmware, and/or software may be modified, upgraded, and/or programmed to perform the functions, methods, and behaviors described herein.
  • software, hardware, and/or firmware may be created to interface with pre-existing UAV interfaces.
  • modifications to one or more portions of a UAV may be made to accomplish the described systems and methods.
  • Hardware, firmware, and/or software may also be used in conjunction with a UAV to extend or replace its capabilities to implement any of the embodiments described herein.
  • a technician may manually operate a UAV to perform one or more assessment tasks. For example, a technician may manually operate a drone to capture photographs that would have required the technician to scale a building or fit into cramped spaces. However, this approach may still require a technician to manually operate the UAV and fails to solve the uniformity problem.
  • Some UAVs have semi -autonomous capabilities. These UAVs may be directed to capture photographs of an operator-identified location. However, semi-autonomous UAVs may not capture a comprehensive image collection of the entire site and may not provide adequate information to replace an on-site technician.
  • the use of a UAV or UGV system may augment and/or modify the manner in which human agents and technicians are employed, not necessarily replace them.
  • a UAV assessment and reporting system described herein provides a
  • the approaches obviate the need for an industry-specific trained technician to be physically present or at least greatly reduce the workload of a technician.
  • the approaches also allow highly skilled claims adjusters to perform adjusting task from a desk (e.g., at a remote location) while reviewing the data capturing from the UAV assessment and reporting system.
  • the UAV assessment and reporting system may comprise a site selection interface to receive an electronic input identifying a location of a structure, a boundary identification interface to receive electronic input identifying geographic boundaries of an area that includes the structure, and a UAV to receive the geographic boundaries and the location of the structure from the site selection interface and conduct a structural assessment.
  • the UAV assessment and reporting system may also include a hazard selection interface to receive electronic input identifying geographic hazards such as aboveground power lines, tall trees, neighboring structures, etc.
  • the UAV assessment and reporting system may be preloaded geographic hazard models.
  • the UAV assessment and reporting system may allow for these hazards to be eliminated from the flight plan to produce a safe path for automated imagery and data capture.
  • Onboard sensors for obstacle avoidance may additionally or alternatively be used for the detection of hazardous obstacles, especially in situations in which incomplete geographic information is available and periodic changes are expected.
  • the UAV may include a camera to capture images of the structure, sonar sensors, LIDAR sensors, infrared sensors, optical sensors, radar sensors and the like.
  • the UAV may include an onboard processor and/or a communication interface to communicate with the controller and/or the interface's cloud-based processing.
  • the UAV may include a non- transitory computer-readable medium for receiving and storing instructions that, when executed by the processor, cause the UAV to conduct a structural assessment.
  • the structural assessment may include a boustrophedonic scan of the area defined by geographic boundaries that includes the structure or the like from scanned floor plan data.
  • the boustrophedonic scan may include capturing images during a boustrophedonic flight pattern within a first altitude range.
  • the boustrophedonic scan may also or alternatively include determining distances to a surface for each of a plurality of potential vertical approaches within the area defined by the geographic boundaries.
  • the UAV assessment and reporting system may include identifying a structure on the site based on the identified geographic boundaries and/or the
  • the UAV assessment and reporting system may
  • the UAV assessment and reporting system may additionally or alternatively include a micro scan of the structure in a third flight pattern that includes vertical approaches proximate the structure to capture detailed images of the structure.
  • a site may be identified and the UAV may fly to the site and capture a collection of high-resolution images following a comprehensive and methodical autonomous flight pattern.
  • an unskilled operator may take the UAV to the site, and capture a collection of high-resolution images with little to no training.
  • the UAV system may automatically conduct the assessment via an autonomous flight pattern. Based on the assessment or report selected, a UAV assessment and reporting system may determine the appropriate flight pattern, types of images to be captured, a number of images to be captured, detail level to be captured, attributes to be identified, measurements to be made, and other assessment elements to be determined.
  • the UAV assessment and reporting system may use a satellite and/or aerial image to initially identify a site to analyze.
  • a site selection interface on the operator client may present a satellite image.
  • the site selection interface may receive, from the operator, an electronic input identifying a location of a structure.
  • the operator client may be a controller, computer, phone, tablet, or another electronic device.
  • the operator may mark, via an electronic input on a boundary identification interface, one or more geographic boundaries associated with the structure and/or site.
  • the operator may also identify, on the operator client, obstacles, boundaries, structures, and particular points of interest.
  • an operator who is attempting to scan a residential lot may be presented with a satellite image on his phone.
  • the operator may select each corner of the lot to identify the boundaries of the lot.
  • the operator may then drag his finger along the border of a house on the lot to mark the perimeter of the house.
  • the operator may press and hold to identify their location and enter an estimated height.
  • the operator may also circle certain areas on the satellite image to identify particular points of interest. For instance, if the operator is collecting images for an insurance claim on a house that has had its fence blown over by a recent microburst, the operator may circle the fence for a closer inspection and data capture.
  • the UAV assessment and reporting system may automatically identify obstacles, boundaries, structures, and particular points of interest using prefilled data objects and/or data layers, satellite images, county records, topographical maps, and/or customer statements.
  • the UAV assessment and reporting system may receive an address of a commercial property to be assessed for damage caused by a tornado.
  • the UAV assessment and reporting system may use available county records to determine the boundary of the property, and topographical maps of the area to identify objects and structures.
  • a customer submits a claim stating that the entry of a warehouse on the site has collapsed
  • the UAV assessment and reporting system may receive and parse the submitted claim to identify the entrance as a particular point of interest.
  • a technician or other user may electronically identify the entrance on a map or satellite image.
  • the UAV may receive the location of the structure and the identified geographic boundaries.
  • the UAV may first take a nadir image (i.e., top-down) of the entire site.
  • the UAV assessment and reporting system may use the nadir image to align the UAV with landmarks established in the initial identification of the site and structure.
  • the UAV assessment and reporting system may also use the nadir image to generate a flight pattern or adjust a predefined flight pattern to ensure accuracy and uniformity.
  • the flight pattern may include three flight stages: (1) a boustrophedonic scan, (2) a loop scan, and (3) a micro scan.
  • a structural assessment may require only one or two of the three types of scans.
  • one or more stages may be omitted. For instance, in some situations an autonomous or semi-autonomous micro scan may be sufficient.
  • the UAV may perform a boustrophedonic scan.
  • the UAV may follow a flight pattern where the UAV travels from edge to edge of the site in alternating offset zones.
  • the camera on the UAV may capture images of the site as the UAV travels in its boustrophedon pattern.
  • the UAV assessment and reporting system may merge the images to form a detailed aerial view of the site.
  • the level of detail in the detailed aerial view may be improved by lowering the altitude of the UAV and using minimal offsets.
  • the altitude used for a boustrophedonic scan may be limited due to the height of structures and obstacles on the site.
  • the boustrophedonic scan alone may be used to develop a top-down or aerial view of the site, structure, property, etc.
  • the images and scan information obtained during the boustrophedonic scan may be combined with other available data or used to refine other available data.
  • the scan information may, as previously described, include information from optical imaging systems, ultrasonic systems, radar, LIDAR, infrared imaging, moisture sensors, and/or other sensor systems.
  • the UAV may perform a loop scan to analyze the angles of a structure.
  • the loop scan may include a flight pattern that positions the UAV at the perimeter of the structure and/or the site.
  • the loop scan may include the UAV traveling around the perimeter.
  • the UAV may lower its altitude and the camera captures images of the structure at one or more angles.
  • the angles may be oblique or perpendicular to the walls of the structure.
  • the UAV assessment and reporting system may use these images to create a three-dimensional model of the structure.
  • the UAV may make multiple passes around the perimeter of the structure at different altitudes.
  • the UAV may fly around the perimeter at a first altitude to capture images of the structure at a first angle and then fly around the perimeter at a second altitude to capture additional images of the structure at a second angle.
  • the number of passes around the perimeter and the lowering of UAV altitude after each pass may vary based on a desired assessment or report.
  • Each additional pass may provide more accurate structural images for a three-dimensional model, construction assessment, solar panel installation assessment, and/or damage assessment.
  • the UAV may perform a micro scan for close up photos of a structure or other areas of interest.
  • the micro scan over the surface of the structure may provide detailed images for assessing the structure and/or other personal property.
  • the granularity from the micro scan may assist in detailed measurements, damage identification, and material identification.
  • the micro scan may allow an insurance adjuster to zoom in on a three dimensional model of the structure to view and assess a small patch of roof that has been damaged, identify a stucco color or a material of a structure, etc.
  • the flight pattern may include a
  • boustrophedonic scan Information gained during the boustrophedonic scan may be used to perform a loop scan. Information gained during the loop scan may be used to perform a more accurate boustrophedonic scan. That process may be repeated as many times as is desired or necessary to obtain sufficient information about a property or structure to perform a suitably detailed or accurate micro scan.
  • the UAV may perform a series of vertical approaches near the structure.
  • the UAV may utilize a base altitude that is higher than at least a portion of the structure or other personal property of interest.
  • the UAV may begin in a starting position at the base altitude and lower its altitude until it is at a target distance from the structure.
  • the camera on the UAV may capture an image when the target distance is reached.
  • the camera may take a set of images as the UAV lowers in altitude. After the image at the target distance is captured, the UAV may return to the base altitude and travel a target lateral distance and once again lower its altitude until it is at a target distance from the structure.
  • the target lateral distance may be determined based on the area of the structure captured by each image. In some embodiments, the images may slightly overlap to ensure coverage of the entire structure.
  • the UAV may continue to perform vertical approaches separated by the target lateral distance until the entire structure has been covered or a specified portion of the structure has been assessed.
  • the UAV may traverse the surface of a structure or other personal property at a target lateral distance and the camera may capture images as the UAV travels in a boustrophedonic or circular pattern. To avoid a collision, the UAV may use the angled images from the loop scan to determine any slope or obstacle on the surface.
  • the UAV may include proximity sensors.
  • the proximity sensors may be used to avoid obstacles on and surrounding the structure and thereby identify safe flight areas above and proximate the structure and surrounding objects.
  • the safe flight areas are locations where the UAV may fly very close to the structure and capture images.
  • the proximity sensors may also be used to determine how close the UAV is to the structure.
  • a UAV may be programmed to capture images at a distance of five feet from the structure.
  • the proximity sensors may send a signal indicating to the UAV that it has reached the target distance, five feet, and the camera may take a photograph in response to the signal.
  • the target distance may be adjusted based on desired detail, weather conditions, surface obstacles, camera resolution, camera field of view, and/or other sensor qualities or attributes.
  • infrared and other non-optical sensors may be used to provide additional assessment data. For example, materials may be identified based on a spectral analysis and/or damage may be identified based on infrared leaks in a structure.
  • a target distance may be adjusted based on, among other things, a camera field of view.
  • a target distance may be selected for a given camera field of view to attain captured images corresponding to specific dimensions on a surface of a roof.
  • a camera may have fixed field of view or an adjustable (zoomable) field of view.
  • the square footage and dimensions of the portion of the roof captured in an image depend on the field of view of the camera and the distance of the camera from the roof.
  • a target distance may, therefore, be determined for a given field of view to ensure that the image captures corresponds to a portion of the roof having specific dimensions.
  • a target distance and field of view combination can be determined to capture images of sample regions having 100 square feet.
  • a target distance may be fixed and the field of view of the camera adjusted to ensure image captures correspond to a 100 square-foot area.
  • a distance may be adjusted based on a fixed field of view of the camera to ensure that a captured image corresponds to a 100 square-foot area.
  • the distance of the UAV from the surface of the roof and the field of view of the camera may be adjusted to ensure image captures correspond to a 100 square-foot area.
  • a specific pixel density may be within a predetermined range of pixel densities.
  • an image may be captured with a specific pixel density that corresponds to specific dimensions on the surface of the roof.
  • a target distance may be selected based on desired detail, weather conditions, surface obstacles, camera resolution, camera field of view, and/or other sensor qualities. Thus, for a given field of view, a target distance may be selected to ensure that a captured image corresponds to specific dimensions on the surface of the roof.
  • a fixed field of view camera may be used and one or more predetermined distance between the surface of the roof and the UAV may be defined to ensure that captured images correspond to one or more corresponding dimensions on the surface of the roof.
  • the field of view may be square and a target combination of distance and field of view may be used to ensure that captured images correspond to a 10-foot by 10-foot test square.
  • the 10-foot by 10-foot test square may be captured using a camera with a rectangular (e.g., 16:9 or 4:3) aspect ratio, in which case the image may be cropped during capture and/or as part of post-processing. Even if post-processing cropping is used, knowing the exact dimensions of the original, potentially rectangular, image ensures that cropped images correspond almost exactly (or even exactly) to the target dimensions. Such a UAV system may be said to provide a "virtual test square" whether by original image capture at specific dimensions or by cropping of an image to have specific dimensions.
  • a camera having a 16:9 or 4:3 aspect ratio may be used to capture an image that corresponds to 100 square feet without cropping.
  • a camera with a 16:9 aspect ratio may be positioned at a target distance for a given field of view to capture an image that corresponds to a portion of the roof that is 13.3333 feet wide and 7.5 feet tall.
  • a sample region may have any desired or target square footage, pixel density, length, width, and/or another size or image quality
  • test areas As described below, to ensure captured images correspond to rectangular and/or square "test areas” or “sample regions,” perpendicular image capture may be performed based on a calculated or known pitch of the surface of the roof. Test squares, sample regions, patch scans, test rectangles, sample patches, and various other terms may be used synonymously in the industry and may be used interchangeably herein to the extent context does not dictate otherwise.
  • the UAV may use additional and/or alternative methods to detect proximity to obstacles and the structure.
  • the UAV may use topographical data.
  • the UAV may have a sonar system that it uses to detect proximity.
  • the UAV may determine the proximity to the structure based on the angled images from the loop scan. For instance, the UAV assessment and reporting system may calculate the height of walls based on the angled images and determine an altitude that is a target distance above the height of the walls to descend for each image capture.
  • the location of the micro scan may be determined in a variety of ways.
  • the micro scan may include an assessment of the entire structure as identified by the operator.
  • the micro scan may include an assessment of only a portion of interest identified by the operator. For example, for a solar panel installation or construction assessment on or near a structure, a micro scan and/or loop scan may be needed for only a portion of the structure.
  • the UAV assessment and reporting system may intelligently identify portions of interest during one or both of the first two scanning stages and only micro scan those areas.
  • the UAV assessment and reporting system may perform multiple micro scans with different levels of resolution and/or perspective. For example, a first micro scan may provide detailed images at 10 or 20 feet above a roof. Then a second micro scan may image a portion of the roof at five feet for additional detail of that section. This may allow a faster capture of the roof overall while providing a more detailed image set of a portion of interest. In one embodiment, the UAV assessment and reporting system may use the first micro scan to determine the portion to be imaged in the second micro scan.
  • the UAV assessment and reporting system may use each scan stage to improve the next scan stage.
  • the first scan stage may identify the location of objects. Sonar or optical sensors may be used in the first scan stage to identify the height of the objects and/or physical damage. The location and height of the objects identified in the first scan stage may determine where the loop scan occurs and the altitude at which the angled photographs are taken.
  • the first and second stages may identify particular points of interest.
  • the third stage may use the particular points of interest to determine the location of the micro scans. For example, during a loop scan, the autonomous flying system may identify wind damage on the east surface of a structure. The micro scan may then focus on the east surface of the structure. The identification of particular points of interest may be done using UAV onboard image processing, server image processing, or client image processing.
  • the UAV assessment and reporting system may automatically calculate a pitch of a roof.
  • the UAV assessment and reporting system may use the UAV's sonar or object detection sensors to calculate the pitch of the roof. For example, the UAV may begin at an edge of the roof and then travel toward the peak. The pitch may then be calculated based on the perceived Doppler effect as the roof becomes increasingly closer to the UAV as it travels at a constant vertical height.
  • the UAV may land on the roof and use a positioning sensor, such as a gyroscope, to determine the UAV's orientation.
  • the UAV assessment and reporting system may use the orientation of the UAV to determine the slope.
  • a UAV may hover above the roof but below a peak of the roof.
  • Sensors may determine a vertical distance to the roof below and a horizontal distance to the roof, such that the roof represents the hypotenuse of a right triangle with the UAV positioned at the 90-degree corner of the right triangle.
  • a pitch of the roof may be determined based on the rise (vertical distance down to the roof) divided by the run (horizontal forward distance to the roof).
  • a UAV may hover above the roof at a first location and measure a vertical distance from the UAV to the roof (e.g., downward).
  • a downward sensor may be used.
  • the UAV may then move horizontally to a second location above the roof and measure the vertical distance from the UAV to the roof.
  • the roof becomes the hypotenuse of a right triangle, with one side of the triangle corresponding to the horizontal difference between the first location and the second location, and the second side of the triangle corresponding to the vertical difference between the distance from the UAV to the roof in the first location and the distance from the UAV to the roof in the second location.
  • a UAV may hover above the roof at a first location and measure a horizontal distance from the UAV to the roof.
  • a forward, lateral, and/or reverse, sensor may be used. The UAV may then move vertically to a second location above the roof and measure the horizontal distance from the UAV to the roof. Again, the roof becomes the hypotenuse of a right triangle, with one side of the triangle
  • the UAV assessment and reporting system may use three or more images and metadata associated with those images to calculate the pitch of the roof. For example, the UAV may capture a first image near the roof. The UAV may then increase its altitude and capture a second image above the first image. The UAV may then fly laterally towards the peak of the roof until the proximity of the UAV to the roof is the same as the proximity of the first image. The UAV may then capture a third image. Each image may have metadata associated with it including GPS coordinates, altitude, and proximity to the house. The UAV assessment and reporting system may calculate the distance of the roof traveled based on the GPS coordinates and altitude associated with the three images using the
  • the UAV assessment and reporting system may then calculate the pitch by taking the ratio of the altitude and the distance of the roof traveled.
  • a calculated pitch of the roof may be matched with the closest standardized pitch.
  • the calculated pitch may be slightly inaccurate due to measurement inaccuracies and/or other limiting characteristics of the environment, construction, and/or sensor capabilities.
  • a calculated pitch of a roof may indicate that the pitch of the roof is 37.8 degrees.
  • the closest standardized pitch may be determined to be a 10: 12 pitch with a slope of 39.81 degrees.
  • a calculated pitch may be 46.45 degrees and the closest standardized pitch may be determined to be a 12: 12 pitch with an angle of 45.0 degrees.
  • the standardized pitch may be used instead of the calculated pitch.
  • the calculated pitch may be used because it is sufficiently accurate and/or more accurate than the closest standardized pitch.
  • a UAV may have to tilt the body and/or one or more propellers to compensate for wind or other environmental factors.
  • the images, measurements, and/or other captured data may be annotated to identify the tilt or angle caused by the UAV tilt.
  • the sensors, cameras, and other data capture tools may be mechanically or digitally adjusted, such as gyroscopically for example.
  • measurements, such as distances when calculating skew and/or roof pitch may be adjusted during calculations based on identified UAV tilt due to environmental factors.
  • the UAV may use the calculated pitch to adjust the angle of the camera to reduce image skew during a micro scan and/or loop scan. For example, once the pitch is calculated the UAV may perform a micro scan with the camera at a perpendicular angle to the roof and/or de-skew the image using software on the UAV, during post-imaging processing, and/or through cloud-based processing. In various embodiments, the calculated pitch is used to angle the camera so it is perpendicular to the roof to eliminate or substantially reduce skew.
  • a pitch determination system may determine a pitch of the roof based on at least two distance measurements, as described above, that allow for a calculation of the pitch.
  • An imaging system of the UAV may capture an image of the roof of the structure with the optical axis of the camera aligned perpendicular to a plane of the roof of the structure by adjusting a location of the UAV relative to a planar surface of the roof and/or a tilt angle of the camera of the UAV.
  • the UAV may capture sample regions having a defined size (e.g., 10-foot x 10-foot, or 100 square feet, or 3-meter x 3- meter, etc.).
  • a defined size e.g., 10-foot x 10-foot, or 100 square feet, or 3-meter x 3- meter, etc.
  • Compatibility with industry standard "test squares” e.g., sample regions, patches, etc.
  • Traditional "test square” analysis requires a human adjuster to inspect a roof and draw a 10-foot x 10-foot region using chalk, this is largely to reduce the workload of the adjuster or evaluator. The thought is that the test square is large enough to be representative of the rest of the roof (or at least a face of the roof), and so there is no need to do a complete analysis.
  • the presently described systems and methods allow for "virtual test squares" to be created by capturing images orthogonal to the surface of the roof at target distances and field of views. That is, an orthogonal image can be captured to obtain a virtual test square (or rectangle) having defined dimensions and/or a defined area by positioning the UAV at a specific distance for a given field of view, by adjusting the field of view for a given distance, and/or a combination thereof. A number and/or severity each of the damage points or other defect points may be identified within the virtual test square.
  • the UAV assessment and reporting system may also reduce and/or identify shadows in the images by calculating the current angle of the sun.
  • the UAV assessment and reporting system may calculate the angle of the sun based on the time of the day, the day of the year, and GPS location. To eliminate the UAV's shadow from appearing in captured images, the UAV assessment and reporting system may apply the angle of the sun to the current UAV position in flight.
  • the UAV position, the angle/position of the sun, and the relative location of surfaces and structures (e.g., roof) may determine precisely where the shadow of the UAV will appear.
  • the UAV may adjust its position and camera based on the location of the roof shadow to ensure that each photograph will be captured in such a way as to completely eliminate the UAV's shadow.
  • the UAV assessment and reporting system may also use the angle of the sun to determine the best time of day to photograph a site or portion of a site. For example, the shadow of an object on a site may obscure a structure during the morning. Based on the angle of the sun, the UAV assessment and reporting system may determine what time of day the shadow would no longer obscure the structure.
  • the UAV may autonomously collect images during different times of day to ensure that shadow-free images of all, most, or specific portions of the structure are captured during boustrophedonic, loop, and/or micro scans.
  • the systems and methods described herein are repeatable on a consistent basis for various properties and structures and are therefore aptly characterized as systematic.
  • a UAV assessment system for imaging a structure may utilize a site selection user interface to receive an electronic input from a user identifying a geographic location of a structure, as previously described.
  • the selection may, for example, be based on one or more of a user input of a street address, a coordinate, and/or a satellite image selection.
  • the UAV may utilize one or more cameras to image the structure (multiple cameras may be used to capture three-dimensional images if desired).
  • onboard or cloud-based may calculate a location of a shadow of the UAV on the structure based on the relative position of the UAV and the sun.
  • a shadow avoidance system may adjust a location of the UAV as it captures images of the structure to ensure that the shadow of the UAV is not in any of the images.
  • the UAV may include a proximate object determination system to identify at least one object proximate the structure, such as a tree, telephone pole, telephone wires, other structures, etc., that are proximate the structure to be imaged.
  • a shadow determination system (local or remote) may calculate (as opposed to directly observe) a location of a shadow cast by the proximate object onto the structure based on a current location of the sun, which can be accurately determined based on a current time and a GPS location of the structure.
  • the imaging system may account for the shadow by (1) annotating images of the structure that include the calculated shadow, (2) adjusting an exposure of images of the structure that include the calculated shadow, and/or (3) identifying a subsequent time to return to the structure to capture non-shadowed images of the portions of the structure that are currently shadowed.
  • the UAV, server, and operator client may be connected via one or more networks.
  • the UAV may transmit images to the server via a cellular network.
  • the UAV may connect to the client via a second network such as a local wireless network.
  • the UAV, server, and operator client may each be directly connected to each other, or one of the elements may act as a gateway and pass information received from a first element to a second element.
  • a standard flight plan may be saved on the server.
  • the standard flight plan may be loaded on the UAV and altered based on information entered by the operator into the operator client interface.
  • the UAV e.g., via onboard or cloud-based processors
  • a navigational risk zone may be associated with a property, such as a structure, vehicle, land, livestock, equipment, farm, mine, etc.
  • the navigational risk zone may include some or all of the area within which an autonomous vehicle, such as a UAV, may navigate to perform micro scans of the property.
  • a rectangular office building may be associated with a navigational risk zone represented by an envelope surrounding the office building, where the envelope represents a region within which the UAV may need to navigate during a loop or micro scan stage of an analysis.
  • the navigational risk zone may include one or more navigational risk tag associated with specific locations relative to the property. For example, if a tree is identified as having branches overhanging some portions of the navigational risk zone, the portions below the overhanging branches may be tagged with a navigational risk tag indicating that an obstruction is overhead. A navigational risk tag may simply indicate the existence of the overhead obstruction. Alternatively, the navigational risk tag may provide additional detail, such as distance from the current location to the obstruction, the type of obstruction, or even a flight pattern modification to avoid the obstruction.
  • a navigational risk tag may include a wide variety of warnings, notices, or other relevant information for the location.
  • Examples of a navigational risk tag include, but are not limited to: identification of standing water, ice, or snow that may make sensor readings inaccurate, an obstruction that is more easily seen or detected from some vantage points than others (e.g., a net or wire), a feature or characteristic of the property that may be subsequently misidentified (e.g., a skylight might be mistaken as standing water on a roof and erroneously scanned), a feature or characteristic of the property that may necessitate addition or more careful scanning, high value items that should be avoided by a set distance (e.g., a car in a driveway), and/or other tags.
  • a UAV system may include onboard processing, onboard storage,
  • the system may utilize one or more of these resources to analyze, image, and/or otherwise scan the property.
  • the system may utilize computer vision in combination with a library of images for identifying properties, characteristics of properties, problems, defects, damage, unexpected issues, and the like.
  • Computer vision intelligence may be adapted based on the use of computer vision in other fields and in its general form for use in UAV structural and property analysis.
  • Computer visional analysis may include various systems and methods for acquiring, processing, analyzing, storing, and understanding captured images.
  • the system may include digital and analog components, many of which may be interchangeable between analog and digital components.
  • Computer vision tasks may be performed in the cloud or through onboard processing and storage.
  • the computer vision system of the UAV may execute the extraction of high-dimensional data from captured images (optical, infrared, and/or ultraviolet) and other sensor data to produce numerical or symbolic information.
  • the computer vision systems may extract high-dimensional data to make decisions based on rule sets.
  • the computer vision systems may utilize images, video sequences, multi -dimensional data, time- stamped data, and/or other types of data captured by any of a wide variety of electromagnetic radiation sensors, ultrasonic sensors, moisture sensors, radioactive decay sensors, and/or the like.
  • Part of the analysis may include profile matching by comparing captured sensor data with data sets from a library of identifiable sensor profiles.
  • An evaluator module or system may be responsible or partially responsible for this analysis. Such an analysis may be performed locally and/or in the cloud. For example, images of different types of shingles (e.g., asphalt, cedar, and clay) may be used to determine which type of shingle is on a structure being analyzed. Upon a determination that the shingles are asphalt, the system may compare captured images of the asphalt shingles on the structure with a library of defects in asphalt shingles to identify such defects.
  • images of different types of shingles e.g., asphalt, cedar, and clay
  • the system may compare captured images of the asphalt shingles on the structure with a library of defects in asphalt shingles to identify such defects.
  • a thermal scan of asphalt shingles in a region of a structure may reveal a thermal profile data set that can be compared with a library of thermal profiles.
  • a matched profile may be used to determine that the roof is undamaged, damaged, aging, poorly constructed, etc.
  • a first sensor system may be used and, if a matched profile is found, the system may follow a rule set to take a subsequent action that is different from the action that would have been taken if no matched profile had been found.
  • An evaluator system or module may evaluate various inputs to make a decision and/or determine that human operator input is required.
  • an optical scan may be used to match profiles within the library that indicate that a portion of the structure may have a particular characteristic (e.g., damage, manufacturing material, construction material, construction methods, modification from prior specification, etc.).
  • a rule set may dictate that based on the matched profile within the library that another type of sensor system be used for a subsequent scan and/or if a scan with increased resolution or detail is warranted.
  • UAVs and/or UGVs may be used to conduct an internal analysis in addition to or alternative to the external analysis described above.
  • many of the embodiments described herein are described with respect to only external analysis or only internal analysis. It is appreciated that many of the embodiments and features are applicable to both internal and external analysis. As such, it should be understood that all feasible permutations and combinations of embodiments described herein are contemplated as potentially being implemented in a single embodiment or combination of embodiments.
  • an internal structural analysis may begin with the reception or creation of technical drawings, constructions drawings, top-down drawings, floorplans, etc. (referred to herein cumulatively as a "planview map").
  • the planview map may be used to develop an internal flight plan and to identify the types of UAVs and/or UGVs that should be used as well as the types of sensors that could be utilized.
  • architectural blueprints of a structure may be embedded with a visual embodiment of a structure.
  • Such documents may be embedded with text that is more complicated and/or technical in nature.
  • Various features of the drawings may be used to build a model around which the UAV will base its protocols.
  • the system may characterize regions of the structure as no-fly zones, define strict route guidelines, and/or specify the types of sensors that should be used in each location to provide a desired structural analysis.
  • drafting standards may be used to interpret blueprints and other construction documents. For example, differences in line weights and symbols can be processed by current image processing techniques.
  • the system may utilize the interpreted data from construction documents to identify the relative location of vents, windows, stairs, doorways, electrical wires, and other structural features.
  • the system may determine a suitable navigation path for a UAV or UGV to navigate confined areas. For example, the system may instruct a UAV to execute a predetermined flight maneuver at specific door locations or near stairs to reduce collision risks.
  • the system may annotate (e.g., digitally) an existing planview map to provide navigational and scanning instructions to a UAV and/or UGV to conduct the internal structural analysis.
  • zones may be assigned to the various rooms of the structure. Each zone may be defined by a perimeter that may or may not correspond to physical barriers or features within the structure.
  • Each zone may be defined by specific instructions or information for use by a UAV or UGV.
  • Types of instruction and information include, but are not limited to, instructions regarding the type of sensors to be used, types of scans to be conducted, navigational risk identifiers, navigation instructions, and the like.
  • a first zone may correspond to open rooms expected to be free from obstructions that can be easily navigated.
  • a second zone may be defined for rooms or regions of the structure that have or are expected to have irregularities or non-conforming obstacles.
  • a third zone may be used to define hallways or stairs and provide specific navigation instructions for traversing the zone, specific instructions for the types of scans to be completed within the zone, and possible which types of warnings or automatic controls of the UAV should be disabled.
  • a UAV may have sensors to prevent it from rising in elevation if a ceiling is detected within three feet.
  • the UAV may also have sensors to prevent it from advancing forward if a wall is within two feet.
  • Specific instructions may be required to disable these auto-collision avoidance systems to navigate the UAV up a stairwell at a diagonal. Absent these instructions, the UAV may determine that it cannot advance forward (because the stairs straight in from of the UAV appear to be a wall) or upward at the bottom of the stairs.
  • a fourth zone may identify regions or portions of the structure that require specific sensor scanning and/or increased navigational caution.
  • Zone assignment and associated navigational, scanning, and other parameters may be assigned automatically, modified in real-time by the UAV, and/or manually assigned by a human operator before micro scanning begins or in real-time as analysis is performed.
  • a UAV may detect a previously unidentified ceiling fan and modify the zoning parameters and/or planview map to include the obstruction.
  • the UAV or a control system thereof may utilize the planview map and/or the assigned zones and associated parameters to determine an efficient or optimal navigation path using algorithmic calculations to achieve a weighted balance of battery consumption, scan time, and exhaustive scanning.
  • regions of a structure may be zoned as "excluded," in which case the UAV may not conduct a micro scan of the region.
  • Other regions may be zoned as prohibited, in which case the UAV may be prohibited from even navigating within the prohibited region.
  • zoning may include parameters specifying the type of UAV (or UGV) that should be utilized for the scanning.
  • a default scanning pattern to ensure thorough micro scans includes an interior boustrophedonic flight plan.
  • the specific location along the flight path may be annotated on the plan view map for additional or supplemental scanning.
  • an annotation on planview map, zoning parameter, or real-time UAV decision may instruct the UAV to perform specific navigational or scanning functions.
  • the UAV may be instructed to navigate to a lower altitude to take the first measurement and then navigate to a second altitude to take a second measurement. For accurate comparison of the two measurements, it may be necessary that the UAV either not move laterally or identify the lateral movement with some level of precision.
  • the UAV may utilize visual, IR, Ultrasonic, and/or Ultraviolet sensors to measure and/or limit lateral movement.
  • a plurality of sensors may be used to accurately position a UAV.
  • GPS positioning may or may not be utilized or available in interior portions of a structure.
  • relayed GPS information and/or triangulation with exterior sensors may be utilized.
  • a UAV may utilize dead reckoning m and/or location identification on a planview map based on object detection (e.g., wall, doorway, etc.). Obstacle avoidance sensors may also be utilized.
  • UAVs may, in some instances, utilize ultrasound, radar, and/or LIDAR for enhanced or alternative position tracking.
  • UAVs and/or UGVs within a structure may lose network connections and/or control signals.
  • the UAVs and/or UGVs may be configured to return to a location of last connectivity and cache necessary information to finish the micro scans of regions lacking sufficient network availability.
  • the system may include a network of UAVs and/or UGVs.
  • the network of UAVs may act as a mesh or relay network in which each UAV may utilize one or more other UAVs as an intermediary network node to maintain connectivity with a host or control device.
  • Such networks may utilize self- healing algorithms to maintain communication between nodes (UAVs).
  • the system utilizes a mesh network topology for one UAV to communicate with another. If communication with one "node” (i.e., UAV) is weak or lost, self-healing algorithms are used to acquire a new (shortest) path bridging connection through other available nodes.
  • UAVs utilized may simply act as relay UAVs for communication signals or as nodes in a mesh network.
  • battery powered network relays may be deployed (i.e., dropped off) by the UAV as it navigates through a structure.
  • a network relay device is dropped or otherwise deployed by the UAV (e.g., controllable electromagnets may be used to deploy and retrieve network relay devices from the UAV or UGV).
  • the one or more deployed network relay devices can act as relays or as nodes in a mesh network to facilitate uninterrupted communication through multiple walls, floors, and other barriers within a structure.
  • a UAV may be deployed alone to perform a scanning operation. If the UAV determines that additional network support is needed, it may automatically request that a network support UAV be dispatched.
  • the network support UAV may have the same features as the originally deployed UAV or may be specifically designed to provide network support.
  • a UAV may determine if a wall is plumb by sampling a distance to a wall from multiple vertical positions. If the wall is generally found to be plumb within an acceptable deviation tolerance, the UAV may move on to a next scanning type. If, however, the wall is not sufficiently plumb within the acceptable deviation tolerance, then the UAV may conduct additional plumb-related tests. For example, it may verify that the initial measurements were accurate through additional distance tests at various verticals and at various locations along the wall.
  • a wall not being plumb may be a red flag for other issues. If the wall is found to not be sufficiently plumb, the UAV may conduct measurements to see if the wall is at a right angle to the floor and/or ceiling. The UAV may also perform additional scanning to identify cracks in paint, caulk, or drywall. As per the example above regarding wall plumb testing, the UAV may conduct a series of initial tests that, if passed, result in the UAV moving on to other tests. However, if the initial tests fail to meet expected criteria or threshold deviations, a subset of additional scanning may be conducted. Thus, according to various embodiments described herein, interior micro scanning is more than simply executing a scanning script - it real-time analysis to expedite scanning when specific conditions are met and conduct enhanced scanning when conditions are not met or are outside of a standard deviation.
  • the UAV system may utilize learning algorithms to conduct more efficient and accurate interior micro scans. For instance, if a wall is found to be not plumb, contain moisture, or have cracks, the UAV may decide that enhanced or more exhaustive scanning of the identified issues is warranted in other locations within the structure. If multiple UAVs are used for interior scanning, then the UAVs may share the information with a central control system and/or directly with one another via peer-to-peer networks.
  • a floor and/or ceiling can be sampled for irregularities. Ceilings bow with stress and may sag with age, taking several measurements throughout a room allows for comparisons to be made.
  • a library of common structural problems, defined in terms of UAV sensor data, can be compared with gathered data to quickly identify issues. Identification of code violation may depend on the year the structure was built and/or which version of a code is applicable to the structure.
  • UAVs may utilize ultrasound to determine the densities of objects as part of the interior micro scan.
  • the density of drywall should be relatively uniform. Density analysis can be used to find unseen problems.
  • ultrasound and/or infrared may be used to identify stud locations behind drywall or other wall coverings (e.g., plaster, paint, wood paneling, etc.). Density analysis may be used to confirm the walls were built to code and/or identify moisture problems (saturated or decaying structural elements may have a different density profile than expected). For example, a density analysis may be used to confirm the existence of hurricane tie-down straps or other support or structural members.
  • UAVs While most of the examples described herein refer to UAVs, it is appreciated that many of the same embodiments and descriptions can be applied to UGVs.
  • variants of UAVs or UGVs may be used to conduct a comprehensive micro scan. Certain confined spaces may not be accessible to UAVs.
  • micro- UAVs or micro-UGVs that may have less-capable sensors, are used to conduct micro scans of confined spaces, such as vents, attics, crawl spaces, and/or other spaces that may not be navigable by a fully-sized and fully-capable UAV.
  • UAVs and/or UGVs may be configured to sample portions of the structure and/or the air around their navigation path.
  • a UAV or UGV may be deployed through ductwork and collect samples therein to determine if a structure has mold or fungus problems.
  • a UAV or UGV may collect samples of structural or decorative material for realtime and/or subsequent testing.
  • UAVs or UGVs may utilize various thermography techniques. For example, thermal imaging may be used to identify heat loss of a structure through windows doors and other openings. Temperature variants along a wall may indicate insufficient insulation or other problems in that location. Thermography techniques may also be used to detect the location of power lines and/or plumbing within walls.
  • a UAV may interface with an operator and/or automatic controls to utilize fixtures (e.g., lighting, plugs, appliances, sinks, toilets, showers, etc.) that will cause the wiring and/or plumbing to change temperature for better analysis (i.e., by drawing electric current through wires and/or causing water to run through pipes).
  • fixtures e.g., lighting, plugs, appliances, sinks, toilets, showers, etc.
  • the system may compare some structural elements against applicable building codes. For example, stairs are regulated by many local jurisdictions.
  • a UAV may take multiple measurements of the risers and treads so that it can compare them to applicable building codes. The UAV may also check how level each tread is and/or how much wear is shown on the treads.
  • UAVs and UGVs may be fitted or retrofitted with any of a wide variety of sensors and combinations of sensors, including but not limited to air quality analyzers, humidity sensors, temperature sensors, ground penetrating radar (GPR), ultrasonic sensors, infrared sensors, ultraviolet sensors, radon sensors, carbon monoxide sensors, etc. Any combination of these and the other sensors described herein may be utilized by one or more UAVs or UGVs to conduct an interior micro scan of a structure.
  • GPR ground penetrating radar
  • Data gathered during the micro scans may be associated with specific zones and/or rooms on a plan view map.
  • images and other sensor outputs may be compiled and arranged for presentation in a two or three-dimensional rendering of the structure for data visualization. Some data may be presented via graphs or as percentages of deviation from standards.
  • a micro or detailed scan, loop scan, or another type of scan is more than a manual scan that is susceptible to user-error and variation from scan to scan.
  • the micro or detailed scans described herein are more than a mere automatic or programmed scan performed according to a defined pattern.
  • the utilization of computer vision and/or a library of sensor data profiles allows for a dynamic and adaptive system that can respond in real time according to a rule set.
  • the UAV system described herein allows for an autonomous and adaptive system that can conduct an analysis in a repeatable, uniform, consistent, and detailed manner.
  • the system's ability to adapt based on a rule set in response to matched data profiles allows for increased scan speeds without undue sacrifice of accuracy or consistency. This is equally true for interior scanning and exterior scanning.
  • scan data may be determined to have a particular
  • a characteristic e.g., construction material
  • a rule set may dictate that for the particular characteristic a supplemental sensor system should be used to enhance the scan data.
  • a three-dimensional representation of the property may be presented to a user.
  • the user may click on a location on the three- dimensional representation to view micro scans from one or more sensor types and/or information relevant to a particular user.
  • an engineer or inspector may value specific types of information that is different from the information valued by other entities, such as underwriters, real estate agents, appraisers, claimants, etc.
  • the system may present different data sets and conclusions to each type of entity based on expected utility.
  • some information may be intentionally withheld and/or unavailable to certain types of entities based on access privileges.
  • a computer may include a processor, such as a microprocessor, microcontroller, logic circuitry, or the like.
  • the processor may include a special-purpose processing device, such as an ASIC, a PAL, a PLA, a PLD, a CPLD, a Field Programmable Gate Array (FPGA), or other customized or programmable device.
  • the computer may also include a computer-readable storage device, such as non-volatile memory, static RAM, dynamic RAM, ROM, CD-ROM, disk, tape, magnetic memory, optical memory, flash memory, or another computer-readable storage medium.
  • Suitable networks for configuration and/or use, as described herein, include any of a wide variety of network infrastructures.
  • a network may incorporate landlines, wireless communication, optical connections, various modulators, demodulators, small form- factor pluggable (SFP) transceivers, routers, hubs, switches, and/or other networking equipment.
  • SFP small form- factor pluggable
  • the network may include communications or networking software, such as software available from Novell, Microsoft, Artisoft, and other vendors, and may operate using TCP/IP, SPX, IPX, SONET, and other protocols over twisted pair, coaxial, or optical fiber cables, telephone lines, satellites, microwave relays, modulated AC power lines, physical media transfer, wireless radio links, and/or other data transmission "wires.”
  • the network may encompass smaller networks and/or be connectable to other networks through a gateway or similar mechanism. Geographical information from any of a wide variety of sources may be utilized. For example, publically available (or contractually provided) mapping data, three-dimensional models, topographical data, and/or three-dimensional renderings from two-dimensional data may be utilized.
  • a software module or component may include any type of computer instruction or computer-executable code located within or on a computer- readable storage medium, such as a non-transitory computer-readable medium.
  • a software module may, for instance, comprise one or more physical or logical blocks of computer instructions, which may be organized as a routine, program, object, component, data structure, etc., that perform one or more tasks or implement particular data types, algorithms, and/or methods.
  • a particular software module may comprise disparate instructions stored in different locations of a computer-readable storage medium, which together implement the described functionality of the module.
  • a module may comprise a single instruction or many instructions and may be distributed over several different code segments, among different programs, and across several computer-readable storage media.
  • Some embodiments may be practiced in a distributed computing environment where tasks are performed by a remote processing device linked through a communications network.
  • software modules may be located in local and/or remote computer- readable storage media.
  • data being tied or rendered together in a database record may be resident in the same computer-readable storage medium, or across several computer- readable storage media, and may be linked together in fields of a record in a database across a network.
  • FIG. 1 A illustrates a site selection interface 100 to receive an electronic input 110 identifying a location 115 of a structure 120.
  • a client device may present the site selection interface 100 to an operator, and the operator may identify the location 115 by entering an address and selecting 130 the search function.
  • the electronic input 110 may be an address entered by an operator.
  • the operator may enter GPS coordinates.
  • the operator may select the location 115 with a gesture or based on a selection within the map view.
  • the site selection interface 100 may also receive an electronic input 110 identifying any obstacles 122. For example, an operator may identify a tree, a shed, telephone poles, or other obstacle using a gesture within the site selection interface 100.
  • the site selection interface 100 may request an estimated height of the obstacle 122. In other embodiments, the site selection interface 100 may request the object type then estimate the height of the obstacle 122 based on the object type. For instance, a standard telephone pole is 40 feet tall. If an operator identified an obstacle 122 on the site to be a telephone pole, the site selection interface 100 may estimate the height to be 40 feet.
  • FIG. IB illustrates parcel boundaries via a dashed line that is associated with the location 115 identified in FIG. 1 A.
  • parcel information may be determined using aerial photos, satellite images, government records, plot maps, and/or the like.
  • FIG. 2 illustrates a boundary identification interface 200 to receive electronic input 230 identifying geographic boundaries 217 of an area that includes a structure 220.
  • the geographic boundaries 217 provide an area for the UAV assessment and reporting system to analyze.
  • an operator may provide electronic input 230 identifying a location on the boundary identification interface 200.
  • the electronic input 230 may be a mouse click.
  • the electronic input 230 may also be a gesture entered via a touchscreen.
  • the operator may enter an address or GPS coordinate in an address bar 210.
  • the electronic inputs 230 provided by the operator may be marked with a pin 216.
  • the pins 216 may be associated with GPS coordinates, and may be placed in corners of the site.
  • the boundary identification interface 200 may automatically form a boundary line between each pin 216. The placement of the pins 216 may be adjusted through the electronic input 230. For example, the operator may select and drag a pin 216 to a new location if the old location was inaccurate.
  • the boundary identification interface 200 may also display the placement of the current pin 216 in a preview window 211.
  • FIG. 3A illustrates a structure identification interface 300 to receive electronic input 330 identifying structural boundaries 318 of a structure 320.
  • the structural boundaries 318 identify the corners of the structure 320 for the UAV assessment and reporting system to analyze.
  • an operator may provide electronic input 330 identifying a location on the structure identification interface 300.
  • the electronic input 330 may be a mouse click.
  • the electronic input 330 may also be a gesture entered via a touch screen.
  • the operator may enter an address or GPS coordinate in an address bar 310.
  • Boundary lines 350 formed by the boundary identification interface 200 of FIG. 2 may be displayed on the structure identification interface 300.
  • any electronic input allowed to be entered in the structure identification interface 300 is limited to the area within the boundary lines 350.
  • the structure identification interface 300 may present an alert if a structural boundary 318 is located outside of the boundary lines 350.
  • the structure identification interface 300 may adjust the boundary lines 350 if a structural boundary 318 is located outside of the boundary lines 350.
  • the structure identification interface 300 may also display a current property boundary 311.
  • the electronic inputs 330 provided by the operator may be marked with pins.
  • the pins may be associated with GPS coordinates and may be placed in corners of the site.
  • the structure identification interface 300 may automatically form a boundary structure line between each pin. The placement of the pins may be adjusted through the electronic input 330. For example, the operator may select and drag a pin to a new location if the old location was inaccurate.
  • the structure identification interface 300 may also display the current pin placement in a preview window 312.
  • FIG. 3B illustrates a close-up view of the parcel boundaries 350 and the structure 320 identified in FIG. 3 A by GPS markers.
  • the structure 320 which may be partially or fully defined by the operator is illustrated in bold lines.
  • the system may utilize the markers in combination with an image (e.g., aerial or satellite) to intelligently identify the structure 320.
  • an operator of the system may fully identify the outline of the structure.
  • FIG. 4 illustrates a boustrophedonic scan of a site 450 defined by the identified geographic boundaries that include a structure 420.
  • the UAV 475 may capture images while following a boustrophedonic flight pattern 480.
  • the number of passes shown is eight; however, the actual number of passes may vary based the size of the structure and/or property, on a desired resolution, camera field of view, camera resolution, height of the UAV 475 relative to the surface, and/or other characteristics of the desired scan, capabilities of the UAV 475, and attributes of the surface.
  • the UAV 475 may fly to a start location.
  • the start location may be at a first corner of the site 450.
  • the UAV 475 may then follow a straight path until a boundary line of the site 450 is reached.
  • the UAV 475 may then turn and follow an offset path in the opposite direction.
  • the UAV 475 may continue to travel back and forth until an endpoint 485 is reached and the entire site 450 has been traveled.
  • the UAV 475 may travel at a high altitude such that it will not collide with any obstacle or structure and/or avoid obstacles in the path by going around or above them.
  • the UAV 475 may capture images.
  • onboard processing or cloud-based processing may be used to identify structures and obstacles.
  • an analysis may be conducted after scanning is complete and the UAV has returned home.
  • FIG. 5 illustrates an elevation map of a site 550 with a structure 520.
  • a UAV 575 may map out the site 550 in a plurality of sub-locals 560
  • the UAV 575 may record the distances to a surface for each of the plurality of sub-locals 560 within the site 550.
  • Each of the sub-locals 560 may correspond to potential vertical approaches for vertical descents during subsequent scans.
  • the distances may be used to detect the location of a structure or any obstacles (e.g., tree 522) on the site 550.
  • the UAV 575 may determine the boundaries and relative location of a roof of the structure 520.
  • FIG. 6A illustrates a UAV 675 performing a micro scan of a site 650.
  • the UAV 675 may make a series of vertical approaches for each sub-local 660.
  • the UAV 675 may descend within each vertical approach to a target distance 695 and then capture a detailed image of a portion 690 of a structure 620. Some of the descents may culminate proximate a surface of the roof. Other descents may culminate proximate the ground and allow for imaging of a wall of the structure 620 as the UAV 675 descends proximate a wall of the structure 620.
  • the entire site 650 may be micro scanned.
  • the elevation map 560 from FIG. 5 may provide the height to obstacles 622 and the structure 620.
  • the UAV 675 may determine the altitude change necessary to reach the target distance 695 for each sub-local 660 based on the elevation map 560.
  • certain portions of the site 650 may be micro scanned while other portions are not.
  • the UAV 675 may not micro scan the obstacle 622.
  • the UAV 675 may only micro scan the structure 620, or a certain portion 690 of the structure 620.
  • FIG. 6B illustrates an elevation map of the structure 620 to allow for micro scans or detailed scans to be performed from a consistent distance to each portion of the structure 620.
  • the UAV 675 may descend within each vertical approach to within, for example, 15 feet of the structure 620for detailed images and/or other analysis to be performed.
  • the UAV, or associated cloud-based control systems may identify a pitch of the roof before performing micro scans.
  • each descent within each vertical approach may be used to scan (or otherwise analyze or collect data) of a portion of the structure 620 that is not directly beneath the UAV 675. Such an approach may allow for skew-free data collection.
  • skew-free data collection In other
  • micro scans may be performed directly beneath, to the side, behind, and/or in front of the UAV 675 as it descends within each vertical approach.
  • FIGS. 7A-7C illustrate a loop scan 701 and a three-dimensional model 700 of a structure 720 on a site 750.
  • the loop scan 701 may take a series of angled images 745 of the walls 748 of the structure 720.
  • a UAV 775 may perform the loop scan 701 by following a second flight pattern 740 that causes the UAV 775 to travel around the perimeter of the structure 720 at a second altitude range lower than the altitude of the boustrophedonic scan. By following a lower elevation, the UAV 775 captures images of the side of the structure 720. This may be used to create a higher resolution three-dimensional model 700.
  • FIG. 8 illustrates a UAV determining a pitch 821 of a roof of a structure 820.
  • the UAV may capture three or more images of the roof: a first image at a first elevation 875, a second image at a second elevation 876, and a third image at a third elevation 877.
  • the first and the second elevations 875, 876 may be below the roof peak.
  • the third elevation 877 may be slightly above the rain gutters.
  • the UAV may use these images along with associated metadata, including proximity data, to determine the pitch 821 of the roof.
  • the UAV may also detect inconsistencies 830 to the shingles on the roof.
  • the inconsistencies 830 may be a sign of damage to the roof.
  • the UAV may mark the
  • inconsistency 830 as a portion of interest to micro scan.
  • the UAV includes a propulsion system to move the UAV from a first aerial location to a second aerial location relative to a structure, as illustrated in FIG. 8. Movements may be horizontal, vertical, and/or a combination thereof. Lateral movements and rotation may also be possible.
  • the UAV may include one or more sensors that can be used, or possibly are specifically configured to determine distances to objects, such as a roof. The UAV may determine a distance to a roof at a first aerial location.
  • the UAV may then move to a second aerial location along a movement vector that includes one or more directional components (e.g., up, down, left, right, back, or forward, which could be more generally described as vertical, horizontal, or lateral, or even described using an X, Y, and Z coordinate system).
  • a distance to the roof may be calculated at the second aerial location.
  • a pitch of the roof may be calculated (e.g., geometrically) based on the distance measurements at the first and second locations and at least one of the components of the movement vector.
  • FIG. 9 illustrates a UAV assessment and reporting system using the date and time 910 to identify and/or optionally eliminate shadows in image captures.
  • a UAV 975 may receive the current date and time 910.
  • the UAV 975 may determine a shadow 945 of obstacles 922 on a site 950.
  • the UAV 975 may refrain from taking images of the portion of a structure 920 covered by the shadow 945 of the obstacle 922, annotate or otherwise identify shadow 945, and/or take additional images at a subsequent time when the shadow 945 has moved. Further, the UAV 975 may determine a time when the shadow 945 will move away from the roof.
  • the UAV assessment and reporting system using the date may also adjust the camera angle on the UAV 975 to avoid shadows 946 from the UAV 975.
  • FIG. 10 illustrates a UAV assessment and reporting system for analyzing a structure, according to one embodiment.
  • a user interface 1010 may include a site selection interface 1015 to receive an electronic input from an operator, user, or technician that identifies a location of a structure or other object to be assessed.
  • the user interface 1010 may further include a boundary identification interface 1020 to receive user input identifying geographic boundaries of a site or lot containing a structure and/or of the structure itself.
  • the user interface 1010 may additionally or optionally include a hazard identification interface 1025 allowing a user to identify one or more hazards proximate a structure or site identified using the site selection interface 1015.
  • a control system 1030 may be onboard a UAV 1055 or may be remote (e.g., cloud-based).
  • the control system 1030 may provide instructions to the UAV 1055 to cause it to conduct an assessment.
  • the control system 1030 may include a camera control module 1035, other sensor control modules 1040, image and/or sensor processing modules 1045, and/or scanning modules 1050 to implement boustrophedonic, loop, and/or micro scans.
  • the UAV 1055 itself may include a camera 1060, one or more optical sensors 1065, ultrasonic sensors 1070, other sensors 1075, and one or more network communication systems 1080.
  • FIG. 10 is merely representative of one example embodiment, and numerous variations and combinations are possible to implement the systems and methods described herein.
  • FIG. 11 A illustrates a structure 1101 that has been identified for a micro scan by a UAV.
  • Manual input, top-down scans, loop scans, boustrophedonic scans, or a combination or permutation thereof may be used to identify a navigational risk zone 1110.
  • the navigational risk zone 1110 may represent an area within which the UAV may be required to navigate while performing micro scans. Obstructions and other elements relevant to the micro scan process located within the navigational risk zone 1110 or affecting the UAV as it navigates within the navigational risk zone 1110 may be identified and tagged. For example:
  • overhanging branches 1180 may present a navigational risk to the UAV performing the micro scans
  • a satellite dish 1170 may provide a navigational obstacle to the UAV as well as present post-construction penetrations in the building envelope that may require enhanced analysis or scanning
  • a skylight 1160 may present glare or other sensor disruptions as well as potential seals that require inspection or analysis
  • a power line 1195 entering the structure 1101 from power pole 1190 may be identified as a navigational obstruction to be avoided.
  • FIG. 1 IB illustrates virtual tags assigned in geographic locations relative to the obstacles and scan-relevant elements.
  • the illustrated tags may provide information to the UAV as it comes within proximity of each tag.
  • the tag may provide relevant information to aid the navigation and/or scanning of the structure 1101.
  • FIG. l lC illustrates an example of the type of information that may be associated with each of the tags.
  • the tag information may be an indication of the type of risk associated with the obstacle or scan-relevant element.
  • the tag information may provide scan instructions and/or movement instructions, and/or trigger a reaction according to a rule set of the UAV.
  • the tag may be configured to provide a trigger for the UAV to query and follow instructions from a rule set.
  • the instructions from the rule set may be related to scanning or navigation.
  • the rule set and/or reaction to the rule set may vary based on the type of UAV being used for the micro scan and/or the type of scan being performed.
  • FIG. 12 illustrates a system 1200 for property analysis, according to one embodiment.
  • the UAV computer vision system 1200 may be onboard the aerial vehicle, cloud-based, or a combination thereof.
  • the system 1200 may include a processor 1230, memory 1240 and a network interface 1250 connected to a computer-readable storage medium 1270 via a bus 1220.
  • a scanning module 1280 may incorporate or control any of the systems described herein and implement any of the methods described herein.
  • a navigation module 1282 may utilize navigation sensors of the UAV and include various control mechanisms for navigating the UAV to perform scans, including boustrophedonic, loop, and/or micro scans.
  • the risk zone generator 1284 may generate a risk zone associated with the property (e.g., vehicle, structure, tower, bridge, road, residence, commercial building, etc.) within which the UAV may navigate while performing one or more types of scanning operations.
  • the risk zone generator 1284 may tag portions of the risk zone with scan-relevant tags and obstacle tags to aid the scanning of the property and/or avoid obstacles during navigation.
  • a tag reading module 1286 may receive information from tags based on the location of the UAV within the risk zone and relative to the property.
  • the tag reading module 1286 may receive scan-relevant or navigation -relevant information.
  • the information therein may be used to query a rule set 1288.
  • the rule set 1288 may modify a navigation pattern, flight direction, scan type, scan details, or other action taken or being taken by the UAV in response to a rule set's interpretation of information provided by a tag read by the tag reading module 1286.
  • the UAV computer vision system 1200 may also access a library of data profiles 1289. Scan data captured by the UAV of any type of sensor may be compared and matched with data profiles within the library of data profiles 1289. In response to the UAV computer vision system 1200 identifying a match within the library of data profiles 1289, the rule set 1288 may dictate a modification to the scanning or navigation pattern.
  • FIG. 13 illustrates examples of possible library images 1305-1340 of a property analysis library or library of data profiles, according to various embodiments.
  • Many examples of data profiles may not be optical and thus are not well illustrated in the drawings.
  • infrared data profiles and ultrasound profiles may be used instead of or in addition to optical data profiles.
  • the UAV system may capture sensor data and identify a material by comparing the captured images with data profiles within a library of data profiles.
  • computer vision may be used to identify a roof as cedar shakes 1305, asphalt shingles 1310, wood 1315, or glass 1320.
  • windblown cedar shakes 1325 may be identified through computer vision techniques.
  • Hail pops in asphalt shingles 1330 may be identified by matching captured image data with stored data profiles.
  • defects in wood 1335 and broken glass 1340 may be identified by matching captured sensor data with library data profiles.
  • FIG. 14A illustrates an example of a planview map 1410 (e.g., a blueprint or a schematic) of a facility that may be created, scanned, or imported by a UAV.
  • autonomous vehicles such as a UAVs or UGVs may utilize the planview map 1410 to plan and execute a micro scan.
  • the planview map 1410 may be scanned or imported from existing technical drawings of the structure with accompanying details and notes describing variations and details. Construction details and callouts (sometimes referred to as anchors) may be utilized by the UAV for interior micro scanning.
  • the planview map may include interior and/or exterior data.
  • the UAV may utilize the plan view map to identify interior and exterior walls, such as exterior wall 1420.
  • a UAV system may interpret standard symbols, line thicknesses, digital annotations, and other markings to develop a scanning strategy.
  • an existing planview map may not be available for a structure.
  • a UAV may perform the scan in real-time while conducting a micro scan and may optionally create a planview map for subsequent use and/or as part of a deliverable micro scan report.
  • FIG. 14B illustrates the detection of pathways for generating perimeters and the detection of standard symbols, according to one embodiment.
  • thick black walls 1420 have been identified as exterior walls.
  • Unfilled interior walls 1430 next to a kitchen or bathroom has been identified as a wet wall.
  • Other, thinner black walls 1425 have been identified as interior walls.
  • Each doorway 1435 has been identified as such.
  • Windows 1431 may also be identified for enhanced scanning as appropriate for windows and surrounding walls.
  • specific wall types, and/or regions or zones bounded by such wall types may be associated with specific scanning or navigational instructions.
  • wet walls 1430 may automatically be treated with a higher level of scrutiny for moisture analysis.
  • Doorways 1435 may be marked as areas requiring special navigational considerations.
  • each doorway may be marked as a location where a UAV should traverse at an elevation of no more than four feet, to provide plenty of clearance for a standard 6' 8" door clearance.
  • Load bearing walls such as exterior walls 1420 may be identified for a different level of scanning scrutiny than interior walls. For instance, exterior walls may be subjected to thermal imaging to confirm a uniform distribution of insulating materials. Such analysis may not be needed for interior walls. Differentiating between wall types decreases the time and volume of data required to finish a comprehensive micro scan without compromising the quality of the complete micro scan. Similar differentiations and efficiencies can be made for wet versus dry locations, load-bearing versus non-load bearing walls, basement versus above- ground rooms, walls with windows, etc.
  • FIG. 14C illustrates an initial assessment of a planview map 1410 in which regions have been marked with specific zones associated with particular analysis and/or sensor utilization requirements, according to one embodiment.
  • a UAV system may utilize variables such as room size, irregularities, types of walls, the existence of windows, known hazards, expected or identified plumbing or electrical fixtures, and the like to assign zone types to each or at least some of the regions in the planview map 1410.
  • the UAV system may annotate (e.g., digitally) an existing planview map to provide navigational and scanning instructions to a UAV and/or UGV to conduct the internal structural analysis.
  • zones (such as zones 1440, 1445, 1450, and 1455 noted in FIG. 14C) may be assigned to the various rooms of the structure.
  • Each zone may be defined by a perimeter that may or may not correspond to physical barriers or features within the structure.
  • the UAV system may assign scanning and/or navigational instructions to each zone type including, but not limited to, instructions regarding the type of sensors to be used, types of scans to be conducted, navigational risk identifiers, navigation instructions, and/or the like.
  • a first zone 1440 may correspond to open rooms expected to be free from obstructions that can be easily navigated.
  • a second zone 1445 may be defined for rooms or regions of the structure that have or are expected to have irregularities or nonconforming obstacles. For example, a bathroom may be reasonably expected to have a shower curtain that may not be shown on a planview map.
  • Such objects may be identified and/or avoided in real time during a micro scan by a UAV, but annotating a planview map may aid in such identification and/or instruct the UAV to proceed with extra caution.
  • a third zone 1455 may be used to define hallways or stairs and provide specific navigation instructions for traversing the zone, specific instructions for the types of scans to be completed within the zone, and possible which types of warnings or automatic controls of the UAV should be disabled.
  • a UAV may have sensors to prevent it from rising in elevation if a ceiling is detected within three feet.
  • the UAV may also have sensors to prevent it from advancing forward if a wall is within two feet.
  • Specific instructions may be required to disable these auto-collision avoidance systems to navigate the UAV up a stairwell at a diagonal.
  • a fourth zone 1450 may identify regions or portions of the structure that require specific sensor scanning and/or increased navigational caution.
  • fourth zone 1450 may include wet walls 1430 that may benefit from enhanced scanning and/or the utilization of specific sensor types.
  • Zone assignment and associated navigational, scanning, and other parameters may be assigned automatically, modified in real-time by the UAV, and/or manually assigned by a human operator before micro scanning begins or in real-time as analysis is performed.
  • a UAV may detect a previously unidentified ceiling fan and modify the zoning parameters and/or planview map 1410 to include the obstruction.
  • FIG. 14D illustrates auto-generated boustrophedonic flight path 1460 through the structure shown on the planview map 1410.
  • UAV 1465 may navigate the structure using the pre-determined flight path.
  • the boustrophedonic flight path 1460 may be used to confirm or create the planview map 1410.
  • the UAV 1465 may conduct one or more aspects of the micro scan using one or more sensor types at one or more altitudes such that the micro scan is completed when the boustrophedonic flight path 1460 has been fully traversed.
  • the UAV 1465 may traverse a boustrophedonic flight path 1460 as an initial step of the micro scan to identify potential hazards and identify the types of scanning and sensors that will be used for subsequent scanning paths.
  • the boustrophedonic flight path 1460 may be used to determine a range of elevations that may be used for vertical descents along each of the walls 1420.
  • the UAV 1465 may then perform a boustrophedonic flight path at a fixed (or variable) distance from the wall at various elevations. That is, the UAV 1465 may travel back and forth along each wall in a boustrophedonic pattern at decreasing or increasing elevations as it scans the entire wall.
  • FIG. 14E illustrates an embodiment in which multiple UAVs 1475, 1476, 1477, and 1478 are used to analyze various rooms of a facility (structure) and/or communicate with each other and/or relay communication with a host or control device (not shown), according to various embodiments. Due to complications of communicating through walls and other obstructions, the UAV system may utilize relayed communication to maintain continual communication.
  • a single UAV within the structure shown in planview map 1410 may lose network connection and/or access to control signals due to the walls 1420. In such situations, multiple UAVs 1475-1478 may be deployed.
  • the network of UAVs 1475-1478 may act as a mesh or relay communication network 1480 in which each UAV 1475-1478 may utilize one or more other UAVs 1475-1478 as an intermediary network node to maintain connectivity with a host or control device.
  • Such networks may utilize self-healing algorithms to maintain communication between nodes (UAVs 1475-1478).
  • FIG. 14F illustrates an embodiment in which a UAV swarm 1494 is used to analyze a room of a facility bounded by various walls 1420, according to one embodiment.
  • a UAV swarm may be referred to as a "drone swarm" may include any number of UAVs.
  • a UAV swarm 1494 may react to real-time data analysis by re-focusing scan efforts based on communication from one UAV to other members of the swarm (directly or indirectly through a control system).
  • the UAV swarm 1494 may not be pre-programmed synchronized individuals. Rather, the UAV swarm 1494 may function as a collective organism, sharing one distributed brain for decision-making and adapting to each other like swarms in nature. Because every UAV in the UAV swarm 1494 communicates and collaborates with every other UAV, the swarm has no leader and can adapt to UAVs entering or exiting the swarm. Each UAV in the swarm may be aware of where it is relative to other UAVs in the swarm and/or relative to its nearest neighbors.
  • FIG. 15 illustrates a UAV 1505 scanning a wall 1507 to identify deviations and/or confirm that the wall is plumb along a line 1553 between a first measurement location 1550 and a second measurement location 1551, according to one embodiment.
  • the UAV 1505 may make a first distance measurement 1520 at a first altitude 1510 and then make a second distance measurement 1540 at a second altitude 1520. If the distances, Dl 1520 and D2 1540 are the same, then the wall can be confirmed plumb. If the distances differ by more than an acceptable standard deviation, the wall may be noted as not being plumb. A wall 1507 not being plumb may be a red flag for other issues. Thus, a decision that a wall is not plumb may result in the UAV 1505 conducting a plurality of additional measurements that may not have been conducted absent a finding of a non- plumb wall.
  • the real-time decision logic allows the UAV 1505 to conduct a comprehensive scan in the least amount of time possible. Absent real-time decisions, the UAV 1505 would either have to over-scan the entire structure or fail to conduct scanning that should be performed based on initial findings (e.g., the lack of a plumb wall).
  • FIG. 16A illustrates a UAV 1605 scanning in a vertical direction 1620 and 1640 from multiple locations 1610 and 1630 to identify deviations in the ceiling between scan locations 1625 and 1627, such as bowing or sagging, according to one embodiment.
  • the UAV 1605 uses internal sensors to confirm that it has not moved vertically as it travels from location 1610 to locations 1630.
  • UAV 1605 scans upward and downward at the same time and compares the total distance from ceiling to floor at location 1610 with the total distance from ceiling to floor at location 1630.
  • a planar evaluation system associated with a UAV may be used to make two or more measurements on a wall or ceiling to determine if the wall is plumb or, in the case of a ceiling, if the ceiling is bowed. More generically, the planar evaluation system can utilize distance measurements from a UAV to an interior boundary from at least two locations within a structure to determine if an interior boundary is planar or not (i.e., plumb, bowed, curved, etc.).
  • FIG. 16B illustrates a UAV 1610 scanning a ceiling 1670 to identify paint chipping 1660, according to one embodiment.
  • the UAV 1410 may include sensors on a top portion to perform ceiling scans 1650, be capable of inverting, and/or be capable of tilting sufficiently for an angled view of the ceiling 1670 to be scanned.
  • the UAV 1610 may include one or more sensors of one or more types, such as optical sensors, infrared sensors, ultrasonic sensors, ultraviolet sensors, moisture sensors, and the like.
  • the UAV, a real-time connected processing controller, and/or subsequent processing units may be used to analyze a ceiling, or other surface or subsurface, for discontinuities and/or identify matches with a library of defects or damage as described herein.
  • FIG. 16C illustrates the UAV 1610 scanning a wall 1671 to identify deteriorating sheetrock 1661 within a scan region 1651, according to one embodiment.
  • FIG. 16D illustrates a UAV 1610 scanning, at 1652, a wall 1671 using at least one of infrared and ultrasonic sensors to detect structural straps 1633 on studs 1622 within the wall 1671, according to one embodiment.
  • the UAV may be able to detect various material types, heat signatures, densities (e.g., via ultrasound) to determine that proper structural components are in place. For example, it may be desirous in some regions to confirm the installation of earthquake proofing and/or hurricane proofing components, such as hurricane straps.
  • FIG. 17 illustrates a comparison of squared door frame 1705 from a stored data library to that of out of alignment acquired photo data 1720.
  • elements such as doorframes are irregular or have skew elements, they are tagged specifically to indicate future inspection 1730.
  • the library of data profiles 1289 of the UAV computer vision system 1200 in FIG. 12 may include one or more profiles associated with interior
  • the UAV may use the library of data profiles 1289 in real-time to identify characteristics of a portion of a structure being scanned for identification purposes and/or to detect defects in workmanship.
  • a data profile within the library may include a rectangle shape 1710 aligned around a door frame 1705 and a pre-hung door 1707.
  • An acquired photo of a door 1717 within a structure can be compared to the library element to determine if the overall rectangular shape 1720 is askew, the door frame 1715 is askew, and/or the door 1717 itself is askew.
  • a determination that the door 1717 is askew more than an acceptable skew deviation, may result in flag 1730 being recorded.
  • FIG. 18 illustrates the use of a single sensor or an array of sensors in a UAV 1810 to identify boundaries 1818, 1825, 1840, and 1850 and objects 1815, 1817, and 1819 of an interior region of a structure, according to one embodiment.
  • the UAV 1810 may also identify window 1827 in wall 1825.
  • the sensor usage depicted in FIG. 18A is merely representative of the data collected by a combination of sensors and/or obtained through multiple flight passes within a room, such as via a boustrophedonic flight path shown in FIG. 14D. continually bombarding its surroundings with an array of various sensors.
  • the UAV 1810 may be configured to identify a state of deterioration of furniture within the room and/or other fixtures within the home. For example, mounted television, artwork, lighting fixtures, banisters, railings, and the like may be scanned and analyzed for discontinuities, deviations from expected values, and/or matches within a library of characterizations.
  • FIG. 19A illustrates the use of a UGV 1930 conducting an inspection of ducting 1927, according to one embodiment.
  • an entrance location 1910 may be identified by a UAV during a boustrophedonic flight path.
  • Vent 1920 may be removed by the UGV 1930 or manually by an operator to reveal entrance 1925 in wall 1905.
  • FIG. 19B illustrates the inspection and sampling of ducting 1927 using the UGV 1930, according to one embodiment.
  • UGV 1930 may be specifically designed to navigate ducting and/or other small passages.
  • UGV 1930 may have special navigation capabilities and/or sensor types for ducting and/or other narrow passages.
  • the UGV 1930 may conduct thermal, visual, ultrasonic, and/or other analyses.
  • UGV 1930 may also collect physical samples for later analysis and/or perform real-time physical sample analysis.
  • an anemometer may be used to measure the airflow within the ducting.
  • a comparison of measurements with specified values from equipment manufacturers or a comparison of measurements at various times over days, weeks, months, years may be used to identify dirty filters and/or failing equipment.
  • the UGV 1930 may identify and/or tag gaps, gashes or holes 1950 may be identified.
  • UGV 1930 may be equipped with the capability to fix minor problems and holes using applied adhesives or spray foams.
  • UGV 1930 may also be configured to identify buildup of material 1960 that may constrict airflow and/or be associated with mold or fungi. As previously noted, physical samples may be acquired and/or analyzed in real time.
  • FIG. 20A illustrates a UAV 2000 using thermal imaging 2020 to identify defects in wall 2005 around a window 2010, according to one embodiment. Black dots are used to show the uniformity of thermal temperature around the window 2010, indicating that the seal around the window is adequate.
  • the library of data profiles 1289 of the UAV computer vision system 1200 in FIG. 12 may include one or more profiles associated with interior construction materials, expected infrared profiles around specific objects (such as windows, plumbing, studs, wiring, etc.).
  • FIG. 20B illustrates a UAV 2000 scanning within a wall 2005 to detect a poorly insulated portion 2011 of the wall near a corner, according to one embodiment.
  • FIG. 20C illustrates a UAV 2000 analyzing a subsurface of a wall 2005 to detect moisture or a water-damaged portion2012, according to one embodiment.
  • the poorly insulated portion 2011 may be detected through a different scan with a different sensor, or using the same sensor(s) and at the same time.
  • FIG. 20D illustrates an infrared scan near an outlet 2032 below a window 2031 to identify an anomalously high temperature of a wire 2033, according to one embodiment.
  • FIG. 20E illustrates an infrared scan near the outlet 2032 to identify wire paths 2034 within a wall, according to one embodiment.
  • the "hot" wire 2033 may be identified at the same time and/or in using a different sensor.
  • the UAV 2000 may be deployed within a room to map out wire paths, plumbing paths, and/or other subsurface components using infrared and/or ultrasonic sensors.
  • outlets and lights may be turned on and/or connected to high-current sources to heat up the wires as much as possible prior to testing.
  • FIG. 20F illustrates an internal scan of a portion of a wall to identify insect damage and/or insects 2047 within the wall, according to one embodiment.
  • an ultrasonic or thermal (infrared) scan of a wall should be more or less constant. The detection of movement within the wall may be indicative of an infestation of rodents or insects. Moreover, insects and/or rodents may damage the subsurfaces and the UAV 2000 may scan them for matched profiles from a library, detect anomalies or discontinuities, and/or through the use of human-assisted analysis.
  • FIG. 20G illustrates a UAV swarm 2091 of micro-UAVs for scanning a wall, according to one embodiment. As illustrated, each micro-UAV may scan a portion of the wall and in concert with the other UAVs in the swarm, may scan the entire wall.
  • FIG. 20H illustrates a UAV swarm of micro-UAVs for scanning a wall in a scanning formation, according to one embodiment. As illustrated, a formation of
  • micro-drones may move from a first location 2098 to a second lower location 2099 as they collectively scan the wall.
  • FIG. 21 illustrates a UAV 2120 being used to navigate and/or perform a micro scan 2110 of stairs 2105.
  • the UAV 2120 may navigate at an angle 2130 up the stairs 2105. As the UAV 2120 navigates up the stairs 2105, it may collect measurements of width, length, rise, etc. of stairs 2105, according to various embodiments, illustrates the analysis of elements that compare to current structural code.
  • the UAV 2120 may measure the stairs 2105 in both horizontal and vertical directions to collect data regarding each riser and tread of the stairs 2105. Again, actual measurements and images may be compared with a library of stored profiles. Deviations from expected values may be noted, annotated on a planview map, and/or reported.
  • FIG. 22 A illustrates a UGV 2210 conducting a structural analysis using ground penetrating radar (GPR) 2230, according to one embodiment.
  • UGV 2210 may include a special sensor 2220 capable of emitting and/or receiving GPR 2230 to scan a solid body, such as a concrete foundation 2200, to identify hidden fissures, voids or defects.
  • the UGV 2210 may follow a boustrophedonic path to scan an entire slab and/or footings using GPR.
  • FIG. 22B illustrates the detection of hidden subsurface fissures, voids or defects 2240.
  • returned GPR data may be compared with a library of GPR data profiles to identify the types of fissures, voids, or defects.
  • the UAV may match data profiles to quickly determine if additional scanning is necessary or if sufficient information has been obtained to make a determination.
  • FIG. 23 illustrates examples of the types of data profiles that may be stored within a library of data profiles associated with interior micro scans. While only six examples are illustrated, it is appreciated that hundred or even thousands of data profiles for each sensor type, construction material, and/or location type may be stored within a library.
  • the data profiles may be stored hierarchically to enable a UAV system to quickly navigate to the correct data profile for profile matching analysis.
  • Data profile 2310 is marked a defect 2312 related to water damage 2311.
  • Data profile 2320 is marked as a defect 2322 related to surface material cracking 2321.
  • Data profile 2330 is marked as a defect 2332 related to an incorrectly installed electrical outlet 2331.
  • the library may include a correctly installed electrical outlet that is square relative to the floor and/or the aperture in the wall. Images of each electrical outlet in a structure may be compared with the square image - matches may be marked as acceptable while any outlets that are not square beyond an acceptable level of deviation may be marked as problem areas.
  • Data profile 2340 is marked as a defect 2342 because of waves 2341 in a wall surface beneath a window.
  • Data profile 2350 is marked as a defect 2352 because of a mismatch in material 2351 noted in a floor of a structure.
  • the data profile stored in the library may simply indicate that acceptable standards require that the entire floor have a uniform density, material constructions, or elevation. Captured data may indicate a problem based on an ultrasonic analysis of density, a visual analysis of construction materials, and/or a laser distance measurement of elevations at various points along the floor.
  • data profile 2360 is marked as a defect 2362 because of moisture readings taken along a top edge of a wall that revealed moisture 2361.
  • the shape of the moisture 2361 may be matched with data profiles within the library of data profiles that state that this moisture shape indicates it is dripping from above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un système d'évaluation et de rapport de véhicule aérien sans pilote (UAV) qui peut utiliser une ou plusieurs techniques de balayage pour fournir des évaluations et/ou des rapports utiles pour des structures et d'autres objets. Les techniques de balayage peuvent être effectuées en séquence et éventuellement utilisées pour affiner ultérieurement chaque balayage ultérieur. Le système peut comprendre une élimination d'ombre, une annotation et/ou une réduction de l'UAV lui-même et/ou d'autres objets. Un UAV peut être utilisé pour déterminer un pas de toit d'une structure. Le pas du toit peut être utilisé pour affiner un balayage ultérieur et une capture de données pour capturer des images perpendiculaires au niveau de champs de vision cibles et/ou de distances cibles pour capturer des images de régions d'échantillon ayant des longueurs, des largeurs, des zones et/ou d'autres dimensions définies.
PCT/US2017/059990 2016-11-04 2017-11-03 Systèmes et procédés d'imagerie autonome et d'analyse structurelle WO2018089268A1 (fr)

Applications Claiming Priority (18)

Application Number Priority Date Filing Date Title
US201662417779P 2016-11-04 2016-11-04
US62/417,779 2016-11-04
US15/360,641 2016-11-23
US15/360,630 2016-11-23
US15/360,630 US9734397B1 (en) 2016-11-04 2016-11-23 Systems and methods for autonomous imaging and structural analysis
US15/360,641 US9639960B1 (en) 2016-11-04 2016-11-23 Systems and methods for UAV property assessment, data capture and reporting
US15/388,754 US9823658B1 (en) 2016-11-04 2016-12-22 Systems and methods for adaptive property analysis via autonomous vehicles
US15/388,754 2016-12-22
US15/446,202 US10055831B2 (en) 2016-11-04 2017-03-01 Systems and methods for adaptive property analysis via autonomous vehicles
US15/446,202 2017-03-01
US15/480,310 2017-04-05
US15/480,310 US9965965B1 (en) 2016-11-04 2017-04-05 Systems and methods for adaptive property analysis via autonomous vehicles
US15/675,616 2017-08-11
US15/675,616 US10089529B2 (en) 2016-11-04 2017-08-11 Systems and methods for adaptive scanning based on calculated shadows
US15/708,471 2017-09-19
US15/708,471 US9996746B1 (en) 2016-11-04 2017-09-19 Systems and methods for autonomous perpendicular imaging with a target field of view
US15/710,221 2017-09-20
US15/710,221 US9886632B1 (en) 2016-11-04 2017-09-20 Systems and methods for autonomous perpendicular imaging of test squares

Publications (1)

Publication Number Publication Date
WO2018089268A1 true WO2018089268A1 (fr) 2018-05-17

Family

ID=62109637

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/059990 WO2018089268A1 (fr) 2016-11-04 2017-11-03 Systèmes et procédés d'imagerie autonome et d'analyse structurelle

Country Status (1)

Country Link
WO (1) WO2018089268A1 (fr)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10089530B2 (en) 2016-11-04 2018-10-02 Loveland Innovations, LLC Systems and methods for autonomous perpendicular imaging of test squares
US10089529B2 (en) 2016-11-04 2018-10-02 Loveland Innovations, LLC Systems and methods for adaptive scanning based on calculated shadows
US10102428B2 (en) 2017-02-27 2018-10-16 Loveland Innovations, LLC Systems and methods for surface and subsurface damage assessments, patch scans, and visualization
US10521664B2 (en) 2016-11-04 2019-12-31 Loveland Innovations, LLC Systems and methods for autonomous perpendicular imaging of test squares
CN110750106A (zh) * 2019-10-16 2020-02-04 深圳市道通智能航空技术有限公司 无人机的安全航线生成方法、装置、控制终端和无人机
US10733443B2 (en) 2018-08-24 2020-08-04 Loveland Innovations, LLC Image analysis and estimation of rooftop solar exposure
US20200327696A1 (en) * 2019-02-17 2020-10-15 Purdue Research Foundation Calibration of cameras and scanners on uav and mobile platforms
US10825346B2 (en) 2016-11-04 2020-11-03 Loveland Innovations, LLC Systems and methods for adaptive property analysis via autonomous vehicles
US10984182B2 (en) 2017-05-12 2021-04-20 Loveland Innovations, LLC Systems and methods for context-rich annotation and report generation for UAV microscan data
CN112835380A (zh) * 2020-12-30 2021-05-25 深兰科技(上海)有限公司 飞行器的返航方法、装置、飞行器及计算机可读存储介质
CN113287077A (zh) * 2019-07-08 2021-08-20 松下知识产权经营株式会社 信息处理装置、信息处理方法以及无人飞行器
US11097841B2 (en) 2017-10-24 2021-08-24 Loveland Innovations, LLC Crisscross boustrophedonic flight patterns for UAV scanning and imaging
US11205072B2 (en) 2018-08-24 2021-12-21 Loveland Innovations, LLC Solar ray mapping via divergent beam modeling
US11210514B2 (en) 2018-08-24 2021-12-28 Loveland Innovations, LLC Image analysis and estimation of rooftop solar exposure via solar ray mapping
US20220044430A1 (en) * 2020-08-05 2022-02-10 Lineage Logistics, LLC Point cloud annotation for a warehouse environment
CN114115339A (zh) * 2021-11-11 2022-03-01 厦门精图信息技术有限公司 基于KingMap GIS平台的多规合一业务协同系统、方法及设备
US11275376B2 (en) 2019-06-20 2022-03-15 Florida Power & Light Company Large scale unmanned monitoring device assessment of utility system components
US11532116B2 (en) 2020-10-30 2022-12-20 Loveland Innovations, Inc. Graphical user interface for controlling a solar ray mapping
WO2023021043A1 (fr) * 2021-08-19 2023-02-23 Bozkurt Hueseyin Procédé de création d'au moins une image d'au moins un espace intérieur
CN115761533A (zh) * 2022-11-03 2023-03-07 四川省地震局 一种基于无人机技术的地震断裂检测方法
CN116203554A (zh) * 2023-05-06 2023-06-02 武汉煜炜光学科技有限公司 一种环境点云数据扫描方法和系统
CN112835380B (zh) * 2020-12-30 2024-06-07 深兰科技(上海)有限公司 飞行器的返航方法、装置、飞行器及计算机可读存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020013644A1 (en) * 2000-03-06 2002-01-31 Mekemson James R. Method and apparatus for pavement cross-slope measurement
US20100110074A1 (en) * 2008-10-31 2010-05-06 Eagle View Technologies, Inc. Pitch determination systems and methods for aerial roof estimation
US8346578B1 (en) * 2007-06-13 2013-01-01 United Services Automobile Association Systems and methods for using unmanned aerial vehicles
US20130170694A1 (en) * 2009-05-22 2013-07-04 Pictometry International Corp. System and Process for Roof Measurement Using Aerial Imagery
WO2016053438A2 (fr) * 2014-07-21 2016-04-07 King Abdullah University Of Science And Technology Traitement de structure a partir de mouvement (sfm) pour un vehicule aerien sans pilote
US20160272308A1 (en) * 2015-03-18 2016-09-22 Amazon Technologies, Inc. Adjustable landing gear assembly for unmanned aerial vehicles
US9618940B1 (en) * 2015-12-31 2017-04-11 Unmanned Innovation, Inc. Unmanned aerial vehicle rooftop inspection system
US9734397B1 (en) * 2016-11-04 2017-08-15 Loveland Innovations, LLC Systems and methods for autonomous imaging and structural analysis

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020013644A1 (en) * 2000-03-06 2002-01-31 Mekemson James R. Method and apparatus for pavement cross-slope measurement
US8346578B1 (en) * 2007-06-13 2013-01-01 United Services Automobile Association Systems and methods for using unmanned aerial vehicles
US20100110074A1 (en) * 2008-10-31 2010-05-06 Eagle View Technologies, Inc. Pitch determination systems and methods for aerial roof estimation
US20150016689A1 (en) * 2008-10-31 2015-01-15 Eagle View Technologies, Inc. Pitch determination systems and methods for aerial roof estimation
US20130170694A1 (en) * 2009-05-22 2013-07-04 Pictometry International Corp. System and Process for Roof Measurement Using Aerial Imagery
WO2016053438A2 (fr) * 2014-07-21 2016-04-07 King Abdullah University Of Science And Technology Traitement de structure a partir de mouvement (sfm) pour un vehicule aerien sans pilote
US20160272308A1 (en) * 2015-03-18 2016-09-22 Amazon Technologies, Inc. Adjustable landing gear assembly for unmanned aerial vehicles
US9618940B1 (en) * 2015-12-31 2017-04-11 Unmanned Innovation, Inc. Unmanned aerial vehicle rooftop inspection system
US9734397B1 (en) * 2016-11-04 2017-08-15 Loveland Innovations, LLC Systems and methods for autonomous imaging and structural analysis

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10825346B2 (en) 2016-11-04 2020-11-03 Loveland Innovations, LLC Systems and methods for adaptive property analysis via autonomous vehicles
US10089529B2 (en) 2016-11-04 2018-10-02 Loveland Innovations, LLC Systems and methods for adaptive scanning based on calculated shadows
US10521664B2 (en) 2016-11-04 2019-12-31 Loveland Innovations, LLC Systems and methods for autonomous perpendicular imaging of test squares
US11720104B2 (en) 2016-11-04 2023-08-08 Loveland Innovations, Inc. Systems and methods for adaptive property analysis via autonomous vehicles
US10089530B2 (en) 2016-11-04 2018-10-02 Loveland Innovations, LLC Systems and methods for autonomous perpendicular imaging of test squares
US10810426B2 (en) 2016-11-04 2020-10-20 Loveland Innovations, LLC Systems and methods for autonomous perpendicular imaging of test squares
US10102428B2 (en) 2017-02-27 2018-10-16 Loveland Innovations, LLC Systems and methods for surface and subsurface damage assessments, patch scans, and visualization
US10984182B2 (en) 2017-05-12 2021-04-20 Loveland Innovations, LLC Systems and methods for context-rich annotation and report generation for UAV microscan data
US11097841B2 (en) 2017-10-24 2021-08-24 Loveland Innovations, LLC Crisscross boustrophedonic flight patterns for UAV scanning and imaging
US11731762B2 (en) 2017-10-24 2023-08-22 Loveland Innovations, Inc. Crisscross boustrophedonic flight patterns for UAV scanning and imaging
US11783544B2 (en) 2018-08-24 2023-10-10 Loveland Innovations, Inc. Solar ray mapping via divergent beam modeling
US10733443B2 (en) 2018-08-24 2020-08-04 Loveland Innovations, LLC Image analysis and estimation of rooftop solar exposure
US11188751B2 (en) 2018-08-24 2021-11-30 Loveland Innovations, LLC Image analysis and estimation of rooftop solar exposure
US11205072B2 (en) 2018-08-24 2021-12-21 Loveland Innovations, LLC Solar ray mapping via divergent beam modeling
US11210514B2 (en) 2018-08-24 2021-12-28 Loveland Innovations, LLC Image analysis and estimation of rooftop solar exposure via solar ray mapping
US11878797B2 (en) 2018-08-24 2024-01-23 Loveland Innovations, Inc. Image analysis and estimation of rooftop solar exposure
US20200327696A1 (en) * 2019-02-17 2020-10-15 Purdue Research Foundation Calibration of cameras and scanners on uav and mobile platforms
US11610337B2 (en) * 2019-02-17 2023-03-21 Purdue Research Foundation Calibration of cameras and scanners on UAV and mobile platforms
US11275376B2 (en) 2019-06-20 2022-03-15 Florida Power & Light Company Large scale unmanned monitoring device assessment of utility system components
CN113287077A (zh) * 2019-07-08 2021-08-20 松下知识产权经营株式会社 信息处理装置、信息处理方法以及无人飞行器
CN110750106A (zh) * 2019-10-16 2020-02-04 深圳市道通智能航空技术有限公司 无人机的安全航线生成方法、装置、控制终端和无人机
US20230042965A1 (en) * 2020-08-05 2023-02-09 Lineage Logistics, LLC Point cloud annotation for a warehouse environment
US11508078B2 (en) * 2020-08-05 2022-11-22 Lineage Logistics, LLC Point cloud annotation for a warehouse environment
US11790546B2 (en) 2020-08-05 2023-10-17 Lineage Logistics, LLC Point cloud annotation for a warehouse environment
US11995852B2 (en) 2020-08-05 2024-05-28 Lineage Logistics, LLC Point cloud annotation for a warehouse environment
US20220044430A1 (en) * 2020-08-05 2022-02-10 Lineage Logistics, LLC Point cloud annotation for a warehouse environment
US11532116B2 (en) 2020-10-30 2022-12-20 Loveland Innovations, Inc. Graphical user interface for controlling a solar ray mapping
US11699261B2 (en) 2020-10-30 2023-07-11 Loveland Innovations, Inc. Graphical user interface for controlling a solar ray mapping
CN112835380A (zh) * 2020-12-30 2021-05-25 深兰科技(上海)有限公司 飞行器的返航方法、装置、飞行器及计算机可读存储介质
CN112835380B (zh) * 2020-12-30 2024-06-07 深兰科技(上海)有限公司 飞行器的返航方法、装置、飞行器及计算机可读存储介质
WO2023021043A1 (fr) * 2021-08-19 2023-02-23 Bozkurt Hueseyin Procédé de création d'au moins une image d'au moins un espace intérieur
CN114115339B (zh) * 2021-11-11 2023-05-05 厦门精图信息技术有限公司 基于gis平台的多规合一业务协同系统、方法及设备
CN114115339A (zh) * 2021-11-11 2022-03-01 厦门精图信息技术有限公司 基于KingMap GIS平台的多规合一业务协同系统、方法及设备
CN115761533A (zh) * 2022-11-03 2023-03-07 四川省地震局 一种基于无人机技术的地震断裂检测方法
CN115761533B (zh) * 2022-11-03 2023-11-21 四川省地震局 一种基于无人机技术的地震断裂检测方法
CN116203554B (zh) * 2023-05-06 2023-07-07 武汉煜炜光学科技有限公司 一种环境点云数据扫描方法和系统
CN116203554A (zh) * 2023-05-06 2023-06-02 武汉煜炜光学科技有限公司 一种环境点云数据扫描方法和系统

Similar Documents

Publication Publication Date Title
US10055831B2 (en) Systems and methods for adaptive property analysis via autonomous vehicles
WO2018089268A1 (fr) Systèmes et procédés d'imagerie autonome et d'analyse structurelle
US11720104B2 (en) Systems and methods for adaptive property analysis via autonomous vehicles
US10012735B1 (en) GPS offset calibrations for UAVs
US9886632B1 (en) Systems and methods for autonomous perpendicular imaging of test squares
US11731762B2 (en) Crisscross boustrophedonic flight patterns for UAV scanning and imaging
US10102428B2 (en) Systems and methods for surface and subsurface damage assessments, patch scans, and visualization
US10382975B2 (en) Subterranean 3D modeling at cell sites
EP3028464B1 (fr) Système et procédé pour détecter des éléments dans des images aériennes à l'aide de techniques de cartographie d'écart et de segmentation
US10810426B2 (en) Systems and methods for autonomous perpendicular imaging of test squares
US10984182B2 (en) Systems and methods for context-rich annotation and report generation for UAV microscan data
US20190394448A1 (en) Automated feature analysis of a structure
US10397802B2 (en) Detecting changes at cell sites and surrounding areas using unmanned aerial vehicles
US10368249B2 (en) Modeling fiber cabling associated with cell sites
Becker et al. Reality Capture Methods for Remote Inspection of Building Work
KR20230138105A (ko) LiDAR 데이터를 활용한 드론 사진 이미지 단위 변환 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17870419

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14/08/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17870419

Country of ref document: EP

Kind code of ref document: A1