US20170201738A1 - Senising on uavs for mapping and obstacle avoidance - Google Patents

Senising on uavs for mapping and obstacle avoidance Download PDF

Info

Publication number
US20170201738A1
US20170201738A1 US15/176,229 US201615176229A US2017201738A1 US 20170201738 A1 US20170201738 A1 US 20170201738A1 US 201615176229 A US201615176229 A US 201615176229A US 2017201738 A1 US2017201738 A1 US 2017201738A1
Authority
US
United States
Prior art keywords
uavs
laser
sensing device
line
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/176,229
Inventor
Alberto Daniel Lacaze
Karl Nicholas Murphy
Raymond Paul Wilhelm, III
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robotic Research Opco LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US15/176,229 priority Critical patent/US20170201738A1/en
Publication of US20170201738A1 publication Critical patent/US20170201738A1/en
Priority to US16/045,795 priority patent/US11323687B2/en
Assigned to ROBOTIC RESEARCH, LLC reassignment ROBOTIC RESEARCH, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LACAZE, ALBERTO DANIEL, MURPHY, KARL NICHOLAS, Wilhelm, III, Raymond Paul
Assigned to ROBOTIC RESEARCH OPCO, LLC reassignment ROBOTIC RESEARCH OPCO, LLC MERGER (SEE DOCUMENT FOR DETAILS). Assignors: ROBOTIC RESEARCH, LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • H04N13/0242
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • B64C27/08Helicopters with two or more rotors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • G01S17/023
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/933Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • H04N13/0271
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • H04N5/23238
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/13Propulsion using external fans or propellers
    • B64U50/14Propulsion using external fans or propellers ducted or shrouded
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the present invention relates to UAVs. More specifically, the present invention is related to providing structured light and time of flight sensors on UAVs for obstacle avoidance and creating mapping capabilities.
  • the single line sensor needs to be configured into an up-down tilt configuration, the so called “yes-yes” ladar, or into a side to side pan configuration, so called “no-no” ladar, in order to get the coverage needed to traverse a complex environment.
  • Quadrotors of a small size and weight create significant pitch when traveling at high speeds. This pitch can be as high as 45 degrees when traveling at high speeds, or when quadrotors are used in windy areas.
  • LADAR also known as LIDAR
  • LIDAR is an optical remote sensing technology that can measure the distance to, or other properties of a target by illuminating the target with light, often using pulses from a laser.
  • LIDAR technology has application in geomatics, archaeology, geography, geology, geomorphology, seismology, forestry, remote sensing and atmospheric physics, as well as in airborne laser swath mapping (ALSM), laser altimetry and LIDAR contour mapping.
  • ALAM airborne laser swath mapping
  • LADAR Laser Detection and Ranging
  • the term LADAR Laser Detection and Ranging
  • laser radar is sometimes used, even though LIDAR does not employ microwaves or radio waves and therefore is not radar in the strict sense of the word.
  • GUI graphical user interface
  • a GUI represents the information and actions available to a user through graphical icons and visual indicators such as secondary notation, as opposed to text-based interfaces, typed command labels or text navigation. The actions are usually performed through direct manipulation of the graphical elements.
  • MAPHAC is a 3D scanning device for measuring the three-dimensional shape of an object using projected light patterns and a camera system.
  • a quadcopter also called a quadrotor helicopter or quadrotor
  • Quadcopters are classified as rotorcraft, as opposed to fixed-wing aircraft, because their lift is generated by a set of rotors (vertically oriented propellers). Unlike most helicopters, quadcopters use two sets of identical fixed pitched propellers; two clockwise (CW) and two counter-clockwise (CCW). These use variation of RPM to control lift and torque. Control of vehicle motion is achieved by altering the rotation rate of one or more rotor discs, thereby changing its torque load and thrust/lift characteristics.
  • a Small Unmanned Ground Vehicle is a lightweight, man portable Unmanned Ground Vehicle (UGV) capable of conducting military operations in urban terrain, tunnels, sewers, and caves.
  • UGV Unmanned Ground Vehicle
  • the SUGV aids in the performance of manpower-intensive or high-risk functions (i.e. urban Intelligence, Surveillance, and Reconnaissance (ISR) missions, chemical/Toxic Industrial Chemicals (TIC), Toxic Industrial Materials (TIM), reconnaissance, etc.).
  • ISR urban Intelligence, Surveillance, and Reconnaissance
  • TIC chemical/Toxic Industrial Chemicals
  • TIM Toxic Industrial Materials
  • reconnaissance etc.
  • the SUGV's modular design allows multiple payloads to be integrated in a plug and play fashion.
  • UGV Unmanned Ground Vehicle
  • UGVs can be used for many applications where it may be inconvenient, dangerous, or impossible to have a human operator present.
  • the vehicle will have a set of sensors to observe the environment, and will either autonomously make decisions about its behavior or pass the information to a human operator at a different location who will control the vehicle through teleoperation.
  • the UGV is the land-based counterpart to unmanned aerial vehicles and remotely operated underwater vehicles.
  • Unmanned robotics are being actively developed for both civilian and military use to perform a variety of dull, dirty, and dangerous activities.
  • SWAP constraints are directed to size, weight, and power of a military platform as defined by the military for a given platform and providing a basis for which a platform and utilize components from various manufacturers.
  • Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. In order to accommodate these sensors on a quadrotor, modifications will be done to the location of the camera and the laser emitters as taught by the present invention.
  • the proposed configuration makes use of multiple fisheye cameras and laser line scanners.
  • Four, wide degree field-of-view cameras provide overlapping views over nearly the whole unit sphere.
  • the cameras are separated from each other to provide parallax.
  • a near-infrared laser projection unit sends light out into the environment. If the light hits objects in the environment it is reflected and viewed by the cameras.
  • the laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space. At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities.
  • FIG. 1 MAPHAC is a structured light sensor that is designed for SUGVs.
  • FIG. 2 Point cloud generated by MAPHAC, color-coded for range.
  • the point cloud shows a leaning ladder, and a variety of office clutter.
  • FIG. 3 a Quad copter with four imagers and laser projection system.
  • FIG. 3 b Approximate field-of-view of a single imager.
  • FIG. 3 c Overhead view of combined field-of-view of all imagers.
  • FIG. 3 d Side-view of combined field-of-view of all imagers.
  • FIG. 4 Two laser line projectors are used to create a line that can then be sensed with the omnidirectional cameras.
  • FIG. 5 Complete field of view showing laser and cameras.
  • FIG. 6 Expected range error of structured light sensor.
  • FIGS. 7 a and 7 b Prototype sensing plane configuration.
  • Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. In sharp contrast, with conventional stereo and structure from motion, poor lighting actually improves the range and accuracy of this sensor. There is also no need to have rich features in the environment, since the laser “projects its own features.” Therefore, it will even work on featureless walls and floors.
  • FIG. 1 illustrates where a MAPHAC 100 is a structured light sensor that is designed for SUGVs.
  • FIG. 2 shows the scan of a typical cluttered room as a point cloud 200 , including a ladder 201 , a camera with a tripod 202 , chairs 203 , lamps 204 , etc.
  • the point cloud 200 generated by an MAPHAC is color-coded for range.
  • the point cloud 200 shows a leaning ladder 201 , and a variety of office clutter.
  • the current incarnation of MAPHAC 100 is designed to become a substitute for a SUGV antenna, where it can serve as both an autonomous mobility sensor and radio antenna.
  • the sensor is designed to meet the unique needs of an autonomous multicopter for indoor and outdoor environments, including: Large-field of view for obstacle avoidance and mapping; Light-weight system with minimal moving parts; Accurate ranges at short distances, with decreasing accuracy at longer ranges; Use of eye-safe lasers, while providing resilience to ambient light; and a Predicted weight under 150 grams.
  • the proposed configuration makes use of multiple fisheye cameras and laser line scanners.
  • Four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere.
  • the cameras are separated from each other to provide parallax.
  • a near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras.
  • the laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space.
  • a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities as illustrated by FIGS. 3 a , 3 b , 3 c , and 3 d.
  • FIG. 3 a illustrates a Quad copter 300 with four imagers 301 , 302 , 303 , and 304 and laser projection system 305 .
  • FIG. 3 b illustrates an approximate field-of-view 306 of a single imager 301 .
  • FIG. 3 c illustrates an overhead view of combined field-of-view 307 of all imagers 301 , 302 , 303 , and 304 .
  • FIG. 3 d illustrates a side-view of combined field-of-view 308 of all imagers.
  • FIG. 4 illustrates where two laser line projectors 401 and 402 are used to create a line 403 that can then be sensed with the omnidirectional cameras.
  • Each imager is composed of a camera module, a spectral filter, and a wide-angle compound lens.
  • the camera must be small in size and weight, while providing high sensitivity and a wide dynamic range.
  • an optical bandpass filter can be installed to attenuate incoming ambient light. If no filter is installed, the imaging system can be used as a visible light imager to provide full 360 degree RGB imagery in addition to point clouds.
  • a laser projection unit consists of a solid-state laser diode, laser pulsing circuitry, aspheric collimation lens, beam splitter, small rotating mirror, and laser line lens.
  • the laser circuitry pulses the laser while also providing a frame trigger to each imager.
  • the laser light is collimated into a beam 403 and 404 using a small aspheric lens directly in front of the laser.
  • the laser beam is then split into an upward and downward beam 403 and 404 .
  • Each beam 403 and 404 is reflected off a small rotating mirror coupled to a laser line lens.
  • the upward beam 403 creates a laser line that extends from horizontal to positive 80 degrees pitch
  • the downward beam 404 creates a laser line that extends from horizontal to negative 80 degrees pitch.
  • the proposed field-of-view shows the field-of-view of the projected lines 403 and 404 .
  • FIG. 5 shows the combined field-of-view of the cameras 405 and 406 and laser projectors 308 .
  • the structured light sensor will be able to measure 360 degrees horizontally and 160 degrees vertically. At each point in time, the sensor will generate approximately 2080 vertical range measurements. With each imager capturing approximately 180 images/second, the sensor will be able to generate over 370 k points per second.
  • the yaw scan rate can be varied, depending upon the current mission needs.
  • the sensor can be operated with a fine yaw resolution and slow scan rate, providing detailed scans of the environment; or, the sensor can be operated with a faster yaw rate, providing faster updates at a coarser rate.
  • the expected range error 600 is shown in FIG. 6 in graph format.
  • a second approach is to use a time-of-flight line sensor to perform the same task as shown with the structured light sensor.
  • the line sensors can be organized as seen in FIGS. 7 a and 7 b.
  • FIGS. 7 a and 7 b One more possible configuration is the same as shown in FIGS. 7 a and 7 b , but with the vertical sensing plan 700 aligned with the direction of travel 701 .
  • the system is composed of a quadrotor, or other UAV, and one or more range sensors that are used to sense the surrounding environment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. Four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras. The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space. At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full a 360 degree range.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims priority from U.S. Patent Application Ser. 62/175,231, entitled “SENISING ON UAVS FOR MAPPING AND OBSTACLE AVOIDANCE”, filed on 13 Jun. 2015. The benefit under 35 USC §119(e) of the United States provisional application is hereby claimed, and the aforementioned application is hereby incorporated herein by reference.
  • FEDERALLY SPONSORED RESEARCH
  • Not Applicable
  • SEQUENCE LISTING OR PROGRAM
  • Not Applicable
  • TECHNICAL FIELD OF THE INVENTION
  • The present invention relates to UAVs. More specifically, the present invention is related to providing structured light and time of flight sensors on UAVs for obstacle avoidance and creating mapping capabilities.
  • BACKGROUND OF THE INVENTION
  • There are few sensors that are well suited for autonomous mobility and mapping functions on small aerial platforms. LADAR choices that can fit the SWAP requirements are severely limited; few LADARs are available within the SWAP. One option, the single line sensor, needs to be configured into an up-down tilt configuration, the so called “yes-yes” ladar, or into a side to side pan configuration, so called “no-no” ladar, in order to get the coverage needed to traverse a complex environment.
  • Some other sensors provide a relatively small vertical field-of-view. Quadrotors of a small size and weight create significant pitch when traveling at high speeds. This pitch can be as high as 45 degrees when traveling at high speeds, or when quadrotors are used in windy areas.
  • Therefore, if a sensor with relatively small vertical field of view is installed horizontally, the vehicle will be blind in the direction of travel at high speeds. Once again, there is a need of a tilt mechanism.
  • The other approach, which better fits the SWAP constraints of a quadrotor, is stereo vision—or structure from motion. However, in both cases, poor lighting of an indoor environment—together with the lower quality optics camera combinations that can be carried with the quads—makes it a poor choice. Many attempts like this have been performed in the past few years, with very poor results.
  • Definitions
  • LADAR (also known as LIDAR) is an optical remote sensing technology that can measure the distance to, or other properties of a target by illuminating the target with light, often using pulses from a laser. LIDAR technology has application in geomatics, archaeology, geography, geology, geomorphology, seismology, forestry, remote sensing and atmospheric physics, as well as in airborne laser swath mapping (ALSM), laser altimetry and LIDAR contour mapping. The acronym LADAR (Laser Detection and Ranging) is often used in military contexts. The term “laser radar” is sometimes used, even though LIDAR does not employ microwaves or radio waves and therefore is not radar in the strict sense of the word.
  • In computing, a graphical user interface (GUI, commonly pronounced gooey) is a type of user interface that allows users to interact with electronic devices using images rather than text commands. GUIs can be used in computers, hand-held devices such as MP3 players, portable media players or gaming devices, household appliances and office equipment. A GUI represents the information and actions available to a user through graphical icons and visual indicators such as secondary notation, as opposed to text-based interfaces, typed command labels or text navigation. The actions are usually performed through direct manipulation of the graphical elements.
  • MAPHAC is a 3D scanning device for measuring the three-dimensional shape of an object using projected light patterns and a camera system.
  • A quadcopter, also called a quadrotor helicopter or quadrotor, is a multirotor helicopter that is lifted and propelled by four rotors. Quadcopters are classified as rotorcraft, as opposed to fixed-wing aircraft, because their lift is generated by a set of rotors (vertically oriented propellers). Unlike most helicopters, quadcopters use two sets of identical fixed pitched propellers; two clockwise (CW) and two counter-clockwise (CCW). These use variation of RPM to control lift and torque. Control of vehicle motion is achieved by altering the rotation rate of one or more rotor discs, thereby changing its torque load and thrust/lift characteristics.
  • A Small Unmanned Ground Vehicle (SUGV) is a lightweight, man portable Unmanned Ground Vehicle (UGV) capable of conducting military operations in urban terrain, tunnels, sewers, and caves. The SUGV aids in the performance of manpower-intensive or high-risk functions (i.e. urban Intelligence, Surveillance, and Reconnaissance (ISR) missions, chemical/Toxic Industrial Chemicals (TIC), Toxic Industrial Materials (TIM), reconnaissance, etc.). Working to minimize Soldiers' exposure directly to hazards, the SUGV's modular design allows multiple payloads to be integrated in a plug and play fashion.
  • An Unmanned Ground Vehicle (UGV) is a vehicle that operates while in contact with the ground and without an onboard human presence. UGVs can be used for many applications where it may be inconvenient, dangerous, or impossible to have a human operator present. Generally, the vehicle will have a set of sensors to observe the environment, and will either autonomously make decisions about its behavior or pass the information to a human operator at a different location who will control the vehicle through teleoperation. The UGV is the land-based counterpart to unmanned aerial vehicles and remotely operated underwater vehicles. Unmanned robotics are being actively developed for both civilian and military use to perform a variety of dull, dirty, and dangerous activities.
  • SWAP constraints are directed to size, weight, and power of a military platform as defined by the military for a given platform and providing a basis for which a platform and utilize components from various manufacturers.
  • SUMMARY OF THE INVENTION
  • Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. In order to accommodate these sensors on a quadrotor, modifications will be done to the location of the camera and the laser emitters as taught by the present invention.
  • The proposed configuration makes use of multiple fisheye cameras and laser line scanners. Four, wide degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment. If the light hits objects in the environment it is reflected and viewed by the cameras.
  • The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space. At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated herein a form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention.
  • FIG. 1. MAPHAC is a structured light sensor that is designed for SUGVs.
  • FIG. 2 Point cloud generated by MAPHAC, color-coded for range. The point cloud shows a leaning ladder, and a variety of office clutter.
  • FIG. 3a . Quad copter with four imagers and laser projection system.
  • FIG. 3b . Approximate field-of-view of a single imager.
  • FIG. 3c . Overhead view of combined field-of-view of all imagers.
  • FIG. 3d . Side-view of combined field-of-view of all imagers.
  • FIG. 4. Two laser line projectors are used to create a line that can then be sensed with the omnidirectional cameras.
  • FIG. 5. Complete field of view showing laser and cameras.
  • FIG. 6. Expected range error of structured light sensor.
  • FIGS. 7a and 7b . Prototype sensing plane configuration.
  • DETAILED DESCRIPTION OF THE INVENTION
  • In the following detailed description of the invention of exemplary embodiments of the invention, reference is made to the accompanying drawings (where like numbers represent like elements), which form a part hereof, and in which is shown by way of illustration specific exemplary embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, but other embodiments may be utilized and logical, mechanical, electrical, and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
  • In the following description, numerous specific details are set forth to provide a thorough understanding of the invention. However, it is understood that the invention may be practiced without these specific details. In other instances, well-known structures and techniques known to one of ordinary skill in the art have not been shown in detail in order not to obscure the invention. Referring to the figures, it is possible to see the various major elements constituting the apparatus of the present invention.
  • Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. In sharp contrast, with conventional stereo and structure from motion, poor lighting actually improves the range and accuracy of this sensor. There is also no need to have rich features in the environment, since the laser “projects its own features.” Therefore, it will even work on featureless walls and floors.
  • One such approach is presented in FIG. 1, which is currently installed on a SUGV (small unmanned ground vehicle). It is designed to create very high density point clouds for mapping applications at two megapixels per second. FIG. 1 illustrates where a MAPHAC 100 is a structured light sensor that is designed for SUGVs.
  • FIG. 2, shows the scan of a typical cluttered room as a point cloud 200, including a ladder 201, a camera with a tripod 202, chairs 203, lamps 204, etc. In FIG. 2 the point cloud 200 generated by an MAPHAC is color-coded for range. The point cloud 200 shows a leaning ladder 201, and a variety of office clutter. The current incarnation of MAPHAC 100 is designed to become a substitute for a SUGV antenna, where it can serve as both an autonomous mobility sensor and radio antenna.
  • In order to accommodate these sensors on a quadrotor, modifications will be done to the location of the camera and the laser emitters. However, the core electronics and software have already been designed, but never used in this combination. The sensor is designed to meet the unique needs of an autonomous multicopter for indoor and outdoor environments, including: Large-field of view for obstacle avoidance and mapping; Light-weight system with minimal moving parts; Accurate ranges at short distances, with decreasing accuracy at longer ranges; Use of eye-safe lasers, while providing resilience to ambient light; and a Predicted weight under 150 grams.
  • The proposed configuration makes use of multiple fisheye cameras and laser line scanners. Four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras. The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space.
  • At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities as illustrated by FIGS. 3a, 3b, 3c , and 3 d.
  • FIG. 3a illustrates a Quad copter 300 with four imagers 301, 302, 303, and 304 and laser projection system 305. FIG. 3b illustrates an approximate field-of-view 306 of a single imager 301. FIG. 3c illustrates an overhead view of combined field-of-view 307 of all imagers 301, 302, 303, and 304. FIG. 3d illustrates a side-view of combined field-of-view 308 of all imagers.
  • FIG. 4 illustrates where two laser line projectors 401 and 402 are used to create a line 403 that can then be sensed with the omnidirectional cameras.
  • Each imager is composed of a camera module, a spectral filter, and a wide-angle compound lens. The camera must be small in size and weight, while providing high sensitivity and a wide dynamic range. Depending on mission requirements, an optical bandpass filter can be installed to attenuate incoming ambient light. If no filter is installed, the imaging system can be used as a visible light imager to provide full 360 degree RGB imagery in addition to point clouds.
  • A laser projection unit consists of a solid-state laser diode, laser pulsing circuitry, aspheric collimation lens, beam splitter, small rotating mirror, and laser line lens. The laser circuitry pulses the laser while also providing a frame trigger to each imager. The laser light is collimated into a beam 403 and 404 using a small aspheric lens directly in front of the laser. The laser beam is then split into an upward and downward beam 403 and 404. Each beam 403 and 404 is reflected off a small rotating mirror coupled to a laser line lens. The upward beam 403 creates a laser line that extends from horizontal to positive 80 degrees pitch, while the downward beam 404 creates a laser line that extends from horizontal to negative 80 degrees pitch.
  • The proposed field-of-view (shown in FIG. 4) shows the field-of-view of the projected lines 403 and 404. FIG. 5 shows the combined field-of-view of the cameras 405 and 406 and laser projectors 308.
  • The structured light sensor will be able to measure 360 degrees horizontally and 160 degrees vertically. At each point in time, the sensor will generate approximately 2080 vertical range measurements. With each imager capturing approximately 180 images/second, the sensor will be able to generate over 370 k points per second.
  • The yaw scan rate can be varied, depending upon the current mission needs. The sensor can be operated with a fine yaw resolution and slow scan rate, providing detailed scans of the environment; or, the sensor can be operated with a faster yaw rate, providing faster updates at a coarser rate.
  • Since this device relies on triangulation, the range accuracy will be dependent on range. The expected range error 600 is shown in FIG. 6 in graph format.
  • A second approach is to use a time-of-flight line sensor to perform the same task as shown with the structured light sensor. The line sensors can be organized as seen in FIGS. 7a and 7 b.
  • One more possible configuration is the same as shown in FIGS. 7a and 7b , but with the vertical sensing plan 700 aligned with the direction of travel 701.
  • The system is composed of a quadrotor, or other UAV, and one or more range sensors that are used to sense the surrounding environment.
  • Thus, it is appreciated that the optimum dimensional relationships for the parts of the invention, to include variation in size, materials, shape, form, function, and manner of operation, assembly and use, are deemed readily apparent and obvious to one of ordinary skill in the art, and all equivalent relationships to those illustrated in the drawings and described in the above description are intended to be encompassed by the present invention.
  • Furthermore, other areas of art may benefit from this method and adjustments to the design are anticipated. Thus, the scope of the invention should be determined by the appended claims and their legal equivalents, rather than by the examples given.

Claims (20)

The embodiments of the invention in which an exclusive property or privilege is claimed are defined as follows:
1. A sensing device for UAVs, comprising:
a UAV;
a structured light sensor;
the structured light sensor configured to use the size of the quadrotor, in order to provide a disparity requirement; and
a computer or microprocessor to process the structured light sensor information; and
the computer or microprocessor sending the structured light sensor information to one or more recipients.
2. The sensing device for UAVs of claim 1, wherein the processing is used for obstacle avoidance.
3. The sensing device for UAVs of claim 1, wherein the processing is used for mapping the surroundings.
4. The sensing device for UAVs of claim 1, wherein the UAV is a quadrotor.
5. The sensing device for UAVs of claim 1, wherein
the structured light sensor is rotated; and
the rotation is accomplished by a mechanism on the vehicle.
6. The sensing device for UAVs of claim 1, wherein
the structured light sensor is rotated; and
the rotation is accomplished by moving the body of the vehicle.
7. The sensing device for UAVs of claim 1, wherein
the structured light sensor is rotated; and
the rotation is accomplished by at least one of a mechanism on the vehicle and moving the body of the vehicle, or a combination of the two.
8. The sensing device for UAVs of claim 1, wherein
multiple lines are used, one horizontal line and one vertical line, to increase the coverage.
9. The sensing device for UAVs of claim 1, further comprising
a time-of-flight sensor.
10. A sensing device for UAVs, comprising
a quadrotor;
one or more line time-of-flight sensors;
a computer or microprocessor to process range information; and
the computer or microprocessor sending the range information to one or more recipients.
11. The sensing device for UAVs of claim 10, wherein the processing is used for obstacle avoidance.
12. The sensing device for UAVs of claim 10, wherein the processing is used for mapping the surroundings.
13. The sensing device for UAVs of claim 10, wherein
the line time-of-flight sensor is rotated; and
the rotation is accomplished by a mechanism on the vehicle.
14. The sensing device for UAVs of claim 10, wherein
the line time-of-flight sensor is rotated; and
the rotation is accomplished by moving the body of the vehicle.
15. The sensing device for UAVs of claim 10, wherein
the line time-of-flight sensor is rotated; and
the rotation is accomplished by at least one of a mechanism on the vehicle and moving the body of the vehicle, or a combination of the two.
16. The sensing device for UAVs of claim 10, further comprising
a structured light sensor.
17. The sensing device for UAVs of claim 16, wherein
multiple lines are used, one horizontal line and one vertical line, to increase the coverage.
18. The sensing device for UAVs of claim 10, wherein the UAV is a quadrotor.
19. A sensing device for UAVs, comprising:
a plurality of fisheye cameras;
the cameras are separated from each other to provide parallax;
four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere;
a plurality of laser line scanners;
the near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras;
the laser projection system creates vertical lines, while the cameras will be displaced from each other horizontally’
this relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space;
at each point in time, a vertical stripe of the world will be triangulated;
over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities;
the two laser line projectors are used to create a line that can then be sensed with the omnidirectional cameras;
each imager is composed of a camera module, a spectral filter, and a wide-angle compound lens;
an optical bandpass filter can be installed to attenuate incoming ambient light;
if no filter is installed, the imaging system can be used as a visible light imager to provide full 360 degree RGB imagery in addition to point clouds;
a laser projection unit consists of
a solid-state laser diode,
laser pulsing circuitry,
aspheric collimation lens,
beam splitter,
small rotating mirror, and
laser line lens;
the laser circuitry pulses the laser while also providing a frame trigger to each imager;
the laser light is collimated into a beam using a small aspheric lens directly in front of the laser;
the laser beam is then split into an upward and downward beam;
each beam is reflected off a small rotating mirror coupled to a laser line lens;
the upward beam creates a laser line that extends from horizontal to positive 80 degrees pitch;
the downward beam creates a laser line that extends from horizontal to negative 80 degrees pitch;
the structured light sensor will be able to measure 360 degrees horizontally and 160 degrees vertically;
at each point in time, the sensor will generate approximately 2080 vertical range measurements;
each imager capturing approximately 180 images/second, the sensor will be able to generate over 370 k points per second;
the yaw scan rate can be varied, depending upon the current mission needs;
the sensor can be operated with a fine yaw resolution and slow scan rate, providing detailed scans of the environment; or, the sensor can be operated with a faster yaw rate, providing faster updates at a coarser rate; and
since this device relies on triangulation, the range accuracy will be dependent on range.
20. The sensing device for UAVs of claim 19, comprising:
a UAV;
one or more range sensors that are used to sense the surrounding environment;
a time-of-flight line sensor to perform the same task as shown with the structured light sensor; and
a vertical sensing plan aligned with the direction of travel.
US15/176,229 2015-06-13 2016-06-08 Senising on uavs for mapping and obstacle avoidance Abandoned US20170201738A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/176,229 US20170201738A1 (en) 2015-06-13 2016-06-08 Senising on uavs for mapping and obstacle avoidance
US16/045,795 US11323687B2 (en) 2015-06-13 2018-07-26 Sensing on UAVs for mapping and obstacle avoidance

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562175231P 2015-06-13 2015-06-13
US15/176,229 US20170201738A1 (en) 2015-06-13 2016-06-08 Senising on uavs for mapping and obstacle avoidance

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/045,795 Continuation US11323687B2 (en) 2015-06-13 2018-07-26 Sensing on UAVs for mapping and obstacle avoidance

Publications (1)

Publication Number Publication Date
US20170201738A1 true US20170201738A1 (en) 2017-07-13

Family

ID=59275062

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/176,229 Abandoned US20170201738A1 (en) 2015-06-13 2016-06-08 Senising on uavs for mapping and obstacle avoidance
US16/045,795 Active 2037-11-17 US11323687B2 (en) 2015-06-13 2018-07-26 Sensing on UAVs for mapping and obstacle avoidance

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/045,795 Active 2037-11-17 US11323687B2 (en) 2015-06-13 2018-07-26 Sensing on UAVs for mapping and obstacle avoidance

Country Status (1)

Country Link
US (2) US20170201738A1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107806857A (en) * 2017-11-08 2018-03-16 沈阳上博智像科技有限公司 Unpiloted movable equipment
CN107817811A (en) * 2017-10-26 2018-03-20 哈尔滨市舍科技有限公司 The unmanned plane collision prevention device and method of view-based access control model
DE102018205134A1 (en) 2018-04-05 2018-06-21 Emqopter GmbH Distance sensor system for the efficient and automatic detection of landing sites for autonomous hoverable aircraft
CN108639343A (en) * 2018-05-03 2018-10-12 日照职业技术学院 A kind of mapping unmanned plane
CN108735001A (en) * 2018-05-31 2018-11-02 智飞智能装备科技东台有限公司 A kind of unmanned plane based on Beidou navigation is anti-to hit prior-warning device
CN109050889A (en) * 2018-08-17 2018-12-21 陈霞 A kind of unmanned plane of new-type freely adjustable protective device
CN109240326A (en) * 2018-08-27 2019-01-18 广东容祺智能科技有限公司 A kind of barrier-avoiding method of the mixing obstacle avoidance apparatus of unmanned plane
DE102017121119A1 (en) * 2017-09-12 2019-03-14 Elmos Semiconductor Aktiengesellschaft Automotive TOF camera with a LED headlight as the light source
US20190132573A1 (en) * 2017-10-31 2019-05-02 Sony Corporation Generating 3d depth map using parallax
EP3546884A1 (en) * 2018-03-26 2019-10-02 Simmonds Precision Products, Inc. Ranging objects external to an aircraft using multi-camera triangulation
US10442553B2 (en) * 2015-11-06 2019-10-15 Spherie Ug (Haftungsbeschränkt) Wingless aircraft
US10495735B2 (en) 2017-02-14 2019-12-03 Sony Corporation Using micro mirrors to improve the field of view of a 3D depth map
US10536684B2 (en) 2016-12-07 2020-01-14 Sony Corporation Color noise reduction in 3D depth map
US10549186B2 (en) 2018-06-26 2020-02-04 Sony Interactive Entertainment Inc. Multipoint SLAM capture
US10589860B2 (en) * 2017-05-23 2020-03-17 Gopro, Inc. Spherical infrared emitter
US20200096747A1 (en) * 2018-09-20 2020-03-26 Hexagon Technology Center Gmbh Retroreflector comprising fisheye lens
US10672281B2 (en) * 2018-04-10 2020-06-02 Verizan Patent and Licensing Inc. Flight planning using obstacle data
US10795022B2 (en) 2017-03-02 2020-10-06 Sony Corporation 3D depth map
CN112146627A (en) * 2019-06-26 2020-12-29 极光飞行科学公司 Aircraft imaging system using projected patterns on featureless surfaces
US10979687B2 (en) 2017-04-03 2021-04-13 Sony Corporation Using super imposition to render a 3D depth map
ES2824873A1 (en) * 2019-11-13 2021-05-13 Fund Tekniker METHOD AND SYSTEM FOR SPACE TRACKING OF OBJECTS
US11126181B2 (en) * 2015-12-21 2021-09-21 Gopro, Inc. Systems and methods for providing flight control for an unmanned aerial vehicle based on opposing fields of view with overlap
CN115848355A (en) * 2022-12-01 2023-03-28 纵目科技(上海)股份有限公司 Parking system using surface structured light and method of operating the same
DE102022203653A1 (en) 2022-04-12 2023-10-12 Emqopter GmbH DISTANCE SENSOR SYSTEMS FOR EFFICIENT AND AUTOMATIC ENVIRONMENT DETECTION FOR AUTONOMOUS HOVER-CAPABILITY AIRCRAFT

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200264279A1 (en) * 2019-02-20 2020-08-20 Flir Surveillance, Inc. Rotatable light sources and associated pulse detection and imaging systems and methods
CN110880185B (en) 2019-11-08 2022-08-12 南京理工大学 High-precision dynamic real-time 360-degree omnidirectional point cloud acquisition method based on fringe projection
US12480763B2 (en) 2020-10-02 2025-11-25 Teledyne Flir Defense, Inc. Lightweight laser designator systems and methods

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050179888A1 (en) * 2002-02-28 2005-08-18 Vaisala Oyj Lidar
US20160070265A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Multi-sensor environmental mapping
US20160076892A1 (en) * 2014-03-24 2016-03-17 SZ DJI Technology Co., Ltd Methods and systems for determining a state of an unmanned aerial vehicle
US20180095459A1 (en) * 2014-06-19 2018-04-05 Skydio, Inc. User interaction paradigms for a flying digital assistant

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3167430A4 (en) * 2014-11-04 2017-08-16 SZ DJI Technology Co., Ltd. Camera calibration
US9720413B1 (en) * 2015-12-21 2017-08-01 Gopro, Inc. Systems and methods for providing flight control for an unmanned aerial vehicle based on opposing fields of view with overlap
US10536684B2 (en) * 2016-12-07 2020-01-14 Sony Corporation Color noise reduction in 3D depth map
US10495735B2 (en) * 2017-02-14 2019-12-03 Sony Corporation Using micro mirrors to improve the field of view of a 3D depth map
US10795022B2 (en) * 2017-03-02 2020-10-06 Sony Corporation 3D depth map
US10589860B2 (en) * 2017-05-23 2020-03-17 Gopro, Inc. Spherical infrared emitter
US10672281B2 (en) * 2018-04-10 2020-06-02 Verizan Patent and Licensing Inc. Flight planning using obstacle data

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050179888A1 (en) * 2002-02-28 2005-08-18 Vaisala Oyj Lidar
US20160076892A1 (en) * 2014-03-24 2016-03-17 SZ DJI Technology Co., Ltd Methods and systems for determining a state of an unmanned aerial vehicle
US20180095459A1 (en) * 2014-06-19 2018-04-05 Skydio, Inc. User interaction paradigms for a flying digital assistant
US20160070265A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Multi-sensor environmental mapping
WO2016033797A1 (en) * 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd. Multi-sensor environmental mapping

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10442553B2 (en) * 2015-11-06 2019-10-15 Spherie Ug (Haftungsbeschränkt) Wingless aircraft
US12007768B2 (en) 2015-12-21 2024-06-11 Gopro, Inc. Systems and methods for providing flight control for an unmanned aerial vehicle based on opposing fields of view with overlap
US11126181B2 (en) * 2015-12-21 2021-09-21 Gopro, Inc. Systems and methods for providing flight control for an unmanned aerial vehicle based on opposing fields of view with overlap
US10536684B2 (en) 2016-12-07 2020-01-14 Sony Corporation Color noise reduction in 3D depth map
US10495735B2 (en) 2017-02-14 2019-12-03 Sony Corporation Using micro mirrors to improve the field of view of a 3D depth map
US10795022B2 (en) 2017-03-02 2020-10-06 Sony Corporation 3D depth map
US10979687B2 (en) 2017-04-03 2021-04-13 Sony Corporation Using super imposition to render a 3D depth map
US10589860B2 (en) * 2017-05-23 2020-03-17 Gopro, Inc. Spherical infrared emitter
DE102017121119A1 (en) * 2017-09-12 2019-03-14 Elmos Semiconductor Aktiengesellschaft Automotive TOF camera with a LED headlight as the light source
DE102017121119B4 (en) * 2017-09-12 2020-08-20 Elmos Semiconductor Aktiengesellschaft Automotive TOF camera with an LED headlight as the light source
CN107817811A (en) * 2017-10-26 2018-03-20 哈尔滨市舍科技有限公司 The unmanned plane collision prevention device and method of view-based access control model
US20190132573A1 (en) * 2017-10-31 2019-05-02 Sony Corporation Generating 3d depth map using parallax
US10979695B2 (en) 2017-10-31 2021-04-13 Sony Corporation Generating 3D depth map using parallax
US10484667B2 (en) * 2017-10-31 2019-11-19 Sony Corporation Generating 3D depth map using parallax
CN107806857A (en) * 2017-11-08 2018-03-16 沈阳上博智像科技有限公司 Unpiloted movable equipment
EP3546884A1 (en) * 2018-03-26 2019-10-02 Simmonds Precision Products, Inc. Ranging objects external to an aircraft using multi-camera triangulation
US10818024B2 (en) 2018-03-26 2020-10-27 Simmonds Precision Products, Inc. Ranging objects external to an aircraft using multi-camera triangulation
DE102018205134A1 (en) 2018-04-05 2018-06-21 Emqopter GmbH Distance sensor system for the efficient and automatic detection of landing sites for autonomous hoverable aircraft
DE102018205134B4 (en) * 2018-04-05 2020-10-15 Emqopter GmbH Distance sensor system for the efficient and automatic detection of landing sites for autonomous hovering aircraft
US10672281B2 (en) * 2018-04-10 2020-06-02 Verizan Patent and Licensing Inc. Flight planning using obstacle data
US11631335B2 (en) 2018-04-10 2023-04-18 Verizon Patent And Licensing Inc. Flight planning using obstacle data
CN108639343A (en) * 2018-05-03 2018-10-12 日照职业技术学院 A kind of mapping unmanned plane
CN108735001A (en) * 2018-05-31 2018-11-02 智飞智能装备科技东台有限公司 A kind of unmanned plane based on Beidou navigation is anti-to hit prior-warning device
US10549186B2 (en) 2018-06-26 2020-02-04 Sony Interactive Entertainment Inc. Multipoint SLAM capture
US11590416B2 (en) 2018-06-26 2023-02-28 Sony Interactive Entertainment Inc. Multipoint SLAM capture
CN109050889A (en) * 2018-08-17 2018-12-21 陈霞 A kind of unmanned plane of new-type freely adjustable protective device
CN109240326A (en) * 2018-08-27 2019-01-18 广东容祺智能科技有限公司 A kind of barrier-avoiding method of the mixing obstacle avoidance apparatus of unmanned plane
US11543244B2 (en) * 2018-09-20 2023-01-03 Hexagon Technology Center Gmbh Retroreflector comprising fisheye lens
US20200096747A1 (en) * 2018-09-20 2020-03-26 Hexagon Technology Center Gmbh Retroreflector comprising fisheye lens
CN112146627A (en) * 2019-06-26 2020-12-29 极光飞行科学公司 Aircraft imaging system using projected patterns on featureless surfaces
US20200408518A1 (en) * 2019-06-26 2020-12-31 Aurora Flight Sciences Corporation, a subsidiary of The Boeing Company Aircraft Imaging System Using Projected Patterns on Featureless Surfaces
US10955241B2 (en) * 2019-06-26 2021-03-23 Aurora Flight Sciences Corporation Aircraft imaging system using projected patterns on featureless surfaces
WO2021094636A1 (en) * 2019-11-13 2021-05-20 Fundación Tekniker Method and system for the spatial tracking of objects
ES2824873A1 (en) * 2019-11-13 2021-05-13 Fund Tekniker METHOD AND SYSTEM FOR SPACE TRACKING OF OBJECTS
DE102022203653A1 (en) 2022-04-12 2023-10-12 Emqopter GmbH DISTANCE SENSOR SYSTEMS FOR EFFICIENT AND AUTOMATIC ENVIRONMENT DETECTION FOR AUTONOMOUS HOVER-CAPABILITY AIRCRAFT
DE102022203653B4 (en) 2022-04-12 2024-02-08 Emqopter GmbH DISTANCE SENSOR SYSTEMS FOR EFFICIENT AND AUTOMATIC ENVIRONMENT DETECTION FOR AUTONOMOUS HOVER-CAPABILITY AIRCRAFT
CN115848355A (en) * 2022-12-01 2023-03-28 纵目科技(上海)股份有限公司 Parking system using surface structured light and method of operating the same

Also Published As

Publication number Publication date
US11323687B2 (en) 2022-05-03
US20190068954A1 (en) 2019-02-28

Similar Documents

Publication Publication Date Title
US11323687B2 (en) Sensing on UAVs for mapping and obstacle avoidance
US11932392B2 (en) Systems and methods for adjusting UAV trajectory
US11423792B2 (en) System and method for obstacle avoidance in aerial systems
US20230343087A1 (en) Automatic terrain evaluation of landing surfaces, and associated systems and methods
US10401872B2 (en) Method and system for collision avoidance
US10409293B1 (en) Gimbal stabilized components for remotely operated aerial vehicles
CN110192122B (en) Systems and methods for radar control on unmanned mobile platforms
Li et al. A novel distributed architecture for UAV indoor navigation
US10474152B2 (en) Path-based flight maneuvering system
EP3236213A2 (en) Cockpit display for degraded visual environment (dve) using millimeter wave radar (mmwr)
CN108351653A (en) Systems and methods for UAV flight control
WO2021199449A1 (en) Position calculation method and information processing system
WO2021238743A1 (en) Flight control method and apparatus for unmanned aerial vehicle, and unmanned aerial vehicle
CN111157970A (en) A Miniaturized Single Photon Detection Sensitivity Area Array Gm-APD Lidar Device
AU2020288125B2 (en) 3D localization and mapping system and method
Aksenov et al. An application of computer vision systems to solve the problem of unmanned aerial vehicle control
Lee et al. Attitude control of quadrotor with on-board visual feature projection system
Nyasulu et al. Comparison study of low-cost obstacle sensing solutions for Unmanned Aerial Vehicles in wildlife scenery
Goerzen et al. Optimal Landing Site Selection Using Kinematic Weight Function During High Speed Approaches
Krause Multi-purpose environment awareness approach for single line laser scanner in a small rotorcraft UA
Garcia et al. Lightweight UAV Borne 3D Perception via Pan-Tilt 2D LiDAR
WO2026041217A1 (en) Multipurpose unmanned aerial vehicle, uav
JP2021036452A (en) System and method for adjusting uav locus
Pratama et al. Tracking and control of a small unmanned aerial vehicle using a ground-based 3D laser scanner

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: ROBOTIC RESEARCH, LLC, MARYLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LACAZE, ALBERTO DANIEL;MURPHY, KARL NICHOLAS;WILHELM, III, RAYMOND PAUL;REEL/FRAME:057776/0549

Effective date: 20211012

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

AS Assignment

Owner name: ROBOTIC RESEARCH OPCO, LLC, MARYLAND

Free format text: MERGER;ASSIGNOR:ROBOTIC RESEARCH, LLC;REEL/FRAME:060877/0929

Effective date: 20211015

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION