WO2017168423A1 - Système et procédé de guidage autonome de véhicules - Google Patents

Système et procédé de guidage autonome de véhicules Download PDF

Info

Publication number
WO2017168423A1
WO2017168423A1 PCT/IL2017/050390 IL2017050390W WO2017168423A1 WO 2017168423 A1 WO2017168423 A1 WO 2017168423A1 IL 2017050390 W IL2017050390 W IL 2017050390W WO 2017168423 A1 WO2017168423 A1 WO 2017168423A1
Authority
WO
WIPO (PCT)
Prior art keywords
processor
obstacles
guidance
solution
vehicle
Prior art date
Application number
PCT/IL2017/050390
Other languages
English (en)
Inventor
Itai ORR
Original Assignee
Aerialguard Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerialguard Ltd filed Critical Aerialguard Ltd
Publication of WO2017168423A1 publication Critical patent/WO2017168423A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0005Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with arrangements to save energy
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0043Traffic management of multiple aircrafts from the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/04Anti-collision systems
    • G08G5/045Navigation or guidance aids, e.g. determination of anti-collision manoeuvers

Definitions

  • the present invention relates to autonomous guidance of vehicles. More particularly, the present invention relates to systems and methods for autonomous guidance of vehicles.
  • Guidance entails the description of a path or trajectory which is desired for a system to follow, whereby this includes the knowledge of the system's current location, orientation, or state, and other requirements in order to maintain that desired path.
  • Some commercially available vehicles may have a navigation system that receives as input the coordinates of the current position and the destination, for instance using positioning sensors (e.g. Global Positioning System), such that a trajectory is created for movement from the origin point and towards the destination.
  • positioning sensors e.g. Global Positioning System
  • Such navigation may be based on existing maps of the environment through which the trajectory passes.
  • Autonomous vehicles such as unmanned airborne drones, usually require that an operator, creating a new trajectory for a particular assignment, be aware of any possible obstacle along the path of the vehicle in order to allow that vehicle to reach the destination point.
  • a high tree in the middle of a field may obstruct the path of a flying autonomous vehicle, so that the autonomous vehicle cannot simply fly in a straight line between the origin and destination points since various obstacles may be in the way.
  • the operator must therefore initially map all possible obstacles along the flight path (prior to any movement) and accordingly modify the flight trajectory in order to prevent undesired interaction between the vehicle and the obstacles.
  • trajectory limitations are imposed onto autonomous vehicles, such that they cannot easily bypass an obstacle in their way.
  • physical and/or geographical limitations e.g. a mountain
  • regulatory limitations e.g. no fly zones near airports.
  • dynamic obstacles e.g. a bird
  • static obstacles e.g. a building
  • commercially available systems necessitate the presence of a human operator for each vehicle in order to plan and follow a trajectory that is capable of avoiding obstacles, wherein the operator controls the vehicle in real-time.
  • a set of pre-determined maneuvers such as
  • a system for autonomous guidance of one or more vehicles that are at least partially controlled by a computer, the system comprising an onboard unit, the onboard unit is onboard a vehicle; and a command center, configured to allow communication between the onboard unit and the command center.
  • the onboard unit may include a processor and at least one sensor.
  • the system may be configured to receive information about a planned path and about obstacles from at least one of the sensors, and the command center, calculate a set of guidance solutions, according to at least one of an obstacle avoidance algorithm and a predefined guidance rule, choose a guidance solution having maximal efficiency according to the predefined guidance rule, and apply the chosen guidance solution.
  • a system for autonomous guidance of one or more vehicles that are at least partially controlled by a computer may comprise an onboard unit onboard a vehicle, the onboard unit may comprise: a memory, comprising at least one map of a planned path, an obstacle avoidance algorithm and a predefined guidance rule, at least one sensor, configured to detect obstacles, and a processor.
  • the processor may be configured to receive information about a planned path and about obstacles (e.g. along the planned path) from the memory and sensors, calculate a set of guidance solutions, according to at least one of an obstacle avoidance algorithm and a predefined guidance rule, choose a guidance solution having maximal efficiency according to the predefined guidance rule, and apply the chosen guidance solution.
  • the command center may comprise a second processor that may be configured to allow processing of at least one of: received information about the planned path and obstacles, calculated set of guidance solutions, and chosen guidance solution having maximal efficiency.
  • the onboard unit may further comprise a first communication module
  • the command center may comprise a second communication module to allow communication therebetween.
  • the onboard unit may further comprise a memory having at least one map of a planned path, and at least one obstacle avoidance algorithm stored therein.
  • the command center may be configured to receive from the processor information about obstacles and solutions and to override a solution chosen by the processor.
  • the communication may be carried out via at least one wireless network.
  • the system may further comprise a server having a third communication module and configured to allow communication between the onboard unit and the server.
  • the server may be configured to receive information from the processor, to provide a collective mapping of obstacles and to provide classification of obstacles.
  • the server may further comprise a database having at least one map of a planned path, and at least one obstacle avoidance algorithm stored therein.
  • the communication may be carried out via at least one wireless network.
  • system may further comprise a user interface configured to allow at least partial manual control of the vehicle.
  • the system may allow the operator to receive at least one notification from the onboard unit.
  • the processor may be further configured to modify the calculation of solutions based on data stored during previous guidance solutions.
  • a method for autonomous guidance of one or more vehicles that are at least partially controlled by a computer may comprise receiving by a processor information about a planned path and about obstacles from at least one of: one or more sensors of an onboard unit on a vehicle, a command center and a server, calculating by the processor a set of guidance solutions according to at least one of an obstacle avoidance algorithm and a predefined guidance rule, choosing by the processor a guidance solution having maximal value of a solution function according to the predefined guidance rule, and applying by the processor the chosen solution, wherein the solution function is a weighted average function of one or more efficiency parameters.
  • efficiency parameters may be selected from one or more of: time required to complete the assignment, fuel consumption, path length, computational power required for the processing, distance from previous calculated solution, and change in movement direction.
  • the method may further comprise receiving by the command center from the processor information about obstacles and solutions and overriding a solution chosen by the processor.
  • the method may further comprise receiving by the server information from the processor, providing a collective mapping of obstacles and providing classification of obstacles.
  • the method may further comprise receiving by the processor external data, performing data fusion to create a unified threat map, track dynamic obstacles and classify behavior of obstacles, and planning a solution path.
  • the method may further comprise uploading the unified map to the server.
  • the method may further comprise determining a projection of the vehicle in the planned path on an obstacle's plane in order to find solutions.
  • the method may further comprise determining a polygon inscribing an obstacle in order to find a solution.
  • the method may further comprise determining a cylinder inscribing an obstacle in order to find a solution.
  • the method may further comprise determining a sphere inscribing an obstacle in order to find a solution.
  • the method may further comprise locating a window within the polygon that may be capable of accommodating the vehicle so as to allow guidance of the vehicle through said window.
  • the method may further comprise eliminating at least one solution from the set of calculated guidance solutions, according to at least one of the obstacle avoidance algorithm and the predefined guidance rule.
  • the method may further comprise determining threatening obstacles of the received obstacles. According to some embodiments, the method may further displaying graphics overlay on images received from the onboard unit. According to some embodiments, the method may further comprise highlighting at least one obstacle.
  • FIG. 1 schematically illustrates a system for autonomous guidance, according to some embodiments of the invention
  • FIG. 2 schematically illustrates the architecture of the system for autonomous guidance, according to some embodiments of the invention
  • FIG. 3 schematically illustrates an additional initial sensors data process, according to some embodiments of the invention.
  • FIG. 4 shows a flowchart illustrating a method for path planning, according to some embodiments of the invention.
  • FIG. 5A schematically illustrates a side view of a method for detecting obstacles from a depth map, according to some embodiments of the invention
  • Fig. 5B schematically illustrates a frontal view of obstacles, on a plane perpendicular to the plane of Fig. 5A, of the method for detecting obstacles from a depth map, according to some embodiments of the invention
  • FIG. 6 shows a flowchart illustrating a method for detecting obstacles based on a depth map, according to some embodiments of the invention
  • FIG. 7 schematically illustrates a solution planning process, according to some embodiments of the invention.
  • FIG. 8 schematically illustrates a solution planning process with a cylinder, according to some embodiments of the invention.
  • Fig. 9A shows a flowchart illustrating a method for planning a path to avoid unknown obstacles, according to some embodiments of the invention.
  • Fig. 9B shows a continuation of the flowchart illustrating the method for planning a path to avoid unknown obstacles, according to some embodiments of the invention.
  • Fig. 10 schematically illustrates a path planning display on the user interface, according to some embodiments of the invention.
  • Some embodiments of the invention may provide systems and methods for autonomous guidance for computer-controlled systems such as autonomous vehicles.
  • the autonomous guidance system may provide, according to some embodiments of the invention, autonomous assignment or mission based navigation and path planning, detection of obstacles from onboard sensors and other external sources, high level decision making in real time, based on, inter alia, risk-based path calculation, obstacle avoidance planning, and/or real time path planning.
  • mission and assignment as used hereinafter may refer to any function that is assigned to a computer-controlled systems (such as autonomous vehicles) to carry out in a predetermined time, for example a mission to deliver a parcel to a particular location.
  • System 100 may include an onboard unit 10, which is onboard a vehicle 50 which may be at least partially controlled by a computer, for example, an unmanned vehicle, autonomous vehicle and the like.
  • System 100 may further include a command center 12, which may be stationary or mobile.
  • command center 12 may be configured to transfer and receive data with at least one vehicle 50 (i.e. with onboard unit 10 onboard vehicle 50).
  • command center may be configured to transfer and receive data with at least one vehicle 50 (i.e. with onboard unit 10 onboard vehicle 50).
  • system 100 may include a server 14 that may be configured to transfer and receive data with at least one of command center 12 and one or more vehicles 50.
  • system 100 may include one or more computing devices, such as cloud-based server computers, in active communication with one or more command centers 12 and/or one or more onboard units 10 via a network 80, such as the internet.
  • Onboard unit 10 may communicate with command center 12 and/or with server 14, for example, via one or more networks 80.
  • onboard unit 10 may include a first communication module 91 (e.g. an antenna and/or a transceiver), command center 12 may include a second communication module 92, and server 14 may include a third communication module 93, wherein these communication modules are configured to allow communication therebetween, for example, via one or more networks 80.
  • network 80 may be a wireless network, such as cellular, radio, Wi-Fi, Bluetooth, and the like.
  • system 100 may include multiple onboard units 10 onboard respective vehicles 50. Accordingly, multiple onboard units 10 may communicate with command center 12 and/or with server 14 and/or other onboard units 10 in real time.
  • autonomous guidance system 100 may guide any suitable computer controlled system.
  • the computer controlled system is a vehicle 50.
  • embodiments of the invention are not limited to vehicles.
  • System 100 may provide the functionalities described hereinafter by any combination of onboard unit 10, command center 12 and server 14. It will be appreciated that not all functionalities need to be provided at the same time.
  • onboard unit 10 may include one or more sensors 20 capable of detecting obstacle(s) 60 from a distance that leaves sufficient space for obstacle avoidance maneuvering, and a processor 22. Sensors 20 may acquire data and transfer at least a portion of the data to processor 22 for at least one of: recognition of obstacle(s) 60, analysis of the shape of obstacle(s) 60, and detection of, for example, the position, location, movement direction, velocity and/or acceleration of vehicle 50 and/or of obstacle(s) 60 and/or relative such movement vectors between vehicle 50 and obstacle(s) 60.
  • sensors 20 may include, for example, image sensor(s), movement sensor(s), depth sensor(s), distance sensor(s), acceleration sensor(s), gyroscope(s), compass(es), Inertial Navigation System (INS) sensors, Global Positioning System (GPS) sensors, active and/or passive stereovision and/or computer vision, acoustic sensors, LiDAR, RADAR, beacon transceivers (e.g. radio beacon transceivers) and/or any other suitable sensor(s).
  • processor 22 may allow tracking and/or maintaining a predetermined distance and/or angle to an obstacle based on data from sensors 20.
  • processor 22 may allow integrated capabilities utilizing the same hardware to do additional tasks, for instance simultaneously processing image data from image sensors and/or payload control (using the same processor 22).
  • command center 12 and/or server 14 may also acquire and transmit (for instance via network 80) to processor 22 data about vehicle 50, its environment and/or obstacles 60 in that environment.
  • server 14 may include database 30 of maps of known environments and of obstacles 60.
  • command center 12 may include a second processor that is configured to allow processing of data received from at least one onboard unit 10 and/or server 14, such that calculation of possible solutions for obstacle avoidance, may be carried out at command center 12 instead of being carried out at onboard unit 10. In some embodiments calculation of possible solutions may be similarly carried out at server 14 instead of being carried out at onboard unit 10.
  • a functionality of processor 22 may include, for example, provision of autonomous situational awareness of vehicle 50. Accordingly, processor 22 may, for example, receive data from various sources such as, for example, sensors 20 and/or command center 12 and/or server 14, and then fuse the received data and create, based on the received data, a unified threat map of the environment of vehicle 50 and the potential obstacles 60 in that environment.
  • the unified threat map may include, on a single or unified coordinates system, location (e.g. coordinates) information and time stamps of each obstacle. The information may be received from one or more sources.
  • the unified threat map may further include additional information regarding one or more of the obstacles, such as its velocity, direction, planned path of the obstacle, and the like.
  • processor 22 may track moving obstacles 60, i.e. dynamic obstacles 60. Additionally, based on the received data from the various sources, processor 22 may classify the behavioral aspects of dynamic obstacles 60. For example, processor 22 may identify if a dynamic obstacle 60 is a cooperative obstacle, for example if the obstacle conforms its behavior and/or path to the behavior and/or movement of vehicle 50, for example in order to avoid colliding into vehicle 50. Contrarily, an identified obstacle may be a non- cooperative obstacle, for example if the obstacle acts independently from the behavior and/or movement of vehicle 50, or if the obstacle, e.g. an interceptor, threatens vehicle 50.
  • dynamic obstacle 60 may be identified as a cooperative obstacle if that obstacle was previously registered in a network of vehicles 50 (e.g. Air Traffic Control (ATC)) or alternatively if the behavior of dynamic obstacle conforms with traffic rules for conflict resolution, for example aviation standards that dictate for each airborne vehicle to which direction to turn in order to avoid collision.
  • ATC Air Traffic Control
  • the current position may be estimated from previous data corresponding to previously recorded time stamps (for example required when time synchronization is malfunctioning).
  • system 100 may further include a user interface 40.
  • a user may set a behavior mode of guidance system 100 by user interface 40.
  • a user may set, for example, in advance, a behavior mode according to the type of assignment, risk tolerance, and/or other parameters.
  • such behavior modes may include at least one of: shortest or fastest path (e.g. for delivery assignments), minimum deviation from original path (e.g. for surveying, or area coverage missions), minimum fuel/battery consumption path, safest path, constant distance from certain object (e.g. for inspection missions), or the like.
  • user interface 40 for instance coupled to command center 12 and/or to a ground station, may include an information display (e.g. embedded in dedicated goggles or headset) with graphics overlay on images received from vehicles 50.
  • an information display e.g. embedded in dedicated goggles or headset
  • graphics overlay on images received from vehicles 50.
  • information on the environment received from the sensors 20 on-board the vehicle 50 and/or maps and/or other vehicles 50 may be displayed to operator 70.
  • some obstacles 60 e.g. stationary and/or dynamic
  • the shape and/or position of obstacles 60 (or hazards) to vehicles 50 may be highlighted on the display.
  • such information display of user interface 40 may allow displaying the advance of vehicle 50 in real time with presentation of obstacles 60 on screen. In some embodiments, such information display of user interface 40 may allow display of distances to each detected object or obstacle 60.
  • processor 22 may classify the event, for example based on the behavior mode settings, by assessing with regard to obstacle(s) 60 the threat type(s), environment, reaction time (e.g. required vs. enabled) and position(s) of obstacle(s) 60, relative to vehicle 50 and/or to each other.
  • an algorithm executed by processor 22 may consider, for example, the mission type and/or the survivability of vehicle 50 and/or of system 100.
  • Processor 22 may determine the order of the path calculation according to the expected risk from the various obstacles identified in the event.
  • interceptors may be taken into account first, and then dynamic objects, dynamic un-cooperative obstacles, dynamic cooperative obstacles, stationary sensed obstacles and stationary known obstacles, for example by this order.
  • other suitable orders of obstacle(s) 60 may be determined.
  • processor 22 of vehicle 50 may reduce the speed of vehicle 50, and/or if necessary change the coverage area of one or more sensors 20 (e.g. by yawing and pitching vehicle 50, and/or by moving vehicle 50 backwards to increase the distance from obstacle 60) to find another possible path.
  • processor 22 of vehicle 50 may guide vehicle 50 back to a previous check point, and taking a different navigation path from the previous check point.
  • a "no solution mode" may be carried out.
  • vehicle 50 may be ordered to go to a previous check point if unable to find a safe path.
  • Check points may be used by processor 22 for navigation, as discussed in detail herein below.
  • the vehicle in case that vehicle 50 cannot access a waypoint, the vehicle may navigate to the closest point outside of the non-accessible area. It should be noted that this may be a stationary waypoint or a dynamic waypoint (in case of 'follow me' mode for example where vehicle 50 follows a dynamic target).
  • the original assignment plan, with one or more waypoints may be loaded to the check point list (e.g.
  • a tree graph may be created and the location may be registered as a check point. If vehicle 50 encounters a 'dead end' and/or no solution case, vehicle 50 may then search back in the tree graph and select a different solution from the tree graph that leads to a different location from the 'dead end' location.
  • vehicle 50 may then actively search for a new path (e.g. in cases where one solution leads to a 'dead end'), using a combination of check points as well as real time mapping of the environment, such that autonomous navigation may be possible.
  • Processor 22 may generate multiple solutions, for example concurrently.
  • processor 22 may generate, for example for each obstacle that may be encountered, a tree graph of solutions or another representation of solutions that may facilitate decisions.
  • the tree may be generated, for example, based on user preferences and/or allowed calculation time. In case a solution leads to "No Solution" mode, for example, the relevant branch of the solutions tree may be eliminated.
  • the allowed calculation time is too short for generating a decision tree.
  • the optimal solution is calculated per obstacle, by checking the path, generating solutions, eliminating solutions and choosing the optimal path, for example as discussed in detail herein. For example, solutions for flying in an airborne vehicle in a "no fly" zone may be eliminated due to a predefined rule forbidding entry to that area.
  • processor 22 may calculate during a certain event statistics and/or parameters about behavior of obstacles 60 and vehicle 50, and/or may upload the statistics and/or parameters to server 14, for example, for future learning.
  • processor 22 may upload telemetry, and/or known obstacles 60, and/or the original flight plan, and/or a calculated (i.e. new) path plan, and/or other simulation and/or analysis data.
  • processor 22 may perform mapping and/or state estimation of the environment of vehicle 50.
  • processor 22 may include, for example, planning of path of vehicle 50 in real time. In order to plan a path in a certain event, processor 22 may carry out a planning method for autonomous guidance, according to some embodiments of the invention, for example as described herein with reference to any one of Figs. 2-8. Processor 22 may perform the methods described herein in response to instructions that may be stored in a non-transitory storage device, for example, in memory 23, and may be carried out by processor 22.
  • Command center 12 may receive from processor 22 information about obstacles 60 that vehicle 50 has encountered, and/or respective path (navigation) solutions chosen by processor 22, e.g. decisions and/or path determinations made by processor 22. In some embodiments, command center 12 may inform an operator 70 of obstacles 60 that vehicle 50 has encountered and/or respective solutions chosen by processor 22. It should be appreciated that operator 70 may be a human or computer operator. Additionally, command center 12 may display the obstacles and/or respective solutions.
  • command center 12 may carry out instructions to override a solution chosen by processor 22.
  • command center may receive and/or store instructions to send commands to processor 22 to override a solution chosen by processor 22 and/or change or replace them with another solution, for example in certain situations.
  • operator 70 may decide to override a solution chosen by processor 22, for example according to predefined rules and/or other considerations, and may send corresponding instructions to processor 22 via command center 12.
  • operator 70 may receive at least one notification from vehicle 50, for example when an obstacle has been identified or a change made to the original navigation plan.
  • command center 12 and/or operator 70 may control multiple vehicles 50 and coordinate the multiple vehicles 50.
  • command center 12 and/or operator 70 may control multiple vehicles 50 in compatibility to traffic management systems, e.g. display and relay appropriate data to/from vehicles 50.
  • server 14 may receive and collect data from processor(s) 22, for example from one or more vehicles 50. Based on the collected data, server 14 may provide collective mapping, e.g. maps provided from the various processors 22 may be shared by server 14 and/or may be combined to form a combined map, including information from the various maps. Additionally, server 14 may improve scenario classifications based on classification information received from the various processors 22, for example by machine learning, and share the improved classifications to the various processors on server 14. For example, based on information received from processor(s) 22, server 14 may change risk classification (e.g. regarding path safety) and/or path calculation method with regard to an event that includes a certain kind of obstacle, environment, threat type, obstacle locations and/or any other suitable property.
  • risk classification e.g. regarding path safety
  • learning may include behavior classification, wherein at least one vehicle classifies the behavior of a dynamic obstacle, then transmits this to other vehicles in the system.
  • This may allow for multi-vehicle tracking within a network of vehicles, such as vehicles 50, such that not every vehicle needs to reclassify the obstacle by itself.
  • Another aspect of learning may be based on the deviation between planned trajectory and actual trajectory and adjustments made during future maneuvers in order to minimize the error.
  • server 14 may change obstacle type classification, for example for a certain obstacle behavior, based on information received from processor(s) 22. For example, server 14 may learn, based on information received from processor(s) 22, that a certain behavior of an obstacle 60 indicates that the obstacle 60 is of a certain class of obstacles, and/or requires a certain kind of solution. For example, a dynamic obstacle and a static obstacle may require different solutions.
  • Command center 12 may receive from server 14 the collected data and/or the improved classifications. In some embodiments, the calculations and/or learning of improved classifications may be processed by command center 12 and the improved classifications may be shared on server 14.
  • server 14 may share to processor(s) 22 and/or command center 12 the locations of various vehicles 50 in real time.
  • server 14 may share locations of vehicles 50 that include processors 22 that communicate with server 14. The shared locations may be displayed by command center 12, for example in real time.
  • a processor 22 may be included in a simulation device that receives information from server 14 and/or from command center 12.
  • Server 14 and/or command center 12 may provide score points systems, for example based on information collected from various processors 22, to evaluate performance of a user of the simulation device and/or for training.
  • at least one sensor 20 of vehicle 50 may be a receiver that is configured to receive signals (e.g. radio signals) that are indicative of additional dynamic obstacles and/or vehicles in the area.
  • FIG. 2 is a schematic illustration of the architecture of system 100, according to some embodiments of the invention.
  • Processor 22 may receive external collision avoidance data 90 from various external sources, for example external to onboard unit 10.
  • external collision avoidance data 90 may include, for example, a clouds map 210 (including weather data) from server 14, traffic alert data 230, for example from collision avoidance systems such as, for example, Air Traffic Control (ATC), Traffic Alert and Collision Avoidance System (TCAS), Automatic Dependent Surveillance Broadcast (ADS-B) and/or any other suitable system, and operator/control input 250, for example from command center 12 and/or operator 70.
  • External data 90 may be received by onboard unit via networks 80.
  • processor 22 may receive sensors data 260, for example from onboard sensors 20.
  • Processor 22 may perform data fusion 270, in which external data 90 and sensors data 260 may be fused to create a unified threat map of the environment of vehicle 50 and the potential obstacles 60, to track the dynamic obstacles 60, and to classify the behavior of dynamic obstacles 60. Then, processor 22 may perform upload 272 of the created unified map to server 14, for example in order to keep database 30 updated. Additionally, based on the fused data, processor 22 may perform motion planning 274, for example as described in detail herein. In some embodiments, planning the motion of vehicle 50 may also include planning the path along which vehicle 50 may move. Once the motion is planned, processor 22 may perform an additional safety step 276, in which processor 22 verifies that no collisions are expected by taking the planned path. In case a collision is expected, processor 22 may recalculate a new path or perform the no solution mode 278, for example as described herein. In case no collisions are expected, processor 22 may perform execution 280 of the planned motion and/or path.
  • data fusion 270 in which external data 90 and sensors data 260 may be
  • process 265 may perform stationary obstacle detection 264 and dynamic obstacle detection 266.
  • Process 265 may perform tracking 268 of the detected dynamic obstacles.
  • the processed sensors data including the depth map, the detected stationary and dynamic obstacles, and/or the tracking, may be used in data fusion 270.
  • the fusion process may create a unified map representing information from different sources on a single coordinate system.
  • threat analysis 308 may also include obstacle planning.
  • threat analysis 308 may also include obstacle planning.
  • path planning may perform layer planning, wherein a layer may refer to any processed data layer such as paths, obstacles, etc. It should be appreciated that such planning and/or identification may carry out a classification based on the scenario in real time, for instance taking into account the mission, the vehicle state vector, detected obstacles and their type, relative geometry of the scenario and/or of the obstacle, etc.
  • Figs. 5A and 5B are schematic illustrations of a method for detecting obstacles from a depth map, according to some embodiments of the invention.
  • Fig. 5A is a schematic side view showing vehicle 50 moving in direction 'A' towards obstacles 60.
  • the depths 'Z' in direction 'A' may be divided to a certain determined number of ranges 'R'.
  • the number and/or size of ranges 'R' may be determined based on at least one of: the momentary distance of vehicle 50 from obstacles 60, the velocity of vehicle 50 relative to obstacles 60, the maneuver ability of vehicle 50, the current maneuver of vehicle 50 and/or any other suitable parameter.
  • Fig. 6 is a schematic flowchart illustrating a method for detecting obstacles based on a depth map, the method may be carried out by processor 22, according to some embodiments of the invention.
  • a unified depth map may be obtained, for example as described in detail herein.
  • a safety distance may be calculated. It should be noted that the calculated safety distance may also be zero, or substantially zero according to a predefined distance parameter. The safety distance may be determined by a defined operator, depending, for example, on the velocity of vehicle 50 relative to obstacles 60, the momentary distance of vehicle 50 from obstacles 60, known sensor margin of error, and/or any other suitable parameter.
  • a XY plane may be divided to blocks 'b', for example as described and shown with reference to Fig. 5B.
  • depth 'Z' may be divided to ranges 'R' , for example as described and shown with reference to Fig. 5 A.
  • binary masks may be executed for each block per each range, to identify obstacles and windows in obstacles. For example, for each range, blocks 'b' that include sensed objects (e.g. sensed by sensor 20) are represented by binary ⁇ ', wherein blocks 'b' that do not include sensed objects are represented by binary ⁇ ' , or vice versa.
  • the binary mask may operate to show only blocks that include sensed objects, i.e.
  • processor 22 may re-cluster adjacent object blocks to obstacles.
  • dynamic obstacles may be detected and tracked to estimate future positions of dynamic obstacles based on known parameters of the dynamic obstacle, such as, for example, the planned path, the velocity and acceleration values of the dynamic obstacle and the like.
  • Kalman filter may be applied on the known parameters to determine expected or future position of dynamic obstacles. Other methods may be used.
  • Fig. 7 is a schematic illustration of solution planning by processor 22, according to some embodiments of the invention.
  • a polygon 62 may be determined to define the limits of obstacle 60, for example based on the clustered object blocks with a safety distance 'd'.
  • vehicle 50 in the planned path may have a virtual projection 50a on a frontal plane of an obstacle 60.
  • processor 22 may use the momentary distance of obstacle 60 from vehicle 50, the progress direction of vehicle 50, planned way points, planned maneuver, and/or any other suitable parameter.
  • the progress direction may be a three-dimensional vector for aerial/submarine/space assignments, and a two-dimensional vector for ground and marine assignments.
  • a projection 50a may have a center point 52a.
  • obstacle 60 is identified as a threat.
  • Polygon 62 may have, for example, segments sl-s8 and vertices vl-v8.
  • potential solutions e.g. obstacle-avoiding paths, may be generated, for example, by calculating path lines (e.g. vertical lines) from center point 52a to segments of polygon 62, such as, for example pi shown in Fig. 7. In case the calculated path line is outside the segment, the solution may be a path line through the nearest edge of the segment.
  • potential solutions may include lines that go through vertices vl-v8 or through a middle of each segment sl-s8.
  • a method for planning path to avoid unknown obstacles according to some embodiments of the invention is described in detail with reference to Fig. 9.
  • a path solution may be created within polygon 62 in case that a window is detected through which vehicle 50 may pass.
  • a clockwise solution may include, for example, a line CW from a center point 52 of vehicle 50 and tangent to cylinder 64, which is calculated by deviating from the original path in a clockwise direction, for example while staying on the same plane.
  • a top solution may include, for example, a line segment (not shown) from center point 52 to an edge of top ' and a line segment along the top 'T'.
  • a bottom solution may include, for example, a line segment (not shown) from center point 52 to an edge of bottom 'B' and a line segment along the bottom ' ⁇ '.
  • an imaginary line may be drawn between the current position of the vehicle and a way point, with the obstacle therebetween, such that cross points with cylinder or sphere (e.g. with a predefined safety distance) may be determined above and/or below the cylinder or sphere.
  • processor 22 may identify known obstacles that may be encountered in the originally planned path of vehicle 50.
  • An obstacle may be represented by processor 22 by a set of shapes such as, for example, a cylinder, sphere and/or any kind of polygon (e.g. three-dimensional polygon).
  • processor 22 may first check if a way point of the originally planned path of vehicle 50 is located outside a cylinder (or other three-dimensional polygon) inscribing the obstacle, as described herein with reference to Fig. 8. In case the way point is located inside the cylinder, processor 22 proceeds to representing the obstacle by a set of shapes, for example as described with reference to Fig. 7.
  • processor 22 may create the cylinder, instead of a straight polygon or a three dimensional polygon, e.g. a shape having a certain shape of a polygon in the cross section, or a projection of a polygon on a ground plane. If a straight polygon is found and the waypoints are outside of the circumcenter cylinder then it may be more efficient to use cylinder planning.
  • Figs. 9A-9B show a schematic flowchart illustrating a method for planning path to avoid unknown obstacles (e.g. identified by sensors on board the vehicle), the method may be carried out by processor 22, according to some embodiments of the invention.
  • the obstacles may be detected by the at least one sensor onboard vehicle 50.
  • information about obstacles, the threat map, the state of the vehicle, predefined restrictions and the original path plan may be obtained, for example by the methods described herein.
  • a solution may be set as the original path plan (i.e. without modifications due to threat map for instance).
  • the projection 50a of vehicle 50 movement in the planned path on an obstacle plane may be determined.
  • processor 22 may check if projection 50a overlaps polygon 52. In case projection 50a overlaps polygon 52, processor 22 may check if projection 22 is inside a window in polygon 52. In case projection 50a does not overlap polygon 52 and/or is inside a window, the originally planned path is considered clear at the moment. In case projection 50a overlaps polygon 52 and is not inside a window, as indicated in block 540, solutions may be calculated, for example according to the methods described herein.
  • solutions may be eliminated, for example according to the methods described herein, and/or finding an optimal solution if at least one new solution has been found, as indicated in block 550.
  • "no solution mode" may be applied, for example as described in detail herein.
  • a solution may be eliminated due to limitations (or restrictions) on physical maneuvers of the vehicle that are required in a specific solution, due to restricted areas (e.g. a "no fly” zone) or due to general restrictions on positioning of the vehicle (e.g. maximal altitude limitations).
  • an optimal solution may be the solution that maximizes or minimized a value of a solution function, according to a predefined rule.
  • the solution function may be a weighted average function of one or more parameters such as: time required to complete the assignment, fuel consumption, path length, computational power required for the processing, distance from previous calculated solution and change in movement direction, as well as other parameters.
  • the weight given to each parameter may be changed according to, for example, assignment or mission type, vehicle limitations and the like. It should be appreciated that the highest efficiency value may be attributed to solutions that maximize or minimize a value of the solution function, whereby the highest efficiency value provides to most efficient solution.
  • processor 22 may repeat the steps indicated in blocks 510-550.
  • the method may include an additional safety step, including verifying that the solution is not likely to cause a collision.
  • the solution may be applied, as indicated in block 555, or in case the originally planned path is clear, applying the originally planned path.
  • the solution may be applied, as indicated in block 555.
  • processor 22 may generate commands aimed at stopping vehicle 50 or turning vehicle 50 back, since no solution was found for avoiding obstacle(s) 60.
  • the exact kind of action taken by processor 22 may depend on the type of the vehicle, for example, whether the vehicle is a ground, aerial or water/underwater vehicle, the exact kind of vehicle, and the vehicle's features.
  • vehicle 50 may move back in order to increase the field of view.
  • processor 22 may plan a path with a predetermined region of interest (ROI) that may be smaller than the field of view, for instance corresponding to the dimensions of vehicle 50 and/or to characteristics of sensor 20 (e.g., maximal field of view angle), such that only obstacles 60 within the predetermined ROI may be considered for path planning.
  • ROI region of interest
  • dimensions of the ROI may change in accordance with distance from sensor 20, for example being largest near sensor 20 and decreasing with distance.
  • the ROI may be dynamic and move while vehicle 50 moves, for instance the ROI may be dynamic according to at least one of the current maneuver, the detected obstacle 60, the velocity of vehicle 50, the type of sensor 20, etc.
  • the ROI may dynamically move to include at least one obstacle 60 to determine if that obstacle 60 continues to be a threat and/or gets closer to vehicle 50. It should be noted that planning a path based on obstacles within the predetermined ROI instead of the entire field of view may allow reduction of computing resources and/or reduction of computing time.
  • Processor 22 may use check points in order to return to previous locations and start looking for a solution again.
  • way points provided in the original mission may be used as initial check points.
  • processor 22 may save a new check point that the vehicle may be ordered to return to it, for example when a 'No Solution' mode is activated. Once vehicle 50 returns to a check point, processor 22 may try again to find a solution.
  • Memory 23 may store a map of check points that may be created and/or updated by processor 22. In some embodiments, memory 23 may also store the navigation map and/or obstacles.
  • At least two autonomous vehicles may operate in cooperation wherein a first vehicle maps the environment for the planned path for the assignment, and the second (or first and second) vehicle moves through the mapped environment.
  • data regarding obstacles that were recorded during previous assignments may also be taken into consideration during future path planning.
  • a solution for a path may be determined without determination of windows at the obstacle.
  • Such obstacles 60 may include static obstacles such as a building 610 in a distance of lOmeters and a tree 620 in a distance 20meters from vehicle 50, and dynamic obstacles such as a cooperative vehicle 630 in a distance of 15meters and a bird 640 in a distance 200meters from vehicle 50, where the distances may be displayed to the operator.
  • Processor 22 may calculate expected movement of cooperative vehicle 630 and determine future trajectory 635 of dynamic vehicle 630 (e.g., a cooperative vehicle) towards the first waypoint.
  • calculated navigation solution 650 may be calculated so as to not to overlap with future trajectory 635 of dynamic vehicle 630 (e.g., a cooperative).
  • paths of dynamic elements such as calculated navigation solution 650 or future trajectory 635 of dynamic vehicle 630 may also be indicated on the display with distinctive markings.
  • at least one obstacle 60 may be indicated on the display with distinctive markings, for example an obstacle may be indicated accordingly with at least one contour line (e.g., such as contour of obstacle 620 in Fig. 10).

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne des systèmes et des procédés de guidage autonome d'un ou de plusieurs véhicules qui sont au moins partiellement commandés par un ordinateur, le système comprenant une unité embarquée, l'unité embarquée se trouvant à bord d'un véhicule, et un centre de commande, configuré pour permettre une communication entre l'unité embarquée et le centre de commande, ladite unité embarquée comprenant un processeur et au moins un capteur, et le système étant configuré pour recevoir des informations concernant un trajet planifié et concernant des obstacles provenant d'au moins un des capteurs, et le centre de commande, calculer un ensemble de solutions de guidage, en fonction d'au moins un algorithme d'évitement d'obstacle et/ou d'une règle de guidage prédéfinie, choisir une solution de guidage présentant une efficacité maximale selon la règle de guidage prédéfinie, et appliquer la solution de guidage choisie.
PCT/IL2017/050390 2016-03-30 2017-03-29 Système et procédé de guidage autonome de véhicules WO2017168423A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL244838 2016-03-30
IL244838A IL244838A0 (en) 2016-03-30 2016-03-30 System and method for autonomous driving of vehicles

Publications (1)

Publication Number Publication Date
WO2017168423A1 true WO2017168423A1 (fr) 2017-10-05

Family

ID=57300880

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2017/050390 WO2017168423A1 (fr) 2016-03-30 2017-03-29 Système et procédé de guidage autonome de véhicules

Country Status (2)

Country Link
IL (1) IL244838A0 (fr)
WO (1) WO2017168423A1 (fr)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109933092A (zh) * 2017-12-18 2019-06-25 北京京东尚科信息技术有限公司 飞行器避障方法、装置和飞行器
WO2019125489A1 (fr) * 2017-12-22 2019-06-27 Nissan North America, Inc. Interfaces de recouvrement de trajet de solution destinés à des véhicules autonomes
WO2019135849A1 (fr) * 2018-01-03 2019-07-11 Qualcomm Incorporated Seuil de proximité d'évitement d'objet réglable d'un véhicule robotisé basé sur la présence de charge(s) utile(s) détectée(s)
WO2019148188A1 (fr) * 2018-01-29 2019-08-01 Interdigital Patent Holdings, Inc. Procédés de déploiement d'informatique mobile de périphérie (mobile edge computing - mec) pour des applications de système de gestion de trafic de système aérien sans pilote (utm)
CN110297487A (zh) * 2018-03-23 2019-10-01 日本电产新宝株式会社 移动体、管理装置及移动体系统
US10636314B2 (en) 2018-01-03 2020-04-28 Qualcomm Incorporated Adjusting flight parameters of an aerial robotic vehicle based on presence of propeller guard(s)
US10719705B2 (en) 2018-01-03 2020-07-21 Qualcomm Incorporated Adjustable object avoidance proximity threshold based on predictability of the environment
US10717435B2 (en) 2018-01-03 2020-07-21 Qualcomm Incorporated Adjustable object avoidance proximity threshold based on classification of detected objects
US10803759B2 (en) 2018-01-03 2020-10-13 Qualcomm Incorporated Adjustable object avoidance proximity threshold based on presence of propeller guard(s)
WO2021122324A1 (fr) * 2019-12-19 2021-06-24 Thales Dispositif et procede de proposition automatique de resolution de conflits aeriens
CN113574524A (zh) * 2018-10-18 2021-10-29 自动智能科技有限公司 用于障碍物检测的方法和系统
EP3757941B1 (fr) * 2018-02-20 2023-06-07 SoftBank Corp. Dispositif de traitement d'images, objet volant, et programme
WO2024098438A1 (fr) * 2022-04-28 2024-05-16 西安交通大学 Procédé de perception collaborative et de prise de décision dynamique en réseau au-delà de la plage visuelle à agents multiples et dispositif associé

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130124020A1 (en) * 2004-06-18 2013-05-16 L-3 Unmanned Systems, Inc. Autonomous collision avoidance system for unmanned aerial vehicles
US8712679B1 (en) * 2010-10-29 2014-04-29 Stc.Unm System and methods for obstacle mapping and navigation
US20150260526A1 (en) * 2014-03-15 2015-09-17 Aurora Flight Sciences Corporation Autonomous vehicle navigation system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130124020A1 (en) * 2004-06-18 2013-05-16 L-3 Unmanned Systems, Inc. Autonomous collision avoidance system for unmanned aerial vehicles
US8712679B1 (en) * 2010-10-29 2014-04-29 Stc.Unm System and methods for obstacle mapping and navigation
US20150260526A1 (en) * 2014-03-15 2015-09-17 Aurora Flight Sciences Corporation Autonomous vehicle navigation system and method

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109933092A (zh) * 2017-12-18 2019-06-25 北京京东尚科信息技术有限公司 飞行器避障方法、装置和飞行器
CN109933092B (zh) * 2017-12-18 2022-07-05 北京京东乾石科技有限公司 飞行器避障方法、装置、可读存储介质和飞行器
WO2019125489A1 (fr) * 2017-12-22 2019-06-27 Nissan North America, Inc. Interfaces de recouvrement de trajet de solution destinés à des véhicules autonomes
US11377123B2 (en) 2017-12-22 2022-07-05 Nissan North America, Inc. Solution path overlay interfaces for autonomous vehicles
US10803759B2 (en) 2018-01-03 2020-10-13 Qualcomm Incorporated Adjustable object avoidance proximity threshold based on presence of propeller guard(s)
US10636314B2 (en) 2018-01-03 2020-04-28 Qualcomm Incorporated Adjusting flight parameters of an aerial robotic vehicle based on presence of propeller guard(s)
US10719705B2 (en) 2018-01-03 2020-07-21 Qualcomm Incorporated Adjustable object avoidance proximity threshold based on predictability of the environment
US10717435B2 (en) 2018-01-03 2020-07-21 Qualcomm Incorporated Adjustable object avoidance proximity threshold based on classification of detected objects
US10720070B2 (en) 2018-01-03 2020-07-21 Qualcomm Incorporated Adjustable object avoidance proximity threshold of a robotic vehicle based on presence of detected payload(s)
TWI804538B (zh) * 2018-01-03 2023-06-11 美商高通公司 基於偵測到的有效負荷的存在性的、機器人式運載工具的可調整的物件迴避接近度閾值
WO2019135849A1 (fr) * 2018-01-03 2019-07-11 Qualcomm Incorporated Seuil de proximité d'évitement d'objet réglable d'un véhicule robotisé basé sur la présence de charge(s) utile(s) détectée(s)
WO2019148188A1 (fr) * 2018-01-29 2019-08-01 Interdigital Patent Holdings, Inc. Procédés de déploiement d'informatique mobile de périphérie (mobile edge computing - mec) pour des applications de système de gestion de trafic de système aérien sans pilote (utm)
EP3757941B1 (fr) * 2018-02-20 2023-06-07 SoftBank Corp. Dispositif de traitement d'images, objet volant, et programme
CN110297487A (zh) * 2018-03-23 2019-10-01 日本电产新宝株式会社 移动体、管理装置及移动体系统
CN113574524A (zh) * 2018-10-18 2021-10-29 自动智能科技有限公司 用于障碍物检测的方法和系统
FR3105545A1 (fr) * 2019-12-19 2021-06-25 Thales Dispositif et procede de proposition automatique de resolution de conflits aeriens
WO2021122324A1 (fr) * 2019-12-19 2021-06-24 Thales Dispositif et procede de proposition automatique de resolution de conflits aeriens
WO2024098438A1 (fr) * 2022-04-28 2024-05-16 西安交通大学 Procédé de perception collaborative et de prise de décision dynamique en réseau au-delà de la plage visuelle à agents multiples et dispositif associé

Also Published As

Publication number Publication date
IL244838A0 (en) 2016-07-31

Similar Documents

Publication Publication Date Title
WO2017168423A1 (fr) Système et procédé de guidage autonome de véhicules
EP3619591B1 (fr) Drone meneur
US20220124303A1 (en) Methods and systems for selective sensor fusion
US11854413B2 (en) Unmanned aerial vehicle visual line of sight control
US20210358315A1 (en) Unmanned aerial vehicle visual point cloud navigation
EP3901728B1 (fr) Procédés et système pour atterrissage autonome
CN109923492B (zh) 飞行路径确定
CN111295627B (zh) 水下领航无人机系统
EP3039381B1 (fr) Recherches de véhicule sans pilote
CN110226143B (zh) 前导无人机的方法
US20200117220A1 (en) Swarm path planner system for vehicles
CN109478068A (zh) 动态地控制用于处理传感器输出数据的参数以用于碰撞避免和路径规划的系统和方法
US11726501B2 (en) System and method for perceptive navigation of automated vehicles
JP2015006874A (ja) 3次元証拠グリッドを使用する自律着陸のためのシステムおよび方法
US20190066522A1 (en) Controlling Landings of an Aerial Robotic Vehicle Using Three-Dimensional Terrain Maps Generated Using Visual-Inertial Odometry
Clark Collision avoidance and navigation of UAS using vision-based proportional navigation
Upadhyay et al. Multiple Drone Navigation and Formation Using Selective Target Tracking-Based Computer Vision. Electronics 2021, 10, 2125
Naveenkumar et al. Autonomous Drone Using Time-of-Flight Sensor for Collision Avoidance
Adamski et al. Analysis of methods and control systems of unmanned platforms
Zhou et al. On-board sensors-based indoor navigation techniques of micro aerial vehicle
de Oliveira et al. Flexible Trajectory Planning Framework using Optimal Control for Rotary and Fixed-Wing Aircrafts Mission Planning and Target-Pursuit

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17773441

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 07-02-2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17773441

Country of ref document: EP

Kind code of ref document: A1