US10019005B2 - Autonomous vehicle control system - Google Patents

Autonomous vehicle control system Download PDF

Info

Publication number
US10019005B2
US10019005B2 US15/266,708 US201615266708A US10019005B2 US 10019005 B2 US10019005 B2 US 10019005B2 US 201615266708 A US201615266708 A US 201615266708A US 10019005 B2 US10019005 B2 US 10019005B2
Authority
US
United States
Prior art keywords
autonomous vehicle
operational
decision
mission
plans
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/266,708
Other versions
US20170097640A1 (en
Inventor
Walter Wang
Jerome H. Wei
Bradley Feest
Benjamin Najar-Robles
Michael Z. Lim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northrop Grumman Systems Corp
Original Assignee
Northrop Grumman Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northrop Grumman Systems Corp filed Critical Northrop Grumman Systems Corp
Priority to US15/266,708 priority Critical patent/US10019005B2/en
Assigned to NORTHROP GRUMMAN SYSTEMS CORPORATION reassignment NORTHROP GRUMMAN SYSTEMS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FEEST, BRADLEY, LIM, MICHAEL Z., NAJAR-ROBLES, BENJAMIN, WANG, WALTER, WEI, JEROME H.
Publication of US20170097640A1 publication Critical patent/US20170097640A1/en
Application granted granted Critical
Publication of US10019005B2 publication Critical patent/US10019005B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • G06N7/005
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks

Definitions

  • the present disclosure relates generally to artificial intelligence systems, and specifically to an autonomous vehicle control system.
  • Unmanned vehicles are becoming increasingly more common in a number of tactical missions, such as in surveillance and/or combat missions.
  • UAV unmanned aerial vehicles
  • UAV unmanned aerial vehicles
  • computer processing and sensor technology has advanced significantly, unmanned vehicles can be operated in an autonomous manner.
  • a given unmanned vehicle can be operated based on sensors configured to monitor external stimuli, and can be programmed to respond to the external stimuli and to execute mission objectives that are either programmed or provided as input commands, as opposed to being operated by a remote pilot.
  • One example includes an autonomous vehicle control system.
  • the system includes an operational plan controller to maintain operational plans that each correspond to a predetermined set of behavioral characteristics of an associated autonomous vehicle based on situational awareness data provided from on-board sensors of the autonomous vehicle and mission control data provided from a user interface.
  • the system also includes a decision-making algorithm to select one of the operational plans for operational behavior of the autonomous vehicle based on the situational awareness data and the mission control data at a given time and to provide an intent decision based on the situational awareness data and the selected one of the operational plans.
  • the system further includes an execution engine to provide control outputs to operational components of the autonomous vehicle for navigation and control based on the selected one of the operational plans and in response to the intent decision.
  • Another example includes a method for controlling an autonomous vehicle.
  • the method includes providing mission control data to an autonomous vehicle control system associated with the autonomous vehicle via a user interface.
  • the method also includes generating situational awareness data associated with the autonomous vehicle in response receiving sensor data provided from on-board sensors.
  • the method also includes selecting one of a plurality of operational plans that each correspond to a predetermined set of behavioral characteristics of the autonomous vehicle based on the situational awareness data and the mission control data.
  • the method further includes providing control outputs to operational components associated with the autonomous vehicle for navigation and control of the autonomous vehicle in response to the sensor data and based on the selected one of the plurality of operational plans.
  • Another example includes an autonomous vehicle.
  • the vehicle includes on-board sensors configured to generate situational awareness data associated with situational awareness conditions of the autonomous vehicle.
  • the vehicle also includes operational components configured to provide navigation and control of the autonomous vehicle in response to control outputs.
  • the vehicle also includes an autonomous vehicle control system operating on a computer readable medium.
  • the autonomous vehicle control system includes an operational plan controller configured to maintain a plurality of operational plans that each correspond to a predetermined set of behavioral characteristics of an associated autonomous vehicle based on the situational awareness data and mission control data.
  • the system also includes a decision-making algorithm configured to select one of the plurality of operational plans for operational behavior of the autonomous vehicle based on the sensor data and the mission control data at a given time and to provide an intent decision based on the situational awareness data and based on the selected one of the plurality operational plans.
  • the system further includes an execution engine configured to provide the control outputs to the operational components based on the selected one of the plurality of operational plans and in response to the intent decision.
  • FIG. 1 illustrates an example of an autonomous vehicle system.
  • FIG. 2 illustrates an example of an operational plan controller.
  • FIG. 3 illustrates an example of a utility calculation system for a decision-making algorithm.
  • FIG. 4 illustrates an example of an intent generation system.
  • FIG. 5 illustrates an example of a method for controlling an autonomous vehicle.
  • the present disclosure relates generally to artificial intelligence systems, and specifically to an autonomous vehicle control system.
  • An autonomous vehicle control system is implemented, for example, at least partially on a computer readable medium, such as a processor that is resident on an associated autonomous vehicle.
  • the autonomous vehicle can be configured as an unmanned aerial vehicle (UAV).
  • UAV unmanned aerial vehicle
  • the autonomous vehicle thus includes on-board sensors that are configured to generate sensor data that is associated with situational awareness of the autonomous vehicle, and further includes operational components that are associated with navigation and control of the autonomous vehicle (e.g., flaps, an engine, ordnance, or other operational components).
  • the autonomous vehicle control system can thus provide autonomous control of the autonomous vehicle based on receiving the sensor data and mission control data (e.g., defining parameters of a given mission) and by providing output signals to the operational components.
  • mission control data e.g., defining parameters of a given mission
  • the autonomous vehicle control system includes an operational plan controller, a decision-making algorithm, a utility calculation system, and an execution engine.
  • the operational plan controller is configured to maintain predetermined operational plans that each correspond to a predetermined set of behavioral characteristics of the autonomous vehicle based on the sensor data and the mission control data, such as provided from a user interface.
  • the utility calculation system is configured to calculate a total utility factor based on a plurality of behavioral characteristics.
  • the decision-making algorithm is configured to select one of the plurality of operational plans for operational behavior of the autonomous vehicle based on the sensor data and the mission control data at a given time and to provide an intent decision based on situational awareness characteristics provided via the sensor data and the total utility factor for a given decision instance.
  • the execution engine is configured to provide the outputs to the operational components for navigation and control of the autonomous vehicle based on the selected one of the operational plans and in response to the intent decision at the given decision instance.
  • FIG. 1 illustrates an example of an autonomous vehicle system 10 .
  • the autonomous vehicle system 10 includes an autonomous vehicle 12 .
  • autonomous vehicle describes an unmanned vehicle that operates in an autonomous manner, such that the autonomous vehicle 12 is not piloted or operated in any continuous manner, but instead operates continuously based on a programmed set of instructions that dictate motion, maneuverability, and the execution of actions directed toward completing mission objectives in response to sensor data associated with external stimuli.
  • the autonomous vehicle 12 can be configured as an unmanned aerial vehicle (UAV) that operates in an autonomous programmable manner for any of a variety of different purposes.
  • UAV unmanned aerial vehicle
  • the autonomous vehicle 12 includes an autonomous vehicle control system 14 that can be programmed such that the autonomous vehicle 12 can operate autonomously to complete predetermined mission objectives in response to inputs, such as provided via sensor data and mission control data.
  • the autonomous vehicle 12 includes a set of on-board sensors 16 that can provide sensor input data SENS_IN to the autonomous vehicle control system 14 .
  • the on-board sensors 16 can include optical sensors, one or more cameras, and/or other types of electro-optical imaging sensors (e.g., radar, lidar, or a combination thereof).
  • the on-board sensors 16 can also include location and/or situational awareness sensors (e.g., a global navigation satellite system (GNSS) receiver). Therefore, the on-board sensors 16 can be configured to obtain situational awareness data that is provided as the sensor input data SENS_IN to the autonomous vehicle control system 14 .
  • GNSS global navigation satellite system
  • the autonomous vehicle 12 can include operational components 18 that can correspond to navigation and control devices for operating the autonomous vehicle 12 and for completing mission objectives.
  • the operational components 18 can include navigation components (e.g., wing, body, and/or tail flaps), an engine, ordnance, and/or other operational components.
  • the autonomous vehicle control system 14 can provide control outputs OP_OUT to the operational components 18 to control the operational components 18 . Therefore, the autonomous vehicle 12 can operate based on providing the control outputs OP_OUT to the operational components 18 in response to receiving the sensor input data SENS_IN via the on-board sensors 16 .
  • the autonomous vehicle control system 14 can be configured as one or more processors 20 that are programmed to generate the control outputs OP_OUT in response to the sensor input data SENS_IN to control the autonomous vehicle 12 .
  • the processor(s) 20 can thus execute programmable instructions, such as stored in memory (not shown).
  • the processor(s) 20 constituting the autonomous vehicle control system 14 can be programmed via a user interface 22 that is associated with the autonomous vehicle system 10 .
  • the user interface 22 can be configured as a computer system or graphical user interface (GUI) that is accessible via a computer (e.g., via a network).
  • GUI graphical user interface
  • the user interface 22 can be configured, for example, to program the autonomous vehicle control system 14 , to define and provide mission objectives, and/or to provide limited or temporary control of the autonomous vehicle 12 , such as in response to an override request by the autonomous vehicle control system 14 , as described in greater detail herein.
  • the user interface 22 is demonstrated as providing mission control data CTRL (e.g., wirelessly) that can correspond to predetermined mission parameters 24 that describe the mission definitions and objectives, such as including a predetermined navigation course, parameters for navigating the predetermined navigation course, at least one mission objective, and behaviors for accomplishing the mission objective(s).
  • CTRL e.g., wirelessly
  • the mission control data CTRL can also provide program data for programming behavioral characteristics and/or vehicle piloting signals for providing user override control, such as described in greater detail herein.
  • the user interface 22 is described previously as a computer system or GUI, as another example, the user interface 22 can be configured as one or more chips or circuit boards (e.g., printed circuit boards (PCBs)) that can be installed in the autonomous vehicle control system 14 , such that the user interface 22 can be pre-programmed with the mission control data CTRL and can be accessed by the autonomous vehicle control system 14 .
  • PCBs printed circuit boards
  • the processor(s) 20 can be programmed via the mission control data CTRL to implement an operational plan controller 26 , a decision-making algorithm 28 , and an execution engine 30 .
  • the operational plan controller 26 can control an operating plan associated with the autonomous vehicle 12 , such as corresponding to a current behavioral mode in which the autonomous vehicle 12 operates.
  • the operational plan controller 26 can maintain a plurality of selectable operational plans that each correspond to a predetermined set of behavioral characteristics of the autonomous vehicle 12 .
  • the operational plan controller 26 can set the autonomous vehicle control system 14 to operate in a given operational plan at a given duration of time based on the sensor input data SENS_IN and/or the mission control data CTRL provided from the user interface 22 .
  • the operational plan controller 26 can be configured to set a given operational plan based on the decision-making algorithm 28 in response to a given intent decision, such as at a given decision instance.
  • the term “intent decision” refers to a decision that is required to be made by the decision-making algorithm 28 that is consistent with predetermined parameters associated with control of the autonomous vehicle 12 and programmable behavioral characteristics of the autonomous vehicle control system 14 to control the autonomous vehicle 12 in response to unexpected circumstances.
  • the term “decision instance” refers to a given time and/or set of circumstances that are dependent on unexpected and/or unplanned external stimuli (e.g., provided via the sensor input data SENS_IN) that require a decision via the decision-making algorithm 28 to dictate behavior of the autonomous vehicle 12 .
  • FIG. 2 illustrates an example of an operational plan controller 50 .
  • the operational plan controller 50 can be implemented as hardware, software, firmware, or a combination thereof that is executable by the processor(s) 20 .
  • the operational plan controller 50 can correspond to the operational plan controller 26 in the example of FIG. 1 . Therefore, reference is to be made to the example of FIG. 1 in the following description of the example of FIG. 2 .
  • the operational plan controller 50 can select an operating plan associated with the autonomous vehicle 12 , such as corresponding to a current behavioral mode in which the autonomous vehicle 12 operates.
  • the selected operating plan is provided as a command CURR_PLN, which can be configured to trigger one or more routines corresponding to the selected operating plan (e.g., in the autonomous vehicle controller 14 or in the operational plan controller 50 itself).
  • the operational plan controller 50 includes a nominal plan 52 , an expedite plan 54 , a caution plan 56 , a stop plan 58 , and a user request plan 60 .
  • the nominal plan 52 can be associated with a nominal operational behavior of the autonomous vehicle 12 and can be based on the mission control data CTRL.
  • the nominal plan 52 can be a default operational plan that the operational plan controller 50 sets as the operational plan for the autonomous vehicle control system 14 when all other systems are stable, such as during initialization (e.g., takeoff), completion (e.g., landing), and/or during the mission defined by the mission parameters 24 , such as absent perturbations by unexpected and/or unplanned external factors.
  • the mission control data CTRL can dictate the external conditions as to when the autonomous vehicle control system 14 should be set to the nominal plan 52 .
  • the expedite plan 54 can be associated with an expedited operational behavior of the autonomous vehicle 12 relative to the nominal operational behavior of the nominal plan 52 and can be based on the mission control data CTRL.
  • the mission control data CTRL can dictate when the autonomous vehicle control system 14 should switch from the nominal plan 52 to the expedite plan 54 based on external conditions or based on the mission parameters 24 .
  • delays in the mission defined by the mission parameters 24 based on previous circumstances e.g., operating in the caution plan 56 , as described in greater detail herein
  • the expedite plan 54 can be implemented by the operational plan controller 50 for the autonomous vehicle control system 14 when all other systems are stable during the mission defined by the mission parameters 24 absent perturbations by unexpected and/or unplanned external factors to attempt to recapture time.
  • the expedite plan 54 can be implemented in situations when the decision-making algorithm 28 calculates that the utility of an expedited mission operation outweighs the utility of increased risk to the autonomous vehicle 12 or to completion of the mission objective(s).
  • the caution plan 56 can be associated with a reduced-risk operational behavior of the autonomous vehicle 12 relative to the nominal operational behavior of the nominal plan 52 and can be based on the mission control data CTRL.
  • the mission control data CTRL can dictate when the autonomous vehicle control system 14 should switch from the nominal plan 52 to the caution plan 56 based on external conditions, such as perceived hazards and/or threats based on the sensor input data SENS_IN. For example, upon a determination of hazardous environment conditions, an external obstacle, or an imminent or detected threat that may require evasive maneuvering, the operational plan controller 50 can set or can be instructed to set the autonomous vehicle control system 14 to the caution plan 56 .
  • the autonomous vehicle control system 14 can dictate a slower speed for the autonomous vehicle 12 , such as to provide capability for reducing risks by providing more time for reaction and/or maneuvering.
  • the caution plan 56 may force deviation from the predetermined navigation course associated with completion of the mission objectives, as defined by the mission parameters 24 , while still maintaining a rapid speed for the autonomous vehicle 12 .
  • the autonomous vehicle control system 14 can decide that operation of the autonomous vehicle 12 in a predetermined navigation course defined by the mission parameters 24 in the nominal plan 52 is too risky, such as described in greater detail herein, and can thus command the operational plan controller 50 to switch to the caution plan 56 as the current operational plan CURR_PLN.
  • the stop plan 58 can be associated with ceased operational behavior of the autonomous vehicle 12 , such as in response to detecting an imminent collision with an obstacle or another moving vehicle.
  • the stop plan 58 can be associated with an autonomous land vehicle, or an autonomous aerial vehicle that is preparing to take off or has landed.
  • the user request plan 60 can correspond to a situation in which the autonomous vehicle control system 14 transmits a request for instructions from the user interface 22 .
  • the autonomous vehicle control system 14 can be switched to the user request plan 60 in response to the decision-making algorithm 28 determining an approximately equal utility or probability in determining a given intent decision at a respective decision instance.
  • the user request plan 60 can accompany another operational plan of the operational plan controller 50 , such as one of the nominal plan 52 , the caution plan 56 , or the stop plan 58 , such that the autonomous vehicle 12 can continue to operate in a predetermined manner according to the selected operational plan CURR_PLN while awaiting additional instructions as dictated by the user request plan 60 .
  • the operational plan controller 50 can also include at least one additional plan 62 that can dictate a respective at least one additional behavioral mode in which the autonomous vehicle 12 can operate.
  • the operational plan controller 50 is not limited to providing the current plan CURR_PLN as one of the nominal plan 52 , the expedite plan 54 , the caution plan 56 , the stop plan 58 , and the user request 60 .
  • the decision-making algorithm 28 includes a utility calculation system 32 and an intent generation system 34 .
  • the utility calculation system is configured to calculate a total utility factor (TUF) for each of the operational plans (e.g., the nominal plan 52 , the expedite plan 54 , the caution plan 56 , the stop plan 58 , the user request plan 60 , and/or the additional plan(s) 62 ) that are maintained by the operational plan controller 26 .
  • TDF total utility factor
  • the calculation of the TUF can be based on behavioral characteristics that can correspond to characteristics of the autonomous vehicle 12 , user inputs provided via the user interface 22 , avoidance of potential obstacles (e.g., external objects, such as other aircraft, terrain features, buildings, etc.), integrity of the sensors 16 , and/or predetermined performance characteristics of the operational components 18 of the autonomous vehicle 12 .
  • the utility calculation system 32 can be configured to command the operational plan controller 26 to select the operational plan based on the TUF calculated for each of the operational plans (e.g., based on the highest TUF).
  • FIG. 3 illustrates an example of a utility calculation system 100 for a decision-making algorithm (e.g., the decision-making algorithm 28 ).
  • the utility calculation system 100 can be implemented as hardware, software, firmware, or a combination thereof that is executable by the processor(s) 20 .
  • the utility calculation system 100 can correspond to the utility calculation system 32 in the example of FIG. 1 . Therefore, reference is to be made to the example of FIG. 1 in the following description of the example of FIG. 3 .
  • the utility calculation system 100 can implement a variety of predetermined behavioral factors to calculate the TUF that can dictate the operational behavior of the autonomous vehicle control system 14 .
  • the behavioral factors include performance utility factors 102 associated with performance characteristics of the autonomous vehicle 12 and/or characteristics of the mission defined by the mission parameters 24 .
  • the performance utility factors 102 can include timing associated with the mission, such as defined by the mission parameters 24 , can include capabilities of the autonomous vehicle 12 , such as velocity, handling, maneuverability, response speed, turning radii, and/or other navigation characteristics (e.g., including motion in six-degrees of freedom), changes toperformance based on ordnance loading, and/or a variety of other performance characteristics of the autonomous vehicle 12 .
  • the behavioral factors also include operator utility factors 104 associated with the mission control data CTRL and availability of operator inputs for control of the autonomous vehicle 12 .
  • the operator utility factors 104 can include timing associated with response time for communications with a user via the user interface 22 , such as relating to a current velocity of the autonomous vehicle 12 , as well as a level of detail required to provide user input (e.g., in response to requests that can be provided in the user request plan 60 ).
  • the behavioral factors can also include avoidance safety utility factors 106 associated with consequences of collision of the autonomous vehicle 12 .
  • the avoidance safety utility factors 106 can account for velocity of the autonomous vehicle 12 relative to a type of potential obstacle with which the autonomous vehicle 12 can have an imminent collision, such as based on an evaluation of static objects (e.g., terrain) relative to dynamic objects (e.g., other vehicles, threats, etc.).
  • the behavioral factors can further include integrity safety utility factors 108 that are associated with an impact of environmental conditions on the on-board sensors 16 and operational components 18 associated with the autonomous vehicle 12 .
  • the integrity safety utility factors 108 can be associated with the effects of weather on the on-board sensors 16 and operational components 18 , such as the effects of rain occluding optical components of the on-board sensors 16 , the effects of rain on the grip of tires to a concrete airport tarmac, the effect of turbulence on the operational components 18 , the effect of clouds on the sensors 16 , etc.
  • the utility calculation system 100 also includes respective programmable weights that are selectively assigned to the plurality of behavioral characteristics.
  • each of the programmable weights can be provided as a portion of the mission control data CTRL provided via the user interface 22 .
  • the programmable weights include performance utility weight(s) 110 (“PU WEIGHT(S)”) that can be associated with the performance utility factors 102 , operator utility weight(s) 112 (“OU WEIGHT(S)”) that can be associated with the operator utility factors 104 , avoidance safety utility weight(s) 114 (“ASU WEIGHT(S)”) that can be associated with the avoidance safety utility factors 106 , and integrity safety utility weight(s) 116 (“ISU WEIGHT(S)”) that can be associated with the integrity safety utility factors 108 .
  • PU WEIGHT(S) performance utility weight(S) 110
  • OOU WEIGHT(S) operator utility weight(s) 112
  • ASU WEIGHT(S) avoidance safety utility weight(s) 114
  • ISU WEIGHT(S) integrity safety utility weight(s) 116
  • Each of the performance utility weight(s) 110 , operator utility weight(s) 112 , avoidance safety utility weight(s) 114 , and integrity safety utility weight(s) 116 can include one or more weighted multiplicative factors that can emphasize or de-emphasize certain ones of the behavioral factors (e.g., in each of the performance utility factors 102 , operator utility factors 104 , avoidance safety utility factors 106 , and integrity safety utility factors 108 ) at a given time.
  • the selection of the performance utility weight(s) 110 , operator utility weight(s) 112 , avoidance safety utility weight(s) 114 , and/or integrity safety utility weight(s) 116 can be based, for example, on the mission control data CTRL at various stages of a given mission defined by the mission parameters 24 . Therefore, a user can implement the user interface 22 to selectively and programmably set the weights of the respective performance utility weight(s) 110 , operator utility weight(s) 112 , avoidance safety utility weight(s) 114 , and integrity safety utility weight(s) 116 at various stages of the mission defined by the mission parameters 24 to dictate the operational plan of the autonomous vehicle control system 14 for operating the autonomous vehicle 12 .
  • the weighted performance utility factors 102 demonstrated as WPU
  • the weighted operator utility factors 104 demonstrated as WOU
  • the weighted avoidance safety utility factors 106 demonstrated as WASU
  • the weighted integrity safety utility factors 108 demonstrated as WISU
  • the TUF calculation component 118 is configured to calculate the TUF for each given one of the operational plans (e.g., the nominal plan 52 , the expedite plan 54 , the caution plan 56 , the stop plan 58 , and/or the user request plan 60 ).
  • the TUF calculation component 118 can receive situational awareness data via the sensor input data SENS_IN, such that the TUF can be modified based on external considerations (e.g., weather, threats, potential obstacles, etc.). Therefore, the TUF calculation component 118 can calculate the TUF for each of the operational plans, and can provide the calculated TUF for each of the operational plans to the operational plan controller 26 for selection of a given one of the operational plans at a given time.
  • external considerations e.g., weather, threats, potential obstacles, etc.
  • FIG. 4 illustrates an example of an intent generation system 150 .
  • the intent generation system 150 can be implemented as hardware, software, firmware, or a combination thereof that is executable by the processor(s) 20 .
  • the intent generation system 150 is demonstrated in the example of FIG. 4 as a motion intent generation system to provide decision-making capability in the context of motion of the autonomous vehicle 12 , such as for the autonomous vehicle 12 moving on an airfield tarmac.
  • the intent generation system 150 is demonstrated as a collision avoidance intent generator to provide an intent decision for operation of the autonomous vehicle 12 to avoid a collision of the autonomous vehicle 12 with a potential obstacle (e.g., another aircraft on the tarmac).
  • the intent generation system 150 can correspond to the intent generation system 34 in the example of FIG. 1 . Therefore, reference is to be made to the example of FIG. 1 in the following description of the example of FIG. 4 .
  • the intent generation system 150 includes an intent generator 152 that is configured to provide the intent decision for a given decision instance.
  • the intent generator 152 is demonstrated as including a probability calculator 154 that is configured to calculate a set of probabilities associated with predetermined possible outcomes for a given decision instance.
  • the set of probabilities can include a probability of collision with another aircraft that approaches the same intersection of the tarmac as the autonomous vehicle 12 .
  • the possible courses of action for the autonomous vehicle 12 could include: proceed at the same speed, slow down, speed up, stop, turn left, turn right, go straight, etc. Therefore, the intent generator 152 is configured to provide an intent decision based on the set of probabilities, such as to provide the intent decision based on a most acceptable relative probability of the set of probabilities. In the example of FIG.
  • the probability calculator 154 calculates the set of probabilities based on the situational awareness characteristics provided via the sensor input data SENS_IN and the selected operational plan, demonstrated in the example of FIG. 4 as “CURR_PLN”.
  • the probabilities can be calculated by the probability calculator 154 based on any of a variety of algorithms, such as a Bayesian network, influence diagrams, and/or a variety of other decision theory calculations.
  • the situational awareness characteristics can be provided via the sensor input data SENS_IN are demonstrated as including a relative distance 156 , a relative velocity 158 , a relative trajectory 160 , and environmental considerations 162 .
  • the relative distance 156 , the relative velocity 158 , and the relative trajectory 160 can correspond to respective motion features of autonomous vehicle 12 relative to one or more potential obstacles, such as another aircraft on the tarmac (e.g., at an intersection of the tarmac).
  • the relative distance 156 can thus correspond to a relative distance between the autonomous vehicle 12 and the potential obstacle, such as with respect to the intersection or with respect to each other.
  • the relative velocity 158 can thus correspond to a relative velocity between the autonomous vehicle 12 and the potential obstacle with respect to each other or with respect to the intersection.
  • the relative trajectory 160 can thus correspond to a relative direction of motion between the autonomous vehicle 12 and the potential obstacle, such as could indicate intersection of motion and thus a potential collision.
  • the environmental considerations 162 can include characteristics of the environment in which the autonomous vehicle 12 operates. For example, rain, snow, or ice on the tarmac could affect the performance of the autonomous vehicle 12 on the tarmac, and thus the probability of collision of the autonomous vehicle 12 and the potential obstacle could increase at a given relative distance 156 , relative velocity 158 , and/or relative trajectory 160 .
  • the intent generator 152 can provide the intent decision corresponding to a most favorable probable outcome for a given course of action.
  • the decision-making algorithm 28 can communicate the intent decision to the execution engine 30 .
  • the execution engine 30 can be configured to execute the physical results of the intent decision by generating an appropriate set of outputs that can collectively correspond to the control outputs OP_OUT.
  • the control outputs OP_OUT can be provided to the operational components 18 of the autonomous vehicle 12 for execution of the intent decision.
  • the probability calculator 154 could calculate the probability of collision with the other aircraft approaching the tarmac, as described previously, for each of the courses of action (e.g., proceed at the same speed, slow down, speed up, stop, turn left, turn right, go straight, etc.).
  • the intent generator 152 could determine that the most favorable course of action based on the calculated probabilities is for the autonomous vehicle 12 to turn left at the tarmac intersection.
  • the intent generator 152 can provide the corresponding intent decision to the execution engine 30 to generate the corresponding control outputs OP_OUT to turn the wheel(s) of the autonomous vehicle 12 (with the wheel(s) corresponding to the appropriate operational components 18 ) to enact a left turn of the autonomous vehicle 12 at the appropriate time (e.g., as provided by the sensor input data SENS_IN). Accordingly, the autonomous vehicle 12 can operate in a manner that substantially reduces the probability of collision with the potential obstacle based on the determined intent decision.
  • the intent generation system 150 can be configured to generate intent decisions for any of variety of other situations and scenarios that require intent decisions based on external stimuli and/or situational awareness.
  • the intent generation system 150 can be implemented to provide intent decisions during the mission defined by the mission parameters 24 , such as to decide to deviate from a predetermined navigation course in response to unexpected circumstances (e.g., threats, weather conditions, etc.).
  • the intent generation system 150 can provide navigation intent decisions in response to deviation from the predetermined navigation course, such as to avoid obstacles, threats, mid-air collisions, to attempt returning to the predetermined course, to attempt an alternative course to completion of the mission, and/or to decide to abort the mission. Accordingly, the intent generation system 150 can be implemented by the decision-making algorithm 28 in a variety of ways to provide autonomous control of the autonomous vehicle 12 .
  • FIG. 5 a method in accordance with various aspects of the present disclosure will be better appreciated with reference to FIG. 5 . While, for purposes of simplicity of explanation, the method of FIG. 5 is shown and described as executing serially, it is to be understood and appreciated that the present disclosure is not limited by the illustrated order, as some aspects could, in accordance with the present disclosure, occur in different orders and/or concurrently with other aspects from that shown and described herein. Moreover, not all illustrated features may be required to implement a method in accordance with an aspect of the present disclosure.
  • FIG. 5 illustrates a method 200 for controlling an autonomous vehicle (e.g., the autonomous vehicle 12 ).
  • mission control data e.g., the mission control data CTRL
  • an autonomous vehicle control system e.g., the autonomous vehicle control system 14
  • a user interface e.g., the user interface 22
  • situational awareness data associated with the autonomous vehicle is generated in response receiving sensor data (e.g., the sensor data SENS_IN) provided from on-board sensors (e.g., the on-board sensors 16 ).
  • sensor data e.g., the sensor data SENS_IN
  • on-board sensors e.g., the on-board sensors 16 .
  • one of a plurality of operational plans (e.g., the operational plans 52 , 54 , 56 , 58 , 60 , 62 ) that each correspond to a predetermined set of behavioral characteristics of the autonomous vehicle is selected based on the situational awareness data and the mission control data.
  • control outputs e.g., the control outputs OP_OUT
  • operational components e.g., the operational components 18

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Business, Economics & Management (AREA)
  • Game Theory and Decision Science (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • Algebra (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computing Systems (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

One example includes an autonomous vehicle control system. The system includes an operational plan controller to maintain operational plans that each correspond to a predetermined set of behavioral characteristics of an associated autonomous vehicle based on situational awareness data provided from on-board sensors of the autonomous vehicle and mission control data provided from a user interface. The system also includes a decision-making algorithm to select one of the operational plans for operational behavior of the autonomous vehicle based on the situational awareness data and the mission control data at a given time and to provide an intent decision based on the situational awareness data and the selected one of the operational plans. The system further includes an execution engine to provide control outputs to operational components of the autonomous vehicle for navigation and control based on the selected one of the operational plans and in response to the intent decision.

Description

RELATED APPLICATIONS
This application is claims priority of U.S. Provisional Patent Application Ser. No. 62/237917, filed 6 Oct. 2015, which is incorporated herein in its entirety.
This disclosure was made with Government support under United States Air Force Contract No. FA8650-11-C-3104. The Government has certain rights in this disclosure.
TECHNICAL FIELD
The present disclosure relates generally to artificial intelligence systems, and specifically to an autonomous vehicle control system.
BACKGROUND
Unmanned vehicles are becoming increasingly more common in a number of tactical missions, such as in surveillance and/or combat missions. As an example, in the case of aircraft, as some flight operations became increasingly more dangerous or tedious, unmanned aerial vehicles (UAV) have been developed as a means for replacing pilots in the aircraft for controlling the aircraft. Furthermore, as computer processing and sensor technology has advanced significantly, unmanned vehicles can be operated in an autonomous manner. For example, a given unmanned vehicle can be operated based on sensors configured to monitor external stimuli, and can be programmed to respond to the external stimuli and to execute mission objectives that are either programmed or provided as input commands, as opposed to being operated by a remote pilot.
SUMMARY
One example includes an autonomous vehicle control system. The system includes an operational plan controller to maintain operational plans that each correspond to a predetermined set of behavioral characteristics of an associated autonomous vehicle based on situational awareness data provided from on-board sensors of the autonomous vehicle and mission control data provided from a user interface. The system also includes a decision-making algorithm to select one of the operational plans for operational behavior of the autonomous vehicle based on the situational awareness data and the mission control data at a given time and to provide an intent decision based on the situational awareness data and the selected one of the operational plans. The system further includes an execution engine to provide control outputs to operational components of the autonomous vehicle for navigation and control based on the selected one of the operational plans and in response to the intent decision.
Another example includes a method for controlling an autonomous vehicle. The method includes providing mission control data to an autonomous vehicle control system associated with the autonomous vehicle via a user interface. The method also includes generating situational awareness data associated with the autonomous vehicle in response receiving sensor data provided from on-board sensors. The method also includes selecting one of a plurality of operational plans that each correspond to a predetermined set of behavioral characteristics of the autonomous vehicle based on the situational awareness data and the mission control data. The method further includes providing control outputs to operational components associated with the autonomous vehicle for navigation and control of the autonomous vehicle in response to the sensor data and based on the selected one of the plurality of operational plans.
Another example includes an autonomous vehicle. The vehicle includes on-board sensors configured to generate situational awareness data associated with situational awareness conditions of the autonomous vehicle. The vehicle also includes operational components configured to provide navigation and control of the autonomous vehicle in response to control outputs. The vehicle also includes an autonomous vehicle control system operating on a computer readable medium. The autonomous vehicle control system includes an operational plan controller configured to maintain a plurality of operational plans that each correspond to a predetermined set of behavioral characteristics of an associated autonomous vehicle based on the situational awareness data and mission control data. The system also includes a decision-making algorithm configured to select one of the plurality of operational plans for operational behavior of the autonomous vehicle based on the sensor data and the mission control data at a given time and to provide an intent decision based on the situational awareness data and based on the selected one of the plurality operational plans. The system further includes an execution engine configured to provide the control outputs to the operational components based on the selected one of the plurality of operational plans and in response to the intent decision.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates an example of an autonomous vehicle system.
FIG. 2 illustrates an example of an operational plan controller.
FIG. 3 illustrates an example of a utility calculation system for a decision-making algorithm.
FIG. 4 illustrates an example of an intent generation system.
FIG. 5 illustrates an example of a method for controlling an autonomous vehicle.
DETAILED DESCRIPTION
The present disclosure relates generally to artificial intelligence systems, and specifically to an autonomous vehicle control system. An autonomous vehicle control system is implemented, for example, at least partially on a computer readable medium, such as a processor that is resident on an associated autonomous vehicle. For example, the autonomous vehicle can be configured as an unmanned aerial vehicle (UAV). The autonomous vehicle thus includes on-board sensors that are configured to generate sensor data that is associated with situational awareness of the autonomous vehicle, and further includes operational components that are associated with navigation and control of the autonomous vehicle (e.g., flaps, an engine, ordnance, or other operational components). The autonomous vehicle control system can thus provide autonomous control of the autonomous vehicle based on receiving the sensor data and mission control data (e.g., defining parameters of a given mission) and by providing output signals to the operational components.
The autonomous vehicle control system includes an operational plan controller, a decision-making algorithm, a utility calculation system, and an execution engine. The operational plan controller is configured to maintain predetermined operational plans that each correspond to a predetermined set of behavioral characteristics of the autonomous vehicle based on the sensor data and the mission control data, such as provided from a user interface. The utility calculation system is configured to calculate a total utility factor based on a plurality of behavioral characteristics. The decision-making algorithm is configured to select one of the plurality of operational plans for operational behavior of the autonomous vehicle based on the sensor data and the mission control data at a given time and to provide an intent decision based on situational awareness characteristics provided via the sensor data and the total utility factor for a given decision instance. The execution engine is configured to provide the outputs to the operational components for navigation and control of the autonomous vehicle based on the selected one of the operational plans and in response to the intent decision at the given decision instance.
FIG. 1 illustrates an example of an autonomous vehicle system 10. The autonomous vehicle system 10 includes an autonomous vehicle 12. As described herein, the term “autonomous vehicle” describes an unmanned vehicle that operates in an autonomous manner, such that the autonomous vehicle 12 is not piloted or operated in any continuous manner, but instead operates continuously based on a programmed set of instructions that dictate motion, maneuverability, and the execution of actions directed toward completing mission objectives in response to sensor data associated with external stimuli. As an example, the autonomous vehicle 12 can be configured as an unmanned aerial vehicle (UAV) that operates in an autonomous programmable manner for any of a variety of different purposes. The autonomous vehicle 12 includes an autonomous vehicle control system 14 that can be programmed such that the autonomous vehicle 12 can operate autonomously to complete predetermined mission objectives in response to inputs, such as provided via sensor data and mission control data.
In the example of FIG. 1, the autonomous vehicle 12 includes a set of on-board sensors 16 that can provide sensor input data SENS_IN to the autonomous vehicle control system 14. As an example, the on-board sensors 16 can include optical sensors, one or more cameras, and/or other types of electro-optical imaging sensors (e.g., radar, lidar, or a combination thereof). The on-board sensors 16 can also include location and/or situational awareness sensors (e.g., a global navigation satellite system (GNSS) receiver). Therefore, the on-board sensors 16 can be configured to obtain situational awareness data that is provided as the sensor input data SENS_IN to the autonomous vehicle control system 14. Additionally, the autonomous vehicle 12 can include operational components 18 that can correspond to navigation and control devices for operating the autonomous vehicle 12 and for completing mission objectives. As an example, the operational components 18 can include navigation components (e.g., wing, body, and/or tail flaps), an engine, ordnance, and/or other operational components. The autonomous vehicle control system 14 can provide control outputs OP_OUT to the operational components 18 to control the operational components 18. Therefore, the autonomous vehicle 12 can operate based on providing the control outputs OP_OUT to the operational components 18 in response to receiving the sensor input data SENS_IN via the on-board sensors 16.
The autonomous vehicle control system 14 can be configured as one or more processors 20 that are programmed to generate the control outputs OP_OUT in response to the sensor input data SENS_IN to control the autonomous vehicle 12. The processor(s) 20 can thus execute programmable instructions, such as stored in memory (not shown). As an example, the processor(s) 20 constituting the autonomous vehicle control system 14 can be programmed via a user interface 22 that is associated with the autonomous vehicle system 10. For example, the user interface 22 can be configured as a computer system or graphical user interface (GUI) that is accessible via a computer (e.g., via a network). The user interface 22 can be configured, for example, to program the autonomous vehicle control system 14, to define and provide mission objectives, and/or to provide limited or temporary control of the autonomous vehicle 12, such as in response to an override request by the autonomous vehicle control system 14, as described in greater detail herein. In the example of FIG. 1, the user interface 22 is demonstrated as providing mission control data CTRL (e.g., wirelessly) that can correspond to predetermined mission parameters 24 that describe the mission definitions and objectives, such as including a predetermined navigation course, parameters for navigating the predetermined navigation course, at least one mission objective, and behaviors for accomplishing the mission objective(s). The mission control data CTRL can also provide program data for programming behavioral characteristics and/or vehicle piloting signals for providing user override control, such as described in greater detail herein. While the user interface 22 is described previously as a computer system or GUI, as another example, the user interface 22 can be configured as one or more chips or circuit boards (e.g., printed circuit boards (PCBs)) that can be installed in the autonomous vehicle control system 14, such that the user interface 22 can be pre-programmed with the mission control data CTRL and can be accessed by the autonomous vehicle control system 14.
In the example of FIG. 1, the processor(s) 20 can be programmed via the mission control data CTRL to implement an operational plan controller 26, a decision-making algorithm 28, and an execution engine 30. The operational plan controller 26 can control an operating plan associated with the autonomous vehicle 12, such as corresponding to a current behavioral mode in which the autonomous vehicle 12 operates. For example, the operational plan controller 26 can maintain a plurality of selectable operational plans that each correspond to a predetermined set of behavioral characteristics of the autonomous vehicle 12. As an example, the operational plan controller 26 can set the autonomous vehicle control system 14 to operate in a given operational plan at a given duration of time based on the sensor input data SENS_IN and/or the mission control data CTRL provided from the user interface 22. As described in greater detail herein, the operational plan controller 26 can be configured to set a given operational plan based on the decision-making algorithm 28 in response to a given intent decision, such as at a given decision instance.
As described herein, the term “intent decision” refers to a decision that is required to be made by the decision-making algorithm 28 that is consistent with predetermined parameters associated with control of the autonomous vehicle 12 and programmable behavioral characteristics of the autonomous vehicle control system 14 to control the autonomous vehicle 12 in response to unexpected circumstances. As also described herein, the term “decision instance” refers to a given time and/or set of circumstances that are dependent on unexpected and/or unplanned external stimuli (e.g., provided via the sensor input data SENS_IN) that require a decision via the decision-making algorithm 28 to dictate behavior of the autonomous vehicle 12.
FIG. 2 illustrates an example of an operational plan controller 50. As an example, the operational plan controller 50 can be implemented as hardware, software, firmware, or a combination thereof that is executable by the processor(s) 20. The operational plan controller 50 can correspond to the operational plan controller 26 in the example of FIG. 1. Therefore, reference is to be made to the example of FIG. 1 in the following description of the example of FIG. 2.
The operational plan controller 50 can select an operating plan associated with the autonomous vehicle 12, such as corresponding to a current behavioral mode in which the autonomous vehicle 12 operates. In the example of FIG. 2, the selected operating plan is provided as a command CURR_PLN, which can be configured to trigger one or more routines corresponding to the selected operating plan (e.g., in the autonomous vehicle controller 14 or in the operational plan controller 50 itself). The operational plan controller 50 includes a nominal plan 52, an expedite plan 54, a caution plan 56, a stop plan 58, and a user request plan 60. The nominal plan 52 can be associated with a nominal operational behavior of the autonomous vehicle 12 and can be based on the mission control data CTRL. As an example, the nominal plan 52 can be a default operational plan that the operational plan controller 50 sets as the operational plan for the autonomous vehicle control system 14 when all other systems are stable, such as during initialization (e.g., takeoff), completion (e.g., landing), and/or during the mission defined by the mission parameters 24, such as absent perturbations by unexpected and/or unplanned external factors. As an example, the mission control data CTRL can dictate the external conditions as to when the autonomous vehicle control system 14 should be set to the nominal plan 52.
The expedite plan 54 can be associated with an expedited operational behavior of the autonomous vehicle 12 relative to the nominal operational behavior of the nominal plan 52 and can be based on the mission control data CTRL. As an example, the mission control data CTRL can dictate when the autonomous vehicle control system 14 should switch from the nominal plan 52 to the expedite plan 54 based on external conditions or based on the mission parameters 24. For example, delays in the mission defined by the mission parameters 24 based on previous circumstances (e.g., operating in the caution plan 56, as described in greater detail herein) can result in the autonomous vehicle 12 operating behind schedule for one or more specific mission criteria defined by the mission parameters 24. Therefore, the expedite plan 54 can be implemented by the operational plan controller 50 for the autonomous vehicle control system 14 when all other systems are stable during the mission defined by the mission parameters 24 absent perturbations by unexpected and/or unplanned external factors to attempt to recapture time. As described herein, the expedite plan 54 can be implemented in situations when the decision-making algorithm 28 calculates that the utility of an expedited mission operation outweighs the utility of increased risk to the autonomous vehicle 12 or to completion of the mission objective(s).
The caution plan 56 can be associated with a reduced-risk operational behavior of the autonomous vehicle 12 relative to the nominal operational behavior of the nominal plan 52 and can be based on the mission control data CTRL. As an example, the mission control data CTRL can dictate when the autonomous vehicle control system 14 should switch from the nominal plan 52 to the caution plan 56 based on external conditions, such as perceived hazards and/or threats based on the sensor input data SENS_IN. For example, upon a determination of hazardous environment conditions, an external obstacle, or an imminent or detected threat that may require evasive maneuvering, the operational plan controller 50 can set or can be instructed to set the autonomous vehicle control system 14 to the caution plan 56. Therefore, the autonomous vehicle control system 14 can dictate a slower speed for the autonomous vehicle 12, such as to provide capability for reducing risks by providing more time for reaction and/or maneuvering. Alternatively, the caution plan 56 may force deviation from the predetermined navigation course associated with completion of the mission objectives, as defined by the mission parameters 24, while still maintaining a rapid speed for the autonomous vehicle 12. For example, the autonomous vehicle control system 14 can decide that operation of the autonomous vehicle 12 in a predetermined navigation course defined by the mission parameters 24 in the nominal plan 52 is too risky, such as described in greater detail herein, and can thus command the operational plan controller 50 to switch to the caution plan 56 as the current operational plan CURR_PLN.
Similarly, the stop plan 58 can be associated with ceased operational behavior of the autonomous vehicle 12, such as in response to detecting an imminent collision with an obstacle or another moving vehicle. As an example, the stop plan 58 can be associated with an autonomous land vehicle, or an autonomous aerial vehicle that is preparing to take off or has landed. Lastly, the user request plan 60 can correspond to a situation in which the autonomous vehicle control system 14 transmits a request for instructions from the user interface 22. For example, in response to the decision-making algorithm 28 determining an approximately equal utility or probability in determining a given intent decision at a respective decision instance, the autonomous vehicle control system 14 can be switched to the user request plan 60. As an example, the user request plan 60 can accompany another operational plan of the operational plan controller 50, such as one of the nominal plan 52, the caution plan 56, or the stop plan 58, such that the autonomous vehicle 12 can continue to operate in a predetermined manner according to the selected operational plan CURR_PLN while awaiting additional instructions as dictated by the user request plan 60. Furthermore, the operational plan controller 50 can also include at least one additional plan 62 that can dictate a respective at least one additional behavioral mode in which the autonomous vehicle 12 can operate. Thus, the operational plan controller 50 is not limited to providing the current plan CURR_PLN as one of the nominal plan 52, the expedite plan 54, the caution plan 56, the stop plan 58, and the user request 60.
Referring back to the example of FIG. 1, the decision-making algorithm 28 includes a utility calculation system 32 and an intent generation system 34. The utility calculation system is configured to calculate a total utility factor (TUF) for each of the operational plans (e.g., the nominal plan 52, the expedite plan 54, the caution plan 56, the stop plan 58, the user request plan 60, and/or the additional plan(s) 62) that are maintained by the operational plan controller 26. The calculation of the TUF can be based on behavioral characteristics that can correspond to characteristics of the autonomous vehicle 12, user inputs provided via the user interface 22, avoidance of potential obstacles (e.g., external objects, such as other aircraft, terrain features, buildings, etc.), integrity of the sensors 16, and/or predetermined performance characteristics of the operational components 18 of the autonomous vehicle 12. Thus, the utility calculation system 32 can be configured to command the operational plan controller 26 to select the operational plan based on the TUF calculated for each of the operational plans (e.g., based on the highest TUF).
FIG. 3 illustrates an example of a utility calculation system 100 for a decision-making algorithm (e.g., the decision-making algorithm 28). As an example, the utility calculation system 100 can be implemented as hardware, software, firmware, or a combination thereof that is executable by the processor(s) 20. The utility calculation system 100 can correspond to the utility calculation system 32 in the example of FIG. 1. Therefore, reference is to be made to the example of FIG. 1 in the following description of the example of FIG. 3.
The utility calculation system 100 can implement a variety of predetermined behavioral factors to calculate the TUF that can dictate the operational behavior of the autonomous vehicle control system 14. In the example of FIG. 3, the behavioral factors include performance utility factors 102 associated with performance characteristics of the autonomous vehicle 12 and/or characteristics of the mission defined by the mission parameters 24. As an example, the performance utility factors 102 can include timing associated with the mission, such as defined by the mission parameters 24, can include capabilities of the autonomous vehicle 12, such as velocity, handling, maneuverability, response speed, turning radii, and/or other navigation characteristics (e.g., including motion in six-degrees of freedom), changes toperformance based on ordnance loading, and/or a variety of other performance characteristics of the autonomous vehicle 12. The behavioral factors also include operator utility factors 104 associated with the mission control data CTRL and availability of operator inputs for control of the autonomous vehicle 12. As an example, the operator utility factors 104 can include timing associated with response time for communications with a user via the user interface 22, such as relating to a current velocity of the autonomous vehicle 12, as well as a level of detail required to provide user input (e.g., in response to requests that can be provided in the user request plan 60).
The behavioral factors can also include avoidance safety utility factors 106 associated with consequences of collision of the autonomous vehicle 12. The avoidance safety utility factors 106 can account for velocity of the autonomous vehicle 12 relative to a type of potential obstacle with which the autonomous vehicle 12 can have an imminent collision, such as based on an evaluation of static objects (e.g., terrain) relative to dynamic objects (e.g., other vehicles, threats, etc.). The behavioral factors can further include integrity safety utility factors 108 that are associated with an impact of environmental conditions on the on-board sensors 16 and operational components 18 associated with the autonomous vehicle 12. For example, the integrity safety utility factors 108 can be associated with the effects of weather on the on-board sensors 16 and operational components 18, such as the effects of rain occluding optical components of the on-board sensors 16, the effects of rain on the grip of tires to a concrete airport tarmac, the effect of turbulence on the operational components 18, the effect of clouds on the sensors 16, etc.
In the example of FIG. 3, the utility calculation system 100 also includes respective programmable weights that are selectively assigned to the plurality of behavioral characteristics. As an example, each of the programmable weights can be provided as a portion of the mission control data CTRL provided via the user interface 22. The programmable weights include performance utility weight(s) 110 (“PU WEIGHT(S)”) that can be associated with the performance utility factors 102, operator utility weight(s) 112 (“OU WEIGHT(S)”) that can be associated with the operator utility factors 104, avoidance safety utility weight(s) 114 (“ASU WEIGHT(S)”) that can be associated with the avoidance safety utility factors 106, and integrity safety utility weight(s) 116 (“ISU WEIGHT(S)”) that can be associated with the integrity safety utility factors 108. Each of the performance utility weight(s) 110, operator utility weight(s) 112, avoidance safety utility weight(s) 114, and integrity safety utility weight(s) 116 can include one or more weighted multiplicative factors that can emphasize or de-emphasize certain ones of the behavioral factors (e.g., in each of the performance utility factors 102, operator utility factors 104, avoidance safety utility factors 106, and integrity safety utility factors 108) at a given time. The selection of the performance utility weight(s) 110, operator utility weight(s) 112, avoidance safety utility weight(s) 114, and/or integrity safety utility weight(s) 116 can be based, for example, on the mission control data CTRL at various stages of a given mission defined by the mission parameters 24. Therefore, a user can implement the user interface 22 to selectively and programmably set the weights of the respective performance utility weight(s) 110, operator utility weight(s) 112, avoidance safety utility weight(s) 114, and integrity safety utility weight(s) 116 at various stages of the mission defined by the mission parameters 24 to dictate the operational plan of the autonomous vehicle control system 14 for operating the autonomous vehicle 12.
In the example of FIG. 3, the weighted performance utility factors 102, demonstrated as WPU, the weighted operator utility factors 104, demonstrated as WOU, the weighted avoidance safety utility factors 106, demonstrated as WASU, and the weighted integrity safety utility factors 108, demonstrated as WISU, are provided to a TUF calculation component 118. The TUF calculation component 118 is configured to calculate the TUF for each given one of the operational plans (e.g., the nominal plan 52, the expedite plan 54, the caution plan 56, the stop plan 58, and/or the user request plan 60). In addition, the TUF calculation component 118 can receive situational awareness data via the sensor input data SENS_IN, such that the TUF can be modified based on external considerations (e.g., weather, threats, potential obstacles, etc.). Therefore, the TUF calculation component 118 can calculate the TUF for each of the operational plans, and can provide the calculated TUF for each of the operational plans to the operational plan controller 26 for selection of a given one of the operational plans at a given time.
FIG. 4 illustrates an example of an intent generation system 150. As an example, the intent generation system 150 can be implemented as hardware, software, firmware, or a combination thereof that is executable by the processor(s) 20. The intent generation system 150 is demonstrated in the example of FIG. 4 as a motion intent generation system to provide decision-making capability in the context of motion of the autonomous vehicle 12, such as for the autonomous vehicle 12 moving on an airfield tarmac. For example, the intent generation system 150 is demonstrated as a collision avoidance intent generator to provide an intent decision for operation of the autonomous vehicle 12 to avoid a collision of the autonomous vehicle 12 with a potential obstacle (e.g., another aircraft on the tarmac). The intent generation system 150 can correspond to the intent generation system 34 in the example of FIG. 1. Therefore, reference is to be made to the example of FIG. 1 in the following description of the example of FIG. 4.
The intent generation system 150 includes an intent generator 152 that is configured to provide the intent decision for a given decision instance. The intent generator 152 is demonstrated as including a probability calculator 154 that is configured to calculate a set of probabilities associated with predetermined possible outcomes for a given decision instance. For example, the set of probabilities can include a probability of collision with another aircraft that approaches the same intersection of the tarmac as the autonomous vehicle 12. Thus, the possible courses of action for the autonomous vehicle 12 could include: proceed at the same speed, slow down, speed up, stop, turn left, turn right, go straight, etc. Therefore, the intent generator 152 is configured to provide an intent decision based on the set of probabilities, such as to provide the intent decision based on a most acceptable relative probability of the set of probabilities. In the example of FIG. 4 the probability calculator 154 calculates the set of probabilities based on the situational awareness characteristics provided via the sensor input data SENS_IN and the selected operational plan, demonstrated in the example of FIG. 4 as “CURR_PLN”. The probabilities can be calculated by the probability calculator 154 based on any of a variety of algorithms, such as a Bayesian network, influence diagrams, and/or a variety of other decision theory calculations.
The situational awareness characteristics can be provided via the sensor input data SENS_IN are demonstrated as including a relative distance 156, a relative velocity 158, a relative trajectory 160, and environmental considerations 162. The relative distance 156, the relative velocity 158, and the relative trajectory 160 can correspond to respective motion features of autonomous vehicle 12 relative to one or more potential obstacles, such as another aircraft on the tarmac (e.g., at an intersection of the tarmac). The relative distance 156 can thus correspond to a relative distance between the autonomous vehicle 12 and the potential obstacle, such as with respect to the intersection or with respect to each other. The relative velocity 158 can thus correspond to a relative velocity between the autonomous vehicle 12 and the potential obstacle with respect to each other or with respect to the intersection. The relative trajectory 160 can thus correspond to a relative direction of motion between the autonomous vehicle 12 and the potential obstacle, such as could indicate intersection of motion and thus a potential collision. The environmental considerations 162 can include characteristics of the environment in which the autonomous vehicle 12 operates. For example, rain, snow, or ice on the tarmac could affect the performance of the autonomous vehicle 12 on the tarmac, and thus the probability of collision of the autonomous vehicle 12 and the potential obstacle could increase at a given relative distance 156, relative velocity 158, and/or relative trajectory 160.
As described previously, upon calculating the set of probabilities of the possible outcomes of the decision instance via the probability calculator 154, the intent generator 152 can provide the intent decision corresponding to a most favorable probable outcome for a given course of action. Referring back to the example of FIG. 1, the decision-making algorithm 28 can communicate the intent decision to the execution engine 30. The execution engine 30 can be configured to execute the physical results of the intent decision by generating an appropriate set of outputs that can collectively correspond to the control outputs OP_OUT. Thus, the control outputs OP_OUT can be provided to the operational components 18 of the autonomous vehicle 12 for execution of the intent decision. For example, the probability calculator 154 could calculate the probability of collision with the other aircraft approaching the tarmac, as described previously, for each of the courses of action (e.g., proceed at the same speed, slow down, speed up, stop, turn left, turn right, go straight, etc.). Thus, as an example, the intent generator 152 could determine that the most favorable course of action based on the calculated probabilities is for the autonomous vehicle 12 to turn left at the tarmac intersection. Therefore, the intent generator 152 can provide the corresponding intent decision to the execution engine 30 to generate the corresponding control outputs OP_OUT to turn the wheel(s) of the autonomous vehicle 12 (with the wheel(s) corresponding to the appropriate operational components 18) to enact a left turn of the autonomous vehicle 12 at the appropriate time (e.g., as provided by the sensor input data SENS_IN). Accordingly, the autonomous vehicle 12 can operate in a manner that substantially reduces the probability of collision with the potential obstacle based on the determined intent decision.
The description herein of the intent generation system 150 providing intent decision making for the autonomous vehicle 12 is provided by example. Therefore, the intent generation system 150 can be configured to generate intent decisions for any of variety of other situations and scenarios that require intent decisions based on external stimuli and/or situational awareness. For example, the intent generation system 150 can be implemented to provide intent decisions during the mission defined by the mission parameters 24, such as to decide to deviate from a predetermined navigation course in response to unexpected circumstances (e.g., threats, weather conditions, etc.). Additionally, the intent generation system 150 can provide navigation intent decisions in response to deviation from the predetermined navigation course, such as to avoid obstacles, threats, mid-air collisions, to attempt returning to the predetermined course, to attempt an alternative course to completion of the mission, and/or to decide to abort the mission. Accordingly, the intent generation system 150 can be implemented by the decision-making algorithm 28 in a variety of ways to provide autonomous control of the autonomous vehicle 12.
In view of the foregoing structural and functional features described above, a method in accordance with various aspects of the present disclosure will be better appreciated with reference to FIG. 5. While, for purposes of simplicity of explanation, the method of FIG. 5 is shown and described as executing serially, it is to be understood and appreciated that the present disclosure is not limited by the illustrated order, as some aspects could, in accordance with the present disclosure, occur in different orders and/or concurrently with other aspects from that shown and described herein. Moreover, not all illustrated features may be required to implement a method in accordance with an aspect of the present disclosure.
FIG. 5 illustrates a method 200 for controlling an autonomous vehicle (e.g., the autonomous vehicle 12). At 202, mission control data (e.g., the mission control data CTRL) is provided to an autonomous vehicle control system (e.g., the autonomous vehicle control system 14) associated with the autonomous vehicle via a user interface (e.g., the user interface 22). At 204, situational awareness data associated with the autonomous vehicle is generated in response receiving sensor data (e.g., the sensor data SENS_IN) provided from on-board sensors (e.g., the on-board sensors 16). At 206, one of a plurality of operational plans (e.g., the operational plans 52, 54, 56, 58, 60, 62) that each correspond to a predetermined set of behavioral characteristics of the autonomous vehicle is selected based on the situational awareness data and the mission control data. At 208, control outputs (e.g., the control outputs OP_OUT) are provided to operational components (e.g., the operational components 18) associated with the autonomous vehicle for navigation and control of the autonomous vehicle in response to the sensor data and based on the selected one of the plurality of operational plans.
What have been described above are examples of the disclosure. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the disclosure, but one of ordinary skill in the art will recognize that many further combinations and permutations of the disclosure are possible. Accordingly, the disclosure is intended to embrace all such alterations, modifications, and variations that fall within the scope of this application, including the appended claims.

Claims (20)

What is claimed is:
1. An autonomous vehicle control system operating on a computer readable medium, the autonomous vehicle control system comprising:
an operational plan controller configured to maintain a plurality of operational plans that each correspond to a predetermined set of behavioral characteristics of an associated autonomous vehicle based on situational awareness data provided via on-board sensors of the autonomous vehicle and mission control data provided from a user interface;
a decision-making algorithm configured to select one of the plurality of operational plans for operational behavior of the autonomous vehicle based on the situational awareness data and the mission control data at a given time and to provide an intent decision based on the situational awareness data and the selected one of the plurality operational plans; and
an execution engine configured to provide control outputs to operational components associated with the autonomous vehicle for navigation and control of the autonomous vehicle based on the selected one of the plurality of operational plans and in response to the intent decision.
2. The system of claim 1, wherein the decision-making algorithm comprises a utility calculation system configured to calculate a total utility factor for each of the plurality of operational plans based on a plurality of behavioral characteristics, the utility calculation system being configured to select the one of the plurality of operational plans based on the total utility factor calculated for each of the plurality of operational plans.
3. The system of claim 2, wherein the decision-making algorithm further comprises an intent generation system configured to provide the intent decision, wherein the intent generation system comprises a probability calculator configured to calculate a set of probabilities associated with predetermined possible outcomes for the given decision instance based on the situational awareness data and the selected one of the plurality operational plans.
4. The system of claim 2, wherein the plurality of behavioral characteristics comprises at least one of:
performance utility factors associated with performance characteristics of the autonomous vehicle;
operator utility factors associated with the mission control data and availability of operator inputs for control of the autonomous vehicle;
avoidance safety utility factors associated with consequences of collision of the autonomous vehicle; and
integrity safety utility factors associated with an impact of environmental conditions on the on-board sensors and operational components associated with the autonomous vehicle.
5. The system of claim 2, wherein the total utility factor is calculated based on the plurality of behavioral characteristics and based on a respective plurality of programmable weights that are selectively assigned to the plurality of behavioral characteristics and are provided as a portion of the mission control data provided via the user interface.
6. The system of claim 2, wherein the mission control data comprises mission parameters that describe mission definitions and objectives, wherein the utility calculation system is configured to select the one of the plurality of operational plans based on the mission parameters and based on the total utility factor calculated for each of the plurality of operational plans.
7. The system of claim 1, wherein the plurality of operational plans comprises at least one of:
a nominal plan associated with a nominal operational behavior of the autonomous vehicle and the mission control data;
an expedite plan associated with an expedited operational behavior of the autonomous vehicle relative to the nominal operational behavior and based on the mission control data;
a caution plan associated with a reduced-risk operational behavior of the autonomous vehicle relative to the nominal operational behavior and based on the mission control data;
a stop plan associated with ceased operational behavior of the autonomous vehicle; and
a user request plan in which the autonomous vehicle transmits a request for instructions from the user interface.
8. The system of claim 1, wherein the decision-making algorithm comprises a collision-avoidance algorithm configured to provide the intent decision to avoid a collision of the autonomous vehicle with a potential obstacle.
9. The system of claim 8, wherein the collision-avoidance algorithm is configured to provide the intent decision to avoid the collision of the autonomous vehicle with the potential obstacle based on at least one of:
a relative distance of the autonomous vehicle and the potential obstacle;
a relative velocity of the autonomous vehicle and the potential obstacle;
a relative trajectory of the autonomous vehicle and the potential obstacle; and
environment conditions associated with an environment of the autonomous vehicle.
10. An autonomous vehicle comprising the autonomous vehicle control system of claim 1, wherein the autonomous vehicle further comprises:
the on-board sensors configured to generate the sensor data; and
the operational components configured to provide navigation and control of the autonomous vehicle.
11. A method for controlling an autonomous vehicle, the method comprising:
providing mission control data to an autonomous vehicle control system associated with the autonomous vehicle via a user interface;
generating situational awareness data associated with the autonomous vehicle in response receiving sensor data provided from on-board sensors;
selecting one of a plurality of operational plans that each correspond to a predetermined set of behavioral characteristics of the autonomous vehicle based on the situational awareness data and the mission control data; and
providing control outputs to operational components associated with the autonomous vehicle for navigation and control of the autonomous vehicle in response to the sensor data and based on the selected one of the plurality of operational plans.
12. The method of claim 11, further comprising calculating a total utility factor for each of the plurality of operational plans based on a plurality of behavioral characteristics, wherein selecting the one of the plurality of operational plans comprises selecting the one of the plurality of operational plans based on the total utility factor calculated for each of the plurality of operational plans.
13. The method of claim 12, wherein calculating the total utility factor comprises providing a respective plurality of programmable weights that are selectively assigned to the plurality of behavioral characteristics via the mission control data.
14. The method of claim 11, wherein providing the mission control data comprises providing mission parameters that describe mission definitions and objectives, wherein selecting the one of the plurality of operational plans comprises selecting the one of the plurality of operational plans based on the mission parameters and based on the total utility factor calculated for each of the plurality of operational plans.
15. The method of claim 11, further comprising providing an intent decision at a given decision instance based on the sensor data and based on the selected one of the plurality operational plans, wherein providing the control outputs further comprises providing the control outputs to the operational components associated with the autonomous vehicle in response to the intent decision and based on the selected one of the plurality of operational plans.
16. The method of claim 15, further comprising calculating a set of probabilities associated with predetermined possible outcomes for the given decision instance based on the situational awareness data and the selected one of the plurality operational plans, wherein providing the intent decision comprises providing the intent decision based on a most acceptable relative probability of the set of probabilities.
17. The method of claim 15, wherein generating the situational awareness data comprises:
obtaining a relative distance between the autonomous vehicle and a potential obstacle;
obtaining a relative velocity between the autonomous vehicle and the potential obstacle;
obtaining a relative trajectory between the autonomous vehicle and the potential obstacle; and
obtaining environmental considerations associated with an environment of the autonomous vehicle;
wherein calculating the set of probabilities comprises calculating the set of probabilities associated with predetermined possible outcomes for the given decision instance based on the relative distance, the relative velocity, the relative trajectory, and the environmental conditions, and based on the selected one of the plurality operational plans.
18. An autonomous vehicle comprising:
on-board sensors configured to generate situational awareness data associated with situational awareness conditions of the autonomous vehicle;
operational components configured to provide navigation and control of the autonomous vehicle in response to control outputs; and
an autonomous vehicle control system operating on a computer readable medium, the autonomous vehicle control system comprising:
an operational plan controller configured to maintain a plurality of operational plans that each correspond to a predetermined set of behavioral characteristics of an associated autonomous vehicle based on the situational awareness data and mission control data;
a decision-making algorithm configured to select one of the plurality of operational plans for operational behavior of the autonomous vehicle based on the sensor data and the mission control data at a given time and to provide an intent decision based on the situational awareness data and based on the selected one of the plurality operational plans; and
an execution engine configured to provide the control outputs to the operational components based on the selected one of the plurality of operational plans and in response to the intent decision.
19. The system of claim 18, wherein the decision-making algorithm comprises:
a utility calculation system configured to calculate a total utility factor for each of the plurality of operational plans based on a plurality of behavioral characteristics, the utility calculation system being configured to select the one of the plurality of operational plans based on the total utility factor calculated for each of the plurality of operational plans; and
an intent generation system configured to provide the intent decision, wherein the intent generation system comprises a probability calculator configured to calculate a set of probabilities associated with predetermined possible outcomes for the given decision instance based on the situational awareness characteristics and the selected one of the plurality operational plans.
20. The system of claim 19, wherein the total utility factor is calculated based on the plurality of behavioral characteristics and based on a respective plurality of programmable weights that are selectively assigned to the plurality of behavioral characteristics and are provided as a portion of the mission control data provided via the user interface.
US15/266,708 2015-10-06 2016-09-15 Autonomous vehicle control system Active US10019005B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/266,708 US10019005B2 (en) 2015-10-06 2016-09-15 Autonomous vehicle control system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562237917P 2015-10-06 2015-10-06
US15/266,708 US10019005B2 (en) 2015-10-06 2016-09-15 Autonomous vehicle control system

Publications (2)

Publication Number Publication Date
US20170097640A1 US20170097640A1 (en) 2017-04-06
US10019005B2 true US10019005B2 (en) 2018-07-10

Family

ID=57003598

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/266,708 Active US10019005B2 (en) 2015-10-06 2016-09-15 Autonomous vehicle control system

Country Status (4)

Country Link
US (1) US10019005B2 (en)
DE (1) DE112016004563T5 (en)
GB (1) GB2559894B (en)
WO (1) WO2017062151A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10460208B1 (en) * 2019-01-02 2019-10-29 Cognata Ltd. System and method for generating large simulation data sets for testing an autonomous driver
US10909866B2 (en) 2018-07-20 2021-02-02 Cybernet Systems Corp. Autonomous transportation system and methods
US11100371B2 (en) 2019-01-02 2021-08-24 Cognata Ltd. System and method for generating large simulation data sets for testing an autonomous driver
US11345462B2 (en) * 2019-03-22 2022-05-31 Northrop Grumman Systems Corporation System control architecture monitoring system

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10518770B2 (en) * 2017-03-14 2019-12-31 Uatc, Llc Hierarchical motion planning for autonomous vehicles
US10629080B2 (en) * 2017-08-31 2020-04-21 Uatc Llc Autonomous vehicles featuring vehicle intention system
US10795356B2 (en) * 2017-08-31 2020-10-06 Uatc, Llc Systems and methods for determining when to release control of an autonomous vehicle
US10606270B2 (en) 2017-10-18 2020-03-31 Luminar Technologies, Inc. Controlling an autonomous vehicle using cost maps
US10775314B2 (en) * 2017-11-10 2020-09-15 General Electric Company Systems and method for human-assisted robotic industrial inspection
US10501085B2 (en) 2017-12-07 2019-12-10 Waymo Llc Early object detection for unprotected turns
US10877999B2 (en) * 2017-12-21 2020-12-29 Micron Technology, Inc. Programmatically identifying a personality of an autonomous vehicle
US11472549B2 (en) * 2017-12-29 2022-10-18 Sentera, Inc. Automatic location-based UAV mission plan loading
US10967875B2 (en) 2018-01-05 2021-04-06 Honda Motor Co., Ltd. Control system for autonomous all-terrain vehicle (ATV)
US11048277B1 (en) 2018-01-24 2021-06-29 Skydio, Inc. Objective-based control of an autonomous unmanned aerial vehicle
US10894545B2 (en) 2018-03-14 2021-01-19 Micron Technology, Inc. Configuration of a vehicle based on collected user data
US11148658B2 (en) 2018-03-21 2021-10-19 Micron Technology, Inc. Personalization of a vehicle based on user settings
US11307584B2 (en) 2018-09-04 2022-04-19 Skydio, Inc. Applications and skills for an autonomous unmanned aerial vehicle
US11087630B2 (en) * 2018-12-28 2021-08-10 Intel Corporation Drone-based traffic control and V2X enhancements
US11312506B2 (en) 2019-03-21 2022-04-26 Performance Drone Works Llc Autonomous quadcopter piloting controller and debugger
US11409291B2 (en) * 2019-03-21 2022-08-09 Performance Drone Works Llc Modular autonomous drone
US11455336B2 (en) 2019-03-21 2022-09-27 Performance Drone Works Llc Quadcopter hardware characterization and simulation
US11721235B2 (en) * 2019-03-21 2023-08-08 Performance Drone Works Llc Quadcopter sensor noise and camera noise recording and simulation
DE102019212604A1 (en) * 2019-08-22 2021-02-25 Robert Bosch Gmbh Method and control device for determining an evaluation algorithm from a plurality of available evaluation algorithms for processing sensor data of a vehicle sensor of a vehicle
EP3896604A1 (en) 2020-04-16 2021-10-20 Toyota Jidosha Kabushiki Kaisha Vehicle driving and monitoring system; method for maintaining a sufficient level of situational awareness; computer program and computer readable medium for implementing the method
CN115917620A (en) * 2020-06-29 2023-04-04 住友电气工业株式会社 Vehicle-mounted device, vehicle communication system and algorithm providing method
US11821994B2 (en) * 2021-06-29 2023-11-21 New Eagle, Llc Localization of autonomous vehicles using camera, GPS, and IMU
CN114357871A (en) * 2021-12-23 2022-04-15 中国矿业大学 Intelligent decision-making method for steering mode of special rescue vehicle

Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3486937A (en) 1967-03-24 1969-12-30 Perkin Elmer Corp Method of growing a single crystal film of a ferrimagnetic material
US3573207A (en) 1968-03-13 1971-03-30 Lignes Telegraph Telephon Microwave magnetic materials with a hexagonal structure
US3748605A (en) 1970-11-05 1973-07-24 Nat Res Dev Tunable microwave filters
US4186357A (en) 1977-03-18 1980-01-29 Societe Lignes Telegraphiques Et Telephoniques Non-reciprocal microwave phase shifters operating in a wide band on edge mode
US4188594A (en) 1978-01-17 1980-02-12 Rockwell International Corporation Fixed frequency filters using epitaxial ferrite films
US4189521A (en) 1977-07-05 1980-02-19 Rockwell International Corporation Epitaxial growth of M-type hexagonal ferrite films on spinel substrates and composite
US4459567A (en) 1982-06-14 1984-07-10 The United States Of America As Represented By The Secretary Of The Army Dielectric waveguide ferrite resonance isolator
US4689585A (en) 1984-12-19 1987-08-25 Martin Marietta Corporation Dielectric slab signal isolators
US4716389A (en) 1986-10-20 1987-12-29 Honeywell Inc. Millimeter wave microstrip surface mounted attenuator
US4806886A (en) 1988-03-01 1989-02-21 The United States Of America As Represented By The Secretary Of The Army Microstrip resonance isolator
US5327148A (en) 1993-02-17 1994-07-05 Northeastern University Ferrite microstrip antenna
US5642467A (en) * 1995-01-31 1997-06-24 The Penn State Research Foundation Controller for autonomous device
US6122572A (en) * 1995-05-08 2000-09-19 State Of Israel Autonomous command and control unit for mobile platform
US7069124B1 (en) * 2002-10-28 2006-06-27 Workhorse Technologies, Llc Robotic modeling of voids
US20070112700A1 (en) * 2004-04-22 2007-05-17 Frontline Robotics Inc. Open control system architecture for mobile autonomous systems
US7388550B2 (en) 2005-10-11 2008-06-17 Tdk Corporation PxM antenna with improved radiation characteristics over a broad frequency range
US7482977B2 (en) 2004-03-26 2009-01-27 Sony Corporation Antenna apparatus
US7844396B2 (en) * 2005-09-13 2010-11-30 Deere & Company Method and system for modular data processing for a vehicle control system
US20110015816A1 (en) * 2007-06-15 2011-01-20 Mountaintop Technologies, Inc. Aviation ground navigation system
CN101800107B (en) 2010-03-26 2012-05-09 西南交通大学 Anisotropic Z-type hexagonal ferrite and antenna using same
US8380367B2 (en) * 2009-03-26 2013-02-19 The University Of North Dakota Adaptive surveillance and guidance system for vehicle collision avoidance and interception
US20130140076A1 (en) 2010-05-10 2013-06-06 Korea Institute Of Machinery & Materials Waveband electromagnetic wave absorber and method for manufacturing same
CN101807746B (en) 2010-03-26 2013-06-12 西南交通大学 Radio-frequency identification antenna based on Z-type hexaferrite
US8612085B2 (en) * 2008-12-30 2013-12-17 Elbit Systems Ltd. Autonomous navigation system and method for a maneuverable platform
US20130342414A1 (en) 2010-11-15 2013-12-26 Yang-Ki Hong Magnetic exchange coupled core-shell nanomagnets
CN103647511A (en) 2013-12-03 2014-03-19 中国电子科技集团公司第四十一研究所 Broadband preselection mixer design method
US20140176380A1 (en) 2012-12-21 2014-06-26 Samsung Electro-Machanics Co., Ltd. Multilayer ferrite sheet, antenna device using the same, and manufacturing method thereof
US8878741B2 (en) 2009-01-16 2014-11-04 Northeastern University Tunable negative permeability based devices
US20150051783A1 (en) 2012-03-22 2015-02-19 Israel Aerospace Industries Ltd. Planning and monitoring of autonomous-mission
CA2881744A1 (en) 2014-02-14 2015-08-14 Accenture Global Services Limited Unmanned vehicle (uv) control system and uv movement and data control system
US20150234387A1 (en) 2014-02-14 2015-08-20 Accenture Global Services Limited Unmanned vehicle (uv) control system
US20150255846A1 (en) 2012-09-27 2015-09-10 Northeastern University Magnetostatic Surface Wave Nonreciprocal Tunable Bandpass Filters
US20160314224A1 (en) * 2015-04-24 2016-10-27 Northrop Grumman Systems Corporation Autonomous vehicle simulation system
US9527394B1 (en) * 2013-05-02 2016-12-27 Dershuen Allen Tang Transportation system of combined vehicles multi-coupled at highway speeds for electrical energy transfer and sharing

Patent Citations (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3486937A (en) 1967-03-24 1969-12-30 Perkin Elmer Corp Method of growing a single crystal film of a ferrimagnetic material
US3573207A (en) 1968-03-13 1971-03-30 Lignes Telegraph Telephon Microwave magnetic materials with a hexagonal structure
US3748605A (en) 1970-11-05 1973-07-24 Nat Res Dev Tunable microwave filters
US4186357A (en) 1977-03-18 1980-01-29 Societe Lignes Telegraphiques Et Telephoniques Non-reciprocal microwave phase shifters operating in a wide band on edge mode
US4189521A (en) 1977-07-05 1980-02-19 Rockwell International Corporation Epitaxial growth of M-type hexagonal ferrite films on spinel substrates and composite
US4188594A (en) 1978-01-17 1980-02-12 Rockwell International Corporation Fixed frequency filters using epitaxial ferrite films
US4459567A (en) 1982-06-14 1984-07-10 The United States Of America As Represented By The Secretary Of The Army Dielectric waveguide ferrite resonance isolator
US4689585A (en) 1984-12-19 1987-08-25 Martin Marietta Corporation Dielectric slab signal isolators
US4716389A (en) 1986-10-20 1987-12-29 Honeywell Inc. Millimeter wave microstrip surface mounted attenuator
US4806886A (en) 1988-03-01 1989-02-21 The United States Of America As Represented By The Secretary Of The Army Microstrip resonance isolator
US5327148A (en) 1993-02-17 1994-07-05 Northeastern University Ferrite microstrip antenna
US5642467A (en) * 1995-01-31 1997-06-24 The Penn State Research Foundation Controller for autonomous device
US6122572A (en) * 1995-05-08 2000-09-19 State Of Israel Autonomous command and control unit for mobile platform
US7069124B1 (en) * 2002-10-28 2006-06-27 Workhorse Technologies, Llc Robotic modeling of voids
US7482977B2 (en) 2004-03-26 2009-01-27 Sony Corporation Antenna apparatus
US20070112700A1 (en) * 2004-04-22 2007-05-17 Frontline Robotics Inc. Open control system architecture for mobile autonomous systems
US7844396B2 (en) * 2005-09-13 2010-11-30 Deere & Company Method and system for modular data processing for a vehicle control system
US7388550B2 (en) 2005-10-11 2008-06-17 Tdk Corporation PxM antenna with improved radiation characteristics over a broad frequency range
US20110015816A1 (en) * 2007-06-15 2011-01-20 Mountaintop Technologies, Inc. Aviation ground navigation system
US20140195095A1 (en) 2008-12-30 2014-07-10 Elbit Systems Ltd. Autonomous navigation system and method for a maneuverable platform
US8612085B2 (en) * 2008-12-30 2013-12-17 Elbit Systems Ltd. Autonomous navigation system and method for a maneuverable platform
US8878741B2 (en) 2009-01-16 2014-11-04 Northeastern University Tunable negative permeability based devices
US8380367B2 (en) * 2009-03-26 2013-02-19 The University Of North Dakota Adaptive surveillance and guidance system for vehicle collision avoidance and interception
CN101807746B (en) 2010-03-26 2013-06-12 西南交通大学 Radio-frequency identification antenna based on Z-type hexaferrite
CN101800107B (en) 2010-03-26 2012-05-09 西南交通大学 Anisotropic Z-type hexagonal ferrite and antenna using same
US20130140076A1 (en) 2010-05-10 2013-06-06 Korea Institute Of Machinery & Materials Waveband electromagnetic wave absorber and method for manufacturing same
US20130342414A1 (en) 2010-11-15 2013-12-26 Yang-Ki Hong Magnetic exchange coupled core-shell nanomagnets
US20150051783A1 (en) 2012-03-22 2015-02-19 Israel Aerospace Industries Ltd. Planning and monitoring of autonomous-mission
US20150255846A1 (en) 2012-09-27 2015-09-10 Northeastern University Magnetostatic Surface Wave Nonreciprocal Tunable Bandpass Filters
US20140176380A1 (en) 2012-12-21 2014-06-26 Samsung Electro-Machanics Co., Ltd. Multilayer ferrite sheet, antenna device using the same, and manufacturing method thereof
US9527394B1 (en) * 2013-05-02 2016-12-27 Dershuen Allen Tang Transportation system of combined vehicles multi-coupled at highway speeds for electrical energy transfer and sharing
CN103647511A (en) 2013-12-03 2014-03-19 中国电子科技集团公司第四十一研究所 Broadband preselection mixer design method
US20150234387A1 (en) 2014-02-14 2015-08-20 Accenture Global Services Limited Unmanned vehicle (uv) control system
CA2881744A1 (en) 2014-02-14 2015-08-14 Accenture Global Services Limited Unmanned vehicle (uv) control system and uv movement and data control system
US20160314224A1 (en) * 2015-04-24 2016-10-27 Northrop Grumman Systems Corporation Autonomous vehicle simulation system

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Blidberg et al., Guidance and Control Architecture for the EAVE Vehicle, 1986-, IEEE, p. 449-461. *
Cumingham et al., MPDM: Multipolicy decision-making in dynamic, uncertain environments for autonomous driving, 2015, IEEE, p. 1670-1677. *
Dudziak et al., AI Programming Vs. Conventional Programming for Autonomous Vehicles Trade-Off Issue, 1985, IEEE, p. 284-296. *
Ferguson et al., A Reasoning Framework for Autonomous Urban Driving, 2008, IEEE, p. 775-780. *
International Search Report for corresponding PCT/US2016/051935, dated Nov. 30, 2016.

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10909866B2 (en) 2018-07-20 2021-02-02 Cybernet Systems Corp. Autonomous transportation system and methods
US10460208B1 (en) * 2019-01-02 2019-10-29 Cognata Ltd. System and method for generating large simulation data sets for testing an autonomous driver
US11100371B2 (en) 2019-01-02 2021-08-24 Cognata Ltd. System and method for generating large simulation data sets for testing an autonomous driver
US11694388B2 (en) 2019-01-02 2023-07-04 Cognata Ltd. System and method for generating large simulation data sets for testing an autonomous driver
US11345462B2 (en) * 2019-03-22 2022-05-31 Northrop Grumman Systems Corporation System control architecture monitoring system

Also Published As

Publication number Publication date
GB201804267D0 (en) 2018-05-02
DE112016004563T5 (en) 2018-07-12
US20170097640A1 (en) 2017-04-06
WO2017062151A1 (en) 2017-04-13
GB2559894B (en) 2021-08-04
GB2559894A (en) 2018-08-22

Similar Documents

Publication Publication Date Title
US10019005B2 (en) Autonomous vehicle control system
EP1936584B1 (en) A device at an airborne vehicle and a method for collision avoidance
US20170269594A1 (en) Controlling an Unmanned Aerial System
JP2021509096A (en) Autonomous unmanned aerial vehicle and its control method
JP6411012B2 (en) Unpredictable vehicle navigation
EP2555179B1 (en) Aircraft traffic separation system
EP2715471B1 (en) Method and system for steering an unmanned aerial vehicle
US10242578B2 (en) Flight path management system
JP2020520501A (en) System and method for detecting and avoiding objects external to an aircraft
US11014650B2 (en) Moving body, moving body control system, moving body control method, interface device, and recording medium having program recorded thereon
US9372053B2 (en) Autonomous weapon effects planning
US20230028792A1 (en) Machine learning architectures for camera-based detection and avoidance on aircrafts
Richardson et al. Automated vision‐based recovery of a rotary wing unmanned aerial vehicle onto a moving platform
US20160282864A1 (en) Unmanned Ground/Aerial Vehicle System Having Autonomous Ground Vehicle That Remotely Controls One or More Aerial Vehicles
JP2020529583A (en) Systems and methods for adjusting the range of lidar sensors on an aircraft
US20240062661A1 (en) Unmanned aerial vehicle contingency landing system
US20180165974A1 (en) Vehicle collision prevention
WO2017021955A1 (en) Constraints driven autonomous aircraft navigation
WO2024023835A1 (en) Self-learning command & control module for navigation (genisys) and system thereof
Wyatt The DARPA/air force unmanned combat air vehicle (UCAV) program
KR101842217B1 (en) Flight control device and method, and flight control system comprising the same
US12087171B2 (en) Assurance module
Haridas et al. Longitudinal guidance of unmanned aerial vehicle using integral sliding mode control
US10864986B2 (en) Aerial vehicle including autonomous rotor speed control
Bateman et al. Application of run-time assurance architecture to robust geofencing of suas

Legal Events

Date Code Title Description
AS Assignment

Owner name: NORTHROP GRUMMAN SYSTEMS CORPORATION, VIRGINIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, WALTER;WEI, JEROME H.;FEEST, BRADLEY;AND OTHERS;REEL/FRAME:039759/0229

Effective date: 20160914

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4