US20160075014A1 - Asynchronous Data Stream Framework - Google Patents

Asynchronous Data Stream Framework Download PDF

Info

Publication number
US20160075014A1
US20160075014A1 US14/941,199 US201514941199A US2016075014A1 US 20160075014 A1 US20160075014 A1 US 20160075014A1 US 201514941199 A US201514941199 A US 201514941199A US 2016075014 A1 US2016075014 A1 US 2016075014A1
Authority
US
United States
Prior art keywords
behavior
behaviors
data
architecture
robotic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/941,199
Inventor
David Bruemmer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Humatics Corp
Original Assignee
5D Robotics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 5D Robotics Inc filed Critical 5D Robotics Inc
Priority to US14/941,199 priority Critical patent/US20160075014A1/en
Publication of US20160075014A1 publication Critical patent/US20160075014A1/en
Assigned to HUMATICS CORPORATION reassignment HUMATICS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: 5D ROBOTICS, INC.
Assigned to 5D ROBOTICS, INC. reassignment 5D ROBOTICS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BRUEMMER, DAVID J., HARDIN, BENJAMIN C., NIELSEN, CURTIS W.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/08Programme-controlled manipulators characterised by modular constructions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1633Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1684Tracking a line or surface by means of sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • G06F18/256Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/22Platooning, i.e. convoy of communicating vehicles
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/02Arm motion controller
    • Y10S901/09Closed loop, sensor feedback controls arm movement
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/02Arm motion controller
    • Y10S901/09Closed loop, sensor feedback controls arm movement
    • Y10S901/10Sensor physically contacts and follows work contour

Definitions

  • Embodiments of the present invention relate, in general, to computer and software architecture and more particularly to systems and methods to facilitate robotic behaviors through an underlying framework of asynchronously updated data streams.
  • Robots have the potential to solve many problems in society by working in dangerous places or performing jobs that no one wants.
  • One barrier to their widespread deployment is that they are mainly limited to tasks where it is possible to hand-program behaviors for every situation they may encounter.
  • existing robots do not make behavioral decisions.
  • a child as it grows is subjected to numerous learning environments in which the behavioral outcome is supplied.
  • a parent or teacher informs the child how to respond based on their experience or societal norms.
  • Behavior based robotics attempts to provide a robotic system with similar cognitive abilities.
  • Subsumption architecture is a reactive robot architecture that decomposes complicated intelligent behavior into many “simple” behavior modules. Each of these behavior modules are in turn organized into layers wherein each layer implements a particular goal and wherein higher layers are increasingly abstract. Each layer's goal subsumes that of the underlying layers, e.g. the decision to move forward by the “eat-food layer” takes into account the decision of the lowest “obstacle-avoidance layer.” Thus subsumption architecture uses a bottom-up design. The general idea is that each behavior should function simultaneously but asynchronously with no dependence on the others. This independence theoretically reduces interference between behaviors and prevents over complexity.
  • each layer accesses all of the sensor data and generates actions for the actuators for the robot with the understanding that separate tasks can suppress inputs or inhibit outputs.
  • the lowest layers can act as-adapting mechanisms (e.g. reflexes), while the higher layers work to achieve the overall goal.
  • ARA Autonomous Robot Architecture
  • This architecture is a hybrid deliberative/reactive robot architecture. Actions that must mediated by some symbolic representation of the environment are often called deliberative. In contrast, reactive strategies do not exhibit a steadfast reliance on internal models, but place a role of representation of the environment itself Thus, reactive systems are characterized by a direct connection between sensors and effectors. Control is not mediated by this type of model but rather occurs as a low level pairing between stimulus and response.
  • Reactive strategies do not exhibit a steadfast reliance on internal models, but displace some of the role of representation onto the environment itself And instead of responding to entities within a model as is with a deliberative model, the robot can respond directly to perception of the real world. Reactive systems therefore exhibit a direct connection between sensors and effectors and are best applied in complex, real-world domains where uncertainty cannot be effectively modeled.
  • An architecture and associated methodology for asynchronous robotic behavior is described hereafter by way of example.
  • An architecture comprising a hardware layer, a data collection layer and an execution layer lays the foundation for a behavioral layer that can asynchronously access abstracted data.
  • a plurality of data sensors asynchronously collect data which is thereafter abstracted so as to be usable by one or more behavioral modules.
  • Each of the behaviors can be asynchronously executed as well as dynamically modified based on the collected abstracted data.
  • behaviors are associated with one of two or more hierarchal behavioral levels. Should a conflict arise between the execution of two or more behaviors, the behavioral outcome is determined by behaviors associated with a higher level arbitrating over those associated with lower levels.
  • the present invention rests in the architecture's ability to support modular behavioral modules.
  • the present invention provides an architecture in which data collected by a variety of sources is simultaneously available to a plurality of behaviors.
  • a behavior is a process by which a specific output is achieved.
  • a behavior may be to identify a target based on a thermal image or it may be to move the robot without hitting an obstacle.
  • the sensors that collect the data may be common.
  • data collected by, for example, a thermal sensor can be used by both behaviors.
  • the data is collected and processed to a form so that both behaviors can have equal asynchronous access.
  • the data remains available to other behavior modules that may yet to be connected to the robotic platform. In such a manner the operation of the sensors, processors and behaviors can all operate independently and efficiently.
  • the behavior engine of the present invention provides a safe reliable platform within onboard intelligence that enables reactive, dynamic and deliberate behaviors.
  • recursive sensor maps can be customized to allow access to data abstractions at multiple levels.
  • Such abstracted data can be used by multiple behavior modules on multiple levels while still providing the ability to dynamically modify the behavioral outcomes based on collected data.
  • FIG. 1 shows an abstract block diagram depicting component layers of an asynchronous data streaming framework according to one embodiment of the present invention
  • FIG. 2 presents a high level block diagram for a framework for asynchronous data streaming according to one embodiment of the present invention
  • FIG. 3 is a high level block diagram showing general behavior abstraction according to one embodiment of the present invention.
  • FIG. 4 is more detailed rendition of the behavior abstraction diagram of FIG. 3 with example behaviors and abstractions according to one embodiment of the present invention
  • FIG. 5 is a high level block diagram of an example of behavior orchestration using one embodiment of asynchronous data streaming according to the present invention
  • FIG. 6 is a flowchart showing one method embodiment of the present invention for asynchronous data streaming according to one embodiment of the present invention.
  • the asynchronous architecture of the present invention provides an underlying framework for the application and utilization of asynchronously collected and updated data streams.
  • Asynchronously collected data is abstracted and thereafter provided to a plurality of behavior modules. These modules are combined to form behavior suites which can thereafter be asynchronously executed.
  • behavior suites operate in what is referred to as a behavior layer that operates on top of a core architecture.
  • the core architecture combines hardware, data collection and execution layers to form a foundation on which a plurality of distinct behaviors can operate.
  • the computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed in the computer or on the other programmable apparatus to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
  • blocks of the flowchart illustrations support combinations of means for performing the specified functions and combinations of steps for performing the specified functions. It will also be understood that each block of the flowchart illustrations, and combinations of blocks in the flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
  • any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
  • the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
  • the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
  • a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
  • “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
  • the behavior-based approach of the present invention is applicable for use in a wide variety of areas, including military applications, mining, space exploration, agriculture, factory automation, service industries, waste management, health care, disaster intervention and the home. To better understand the novelty of the present invention and what behavior-based robotics is, it may be helpful to explain what it is not.
  • the behavior-based approach does not necessarily seek to produce cognition or a human-like thinking process. While these aims are admirable, they can be misleading. While it is natural for humans to model their own intelligence, humans are not aware of the myriad of internal processes that actually produce our intelligence.
  • a fundamental premise of the behavior-based approach is that sophisticated, high-level behavior can emerge from layered combinations of simple stimulus-response mappings. Instead of careful planning based on modeling, high-level behavior such as flocking or foraging can be built by blending low-level behaviors such as dispersion, aggregation, homing and wandering. Strategies can be built directly from behaviors, whereas plans must be based on an accurate model.
  • behaviors are arranged into simple, asynchronous layers, each of which corresponds to a task. Higher layers equate to higher-level tasks, and each has the ability to override and interact with children (lower) layers. According to one embodiment of the present invention, behaviors are synchronous and arranged in a tree structure, allowing multiple tasks to occupy the same level or priority. Higher behaviors arbitrate the actions requested by each sub-behavior rather than subsume the lower-level behaviors as has been practiced in the prior art.
  • the subsumption architecture of the prior art gives each behavior direct, synchronous access to sensors, which the behaviors use to make reactive decisions.
  • the architecture of the present invention also provides access to data collected from the sensors, but it is gathered asynchronously and transformed by a process to abstract the data into a more useful format.
  • the data from the sensors can be passed and handled by any of a plurality of behaviors and can be used with different levels of abstraction in different places.
  • Behaviors in the present architecture are not limited to be reactive; they may indeed be dynamic or deliberate.
  • the output of one or more behaviors can be the input to another behavior.
  • One or more embodiments of the present invention provide a synchronous behavior hierarchy (tree structure) wherein higher-level behaviors arbitrate outputs and requested actions from lower-level behaviors.
  • Behavior modules make decisions from abstracted sensor information that is collected asynchronously and abstracted based on the distinct needs of individual behaviors. The same sensor data can be abstracted differently for other distinct behaviors.
  • FIG. 1 presents a high level block diagram for a framework for asynchronous data streaming according to one embodiment of the present invention.
  • Five (5) layers comprising a framework 100 for asynchronous data streaming include a hardware layer 110 , a data collection layer 130 , an execution layer 150 , and a behavioral layer 170 .
  • the hardware layer 110 is also associated with a plurality of sensors 115 .
  • the depiction shown in FIG. 1 is illustrative only. In other embodiments of the present invention, more or less layers providing additional or less functionality can be included. Indeed, the present labeling of the layers shown in FIG. 1 are arbitrary and merely illustrative.
  • subsumption architecture In a subsumption architecture, complicated intelligent behavior is decomposed into many simple behavior modules that are in turn organized into layers. Each layer implements a particular goal of the agent with each higher layer being increasingly abstract. Each layer's goal subsumes that of the underlying layers, thus the name of the architecture. Like the present invention, subsumption architecture is a bottom up design.
  • each horizontal layer can synchronously access all of the sensor data and generate reactive actions for the actuators.
  • separate tasks can suppress (or overrule) inputs or inhibit outputs.
  • the lowest layers work like fast-adapting mechanisms while the higher layers work to achieve an overall goal.
  • the architecture of the present invention invokes the same bottom up structure and each layer can access sensor data but the data is gather asynchronously.
  • data in the present invention is made more versatile by an abstraction process. By doing so, the data is placed into a format that can be utilized by a number of different behavior modules simultaneously.
  • Data can also be used with different levels of abstraction in a variety of different locations and, according to one embodiment of the present invention, behaviors associated with the architecture for asynchronous data streaming are not limited to being reactive but are also dynamic and/or deliberate. Behaviors of the present invention are synchronous and arranged in a tree structure, allowing multiple tasks to occupy the same level or priority. Higher behaviors arbitrate the actions requested by each sub-behavior rather than subsume the lower-level behaviors.
  • a behavior layer 170 can be comprised of a plurality of different behavior modules each of which can asynchronously access the data 130 collected by plurality of sensors 115 associated with the plurality of hardware modules 110 .
  • FIG. 2 shows a behavior engine 280 asynchronously receiving inputs from a plurality of behavior modules 270 , a plurality of data modules 230 , and a plurality of hardware modules 210 .
  • the behavior engine 280 also incorporates input from an operator interface 240 .
  • the behavior engine 280 enables various behavior modules to arbitrate actions and tasks requested by any sub-behavior or behavior of an equal level.
  • Each hardware module 210 is coupled to a physical sensor or other input device and collected data asynchronously from other hardware modules.
  • a plurality of hardware modules and sensors interact to provide a web of data that can be asynchronously accessed by the behavior modules 270 .
  • the hardware modules not only are responsible for data collection but also report on whether a specific type of hardware exists and whether the data that is collected is valid. In essence, the hardware modules read data from the corresponding physical hardware sensor or some other type of input device and access its validity.
  • These hardware devices can operate autonomously or be tied to a direct input from a system operator of another component.
  • the hardware modules can operate on a variety of communication mediums such as radio, Internet, USD, and the like. Each hardware module runs in parallel and asynchronously thus allowing data to be collected at a speed that is optimized for the underlying hardware in the overall computing environment.
  • the data modules 230 gain data from each hardware module 210 as well as from other data modules 230 and prepare a high level abstraction of the data that can be later used by all of the behavior modules 270 .
  • Each data module 230 runs independently or asynchronously from each other unrelated data module.
  • data modules 230 are organized into a directed graph architecture in which individual modules can either push or pull data depending on the environment in which they are operating. Accordingly, each data module represents a specific abstraction of the data.
  • the directed data graph is formed by establishing higher-level data modules as an aggregation of a set of lower-level data modules.
  • the abstraction of data is accomplished by data manipulation to achieve its proper and specified format. Abstraction processing is performed asynchronously with any other type of data or hardware module thus removing any unnecessary alliance between or among unrelated modules.
  • Behavior modules of the present invention make informed decisions as to what actions take place using the data continuously shared by the data modules.
  • the behavior modules 270 leverage the information gained by the data modules 230 . Since the underlying data collection is asynchronous in nature, the speed at which the behavior modules execute is not constrained by the speed of data collection, abstraction, and interpretation.
  • behaviors are organized into a hierarchy, where additional inputs to each behavior module may be the action or set of actions suggested by lower-level behaviors. However, each behavior module is designed in such a way that it remains encapsulated and does not require higher-level behaviors to operate.
  • the data modules arbitrate amongst themselves and among sub-behaviors to determine which action should take place. While there is a tree hierarchy among the behaviors, that is there are behaviors and sub-behaviors, the behaviors at the same level possesses equal priority and thus arbitrate among themselves to determine a course of action.
  • Sensor modules (not shown) which are associated with the hardware module 210 shown in FIG. 2 , operate asynchronously to collect and push data to the data modules 230 . Once the data is received by the data modules 230 , it is abstracted so that it can be equally accessed and utilized by a plurality of behavior modules 270 .
  • the asynchronous architecture of the present invention decouples the collection of data from one or more behaviors seeking a particular type of data.
  • the embodiments of the present invention enable data to be gathered at an optimal rate based on sensor or hardware capability.
  • Each sensor/hardware module operates independently. Accordingly, each behavior can operate at its maximum frequency and capability since it is not slowed by any sensor processing or data delay.
  • the asynchronous architecture of the present invention enables behaviors to be incorporated, modified, or removed freely without requiring any changes to the processes and procedures necessary for the gathering of information, as each sensor or hardware module runs independent of the behavior modules.
  • the data collected by the hardware modules 210 is independent of the data modules 230 and the behavior modules 270 the data can be stored in a shared data structure making it unnecessary to maintain redundant copies of the data. And as each behavior module, data module, and hardware module operate independently each can run on different processing units or computers based on their individual, computational requirements. As one of ordinary skill in the relevant art will appreciate, the various embodiments of the present invention and the asynchronous architecture for data streaming described above enhances the flexibility and capability of autonomous robotic behavior.
  • FIG. 3 is a high level hierarchal depiction of the asynchronous data streaming architecture of the present invention.
  • Mission logic 370 is achieved after the arbitration of a plurality of behaviors 340 , 350 which asynchronously receive and process abstract data collected by plurality of sensors 310 , 315 .
  • Each behavior 340 , 350 can equally access and utilize various tiers of abstract data 320 , 325 , 330 , 335 , which are collected by a plurality of hardware sensors 310 , 315 .
  • Each tier of abstract data 320 , 325 , 330 , 335 modifies/transforms the data so that may be universally used by one or more behaviors 340 , 350 .
  • FIG. 4 is a depiction of an asynchronous data streaming architecture for achieving a mission logic 470 of “follow target.”
  • the architecture includes a plurality of sensors 210 , the data collected by each of the sensors 230 and a plurality of behavior modules 270 each of which has its individual capability and task.
  • two separate range sensors 410 , 415 provide data for the various behavior modules.
  • the data collected by the range sensors is abstracted by a plurality of methods so as to provide a variety of tiers of data that can be accessed by the various modules.
  • the 1st tier of data is Filtered Range Data 420 while the second tier of data is an Ego Centric Map 425 .
  • Occupancy Grid 430 and a Change Analysis Map 435 is also provided.
  • Each of these tiers of abstracted data is formed from the raw data collected by the range sensors 410 , 415 .
  • the various tiers of data are equally accessible by each behavior module.
  • the Guarded Motion Module 440 accesses the Ego Centric Map tier 425 to make its determination or recommendation for action.
  • the Obstacle Avoidance module 442 seeks data from the Ego Centric Map data tier 425 .
  • the Follow Path module 444 and the Identify Laser Target module 446 access data from the Occupancy Grid data tier 430 and the Change Analysis Map data tier 435 , respectively.
  • the Plug-in Behavior modules do not access abstract data from the range sensors 410 , 415 at all.
  • the Visual Target behavior module 448 and the Thermal Target Identification module 450 operate independently of the data collected from the range sensors 410 , 415 .
  • each and every behavior module 440 , 442 , 444 , 446 , 448 and 450 can equally access each and every tier of abstract data 420 , 425 , 430 , 435 asynchronously.
  • each behavior module is not limited to access to only one data tier.
  • the follow Path module 444 accesses data from the Occupancy Grid tier 430 . While the Follow Path Module 444 only accesses a single abstract data tier it could easily access other data tiers such as the Ego Centric Map 425 or the Filtered Range Data 420 .
  • the Thermal Target Module 450 and the Visual Target Identification module 448 do not access any of the data tiers, any behavior module can access any abstract the data tier as necessary.
  • each behavior module is provided with data necessary to make its recommendation for an action determination.
  • Each is equally situated to provide input to the mission logic, which in this case is to follow a target.
  • the actions are arbitrated to arrive at the desired outcome.
  • FIG. 4 also shows an overlay of the model architecture shown in FIG. 2 to give the reader a better appreciation for how the various modules interact in arriving at ignition module.
  • the range sensors 410 , 415 are two examples of hardware modules 210 .
  • the four tiers of data including Filtered Range data 420 , Ego Centric Map data 425 , Occupancy Grid data 430 , and Change Analysis Map data 435 are modules of collected data 230 .
  • the behaviors including Guarded Motion 440 , Obstacle Avoidance 442 , follow Path 444 , Identify Laser Target 446 , Target Visual Identification 448 , and thermal target identification 450 are all behavior modules 270 .
  • the behavior module 280 arbitrates the various inputs or action recommendations generated by each behavior module.
  • the asynchronous architecture described herein enables a mission logic/output to be specified as a unique combination of a plurality of behaviors, sensors, and behavior parameters.
  • the behavior modules of the present invention are modular and operate independently and mission logic or mission objective can be identified and formed as a combination of these independent modules.
  • the present invention seeks and incorporates hardware/sensors, abstract data, and various mission behavior outcomes to arrive at the mission logic objective.
  • a behavior engine provides a means to optimize the mission behavior dynamically by changing input parameters to the various behavior modules so as to interpret and to respond to various collected data and user input.
  • Suites of behaviors can also be compiled as separate libraries that can each be customized and distributed to different platforms and/or utilized on different missions while maintaining the same core behavior architecture. Accordingly, the core architecture remains unchanged over various application layers/behaviors that can be continually, and dynamically customized, repackaged and distributed.
  • the asynchronous architecture of the present invention provides cross-application platform technology that is dynamically reconfigurable such that a behavior or outcome can be prioritized based on decision logic embedded in one or more of the behavior modules.
  • a top-level function or behavior module of the overall mission architecture to provide as a means by which prioritize different behavior outcomes and, as described herein, use as the basis to arbitrate between conflicting behavior directives.
  • each application-centric layer cake uses the same basic behavior modules (modular and reusable modules) but orchestrates these individual behavior modules differently using unique behavior composition and prioritization specifications.
  • a user through what is referred to herein as a user control unit or user interface can change behavior goals dynamically and thus reconfigure the mission logic to arrive at the desired optimization and outcome.
  • FIG. 5 presents a high-level interaction block diagram of an illustrative arbitration process by which a mission logic/outcome is comprised of a plurality of individual/modular behavior inputs.
  • behavior engine 510 possesses predetermined behavior outcomes or receives a mission logic outcome from a user via a user control unit 575 .
  • the user control unit 575 can be communicatively coupled to the behavior engine 510 via a radio link 570 .
  • the behavior engine 510 issues logic directives to a variety and plurality of behavior modules 520 , 530 , 540 , 550 , 560 to assemble behaviors based, in one embodiment, on the use of distance and heading cues.
  • a behavior engine 510 communicates directly to a Wander module 540 and a follow module 530 .
  • the behavior engine 510 also communicates to the Follow module 530 and a Way Point Traverse module 520 via a Shared Control module 515 .
  • the output of the Wander module 540 , the Follow module 130 , and a Way Point Traverse module 520 is supplied as input to an Avoid module 550 which in turn provides an output to a Guarded Motion module 560 .
  • the follow module 530 and a Way Point Traverse module 520 each receive heading information from the behavior engine 510 . This heading information may, in this example, be a general direction which the user wishes the robotic device to traverse.
  • the follow module 530 (also referred to herein as target follow or lead/follow) can function in tandem (parallel) with the Way Point traverse module 520 (or other modules) to calculate a path to the target or to specific targets along a path.
  • the Way Point module 520 (also sometime referred herein to as the path follow module) receives a heading input from the Shared Control module 515 and uses a control loop to direct a specific robot heading so as to maintain its position on a path.
  • These heading directives can either be waypoint-based (where there are individual intermittent waypoints) or track-based (where the path is a continuous curve or line.
  • the arbitration between path following and target following is accomplished by the Behavior Engine behavior at the top level.
  • the Shared Control module 515 is a behavior element that modifies heading as an input to Way Point Traverse module 520 and the follow module 530 . This allows an external planner (user) to influence or assume control from the path following or target following behavior. The extent of the heading change and the decision to either assume or augment the heading input is calculated within the Shared Control module 515 .
  • the Wander module 550 provides no heading information but nonetheless provides the Avoid module 550 with a general plan for surveying a general area.
  • the Wander module 550 provides a means by which to explore as well as sort out challenging situations in which the device has failed to achieve established goals or directives of the Behavior Engine 510 in terms of a specific traverse. For example, a failure for the device to traverse a desired path, to make progress toward a certain target, or accept user input can be addressed by wandering to some degree.
  • the Behavior Engine 510 remains responsible for directing a “Wander” behavior as it may impact the object's activity.
  • Wandering provides an element of randomness or resourcefulness that can be useful in a cluttered or highly dynamic environment in which deterministic path following or directive headings are ineffective.
  • Inputs from the various and independent behavior modules are used by the Avoid module 550 to determine a suitable output.
  • the Avoid module 550 has certain predefined or user input locations in which the robot is to avoid.
  • the Avoid module reactively navigates around either static or dynamic obstacles that may not have been accounted for in other behavior elements. As described hereafter, these obstacles can be also identified using an active position tag.
  • the inputs of desire heading, distance, and speed are used and processed by the Avoid module 550 to provide a direction of traverse that both meets mission objectives and the avoidance criteria.
  • the Guarded Motion module 560 ensures that there are no collisions. While the Behavior Engine 510 has little input into the Guarded Motion module 560 , it can turn it on or off as well as determine the acceptable distances used as safety buffers between the object (robot) and various other objects (e.g. people, vehicles, plants, walls, etc.). In this example, the Guarded Motion module 560 may also prevent the robotic device from exceeding a particular angle orientation on a hill so that it will not tip over, speed, or any other motion that would jeopardize the success and/or wellbeing of the platform.
  • the Guarded Motion module thereafter produces an output sent back to the Behavior Engine 510 for comparison with the overall mission objective. If the mission objective and the output of the Guarded Motion module 560 are aligned the Behavior engine 510 provides commands to the drive mechanism 580 which correspondingly turns the wheels 590 or acts on a similar device.
  • the Behavior Engine 510 in combination with the asynchronous architecture of the present invention, enables modular behaviors to modify collective outcomes so as to meet an overall mission objective.
  • the Behavior Engine 510 uses heading, distance and a plurality of other parameters to arbitrate between different lower level behavior elements such as those produced by the Follow module 530 , Way Point Traverse module 520 , Wander module 540 , Shared Control Module 515 , Guarded Motion Module 560 and the Avoid module 550 .
  • the Behavior Engine 510 does this, in one embodiment, by modifying the heading and distance inputs that go into each behavior to arrive an an optimal result that yet meets mission objectives.
  • the Behavior Engine 510 can also completely turn off an individual behavior element in which case, no execution of that behavior occurs.
  • the Behavior Engine 510 can also execute two or more behaviors in parallel and then, based on the outputted heading and distance, decide how to select between the competing outputs, or, in an different embodiment, combine them in some meaningful way. It is also possible to assign specific prioritization to individual active positioning tags so that certain tags (e.g. a hazard tag or avoid tag) have a defined priority over other tags (e.g. a follow tag). In this manner the Behavior Engine 510 allows for the orchestration of behaviors to be in part dependent on a mission level behavior specification but also use tag prioritization to influence behavior coordination.
  • tags e.g. a hazard tag or avoid tag
  • the wander module 540 was the lowest priority or mission objective and whose output may be modified by the new mission objective.
  • the new directive to move from one point to another required the Follow module 530 and a Way Point Traverse module 520 to produce a specific heading, speed, and distance. It is possible that the same heading, speed, and distance matched that which had been output by the Wander module 540 . Alternatively, the heading, speed and distance would not conflict with the Wander module output. However, should there be a conflict the output of the Follow module 530 and a Way Point Traverse module 520 would override that of the Wander module 540 . Thus, the Follow module 530 and a Way Point Traverse module 520 would be viewed as being in a higher layer than the Wander module 540 .
  • the Avoid module 550 Another layer higher than all previous modules would be the Avoid module 550 .
  • This module ensures that the robotic platform avoids certain predetermined or communicated positions or hazards as identified by onboard sensors. For example, perhaps an onboard sensor identifies a thermal hot spot (fire) that should be avoided or a wireless communication conveys the precise location of an explosive device. The Avoid module 550 would avoid these positions.
  • the Guarded Motion module 560 is yet another, and higher, layer than even the Avoid module 550 . This module may, for example, guard the platform from certain types of motion that would jeopardize its safety.
  • Each module described herein accesses abstract and asynchronously collected data simultaneously to determine the optimal outcome for that particular module.
  • the architecture of the present invention arbitrates the outcomes based on a hierarchal structure determined by the overall mission objective.
  • the behavior engine 510 thereafter compares the recommended course of action based on the arbitrated behaviors of one or more behavior modules to that of the overall mission objective to confirm that the recommended course of action is in compliance with the overall mission objective. Once the directed course of action is confirmed by the behavior, is it is thereafter conveyed to various hardware and logic components to implement commands.
  • UWB ultra wide band
  • RF radio frequency
  • ID identification
  • tag systems comprise a reader with an antenna, a transmitter, and software such as a driver and middleware.
  • One function of the UWB RFID system is to retrieve state and positional information (ID) generated by each tag (also known as a transponder).
  • Tags are usually affixed to objects so that it becomes possible to locate where the goods are without a direct line-of-sight given the low frequency nature of their transmission.
  • a tag can include additional information other than the ID. For example, using triangulation of the tag's position and the identity of a tag, heading and distance to the tag's location can be ascertained.
  • a single tag can also be used as a beacon for returning to a specific location or carried by an individual or vehicle to affect a follow behavior from other like equipped objects.
  • active ranging technology is equally applicable to the present invention and is contemplated in its use.
  • the use of the term “UWB”, “tags” or “RFID tags,” or the like, is merely exemplary and should not be viewed as limiting the scope of the present invention.
  • a RFID and/or UWB tag cannot only be associated with a piece of stationary infrastructure with a known, precise, position, but also provide active relative positioning between movable objects. For example, even if the two or more tags are unaware of their precise position that can provide accurate relative position.
  • the tag can be connected to a centralized tracking system to convey interaction data. As a mobile object interacts with the tag of a known position, the variances in the objects positional data can be refined.
  • a tag can convey not only relative position between objects but relative motion between objects as well. Such tags possess low-detectability and are not limited to line of sight nor are they vulnerable to jamming.
  • tags offer relative position accuracy of approximately +/ ⁇ 12 cm for each interactive object outfitted with a tag.
  • object is not intended to be limiting in any way. While the present invention is described by way of examples in which objects may be represented by vehicles or cellular telephones, an object is to be interpreted as an arbitrary entity that can implement the inventive concepts presented herein. For example, an object can be a robot, vehicle, aircraft, ship, bicycle, or other device or entity that moves in relation to another.
  • the collaboration and communication described herein can involve multiple modalities of communication across a plurality of mediums.
  • the active position tags of the present invention can also provide range and bearing information. Using triangulation and trilateration between tags, a route can be established using a series of virtual waypoints. Tags can also be used to attract other objects or repulse objects creating a buffer zone. For example, a person wearing a tag can create a 4-foot buffer zone which will result in objects not entering the zone to protect the individual. Similarly, a series of tags can be used to line a ditch or similar hazard to ensure that the object will not enter a certain region.
  • a process by which to asynchronously provide data to a plurality of behavior modules begins with the collection of data 610 .
  • data is collected asynchronously by a plurality of hardware modules or sensors but is used independently by behaviors. Said differently, behavior execution is independent of data collection.
  • one or more behavior modules can be combined 670 to form what is referred to as a behavior suite. Behavior suites can then be treated either independently or in conjunction with other behaviors in the determination of a recommended action. And finally, the behavior suites are executed 680 to achieve the desired mission outcome.
  • the Behavior Engine of the present invention is a reusable behavior framework. Elements such as 3D mapping, position fusion calculations, guarded motion, obstacle avoidance, follow, path planning and path following represent behaviors which the behavior engine can implement to achieve mission objectives. These behaviors can also be combined. For example, guarded motion, obstacle avoidance, and reactive follow work together to form a primitive reactive behavior core. This core behavior can thereafter couple with reusable interfaces to achieve higher level of outcomes such as 3D mapping, path planning, hazards and path trajectories.
  • reactive behaviors can be used across any object or vehicle or application.
  • the connectivity between reactive behaviors and application layer representations can be universal even when specific applications algorithms differ.
  • an implementation of these concepts may be vastly different but the core concepts will remain the same.
  • the behavior engine manages basic handoffs between behaviors such as guarded motion, obstacle avoidance, follow, path planning and path following. While a follow algorithm may change (e.g. Akerman vs. skid steer) as may a follow algorithm (bread crumb follow vs. regular follow) the interplay between the behaviors remains the same.
  • the architecture of the present invention facilitates the development and implementation of modular robotic behaviors.
  • the same platform can be quickly modified using plug-and-play modules to provide new capabilities without having to modify the the collection and preparation of sensor data. Modules can be interchanged between platforms with confidence that each platform will provide to the newly incorporated module the information it needs to accomplish its mission.
  • the core architecture of the present invention comprised of the hardware layer, the data collection layer and the execution layer enables behaviors associated with the behavior layer to operate independently and to be modified dynamically.
  • Asynchronously collected data is abstracted so as to be useable by a plurality of data modules simultaneously.
  • the use of a set of abstracted data by one behavior module is completely independent of the simultaneous and continuous use of the same data, albeit abstracted differently, by a different behavior module.
  • the architecture of the present invention also enables behavior modules to be dynamically modified responsive to collected data. Should data indicate that the current parameters of a particular behavior are not compatible with the overall mission objective based on the collected data, the parameters can be modified without modifying the data collection means or processing. Once the module has been modified it can again access the collected data to initiate a responsive action.
  • one more embodiments of the present invention may be implemented in a computer system as a program of instructions executable by a machine.
  • the program of instructions may take the form as one or more program codes for collection data asynchronously, abstracting the data, accessing the abstracted data asynchronously by one or behavior modules and then code for executing the modules.
  • micro-controllers with memory such as electronically erasable programmable read-only memory (EEPROM)
  • EEPROM electronically erasable programmable read-only memory
  • Computer-readable media in which instructions may be embodied include, but are not limited to, non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media) and carrier waves that may be used to transfer such formatted data and/or instructions through wireless, optical, or wired signaling media or any combination thereof.
  • the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.
  • the particular naming and division of the modules, managers, functions, systems, engines, layers, features, attributes, methodologies, and other aspects are not mandatory or significant, and the mechanisms that implement the invention or its features may have different names, divisions, and/or formats.
  • the modules, managers, functions, systems, engines, layers, features, attributes, methodologies, and other aspects of the invention can be implemented as software, hardware, firmware, or any combination of the three.
  • a component of the present invention is implemented as software
  • the component can be implemented as a script, as a standalone program, as part of a larger program, as a plurality of separate scripts and/or programs, as a statically or dynamically linked library, as a kernel loadable module, as a device driver, and/or in every and any other way known now or in the future to those of skill in the art of computer programming.
  • the present invention is in no way limited to implementation in any specific programming language, or for any specific operating system or environment. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Fuzzy Systems (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Toys (AREA)
  • General Factory Administration (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

An architecture comprising a hardware layer, a data collection layer and an execution layer lays the foundation for a behavioral layer that can asynchronously access abstracted data. A plurality of data sensors asynchronously collect data which is thereafter abstracted so as to be usable by one or more behavioral modules simultaneously. Each of the behaviors can be asynchronously executed as well as dynamically modified based on the collected abstracted data. Moreover, the behavior modules themselves are structured in a hierarchical manner among one or more layers such that outputs of behavior module associated with a lower layer may be the input to a behavior module of a higher letter. Conflicts between outputs of behavior modules are arbitrated and analyzed so as to conform with an overall mission objective.

Description

    RELATED APPLICATION
  • The present application is a continuation of and claims the benefit of priority to U.S. Non-Provisional patent application Ser. No. 13/597791 which in turn claims priority to U.S. Provisional Application 61/529,206 filed Aug. 30, 2011, both of which are hereby incorporated by reference in their entirety for all purposes as if fully set forth herein. The present application if further related to the following commonly assigned patent applications: U.S. patent application Ser. No. 13/597,911 entitled, “Vehicle Management System”, U.S. patent application Ser. No. 13/597,991 entitled, “Modular Robotic Manipulation”, U.S. patent application Ser. No. 13/598,021 entitled, “Graphical Rendition of Multi-Modal Data, and U.S. patent application Ser. No. 13/598,114 entitled, “Universal Payload Abstraction, all of which filed on Aug. 29, 2012.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • Embodiments of the present invention relate, in general, to computer and software architecture and more particularly to systems and methods to facilitate robotic behaviors through an underlying framework of asynchronously updated data streams.
  • 2. Relevant Background
  • Robots have the potential to solve many problems in society by working in dangerous places or performing jobs that no one wants. One barrier to their widespread deployment is that they are mainly limited to tasks where it is possible to hand-program behaviors for every situation they may encounter. Simply stated, existing robots do not make behavioral decisions. For example, a child as it grows is subjected to numerous learning environments in which the behavioral outcome is supplied. For example, a parent or teacher informs the child how to respond based on their experience or societal norms. Eventually, and as the child matures, it applies these prior experiences to make new behavioral decisions on entirely new environmental situations. Behavior based robotics attempts to provide a robotic system with similar cognitive abilities.
  • It has been a long established goal and challenge to artificially model human intelligence. One early approach to solving this task is to first build a model of the environment and then explore solutions abstractly before enacting strategies in the real world. This approach places emphasis on symbolic representation and while to a human designer such an approach makes conceptual sense, to a robot which has little or no autonomy, it has little applicability.
  • One architecture associated with behavior based robotics is known as Subsumption. Subsumption architecture is a reactive robot architecture that decomposes complicated intelligent behavior into many “simple” behavior modules. Each of these behavior modules are in turn organized into layers wherein each layer implements a particular goal and wherein higher layers are increasingly abstract. Each layer's goal subsumes that of the underlying layers, e.g. the decision to move forward by the “eat-food layer” takes into account the decision of the lowest “obstacle-avoidance layer.” Thus subsumption architecture uses a bottom-up design. The general idea is that each behavior should function simultaneously but asynchronously with no dependence on the others. This independence theoretically reduces interference between behaviors and prevents over complexity.
  • In such an approach, each layer accesses all of the sensor data and generates actions for the actuators for the robot with the understanding that separate tasks can suppress inputs or inhibit outputs. By doing so the lowest layers can act as-adapting mechanisms (e.g. reflexes), while the higher layers work to achieve the overall goal.
  • Another robotic architecture known in the prior art is the Autonomous Robot Architecture (AuRA). This architecture is a hybrid deliberative/reactive robot architecture. Actions that must mediated by some symbolic representation of the environment are often called deliberative. In contrast, reactive strategies do not exhibit a steadfast reliance on internal models, but place a role of representation of the environment itself Thus, reactive systems are characterized by a direct connection between sensors and effectors. Control is not mediated by this type of model but rather occurs as a low level pairing between stimulus and response.
  • Structured tasks with predictable outcomes are best suited for a deliberative approach while environmentally dependent task are better suited to the reactive model. Reactive strategies do not exhibit a steadfast reliance on internal models, but displace some of the role of representation onto the environment itself And instead of responding to entities within a model as is with a deliberative model, the robot can respond directly to perception of the real world. Reactive systems therefore exhibit a direct connection between sensors and effectors and are best applied in complex, real-world domains where uncertainty cannot be effectively modeled.
  • Some behavior-based strategies use no explicit model of the environment but for more complicated domains it is necessary to find an appropriate balance between reactive and deliberative control.
  • Increasingly, researchers have abandoned the quest for high-level cognition and instead begun to model lower animal activity. Biology serves not only as inspiration for underlying methodologies, but also for actual robot hardware and sensors. For example, a simple household fly navigates using a compound eye comprised of 3,000 facets which operate in parallel to monitor visual motion. In response, an artificial robot eye has been constructed with 100 facets that can provide a 360-degree panoramic view. In another example, artificial bees have been crafted to simulate the dance patterns and sounds of real bees sufficiently well to actually communicate with other bees.
  • As conceived by one artificial intelligence researcher, “cognition is a chimera contrived by an observer who is necessarily biased by his/her own perspective on the environment. Cognition, as it is a subjective fabrication by an observer, cannot be measured or modeled scientifically.” Accordingly, the development of an artificial intelligence capable of cognition remains a challenge. While it remains debatable whether a machine can truly become cognitive, it is generally agreed that a bottom up behavioral approach lays the foundation for all artificial intelligence and a basis for future research. Therefore, a challenge remains to develop and implement a bottom up behavioral architecture that can be both deliberate and reactive in response to a multitude of sensory inputs. These and other challenges of the prior art are addressed by one or more embodiments of the present invention.
  • SUMMARY OF THE INVENTION
  • An architecture and associated methodology for asynchronous robotic behavior is described hereafter by way of example. An architecture comprising a hardware layer, a data collection layer and an execution layer lays the foundation for a behavioral layer that can asynchronously access abstracted data. A plurality of data sensors asynchronously collect data which is thereafter abstracted so as to be usable by one or more behavioral modules. Each of the behaviors can be asynchronously executed as well as dynamically modified based on the collected abstracted data.
  • According to one embodiment, behaviors are associated with one of two or more hierarchal behavioral levels. Should a conflict arise between the execution of two or more behaviors, the behavioral outcome is determined by behaviors associated with a higher level arbitrating over those associated with lower levels.
  • Another feature of the present invention rests in the architecture's ability to support modular behavioral modules. Unlike robotic architectures of the prior art, the present invention provides an architecture in which data collected by a variety of sources is simultaneously available to a plurality of behaviors. For the purpose of the present invention, a behavior is a process by which a specific output is achieved. A behavior may be to identify a target based on a thermal image or it may be to move the robot without hitting an obstacle. Clearly each of these tasks or behavior requires different data, however, the sensors that collect the data may be common. Using the embodiments of the present invention data collected by, for example, a thermal sensor, can be used by both behaviors. The data is collected and processed to a form so that both behaviors can have equal asynchronous access. Moreover, the data remains available to other behavior modules that may yet to be connected to the robotic platform. In such a manner the operation of the sensors, processors and behaviors can all operate independently and efficiently.
  • The behavior engine of the present invention provides a safe reliable platform within onboard intelligence that enables reactive, dynamic and deliberate behaviors. Using an independent asynchronous data collection process, recursive sensor maps can be customized to allow access to data abstractions at multiple levels. Such abstracted data can be used by multiple behavior modules on multiple levels while still providing the ability to dynamically modify the behavioral outcomes based on collected data.
  • The features and advantages described in this disclosure and in the following detailed description are not all-inclusive. Many additional features and advantages will be apparent to one of ordinary skill in the relevant art in view of the drawings, specification, and claims hereof. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes and may not have been selected to delineate or circumscribe the inventive subject matter; reference to the claims is necessary to determine such inventive subject matter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The aforementioned and other features and objects of the present invention and the manner of attaining them will become more apparent, and the invention itself will be best understood, by reference to the following description of one or more embodiments taken in conjunction with the accompanying drawings, wherein:
  • FIG. 1 shows an abstract block diagram depicting component layers of an asynchronous data streaming framework according to one embodiment of the present invention;
  • FIG. 2 presents a high level block diagram for a framework for asynchronous data streaming according to one embodiment of the present invention;
  • FIG. 3 is a high level block diagram showing general behavior abstraction according to one embodiment of the present invention;
  • FIG. 4 is more detailed rendition of the behavior abstraction diagram of FIG. 3 with example behaviors and abstractions according to one embodiment of the present invention;
  • FIG. 5 is a high level block diagram of an example of behavior orchestration using one embodiment of asynchronous data streaming according to the present invention;
  • FIG. 6 is a flowchart showing one method embodiment of the present invention for asynchronous data streaming according to one embodiment of the present invention.
  • The Figures depict embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
  • DESCRIPTION OF THE INVENTION
  • An architecture for asynchronous robotic behavior is hereafter disclosed by way of example. The asynchronous architecture of the present invention provides an underlying framework for the application and utilization of asynchronously collected and updated data streams. Asynchronously collected data is abstracted and thereafter provided to a plurality of behavior modules. These modules are combined to form behavior suites which can thereafter be asynchronously executed. These behavior suites operate in what is referred to as a behavior layer that operates on top of a core architecture. The core architecture combines hardware, data collection and execution layers to form a foundation on which a plurality of distinct behaviors can operate.
  • Embodiments of the present invention are hereafter described in detail with reference to the accompanying Figures. Although the invention has been described and illustrated with a certain degree of particularity, it is understood that the present disclosure has been made only by way of example and that numerous changes in the combination and arrangement of parts can be resorted to by those skilled in the art without departing from the spirit and scope of the invention.
  • The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the present invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
  • Included in the description are flowcharts depicting examples of the methodology which may be used to asynchronously execute robotic behavior. In the following description, it will be understood that each block of the flowchart illustrations, and combinations of blocks in the flowchart illustrations, can be implemented by computer program instructions. These computer program instructions may be loaded onto a computer or other programmable apparatus to produce a machine such that the instructions that execute on the computer or other programmable apparatus create means for implementing the functions specified in the flowchart block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture, including instruction means that implement the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed in the computer or on the other programmable apparatus to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
  • Accordingly, blocks of the flowchart illustrations support combinations of means for performing the specified functions and combinations of steps for performing the specified functions. It will also be understood that each block of the flowchart illustrations, and combinations of blocks in the flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
  • The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purposes only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
  • Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information. By the term “substantially,” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide
  • As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
  • As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
  • In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
  • Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for a system and a process for asynchronous robotic behavior through the disclosed principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.
  • The behavior-based approach of the present invention is applicable for use in a wide variety of areas, including military applications, mining, space exploration, agriculture, factory automation, service industries, waste management, health care, disaster intervention and the home. To better understand the novelty of the present invention and what behavior-based robotics is, it may be helpful to explain what it is not. The behavior-based approach does not necessarily seek to produce cognition or a human-like thinking process. While these aims are admirable, they can be misleading. While it is natural for humans to model their own intelligence, humans are not aware of the myriad of internal processes that actually produce our intelligence.
  • A fundamental premise of the behavior-based approach is that sophisticated, high-level behavior can emerge from layered combinations of simple stimulus-response mappings. Instead of careful planning based on modeling, high-level behavior such as flocking or foraging can be built by blending low-level behaviors such as dispersion, aggregation, homing and wandering. Strategies can be built directly from behaviors, whereas plans must be based on an accurate model.
  • In the architectural style according to one or more embodiments of the present invention, behaviors are arranged into simple, asynchronous layers, each of which corresponds to a task. Higher layers equate to higher-level tasks, and each has the ability to override and interact with children (lower) layers. According to one embodiment of the present invention, behaviors are synchronous and arranged in a tree structure, allowing multiple tasks to occupy the same level or priority. Higher behaviors arbitrate the actions requested by each sub-behavior rather than subsume the lower-level behaviors as has been practiced in the prior art.
  • The subsumption architecture of the prior art gives each behavior direct, synchronous access to sensors, which the behaviors use to make reactive decisions. The architecture of the present invention also provides access to data collected from the sensors, but it is gathered asynchronously and transformed by a process to abstract the data into a more useful format. Moreover, in the present solution, the data from the sensors can be passed and handled by any of a plurality of behaviors and can be used with different levels of abstraction in different places. Behaviors in the present architecture are not limited to be reactive; they may indeed be dynamic or deliberate. Furthermore, the output of one or more behaviors can be the input to another behavior.
  • One or more embodiments of the present invention provide a synchronous behavior hierarchy (tree structure) wherein higher-level behaviors arbitrate outputs and requested actions from lower-level behaviors. Behavior modules make decisions from abstracted sensor information that is collected asynchronously and abstracted based on the distinct needs of individual behaviors. The same sensor data can be abstracted differently for other distinct behaviors.
  • FIG. 1 presents a high level block diagram for a framework for asynchronous data streaming according to one embodiment of the present invention. Five (5) layers comprising a framework 100 for asynchronous data streaming include a hardware layer 110, a data collection layer 130, an execution layer 150, and a behavioral layer 170. As shown, the hardware layer 110 is also associated with a plurality of sensors 115. As one of reasonable skill in the relevant art will appreciate, the depiction shown in FIG. 1 is illustrative only. In other embodiments of the present invention, more or less layers providing additional or less functionality can be included. Indeed, the present labeling of the layers shown in FIG. 1 are arbitrary and merely illustrative.
  • In a subsumption architecture, complicated intelligent behavior is decomposed into many simple behavior modules that are in turn organized into layers. Each layer implements a particular goal of the agent with each higher layer being increasingly abstract. Each layer's goal subsumes that of the underlying layers, thus the name of the architecture. Like the present invention, subsumption architecture is a bottom up design.
  • In a subsumption architecture, each horizontal layer can synchronously access all of the sensor data and generate reactive actions for the actuators. However, separate tasks can suppress (or overrule) inputs or inhibit outputs. Thus, the lowest layers work like fast-adapting mechanisms while the higher layers work to achieve an overall goal.
  • The architecture of the present invention invokes the same bottom up structure and each layer can access sensor data but the data is gather asynchronously. Moreover, data in the present invention is made more versatile by an abstraction process. By doing so, the data is placed into a format that can be utilized by a number of different behavior modules simultaneously. Data can also be used with different levels of abstraction in a variety of different locations and, according to one embodiment of the present invention, behaviors associated with the architecture for asynchronous data streaming are not limited to being reactive but are also dynamic and/or deliberate. Behaviors of the present invention are synchronous and arranged in a tree structure, allowing multiple tasks to occupy the same level or priority. Higher behaviors arbitrate the actions requested by each sub-behavior rather than subsume the lower-level behaviors.
  • Turning back to the architecture for asynchronous data streaming shown in FIG. 1, a behavior layer 170 can be comprised of a plurality of different behavior modules each of which can asynchronously access the data 130 collected by plurality of sensors 115 associated with the plurality of hardware modules 110.
  • To better understand the organization of the present invention and of the architecture for asynchronous data stream, consider the high-level block diagram shown in FIG. 2. FIG. 2 shows a behavior engine 280 asynchronously receiving inputs from a plurality of behavior modules 270, a plurality of data modules 230, and a plurality of hardware modules 210. According to another embodiment of the present invention, the behavior engine 280 also incorporates input from an operator interface 240. The behavior engine 280 enables various behavior modules to arbitrate actions and tasks requested by any sub-behavior or behavior of an equal level.
  • Each hardware module 210 is coupled to a physical sensor or other input device and collected data asynchronously from other hardware modules. Thus, a plurality of hardware modules and sensors interact to provide a web of data that can be asynchronously accessed by the behavior modules 270. Moreover, the hardware modules not only are responsible for data collection but also report on whether a specific type of hardware exists and whether the data that is collected is valid. In essence, the hardware modules read data from the corresponding physical hardware sensor or some other type of input device and access its validity. These hardware devices can operate autonomously or be tied to a direct input from a system operator of another component. According to another embodiment of the present invention the hardware modules can operate on a variety of communication mediums such as radio, Internet, USD, and the like. Each hardware module runs in parallel and asynchronously thus allowing data to be collected at a speed that is optimized for the underlying hardware in the overall computing environment.
  • The data modules 230 gain data from each hardware module 210 as well as from other data modules 230 and prepare a high level abstraction of the data that can be later used by all of the behavior modules 270. Each data module 230 runs independently or asynchronously from each other unrelated data module. According to one embodiment of the present invention, data modules 230 are organized into a directed graph architecture in which individual modules can either push or pull data depending on the environment in which they are operating. Accordingly, each data module represents a specific abstraction of the data. The directed data graph is formed by establishing higher-level data modules as an aggregation of a set of lower-level data modules. The abstraction of data is accomplished by data manipulation to achieve its proper and specified format. Abstraction processing is performed asynchronously with any other type of data or hardware module thus removing any unnecessary alliance between or among unrelated modules.
  • Behavior modules of the present invention make informed decisions as to what actions take place using the data continuously shared by the data modules. Thus the behavior modules 270 leverage the information gained by the data modules 230. Since the underlying data collection is asynchronous in nature, the speed at which the behavior modules execute is not constrained by the speed of data collection, abstraction, and interpretation. According to one embodiment of the present invention, behaviors are organized into a hierarchy, where additional inputs to each behavior module may be the action or set of actions suggested by lower-level behaviors. However, each behavior module is designed in such a way that it remains encapsulated and does not require higher-level behaviors to operate.
  • The data modules arbitrate amongst themselves and among sub-behaviors to determine which action should take place. While there is a tree hierarchy among the behaviors, that is there are behaviors and sub-behaviors, the behaviors at the same level possesses equal priority and thus arbitrate among themselves to determine a course of action.
  • Sensor modules (not shown) which are associated with the hardware module 210 shown in FIG. 2, operate asynchronously to collect and push data to the data modules 230. Once the data is received by the data modules 230, it is abstracted so that it can be equally accessed and utilized by a plurality of behavior modules 270.
  • The asynchronous architecture of the present invention decouples the collection of data from one or more behaviors seeking a particular type of data. The embodiments of the present invention enable data to be gathered at an optimal rate based on sensor or hardware capability. Each sensor/hardware module operates independently. Accordingly, each behavior can operate at its maximum frequency and capability since it is not slowed by any sensor processing or data delay. And unlike the architectures of the prior art, the asynchronous architecture of the present invention enables behaviors to be incorporated, modified, or removed freely without requiring any changes to the processes and procedures necessary for the gathering of information, as each sensor or hardware module runs independent of the behavior modules. Through the data collected by the hardware modules 210 is independent of the data modules 230 and the behavior modules 270 the data can be stored in a shared data structure making it unnecessary to maintain redundant copies of the data. And as each behavior module, data module, and hardware module operate independently each can run on different processing units or computers based on their individual, computational requirements. As one of ordinary skill in the relevant art will appreciate, the various embodiments of the present invention and the asynchronous architecture for data streaming described above enhances the flexibility and capability of autonomous robotic behavior.
  • FIG. 3 is a high level hierarchal depiction of the asynchronous data streaming architecture of the present invention. Mission logic 370 is achieved after the arbitration of a plurality of behaviors 340, 350 which asynchronously receive and process abstract data collected by plurality of sensors 310, 315. Each behavior 340, 350 can equally access and utilize various tiers of abstract data 320, 325, 330, 335, which are collected by a plurality of hardware sensors 310, 315. Each tier of abstract data 320, 325, 330, 335 modifies/transforms the data so that may be universally used by one or more behaviors 340, 350.
  • For illustration purposes, consider the hierarchal structure shown in FIG. 4. FIG. 4 is a depiction of an asynchronous data streaming architecture for achieving a mission logic 470 of “follow target.” As can be seen the architecture includes a plurality of sensors 210, the data collected by each of the sensors 230 and a plurality of behavior modules 270 each of which has its individual capability and task. In this example, two separate range sensors 410, 415 provide data for the various behavior modules. The data collected by the range sensors is abstracted by a plurality of methods so as to provide a variety of tiers of data that can be accessed by the various modules. For example, the 1st tier of data is Filtered Range Data 420 while the second tier of data is an Ego Centric Map 425. In addition, Occupancy Grid 430 and a Change Analysis Map 435 is also provided. Each of these tiers of abstracted data is formed from the raw data collected by the range sensors 410, 415.
  • The various tiers of data, also referred to herein as data modules or data collection modules 230, are equally accessible by each behavior module. In this example the Guarded Motion Module 440 accesses the Ego Centric Map tier 425 to make its determination or recommendation for action. Likewise, the Obstacle Avoidance module 442 seeks data from the Ego Centric Map data tier 425. Similarly, the Follow Path module 444 and the Identify Laser Target module 446 access data from the Occupancy Grid data tier 430 and the Change Analysis Map data tier 435, respectively. The Plug-in Behavior modules do not access abstract data from the range sensors 410, 415 at all. The Visual Target behavior module 448 and the Thermal Target Identification module 450 operate independently of the data collected from the range sensors 410, 415.
  • One of reasonable skill in the relevant art will appreciate that each and every behavior module 440, 442, 444, 446, 448 and 450 can equally access each and every tier of abstract data 420, 425, 430, 435 asynchronously. Moreover, each behavior module is not limited to access to only one data tier. For example, the Follow Path module 444 accesses data from the Occupancy Grid tier 430. While the Follow Path Module 444 only accesses a single abstract data tier it could easily access other data tiers such as the Ego Centric Map 425 or the Filtered Range Data 420. Just as two modules, the Thermal Target Module 450 and the Visual Target Identification module 448, do not access any of the data tiers, any behavior module can access any abstract the data tier as necessary.
  • The depiction in FIG. 4 shows that each behavior module is provided with data necessary to make its recommendation for an action determination. Each is equally situated to provide input to the mission logic, which in this case is to follow a target. In cases in which the action of one or more modules conflict, the actions are arbitrated to arrive at the desired outcome.
  • FIG. 4 also shows an overlay of the model architecture shown in FIG. 2 to give the reader a better appreciation for how the various modules interact in arriving at ignition module. In this example, the range sensors 410, 415 are two examples of hardware modules 210. The four tiers of data including Filtered Range data 420, Ego Centric Map data 425, Occupancy Grid data 430, and Change Analysis Map data 435 are modules of collected data 230. Lastly the behaviors including Guarded Motion 440, Obstacle Avoidance 442, Follow Path 444, Identify Laser Target 446, Target Visual Identification 448, and thermal target identification 450 are all behavior modules 270. And while not shown, the behavior module 280 arbitrates the various inputs or action recommendations generated by each behavior module.
  • Another aspect of the present invention is unique ability to specify and reconfigure behavior orchestration. According to one embodiment of the present invention, the asynchronous architecture described herein enables a mission logic/output to be specified as a unique combination of a plurality of behaviors, sensors, and behavior parameters. As the behavior modules of the present invention are modular and operate independently and mission logic or mission objective can be identified and formed as a combination of these independent modules. The present invention then seeks and incorporates hardware/sensors, abstract data, and various mission behavior outcomes to arrive at the mission logic objective. In this manner, a behavior engine provides a means to optimize the mission behavior dynamically by changing input parameters to the various behavior modules so as to interpret and to respond to various collected data and user input.
  • Suites of behaviors can also be compiled as separate libraries that can each be customized and distributed to different platforms and/or utilized on different missions while maintaining the same core behavior architecture. Accordingly, the core architecture remains unchanged over various application layers/behaviors that can be continually, and dynamically customized, repackaged and distributed. The asynchronous architecture of the present invention provides cross-application platform technology that is dynamically reconfigurable such that a behavior or outcome can be prioritized based on decision logic embedded in one or more of the behavior modules. A top-level function or behavior module of the overall mission architecture to provide as a means by which prioritize different behavior outcomes and, as described herein, use as the basis to arbitrate between conflicting behavior directives. In essence, each application-centric layer cake uses the same basic behavior modules (modular and reusable modules) but orchestrates these individual behavior modules differently using unique behavior composition and prioritization specifications. By doing so, a user through what is referred to herein as a user control unit or user interface can change behavior goals dynamically and thus reconfigure the mission logic to arrive at the desired optimization and outcome.
  • FIG. 5 presents a high-level interaction block diagram of an illustrative arbitration process by which a mission logic/outcome is comprised of a plurality of individual/modular behavior inputs. As shown, behavior engine 510 possesses predetermined behavior outcomes or receives a mission logic outcome from a user via a user control unit 575. As illustrated in FIG. 5, the user control unit 575 can be communicatively coupled to the behavior engine 510 via a radio link 570. The behavior engine 510 issues logic directives to a variety and plurality of behavior modules 520, 530, 540, 550, 560 to assemble behaviors based, in one embodiment, on the use of distance and heading cues. By using standard inputs such as heading and distance the behavior engine can provide overall behavior orchestration. In the example depicted in FIG. 5 a behavior engine 510 communicates directly to a Wander module 540 and a Follow module 530. The behavior engine 510 also communicates to the Follow module 530 and a Way Point Traverse module 520 via a Shared Control module 515. The output of the Wander module 540, the Follow module 130, and a Way Point Traverse module 520 is supplied as input to an Avoid module 550 which in turn provides an output to a Guarded Motion module 560. The Follow module 530 and a Way Point Traverse module 520 each receive heading information from the behavior engine 510. This heading information may, in this example, be a general direction which the user wishes the robotic device to traverse.
  • Using the basic input of heading the Follow module 530 and a Way Point Traverse module 520 provide the Avoid module 550 an initial heading, desired speed and distance. The Follow module 530 (also referred to herein as target follow or lead/follow) can function in tandem (parallel) with the Way Point traverse module 520 (or other modules) to calculate a path to the target or to specific targets along a path. The Way Point module 520 (also sometime referred herein to as the path follow module) receives a heading input from the Shared Control module 515 and uses a control loop to direct a specific robot heading so as to maintain its position on a path. These heading directives can either be waypoint-based (where there are individual intermittent waypoints) or track-based (where the path is a continuous curve or line. The arbitration between path following and target following is accomplished by the Behavior Engine behavior at the top level. The Shared Control module 515 is a behavior element that modifies heading as an input to Way Point Traverse module 520 and the Follow module 530. This allows an external planner (user) to influence or assume control from the path following or target following behavior. The extent of the heading change and the decision to either assume or augment the heading input is calculated within the Shared Control module 515.
  • The Wander module 550 provides no heading information but nonetheless provides the Avoid module 550 with a general plan for surveying a general area. The Wander module 550 provides a means by which to explore as well as sort out challenging situations in which the device has failed to achieve established goals or directives of the Behavior Engine 510 in terms of a specific traverse. For example, a failure for the device to traverse a desired path, to make progress toward a certain target, or accept user input can be addressed by wandering to some degree. The Behavior Engine 510 remains responsible for directing a “Wander” behavior as it may impact the object's activity. Wandering provides an element of randomness or resourcefulness that can be useful in a cluttered or highly dynamic environment in which deterministic path following or directive headings are ineffective.
  • Inputs from the various and independent behavior modules are used by the Avoid module 550 to determine a suitable output. Presumably, the Avoid module 550 has certain predefined or user input locations in which the robot is to avoid. Essentially, the Avoid module reactively navigates around either static or dynamic obstacles that may not have been accounted for in other behavior elements. As described hereafter, these obstacles can be also identified using an active position tag. Thus the inputs of desire heading, distance, and speed are used and processed by the Avoid module 550 to provide a direction of traverse that both meets mission objectives and the avoidance criteria.
  • Finally, the output of the Avoid module 550 is provided as input to a Guarded Motion module 560. The Guarded Motion module 560 ensures that there are no collisions. While the Behavior Engine 510 has little input into the Guarded Motion module 560, it can turn it on or off as well as determine the acceptable distances used as safety buffers between the object (robot) and various other objects (e.g. people, vehicles, plants, walls, etc.). In this example, the Guarded Motion module 560 may also prevent the robotic device from exceeding a particular angle orientation on a hill so that it will not tip over, speed, or any other motion that would jeopardize the success and/or wellbeing of the platform. The Guarded Motion module thereafter produces an output sent back to the Behavior Engine 510 for comparison with the overall mission objective. If the mission objective and the output of the Guarded Motion module 560 are aligned the Behavior engine 510 provides commands to the drive mechanism 580 which correspondingly turns the wheels 590 or acts on a similar device.
  • The Behavior Engine 510, in combination with the asynchronous architecture of the present invention, enables modular behaviors to modify collective outcomes so as to meet an overall mission objective. The Behavior Engine 510 uses heading, distance and a plurality of other parameters to arbitrate between different lower level behavior elements such as those produced by the Follow module 530, Way Point Traverse module 520, Wander module 540, Shared Control Module 515, Guarded Motion Module 560 and the Avoid module 550. The Behavior Engine 510 does this, in one embodiment, by modifying the heading and distance inputs that go into each behavior to arrive an an optimal result that yet meets mission objectives. Alternatively, the Behavior Engine 510 can also completely turn off an individual behavior element in which case, no execution of that behavior occurs. Moreover, the Behavior Engine 510 can also execute two or more behaviors in parallel and then, based on the outputted heading and distance, decide how to select between the competing outputs, or, in an different embodiment, combine them in some meaningful way. It is also possible to assign specific prioritization to individual active positioning tags so that certain tags (e.g. a hazard tag or avoid tag) have a defined priority over other tags (e.g. a follow tag). In this manner the Behavior Engine 510 allows for the orchestration of behaviors to be in part dependent on a mission level behavior specification but also use tag prioritization to influence behavior coordination.
  • Assume for example, and as illustrated above, that an operator communicated to the particular platform a desired outcome or goal to reposition a robotic device from one point to another. Prior to that input the platform had possessed a general directive to scan and survey a particular geographic region using the Wander module 540. Thus, the new directive provided new direction or goals dynamically that would modify the implementation of the currently operational behavior modules. In this case the wander module 540 was the lowest priority or mission objective and whose output may be modified by the new mission objective.
  • The new directive to move from one point to another required the Follow module 530 and a Way Point Traverse module 520 to produce a specific heading, speed, and distance. It is possible that the same heading, speed, and distance matched that which had been output by the Wander module 540. Alternatively, the heading, speed and distance would not conflict with the Wander module output. However, should there be a conflict the output of the Follow module 530 and a Way Point Traverse module 520 would override that of the Wander module 540. Thus, the Follow module 530 and a Way Point Traverse module 520 would be viewed as being in a higher layer than the Wander module 540.
  • Another layer higher than all previous modules would be the Avoid module 550. This module ensures that the robotic platform avoids certain predetermined or communicated positions or hazards as identified by onboard sensors. For example, perhaps an onboard sensor identifies a thermal hot spot (fire) that should be avoided or a wireless communication conveys the precise location of an explosive device. The Avoid module 550 would avoid these positions. And lastly, the Guarded Motion module 560 is yet another, and higher, layer than even the Avoid module 550. This module may, for example, guard the platform from certain types of motion that would jeopardize its safety. Each module described herein accesses abstract and asynchronously collected data simultaneously to determine the optimal outcome for that particular module.
  • Once each module has determined its optimal outcome based on required input data, that being either supplied by another behavior module or data that has been asynchronously collected by one or more sensors, the architecture of the present invention arbitrates the outcomes based on a hierarchal structure determined by the overall mission objective. The behavior engine 510 thereafter compares the recommended course of action based on the arbitrated behaviors of one or more behavior modules to that of the overall mission objective to confirm that the recommended course of action is in compliance with the overall mission objective. Once the directed course of action is confirmed by the behavior, is it is thereafter conveyed to various hardware and logic components to implement commands.
  • One of reasonable skill in the relevant art will recognize that the example presented above and is depicted in FIG. 5 is only one possible combination of a variety and plurality of modular behavior modules that can be implemented according to a hierarchical architecture using asynchronously collected data. Another aspect of the present invention is the use of one or more active position ultra wide band (UWB) transceivers or tags. These ultra wide-band (UWB) radio frequency (RF) identification (ID) tag systems (collectively RFID) comprise a reader with an antenna, a transmitter, and software such as a driver and middleware. One function of the UWB RFID system is to retrieve state and positional information (ID) generated by each tag (also known as a transponder). Tags are usually affixed to objects so that it becomes possible to locate where the goods are without a direct line-of-sight given the low frequency nature of their transmission. A tag can include additional information other than the ID. For example, using triangulation of the tag's position and the identity of a tag, heading and distance to the tag's location can be ascertained. A single tag can also be used as a beacon for returning to a specific location or carried by an individual or vehicle to affect a follow behavior from other like equipped objects. As will be appreciated by one of reasonable skill in the relevant art, other active ranging technology is equally applicable to the present invention and is contemplated in its use. The use of the term “UWB”, “tags” or “RFID tags,” or the like, is merely exemplary and should not be viewed as limiting the scope of the present invention.
  • In one implementation of the present invention, a RFID and/or UWB tag cannot only be associated with a piece of stationary infrastructure with a known, precise, position, but also provide active relative positioning between movable objects. For example, even if the two or more tags are unaware of their precise position that can provide accurate relative position. Moreover, the tag can be connected to a centralized tracking system to convey interaction data. As a mobile object interacts with the tag of a known position, the variances in the objects positional data can be refined. Likewise, a tag can convey not only relative position between objects but relative motion between objects as well. Such tags possess low-detectability and are not limited to line of sight nor are they vulnerable to jamming. And, depending on how mounted and the terrain in which they are implemented, a tag and tracking system can permit user/tag interaction anywhere from 200 ft to 2-mile radius of accurate positioning. Currently, tags offer relative position accuracy of approximately +/−12 cm for each interactive object outfitted with a tag. As will be appreciated by one or reasonable skill in the relevant art, the use of the term object is not intended to be limiting in any way. While the present invention is described by way of examples in which objects may be represented by vehicles or cellular telephones, an object is to be interpreted as an arbitrary entity that can implement the inventive concepts presented herein. For example, an object can be a robot, vehicle, aircraft, ship, bicycle, or other device or entity that moves in relation to another. The collaboration and communication described herein can involve multiple modalities of communication across a plurality of mediums.
  • The active position tags of the present invention can also provide range and bearing information. Using triangulation and trilateration between tags, a route can be established using a series of virtual waypoints. Tags can also be used to attract other objects or repulse objects creating a buffer zone. For example, a person wearing a tag can create a 4-foot buffer zone which will result in objects not entering the zone to protect the individual. Similarly, a series of tags can be used to line a ditch or similar hazard to ensure that the object will not enter a certain region.
  • The process by which an asynchronous data can be used to, in one example, determine robotic behavior in accordance with an overall mission logic is further illustrated by the flowchart of FIG. 6. According to one embodiment of the present invention, a process by which to asynchronously provide data to a plurality of behavior modules begins with the collection of data 610. As has been previously described, data is collected asynchronously by a plurality of hardware modules or sensors but is used independently by behaviors. Said differently, behavior execution is independent of data collection. As the data is collected it is abstracted 620 and provided for asynchronous access by a plurality of behavior modules 640. According to one embodiment of the present invention, one or more behavior modules can be combined 670 to form what is referred to as a behavior suite. Behavior suites can then be treated either independently or in conjunction with other behaviors in the determination of a recommended action. And finally, the behavior suites are executed 680 to achieve the desired mission outcome.
  • The Behavior Engine of the present invention is a reusable behavior framework. Elements such as 3D mapping, position fusion calculations, guarded motion, obstacle avoidance, follow, path planning and path following represent behaviors which the behavior engine can implement to achieve mission objectives. These behaviors can also be combined. For example, guarded motion, obstacle avoidance, and reactive follow work together to form a primitive reactive behavior core. This core behavior can thereafter couple with reusable interfaces to achieve higher level of outcomes such as 3D mapping, path planning, hazards and path trajectories.
  • In such a manner reactive behaviors can be used across any object or vehicle or application. Moreover, the connectivity between reactive behaviors and application layer representations can be universal even when specific applications algorithms differ. For example, an implementation of these concepts may be vastly different but the core concepts will remain the same. The behavior engine manages basic handoffs between behaviors such as guarded motion, obstacle avoidance, follow, path planning and path following. While a follow algorithm may change (e.g. Akerman vs. skid steer) as may a follow algorithm (bread crumb follow vs. regular follow) the interplay between the behaviors remains the same.
  • The architecture of the present invention facilitates the development and implementation of modular robotic behaviors. By providing an underlying framework of asynchronously updated data streams that can be universally accessed by one more more behavior modules, the capabilities of a wide variety of robotic platforms can be modified and tailored to meet specific mission needs. Moreover, the same platform can be quickly modified using plug-and-play modules to provide new capabilities without having to modify the the collection and preparation of sensor data. Modules can be interchanged between platforms with confidence that each platform will provide to the newly incorporated module the information it needs to accomplish its mission.
  • The core architecture of the present invention comprised of the hardware layer, the data collection layer and the execution layer enables behaviors associated with the behavior layer to operate independently and to be modified dynamically. Asynchronously collected data is abstracted so as to be useable by a plurality of data modules simultaneously. Moreover, the use of a set of abstracted data by one behavior module is completely independent of the simultaneous and continuous use of the same data, albeit abstracted differently, by a different behavior module.
  • The architecture of the present invention also enables behavior modules to be dynamically modified responsive to collected data. Should data indicate that the current parameters of a particular behavior are not compatible with the overall mission objective based on the collected data, the parameters can be modified without modifying the data collection means or processing. Once the module has been modified it can again access the collected data to initiate a responsive action.
  • In situations in which two or more actions developed by behaviors of the same hierarchal level conflict, a resolution is reached as to which behavior should be applied based on desired higher level behavioral outcomes. The architecture and associated methods of the present invention provide a significant advance over the prior art and address many of the challenges in robotic behavioral control.
  • As will be appreciated by one skilled in the relevant art, one preferred means of implementing the present invention is in a computer system as software, hardware or a combination thereof. These implementation methodologies are known within the art and the specifics of their application within the context of the present invention will be readily apparent to one of ordinary skill in the relevant art in light of this specification and as described below.
  • For example, one more embodiments of the present invention may be implemented in a computer system as a program of instructions executable by a machine. In such an implementation, the program of instructions may take the form as one or more program codes for collection data asynchronously, abstracting the data, accessing the abstracted data asynchronously by one or behavior modules and then code for executing the modules.
  • Some other possibilities for implementing aspects of the systems and methods include micro-controllers with memory (such as electronically erasable programmable read-only memory (EEPROM)), embedded microprocessors, firmware, software, etc. Computer-readable media in which instructions may be embodied include, but are not limited to, non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media) and carrier waves that may be used to transfer such formatted data and/or instructions through wireless, optical, or wired signaling media or any combination thereof.
  • As will be understood by those familiar with the art, the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Likewise, the particular naming and division of the modules, managers, functions, systems, engines, layers, features, attributes, methodologies, and other aspects are not mandatory or significant, and the mechanisms that implement the invention or its features may have different names, divisions, and/or formats. Furthermore, as will be apparent to one of ordinary skill in the relevant art, the modules, managers, functions, systems, engines, layers, features, attributes, methodologies, and other aspects of the invention can be implemented as software, hardware, firmware, or any combination of the three. Of course, wherever a component of the present invention is implemented as software, the component can be implemented as a script, as a standalone program, as part of a larger program, as a plurality of separate scripts and/or programs, as a statically or dynamically linked library, as a kernel loadable module, as a device driver, and/or in every and any other way known now or in the future to those of skill in the art of computer programming. Additionally, the present invention is in no way limited to implementation in any specific programming language, or for any specific operating system or environment. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.

Claims (19)

We claim:
1. An architecture for asynchronous robotic behavior, comprising:
a hardware layer generating sensor data from a plurality of sensors wherein the plurality of sensors includes a plurality of ultra wide band (UWB) transceivers and wherein each of the UWB transceivers generates positional information;
a data collection layer wherein the data collection layer asynchronously collects sensor data from the plurality of sensors and thereafter abstracts the collected sensor data making abstract collected sensor data simultaneously available to each of a plurality of behaviors;
a behavior layer including the plurality of behaviors wherein the behavior layer dynamically prioritizes each of the plurality of behaviors based, in part, on positional information; and
an execution layer wherein the execution layer asynchronously and in parallel executes one or more of the plurality of behaviors responsive to behavior level prioritization.
2. The architecture for asynchronous robotic behavior of claim 1, wherein the collection of sensor data and prioritization of the plurality of behaviors is decoupled.
3. The architecture for asynchronous robotic behavior of claim 1, wherein the plurality of behaviors includes reactive behaviors and deliberate behaviors.
4. The architecture for asynchronous robotic behavior of claim 3, wherein the behavior layer dynamically prioritizes reactive and deliberate behaviors based on positional information.
5. The architecture for asynchronous robotic behavior of claim 3, wherein reactive behaviors are selected from a group consisting of guarded motion, collision avoidance and follow path behaviors.
6. The architecture for asynchronous robotic behavior of claim 3, wherein deliberate behaviors include follow path and way point traverse behaviors.
7. The architecture for asynchronous robotic behavior of claim 3, wherein the behavior layer continuously prioritizes each of the plurality of behaviors based on positional information.
8. The architecture for asynchronous robotic behavior of claim 1, wherein the execution layer executes a deliberate behavior until directed by the behavior layer to modify the deliberate behavior with a reactive behavior based on positional information.
9. The architecture for asynchronous robotic behavior of claim 8, wherein priority with respect to the reactive behavior and the deliberate behavior changes in response to a change in positional information.
10. The architecture for asynchronous robotic behavior of claim 8, wherein the reactive behavior is a guarded motion behavior.
11. The architecture for asynchronous robotic behavior of claim 1, wherein each of the plurality of behaviors is associated with a priority based on positional information and the priority changes in response to a change in positional information.
12. The architecture for asynchronous robotic behavior of claim 11, wherein the execution layer executes a deliberative behavior, and, responsive to the change in positional information, the execution layer merges the deliberative behavior with a reactive behavior possessing priority greater than the deliberative behavior.
13. The architecture for asynchronous robotic behavior of claim 12, wherein the deliberative behavior is a way point traverse behavior and the reactive behavior is an obstacle avoidance behavior.
14. The architecture for asynchronous robotic behavior of claim 12, wherein the deliberative behavior is a follow path behavior and the reactive behavior is an obstacle avoidance behavior.
15. The architecture for asynchronous robotic behavior of claim 12, wherein based on the change in positional information the deliberative behavior is modified to incorporate the reactive behavior.
16. A method for asynchronously executing robotic behaviors of an object, the method comprising:
collecting data asynchronously and in parallel from a plurality of sensors wherein the plurality of sensors includes a plurality of ultra wide band (UWB) transceivers and wherein each of the UWB transceivers generates positional information to determine a position of the object;
abstracting the collected data wherein the abstracted collected data is simultaneously available to each of a plurality of behavior modules, and wherein each behavior module is associated with a priority level that is continuously updated based on the position of the object;
arbitrating a conflict between two or more of the plurality of behaviors modules based on the priority level associated with each behavior module;
responsive to arbitrating the conflict between two or more of the plurality of behavior modules, merging the two or more of the plurality of behavior modules forming a merged behavior module; and
executing by the object the merged behavior module.
17. The method for asynchronously executing robotic behaviors according to claim 16, wherein the two or more of the plurality of behaviors modules include a reactive behavior and a deliberative behavior.
18. The method for asynchronously executing robotic behaviors according to claim 17, wherein the deliberative behavior is a way point traverse behavior and the reactive behavior is an obstacle avoidance behavior.
19. The method for asynchronously executing robotic behaviors according to claim 17, wherein the deliberative behavior is a follow path behavior and the reactive behavior is an obstacle avoidance behavior.
US14/941,199 2011-08-30 2015-11-13 Asynchronous Data Stream Framework Abandoned US20160075014A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/941,199 US20160075014A1 (en) 2011-08-30 2015-11-13 Asynchronous Data Stream Framework

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161529206P 2011-08-30 2011-08-30
US13/597,791 US20130054023A1 (en) 2011-08-30 2012-08-29 Asynchronous Data Stream Framework
US14/941,199 US20160075014A1 (en) 2011-08-30 2015-11-13 Asynchronous Data Stream Framework

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/597,791 Continuation US20130054023A1 (en) 2011-08-30 2012-08-29 Asynchronous Data Stream Framework

Publications (1)

Publication Number Publication Date
US20160075014A1 true US20160075014A1 (en) 2016-03-17

Family

ID=47742944

Family Applications (9)

Application Number Title Priority Date Filing Date
US13/598,114 Active 2033-01-14 US8972053B2 (en) 2011-08-30 2012-08-29 Universal payload abstraction
US13/597,991 Active 2034-05-05 US9195911B2 (en) 2011-08-30 2012-08-29 Modular robotic manipulation
US13/597,791 Abandoned US20130054023A1 (en) 2011-08-30 2012-08-29 Asynchronous Data Stream Framework
US13/597,911 Expired - Fee Related US9053394B2 (en) 2011-08-30 2012-08-29 Vehicle management system
US13/598,021 Abandoned US20130050180A1 (en) 2011-08-30 2012-08-29 Graphical Rendition of Multi-Modal Data
US14/717,688 Active US9731417B2 (en) 2011-08-30 2015-05-20 Vehicle management system
US14/717,219 Active US9586314B2 (en) 2011-08-30 2015-05-20 Graphical rendition of multi-modal data
US14/918,059 Active US10226864B2 (en) 2011-08-30 2015-10-20 Modular robotic manipulation
US14/941,199 Abandoned US20160075014A1 (en) 2011-08-30 2015-11-13 Asynchronous Data Stream Framework

Family Applications Before (8)

Application Number Title Priority Date Filing Date
US13/598,114 Active 2033-01-14 US8972053B2 (en) 2011-08-30 2012-08-29 Universal payload abstraction
US13/597,991 Active 2034-05-05 US9195911B2 (en) 2011-08-30 2012-08-29 Modular robotic manipulation
US13/597,791 Abandoned US20130054023A1 (en) 2011-08-30 2012-08-29 Asynchronous Data Stream Framework
US13/597,911 Expired - Fee Related US9053394B2 (en) 2011-08-30 2012-08-29 Vehicle management system
US13/598,021 Abandoned US20130050180A1 (en) 2011-08-30 2012-08-29 Graphical Rendition of Multi-Modal Data
US14/717,688 Active US9731417B2 (en) 2011-08-30 2015-05-20 Vehicle management system
US14/717,219 Active US9586314B2 (en) 2011-08-30 2015-05-20 Graphical rendition of multi-modal data
US14/918,059 Active US10226864B2 (en) 2011-08-30 2015-10-20 Modular robotic manipulation

Country Status (2)

Country Link
US (9) US8972053B2 (en)
WO (3) WO2013033351A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018045448A1 (en) * 2016-09-06 2018-03-15 Advanced Intelligent Systems Inc. Mobile work station for transporting a plurality of articles
US10633190B2 (en) 2018-02-15 2020-04-28 Advanced Intelligent Systems Inc. Apparatus for supporting an article during transport
US10645882B1 (en) 2018-10-29 2020-05-12 Advanced Intelligent Systems Inc. Method and apparatus for performing pruning operations using an autonomous vehicle
US10676279B1 (en) 2018-11-20 2020-06-09 Advanced Intelligent Systems Inc. Systems, methods, and storage units for article transport and storage
US10745219B2 (en) 2018-09-28 2020-08-18 Advanced Intelligent Systems Inc. Manipulator apparatus, methods, and systems with at least one cable
US10751888B2 (en) 2018-10-04 2020-08-25 Advanced Intelligent Systems Inc. Manipulator apparatus for operating on articles
US10966374B2 (en) 2018-10-29 2021-04-06 Advanced Intelligent Systems Inc. Method and apparatus for performing pruning operations using an autonomous vehicle
WO2023021091A1 (en) * 2021-08-20 2023-02-23 Telefonaktiebolaget Lm Ericsson (Publ) Sequential behavior for intelligent control in subsumption-like architecture

Families Citing this family (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10331136B2 (en) * 2006-02-27 2019-06-25 Perrone Robotics, Inc. General purpose robotics operating system with unmanned and autonomous vehicle extensions
US9833901B2 (en) * 2006-02-27 2017-12-05 Perrone Robotics, Inc. General purpose robotics operating system with unmanned and autonomous vehicle extensions
JP5649892B2 (en) * 2010-09-22 2015-01-07 トヨタ自動車株式会社 Section setting method, fuel consumption information generating device, and driving support device
US11334092B2 (en) * 2011-07-06 2022-05-17 Peloton Technology, Inc. Devices, systems, and methods for transmitting vehicle data
US8972053B2 (en) 2011-08-30 2015-03-03 5D Robotics, Inc. Universal payload abstraction
US8909404B2 (en) * 2013-03-15 2014-12-09 Ford Global Technologies, Llc Information display system and method
US9367811B2 (en) 2013-03-15 2016-06-14 Qualcomm Incorporated Context aware localization, mapping, and tracking
ITMI20131966A1 (en) * 2013-11-26 2015-05-27 Datalogic IP Tech Srl DISTANCE SENSOR WITHOUT CONTACT AND METHOD TO CARRY OUT A DISTANCE MEASURE WITHOUT CONTACT
WO2015085483A1 (en) 2013-12-10 2015-06-18 SZ DJI Technology Co., Ltd. Sensor fusion
US9841463B2 (en) * 2014-02-27 2017-12-12 Invently Automotive Inc. Method and system for predicting energy consumption of a vehicle using a statistical model
WO2015172131A1 (en) * 2014-05-09 2015-11-12 Carnegie Mellon University Systems and methods for modular units in electro-mechanical systems
WO2016033795A1 (en) 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd. Velocity control for an unmanned aerial vehicle
EP3008535B1 (en) 2014-09-05 2018-05-16 SZ DJI Technology Co., Ltd. Context-based flight mode selection
EP3428766B1 (en) 2014-09-05 2021-04-07 SZ DJI Technology Co., Ltd. Multi-sensor environmental mapping
US10071735B2 (en) * 2015-04-13 2018-09-11 Matthew Gage Merzig Airspeed adaptive cruise control for ground vehicles
US20190138021A1 (en) * 2015-04-13 2019-05-09 Matthew Gage Merzig Airspeed Adaptive Cruise Control for Ground Vehicles
US10379007B2 (en) 2015-06-24 2019-08-13 Perrone Robotics, Inc. Automated robotic test system for automated driving systems
US10035264B1 (en) 2015-07-13 2018-07-31 X Development Llc Real time robot implementation of state machine
US9922282B2 (en) * 2015-07-21 2018-03-20 Limitless Computing, Inc. Automated readiness evaluation system (ARES) for use with an unmanned aircraft system (UAS)
US10391712B2 (en) * 2016-02-18 2019-08-27 Xerox Corporation System and method for automated cleaning of parts produced by a three-dimensional object printer
WO2017169898A1 (en) * 2016-03-30 2017-10-05 パナソニックIpマネジメント株式会社 Data storage device, robot system, and data storage method
US10118696B1 (en) 2016-03-31 2018-11-06 Steven M. Hoffberg Steerable rotating projectile
US20170298849A1 (en) * 2016-04-15 2017-10-19 Ford Global Technologies, Llc System and method for enhanced operator control of fuel saving modes
ES2661067B1 (en) * 2016-06-20 2019-01-16 Erle Robotics S L METHOD OF DETERMINATION OF CONFIGURATION OF A MODULAR ROBOT
US10515079B2 (en) 2016-06-23 2019-12-24 Airwatch Llc Auto tuning data anomaly detection
US10650621B1 (en) 2016-09-13 2020-05-12 Iocurrents, Inc. Interfacing with a vehicular controller area network
CN108282504B (en) * 2017-01-05 2020-06-26 北京四维图新科技股份有限公司 Automatic driving map data acquisition vehicle, base station and cooperative acquisition system and method
US10162357B2 (en) 2017-03-15 2018-12-25 Toyota Research Institute, Inc. Distributed computing among vehicles
US10265844B2 (en) * 2017-03-24 2019-04-23 International Business Machines Corporation Creating assembly plans based on triggering events
CN107433591A (en) * 2017-08-01 2017-12-05 上海未来伙伴机器人有限公司 Various dimensions interact robot application control system and method
CN107351084B (en) * 2017-08-04 2020-05-19 哈尔滨工业大学 Space manipulator system error correction method for maintenance task
JP7087316B2 (en) * 2017-09-27 2022-06-21 オムロン株式会社 Information processing equipment, information processing methods and programs
FR3076047B1 (en) * 2017-12-22 2021-01-08 Michelin & Cie PROCESS FOR MANAGING A PLATOON OF TRUCKS BASED ON INFORMATION RELATING TO THE TIRES EQUIPPING THE TRUCKS DUDIT PLATOON
US11568236B2 (en) 2018-01-25 2023-01-31 The Research Foundation For The State University Of New York Framework and methods of diverse exploration for fast and safe policy improvement
WO2019173918A1 (en) * 2018-03-13 2019-09-19 Advanced Intelligent Systems Inc. System and method for configuring and servicing a robotic host platform
KR102061810B1 (en) * 2018-03-23 2020-01-02 단국대학교 산학협력단 System and Method for Processing Multi type Sensor Signal Based on Multi modal Deep Learning
US11712637B1 (en) 2018-03-23 2023-08-01 Steven M. Hoffberg Steerable disk or ball
US10594549B2 (en) * 2018-05-18 2020-03-17 Nant Holdings Ip, Llc Fine grained network management to edge device features
WO2020053454A1 (en) * 2018-09-12 2020-03-19 Erle Robotics, S.L. Controller for robots
US11720849B2 (en) 2018-11-15 2023-08-08 Corverity Corporation Method and system for managing navigational data for autonomous vehicles
US20200216066A1 (en) * 2019-01-04 2020-07-09 Delphi Technologies Ip Limited System and method for controlling vehicle propulsion
US10665251B1 (en) 2019-02-27 2020-05-26 International Business Machines Corporation Multi-modal anomaly detection
CN110134081B (en) * 2019-04-08 2020-09-04 华东计算技术研究所(中国电子科技集团公司第三十二研究所) Control system based on robot capability model
US11631331B1 (en) * 2019-06-03 2023-04-18 Smartdrive Systems, Inc. Systems and methods for providing lane-specific recommendations to a vehicle operator of a vehicle
US11711238B2 (en) * 2019-06-21 2023-07-25 Stevyn Pompelio Methods for operator control unit and payload communication
US11796330B2 (en) * 2020-02-11 2023-10-24 Delphi Technologies Ip Limited System and method for providing value recommendations to ride-hailing drivers
CN111300426B (en) * 2020-03-19 2022-05-31 深圳国信泰富科技有限公司 Control system of sensing head of highly intelligent humanoid robot
CN111300390B (en) * 2020-03-20 2021-03-23 南栖仙策(南京)科技有限公司 Intelligent mechanical arm control system based on reservoir sampling and double-channel inspection pool
US20210331686A1 (en) * 2020-04-22 2021-10-28 Uatc, Llc Systems and Methods for Handling Autonomous Vehicle Faults
WO2021247896A1 (en) * 2020-06-03 2021-12-09 Crooks Ricardo R Method for optimizing fuel comsumption using real time traffic data
CN111660285A (en) * 2020-06-30 2020-09-15 佛山科学技术学院 Multi-robot cooperative control method, system, equipment and storage medium
JP2023542515A (en) * 2020-09-23 2023-10-10 デクステリティ・インコーポレーテッド Speed control based robot system
NL2026533B1 (en) * 2020-09-24 2022-05-30 Avular B V Modular robot control system
CN113159071B (en) * 2021-04-20 2022-06-21 复旦大学 Cross-modal image-text association anomaly detection method
GB2598049B (en) * 2021-06-30 2022-09-21 X Tend Robotics Inc Modular frame for an intelligent robot
US20230009466A1 (en) * 2021-07-09 2023-01-12 Booz Allen Hamilton Inc. Modular payload for unmanned vehicle
US11881064B2 (en) 2022-03-09 2024-01-23 Calamp Corp Technologies for determining driver efficiency

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060008976A1 (en) * 2004-01-14 2006-01-12 Taiwan Semiconductor Manufacturing Co., Ltd. Novel random access memory (RAM) capacitor in shallow trench isolation with improved electrical isolation to overlying gate electrodes
US20080009968A1 (en) * 2006-07-05 2008-01-10 Battelle Energy Alliance, Llc Generic robot architecture
US20100008270A1 (en) * 2008-07-11 2010-01-14 Gwangju Institute Of Science And Technology Method and System for Localization Using One-Way Ranging Technique
US20130054023A1 (en) * 2011-08-30 2013-02-28 5D Robotics, Inc. Asynchronous Data Stream Framework

Family Cites Families (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4980626A (en) 1989-08-10 1990-12-25 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for positioning a robotic end effector
US5216631A (en) 1990-11-02 1993-06-01 Sliwa Jr John W Microvibratory memory device
US5365516A (en) * 1991-08-16 1994-11-15 Pinpoint Communications, Inc. Communication system and method for determining the location of a transponder unit
US5655148A (en) 1994-05-27 1997-08-05 Microsoft Corporation Method for automatically configuring devices including a network adapter without manual intervention and without prior configuration information
US5748980A (en) 1994-05-27 1998-05-05 Microsoft Corporation System for configuring a computer system
US5835684A (en) 1994-11-09 1998-11-10 Amada Company, Ltd. Method for planning/controlling robot motion
IL124916A (en) * 1995-12-15 2002-02-10 Object Dynamics Corp Method and system for constructing software components
US5999989A (en) 1997-06-17 1999-12-07 Compaq Computer Corporation Plug-and-play
US6092021A (en) 1997-12-01 2000-07-18 Freightliner Corporation Fuel use efficiency system for a vehicle for assisting the driver to improve fuel economy
US6242880B1 (en) 1998-09-08 2001-06-05 Cimplus, Inc. Tolerance based motion control system
DE19910590A1 (en) * 1999-03-10 2000-09-14 Volkswagen Ag Distance control method and device for a vehicle
JP2001038663A (en) 1999-07-28 2001-02-13 Yamaha Motor Co Ltd Machine control system
US6216631B1 (en) 1999-08-12 2001-04-17 The Mitre Corporation Robotic manipulation system utilizing patterned granular motion
US6931546B1 (en) * 2000-01-28 2005-08-16 Network Associates, Inc. System and method for providing application services with controlled access into privileged processes
US6317686B1 (en) * 2000-07-21 2001-11-13 Bin Ran Method of providing travel time
US6442451B1 (en) * 2000-12-28 2002-08-27 Robotic Workspace Technologies, Inc. Versatile robot control system
US9053222B2 (en) * 2002-05-17 2015-06-09 Lawrence A. Lynn Patient safety processor
US6889118B2 (en) 2001-11-28 2005-05-03 Evolution Robotics, Inc. Hardware abstraction layer for a robot
US7210130B2 (en) * 2002-02-01 2007-04-24 John Fairweather System and method for parsing data
US7065638B1 (en) * 2002-07-08 2006-06-20 Silicon Motion, Inc. Table-driven hardware control system
US7152033B2 (en) * 2002-11-12 2006-12-19 Motorola, Inc. Method, system and module for multi-modal data fusion
WO2004056537A2 (en) 2002-12-19 2004-07-08 Koninklijke Philips Electronics N.V. System and method for controlling a robot
US7161169B2 (en) 2004-01-07 2007-01-09 International Business Machines Corporation Enhancement of electron and hole mobilities in <110> Si under biaxial compressive strain
US7720570B2 (en) 2004-10-01 2010-05-18 Redzone Robotics, Inc. Network architecture for remote robot with interchangeable tools
KR100599662B1 (en) 2004-10-05 2006-07-12 한국타이어 주식회사 Method for Quantitative Measuring of Handling Characteristics of a Vehicle/Tire
US20060161315A1 (en) * 2004-11-22 2006-07-20 Ron Lewis Vehicle position and performance tracking system using wireless communication
WO2006089307A2 (en) * 2005-02-18 2006-08-24 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US9498647B2 (en) * 2005-09-23 2016-11-22 Allen B. Kantrowitz Fiducial marker system for subject movement compensation during medical treatment
US7778632B2 (en) * 2005-10-28 2010-08-17 Microsoft Corporation Multi-modal device capable of automated actions
US8712650B2 (en) * 2005-11-17 2014-04-29 Invent.Ly, Llc Power management systems and designs
US7925426B2 (en) * 2005-11-17 2011-04-12 Motility Systems Power management systems and devices
JP2007148835A (en) 2005-11-28 2007-06-14 Fujitsu Ten Ltd Object distinction device, notification controller, object distinction method and object distinction program
US7877198B2 (en) 2006-01-23 2011-01-25 General Electric Company System and method for identifying fuel savings opportunity in vehicles
US9195233B2 (en) * 2006-02-27 2015-11-24 Perrone Robotics, Inc. General purpose robotics operating system
US8073564B2 (en) 2006-07-05 2011-12-06 Battelle Energy Alliance, Llc Multi-robot control interface
US7587260B2 (en) 2006-07-05 2009-09-08 Battelle Energy Alliance, Llc Autonomous navigation system and method
US7668621B2 (en) 2006-07-05 2010-02-23 The United States Of America As Represented By The United States Department Of Energy Robotic guarded motion system and method
US7584020B2 (en) 2006-07-05 2009-09-01 Battelle Energy Alliance, Llc Occupancy change detection system and method
US7974738B2 (en) 2006-07-05 2011-07-05 Battelle Energy Alliance, Llc Robotics virtual rail system and method
US7620477B2 (en) 2006-07-05 2009-11-17 Battelle Energy Alliance, Llc Robotic intelligence kernel
US7211980B1 (en) 2006-07-05 2007-05-01 Battelle Energy Alliance, Llc Robotic follow system and method
WO2008032075A2 (en) * 2006-09-12 2008-03-20 Itis Holdings Plc Apparatus and method for implementing a road pricing scheme
US7778769B2 (en) 2006-11-27 2010-08-17 International Business Machines Corporation Method and system for calculating least-cost routes based on historical fuel efficiency, street mapping and location based services
EP2011536A1 (en) 2007-07-06 2009-01-07 Boehringer Ingelheim Pharma GmbH & Co. KG Inhaler
ATE459511T1 (en) 2007-09-12 2010-03-15 Harman Becker Automotive Sys METHOD AND SYSTEM FOR PROVIDING DRIVING INFORMATION TO THE DRIVER OF A VEHICLE
US20090082879A1 (en) * 2007-09-20 2009-03-26 Evolution Robotics Transferable intelligent control device
WO2009070069A1 (en) 2007-11-26 2009-06-04 Autoliv Development Ab A system for classifying objects in the vicinity of a vehicle
KR101356197B1 (en) 2007-12-12 2014-01-27 기아자동차주식회사 System for Guiding Fuel Economy Driving
US9007178B2 (en) * 2008-02-14 2015-04-14 Intermec Ip Corp. Utilization of motion and spatial identification in RFID systems
DE102008057142B4 (en) * 2008-04-29 2016-01-28 Siemens Aktiengesellschaft Method for computer-aided motion planning of a robot
US20090307772A1 (en) * 2008-05-21 2009-12-10 Honeywell International Inc. framework for scalable state estimation using multi network observations
ES2594231T3 (en) 2008-07-24 2016-12-16 Tomtom North America Inc. Device for anonymous alert from vehicle to vehicle initiated by driver
DE102008047143B4 (en) 2008-09-12 2010-09-09 Technische Universität Carolo-Wilhelmina Zu Braunschweig Method and device for determining a driving strategy
US8155868B1 (en) 2009-03-31 2012-04-10 Toyota Infotechnology Center Co., Ltd. Managing vehicle efficiency
JP5306024B2 (en) * 2009-04-02 2013-10-02 株式会社東芝 Ultrasonic inspection apparatus and ultrasonic inspection method
ES2424244T3 (en) * 2009-04-22 2013-09-30 Kuka Roboter Gmbh Procedure and device to regulate a manipulator
JP4788798B2 (en) * 2009-04-23 2011-10-05 トヨタ自動車株式会社 Object detection device
WO2010134824A1 (en) 2009-05-20 2010-11-25 Modulprodukter As Driving assistance device and vehicle system
US20100305806A1 (en) * 2009-06-02 2010-12-02 Chadwick Todd Hawley Portable Multi-Modal Emergency Situation Anomaly Detection and Response System
US8321125B2 (en) * 2009-06-24 2012-11-27 General Motors Llc System and method for providing route guidance to a requesting vehicle
US8248210B2 (en) * 2009-06-30 2012-08-21 Intermec Ip Corp. Method and system to determine the position, orientation, size, and movement of RFID tagged objects
US8651183B2 (en) * 2009-07-31 2014-02-18 Schlumberger Technology Corporation Robotic exploration of unknown surfaces
JP5135308B2 (en) 2009-09-09 2013-02-06 クラリオン株式会社 Energy consumption prediction method, energy consumption prediction device, and terminal device
US8421811B2 (en) * 2009-09-15 2013-04-16 David Odland Customized vehicle body
US20110224828A1 (en) * 2010-02-12 2011-09-15 Neuron Robotics, LLC Development platform for robotic systems
US8428780B2 (en) * 2010-03-01 2013-04-23 Honda Motor Co., Ltd. External force target generating device of legged mobile robot
US8190319B2 (en) 2010-06-08 2012-05-29 Ford Global Technologies, Llc Adaptive real-time driver advisory control for a hybrid electric vehicle to achieve fuel economy improvement
CA2720886A1 (en) 2010-11-12 2012-05-12 Crosswing Inc. Customizable virtual presence system
US8700202B2 (en) * 2010-11-30 2014-04-15 Trimble Navigation Limited System for positioning a tool in a work space
US9123035B2 (en) * 2011-04-22 2015-09-01 Angel A. Penilla Electric vehicle (EV) range extending charge systems, distributed networks of charge kiosks, and charge locating mobile apps
US9052230B2 (en) * 2011-05-13 2015-06-09 Chevron U.S.A. Inc Industrial process monitoring and imaging
US8744666B2 (en) * 2011-07-06 2014-06-03 Peloton Technology, Inc. Systems and methods for semi-autonomous vehicular convoys
TWI436179B (en) * 2011-07-22 2014-05-01 Ememe Robot Co Ltd Autonomous electronic device and method of controlling motion of the autonomous electronic device thereof
FR2983591B1 (en) * 2011-12-02 2014-01-03 Dassault Aviat APPARATUS FOR CONTROLLING A SURFACE AND ASSOCIATED METHOD
US9573276B2 (en) * 2012-02-15 2017-02-21 Kenneth Dean Stephens, Jr. Space exploration with human proxy robots
US9605952B2 (en) * 2012-03-08 2017-03-28 Quality Manufacturing Inc. Touch sensitive robotic gripper
US9606217B2 (en) * 2012-05-01 2017-03-28 5D Robotics, Inc. Collaborative spatial positioning
US9235212B2 (en) * 2012-05-01 2016-01-12 5D Robotics, Inc. Conflict resolution based on object behavioral determination and collaborative relative positioning
US9971339B2 (en) * 2012-09-26 2018-05-15 Apple Inc. Contact patch simulation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060008976A1 (en) * 2004-01-14 2006-01-12 Taiwan Semiconductor Manufacturing Co., Ltd. Novel random access memory (RAM) capacitor in shallow trench isolation with improved electrical isolation to overlying gate electrodes
US20080009968A1 (en) * 2006-07-05 2008-01-10 Battelle Energy Alliance, Llc Generic robot architecture
US20100008270A1 (en) * 2008-07-11 2010-01-14 Gwangju Institute Of Science And Technology Method and System for Localization Using One-Way Ranging Technique
US20130054023A1 (en) * 2011-08-30 2013-02-28 5D Robotics, Inc. Asynchronous Data Stream Framework

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Microcontroller UART Tutorial, Accessed 19 May 2009, Society of Robots, http://www.societyofrobots.com/microcontroller_uart.shtml *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018045448A1 (en) * 2016-09-06 2018-03-15 Advanced Intelligent Systems Inc. Mobile work station for transporting a plurality of articles
US20190248024A1 (en) 2016-09-06 2019-08-15 Advanced Intelligent Systems Inc. Mobile work station for transporting a plurality of articles
US10611036B2 (en) 2016-09-06 2020-04-07 Advanced Intelligent Systems Inc. Mobile work station for transporting a plurality of articles
US10884426B2 (en) 2016-09-06 2021-01-05 Advanced Intelligent Systems Inc. Mobile work station for transporting a plurality of articles
US10633190B2 (en) 2018-02-15 2020-04-28 Advanced Intelligent Systems Inc. Apparatus for supporting an article during transport
US10745219B2 (en) 2018-09-28 2020-08-18 Advanced Intelligent Systems Inc. Manipulator apparatus, methods, and systems with at least one cable
US10751888B2 (en) 2018-10-04 2020-08-25 Advanced Intelligent Systems Inc. Manipulator apparatus for operating on articles
US10645882B1 (en) 2018-10-29 2020-05-12 Advanced Intelligent Systems Inc. Method and apparatus for performing pruning operations using an autonomous vehicle
US10966374B2 (en) 2018-10-29 2021-04-06 Advanced Intelligent Systems Inc. Method and apparatus for performing pruning operations using an autonomous vehicle
US10676279B1 (en) 2018-11-20 2020-06-09 Advanced Intelligent Systems Inc. Systems, methods, and storage units for article transport and storage
WO2023021091A1 (en) * 2021-08-20 2023-02-23 Telefonaktiebolaget Lm Ericsson (Publ) Sequential behavior for intelligent control in subsumption-like architecture

Also Published As

Publication number Publication date
US20130054125A1 (en) 2013-02-28
US9195911B2 (en) 2015-11-24
US20160039091A1 (en) 2016-02-11
WO2013033351A2 (en) 2013-03-07
WO2013033354A3 (en) 2013-08-01
US10226864B2 (en) 2019-03-12
US20130050180A1 (en) 2013-02-28
US20150285646A1 (en) 2015-10-08
US8972053B2 (en) 2015-03-03
US20150269757A1 (en) 2015-09-24
WO2013033338A4 (en) 2013-08-29
WO2013033354A2 (en) 2013-03-07
US9731417B2 (en) 2017-08-15
US20130054024A1 (en) 2013-02-28
WO2013033338A2 (en) 2013-03-07
US20130054023A1 (en) 2013-02-28
WO2013033351A3 (en) 2013-06-27
US9053394B2 (en) 2015-06-09
US20130050121A1 (en) 2013-02-28
WO2013033338A3 (en) 2013-07-04
WO2013033351A4 (en) 2013-10-31
US9586314B2 (en) 2017-03-07

Similar Documents

Publication Publication Date Title
US20160075014A1 (en) Asynchronous Data Stream Framework
Khatib Real-time obstacle avoidance for manipulators and mobile robots
Saska et al. Coordination and navigation of heterogeneous MAV–UGV formations localized by a ‘hawk-eye’-like approach under a model predictive control scheme
Recchiuto et al. Post‐disaster assessment with unmanned aerial vehicles: A survey on practical implementations and research approaches
Wen et al. CL-MAPF: Multi-agent path finding for car-like robots with kinematic and spatiotemporal constraints
Saska et al. Formation control of unmanned micro aerial vehicles for straitened environments
Ravankar et al. Autonomous mapping and exploration with unmanned aerial vehicles using low cost sensors
Butzke et al. The University of Pennsylvania MAGIC 2010 multi‐robot unmanned vehicle system
Suzuki Recent researches on innovative drone technologies in robotics field
Rey et al. A novel robot co-worker system for paint factories without the need of existing robotic infrastructure
Chen On the trends of autonomous unmanned systems research
Martinez UAV cooperative decision and control: Challenges and practical approaches (shima, t. and rasmussen, s.; 2008)[bookshelf]
Montero et al. Dynamic warning zone and a short-distance goal for autonomous robot navigation using deep reinforcement learning
Yazdani et al. Cognition-enabled robot control for mixed human-robot rescue teams
Sousa et al. Self-adaptive team of aquatic drones with a communication network for aquaculture
de-Dios et al. GRVC-CATEC: Aerial robot co-worker in plant servicing (ARCOW)
Smirnov et al. Smart M3-based robot interaction scenario for coalition work
Chen et al. Social crowd navigation of a mobile robot based on human trajectory prediction and hybrid sensing
Sardinha et al. Combining Lévy walks and flocking for cooperative surveillance using aerial swarms
Wei et al. Deep reinforcement learning with heuristic corrections for UGV navigation
Brill et al. The effective field of view paradigm: Adding representation to a reactive system
Von Hundelshausen et al. Cognitive navigation: an overview of three navigation paradigms leading to the concept of an affordance hierarchy
Palácios et al. Evaluation of mobile autonomous robot in trajectory optimization
Naik et al. Control system integration methods to maintain the position and speed of the robot in spatial forbidden areas
Chandrasekaran A SELECTIVE SENSOR FRAMEWORK USING SENSOR FUSION AND SENSOR MAPS TO ACHIEVE COMPLETE COVERAGE PLANNING OF A SEMI-AUTONOMOUS ROBOTIC VEHICLE

Legal Events

Date Code Title Description
AS Assignment

Owner name: HUMATICS CORPORATION, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:5D ROBOTICS, INC.;REEL/FRAME:044753/0412

Effective date: 20180122

AS Assignment

Owner name: 5D ROBOTICS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BRUEMMER, DAVID J.;NIELSEN, CURTIS W.;HARDIN, BENJAMIN C.;REEL/FRAME:045443/0648

Effective date: 20120828

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION