US20210311475A1 - Method, system and apparatus for handling operational constraints for control of unmanned vehicles - Google Patents

Method, system and apparatus for handling operational constraints for control of unmanned vehicles Download PDF

Info

Publication number
US20210311475A1
US20210311475A1 US17/185,503 US202117185503A US2021311475A1 US 20210311475 A1 US20210311475 A1 US 20210311475A1 US 202117185503 A US202117185503 A US 202117185503A US 2021311475 A1 US2021311475 A1 US 2021311475A1
Authority
US
United States
Prior art keywords
unmanned vehicle
operational constraint
region
vehicle
operational
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/185,503
Inventor
Ryan Christopher GARIEPY
Alex BENCZ
Andrew Clifford BLAKEY
Shahab Kaynama
James SERVOS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rockwell Automation Technologies Inc
Original Assignee
Clearpath Robotics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Clearpath Robotics Inc filed Critical Clearpath Robotics Inc
Priority to US17/185,503 priority Critical patent/US20210311475A1/en
Assigned to Clearpath Robotics Inc. reassignment Clearpath Robotics Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GARIEPY, RYAN CHRISTOPHER, BENCZ, ALEX, BLAKEY, ANDREW CLIFFORD, KAYNAMA, SHAHAB, SERVOS, JAMES
Publication of US20210311475A1 publication Critical patent/US20210311475A1/en
Assigned to ROCKWELL AUTOMATION TECHNOLOGIES, INC. reassignment ROCKWELL AUTOMATION TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ROCKWELL AUTOMATION, INC.
Assigned to ROCKWELL AUTOMATION, INC. reassignment ROCKWELL AUTOMATION, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CLEARPATH ROBOTICS, INC.
Assigned to ROCKWELL AUTOMATION, INC. reassignment ROCKWELL AUTOMATION, INC. CORRECTIVE ASSIGNMENT TO CORRECT THE CONVEYING PARTY'S NAME FROM CLEARPATH ROBOTICS, INC. TO CLEARPATH ROBOTICS INC. (WITHOUT THE COMMA) PREVIOUSLY RECORDED ON REEL 67944 FRAME 916. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: Clearpath Robotics Inc.
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/063Automatically guided
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0297Fleet control by controlling means in a control room
    • G05D2201/0216

Definitions

  • the specification relates generally to the control of unmanned vehicles, and specifically to a method, system and apparatus for handling operational constraints for the control of unmanned vehicles.
  • Mobile unmanned vehicles also referred to as autonomous mobile robots or self-driving vehicles
  • Such environments can have various physical characteristics that the vehicles may be required to navigate around, interact with and the like, in order to operate successfully. Physical characteristics such as those mentioned above can generally be represented in maps of the environments stored by the vehicles.
  • the operating environments of the vehicles may also impose other restrictions on the operation of the vehicles that do not arise directly from physical characteristics of the environments, or that arise from physical characteristics that are not readily detectable by the vehicles. Such restrictions are less suitable for representation in maps, and can therefore render the operation of self-driving vehicles difficult.
  • the specification is generally directed to systems, apparatuses and methods for generating and deploying operational constraints from a computing device to at least one self-driving vehicle.
  • a computing device in communication with one or more self-driving vehicles stores operational constraints associated with respective regions of an environment in which the self-driving vehicles are to operate.
  • the operational constraints each contain a property defining a constraint on the operation of self-driving vehicles within the relevant region.
  • the computing device provides operational constraints to a self-driving vehicle, either at the vehicle's request, or along with task assignments for the vehicle, or both, to control how the self-driving vehicle will operate in the region of the environment associated with the transmitted operational constraint.
  • a system comprising: at least one mobile unmanned vehicle for deployment in an environment; a computing device connected to the at least one unmanned vehicle via a network, the computing device storing, in a memory, a plurality of operational constraints; each operational constraint including (i) a type identifier, (ii) an indication of a region of the environment, and (iii) a property defining a constraint on the operation of the at least one unmanned vehicle within the region; the computing device configured to: receive a request from one of the at least one mobile unmanned vehicle, the request identifying an operational constraint; responsive to receiving the request, retrieve an operational constraint from the memory based on the request; and send the retrieved operational constraint to the one of the at least one mobile unmanned vehicle.
  • a method in a system having at least one mobile unmanned vehicle for deployment in an environment and a computing device connected to the at least one unmanned vehicle via a network, the method comprising: storing, in a memory of the computing device, a plurality of operational constraints; each operational constraint including (i) a type identifier, (ii) an indication of a region of the environment, and (iii) a property defining a constraint on the operation of the at least one unmanned vehicle within the region; at the computing device: receiving a request from one of the at least one mobile unmanned vehicle, the request identifying an operational constraint; responsive to receiving the request, retrieving an operational constraint from the memory based on the request; and sending the retrieved operational constraint to the one of the at least one mobile unmanned vehicle.
  • a non-transitory computer-readable medium storing computer-readable instructions for execution by a processor of a computing device for causing the computing device to perform a method comprising: storing a plurality of operational constraints; each operational constraint including (i) a type identifier, (ii) an indication of a region of an environment in which at least one mobile unmanned vehicle is to be deployed, and (iii) a property defining a constraint on the operation of the at least one unmanned vehicle within the region; receiving a request from one of the at least one mobile unmanned vehicle, the request identifying an operational constraint; responsive to receiving the request, retrieving an operational constraint from the memory based on the request; and sending the retrieved operational constraint to the one of the at least one mobile unmanned vehicle via the network.
  • FIG. 1 depicts a system for controlling unmanned vehicles, according to a non-limiting embodiment
  • FIG. 2 depicts certain components of an unmanned vehicle of the system of FIG. 1 , according to a non-limiting embodiment
  • FIG. 3 depicts certain internal components of the computing device of FIG. 1 , according to a non-limiting embodiment
  • FIG. 4 depicts a method of receiving and storing operational constraints in the system of FIG. 1 , according to a non-limiting embodiment
  • FIG. 5 depicts example interfaces presented by the computing device of FIG. 1 during the method of FIG. 4 , according to a non-limiting embodiment
  • FIG. 6 depicts an example data structure for storing the operational constraints received in the method of FIG. 4 , according to a non-limiting embodiment
  • FIG. 7 depicts a method of deploying the operational constraints received in the method of FIG. 4 , according to a non-limiting embodiment.
  • FIG. 1 depicts a system 100 including a plurality of self-driving vehicles, referred to herein as mobile unmanned vehicles 104 - 1 , 104 - 2 and 1043 (collectively referred to as unmanned vehicles 104 , and generically referred to as an unmanned vehicle 104 , or simply a vehicle 104 ; similar nomenclature is used for other reference numerals herein) for deployment in a facility, such as a manufacturing facility, warehouse or the like.
  • the facility can be any one of, or any suitable combination of, a single building, a combination of buildings, an outdoor area, and the like.
  • a greater or smaller number of unmanned vehicles 104 may be included in system 100 than the three shown in FIG. 1 .
  • Unmanned vehicles 104 can have a wide variety of operational characteristics (e.g. maximum payload, dimensions, weight, maximum speed, battery life, and the like).
  • System 100 also includes a computing device 108 for connection to unmanned vehicles 104 via a network 112 .
  • Computing device 108 can be connected to network 112 via, for example, a wired link 113 , although wired link 113 can be any suitable combination of wired and wireless links in other embodiments.
  • Unmanned vehicles 104 can be connected to network 112 via respective wireless links 114 - 1 , 114 - 2 and 114 - 3 .
  • Links 114 can be any suitable combination of wired and wireless links in other examples, although generally wireless links are preferable to reduce or eliminate obstacles to the free movement of unmanned vehicles 104 about the facility.
  • Network 112 can be any suitable one of, or any suitable combination of, wired and wireless networks, including local area networks (LAN or WLAN), wide area networks (WAN) such as the Internet, and mobile networks (e.g. GSM, LTE and the like).
  • Computing device 108 can control unmanned vehicles 104 , for example by instructing unmanned vehicles 104 to carry out tasks within the facility.
  • the nature of the tasks performed by unmanned vehicles 104 under the control of computing device 108 is not particularly limited.
  • the tasks assigned to unmanned vehicles 104 require unmanned vehicles 104 to perform various actions at various locations within the facility. Data defining the actions and locations are provided to unmanned vehicles 104 by computing device 108 via network 112 .
  • an unmanned vehicle 104 can be instructed to simply travel to a specific location.
  • an unmanned vehicle 104 can be instructed to travel to a specified location and pick up, drop off, or otherwise manipulate, an item (e.g. a tool, container, and the like), or perform any other suitable action (e.g. park, begin a mapping algorithm, and so on).
  • Locations include any regions within the facility bounded by coordinates. Such regions can be three-dimensional (i.e. volumes), two-dimensional (i.e. areas), one-dimensional (i.e. lines) or zero-dimensional (i.e. points).
  • a first location 120 is illustrated, which may be employed to store items, such as an item 116 (e.g. a container).
  • Location 120 can be an area defined on a floor of the facility for storage of items.
  • a second location 124 is also illustrated, containing, for example, a work station where materials are to be removed from or placed in item 116 , or where item 116 is to be labelled or otherwise modified. A wide variety of other work station activities will occur to those skilled in the art (e.g. welding stations, paint spray booths, and so on).
  • a third location 128 is also illustrated in FIG. 1 . In the present example, third location 128 contains a conveyor apparatus, which may carry item 116 to another part of the facility.
  • a vehicle 104 When a vehicle 104 is assigned a task by computing device 108 , that vehicle 104 is configured to generate a path for completing the task (e.g. a path leading from the vehicle's current location to the end location of the task; the path may include one or more intermediate locations between the start location and the end location).
  • computing device 108 can assist the vehicle 104 in path generation (also referred to as path planning), or can generate the path without the involvement of the vehicle 104 and send the completed path to the vehicle 104 for execution.
  • Path generation can be based on, for example, a map of the facility stored at one or both of computing device 108 and vehicles 104 .
  • Path generation may also depend on attributes of the relevant vehicle 104 .
  • the map may indicate that a certain area of the facility contains constricted areas unsuitable for vehicles 104 greater than certain dimensions; if a vehicle 104 has dimensions greater than those of the constricted areas, a path may therefore be generated for that vehicle 104 that avoids the constricted areas.
  • system 100 is configured to receive, store and deploy to vehicles 104 a wide variety of such other information, referred to broadly herein as operational constraints for vehicles 104 .
  • unmanned vehicle 104 is shown.
  • unmanned vehicle 104 - 3 is depicted according to a non-limiting embodiment.
  • Other vehicles 104 need not be identical to vehicle 104 - 3 as depicted, but are generally as described below.
  • Unmanned vehicle 104 - 3 is depicted as a terrestrial vehicle, although it is contemplated that unmanned vehicles 104 can also include aerial vehicles and watercraft.
  • Unmanned vehicle 104 - 3 includes a chassis 200 containing or otherwise supporting various other components, including one or more locomotive devices 204 .
  • Devices 204 in the present example are wheels, although in other embodiments any suitable locomotive device, or combination thereof, may be employed (e.g. tracks, propellers, and the like).
  • Locomotive devices 204 are powered by one or more motors (not shown) contained within chassis 200 .
  • the motors of unmanned vehicle 104 - 3 can be electric motors, internal combustion engines, or any other suitable motor or combination of motors.
  • the motors drive the locomotive devices 204 by drawing power from an energy storage device (not shown) supported on or within chassis 200 .
  • the nature of the energy storage device can vary based on the nature of the motors.
  • the energy storage can include batteries, combustible fuel tanks, or any suitable combination thereof.
  • Unmanned vehicle 104 - 3 also includes a load-bearing surface 208 (also referred to as a payload surface), for carrying an item such as item 116 thereon.
  • load-bearing surface 208 can be replaced or supplemented with other payload-bearing equipment, such as a cradle, a manipulator arm, or the like.
  • Unmanned vehicle 104 - 3 can also include a variety of sensors.
  • such sensors include at least one load cell 212 coupled to payload surface 208 , for measuring a force exerted on payload surface 208 (e.g. by an item being carried by unmanned vehicle 104 - 3 ).
  • the sensors of unmanned vehicle 104 - 3 can also include machine vision sensors 216 , such as any suitable one of, or any suitable combination of, barcode scanners, laser-based sensing devices (e.g. a LIDAR sensor), cameras and the like.
  • Unmanned vehicle 104 - 3 can also include a location sensor (not shown) such as a GPS sensor, for detecting the location of unmanned vehicle 104 - 3 with respect to a frame of reference.
  • the frame of reference is not particularly limited, and may be, for example, a global frame of reference (e.g. GPS coordinates), or a facility-specific frame of reference.
  • Other sensors that can be provided with unmanned vehicle 104 - 3 include accelerometers, fuel-level or battery-level sensors, and the like.
  • Unmanned vehicle 104 - 3 can also include a control panel 220 , as well as anchors 224 for securing items or other equipment to chassis 200 , or for lifting chassis 200 (e.g. for maintenance). Unmanned vehicle 104 - 3 can also include any of a variety of other features, such as indicator lights 228 .
  • unmanned vehicle 104 - 3 includes a central processing unit (CPU) 250 , also referred to as a processor 250 , interconnected with a non-transitory computer-readable medium such as a memory 254 .
  • processor 250 and memory 254 are generally comprised of one or more integrated circuits (ICs), and can have a variety of structures, as will now occur to those skilled in the art (for example, more than one CPU can be provided).
  • Memory 254 can be any suitable combination of volatile (e.g. Random Access Memory (“RAM”)) and non-volatile (e.g. read only memory (“ROM”), Electrically Erasable Programmable Read Only Memory (“EEPROM”), flash memory, magnetic computer storage device, or optical disc) memory.
  • RAM Random Access Memory
  • ROM read only memory
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • flash memory magnetic computer storage device, or optical disc
  • Unmanned vehicle 104 - 3 also includes a communications interface 258 (e.g. a network interface controller or NIC) interconnected with processor 250 . Via communications interface 258 , link 114 - 3 and network 112 , processor 254 can send and receive data to and from computing device 108 . For example, unmanned vehicle 104 - 3 can send updated location data to computing device 108 , and receive operational constraints from computing device 108 .
  • a communications interface 258 e.g. a network interface controller or NIC
  • NIC network interface controller
  • processor 250 is interconnected with the other components of unmanned vehicle 104 - 3 mentioned above, such as sensors 212 and 216 and control panel 220 .
  • Memory 254 stores a plurality of computer-readable programming instructions, executable by processor 300 , in the form of various applications, including a vehicle control application 262 .
  • processor 250 can execute the instructions of application 262 (and any other suitable applications stored in memory 254 ) in order to perform various actions defined within the instructions.
  • processor 250 and more generally any vehicle 104 , is said to be “configured to” perform certain actions. It will be understood that vehicles 104 are so configured via the execution of the instructions of the applications stored in memory 254 .
  • Memory 254 also stores a cache 264 , to be discussed in greater detail below.
  • Computing device 108 can be any one of, or any combination of, a variety of computing devices. Such devices include desktop computers, servers, mobile computers such as laptops and tablet computers, and the like. Computing device 108 therefore includes at least one central processing unit (CPU), also referred to herein as a processor, 300 .
  • CPU central processing unit
  • Processor 300 is interconnected with a non-transitory computer-readable medium such as a memory 304 .
  • Processor 300 is also interconnected with a communications interface 308 .
  • Processor 300 and memory 304 are generally comprised of one or more integrated circuits (ICs), and can have a variety of structures, as will now occur to those skilled in the art (for example, more than one CPU can be provided).
  • Memory 304 can be any suitable combination of volatile (e.g. Random Access Memory (“RAM”)) and non-volatile (e.g. read only memory (“ROM”), Electrically Erasable Programmable Read Only Memory (“EEPROM”), flash memory, magnetic computer storage device, or optical disc) memory.
  • RAM Random Access Memory
  • ROM read only memory
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • flash memory magnetic computer storage device, or optical disc
  • Communications interface 308 allows computing device 108 to connect with other computing devices (e.g. unmanned vehicles 104 ) via network 112 .
  • Communications interface 308 therefore includes any necessary hardware (e.g. network interface controllers (NICs), radio units, and the like) to communicate with network 112 over link 113 .
  • NICs network interface controllers
  • Computing device 108 can also include input and output devices, such as keyboards, mice, displays, and the like (not shown).
  • Memory 304 stores a plurality of computer-readable programming instructions, executable by processor 300 , in the form of various applications, including an operational constraints handling application 312 .
  • processor 300 can execute the instructions of application 312 (and any other suitable applications) in order to perform various actions defined within the instructions.
  • processor 300 and more generally computing device 108 , are said to be “configured to” perform those actions. It will be understood that they are so configured via the execution of the instructions of the applications stored in memory 304 .
  • Memory 304 also stores various types of data for retrieval, processing and updating during the execution of application 312 .
  • memory 304 stores an operational constraints database 316 .
  • Memory 304 may also store other data (not shown), such as a map of the facility of FIG. 1 , as well as vehicle attributes, location-related data, and the like.
  • method 400 for generating and storing operational constraints is illustrated.
  • the performance of method 400 will be described in connection with its performance in system 100 , although it is contemplated that method 400 can also be performed in other suitable systems.
  • the blocks of method 400 as described below are performed by computing device 108 , via the execution of application 312 by processor 300 . In other embodiments, however, method 400 can also be performed by any of unmanned vehicles 104 (that is, by processor 250 of a given vehicle 104 ).
  • computing device 108 is configured to receive and store operational constraint type definitions, also referred to as operational constraint templates or masters.
  • Each operational constraint master defines a type of operational constraint, as well as the properties that can be assigned to that type of operational constraint.
  • the nature of the receipt of operational constraint masters at block 405 is not particularly limited.
  • the masters may be received at processor 300 via input devices such as a keyboard and mouse.
  • the operational constraint masters are stored in database 316 .
  • Table 1 illustrates examples of operational constraint masters.
  • a speed limit operational constraint type provides a definition for creating speed limit operational constraints that apply to the facility of FIG. 1 .
  • each speed limit operational constraint (also referred to herein as zone) defines a space on the map of the facility where that constraint applies.
  • the space may be a volume, an area, a line, or a point, and can be defined in a variety of ways (e.g. by coordinates).
  • the location properties for zones correspond to “real” physical spaces in the facility.
  • Each speed limit zone also defines an upper speed limit for unmanned vehicles 104 within the zone, a lower speed limit, and a time period during which the operational constraint applies. It will be appreciated that a wide variety of properties may be defined for each operational constraint type. Further, some properties may be indicated as mandatory, while others may be optional (e.g. a lower speed limit and the time property may be optional).
  • each one-way zone includes a location, as well as a direction, with or without an associated tolerance.
  • a one-way zone may state that unmanned vehicles in a certain area of the facility must travel in a direction ten degrees east of north, plus or minus five degrees.
  • a variety of other ways may also be employed to represent directions and tolerances.
  • computing device 108 is configured to receive a request to edit operational constraints.
  • computing device 108 can received such a request in the form of input data from a keyboard or mouse, or via network 112 via another computing device.
  • the request can be received from a vehicle 104 .
  • computing device 108 is configured to determine whether the request received at block 410 is a request to edit an existing zone, or to create a new zone.
  • the request received at block 410 can be a selection of one of several user-selectable elements of a graphical user interface (GUI) presented on a display connected to processor 300 .
  • GUI graphical user interface
  • the determination at block 415 can therefore include a determination of which selectable element was selected (e.g. an element corresponding to zone creation, or an element corresponding to zone editing).
  • performance of method 400 advances to block 420 , at which computing device 108 retrieves the zone types defined by the masters received and stored at block 405 .
  • computing device 108 retrieves the zone types “speed limit” and “one-way” and presents those zone types for selection.
  • Presenting zone types for selection can involve controlling a display to render an interface 500 , shown in FIG. 5 , including selectable elements 504 and 508 corresponding to each retrieved zone type.
  • computing device 108 can present the available zone types for selection via other means, such as by transmitting the zone types via network 112 to another computing device.
  • computing device 108 is configured to receive a selection of one of the zone types presented at block 420 , and in response to retrieve and present the properties defined by the master corresponding to the selected zone type.
  • an updated interface 512 can be rendered on a display connected to processor 300 , in which selectable elements are provided for defining the location and direction properties defined in Table 1.
  • the location property can be defined by drawing (e.g. via input data received from a mouse, touch screen or the like) an area, volume or the like 516 on a rendering of the map of the facility.
  • the location can be specified by received input data in the form of coordinates on the map.
  • the direction property of the one-way zone can be specified, for example, by setting an angle 520 relative to an indicator of cardinal north 524 (or any other known direction.
  • Angle 520 can be specified directly on the map shown in interface 512 in some embodiments, rather than as a separate interface element from the map (as shown in FIG. 5 ).
  • the direction can be specified by way of input data identifying an angle and a tolerance.
  • computing device 108 is configured to store the newly received zone in database 316 .
  • the request received at block 410 is determined (at block 415 ) to be a request to edit an existing zone
  • at block 435 computing device 108 can be configured to retrieve the existing zones in database 316 and present the zones for selection.
  • the existing zones can be retrieved and presented based on a zone type; for example, between blocks 415 and 435 computing device 108 can present an interface similar to interface 500 for receiving a selection of which zone type is to be edited.
  • computing device 108 Upon receipt of a selection of a specific zone to edit at block 440 , computing device 108 retrieves the properties of that zone and presents the retrieved properties (e.g. on a display).
  • the display of zone properties at block 440 can be implemented via the presentation of an interface such as interface 512 shown in FIG. 5 .
  • Performance of method 400 then proceeds to block 430 , at which properties for the relevant zone are received and stored as described above.
  • Editing properties presented at block 440 and 430 can include deleting properties.
  • the editing inputs received via an interface such as interface 512 include the removal of all properties for a zone (or selection of a “delete zone” element)
  • the receipt and storage of properties at block 430 involves removing the zone from memory.
  • a plurality of operational constraints, or zones can be maintained in database 316 , each having a type, a location, and one or more properties.
  • computing device 108 is configured to perform a validation or simulation at block 435 , after receiving updates to operational constraints (e.g. new operational constraints or edited operational constraints).
  • operational constraints e.g. new operational constraints or edited operational constraints.
  • computing device 108 can be configured to detect conflicts in the operational constraints, such as one-way zones with incompatible direction properties (e.g. opposite directions).
  • Computing device 108 can also be configured to detect potential conflicts between zones that are not overlapping but adjacent to each other. For example, when two speed limit zones are in close proximity, and one zone has a greater minimum speed than the maximum speed of the other zone, computing device 108 can compare the difference between the required speeds of the zones to the known accelerations and decelerations of unmanned vehicles 104 .
  • Computing device 108 can thus determine whether the proximity of the zones, in combination with their differing requirements, would result in unmanned vehicles 104 being unable to comply with the operational constraints when traversing both zones (e.g. because the vehicles 104 cannot accelerate or decelerate quickly enough). When conflicts or potential conflicts are detected, computing device 108 can generate a warning message, for example on the display mentioned in connection with FIG. 5 .
  • the zones and their properties can be stored in a wide variety of ways. For example, turning to FIG. 6 , three layers 600 , 604 and 608 of zones are depicted as stored in database 316 .
  • the layers can be depicted, for example, in a plurality of image files (e.g. vector or raster-based images). In other embodiments, the layers can be stored in a variety of other formats, such as tables of coordinates or the like.
  • each layer can store the locations of zones of a particular type as polygons (or volumes, points, lines or the like).
  • layer 600 can contain one-way zone locations (e.g.
  • layer 604 can contain speed limit locations
  • layer 608 can contain locations for another type of zone (e.g. emergency aisles in the facility).
  • Each layer e.g. image file
  • database 316 can contain an index linking layers 600 , 604 , 608 and properties 620 .
  • location and zone type can be considered primary properties of the zones, as those are the properties defining which layer the zone is depicted in, while the remaining properties can be considered secondary properties.
  • zone types and properties are contemplated, in addition to those discussed above.
  • Other examples of zone types and corresponding properties include: stop before proceeding (e.g. with properties specifying a length of time to stop); restricted areas, such as the above-mentioned emergency aisles (e.g. with properties specifying that such zones are active only when an alarm is sounding in the facility); parking zones indicating areas where vehicles 104 may dock for periods of inactivity; height-restricted zones (e.g. with maximum permitted vehicle heights); weight-restricted zones (e.g. with maximum permitted vehicle weights); map quality zones (e.g. with properties indicating a level of confidence in the map in each zone).
  • Other zone types and properties will also occur to those skilled in the art.
  • restricted area zones may include properties identifying classes or vehicles 104 or individual vehicles 104 that are prohibited or permitted from entering such areas).
  • zone types include undetectable (by unmanned vehicles 104 ) physical features of the facility that are therefore less well-suited for representation in the map. For example, ramps or other such features of the facility may be represented as operational constraints. Further examples of properties include transitional properties associated with the boundaries of zones. For example, a speed limit zone may include a secondary speed limit property that is applied when an unmanned vehicle is within a predetermined distance of the edge of the zone.
  • Computing device 108 can also be configured to allocate tasks to unmanned vehicles 104 based on operational constraints. For example, computing device 108 can receive a task for assignment, and retrieve operational constraints associated with a location identified in the task. In combination with unmanned vehicle characteristics, computing device 108 can then select an unmanned vehicle 104 to perform the task (e.g. picking up an item in a specific location). For example, computing device may exclude certain unmanned vehicles 104 from the above-mentioned selection when the task to be assigned lies within a height-restricted zone with a maximum height that is smaller than the known height of those unmanned vehicles 104 .
  • Method 700 for deploying operational constraints is illustrated. Method 700 will be described in connection with its performance in system 100 , although it is contemplated that method 700 can also be performed in other suitable systems.
  • a vehicle 104 is configured to determine whether operational constraint data is required.
  • the determination at block 705 can take various forms, and generally involves determining whether a navigational process executed by the processor 250 of the vehicle 104 requires operational constraint data.
  • the vehicle 104 can begin executing a path planning (i.e. path generation) process that incorporates operational constraint data, following receipt of a task assignment from computing device 108 .
  • the vehicle 104 can perform a path execution process to travel a previously generated path.
  • the path execution process can incorporate operational constraint data.
  • the path generation and path execution processes can incorporate different operational constraints. For example, path generation may require the use of one-way zones, whereas path execution data may require the use of speed limit zones (which can be ignored during path generation in some embodiments).
  • the determination at block 705 can be a determination of whether a current location of the vehicle 104 is acceptable (i.e. complies with operational constraints).
  • the vehicle 104 can initiate a mapping or localization process and determine that operational constraint data is required to complete the process.
  • the vehicle 104 determines, at block 710 , whether the required operational constraint data is present in cache 264 . It is therefore contemplated that when the determination at block 705 is affirmative, the vehicle 104 is configured to identify required operational constraint data, such as a zone type, a location, or the like. For example, when block 705 is performed in connection with a path execution process, the required location may be a portion of the path (or the entire path), and the zone types may include any zone types relevant to path execution (e.g. speed limits).
  • the performance of block 710 thus includes examining the contents of cache 264 for operational constraints corresponding to any requirements identified at block 705 .
  • the determination at block 710 can include simply determining whether operational constraints corresponding to those identified at block 705 are present in cache 264 .
  • the vehicle 104 can also determine whether the required constraints are valid, for example by determining the age of those constraints in cache 264 .
  • the vehicle 104 may also send a request (not shown) to computing device 108 to retrieve a timestamp indicating the last time database 316 was modified. If the timestamp is more recent than the age of cache 264 , the determination at block 710 can be negative (even if the required constraints are present in cache 264 ).
  • the vehicle 104 proceeds to block 715 and retrieves, from cache 264 , the operational constraint data identified as being required at block 705 .
  • the vehicle 104 proceeds to block 720 .
  • the vehicle 104 is configured to transmit a request for operational constraint data to computing device 108 .
  • the request can contain one or more of a location (e.g. the current location of vehicle 104 , another specified location in the facility, including a set of locations such as a path, and the like), and a zone type.
  • the request can include identifiers (e.g. of a zone type and location) corresponding to the required data identified at block 705 .
  • the location can also be omitted in order to request all available data for the facility.
  • zone types can also be omitted from the request to request data for all available types.
  • vehicle 104 can send a request without a location or zone type in order to request all available zone data.
  • computing device 108 can store, in database 316 , identifiers of zone types in association with identifiers of vehicles 104 .
  • vehicles 104 can omit zone types from requests and receive zone data of a specific type (or set of types) based on the above associations.
  • computing device 108 is configured to receive the request via network 112 .
  • computing device 108 is configured to retrieve the data identified in the request and send the requested data to the vehicle 104 .
  • Computing device 108 can be configured to retrieve the data in a variety of ways. For example, computing device 108 can be configured to select any zone having the type specified in the request and intersecting the location specified in the request. It will now be apparent that if no type was specified in the request, zones of all types may be retrieved at block 730 .
  • the vehicle 104 is configured to receive the operational constraints sent by computing device 108 and store the operational constraints in cache 264 . Proceeding then to block 740 , the vehicle 104 is configured to update its operation based on the operational constraint data received from computing device 108 or retrieved from cache 264 at block 715 . Updating the operation of the vehicle 104 can include updating the vehicle's trajectory, performing a preconfigured action, sending a signal to anther device, or any of a wide variety of other operational behaviours. In general, the vehicle 104 is configured to complete the process that gave rise to the requirement identified at block 705 .
  • the vehicle 104 is therefore configured, at block 740 , to resume the process that lead to the affirmative determination at block 705 .
  • the received (or retrieved) operational constraint data such as a speed limit
  • the operational constraint data can be incorporated into the process, for example by rerouting the path to avoid travelling against the direction mandated by one-way zones.
  • the vehicle 104 can be configured to initiate a predetermined behaviour based on the operational constraint data received from computing device 108 or retrieved from cache 264 .
  • each vehicle 104 can maintain, in memory 254 , sets of instructions defining specific routines, such as a sequence of movements for parking or docking the vehicle 104 .
  • a type of operational constraint can define parking zones within the facility, and thus at block 740 a vehicle 104 can be configured, having requested parking zone data, to initiate a parking routine upon determining that its current location is within a parking zone.
  • computing device 108 itself can perform the determination at block 705 .
  • computing device 108 rather than vehicles 104 , is responsible for generating paths for vehicles 104 .
  • the receipt of a request to generate a path can result in an affirmative determination at block 705 (at computing device 108 rather than a vehicle 104 ).
  • computing device 108 can be configured to perform blocks 720 , 725 , 730 (which would be performed internally within computing device 108 ) and 740 by retrieving operational constraints required for path generation, generating a path and sending the path to the relevant vehicle 104 .
  • Blocks 710 , 715 and 735 would be omitted from such embodiments.
  • vehicles 104 may request, at block 720 , a partial operational constraint or a binary decision, rather than complete operational constraint data.
  • a vehicle 104 can request a speed limit for the vehicle's current location.
  • computing device 108 can transmit only a speed limit to vehicle 104 (or indeed, any other requested property), rather than the zone to which the speed limit applies).
  • the vehicle 104 can request confirmation that a current location of the vehicle is acceptable (i.e. complies with operational constraints).
  • computing device 108 can perform the determination locally and send an indication of whether or not the vehicle's current location is acceptable or not (that is, without sending any operational constraints).
  • vehicles 104 can update operational constraints.
  • vehicles 104 are equipped with sensors and can thus gather data about their environments. Vehicles can thus be configured to compare data gathered via their sensors to operational constraint data in cache 264 . When the comparison reveals that the operational constraint data does not matched the sensed data, a vehicle 104 can send a request to computing device 108 to update an operational constraint (e.g. a request received by computing device 108 at block 410 of method 400 ).
  • a vehicle 104 may have a sensor capable of measuring the height of surrounding objects in the facility. The vehicle may therefore measure the height clearance in a region of the facility and determine that the height clearance specified in a corresponding operational constraint does not match the measurement.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Transportation (AREA)
  • Structural Engineering (AREA)
  • Geology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Civil Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Traffic Control Systems (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)

Abstract

Systems, methods and apparatus are provided for handling operational constraints for unmanned vehicles. The system includes: a plurality of mobile unmanned vehicles for deployment in an environment; a computing device connected to the plurality of unmanned vehicles via a network, the computing device storing, in a memory, a plurality of operational constraints; each operational constraint including (i) a type identifier, (ii) an indication of a region of the environment, and (iii) a property defining a constraint on the operation of the unmanned vehicles within the region. The computing device is configured to: receive a request from one of the mobile unmanned vehicles, the request identifying an operational constraint; responsive to receiving the request, retrieve an operational constraint from the memory based on the request; and send the retrieved operational constraint to the one of the mobile unmanned vehicles.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • The application is a continuation of U.S. patent application Ser. No. 16/290,380 filed on Mar. 1, 2019, which is a continuation of U.S. patent application Ser. No. 15/167,002 filed on May 27, 2016 which claims the benefit of U.S. Provisional Application No. 62/168,511, filed on May 29, 2015. The entire contents of each of U.S. patent application Ser. No. 16/290,380, U.S. patent application Ser. No. 15/167,002 and U.S. Provisional Application No. 62/168,511 is hereby incorporated by reference.
  • FIELD
  • The specification relates generally to the control of unmanned vehicles, and specifically to a method, system and apparatus for handling operational constraints for the control of unmanned vehicles.
  • BACKGROUND
  • Mobile unmanned vehicles (also referred to as autonomous mobile robots or self-driving vehicles) operate in a wide variety of environments. Such environments can have various physical characteristics that the vehicles may be required to navigate around, interact with and the like, in order to operate successfully. Physical characteristics such as those mentioned above can generally be represented in maps of the environments stored by the vehicles. The operating environments of the vehicles, however, may also impose other restrictions on the operation of the vehicles that do not arise directly from physical characteristics of the environments, or that arise from physical characteristics that are not readily detectable by the vehicles. Such restrictions are less suitable for representation in maps, and can therefore render the operation of self-driving vehicles difficult.
  • SUMMARY
  • The specification is generally directed to systems, apparatuses and methods for generating and deploying operational constraints from a computing device to at least one self-driving vehicle. For example, a computing device in communication with one or more self-driving vehicles stores operational constraints associated with respective regions of an environment in which the self-driving vehicles are to operate. The operational constraints each contain a property defining a constraint on the operation of self-driving vehicles within the relevant region. The computing device provides operational constraints to a self-driving vehicle, either at the vehicle's request, or along with task assignments for the vehicle, or both, to control how the self-driving vehicle will operate in the region of the environment associated with the transmitted operational constraint.
  • According to an aspect of the specification, a system is provided, comprising: at least one mobile unmanned vehicle for deployment in an environment; a computing device connected to the at least one unmanned vehicle via a network, the computing device storing, in a memory, a plurality of operational constraints; each operational constraint including (i) a type identifier, (ii) an indication of a region of the environment, and (iii) a property defining a constraint on the operation of the at least one unmanned vehicle within the region; the computing device configured to: receive a request from one of the at least one mobile unmanned vehicle, the request identifying an operational constraint; responsive to receiving the request, retrieve an operational constraint from the memory based on the request; and send the retrieved operational constraint to the one of the at least one mobile unmanned vehicle.
  • According to another aspect of the specification, a method is provided in a system having at least one mobile unmanned vehicle for deployment in an environment and a computing device connected to the at least one unmanned vehicle via a network, the method comprising: storing, in a memory of the computing device, a plurality of operational constraints; each operational constraint including (i) a type identifier, (ii) an indication of a region of the environment, and (iii) a property defining a constraint on the operation of the at least one unmanned vehicle within the region; at the computing device: receiving a request from one of the at least one mobile unmanned vehicle, the request identifying an operational constraint; responsive to receiving the request, retrieving an operational constraint from the memory based on the request; and sending the retrieved operational constraint to the one of the at least one mobile unmanned vehicle.
  • According to a further aspect of the specification, a non-transitory computer-readable medium is provided storing computer-readable instructions for execution by a processor of a computing device for causing the computing device to perform a method comprising: storing a plurality of operational constraints; each operational constraint including (i) a type identifier, (ii) an indication of a region of an environment in which at least one mobile unmanned vehicle is to be deployed, and (iii) a property defining a constraint on the operation of the at least one unmanned vehicle within the region; receiving a request from one of the at least one mobile unmanned vehicle, the request identifying an operational constraint; responsive to receiving the request, retrieving an operational constraint from the memory based on the request; and sending the retrieved operational constraint to the one of the at least one mobile unmanned vehicle via the network.
  • BRIEF DESCRIPTION OF DRAWINGS
  • Embodiments are described with reference to the following figures, in which:
  • FIG. 1 depicts a system for controlling unmanned vehicles, according to a non-limiting embodiment;
  • FIG. 2 depicts certain components of an unmanned vehicle of the system of FIG. 1, according to a non-limiting embodiment;
  • FIG. 3 depicts certain internal components of the computing device of FIG. 1, according to a non-limiting embodiment;
  • FIG. 4 depicts a method of receiving and storing operational constraints in the system of FIG. 1, according to a non-limiting embodiment;
  • FIG. 5 depicts example interfaces presented by the computing device of FIG. 1 during the method of FIG. 4, according to a non-limiting embodiment;
  • FIG. 6 depicts an example data structure for storing the operational constraints received in the method of FIG. 4, according to a non-limiting embodiment;
  • FIG. 7 depicts a method of deploying the operational constraints received in the method of FIG. 4, according to a non-limiting embodiment.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • FIG. 1 depicts a system 100 including a plurality of self-driving vehicles, referred to herein as mobile unmanned vehicles 104-1, 104-2 and 1043 (collectively referred to as unmanned vehicles 104, and generically referred to as an unmanned vehicle 104, or simply a vehicle 104; similar nomenclature is used for other reference numerals herein) for deployment in a facility, such as a manufacturing facility, warehouse or the like. The facility can be any one of, or any suitable combination of, a single building, a combination of buildings, an outdoor area, and the like. A greater or smaller number of unmanned vehicles 104 may be included in system 100 than the three shown in FIG. 1. Unmanned vehicles 104 can have a wide variety of operational characteristics (e.g. maximum payload, dimensions, weight, maximum speed, battery life, and the like).
  • System 100 also includes a computing device 108 for connection to unmanned vehicles 104 via a network 112. Computing device 108 can be connected to network 112 via, for example, a wired link 113, although wired link 113 can be any suitable combination of wired and wireless links in other embodiments. Unmanned vehicles 104 can be connected to network 112 via respective wireless links 114-1, 114-2 and 114-3. Links 114 can be any suitable combination of wired and wireless links in other examples, although generally wireless links are preferable to reduce or eliminate obstacles to the free movement of unmanned vehicles 104 about the facility. Network 112 can be any suitable one of, or any suitable combination of, wired and wireless networks, including local area networks (LAN or WLAN), wide area networks (WAN) such as the Internet, and mobile networks (e.g. GSM, LTE and the like).
  • Computing device 108 can control unmanned vehicles 104, for example by instructing unmanned vehicles 104 to carry out tasks within the facility. The nature of the tasks performed by unmanned vehicles 104 under the control of computing device 108 is not particularly limited. In general, the tasks assigned to unmanned vehicles 104 require unmanned vehicles 104 to perform various actions at various locations within the facility. Data defining the actions and locations are provided to unmanned vehicles 104 by computing device 108 via network 112.
  • The actions, items and locations mentioned above are not particularly limited. For example, an unmanned vehicle 104 can be instructed to simply travel to a specific location. In other examples, an unmanned vehicle 104 can be instructed to travel to a specified location and pick up, drop off, or otherwise manipulate, an item (e.g. a tool, container, and the like), or perform any other suitable action (e.g. park, begin a mapping algorithm, and so on). Locations include any regions within the facility bounded by coordinates. Such regions can be three-dimensional (i.e. volumes), two-dimensional (i.e. areas), one-dimensional (i.e. lines) or zero-dimensional (i.e. points).
  • In the present example, a first location 120 is illustrated, which may be employed to store items, such as an item 116 (e.g. a container). Location 120 can be an area defined on a floor of the facility for storage of items. A second location 124 is also illustrated, containing, for example, a work station where materials are to be removed from or placed in item 116, or where item 116 is to be labelled or otherwise modified. A wide variety of other work station activities will occur to those skilled in the art (e.g. welding stations, paint spray booths, and so on). A third location 128 is also illustrated in FIG. 1. In the present example, third location 128 contains a conveyor apparatus, which may carry item 116 to another part of the facility.
  • When a vehicle 104 is assigned a task by computing device 108, that vehicle 104 is configured to generate a path for completing the task (e.g. a path leading from the vehicle's current location to the end location of the task; the path may include one or more intermediate locations between the start location and the end location). In some embodiments, computing device 108 can assist the vehicle 104 in path generation (also referred to as path planning), or can generate the path without the involvement of the vehicle 104 and send the completed path to the vehicle 104 for execution.
  • Generation of the above-mentioned paths can be based on, for example, a map of the facility stored at one or both of computing device 108 and vehicles 104. Path generation may also depend on attributes of the relevant vehicle 104. For example, the map may indicate that a certain area of the facility contains constricted areas unsuitable for vehicles 104 greater than certain dimensions; if a vehicle 104 has dimensions greater than those of the constricted areas, a path may therefore be generated for that vehicle 104 that avoids the constricted areas.
  • Various other information can also impact not only the generation of paths for vehicles 104, but also the execution of those paths by vehicles 104 and the performance of actions by vehicles 104 in connection with the paths. Such other information may not be amenable to storage in the above-mentioned map (e.g. because the information may not relate directly to physical features of the facility). As will be discussed in greater detail below, system 100 is configured to receive, store and deploy to vehicles 104 a wide variety of such other information, referred to broadly herein as operational constraints for vehicles 104.
  • Before describing the handling of operational constraints by system 100 in greater detail, an example vehicle 104 and certain internal components of computing device 108 will be described.
  • Referring now to FIG. 2, an example unmanned vehicle 104 is shown. In particular, unmanned vehicle 104-3 is depicted according to a non-limiting embodiment. Other vehicles 104 need not be identical to vehicle 104-3 as depicted, but are generally as described below. Unmanned vehicle 104-3 is depicted as a terrestrial vehicle, although it is contemplated that unmanned vehicles 104 can also include aerial vehicles and watercraft. Unmanned vehicle 104-3 includes a chassis 200 containing or otherwise supporting various other components, including one or more locomotive devices 204. Devices 204 in the present example are wheels, although in other embodiments any suitable locomotive device, or combination thereof, may be employed (e.g. tracks, propellers, and the like).
  • Locomotive devices 204 are powered by one or more motors (not shown) contained within chassis 200. The motors of unmanned vehicle 104-3 can be electric motors, internal combustion engines, or any other suitable motor or combination of motors. In general, the motors drive the locomotive devices 204 by drawing power from an energy storage device (not shown) supported on or within chassis 200. The nature of the energy storage device can vary based on the nature of the motors. For example, the energy storage can include batteries, combustible fuel tanks, or any suitable combination thereof.
  • Unmanned vehicle 104-3 also includes a load-bearing surface 208 (also referred to as a payload surface), for carrying an item such as item 116 thereon. In some examples, payload surface 208 can be replaced or supplemented with other payload-bearing equipment, such as a cradle, a manipulator arm, or the like.
  • Unmanned vehicle 104-3 can also include a variety of sensors. In the present example, such sensors include at least one load cell 212 coupled to payload surface 208, for measuring a force exerted on payload surface 208 (e.g. by an item being carried by unmanned vehicle 104-3). The sensors of unmanned vehicle 104-3 can also include machine vision sensors 216, such as any suitable one of, or any suitable combination of, barcode scanners, laser-based sensing devices (e.g. a LIDAR sensor), cameras and the like. Unmanned vehicle 104-3 can also include a location sensor (not shown) such as a GPS sensor, for detecting the location of unmanned vehicle 104-3 with respect to a frame of reference. The frame of reference is not particularly limited, and may be, for example, a global frame of reference (e.g. GPS coordinates), or a facility-specific frame of reference. Other sensors that can be provided with unmanned vehicle 104-3 include accelerometers, fuel-level or battery-level sensors, and the like.
  • Unmanned vehicle 104-3 can also include a control panel 220, as well as anchors 224 for securing items or other equipment to chassis 200, or for lifting chassis 200 (e.g. for maintenance). Unmanned vehicle 104-3 can also include any of a variety of other features, such as indicator lights 228.
  • In addition, unmanned vehicle 104-3 includes a central processing unit (CPU) 250, also referred to as a processor 250, interconnected with a non-transitory computer-readable medium such as a memory 254. Processor 250 and memory 254 are generally comprised of one or more integrated circuits (ICs), and can have a variety of structures, as will now occur to those skilled in the art (for example, more than one CPU can be provided). Memory 254 can be any suitable combination of volatile (e.g. Random Access Memory (“RAM”)) and non-volatile (e.g. read only memory (“ROM”), Electrically Erasable Programmable Read Only Memory (“EEPROM”), flash memory, magnetic computer storage device, or optical disc) memory.
  • Unmanned vehicle 104-3 also includes a communications interface 258 (e.g. a network interface controller or NIC) interconnected with processor 250. Via communications interface 258, link 114-3 and network 112, processor 254 can send and receive data to and from computing device 108. For example, unmanned vehicle 104-3 can send updated location data to computing device 108, and receive operational constraints from computing device 108.
  • Additionally, processor 250 is interconnected with the other components of unmanned vehicle 104-3 mentioned above, such as sensors 212 and 216 and control panel 220.
  • Memory 254 stores a plurality of computer-readable programming instructions, executable by processor 300, in the form of various applications, including a vehicle control application 262. As will be understood by those skilled in the art, processor 250 can execute the instructions of application 262 (and any other suitable applications stored in memory 254) in order to perform various actions defined within the instructions. In the description below processor 250, and more generally any vehicle 104, is said to be “configured to” perform certain actions. It will be understood that vehicles 104 are so configured via the execution of the instructions of the applications stored in memory 254. Memory 254 also stores a cache 264, to be discussed in greater detail below.
  • Turning now to FIG. 3, certain internal components of computing device 108 are illustrated. Computing device 108 can be any one of, or any combination of, a variety of computing devices. Such devices include desktop computers, servers, mobile computers such as laptops and tablet computers, and the like. Computing device 108 therefore includes at least one central processing unit (CPU), also referred to herein as a processor, 300. Processor 300 is interconnected with a non-transitory computer-readable medium such as a memory 304. Processor 300 is also interconnected with a communications interface 308.
  • Processor 300 and memory 304 are generally comprised of one or more integrated circuits (ICs), and can have a variety of structures, as will now occur to those skilled in the art (for example, more than one CPU can be provided). Memory 304 can be any suitable combination of volatile (e.g. Random Access Memory (“RAM”)) and non-volatile (e.g. read only memory (“ROM”), Electrically Erasable Programmable Read Only Memory (“EEPROM”), flash memory, magnetic computer storage device, or optical disc) memory.
  • Communications interface 308 allows computing device 108 to connect with other computing devices (e.g. unmanned vehicles 104) via network 112. Communications interface 308 therefore includes any necessary hardware (e.g. network interface controllers (NICs), radio units, and the like) to communicate with network 112 over link 113. Computing device 108 can also include input and output devices, such as keyboards, mice, displays, and the like (not shown).
  • Memory 304 stores a plurality of computer-readable programming instructions, executable by processor 300, in the form of various applications, including an operational constraints handling application 312. As will be understood by those skilled in the art, processor 300 can execute the instructions of application 312 (and any other suitable applications) in order to perform various actions defined within the instructions. In the description below processor 300, and more generally computing device 108, are said to be “configured to” perform those actions. It will be understood that they are so configured via the execution of the instructions of the applications stored in memory 304.
  • Memory 304, in the present example, also stores various types of data for retrieval, processing and updating during the execution of application 312. In particular, memory 304 stores an operational constraints database 316. Memory 304 may also store other data (not shown), such as a map of the facility of FIG. 1, as well as vehicle attributes, location-related data, and the like.
  • Turning now to FIG. 4, a method 400 for generating and storing operational constraints is illustrated. The performance of method 400 will be described in connection with its performance in system 100, although it is contemplated that method 400 can also be performed in other suitable systems. The blocks of method 400 as described below are performed by computing device 108, via the execution of application 312 by processor 300. In other embodiments, however, method 400 can also be performed by any of unmanned vehicles 104 (that is, by processor 250 of a given vehicle 104).
  • Beginning at block 405, computing device 108 is configured to receive and store operational constraint type definitions, also referred to as operational constraint templates or masters. Each operational constraint master defines a type of operational constraint, as well as the properties that can be assigned to that type of operational constraint. The nature of the receipt of operational constraint masters at block 405 is not particularly limited. For example, the masters may be received at processor 300 via input devices such as a keyboard and mouse. Upon receipt the operational constraint masters are stored in database 316.
  • Table 1 illustrates examples of operational constraint masters.
  • TABLE 1
    Operational Constraint Masters
    Type Property Identifiers Property Definitions
    Speed Limit Location space on map
    Upper limit speed in m/s
    Lower limit speed in m/s
    Time Period start time, end time
    One-Way Location space on map
    Direction direction on map;
    tolerance
  • In particular, two types of operational constraints are illustrated above. A speed limit operational constraint type provides a definition for creating speed limit operational constraints that apply to the facility of FIG. 1. As required by the master above, each speed limit operational constraint (also referred to herein as zone) defines a space on the map of the facility where that constraint applies. The space may be a volume, an area, a line, or a point, and can be defined in a variety of ways (e.g. by coordinates). Thus, the location properties for zones correspond to “real” physical spaces in the facility. Each speed limit zone also defines an upper speed limit for unmanned vehicles 104 within the zone, a lower speed limit, and a time period during which the operational constraint applies. It will be appreciated that a wide variety of properties may be defined for each operational constraint type. Further, some properties may be indicated as mandatory, while others may be optional (e.g. a lower speed limit and the time property may be optional).
  • Another example of an operational constraint type is a one-way operational constraint. Based on the master above, each one-way zone includes a location, as well as a direction, with or without an associated tolerance. For example, a one-way zone may state that unmanned vehicles in a certain area of the facility must travel in a direction ten degrees east of north, plus or minus five degrees. As will now be apparent to those skilled in the art, a variety of other ways may also be employed to represent directions and tolerances.
  • Having stored operational constraint templates, at block 410 computing device 108 is configured to receive a request to edit operational constraints. For example, computing device 108 can received such a request in the form of input data from a keyboard or mouse, or via network 112 via another computing device. In some examples, the request can be received from a vehicle 104.
  • At block 415, computing device 108 is configured to determine whether the request received at block 410 is a request to edit an existing zone, or to create a new zone. For example, the request received at block 410 can be a selection of one of several user-selectable elements of a graphical user interface (GUI) presented on a display connected to processor 300. The determination at block 415 can therefore include a determination of which selectable element was selected (e.g. an element corresponding to zone creation, or an element corresponding to zone editing).
  • When the request received at block 410 is a request to create a new zone, performance of method 400 advances to block 420, at which computing device 108 retrieves the zone types defined by the masters received and stored at block 405. Thus, if Table 1 represents the currently stored zone masters, at block 420 computing device 108 retrieves the zone types “speed limit” and “one-way” and presents those zone types for selection. Presenting zone types for selection can involve controlling a display to render an interface 500, shown in FIG. 5, including selectable elements 504 and 508 corresponding to each retrieved zone type.
  • In other embodiments, rather than rendering the available zone types for selection at block 420, computing device 108 can present the available zone types for selection via other means, such as by transmitting the zone types via network 112 to another computing device.
  • Returning to FIG. 4, following the performance of block 420, computing device 108 is configured to receive a selection of one of the zone types presented at block 420, and in response to retrieve and present the properties defined by the master corresponding to the selected zone type. Thus, referring again to FIG. 5, an updated interface 512 can be rendered on a display connected to processor 300, in which selectable elements are provided for defining the location and direction properties defined in Table 1. For example, the location property can be defined by drawing (e.g. via input data received from a mouse, touch screen or the like) an area, volume or the like 516 on a rendering of the map of the facility. In other embodiments, the location can be specified by received input data in the form of coordinates on the map.
  • The direction property of the one-way zone can be specified, for example, by setting an angle 520 relative to an indicator of cardinal north 524 (or any other known direction. Angle 520 can be specified directly on the map shown in interface 512 in some embodiments, rather than as a separate interface element from the map (as shown in FIG. 5). In other embodiments, the direction can be specified by way of input data identifying an angle and a tolerance. Returning to FIG. 4, at block 430 computing device 108 is configured to store the newly received zone in database 316.
  • If, on the other hand, the request received at block 410 is determined (at block 415) to be a request to edit an existing zone, then at block 435 computing device 108 can be configured to retrieve the existing zones in database 316 and present the zones for selection. In some embodiments, the existing zones can be retrieved and presented based on a zone type; for example, between blocks 415 and 435 computing device 108 can present an interface similar to interface 500 for receiving a selection of which zone type is to be edited.
  • Upon receipt of a selection of a specific zone to edit at block 440, computing device 108 retrieves the properties of that zone and presents the retrieved properties (e.g. on a display). The display of zone properties at block 440 can be implemented via the presentation of an interface such as interface 512 shown in FIG. 5. Performance of method 400 then proceeds to block 430, at which properties for the relevant zone are received and stored as described above. Editing properties presented at block 440 and 430 can include deleting properties. When the editing inputs received via an interface such as interface 512 include the removal of all properties for a zone (or selection of a “delete zone” element), the receipt and storage of properties at block 430 involves removing the zone from memory.
  • As a result of repeated performances of method 400 (or, at least, repeated performances of blocks 420-430), a plurality of operational constraints, or zones, can be maintained in database 316, each having a type, a location, and one or more properties.
  • In some embodiments, computing device 108 is configured to perform a validation or simulation at block 435, after receiving updates to operational constraints (e.g. new operational constraints or edited operational constraints). In particular, computing device 108 can be configured to detect conflicts in the operational constraints, such as one-way zones with incompatible direction properties (e.g. opposite directions). Computing device 108 can also be configured to detect potential conflicts between zones that are not overlapping but adjacent to each other. For example, when two speed limit zones are in close proximity, and one zone has a greater minimum speed than the maximum speed of the other zone, computing device 108 can compare the difference between the required speeds of the zones to the known accelerations and decelerations of unmanned vehicles 104. Computing device 108 can thus determine whether the proximity of the zones, in combination with their differing requirements, would result in unmanned vehicles 104 being unable to comply with the operational constraints when traversing both zones (e.g. because the vehicles 104 cannot accelerate or decelerate quickly enough). When conflicts or potential conflicts are detected, computing device 108 can generate a warning message, for example on the display mentioned in connection with FIG. 5.
  • The zones and their properties can be stored in a wide variety of ways. For example, turning to FIG. 6, three layers 600, 604 and 608 of zones are depicted as stored in database 316. The layers can be depicted, for example, in a plurality of image files (e.g. vector or raster-based images). In other embodiments, the layers can be stored in a variety of other formats, such as tables of coordinates or the like. In the present example, each layer can store the locations of zones of a particular type as polygons (or volumes, points, lines or the like). Thus, layer 600 can contain one-way zone locations (e.g. locations for two zones, 612 and 616, are visible), layer 604 can contain speed limit locations, and layer 608 can contain locations for another type of zone (e.g. emergency aisles in the facility). Each layer (e.g. image file) can contain the remaining properties 620 corresponding to the zones, or can contain references to those properties stored separately in database 316. In other embodiments, database 316 can contain an index linking layers 600, 604, 608 and properties 620. Thus, in some embodiments, location and zone type can be considered primary properties of the zones, as those are the properties defining which layer the zone is depicted in, while the remaining properties can be considered secondary properties.
  • A wide variety of zone types and properties are contemplated, in addition to those discussed above. Other examples of zone types and corresponding properties (beyond those already discussed) include: stop before proceeding (e.g. with properties specifying a length of time to stop); restricted areas, such as the above-mentioned emergency aisles (e.g. with properties specifying that such zones are active only when an alarm is sounding in the facility); parking zones indicating areas where vehicles 104 may dock for periods of inactivity; height-restricted zones (e.g. with maximum permitted vehicle heights); weight-restricted zones (e.g. with maximum permitted vehicle weights); map quality zones (e.g. with properties indicating a level of confidence in the map in each zone). Other zone types and properties will also occur to those skilled in the art. As a further example, restricted area zones may include properties identifying classes or vehicles 104 or individual vehicles 104 that are prohibited or permitted from entering such areas).
  • Still other examples of zone types include undetectable (by unmanned vehicles 104) physical features of the facility that are therefore less well-suited for representation in the map. For example, ramps or other such features of the facility may be represented as operational constraints. Further examples of properties include transitional properties associated with the boundaries of zones. For example, a speed limit zone may include a secondary speed limit property that is applied when an unmanned vehicle is within a predetermined distance of the edge of the zone.
  • Computing device 108 can also be configured to allocate tasks to unmanned vehicles 104 based on operational constraints. For example, computing device 108 can receive a task for assignment, and retrieve operational constraints associated with a location identified in the task. In combination with unmanned vehicle characteristics, computing device 108 can then select an unmanned vehicle 104 to perform the task (e.g. picking up an item in a specific location). For example, computing device may exclude certain unmanned vehicles 104 from the above-mentioned selection when the task to be assigned lies within a height-restricted zone with a maximum height that is smaller than the known height of those unmanned vehicles 104.
  • Turning now to FIG. 7, a method 700 for deploying operational constraints is illustrated. Method 700 will be described in connection with its performance in system 100, although it is contemplated that method 700 can also be performed in other suitable systems.
  • Beginning at block 705, a vehicle 104 is configured to determine whether operational constraint data is required. The determination at block 705 can take various forms, and generally involves determining whether a navigational process executed by the processor 250 of the vehicle 104 requires operational constraint data. For example, the vehicle 104 can begin executing a path planning (i.e. path generation) process that incorporates operational constraint data, following receipt of a task assignment from computing device 108. In another example, the vehicle 104 can perform a path execution process to travel a previously generated path. The path execution process can incorporate operational constraint data. In some examples, the path generation and path execution processes can incorporate different operational constraints. For example, path generation may require the use of one-way zones, whereas path execution data may require the use of speed limit zones (which can be ignored during path generation in some embodiments).
  • In other examples, the determination at block 705 can be a determination of whether a current location of the vehicle 104 is acceptable (i.e. complies with operational constraints). In still other examples, the vehicle 104 can initiate a mapping or localization process and determine that operational constraint data is required to complete the process.
  • When the determination at block 705 is negative, the unmanned vehicle 104 continues operating as before. When the determination at block 705 is affirmative, however, the vehicle 104 determines, at block 710, whether the required operational constraint data is present in cache 264. It is therefore contemplated that when the determination at block 705 is affirmative, the vehicle 104 is configured to identify required operational constraint data, such as a zone type, a location, or the like. For example, when block 705 is performed in connection with a path execution process, the required location may be a portion of the path (or the entire path), and the zone types may include any zone types relevant to path execution (e.g. speed limits).
  • The performance of block 710 thus includes examining the contents of cache 264 for operational constraints corresponding to any requirements identified at block 705. In some embodiments, the determination at block 710 can include simply determining whether operational constraints corresponding to those identified at block 705 are present in cache 264. In other embodiments, when the required operational constraints are present in cache 264, the vehicle 104 can also determine whether the required constraints are valid, for example by determining the age of those constraints in cache 264. The vehicle 104 may also send a request (not shown) to computing device 108 to retrieve a timestamp indicating the last time database 316 was modified. If the timestamp is more recent than the age of cache 264, the determination at block 710 can be negative (even if the required constraints are present in cache 264).
  • When the determination at block 710 is affirmative, the vehicle 104 proceeds to block 715 and retrieves, from cache 264, the operational constraint data identified as being required at block 705. When the determination at block 710 is negative, however, the vehicle 104 proceeds to block 720.
  • At block 720, the vehicle 104 is configured to transmit a request for operational constraint data to computing device 108. The request can contain one or more of a location (e.g. the current location of vehicle 104, another specified location in the facility, including a set of locations such as a path, and the like), and a zone type. As will now be apparent to those skilled in the art, the request can include identifiers (e.g. of a zone type and location) corresponding to the required data identified at block 705. The location can also be omitted in order to request all available data for the facility. Likewise, zone types can also be omitted from the request to request data for all available types. Thus, vehicle 104 can send a request without a location or zone type in order to request all available zone data. In further embodiments, computing device 108 can store, in database 316, identifiers of zone types in association with identifiers of vehicles 104. Thus, vehicles 104 can omit zone types from requests and receive zone data of a specific type (or set of types) based on the above associations.
  • At block 725, computing device 108 is configured to receive the request via network 112. At block 730, responsive to receiving the request at block 725, computing device 108 is configured to retrieve the data identified in the request and send the requested data to the vehicle 104. Computing device 108 can be configured to retrieve the data in a variety of ways. For example, computing device 108 can be configured to select any zone having the type specified in the request and intersecting the location specified in the request. It will now be apparent that if no type was specified in the request, zones of all types may be retrieved at block 730.
  • At block 735, the vehicle 104 is configured to receive the operational constraints sent by computing device 108 and store the operational constraints in cache 264. Proceeding then to block 740, the vehicle 104 is configured to update its operation based on the operational constraint data received from computing device 108 or retrieved from cache 264 at block 715. Updating the operation of the vehicle 104 can include updating the vehicle's trajectory, performing a preconfigured action, sending a signal to anther device, or any of a wide variety of other operational behaviours. In general, the vehicle 104 is configured to complete the process that gave rise to the requirement identified at block 705.
  • The vehicle 104 is therefore configured, at block 740, to resume the process that lead to the affirmative determination at block 705. When the process was a path execution process, the received (or retrieved) operational constraint data, such as a speed limit, can be incorporated into the path execution process to set a speed of the vehicle 104 during execution of the path. When the process was a path generation process, the operational constraint data can be incorporated into the process, for example by rerouting the path to avoid travelling against the direction mandated by one-way zones.
  • In further embodiments, the vehicle 104 can be configured to initiate a predetermined behaviour based on the operational constraint data received from computing device 108 or retrieved from cache 264. For example, each vehicle 104 can maintain, in memory 254, sets of instructions defining specific routines, such as a sequence of movements for parking or docking the vehicle 104. In some embodiments, a type of operational constraint can define parking zones within the facility, and thus at block 740 a vehicle 104 can be configured, having requested parking zone data, to initiate a parking routine upon determining that its current location is within a parking zone.
  • Variations to the above systems and methods are contemplated. For example, in some embodiments, computing device 108 itself can perform the determination at block 705. For example, in some systems computing device 108, rather than vehicles 104, is responsible for generating paths for vehicles 104. Thus, the receipt of a request to generate a path can result in an affirmative determination at block 705 (at computing device 108 rather than a vehicle 104). Responsive to such a determination, computing device 108 can be configured to perform blocks 720, 725, 730 (which would be performed internally within computing device 108) and 740 by retrieving operational constraints required for path generation, generating a path and sending the path to the relevant vehicle 104. Blocks 710, 715 and 735 would be omitted from such embodiments.
  • In a further variation, vehicles 104 may request, at block 720, a partial operational constraint or a binary decision, rather than complete operational constraint data. For example, a vehicle 104 can request a speed limit for the vehicle's current location. In response, computing device 108 can transmit only a speed limit to vehicle 104 (or indeed, any other requested property), rather than the zone to which the speed limit applies). In another example, the vehicle 104 can request confirmation that a current location of the vehicle is acceptable (i.e. complies with operational constraints). Rather than providing the operational constraints to the vehicle 104, computing device 108 can perform the determination locally and send an indication of whether or not the vehicle's current location is acceptable or not (that is, without sending any operational constraints).
  • In a further variation, vehicles 104 can update operational constraints. As noted earlier, vehicles 104 are equipped with sensors and can thus gather data about their environments. Vehicles can thus be configured to compare data gathered via their sensors to operational constraint data in cache 264. When the comparison reveals that the operational constraint data does not matched the sensed data, a vehicle 104 can send a request to computing device 108 to update an operational constraint (e.g. a request received by computing device 108 at block 410 of method 400). As an example of such a vehicle-driven update, a vehicle 104 may have a sensor capable of measuring the height of surrounding objects in the facility. The vehicle may therefore measure the height clearance in a region of the facility and determine that the height clearance specified in a corresponding operational constraint does not match the measurement.
  • The scope of the claims should not be limited by the embodiments set forth in the above examples, but should be given the broadest interpretation consistent with the description as a whole.

Claims (21)

1.-19. (canceled)
20. An unmanned vehicle for deployment in an environment, the unmanned vehicle comprising a processor and at least one sensor in communication with the processor, the processor configured to:
operate the unmanned vehicle to travel in a region of the environment;
determine an operational constraint associated with the region, wherein the operational constraint comprises (i) class identifier data that identifies a restriction within the region of the environment, and (ii) property data defining an operation of the unmanned vehicle within the region;
collect sensor data, via the at least one sensor, in association with the region;
compare the sensor data to the operational constraint;
in response determining a mismatch between the sensor data and the operational constraint, update the operational constraint based on the sensor data; and
adjust operation of the unmanned vehicle based on the updated operational constraint.
21. The unmanned vehicle of claim 20, further comprising a memory in communication with the processor, wherein the memory stores a cache of operational constraints, and updating the operational constraint based on the sensor data comprises the processor being configured to update the operational constraint stored in the cache.
22. The unmanned vehicle of claim 21, further comprising a communication interface, and wherein updating the operational constraint based on the sensor data in response to determining the mismatch further comprises the processor being configured to:
transmit, using the communication interface via a network, a request to a computing device to update the operational constraint based on the sensor data.
23. The unmanned vehicle of claim 20, wherein the at least one sensor comprises one or more of a barcode scanner, a laser-based sensing device, a camera, or a location sensor.
24. The unmanned vehicle of claim 20, wherein the at least one sensor is configured to measure a height of surrounding objects.
25. The unmanned vehicle of claim 24, wherein the class identifier in the operational constraint indicates a height-restricted zone including a maximum vehicle height clearance, and the processor is configured to:
measure, via the at least one sensor, a height clearance in the region;
determine a mismatch between the maximum vehicle height clearance in the operational constraint and the measured height clearance; and
in response to determining the mismatch, update the operational constraint by updating the class identifier to include a new maximum vehicle height clearance corresponding to the measured height clearance.
26. The unmanned vehicle of claim 25, wherein the vehicle is travelling along a path that intersects the region, and in response to updating the class identifier in the operational constraint, the processor is further configured to:
determine that a height of the vehicle is greater than the new maximum vehicle height clearance in the updated operational constraint;
generate an updated path that avoids the region of the environment; and
adjust operation of the unmanned vehicle by controlling the unmanned vehicle to follow the updated path.
27. The unmanned vehicle of claim 26, wherein the processor is further configured to generate the path and the updated path in order to complete a task.
28. The unmanned vehicle of claim 20, wherein the region is at least one of an area or a volume within the environment.
29. A method for operating an unmanned vehicle deployed in an environment, the method comprising:
operating, via a processor of the unmanned vehicle, the unmanned vehicle to travel in a region of the environment;
determining, via the processor, an operational constraint associated with the region, wherein the operational constraint comprises (i) class identifier data that identifies a restriction within the region of the environment, and (ii) property data defining an operation of the unmanned vehicle within the region;
collecting, via at least one sensor of the unmanned vehicle coupled to the processor, sensor data in association with the region;
comparing the sensor data to the operational constraint;
in response determining a mismatch between the sensor data and the operational constraint, updating the operational constraint based on the sensor data; and
adjusting, via the processor, operation of the unmanned vehicle based on the updated operational constraint.
30. The method of claim 29, wherein the unmanned vehicle further comprises a memory in communication with the processor, and the method further comprises:
storing, in the memory, a cache of operational constraints; and
updating the operational constraint stored in the cache.
31. The method of claim 30, wherein the unmanned vehicle further comprises a communication interface in communication with the processor, and in response to determining the mismatch, the method further comprises:
transmitting, via a network using the communication interface, a request to a computing device to update the operational constraint based on the sensor data.
32. The method of claim 29, wherein the at least one sensor comprises one or more of a barcode scanner, a laser-based sensing device, a camera, or location sensors.
33. The method of claim 29, wherein the at least one sensor is configured to measure a height of surrounding objects.
34. The method of claim 29, wherein the class identifier of the operational constraint comprises a height-restricted zone that includes a maximum vehicle height clearance, and the method further comprises:
measuring, via the at least one sensor, a height clearance in the region;
determining a mismatch between the maximum vehicle height clearance in the operational constraint and the measured height clearance; and
in response to determining the mismatch, updating the operational constraint by updating the class identifier to include a new maximum vehicle height clearance corresponding to the measured height clearance.
35. The method of claim 34, wherein operating the unmanned vehicle to travel in a region of the environment comprises controlling the unmanned vehicle to follow a path that intersects the region.
36. The method of claim 35, wherein in response to updating the class identifier included the operational constraint, the method further comprises:
determining that a height of the vehicle is greater than the new maximum vehicle height clearance in the updated operational constraint;
generating an updated path for the vehicle that avoids the region of the environment; and
adjusting operation of the unmanned vehicle by controlling the unmanned vehicle to follow the updated path.
37. The method of claim 29, further comprising:
generating the path and the updated path in order to complete a task.
38. The method of claim 29, wherein the region is at least one of an area or a volume within the environment.
39. A non-transitory computer-readable medium storing computer-readable instructions for execution by a processor of an unmanned vehicle for causing the unmanned vehicle to perform a method, the method comprising:
operating the unmanned vehicle to travel in a region of the environment;
determining an operational constraint associated with the region, wherein the operational constraint comprises (i) class identifier data that identifies a restriction within the region of the environment, and (ii) property data defining an operation of the unmanned vehicle within the region;
collecting, via at least one sensor of the unmanned vehicle, sensor data in association with the region;
comparing the sensor data to the operational constraint;
in response determining a mismatch between the sensor data and the operational constraint, updating the operational constraint based on the sensor data; and
adjusting operation of the unmanned vehicle based on the updated operational constraint.
US17/185,503 2015-05-29 2021-02-25 Method, system and apparatus for handling operational constraints for control of unmanned vehicles Pending US20210311475A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/185,503 US20210311475A1 (en) 2015-05-29 2021-02-25 Method, system and apparatus for handling operational constraints for control of unmanned vehicles

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562168511P 2015-05-29 2015-05-29
US15/167,002 US10241515B2 (en) 2015-05-29 2016-05-27 Method, system and apparatus for handling operational constraints for control of unmanned vehicles
US16/290,380 US10990100B2 (en) 2015-05-29 2019-03-01 Method, system and apparatus for handling operational constraints for control of unmanned vehicles
US17/185,503 US20210311475A1 (en) 2015-05-29 2021-02-25 Method, system and apparatus for handling operational constraints for control of unmanned vehicles

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/290,380 Continuation US10990100B2 (en) 2015-05-29 2019-03-01 Method, system and apparatus for handling operational constraints for control of unmanned vehicles

Publications (1)

Publication Number Publication Date
US20210311475A1 true US20210311475A1 (en) 2021-10-07

Family

ID=57398459

Family Applications (4)

Application Number Title Priority Date Filing Date
US15/166,533 Abandoned US20160349754A1 (en) 2015-05-29 2016-05-27 Method, system and apparatus for controlling self-driving vehicles
US15/167,002 Active 2036-06-22 US10241515B2 (en) 2015-05-29 2016-05-27 Method, system and apparatus for handling operational constraints for control of unmanned vehicles
US16/290,380 Active 2036-11-24 US10990100B2 (en) 2015-05-29 2019-03-01 Method, system and apparatus for handling operational constraints for control of unmanned vehicles
US17/185,503 Pending US20210311475A1 (en) 2015-05-29 2021-02-25 Method, system and apparatus for handling operational constraints for control of unmanned vehicles

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US15/166,533 Abandoned US20160349754A1 (en) 2015-05-29 2016-05-27 Method, system and apparatus for controlling self-driving vehicles
US15/167,002 Active 2036-06-22 US10241515B2 (en) 2015-05-29 2016-05-27 Method, system and apparatus for handling operational constraints for control of unmanned vehicles
US16/290,380 Active 2036-11-24 US10990100B2 (en) 2015-05-29 2019-03-01 Method, system and apparatus for handling operational constraints for control of unmanned vehicles

Country Status (1)

Country Link
US (4) US20160349754A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210270632A1 (en) * 2020-02-24 2021-09-02 Neutron Holdings, Inc. Dba Lime Vehicle operation zone detection

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10431099B2 (en) * 2014-02-21 2019-10-01 FLIR Belgium BVBA Collision avoidance systems and methods
US9891630B2 (en) * 2014-10-24 2018-02-13 Clearpath Robotics, Inc. Variable reference frames in unmanned vehicles
US11586208B2 (en) 2014-10-24 2023-02-21 Clearpath Robotics Inc. Systems and methods for executing a task with an unmanned vehicle
US9804594B2 (en) * 2014-11-07 2017-10-31 Clearpath Robotics, Inc. Self-calibrating sensors and actuators for unmanned vehicles
US12084824B2 (en) 2015-03-06 2024-09-10 Walmart Apollo, Llc Shopping facility assistance systems, devices and methods
WO2016142794A1 (en) 2015-03-06 2016-09-15 Wal-Mart Stores, Inc Item monitoring system and method
US20180099846A1 (en) 2015-03-06 2018-04-12 Wal-Mart Stores, Inc. Method and apparatus for transporting a plurality of stacked motorized transport units
US20160255969A1 (en) 2015-03-06 2016-09-08 Wal-Mart Stores, Inc. Shopping facility assistance systems, devices and methods pertaining to movement of a mobile retail product display
CA2961938A1 (en) * 2016-04-01 2017-10-01 Wal-Mart Stores, Inc. Systems and methods for moving pallets via unmanned motorized unit-guided forklifts
WO2018039337A1 (en) 2016-08-23 2018-03-01 Canvas Technology, Inc. Autonomous cart for manufacturing and warehouse applications
US10585440B1 (en) 2017-01-23 2020-03-10 Clearpath Robotics Inc. Systems and methods for using human-operated material-transport vehicles with fleet-management systems
EP3367315A1 (en) * 2017-02-28 2018-08-29 TRUMPF Werkzeugmaschinen GmbH + Co. KG Production system and process for printing and transporting production items by means of an unmanned aerial vehicle
US10260890B2 (en) * 2017-04-21 2019-04-16 X Development Llc Aisle-based roadmap generation
US11760221B2 (en) 2017-06-27 2023-09-19 A9.Com, Inc. Charging systems and methods for autonomous carts
US10793369B2 (en) 2017-07-12 2020-10-06 A9.Com, Inc. Conveyor system for autonomous robot
US11417111B2 (en) 2017-12-22 2022-08-16 Terra Scientia, Llc Method, system and material for detecting objects of high interest with laser scanning systems
CN108415426A (en) * 2018-02-09 2018-08-17 深圳市七布创新科技有限公司 A kind of mobile control system and method
US10919700B2 (en) * 2018-02-21 2021-02-16 Spacesaver Corporation Mobile storage system with direct wireless connectivity
USD873173S1 (en) * 2018-08-28 2020-01-21 Asi Technologies, Inc. Automated guided vehicle
US11656626B2 (en) 2018-11-12 2023-05-23 Robotic Research Opco, Llc Autonomous truck loading for mining and construction applications
US11644843B2 (en) * 2018-11-12 2023-05-09 Robotic Research Opco, Llc Learning mechanism for autonomous trucks for mining and construction applications
US11353865B2 (en) 2018-11-13 2022-06-07 Robotic Research Opco, Llc Coordination of mining and construction vehicles via scripting control
US20220019235A1 (en) * 2018-11-29 2022-01-20 Nec Corporation Route search support apparatus, route search support method, and computer-readable recording medium
CN113678080A (en) * 2019-01-28 2021-11-19 维克多动力学股份公司 Robotic vehicle with safety measures
US10752157B1 (en) * 2019-03-15 2020-08-25 Fetch Robotics, Inc. Compact payload stopper for mobile robot comprising conveyor
DE102019212399A1 (en) * 2019-08-20 2021-02-25 Zf Friedrichshafen Ag Method for adapting a speed of an autonomous vehicle
US11713977B2 (en) * 2019-12-19 2023-08-01 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and medium
CN111061278B (en) * 2019-12-31 2022-12-30 苏州极智嘉机器人有限公司 Path planning method and device, computer equipment and storage medium
NO346249B1 (en) * 2020-03-31 2022-05-09 Autostore Tech As Section based speed reduction
US12103773B2 (en) 2020-10-19 2024-10-01 Gideon Brothers d.o.o. Dynamic traversal protocol selection by autonomous robots in a facility context
CN113848869B (en) * 2021-10-20 2023-03-07 北京三快在线科技有限公司 Unmanned equipment control method and device, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140188324A1 (en) * 2011-08-29 2014-07-03 Crown Equipment Corporation Vehicular navigation control interface
US20140365258A1 (en) * 2012-02-08 2014-12-11 Adept Technology, Inc. Job management system for a fleet of autonomous mobile robots
US20150309485A1 (en) * 2013-03-11 2015-10-29 Hitachi, Ltd. Autonomous Control Device
US9358975B1 (en) * 2015-04-10 2016-06-07 Google Inc. Virtual moving safety limits for vehicles transporting objects

Family Cites Families (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5023790A (en) 1989-02-17 1991-06-11 Whs Robotics Automatic guided vehicle system
US20080183599A1 (en) 1998-11-18 2008-07-31 Visible Inventory, Inc. Inventory control and communication system
JP3484104B2 (en) 1998-12-25 2004-01-06 平田機工株式会社 Automatic warehouse and automatic warehouse management method
JP2001121461A (en) 1999-10-26 2001-05-08 Denso Corp Robot system
KR100640105B1 (en) 2001-04-19 2006-10-30 무라타 기카이 가부시키가이샤 Automated guided vehicle, automated guided vehicle system and wafer conveyance method
SE526913C2 (en) 2003-01-02 2005-11-15 Arnex Navigation Systems Ab Procedure in the form of intelligent functions for vehicles and automatic loading machines regarding mapping of terrain and material volumes, obstacle detection and control of vehicles and work tools
US7962192B2 (en) 2005-09-30 2011-06-14 Restoration Robotics, Inc. Systems and methods for aligning a tool with a desired location or object
US9436184B2 (en) 2006-06-09 2016-09-06 Amazon Technologies, Inc. Method and system for transporting inventory items
US7873469B2 (en) 2006-06-19 2011-01-18 Kiva Systems, Inc. System and method for managing mobile drive units
US7693757B2 (en) 2006-09-21 2010-04-06 International Business Machines Corporation System and method for performing inventory using a mobile inventory robot
JP4576445B2 (en) 2007-04-12 2010-11-10 パナソニック株式会社 Autonomous mobile device and program for autonomous mobile device
JP5047709B2 (en) 2007-07-04 2012-10-10 株式会社日立製作所 Moving device, system, moving method, and moving program
US8190295B1 (en) 2008-05-14 2012-05-29 Sandia Corporation Apparatus and method for modifying the operation of a robotic vehicle in a real environment, to emulate the operation of the robotic vehicle operating in a mixed reality environment
TWM348676U (en) 2008-07-22 2009-01-11 Iner Aec Executive Yuan Environmental survey robot
US7972102B2 (en) 2008-07-24 2011-07-05 Marine Terminals Corporation Automated marine container terminal and system
US8909466B2 (en) 2008-08-01 2014-12-09 Environmental Systems Research Institute, Inc. System and method for hybrid off-board navigation
US8126642B2 (en) 2008-10-24 2012-02-28 Gray & Company, Inc. Control and systems for autonomously driven vehicles
TWI680928B (en) 2009-04-10 2020-01-01 美商辛波提克有限責任公司 Vertical lift system and method for transferring uncontained case unit to and from a multilevel storage structure
FI20095713A (en) * 2009-06-24 2010-12-25 Sandvik Mining & Constr Oy Determination of driving route for arranging automatic control of a moving mining machine
GB2494081B (en) 2010-05-20 2015-11-11 Irobot Corp Mobile human interface robot
EP2668008A4 (en) 2011-01-28 2018-01-24 Intouch Technologies, Inc. Interfacing with a mobile telepresence robot
US20130086215A1 (en) 2011-05-13 2013-04-04 HNTB Holdings, Ltd. Managing large datasets obtained through a survey-data-acquisition process
US8886359B2 (en) 2011-05-17 2014-11-11 Fanuc Corporation Robot and spot welding robot with learning control function
US8583361B2 (en) * 2011-08-24 2013-11-12 Modular Mining Systems, Inc. Guided maneuvering of a mining vehicle to a target destination
DE102012003690A1 (en) 2012-02-23 2013-08-29 Kuka Roboter Gmbh Mobile robot
US9463574B2 (en) 2012-03-01 2016-10-11 Irobot Corporation Mobile inspection robot
US20140040431A1 (en) 2012-08-06 2014-02-06 General Electric Company Systems and methods for an opc ua server
US9220651B2 (en) 2012-09-28 2015-12-29 Elwha Llc Automated systems, devices, and methods for transporting and supporting patients
US9804576B2 (en) 2013-02-27 2017-10-31 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with position and derivative decision reference
US8965561B2 (en) 2013-03-15 2015-02-24 Cybernet Systems Corporation Automated warehousing using robotic forklifts
US9141107B2 (en) 2013-04-10 2015-09-22 Google Inc. Mapping active and inactive construction zones for autonomous driving
WO2015052830A1 (en) 2013-10-11 2015-04-16 株式会社日立製作所 Transport vehicle control device and transport vehicle control method
US9452531B2 (en) 2014-02-04 2016-09-27 Microsoft Technology Licensing, Llc Controlling a robot in the presence of a moving object
US9720410B2 (en) 2014-03-03 2017-08-01 Waymo Llc Remote assistance for autonomous vehicles in predetermined situations
US9465388B1 (en) 2014-03-03 2016-10-11 Google Inc. Remote assistance for an autonomous vehicle in low confidence situations
US9486917B2 (en) 2014-04-30 2016-11-08 The Boeing Company Mobile automated assembly tool for aircraft structures
US9996976B2 (en) * 2014-05-05 2018-06-12 Avigilon Fortress Corporation System and method for real-time overlay of map features onto a video feed
US9280153B1 (en) 2014-09-19 2016-03-08 Amazon Technologies, Inc. Inventory holder load detection and/or stabilization
US9824592B2 (en) 2014-09-22 2017-11-21 Vinveli Unmanned Systems, Inc. Method and apparatus for ensuring the operation and integrity of a three-dimensional integrated logistical system
CN107108122B (en) 2014-10-14 2019-10-18 新生代机器人公司 Storage material handling system
US9487356B1 (en) * 2015-03-02 2016-11-08 Amazon Technologies, Inc. Managing low-frequency inventory items in a fulfillment center
US9649766B2 (en) 2015-03-17 2017-05-16 Amazon Technologies, Inc. Systems and methods to facilitate human/robot interaction
CN107580691B (en) 2015-05-06 2021-03-19 克朗设备公司 Industrial vehicle for identifying fault occurrence sequenced tag
US9682481B2 (en) 2015-10-26 2017-06-20 X Development Llc Communication of information regarding a robot using an optical identifier
US9632502B1 (en) 2015-11-04 2017-04-25 Zoox, Inc. Machine-learning systems and techniques to optimize teleoperation and/or planner decisions
US9969495B2 (en) 2016-04-29 2018-05-15 United Parcel Service Of America, Inc. Unmanned aerial vehicle pick-up and delivery systems
KR102565501B1 (en) 2016-08-01 2023-08-11 삼성전자주식회사 A robotic cleaner, a refrigerator, a system of delivery of a container and a method of delivering and retrieving of a container of the refrigerator using the robotic cleaner
US10317119B2 (en) 2016-08-25 2019-06-11 Amazon Technologiess, Inc. Transportable climate-controlled units for fulfillment of perishable goods
US11097422B2 (en) 2017-02-07 2021-08-24 Veo Robotics, Inc. Safety-rated multi-cell workspace mapping and monitoring
JP7319958B2 (en) 2017-07-28 2023-08-02 ニューロ・インコーポレーテッド Designing Adaptive Compartments for Autonomous and Semi-Autonomous Vehicles

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140188324A1 (en) * 2011-08-29 2014-07-03 Crown Equipment Corporation Vehicular navigation control interface
US20140365258A1 (en) * 2012-02-08 2014-12-11 Adept Technology, Inc. Job management system for a fleet of autonomous mobile robots
US20150309485A1 (en) * 2013-03-11 2015-10-29 Hitachi, Ltd. Autonomous Control Device
US9358975B1 (en) * 2015-04-10 2016-06-07 Google Inc. Virtual moving safety limits for vehicles transporting objects

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210270632A1 (en) * 2020-02-24 2021-09-02 Neutron Holdings, Inc. Dba Lime Vehicle operation zone detection

Also Published As

Publication number Publication date
US10241515B2 (en) 2019-03-26
US20160349749A1 (en) 2016-12-01
US20160349754A1 (en) 2016-12-01
US20190265704A1 (en) 2019-08-29
US10990100B2 (en) 2021-04-27

Similar Documents

Publication Publication Date Title
US20210311475A1 (en) Method, system and apparatus for handling operational constraints for control of unmanned vehicles
US10120390B2 (en) System, computing device, and method for unmanned vehicle fleet control
US11383382B2 (en) Safety system for integrated human/robotic environments
US10366366B1 (en) Entity tracking for kiva robotic floors
US11145206B2 (en) Roadmap segmentation for robotic device coordination
US11460863B2 (en) Systems and methods for unmanned vehicle fleet control
US9764470B2 (en) Selective deployment of robots to perform mapping
US20180306591A1 (en) Aisle-based Roadmap Generation
US10628790B1 (en) Automated floor expansion using an unmanned fiducial marker placement unit
JP2021039450A (en) System and method for design assist, and program
US12094351B2 (en) Systems and methods for providing obstacle information to aircraft operator displays
JP2023173092A (en) Object detection system, object detection method, and object detection program
US20240019872A1 (en) Movement control support device and method
US11994407B2 (en) Evaluation of a ground region for landing a robot
JP2020170438A (en) Work plan making system
KR102128551B1 (en) Apparatus for modeling of reconnaissance behavior and the method thereof
US20240233380A1 (en) Image processing apparatus, method, and program
JP2021181371A (en) Lifting support system, lifting support method, and lifting support program
JP2023167434A (en) Lifting support system, lifting support method, and lifting support program
Bossen Autonomous Drone Searching for People and Animals
JP2024113258A (en) Information processing system, information processing method, and information processing program
CN114510035A (en) Robot remote surveying method, device and storage medium
Wang et al. Offline perching location selection for quadrotor UAV in urban environment
CN114365199A (en) Modeling of underground construction sites
CN118192534A (en) Mobile body management device, management method, and storage medium

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: CLEARPATH ROBOTICS INC., CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GARIEPY, RYAN CHRISTOPHER;BENCZ, ALEX;BLAKEY, ANDREW CLIFFORD;AND OTHERS;SIGNING DATES FROM 20150603 TO 20150608;REEL/FRAME:057444/0206

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STPP Information on status: patent application and granting procedure in general

Free format text: WITHDRAW FROM ISSUE AWAITING ACTION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: ROCKWELL AUTOMATION TECHNOLOGIES, INC., OHIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROCKWELL AUTOMATION, INC.;REEL/FRAME:067944/0982

Effective date: 20240625

Owner name: ROCKWELL AUTOMATION, INC., WISCONSIN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CLEARPATH ROBOTICS, INC.;REEL/FRAME:067944/0916

Effective date: 20240621

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

AS Assignment

Owner name: ROCKWELL AUTOMATION, INC., WISCONSIN

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE CONVEYING PARTY'S NAME FROM CLEARPATH ROBOTICS, INC. TO CLEARPATH ROBOTICS INC. (WITHOUT THE COMMA) PREVIOUSLY RECORDED ON REEL 67944 FRAME 916. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:CLEARPATH ROBOTICS INC.;REEL/FRAME:068233/0542

Effective date: 20240621

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED