WO2023019038A1 - Autonomous transport vehicle with vision system - Google Patents

Autonomous transport vehicle with vision system Download PDF

Info

Publication number
WO2023019038A1
WO2023019038A1 PCT/US2022/072592 US2022072592W WO2023019038A1 WO 2023019038 A1 WO2023019038 A1 WO 2023019038A1 US 2022072592 W US2022072592 W US 2022072592W WO 2023019038 A1 WO2023019038 A1 WO 2023019038A1
Authority
WO
WIPO (PCT)
Prior art keywords
controller
payload
vehicle
guided vehicle
pose
Prior art date
Application number
PCT/US2022/072592
Other languages
French (fr)
Inventor
Akram ZAHDEH
Paul Besl
David GRATIANO
Alan Phillips
Stephen DEBARYSHE
Original Assignee
Symbotic Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/804,026 external-priority patent/US20230050980A1/en
Priority claimed from US17/804,039 external-priority patent/US20230107709A1/en
Application filed by Symbotic Llc filed Critical Symbotic Llc
Priority to KR1020237044782A priority Critical patent/KR20240046119A/en
Priority to CN202280052290.7A priority patent/CN117794845A/en
Priority to EP22856718.6A priority patent/EP4384470A1/en
Priority to CA3220378A priority patent/CA3220378A1/en
Publication of WO2023019038A1 publication Critical patent/WO2023019038A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/063Automatically guided
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/0492Storage devices mechanical with cars adapted to travel in storage aisles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/06Storage devices mechanical with means for presenting articles for removal at predetermined position or level
    • B65G1/065Storage devices mechanical with means for presenting articles for removal at predetermined position or level with self propelled cars
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/137Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
    • B65G1/1373Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed for fulfilling orders in warehouses
    • B65G1/1375Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed for fulfilling orders in warehouses the orders being assembled on a commissioning stacker-crane or truck
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/0755Position control; Position detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/07568Steering arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/20Means for actuating or controlling masts, platforms, or forks
    • B66F9/24Electrical devices or systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • G05D1/0261Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic plots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/243Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals

Definitions

  • the disclosed embodiment generally relates to material handling systems, and more particularly, to transports for automated storage and retrieval systems.
  • Generally automated storage and retrieval systems employ autonomous vehicles that transport goods within the automated storage and retrieval system. These autonomous vehicles are guided throughout the automated storage and retrieval system by location beacons, capacitive or inductive proximity sensors, line following sensors, reflective beam sensors and other narrowly focused beam type sensors. These sensors may provide limited information for effecting navigation of the autonomous vehicles through the storage and retrieval system or provide limited information with respect to identification and discrimination of hazards that may be present throughout the automated storage and retrieval system.
  • autonomous transport vehicles in logistics/warehouse facilities are generally manufactured to have a predetermined form factor for an assigned task in a given environment.
  • These autonomous transport vehicles are constructed of a bespoke cast or machined chassis/frame.
  • the other components e.g., wheels, transfer arms, etc.
  • the transfer arms and payload bay of these autonomous transport vehicles may include numerous components (sensors, encoders, etc.) and motor assemblies for transferring payloads to and from the autonomous transport vehicles as well as for justifying payloads within the payload bay.
  • the motors and sensors may be substantially directly and continuously coupled to a power supply of the autonomous transport vehicle such as through an electrical bus bar.
  • FIG. 1A is a schematic block diagram of an exemplary storage and retrieval system facility incorporating aspects of the disclosed embodiment
  • Fig. IB is a plan view illustration of an the exemplary storage and retrieval system facility of Fig. 1A incorporating aspects of the disclosed embodiment;
  • FIG. 2 is an exemplary perspective illustration of an autonomous guided vehicle of the exemplary storage and retrieval system facility of Fig. 1A in accordance with aspects of the disclosed embodiment
  • FIGs. 3A and 3B are exemplary perspective illustrations of portions of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 4A is an exemplary plan view illustration of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 4B is an exemplary perspective illustration of a portion of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 4C is an exemplary perspective illustration of a portion of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 4D is an exemplary plan view illustration of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • FIG. 5A, 5B, and 5C is an exemplary illustration of pose and location estimation in accordance with aspects of the disclosed embodiment
  • Fig. 6 is an exemplary plan view illustration of a portion of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • FIGs. 7A and 7B are respectively plan and perspective illustrations of a case unit illustrating a shelf invariant front face detection in accordance with aspects of the disclosed embodiment
  • FIG. 8 is an exemplary illustration of data captured by a supplemental sensor system of the autonomous guided vehicle of
  • Fig. 9A is an exemplary stereo vision image captured by a supplemental sensor system of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
  • Fig. 9B is an exemplary augmented stereo vision image captured by a supplemental sensor system of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 10A is an exemplary augmented image captured by a supplemental sensor system of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 10B is an exemplary augmented stereo vision image captured by a supplemental sensor system of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 11 is an exemplary block diagram illustrating a sensor selection depending on an operation mode of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 12 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment.
  • Fig. 13 is an exemplary flow diagram of a vision analysis effected by the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 14 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment
  • Fig. 15 is an exemplary image captured by a supplemental sensor system of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 16 is an exemplary flow diagram of an image analysis effected by the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 17 is an exemplary flow diagram of an image analysis collaboratively effected with a supplemental sensor system of the autonomous guided vehicle of Fig. 2 and an operator in accordance with aspects of the disclosed embodiment;
  • Fig. 18 is an exemplary flow diagram of an image analysis collaboratively effected with a supplemental sensor system of the autonomous guided vehicle of Fig. 2 and an operator in accordance with aspects of the disclosed embodiment;
  • Fig. 19 is an exemplary schematic block diagram of a portion of the autonomous transport vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 20 is an exemplary schematic block diagram of a portion of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 21 is an exemplary schematic charging logic block diagram for the autonomous transport vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 22 is an exemplary protection circuit of the autonomous transport vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment
  • Fig. 23 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment.
  • Fig. 24 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment.
  • Figs. 25A, 25B, and 25C are collectively an exemplary schematic of a control system of the autonomous transport vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
  • Fig. 26 is an exemplary schematic illustration of a portion of the control system of Figs. 25A, 25B, and 25C in accordance with aspects of the disclosed embodiment
  • Fig. 27 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment.
  • Fig. 28 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment. DETAILED DESCRIPTION
  • FIGs. 1A and IB illustrate an exemplary automated storage and retrieval system 100 in accordance with aspects of the disclosed embodiment.
  • the aspects of the disclosed embodiment will be described with reference to the drawings, it should be understood that the aspects of the disclosed embodiment can be embodied in many forms.
  • any suitable size, shape or type of elements or materials could be used.
  • an autonomous transport vehicle 110 (also referred to herein as an autonomous guided vehicle) having a physical characteristic sensor system 276 that at least in part effects determination of at least one of a vehicle navigation pose or location and a payload pose or location.
  • the autonomous transport vehicle 110 includes a supplemental or auxiliary navigation sensor system 288 that supplements the information from the physical characteristic sensor system 276 to at least one of verify and increase the accuracy of the vehicle navigation pose or location and the payload pose or location.
  • the supplemental navigation sensor system 288 includes a vision system 400 that effects a reduction (e.g., compared to automated transport of case units with conventional vehicles lacking the supplemental sensor system described herein) in case unit transport errors and an increase in storage and retrieval system 100 operation efficiency.
  • the aspects of the disclosed embodiment also provide for an autonomous transport vehicle 110 having an autonomous navigation/operation sensor system 270 that effects at least in part determination of at least one of a vehicle navigation pose or location and a payload pose or location.
  • the autonomous transport vehicle 110 further includes a supplemental or auxiliary hazard sensor system 290 that supplements the information from the autonomous navigation/operation sensor system 270 for opportunistically determining or discriminating a presence of a predetermined physical characteristic of at least one object or spatial feature 299 (see, e.g., Figs. 4D and 15) within at least a portion of the facility 100 which the autonomous transport vehicle 110 is navigating (i.e., controller 122 is programmed to command the autonomous transport vehicle to different positions in the facility associated with effecting one or more predetermined payload autonomous transfer tasks).
  • controller 122 is programmed to command the autonomous transport vehicle to different positions in the facility associated with effecting one or more predetermined payload autonomous transfer tasks.
  • the vehicle navigates to the different positions with the navigation system and operates to effect the predetermined transfer tasks at the different positions separate and distinct from the captured image data by the supplemental hazard sensor system 290 in the different positions.
  • the opportunistic determination/ discrimination of the presence of the predetermined physical characteristic of the object or spatial feature 299, incidental or peripheral to the vehicle 110 executing navigation and transfer causes the controller 122 to selectably reconfigure the autonomous transport vehicle 110 from an autonomous state to a collaborative vehicle state for collaboration with an operator so as to finalize discrimination of the object or spatial feature 299 as a hazard and identify a mitigation action of the vehicle with respect to the hazard (i.e., the collaborative state is supplemental (auxiliary) to the autonomous state of the vehicle (wherein in the autonomous state the vehicle autonomously effects each of the one or more predetermined payload autonomous transfer tasks and in the auxiliary/collaborative state the vehicle collaborates with the operator to discriminate and mitigate hazards as described herein.
  • the collaborative state is supplemental (auxiliary) to the autonomous state of the vehicle
  • the supplemental navigation sensor system 288 and the supplemental hazard sensor system 290 may be used in conjunction with each other or separately and may form a common vision system 400 or separate vision systems.
  • the supplemental hazard sensor system 290 may include sensors from the supplemental navigation sensor system 288 or vice versa (i.e., the supplemental navigation sensor system 288 and the supplemental hazard sensor system 290 share common sensors between the two sensor systems).
  • the autonomous transport vehicle 110 includes at least stereo vision that is focused on at least a payload bed (or bay or area) 210B of the autonomous transport vehicle 110 so that a controller (such as one or more of a control server 120 of the storage and retrieval system 100, a controller 122 of the autonomous transport vehicle 110, or any other suitable controller) or human operator of the storage and retrieval system 100 monitors case unit CU movement to and from the payload bed 210B.
  • the autonomous transport vehicle 110 includes one or more imaging radar systems that independently measure (s) a size and a center point of front faces of case units CU disposed in storage spaces 130S on storage shelves of the storage level structure 130L.
  • the autonomous transport vehicle may include one or more other navigation and/or vision sensors to effect case unit transfer to and from the payload bed 210B and navigation of the autonomous transport vehicle 110 throughout a respective storage structure level 130L.
  • imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the supplemental sensor system are coapted (e.g., fit/combined) to a reference model (or maps - such as model 400VM) of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of vehicle navigation pose or location information and payload pose or location information.
  • the autonomous transport vehicle 110 may include a forward looking stereo (e.g., with respect to a direction of travel of the autonomous transport vehicle 110) vision system and a rearward looking (e.g., with respect to the direction of travel) vision system that are configured to effect localization of the autonomous transport vehicle 110 within the storage structure level 130L by detecting any suitable navigation markers or fiducials (e.g., floor tape/lines, structural beams of the storage structure level, storage facility features, etc.) in combination with a storage level floor map and storage structure information (e.g., a virtual model 400VM of locations of columns, storage shelves, storage buffers, floor joints, etc.).
  • a forward looking stereo e.g., with respect to a direction of travel of the autonomous transport vehicle 110
  • a rearward looking vision system e.g., with respect to the direction of travel
  • any suitable navigation markers or fiducials e.g., floor tape/lines, structural beams of the storage structure level, storage facility features, etc.
  • the storage level map (or model) and storage structure information embody the location (s) of the navigation markers so that upon recognition of the markers by the vision system 400 the autonomous transport vehicle 110 determines its localized position within the storage and retrieval system 100.
  • the autonomous transport vehicle 110 may include one or more cameras that face upward for detecting any suitable navigation markers or fiducials located on a ceiling of the storage structure level 130L and determining a localization of the autonomous transport vehicle 110 using the storage level floor map and storage structure information.
  • the autonomous transport vehicle 110 may include at least one sideways looking traffic monitoring camera that is configured to monitor autonomous transport vehicle traffic along transfer decks 130B of the storage and retrieval system 100 to facilitate autonomous transport vehicle 110 entry to a transfer deck 130B and merging of the autonomous transport vehicle 110 with other autonomous transport vehicles 110 already travelling along the transfer deck(s) 130B.
  • the autonomous transport vehicle 110 may also include a forward looking (e.g., with respect to a direction of travel of the autonomous transport vehicle 110) or omnidirectional (x, y, z, 0) vision system and/or a rearward looking (e.g., with respect to the direction of travel) vision system that is configured to effect imaging (available for continuous or periodical) for monitoring (supplemental to autonomous navigating sensor system 270) of the areas or spaces along autonomous travel paths of the autonomous transport vehicle 110 within, e.g., a storage structure level 130L and detecting any objects/hazards that may encroach on the bot travel path.
  • a forward looking e.g., with respect to a direction of travel of the autonomous transport vehicle 110
  • omnidirectional (x, y, z, 0) vision system and/or a rearward looking (e.g., with respect to the direction of travel) vision system that is configured to effect imaging (available for continuous or periodical) for monitoring (supplemental to autonomous navigating sensor system 270) of the areas or spaces
  • the vision system 400 may effect imaging for supplemental monitoring and detection (of the objects/hazards) by the controller 122 so that monitoring and detection is performed resident on (e.g., onboard) the autonomous transport vehicle 110, such as by employment of a reference storage level floor map and storage structure information (e.g., a virtual model 400VM of locations of columns, storage shelves, storage buffers, floor joints, etc.); and from indication by the controller 122 of such detection and in collaboration with a remote operator remotely accessing the vision system effecting collaborative monitoring/detecting/identifying/discriminating/mitigating of the object 299 (see Fig. 15) with the vehicle 110 in the collaborative state.
  • a reference storage level floor map and storage structure information e.g., a virtual model 400VM of locations of columns, storage shelves, storage buffers, floor joints, etc.
  • a determination of the object (s)/hazard (s) type(s) is effected upon indication by the controller by a remote operator receiving the images/video of the object/hazard transmitted from/by the autonomous transport vehicle 110 to the user interface UI.
  • the autonomous transport vehicle 110 includes a vision system controller 122VC disposed onboard the autonomous transport vehicle and communicably coupled to the vision system 400 of the autonomous transport vehicle 110.
  • the vision system controller 122VC is configured with model based vision in that the vision system controller 122VC simulates/models the storage and retrieval system 100 (e.g., based on any suitable information such as computer aided drafting (CAD) data of the storage and retrieval system structure or other suitable data stored in memory or accessible by the vision system controller 122VC that effects modeling/simulation of the storage and retrieval system 100) and compares the data obtained with the vision system 400 to the model/simulation of the storage and retrieval system structure to effect one or more or bot localization and imaging of the object/hazard.
  • CAD computer aided drafting
  • the autonomous transport vehicle 110 is configured to compare what it "sees” with the vision system 400 substantially directly with what the autonomous transport vehicle 110 expects to "see” based on the simulation of the (reference) storage and retrieval system structure.
  • the supplemental sensor system also effects augmented reality operator inspection of the storage and retrieval system environment as well as remote control of the autonomous transport vehicle 110 as will be described herein.
  • the supplemental navigation sensor system 288 and/or the supplemental hazard sensor system 290 includes a vision system 400 that effects transmission (e.g., streaming live video, time stamped images, or any other suitable manner of transmission) of images/video to a remote operator for identification of the object/hazard present within the facility 100 (e.g., an object extending across the bot travel path, blocking the bot, proximate the bot within a predetermined distance) which is "unknown" (i.e., unidentifiable) by the autonomous transport vehicle 110.
  • a vision system 400 that effects transmission (e.g., streaming live video, time stamped images, or any other suitable manner of transmission) of images/video to a remote operator for identification of the object/hazard present within the facility 100 (e.g., an object extending across the bot travel path, blocking the bot, proximate the bot within a predetermined distance) which is "unknown" (i.e., unidentifiable) by the autonomous transport vehicle 110.
  • a controller such as one or more of a control server 120 of the storage and retrieval system 100, a controller 122 of the autonomous transport vehicle 110, the vision system controller 122VC, or any other suitable controller
  • human operator of the storage and retrieval system 100 monitors, via the vision system 400, the bot travel paths as the autonomous transport vehicle 110 navigates the facility to perform autonomous storage and retrieval tasks in accordance with the controller 122 commands.
  • the vehicle 110 opportunistically discovers any objects/hazards within the facility 100 which could (based on predetermined initially identified criteria programmed in the controller 122) disrupt bot operations and/or traffic of other bots also navigating the facility 100 autonomously performing storage and retrieval tasks (i.e., the controller is configured so that determination of presence of object/hazard is coincident, at least in part, with, but supplemental and peripheral to bot actions (demanded for) effecting each of the one or more predetermined payload autonomous transfer tasks).
  • each autonomous transport vehicle 110 is configured with a comprehensive power management section 444 (also referred to herein as a power distribution unit - see Fig. 19).
  • the power distribution unit 444 is configured to manage power needs of the autonomous transport vehicle 110 so as to preserve higher level functions/operations of the autonomous transport vehicle 110, the higher level functions being preserved depending on a charge level of a power supply of the autonomous transport vehicle 110.
  • control and drive operations may be preserved so that the autonomous transport vehicle 110 traverses to a charging station or maintenance location while other lower level functions of the autonomous transport vehicle (e.g., not needed for the traverse to the charging station or maintenance location) are shut down.
  • Managing low level systems of the autonomous transport vehicle 110 conserves charge of the onboard vehicle power source to improve the operational time of the autonomous transport vehicle 110 between charging operations and preserves autonomous transport vehicle controller functionality.
  • the power distribution unit 444 may also be configured to control a charge mode of a power supply 481 of the autonomous transport vehicle so as to maximize a number of charge cycles of the power supply 481.
  • the power distribution unit 444 monitors current draw for components (e.g., motors, sensors, controllers, etc. that are communicably coupled to the power source 481 on "branch circuits") of the autonomous transport vehicle 110 and manages (e.g., switches on and off) the power supply to each of the components to conserve the charge (e.g., energy usage) of the power supply 481.
  • the power distribution unit 444 may be configured to provide electric circuit fault protection (e.g., short circuit protection, over-voltage protection, over-current protection, etc.) for components of the autonomous transport vehicle 110 that are communicably coupled to the power supply 481 as loop devices or loop powered devices.
  • a loop powered device is an electronic device that is connected in a transmitter loop, such as a current loop, without the need to have a separate or independent power source, where the electronic device employs the power from the current flowing in the loop for its operation).
  • the automated storage and retrieval system 100 in Figs. 1A and IB may be disposed in a retail distribution center or warehouse, for example, to fulfill orders received from retail stores for replenishment goods shipped in cases, packages, and or parcels.
  • case, package and parcel are used interchangeably herein and as noted before may be any container that may be used for shipping and may be filled with case or more product units by the producer.
  • Case or cases as used herein means case, package or parcel units not stored in trays, on totes, etc. (e.g., uncontained).
  • case units CU may include cases of items/unit (e.g., case of soup cans, boxes of cereal, etc.) or individual item/units that are adapted to be taken off of or placed on a pallet.
  • shipping cases or case units e.g., cartons, barrels, boxes, crates, jugs, shrink wrapped trays or groups or any other suitable device for holding case units
  • Case units may also include totes, boxes, and/or containers of one or more individual goods, unpacked/decommissioned (generally referred to as breakpack goods) from original packaging and placed into the tote, boxes, and/or containers (collectively referred to as totes) with one or more other individual goods of mixed or common types at an order fill station.
  • breakpack goods generally referred to as breakpack goods
  • totes unpacked/decommissioned
  • the content of each pallet may be uniform (e.g. each pallet holds a predetermined number of the same item - one pallet holds soup and another pallet holds cereal).
  • the cases of such pallet load may be substantially similar or in other words, homogenous cases (e.g. similar dimensions), and may have the same SKU (otherwise, as noted before the pallets may be "rainbow" pallets having layers formed of homogeneous cases).
  • the pallets may contain any suitable number and combination of different case units (e.g., each pallet may hold different types of case units - a pallet holds a combination of canned soup, cereal, beverage packs, cosmetics and household cleaners).
  • the cases combined onto a single pallet may have different dimensions and/or different SKU's.
  • the automated storage and retrieval system 100 may be generally described as a storage and retrieval engine 190 coupled to a palletizer 162.
  • the storage and retrieval system 100 may be configured for installation in, for example, existing warehouse structures or adapted to new warehouse structures.
  • the automated storage and retrieval system 100 shown in Figs. 1A and IB is representative and may include for example, in-feed and out- feed conveyors terminating on respective transfer stations 170, 160, lift module(s) 150A, 150B, a storage structure 130, and a number of autonomous transport vehicles 110 (also referred to herein as "bots").
  • the storage and retrieval engine 190 is formed at least by the storage structure 130 and the autonomous transport vehicles 110 (and in some aspect the lift modules 150A, 150B; however in other aspects the lift modules 150A, 150B may form vertical sequencers in addition to the storage and retrieval engine 190 as described in United States patent application number 17/091,265 filed on November 6, 2020 and titled "Pallet Building System with Flexible Sequencing, " the disclosure of which is incorporated herein by reference in its entirety).
  • the storage and retrieval system 100 may also include robot or bot transfer stations (not shown) that may provide an interface between the autonomous transport vehicles 110 and the lift module(s) 150A, 150B.
  • the storage structure 130 may include multiple levels of storage rack modules where each storage structure level 130L of the storage structure 130 includes respective picking aisles 130A, and transfer decks 130B for transferring case units between any of the storage areas of the storage structure 130 and a shelf of the lift module (s) 150A, 150B.
  • the picking aisles 130A are in one aspect configured to provide guided travel of the autonomous transport vehicles 110 (such as along rails 130AR) while in other aspects the picking aisles are configured to provide unrestrained travel of the autonomous transport vehicle 110 (e.g., the picking aisles are open and undeterministic with respect to autonomous transport vehicle 110 guidance/travel).
  • the transfer decks 130B have open and undeterministic bot support travel surfaces along which the autonomous transport vehicles 110 travel under guidance and control provided by bot steering (as will be described herein).
  • the transfer decks have multiple lanes between which the autonomous transport vehicles 110 freely transition for accessing the picking aisles 130A and/or lift modules 150A, 150B.
  • open and undeterministic denotes the travel surface of the picking aisle and/or the transfer deck has no mechanical restraints (such as guide rails) that delimit the travel of the autonomous transport vehicle 110 to any given path along the travel surface. It is noted that while the aspects of the disclosed embodiment are described with respect to a multilevel storage array, the aspects of the disclosed embodiment may be equally applied to a single level storage array that is disposed on a facility floor or elevated above the facility floor.
  • the picking aisles 130A, and transfer decks 130B also allow the autonomous transport vehicles 110 to place case units CU into picking stock and to retrieve ordered case units CU (and define the different positions where the bot performs autonomous tasks, though any number of locations in the storage structure (e.g., decks, aisles, storage racks, etc.) can be one or more of the different positions).
  • each level may also include respective bot transfer stations 140.
  • the autonomous transport vehicles 110 may be configured to place case units, such as the above described retail merchandise, into picking stock in the one or more storage structure levels 130L of the storage structure 130 and then selectively retrieve ordered case units for shipping the ordered case units to, for example, a store or other suitable location.
  • the in-feed transfer stations 170 and out-feed transfer stations 160 may operate together with their respective lift module(s) 150A, 150B for bi-directionally transferring case units CU to and from one or more storage structure levels 130L of the storage structure 130. It is noted that while the lift modules 150A, 150B may be described as being dedicated inbound lift modules 150A and outbound lift modules 150B, in alternate aspects each of the lift modules 150A, 150B may be used for both inbound and outbound transfer of case units from the storage and retrieval system 100.
  • the storage and retrieval system 100 may include multiple in-feed and out-feed lift modules 150A, 150B that are accessible by, for example, autonomous transport vehicles 110 of the storage and retrieval system 100 so that one or more case unit(s), uncontained (e.g., case unit(s) are not held in trays), or contained (within a tray or tote) can be transferred from a lift module 150A, 150B to each storage space on a respective level and from each storage space to any one of the lift modules 150A, 150B on a respective level.
  • case unit(s) uncontained
  • case unit(s) are not held in trays
  • contained within a tray or tote
  • the autonomous transport vehicles 110 may be configured to transfer the case units between the storage spaces 130S (e.g., located in the picking aisles 130A or other suitable storage space/case unit buffer disposed along the transfer deck 130B) and the lift modules 150A, 150B.
  • the lift modules 150A, 150B include at least one movable payload support that may move the case unit (s) between the in-feed and out-feed transfer stations 160, 170 and the respective level of the storage space where the case unit (s) is stored and retrieved.
  • the lift module(s) may have any suitable configuration, such as for example reciprocating lift, or any other suitable configuration.
  • the lift module (s) 150A, 150B include any suitable controller (such as control server 120 or other suitable controller coupled to control server 120, warehouse management system 2500, and/or palletizer controller 164, 164') and may form a sequencer or sorter in a manner similar to that described in United States patent application number 16/444,592 filed on June 18, 2019 and titled "Vertical Sequencer for Product Order Fulfillment" (the disclosure of which is incorporated herein by reference in its entirety).
  • any suitable controller such as control server 120 or other suitable controller coupled to control server 120, warehouse management system 2500, and/or palletizer controller 164, 164'
  • sequencer or sorter in a manner similar to that described in United States patent application number 16/444,592 filed on June 18, 2019 and titled "Vertical Sequencer for Product Order Fulfillment" (the disclosure of which is incorporated herein by reference in its entirety).
  • the automated storage and retrieval system may include a control system, comprising for example one or more control servers 120 that are communicably connected to the in-feed and out-feed conveyors and transfer stations 170, 160, the lift modules 150A, 150B, and the autonomous transport vehicles 110 via a suitable communication and control network 180.
  • the communication and control network 180 may have any suitable architecture which, for example, may incorporate various programmable logic controllers (PLC) such as for commanding the operations of the in- feed and out-feed conveyors and transfer stations 170, 160, the lift modules 150A, 150B, and other suitable system automation.
  • PLC programmable logic controllers
  • the control server 120 may include high level programming that effects a case management system (CMS) managing the case flow system.
  • CMS case management system
  • the network 180 may further include suitable communication for effecting a bi-directional interface with the autonomous transport vehicles 110.
  • the autonomous transport vehicles 110 may include an on-board processor/controller 122 (which is configured to effect at least control and safety functions of the autonomous transport vehicle 110 - see also Figs. 10A-10C).
  • the network 180 may include a suitable bi-directional communication suite enabling the autonomous transport vehicle controller 122 to request or receive commands from the control server 120 for effecting desired transport (e.g. placing into storage locations or retrieving from storage locations) of case units and to send desired autonomous transport vehicle 110 information and data including autonomous transport vehicle 110 ephemeris, status and other desired data, to the control server 120.
  • desired transport e.g. placing into storage locations or retrieving from storage locations
  • control server 120 may be further connected to a warehouse management system 2500 for providing, for example, inventory management, and customer order fulfillment information to the CMS level program of control server 120.
  • a warehouse management system 2500 for providing, for example, inventory management, and customer order fulfillment information to the CMS level program of control server 120.
  • the control server 120, and/or the warehouse management system 2500 allow for a degree of collaborative control, at least of bots 110, via a user interface UI, as will be further described below.
  • a suitable example of an automated storage and retrieval system arranged for holding and storing case units is described in U.S. Patent No. 9,096,375, issued on August 4, 2015 the disclosure of which is incorporated by reference herein in its entirety. [0059] Referring now to Figs.
  • the autonomous transport vehicle 110 (which may also be referred to herein as an autonomous guided vehicle or bot) includes a vehicle frame or chassis 200 (referred to herein as a frame) with a power supply 481 mounted therein and an integral payload support or bed 210B.
  • the frame 200 has a front end 200E1 and a back end 200E2 that define a longitudinal axis LAX of the autonomous transport vehicle 110.
  • the frame 200 may be constructed of any suitable material (e.g., steel, aluminum, composites, etc.).
  • powered sections are connected to the frame 200, where each powered section is powered by the power supply 481.
  • the powered sections include a drive section, 261D, a payload handling section 210 (also referred to herein as a case handling assembly 210), and a peripheral electronics section 778.
  • the payload handling section or case handling assembly 210 configured to handle cases/payloads transported by the autonomous transport vehicle 110.
  • the case handling assembly 210 has at least one payload handling actuator (e.g., transfer arm 210A) configured so that actuation of the payload handling actuator effects transfer of the payload (e.g., case unit) to and from the payload bed 210B, of the frame, and a storage (e.g., storage spaces 130S of storage shelves) in the facility.
  • payload handling actuator e.g., transfer arm 210A
  • a storage e.g., storage spaces 130S of storage shelves
  • the case handling assembly 210 includes the payload bed 210B (also referred to herein as a payload bay or payload hold) and is configured so as to move the payload bed in direction VER; in other aspects where the payload bed 210B is formed by the frame 200 the payload bed may be fixed/stationary in direction VER. As may be realized, payloads are placed on the payload bed 210B.
  • the payload bed 210B also referred to herein as a payload bay or payload hold
  • the payload bed may be fixed/stationary in direction VER.
  • payloads are placed on the payload bed 210B.
  • the transfer arm 210A is configured to (autonomously) transfer a payload (such as a case unit CU), with a flat undeterministic seating surface seated in the payload bed 210B, to and from the payload bed 210B of the autonomous guided vehicle 110 and a storage location (such as storage space 130S on storage shelf 555 (see Fig. 5A), a shelf of lift module 150A, 150B, buffer, transfer station, and/or any other suitable storage location), of the payload CU, in a storage array SA, where the storage location 130S, in the storage array SA, is separate and distinct from the transfer arm 210A and the payload bed 210B.
  • a payload such as a case unit CU
  • a flat undeterministic seating surface seated in the payload bed 210B
  • a storage location such as storage space 130S on storage shelf 555 (see Fig. 5A), a shelf of lift module 150A, 150B, buffer, transfer station, and/or any other suitable storage location
  • the transfer arm 210A is configured with extension motors 667A-667C and lift motor(s) 669 that configure the transfer arm 210A to extend laterally in direction LAT and/or vertically in direction VER to transport payloads to and from the payload bed 210B.
  • the payload bed 210B includes a front and rear justification module 210ARJ, 210AFJ configured to justify case units along the longitudinal axis LAX and laterally in direction LAT anywhere within the payload bed 210B.
  • the payload bed includes justification arms JAR (Figs. 10A and IOC) that are driven along the longitudinal axis by respective justification motors 668B, 668E so as to justify the case unit(s) CU along the longitudinal axis LAX.
  • Pushers JPS and pullers JPP may be movably mounted to the justification arms so as to be driven by respective motors 668A, 668C, 668D, 668F in direction LAT so as to justify the case unit (s) CU in direction LAT.
  • One or more of the motors 668A-668F may also be operated to clamp or grip the case unit(s) CU held in the payload bed 210B such as during case unit transport by the vehicle 110.
  • Examples of suitable payload beds 210B and transfer arms 210A and/or autonomous transport vehicles to which the aspects of the disclosed embodiment may be applied can be found in United States provisional patent application number 63/236,591, having attorney docket number 1127P015753-US (-#3) filed on August 24, 2021 and titled “Autonomous Transport Vehicle” as well as United States pre-grant publication number 2012/0189416 published on July 26, 2012 (United States patent application number 13/326,952 filed on December 15, 2011) and titled "Automated Bot with Transfer Arm”; United States patent number 7591630 issued on September 22, 2009 titled “Materials-Handling System Using Autonomous Transfer and Transport Vehicles”; United States patent number 7991505 issued on August 2, 2011 titled “Materials-Handling System Using Autonomous Transfer and Transport Vehicles”; United States patent number 9561905 issued on February 7, 2017 titled “Autonomous Transport Vehicle”; United States patent number 9082112 issued on July 14, 2015 titled "
  • the frame 200 includes one or more idler wheels or casters 250 disposed adjacent the front end 200E1.
  • the frame 200 also includes one or more drive wheels 260 disposed adjacent the back end 200E2.
  • the position of the casters 250 and drive wheels 260 may be reversed (e.g., the drive wheels 260 are disposed at the front end 200E1 and the casters 250 are disposed at the back end 200E2).
  • the autonomous transport vehicle 110 is configured to travel with the front end 200E1 leading the direction of travel or with the back end 200E2 leading the direction of travel.
  • casters 250A, 250B are located at respective front corners of the frame 200 at the front end 200E1 and drive wheels 260A, 260B (which are substantially similar to drive wheel 260 described herein) are located at respective back corners of the frame 200 at the back end 200E2 (e.g., a support wheel is located at each of the four corners of the frame 200) so that the autonomous transport vehicle 110 stably traverses the transfer deck(s) 130B and picking aisles 130A of the storage structure 130.
  • the autonomous transport vehicle 110 includes a drive section 261D, connected to the frame 200, having motores 261M that power (or drive) drive wheels 260 (supporting the autonomous transport vehicle 110 on a traverse/rolling surface 284), where the drive wheels 260 effect vehicle traverse on the traverse surface 284 moving the autonomous transport vehicle 110 over the traverse surface 284 in a facility (e.g., such as a warehouse, store, etc.) under autonomous guidance.
  • the drive section 261D has at least a pair of traction drive wheels 260 (also referred to as drive wheels 260 - see drive wheels 260A, 260B) astride the drive section 261D.
  • the drive wheels 260 have a fully independent suspension 280 coupling each drive wheel 260A, 260B of the at least pair of drive wheels 260 to the frame 200, with at least one intervening pivot link (described herein) between at least one drive wheel 260A, 260B and the frame 200 configured to maintain a substantially steady state traction contact patch between the at least one drive wheel 260A, 260B and rolling/travel surface 284 (also referred to as autonomous vehicle travel surface 284- see, e.g., Figs.
  • the frame 200 includes one or more casters 250 disposed adjacent the front end 200E1.
  • a caster 250 is located adjacent each front corner of the frame 200 so that in combination with the drive wheels 260 disposed at each rear corner of the frame 200, the frame 200 stably traverses the transfer deck 130B and picking aisles 130A of the storage structure 130.
  • each caster 250 comprises a motorized (e.g., active/motorized steering) caster 600M; however, in other aspects the caster 250 may be a passive (e.g., un-motorized) caster.
  • the motorized caster 600M includes a caster wheel 610 coupled to a fixed geometry wheel fork 640 (Fig.
  • Each motorized caster 600M is configured to actively pivot its respective caster wheel 610 (independent of the pivoting of other wheels of other motorized casters) in direction 690 about caster pivot axis 691 to at least assist in effecting a change in the travel direction of the autonomous transport vehicle 110.
  • the autonomous transport vehicle 110 includes a physical characteristic sensor system 270 (also referred to as an autonomous navigation operation sensor system) connected to the frame 200.
  • the physical characteristic sensor system 270 has electro-magnetic sensors.
  • Each of the electro-magnetic sensors responsive is to interaction or interface of a sensor emitted or generated electro- magnetic beam or field with a physical characteristic (e.g., of the storage structure or a transient object such as a case unit CU, debris, etc.), where the electro-magnetic beam or field is disturbed by interaction or interface with the physical characteristic.
  • a physical characteristic e.g., of the storage structure or a transient object such as a case unit CU, debris, etc.
  • the disturbance in the electro-magnetic beam is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic, wherein the physical characteristic sensor system 270 is configured to generate sensor data embodying at least one of a vehicle navigation pose or location (relative to the storage and retrieval system or facility in which the autonomous transport vehicle 110 operates) information and payload pose or location (relative to a storage location 130S or the payload bed 210B) information.
  • the physical characteristic sensor system 270 is configured to generate sensor data embodying at least one of a vehicle navigation pose or location (relative to the storage and retrieval system or facility in which the autonomous transport vehicle 110 operates) information and payload pose or location (relative to a storage location 130S or the payload bed 210B) information.
  • the physical characteristic sensor system 270 includes, an autonomous pose and navigation sensor that includes, for exemplary purposes only, one or more of laser sensor(s) 271, ultrasonic sensor(s) 272, bar code scanner(s) 273, position sensor(s) 274, line sensor(s) 275, vehicle proximity sensor(s) 278, or any other suitable sensors for sensing a position of the vehicle 110.
  • an autonomous pose and navigation sensor that includes, for exemplary purposes only, one or more of laser sensor(s) 271, ultrasonic sensor(s) 272, bar code scanner(s) 273, position sensor(s) 274, line sensor(s) 275, vehicle proximity sensor(s) 278, or any other suitable sensors for sensing a position of the vehicle 110.
  • the at least one payload handling sensor includes case sensors 278 (e.g., for sensing case units within the payload bed 210B onboard the vehicle 110 or on a storage shelf off-board the vehicle 110), arm proximity sensor(s) 277, or any other suitable sensors for sensing a payload (e.g., case unit CU) and its location/pose during autonomous transport vehicle handling of the payload CU.
  • case sensors 278 e.g., for sensing case units within the payload bed 210B onboard the vehicle 110 or on a storage shelf off-board the vehicle 110
  • arm proximity sensor(s) 277 e.g., for sensing case units within the payload bed 210B onboard the vehicle 110 or on a storage shelf off-board the vehicle 110
  • supplemental navigation sensor system 288 may form a portion of the physical characteristic sensor system 270.
  • Suitable examples of sensors that may be included in the physical characteristic sensor system 270 are described in United States provisional patent application number 63/236,591 having attorney docket number 1127P015753-US (-#3) titled “Autonomous Transport Vehicle” and filed on August 24, 2021, as well as United States patent numbers 8,425,173 titled “Autonomous Transport for Storage and Retrieval Systems” issued on April 23, 2013, 9,008,884 titled “Bot Position Sensing” issued on April 14, 2015, and 9,946,265 titled Bot Having High Speed Stability” issued on April 17, 2018, the disclosures of which are incorporated herein by reference in their entireties.
  • the sensors of the physical characteristic sensor system 270 may be configured to provide the autonomous transport vehicle 110 with, for example, awareness of its environment (in up to six degrees of freedom X, Y, Z, Rx, Ry, Rz - see Fig. 2) and external objects, as well as the monitor and control of internal subsystems.
  • the sensors may provide guidance information, payload information, or any other suitable information for use in operation of the autonomous transport vehicle 110 such as described herein and/or as described in, for example, United States provisional patent application having attorney docket number 1127P015753-US (-#3) titled “Autonomous Transport Vehicle” and having United States provisional application number 63/236,591 filed on August 24, 2021, the disclosure of which is incorporated herein by reference in its entirety.
  • the bar code scanner(s) 273 may be mounted on the autonomous transport vehicle 110 in any suitable location.
  • the bar code scanners (s) 273 may be configured to provide an absolute location of the autonomous transport vehicle 110 within the storage structure 130.
  • the bar code scanner (s) 273 may be configured to verify aisle references and locations on the transfer decks by, for example, reading bar codes located on, for example the transfer decks, picking aisles and transfer station floors to verify a location of the autonomous transport vehicle 110.
  • the bar code scanner (s) 273 may also be configured to read bar codes located on items stored in the shelves 555.
  • the position sensors 274 may be mounted to the autonomous transport vehicle 110 at any suitable location.
  • the position sensors 274 may be configured to detect reference datum features (or count the slats 520L of the storage shelves 555) (e.g. see Fig. 5A) for determining a location of the vehicle 110 with respect to the shelving of, for example, the picking aisles 130A (or a buffer/transfer station located adjacent the transfer deck 130B or lift 150) .
  • the reference datum information may be used by the controller 122 to, for example, correct the vehicle's odometry and allow the autonomous transport vehicle 110 to stop with the support tines 210AT of the transfer arm 210A positioned for insertion into the spaces between the slats 520L (see, e.g., Fig. 5A).
  • the vehicle 110 may include position sensors 274 on the drive (rear) end 200E2 and the driven (front) end 200E1 of the autonomous transport vehicle 110 to allow for reference datum detection regardless of which end of the autonomous transport vehicle 110 is facing the direction the vehicle is travelling.
  • the line sensors 275 may be any suitable sensors mounted to the autonomous transport vehicle 110 in any suitable location, such as for exemplary purposes only, on the frame 200 disposed adjacent the drive (rear) and driven (front) ends 200E2, 200E1 of the autonomous transport vehicle 110.
  • the line sensors 275 may be diffuse infrared sensors.
  • the line sensors 275 may be configured to detect guidance lines 900 (see Figs. 9A and 15) provided on, for example, the floor of the transfer decks 130B.
  • the autonomous transport vehicle 110 may be configured to follow the guidance lines when travelling on the transfer decks 130B and defining ends of turns when the vehicle is transitioning on or off the transfer decks 130B.
  • the line sensors 275 may also allow the vehicle 110 to detect index references for determining absolute localization where the index references are generated by crossed guidance lines (see Fig. 9A and 15).
  • the case sensors 276 may include case overhang sensors and/or other suitable sensors configured to detect the location/pose of a case unit CU within the payload bed 210B.
  • the case sensors 276 may be any suitable sensors that are positioned on the vehicle so that the sensor(s) field of view(s) span the payload bed 210B adjacent the top surface of the support tines 210AT (see Figs. 4A and 4B).
  • the case sensors 276 may be disposed at the edge of the payload bed 210B (e.g., adjacent a transport opening 1199 of the payload bed 210B to detect any case units CU that are at least partially extending outside of the payload bed 210B.
  • the arm proximity sensors 277 may be mounted to the autonomous transport vehicle 110 in any suitable location, such as for example, on the transfer arm 210A.
  • the arm proximity sensors 277 may be configured to sense objects around the transfer arm 210A and/or support tines 210AT of the transfer arm 210A as the transfer arm 210A is raised/lowered and/or as the support tines 210AT are extended/retracted.
  • the laser sensors 271 and ultrasonic sensors 272 may be configured to allow the autonomous transport vehicle 110 to locate itself relative to each case unit forming the load carried by the autonomous transport vehicle 110 before the case units are picked from, for example, the storage shelves 555 and/or lift 150 (or any other location suitable for retrieving payload).
  • the laser sensors 271 and ultrasonic sensors 272 may also allow the vehicle to locate itself relative to empty storage locations 130S for placing case units in those empty storage locations 130S.
  • the laser sensors 271 and ultrasonic sensors 272 may also allow the autonomous transport vehicle 110 to confirm that a storage space (or other load depositing location) is empty before the payload carried by the autonomous transport vehicle 110 is deposited in, for example, the storage space 130S.
  • the laser sensor 271 may be mounted to the autonomous transport vehicle 110 at a suitable location for detecting edges of items to be transferred to (or from) the autonomous transport vehicle 110.
  • the laser sensor 271 may work in conjunction with, for example, retro-reflective tape (or other suitable reflective surface, coating or material) located at, for example, the back of the shelves 555 to enable the sensor to "see" all the way to the back of the storage shelves 555.
  • the reflective tape located at the back of the storage shelves allows the laser sensor 1715 to be substantially unaffected by the color, reflectiveness, roundness, or other suitable characteristics of the items located on the shelves 555.
  • the ultrasonic sensor 272 may be configured to measure a distance from the autonomous transport vehicle 110 to the first item in a predetermined storage area of the shelves 555 to allow the autonomous transport vehicle 110 to determine the picking depth (e.g. the distance the support tines 210AT travel into the shelves 555 for picking the item(s) off of the shelves 555).
  • One or more of the laser sensors 271 and ultrasonic sensors 272 may allow for detection of case orientation (e.g. skewing of cases within the storage shelves 555) by, for example, measuring the distance between the autonomous transport vehicle 110 and a front surface of the case units to be picked as the autonomous transport vehicle 110 comes to a stop adjacent the case units to be picked.
  • the case sensors may allow verification of placement of a case unit on, for example, a storage shelf 555 by, for example, scanning the case unit after it is placed on the shelf.
  • Vehicle proximity sensors 278 may also be disposed on the frame 200 for determining the location of the autonomous transport vehicle 110 in the picking aisle 130A and/or relative to lifts 150.
  • the vehicle proximity sensors 278 are located on the autonomous transport vehicle 110 so as to sense targets or position determining features disposed on rails 130AR on which the vehicle 110 travels through the picking aisles 130A (and/or on walls of transfer areas 195 and/or lift 150 access location).
  • the position of the targets on the rails 130AR are in known locations so as to form incremental or absolute encoders along the rails 130AR.
  • the vehicle proximity sensors 278 sense the targets and provide sensor data to the controller 122 so that the controller 122 determines the position of the autonomous transport vehicle 110 along the picking aisle 130A based on the sensed targets.
  • the sensors of the physical characteristic sensing system 270 are communicably coupled to the controller 122 of the autonomous transport vehicle 110.
  • the controller 122 is operably connected to the drive section 261D and/or the transfer arm 210A.
  • the controller 122 is configured to determine from the information of the physical characteristic sensor system 270 vehicle pose and location (e.g., in up to six degrees of freedom, X, Y, Z, Rx, Ry, Rz) effecting independent guidance of the autonomous transport vehicle 110 traversing the storage and retrieval facility/system 100.
  • the controller 122 is also configured to determine from the information of the physical characteristic sensor system 270 payload (e.g., case unit CU) pose and location (onboard or off-board the autonomous transport vehicle 110) effecting independent underpick (e.g., lifting of the case unit CU from underneath the case unit CU) and place of the payload CU to and from a storage location 130S and independent underpick and place of the payload CU in the payload bed 210B.
  • payload e.g., case unit CU
  • location onboard or off-board the autonomous transport vehicle 110
  • independent underpick e.g., lifting of the case unit CU from underneath the case unit CU
  • place of the payload CU to and from a storage location 130S and independent underpick and place of the payload CU in the payload bed 210B.
  • the autonomous transport vehicle 110 includes a supplemental or auxiliary navigation sensor system 288, connected to the frame 200.
  • the supplemental navigation sensor system 288 supplements the physical characteristic sensor system 270.
  • the supplemental navigation sensor system 288 is, at least in part, a vision system 400 with cameras disposed to capture image data informing the at least one of a vehicle navigation pose or location (relative to the storage and retrieval system structure or facility in which the vehicle 110 operates) and payload pose or location (relative to the storage locations or payload bed 210B) that supplements the information of the physical characteristic sensor system 270.
  • the term "camera” described herein is a still imaging or video imaging device that includes one or more of a two-dimensional camera, a two dimensional camera with RGB (red, green, blue) pixels, a three-dimensional camera with XYZ+A definition (where XYZ is the three-dimensional reference frame of the camera and A is one of a radar return strength, a time of flight stamp, or other distance determination stamp/indicator), and an RGB/XYZ camera which includes both RGB and three-dimensional coordinate system information, non-limiting examples of which are provided herein. It should be understood that while the vision system 400 is described herein with respect to the autonomous transport vehicle 110 in other aspects the vision system 400 may be applied to a load handling device 150LHD (Fig.
  • the vision system 400 includes one or more of the following: case unit monitoring cameras 410A, 410B (collectively referred to monitoring cameras 410), forward navigation cameras 420A, 420B and rearward navigation cameras 430A, 430B (collectively referred to herein as navigation cameras 430), one or more three-dimensional imaging system 440A, 440B, one or more case edge detection sensors 450A, 450B, one or more traffic monitoring camera 460A, 460B (collectively referred to herein as traffic monitoring cameras 460), and one or more out of plane (e.g., upward or downward facing) localization cameras 477A, 477B (collectively referred to herein as localization cameras 477)(noting the downward facing cameras may supplement the line following sensors 275 of the physical characteristic sensor system 270 and provide a broader field of view than the line following sensors 275 so as to effect guidance/traverse of the vehicle 110 to place the guide lines
  • case unit monitoring cameras 410A, 410B collectively referred to monitoring cameras 410
  • Images (static images and/or dynamic video images) from the different vision system 400 cameras are reguested from the vision system controller 122VC by the controller 122 as desired for any given autonomous transport vehicle 110 task. For example, images are obtained by the controller 122 from at least one or more of the forward and rearward navigation cameras 420A, 420B, 430A, 430B to effect navigation of the autonomous transport vehicle along the transfer deck 130B and picking aisles 130A.
  • the controller 122 may obtain images from one or more of the three-dimensional imaging system 440A, 440B, where the case edge detection sensors 450A, 450B, and the case unit monitoring cameras 410A, 410B are employed to effect case handling by the vehicle 110.
  • Case handling includes picking and placing case units from case unit holding locations (such as case unit localization, verification of the case unit, and verification of placement of the case unit in the payload bed 210B and/or at a case unit holding location such as a storage shelf or buffer location).
  • Images from the out of plane localization cameras 477A, 477B may be obtained by the controller 122 to effect navigation of the autonomous transport vehicle and/or to provide data (e.g., image data) supplemental to localization/navigation data from the one or more of the forward and rearward navigation cameras 420A, 420B, 430A, 430B.
  • Images from the one or more traffic monitoring camera 460A, 460B may be obtained by the controller 122, where the traffic monitoring cameras 460 are employed to effect travel transitions of the autonomous transport vehicle 110 from a picking aisle 130A to the transfer deck 130B (e.g., entry to the transfer deck 130B and merging of the autonomous transport vehicle 110 with other autonomous transport vehicles travelling along the transfer deck 130B).
  • the case unit monitoring cameras 410A, 410B are any suitable high resolution or low resolution video cameras (where video images that include more than about 480 vertical scan lines and are captured at more than about 50 frames/second are considered high resolution).
  • the case unit monitoring cameras 410A, 410B are arranged relative to each other to form a stereo vision camera system that is configured to monitor case unit CU ingress to and egress from the payload bed 210B.
  • the case unit monitoring cameras 410A, 410B are coupled to the frame 200 in any suitable manner and are focused at least on the payload bed 210B.
  • the case unit monitoring cameras 410A, 410B are coupled to the transfer arm 210A so as move in direction LAT with the transfer arm 210A (such as when picking and placing case units CU) and are positioned so as to be focused on the payload bed 210B and support tines 210AT of the transfer arm 210A.
  • the case unit monitoring cameras 410A, 410B effect at least in part one or more of case unit determination, case unit localization, case unit position verification, and verification of the case unit justification features (e.g., justification blades 471 and pushers 470) and case transfer features (e.g., tines 210AT, pullers 472, and payload bed floor 473).
  • the case unit monitoring cameras 410A, 410B detect one or more of case unit length CL, CL1, CL2, CL3, a case unit height CHI, CH2, CH3, and a case unit yaw YW (e.g., relative to the transfer arm 210A extension/retraction direction LAT).
  • the data from the case handling sensors may also provide the location/positions of the pushers 470, pullers 472, and justification blades 471, such as where the payload bed 210B is empty (e.g., not holding a case unit).
  • the case unit monitoring cameras 410A, 410B are also configured to effect, with the vision system controller 122VC, a determination of a front face case center point FFCP (e.g., in the X, Y, and Z directions with the case units disposed on a shelf or other holding area off-board the vehicle 110) relative to a reference location of the autonomous transport vehicle 110.
  • the reference location of the autonomous transport vehicle 110 may be defined by one or more justification surfaces of the payload bed 210B or the centerline CLPB of the payload bed 210B.
  • the front face case center point FFCP may be determined along the longitudinal axis LAX (e.g. in the Y direction) relative to a centerline CLPB of the payload bed 210B (Fig.
  • the front face case center point FFCP may be determined along the vertical axis VER (e.g. in the Z direction) relative to a case unit support plane PSP of the payload bed 210B (Fig. 4B - formed by one or more of the tines 210AT of the transfer arm 210A and the payload bed floor 473).
  • the front face case center point FFCP may be determined along the lateral axis LAT (e.g. in the X direction) relative to a justification plane surface JPP of the pushers 470 (Fig. 4A). Determination of the front face case center point FFCP of the case units CU located on a storage shelf 555 (see Fig.
  • case unit holding location provides, as non-limiting examples, for localization of the autonomous transport vehicle 110 relative to case units CU to be picked, mapping locations of case units within the storage structure (e.g., such as in a manner similar to that described in United States patent number 9,242,800 issued on January 26, 2016 titled “Storage and retrieval system case unit detection", the disclosure of which is incorporated herein by reference in its entirety), and/or pick and place accuracy relative to other case units on the storage shelf 555 (e.g., so as to maintain predetermined gap sizes between case units.
  • mapping locations of case units within the storage structure e.g., such as in a manner similar to that described in United States patent number 9,242,800 issued on January 26, 2016 titled “Storage and retrieval system case unit detection", the disclosure of which is incorporated herein by reference in its entirety
  • pick and place accuracy relative to other case units on the storage shelf 555 e.g., so as to maintain predetermined gap sizes between case units.
  • the determination of the front face case center point FFCP also effects a comparison of the "real world" environment in which the autonomous transport vehicle 110 is operating with the virtual model 400VM so that controller 122 of the autonomous transport vehicle 110 compares what is "sees” with the vision system 400 substantially directly with what the autonomous transport vehicle 110 expects to "see” based on the simulation of the storage and retrieval system structure.
  • the object (case unit) and characteristics determined by the vision system controller 122VC are coapted (combined, overlayed) to the virtual model 400VM enhancing resolution, in up to six degrees of freedom resolution, of the object pose with respect to a facility reference frame.
  • registration of the cameras of the vision system 400 with the facility reference frame allows for enhanced resolution of vehicle 110 pose and/or location with respect to both a global reference (facility features rendered in the virtual model 400VM) and the imaged object. More particularly, object position discrepancies or anomalies apparent and identified upon coapting the object image and virtual model (e.g., edge spacing between case unit fiducial edges or case unit inclination or shew, with respect to the rack slats 520L of the virtual model 400VM), if greater than a predetermined nominal threshold, describe an errant pose of one or more of case, rack, and/or vehicle 110. Discrimination as to whether errancy is with the pose/location of the case, rack or vehicle 110, one or more is determined via comparison with pose data from sensors 270 and supplemental navigation sensor system 288.
  • the vision system 400 may determine the one case is skewed and provide the enhanced case position information to the controller 122 for operating the transfer arm 210A and positioning the transfer arm 210A so as to pick the one case based on the enhanced resolution of the case pose and location.
  • the edge of a case is offset from a slat 520L (see
  • the vision system 400 may generate a position error for the case; noting that if the offset is within the threshold, the supplemental information from the supplemental navigation sensor system 288 enhances the pose/location resolution (e.g., an offset substantially equal to the determined pose/location of the case with respect to the salt 520L and vehicle 110 payload bed 210B transfer arm 210A frame.
  • the supplemental information from the supplemental navigation sensor system 288 enhances the pose/location resolution (e.g., an offset substantially equal to the determined pose/location of the case with respect to the salt 520L and vehicle 110 payload bed 210B transfer arm 210A frame.
  • the vision system may generate the case position error; however, if two or more juxtaposed cases are determined to be skewed relative to the slat 520L edges the vision system may generate a vehicle 110 pose error and effect repositioning of the vehicle 110 (e.g., correct the position of the vehicle 110 based on an offset determined from the supplemental navigation sensor system 288 supplemental information) or a service message to an operator (e.g., where the vision system 400 effects a "dash cam" collaborative mode (as described herein) that provides for remote control of the vehicle 110 by an operator with images (still and/or real time video) from the vision system being conveyed to the operator to effect the remote control operation).
  • the vehicle 110 may be stopped (e.g., does not traverse the picking aisle 130A or transfer deck 130B) until the operator initiates remote control of the vehicle 110.
  • the case unit monitoring cameras 410A, 410B may also provide feedback with respect to the positions of the case unit justification features and case transfer features of the autonomous transport vehicle 110 prior to and/or after picking/placing a case unit from, for example, a storage shelf or other holding locations (e.g., for verifying the locations/positions of the justification features and the case transfer features so as to effect pick/place of the case unit with the transfer arm 210A without transfer arm obstruction).
  • the case unit monitoring cameras 410A, 410B have a field of view that encompasses the payload bed 210B.
  • the vision system controller 122VC is configured to receive sensor data from the case unit monitoring cameras 410A, 410B and determine, with any suitable image recognition algorithms stored in a memory of or accessible by the vision system controller 122VC, positions of the pushers 470, justification blades 471, pullers 472, tines 210AT, and/or any other features of the payload bed 210B that engage a case unit held on the payload bed 210B.
  • the positions of the pushers 470, justification blades 471, pullers 472, tines 210AT, and/or any other features of the payload bed 210B may be employed by the controller 122 to verify a respective position of the pushers 470, justification blades 471, pullers 472, tines 210AT, and/or any other features of the payload bed 210B as determined by motor encoders or other respective position sensors; while in some aspects the positions determined by the vision system controller 122VC may be employed as a redundancy in the event of encoder/position sensor malfunction.
  • the justification position of the case unit CU within the payload bed 21B may also be verified by the case unit monitoring cameras 410A, 410B.
  • the vision system controller 122VC is configured to receive sensor data from the case unit monitoring cameras 410A, 410B and determine, with any suitable image recognition algorithms stored in a memory of or accessible by the vision system controller 122VC, a position of the case unit in the X, Y, Z directions relative to, for example, one or more of the centerline CLPB of the payload bed 210B, a reference/home position of the justification plane surface JPP of the pushers 470, and the case unit support plane PSP.
  • position determination of the case unit CU within the payload bed 210B effects at least place accuracy relative to other case units on the storage shelf 555 (e.g., so as to maintain predetermined gap sizes between case units.
  • the one or more three-dimensional imaging system 440A, 440B includes any suitable three-dimensional imager(s) including but not limited to, e.g., time-of-flight cameras, imaging radar systems, light detection and ranging (LIDAR), etc.
  • any suitable three-dimensional imager(s) including but not limited to, e.g., time-of-flight cameras, imaging radar systems, light detection and ranging (LIDAR), etc.
  • the one or more three- dimensional imaging system 440A, 440B may effect, with the vision system controller 122VC, a determination of a size (e.g., height and width) of the front face (i.e., the front face surface) of a case unit CU and front face case center point FFCP (e.g., in the X, Y, and Z directions) relative to a reference location of the autonomous transport vehicle 110 and invariant of a shelf supporting the case unit CU (e.g., the one or more three- dimensional imaging system 440A, 440B effects case unit CU location without reference to the shelf supporting the case unit CU and effects a determination as to whether the case unit is supported on a shelf through a determination of a shelf invariant characteristic of the case units).
  • a size e.g., height and width
  • FFCP front face case center point
  • the determination of the front face surface and case center point FFCP also effects a comparison of the "real world" environment in which the autonomous transport vehicle 110 is operating with the virtual model 400VM so that controller 122 of the autonomous transport vehicle 110 compares what is "sees” with the vision system 400 substantially directly with what the autonomous transport vehicle 110 expects to "see” based on the simulation of the storage and retrieval system structure.
  • the image data obtained from the one or more three- dimensional imaging system 440A, 440B may supplement the image data from the cameras 410A, 410B in the event data from the cameras 410A, 410B is incomplete or missing.
  • the one or more three- dimensional imaging system 440A, 440B has a respective field of view that extends past the payload bed 210B substantially in direction LAT so that each three-dimensional imaging system 440A, 440B is disposed to sense case units CU adjacent to but external of the payload bed 210B (such as case units CU arranged so as to extend in one or more rows along a length of a picking aisle 130A (see Fig. 5A) or a substrate buffer/transfer stations (similar in configuration to storage racks 599 and shelves 555 thereof disposed along the picking aisles 130A) arranged along the transfer deck 130B).
  • the field of view 440AF, 440BF of each three-dimensional imaging system 440A, 440B encompasses a volume of space 440AV, 440BV that extends a height 670 of a pick range of the autonomous transport vehicle 110 (e.g., a range/height in direction VER - Figs, and 8 - in which the arm 210A can move to pick/place case units to a shelf 555 or stacked shelves accessible from a common rolling surface 284 (e.g., of the transfer deck 130B or picking aisle 130A - see Fig. 2) on which the autonomous transport vehicle 110 rides).
  • a common rolling surface 284 e.g., of the transfer deck 130B or picking aisle 130A - see Fig. 2
  • the one or more three- dimensional imaging system 440A, 440B provides sensor data to the vision system controller 122VC that embodies at least the front face surfaces 800A, 800B, 800C of case units CUI, CU2, CU3, where such front face surface detection is detected/determined without reference to and regardless of the presence of a shelf supporting the case units.
  • the vision system controller 122VC determines if the case unit CU detected is disposed on a shelf with other case units through a determination of a shelf invariant characteristic common to each case unit disposed on the same shelf.
  • a case unit sitting/seated on a shelf 555 has a front face or front face surface 800 that is visible to the one or more three-dimensional imaging system 440A, 440B (and to the case unit monitoring cameras 410A, 410B).
  • the vision system controller 122VC determines a front face normal vector N that is normal to the front face surface 800.
  • the vision system controller 122VC (with any suitable image processing algorithms thereof) determines the bottom edge 777 (and vector B thereof) of the front face surface 800, where a shelf invariant characteristic of the case unit CU is derived from the front face normal vector N and the bottom edge 777.
  • an UP or Z axis vector U can be determined from the cross product of vectors N and B as follows:
  • a center point P of the bottom edge 777 is determined by vision system controller 122VC (with any suitable image processing algorithms thereof) and a scalar eguation of a plane (that represents the bottom surface of the case unit CU seated on the shelf 555) can be written as follows:
  • (U, d) is the shelf invariant characteristic that is common to any case unit seated on the same shelf 555 (e.g., any case unit seated on the same shelf has the same shelf invariant feature vector within a predetermined tolerance).
  • the vision system controller 122VC can determine whether the case units CUI, CU2, CU3 (see Fig. 8) are disposed on the same shelf by scanning of case units CUI, CU2, CU3 with at least the one or more three- dimensional imaging system 440A, 440B and determining the shelf invariant characteristic.
  • the determination of the shelf invariant characteristic may effect, at least in part, comparison between what the vision system 400 of the autonomous transport vehicle 110 "sees” substantially directly with what the autonomous transport vehicle 110 expects to "see” based on the simulation of the storage and retrieval system structure. Determination of the shelf invariant characteristic may also effect placement of case units on the plane of the shelf 555 as determined from the shelf invariant characteristic.
  • the forward navigation cameras 420A, 420B are any suitable cameras configured to provide object detection and ranging.
  • the forward navigation cameras 420A, 420B may be placed on opposite sides of the longitudinal centerline LAXCL of the autonomous transport vehicle 110 and spaced apart by any suitable distance so that the forward facing fields of view 420AF, 420BF (see also Fig. provide the autonomous transport vehicle 110 with stereo vision.
  • the forward navigation cameras 420A, 420B are any suitable high resolution or low resolution video cameras (where video images that include more than about 480 vertical scan lines and are captured at more than about 50 frames/second are considered high resolution), time-of-flight cameras, laser ranging cameras, or any other suitable cameras configured to provide object detection and ranging for effecting autonomous vehicle traverse along the transfer deck 130B and picking aisles 130A.
  • the rearward navigation cameras 430A, 430B may be substantially similar to the forward navigation cameras.
  • the forward navigation cameras 420A, 420B and the rear navigation cameras 430A, 430B provide for autonomous transport vehicle 110 navigation with obstacle detection and avoidance (with either end 200E1 of the autonomous transport vehicle 110 leading a direction of travel or trailing the direction of travel) as well as localization of the autonomous transport vehicle within the storage and retrieval system 100. Localization of the autonomous transport vehicle 110 may be effected by one or more of the forward navigation cameras 420A, 420B and the rearward navigation cameras 430A, 430B by detection of lines 900 on the travel/rolling surface 284 and/or by detection of suitable storage structure, including but not limited to storage rack (or other) structure 999.
  • the line detection and/or storage structure detection may be compared to floor maps and structure information (e.g., stored in a memory of or accessible by) of the vision system controller 122VC.
  • the forward navigation cameras 420A, 420B and the rearward navigation cameras 430A, 430B may also send signal to the controller 122 (inclusive of or through the vision system controller 122VC) so that as objects approach the autonomous transport vehicle 110 (with the autonomous transport vehicle 110 stopped or in motion) the autonomous transport vehicle 110 may be maneuvered (e.g., on the undeterministic rolling surface of the transfer deck 130B or within the picking aisle 130A (which may have a deterministic or undeterministic rolling surface) to avoid the approaching object (e.g., another autonomous transport vehicle, case unit, or other transient object within the storage and retrieval system 100).
  • the approaching object e.g., another autonomous transport vehicle, case unit, or other transient object within the storage and retrieval system 100.
  • the forward navigation cameras 420A, 420B and the rear navigation cameras 430A, 430B may also provide for convoys of vehicles 110 along the picking aisles 130A or transfer deck 130B, where one vehicle 110 follows another vehicle 110A at predetermined fixed distances.
  • Fig. IB illustrates a three vehicle 110 convoy where one vehicle closely follows another vehicle at the predetermined fixed distance.
  • the one or more case edge detection sensors 450A, 450B are any suitable sensors such as laser measurement sensors configured to scan the shelves of the storage and retrieval system to verify the shelves are clear for placing case units CU, or to verify a case unit size and position before picking the case unit CU. While one case edge detection sensor 450A, 450B is illustrated on each side of the payload bed 210B centerline CLPB (see Fig. 4A) there may be more or less than two case edge detection sensors placed at any suitable locations on the autonomous transport vehicle 110 so that the vehicle 110 can traverse by and scan case units CU with the front end 200E1 leading a direction of vehicle travel or the rear/back end 200E2 leading the direction of vehicle travel
  • the one or more traffic monitoring cameras 460A, 460B are disposed on the frame 200 so that a respective field of view 460AF, 460BF faces laterally in lateral direction LAT1. While the one or more traffic monitoring cameras 460A, 460B are illustrated as being adjacent a transfer opening 1199 of the transfer bed 210B (e.g., on the pick side from which the arm 210A of the autonomous transport vehicle 110 extends), in other aspects there may be traffic monitoring cameras disposed on the non-pick side of the frame 200 so that a field of view of the traffic monitoring cameras faces laterally in direction LAT2.
  • the traffic monitoring cameras 460A, 460B provide for an autonomous merging of autonomous transport vehicles 110 exiting, for example, a picking aisle 130A or lift transfer area 195 onto the transfer deck 130B (see Fig. IB).
  • the autonomous transport vehicle 110 leaving the lift transfer area 195 detects autonomous transport vehicle 110T travelling along the transfer deck 130B.
  • the controller 122 autonomously strategizes merging (e.g., entering the transfer deck in front of or behind the autonomous transport vehicle H OT, acceleration onto the transfer deck based on a speed of the approaching vehicle H OT, etc.) on to the transfer deck based on information (e.g., distance, speed, etc.) of the vehicle 110V gathered by the traffic monitoring cameras 460A, 460B and communicated to and processed by the vision system controller 122VC.
  • merging e.g., entering the transfer deck in front of or behind the autonomous transport vehicle H OT, acceleration onto the transfer deck based on a speed of the approaching vehicle H OT, etc.
  • information e.g., distance, speed, etc.
  • the one or more out of plane (e.g., upward or downward facing) localization cameras 477A, 477B are disposed on the frame 200 of the autonomous transport vehicle 110 so as to sense/detect location fiducials (e.g., location marks 971, lines 900, etc.) disposed on a ceiling 991 of the storage and retrieval system or on the rolling surface 284 of the storage and retrieval system.
  • location fiducials e.g., location marks 971, lines 900, etc.
  • the location fiducials have known locations within the storage and retrieval system and may provide unigue identification marks/patterns that are recognized by the vision system controller 122VC (e.g., processing data obtained from the localization cameras 477A, 477B).
  • the vision system controller 122VC compares the detected location fiducial to known location fiducials (e.g., store in a memory of or accessible to the vision system controller 122VC) to determine a location of the autonomous transport vehicle 110 within the storage structure 130.
  • known location fiducials e.g., store in a memory of or accessible to the vision system controller 122VC
  • the cameras of the supplemental navigation sensor system 288 may be calibrated in any suitable manner (such as by, e.g., an intrinsic and extrinsic camera calibration) to effect sensing of case units CU, storage structure (e.g., shelves, columns, etc.), and other structural features of the storage and retrieval system. Referring to Figs.
  • known objects such as case units CUI, CU2, CU3 (or storage system structure) (e.g., having a known physical characteristics such as shape, size, etc.) may be placed within the field of view of a camera (or the vehicle 110 may be positioned so that the known objects are within the field of view of the camera) of the supplemental navigation sensor system 288.
  • These known objects may be imaged by the camera from several angles/view points to calibrate each camera so that the vision system controller 122VC is configured to detect the known objects based on sensor signals from the calibrated camera.
  • Figs. 5A- 5C are exemplary images captured from one of case unit monitoring cameras 410A, 410B from, for exemplary purposes, three different view points.
  • physical characteristics/parameters e.g., shape, length, width, height, etc.
  • the vision system controller 122VC e.g., the physical characteristics of the different case units CUI, CU2, CU3 are stored in a memory of or accessible to the vision system controller 122VC.
  • the vision system controller 122VC is provided with intrinsic and extrinsic camera and case unit parameters that effect calibration of the case unit monitoring camera 410A, 410B. For example, from the images the vision system controller 122VC registers (e.g., stores in memory) a perspective of the case units CUI, CU2, CU3 relative to the case unit monitoring camera 410A, 410B.
  • the vision system controller 122VC estimates the pose of the case units CUI, CU2, CU3 relative to the case unit monitoring camera 410A, 410B and estimates the pose of the case units CUI, CU2, CU3 relative to each other.
  • the pose estimates PE of the respective case units CUI, CU2, CU3 are illustrated in Figs. 5A-C as being overlaid on the respective case units CUI, CU2, CU3.
  • the vehicle 110 is moved so that any suitable number of view points of the case units CUI, CU2, CU3 are obtained/imaged by the case unit monitoring camera 410A, 410B to effect a convergence of the case unit characteristics/parameters (e.g., estimated by the vison system controller 122VC) for each of the known case units CUI, CU2, CU3.
  • the case unit monitoring camera 410A, 410B is calibrated. The calibration process is repeated for the other case unit monitoring camera 410A, 410B.
  • the vision system controller 122VC is configured with three-dimensional rays for each pixel in each of the case unit monitoring cameras 410A, 410B as well as an estimate of the three-dimensional baseline line segment separating the cameras and the relative pose of the case unit monitoring cameras 410A, 410B relative to each other.
  • the vision system controller 122VC is configured to employ the three-dimensional rays for each pixel in each of the case unit monitoring cameras 410A, 410B, the estimate of the three-dimensional baseline line segment separating the cameras, and the relative pose of the case unit monitoring cameras 410A, 410B relative to each other so that the case unit monitoring cameras 410A, 410B form a passive stereo vision sensor such as where there are common features visible within the fields of view 410AF, 410BF of the case unit monitoring cameras 410A, 410B.
  • the calibration of the case unit monitoring cameras 410A, 410B was described with respect to case units CUI, CU2, CU3 but may be performed with respect to any suitable structure (e.g., permanent or transient) of the storage and retrieval system 100 in a substantially similar manner.
  • vehicle localization e.g., positioning of the vehicle at a predetermined location along a picking aisle 130A or along the transfer deck 130B relative to a pick/place location
  • vehicle localization e.g., positioning of the vehicle at a predetermined location along a picking aisle 130A or along the transfer deck 130B relative to a pick/place location
  • the controller 122 is configured to what may be referred to as "grossly" locate the vehicle 110 relative to a pick/place location by employing on or more sensors of the physical characteristic sensor system 270.
  • the controller 122 is configured to employ the supplemental (e.g., pixel level) position information obtained from the vision system controller 122VC of the supplemental navigation sensor system 288 to what may be referred to as "fine tune" the vehicle pose and location relative to the pick/place location so that positioning of the vehicle 110 and case units CU placed to storage locations 130S by the vehicle 110 may be held to smaller tolerances (i.e., increased position accuracy) compared to positioning of the vehicle 110 or case units CU with the physical characteristic sensor system 270 alone.
  • the pixel level positioning provided by the supplemental navigation sensor system 288 has a higher positioning definition/resolution than the electro-magnetic sensor resolution provided by the physical characteristic sensor system 270.
  • lighting sources may be provided on the vehicle 110 to illuminate the case units (or other structure) to effect the calibration of the cameras in the manner noted above.
  • the lighting may be a diffuse lighting or the lighting may have a known pattern (s) that are projected on the surface (s) of the case units (or other structure) so that the case unit or other structure) parameters may be extracted from the images and convergence of the case unit (or other structure) parameters is obtained by the vision system controller 122VC.
  • Suitable markers may also be placed on the case units/structure to facilitate feature extraction from the images obtained by the case unit monitoring cameras 410A, 410B and effect calibration of the case unit monitoring cameras 410A, 410B.
  • Calibration of the other cameras e.g., the forward and rearward navigation cameras 420A, 420B, 430A, 430B, the traffic monitoring camera(s) 460A, 460B, and the out of plane localization camera(s) 477A, 477B, etc.
  • the supplemental navigation sensor system 288 may be effected in a manner similar to that described above.
  • the vision system controller 122VC of the autonomous transport vehicle 110 is configured to dynamically select and access information from different sensors (or groups of sensors) from the supplemental navigation sensor system 288 depending on vehicle 110 operation.
  • Fig. 11 is an illustration showing non-exhaustive sensor groupings 1111-1114 and associated non-exhaustive vehicle operations in which the sensors groups may be accessed by the vision system controller 122VC to effect that vehicle operation.
  • Exemplary sensor group 1111 includes the rear navigation cameras 230A, 230B.
  • Exemplary sensor group 1112 includes the forward navigation cameras 420A, 420B.
  • Exemplary sensor group 1113 includes the out of plane cameras 477A, 477B.
  • Exemplary senor group 1114 includes the case unit monitoring cameras 410A, 410B.
  • sensor groups 1111, 1113 may be employed by the vision system controller 122VC (and controller 122) for vehicle operations where the rear end 200E2 of the vehicle 110 leads a direction of vehicle travel (e.g., backward travel on the transfer deck 130B).
  • the sensor groups 1112, 1113 may be employed by the vision system controller 122VC (and controller 122) for vehicle operations where the front end 200E1 of the vehicle 110 leads a direction of vehicle travel (e.g., forward travel on the transfer deck 130B).
  • the sensor groups 1112, 1114 may be employed by the vision system controller 122VC (and controller 122) for vehicle operations where the front end 200E1 of the vehicle 110 leads a direction of vehicle travel (e.g., forward travel along a picking aisle 130A).
  • the sensor groups 1111, 1114 may be employed by the vision system controller 122VC (and controller 122) for vehicle operations where the rear end 200E2 of the vehicle 110 leads a direction of vehicle travel (e.g., backward travel along a picking aisle 130A).
  • the sensor group 1114 may be employed by the vision system controller 122VC (and controller 122) for vehicle operations where the transfer arm 210A loads or unloads a case unit CU to or from the payload bed 210B (e.g., pick place operations).
  • the autonomous transport vehicle 110 includes the supplemental hazard sensor system 290.
  • the supplemental hazard sensor system 290 is connected to the frame 200 of the autonomous transport vehicle 110 to provide the bot operational control of the autonomous transport vehicle 110 in collaboration with an operator.
  • the supplemental hazard sensor system 290 provides data (images)
  • the vision system data is registered by the vision system controller 122VC that a) determines information characteristics (in turn provided to the controller 122), or b) information is passed the controller 122 without being characterizes (object in predetermined criteria) and characterization is done by the controller 122.
  • the controller 122 determines selection to switch to the collaborative state. After switching, then the collaborative operation is effected by a user accessing the supplemental hazard sensor system 290 via the vision system controller 122VC and/or the controller 122.
  • the supplemental hazard sensor system 290 may be considered as providing a collaborative mode of operation of the autonomous transport vehicle 110.
  • the supplemental hazard sensor system 290 supplements the autonomous navigation/operation sensor system 270 and/or the supplemental sensor system 298, with the supplemental hazard sensor system 290 configured to effect collaborative discriminating and mitigation of objects/hazards, e.g., encroaching upon the travel/rolling surface 284.
  • the supplemental hazard sensor system 290 forms, at least in part, the vision system 400 and includes at least one camera 292.
  • the term "camera” described herein is a still imaging or video imaging device that includes one or more of a two- dimensional camera, a two dimensional camera with RGB (red, green, blue) pixels, a three-dimensional camera with XYZ+A definition (where XYZ is the three-dimensional reference frame of the camera and A is a radar return strength or time-of-flight stamp), and an RGB/XYZ camera which includes both RGB and three-dimensional coordinate system information, non-limiting examples of which are provided herein.
  • the at least one camera 292 of the vision system 400 is disposed to capture image data informing objects and/or spatial features 299 (having intrinsic physical characteristics) within at least a portion of the facility 100 viewed by the at least one camera 292 with the autonomous transport vehicle 110 in the different positions in the facility 100 while executing autonomous navigation and transfer tasks.
  • the at least one camera 292 is illustrated in Fig. 4D, for exemplary purposes only, as being separate and distinct from the cameras illustrated in Fig. 4A; however, the at least one camera 292 may be part of the system illustrated in Fig. 4A (e.g., camera 292 on end 200E1 of the vehicle 110 may be camera 477A in Fig.
  • camera 292 on end 200E2 of eh vehicle 110 may be camera 477B in Fig. 4A; and cameras 292 facing laterally in direction LAT1 in Fig. 4D may be cameras 460AF, 460BF in Fig. 4A).
  • the vision system 400 includes the at least one camera 292. It is noted that although the aspects of the present disclosure are described with respect to a forward facing camera (i.e., a camera that faces in the direction of travel with the end 200E1 of the autonomous transport vehicle 110 leading), the camera(s) may be positioned to face in any direction (rearward, sideways, up, down, etc.) for up to 360° monitoring about the autonomous transport vehicle 110.
  • a forward facing camera i.e., a camera that faces in the direction of travel with the end 200E1 of the autonomous transport vehicle 110 leading
  • the camera(s) may be positioned to face in any direction (rearward, sideways, up, down, etc.) for up to 360° monitoring about the autonomous transport vehicle 110.
  • the at least one camera 292 may be placed on the longitudinal centerline LAXCL, on either side of the longitudinal centerline LAXCL, more than one camera 292 may be placed on opposite sides of the longitudinal centerline LAXCL of the autonomous transport vehicle 110 so that the field of view 292F provides the autonomous transport vehicle 110 with stereo vision (e.g., such as cameras 420A, 420B), or any other suitable configuration.
  • the at least one camera 292, is any suitable camera configured to provide object or spatial feature 299 detection.
  • the at least one camera 292 is any suitable high resolution or low resolution video cameras, a 3D imaging system, time-of-flight camera, laser ranging camera, or any other suitable camera configured to provide detection of the object or spatial feature 299 within at least a portion of the facility 100 viewed by the at least one camera 292 with the autonomous transport vehicle 110 in the different positions in the facility 100 while executing autonomous navigation and transfer tasks.
  • the at least one camera 292 provides for imaging and detection (with either end 200E1, 200E2 of the autonomous transport vehicle 110 leading a direction of travel or trailing the direction of travel).
  • the object or spatial feature 299 detection may be compared to reference floor maps and structure information (e.g., stored in a memory of or accessible by) of the vision system controller 122VC.
  • the at least one camera 292 may also send signals to the controller 122 (inclusive of or through the vision system controller 122VC) so that as the autonomous transport vehicle 110 approaches the object or spatial feature 299, the autonomous transport vehicle 110 initiates an autonomous stop (i.e., in an autonomous operation state) or may enter a collaborative operation state so as to be stopped by an operator or maneuvered e.g., on the undeterministic rolling surface of the transfer deck 130B or within the picking aisle 130A (which may have a deterministic or undeterministic rolling surface) by the operator in order to identify the object or spatial feature 299 (e.g., another malfunctioning autonomous transport vehicle, dropped case unit, debris, spill, or other transient object within the storage and retrieval system 100).
  • the object or spatial feature 299 e.g., another malfunctioning autonomous transport vehicle, dropped case unit, debris, spill, or other transient object within the storage and retrieval system 100.
  • the camera(s) 292 of the supplemental hazard sensor system 290 may be calibrated in any suitable manner (such as by, e.g., an intrinsic and extrinsic camera calibration) to effect sensing/detection of the objects or spatial features 299 in the storage and retrieval system 100. Referring to Figs.
  • known objects such as case units CUI, CU2, CU3 (or storage system structure) (e.g., having a known physical characteristics such as shape, size, etc.) may be placed within the field of view 292F of a camera 292 (or the autonomous transport vehicle 110 may be positioned so that the known objects are within the field of view 292F of the camera 292) of the supplemental hazard sensor system 290.
  • These known objects may be imaged by the camera 292 from several angles/view points to calibrate each camera so that the vision system controller 122VC is configured to determine when an "unknown" (i.e., unidentifiable) object based on sensor signals from the calibrated camera is within the field of view 292F.
  • Figs. 5B and 5C are exemplary images captured from the camera(s) 292 from, for exemplary purposes, two different view points.
  • physical characteristics/parameters e.g., shape, length, width, height, etc.
  • the vision system controller 122VC e.g., the physical characteristics of the different case units CUI, CU2, CU3 are stored in a memory of or accessible to the vision system controller 122VC.
  • the vision system controller 122VC is provided with intrinsic and extrinsic camera and case unit parameters that effect calibration of the camera(s) 292.
  • the autonomous transport vehicle 110 is moved so that any suitable number of view points of the case units CUI, CU2, CU3 are obtained/imaged by the camer (s) 292 to effect a convergence of the case unit characteristics/parameters (e.g., estimated by the vison system controller 122VC) for each of the known case units CUI, CU2, CU3.
  • the camera(s) 292 is calibrated. With the camera(s) 292 calibrated the vision system controller 122VC is configured with three- dimensional rays for each pixel in each of the camer (s) 292.
  • the calibration of the camera(s) 292 was described with respect to case units CUI, CU2, CU3 but may be performed with respect to any suitable structure (e.g., permanent or transient) of the storage and retrieval system 100 in a substantially similar manner.
  • the autonomous transport vehicle 110 may opportunistically detect (incidental or peripheral to predetermined autonomous tasks, e.g., autonomous picking/placing payload at storage, travel to transfer station and/or charge station for autonomous payload pick/place/transfer at the transfer station, and/or autonomous charging at the charge station) other objects within the facility 100 (e.g., other bots, dropped case units, spills, debris, etc.).
  • autonomous tasks e.g., autonomous picking/placing payload at storage, travel to transfer station and/or charge station for autonomous payload pick/place/transfer at the transfer station, and/or autonomous charging at the charge station
  • other objects within the facility 100 e.g., other bots, dropped case units, spills, debris, etc.
  • the vision system controller 122VC is configured to employ the supplemental navigation sensor system 288 and/or the supplemental hazard sensor system 290 (i.e., imaging information obtained from the cameras of one or more of the supplemental sensor systems) to determine whether the objects are "unknown” (i.e., whether the objects or spatial features 299 are not expected to be within an area or space along the autonomous travel path of the autonomous transport vehicle 110).
  • the supplemental navigation sensor system 288 and/or the supplemental hazard sensor system 290 i.e., imaging information obtained from the cameras of one or more of the supplemental sensor systems
  • the vision system 400 of the supplemental navigation sensor system 288 and/or supplemental hazard sensor system 290 configures the autonomous transport vehicle 110 with a virtual model 400VM of an operating environment 401 in which the autonomous transport vehicle 110 operates.
  • the vision system controller 122VC is programmed with a reference representation 400VMR of predetermined features (e.g., the fixed/permanent structure of and/or transient objects in the storage structure 130 of the storage and retrieval system described herein and included in the virtual model 400VM), the reference representation 400VMR of the predetermined features define the form or location of at least part of the facility or storage structure 130 traversed by the autonomous transport vehicle 110.
  • the virtual model 400VM (and the reference representation 400VMR of predetermined features thereof) of the operating environment 401 is stored in any suitable memory of the autonomous transport vehicle (such as a memory of the vision system controller 122VC) or in a memory accessible to the vision system controller 122VC.
  • the virtual model 400VM provides the autonomous transport vehicle 110 with the dimensions, locations, etc. of at least the fixed (e.g., permanent) structural components in the operating environment 401.
  • the operating environment 401 and the virtual model 400VM thereof includes at least fixed/permanent structure (e.g., transfer deck 130B, picking aisles 130A, storage spaces 130S, case unit transfer areas, case unit buffer locations, vehicle charging locations, support columns, etc.) of one more storage structure level 130L; in one or more aspects, the operating environment 401 and the virtual model 400VM include the fixed structure of the one or more storage structure level 130L and at least some transitory structure (e.g., case units CU stored or otherwise located at case unit holding locations of the storage and retrieval system 100, etc.) of and located within the storage level 130L on which the autonomous transport vehicle 110 operates; in one or more other aspects the operating environment 401 and the virtual model 400VM includes at least the fixed structure and at least some transitory structure (e.g., case units)) of one or more levels 130L of the storage structure 130 on which the autonomous transport vehicle 110 could operate; and in still other aspects, the operating environment 401 and virtual model 400VM includes the entirety of the storage structure and at least some of the transitor
  • the autonomous transport vehicle 110 may have stored thereon (or in a memory accessible thereby) a portion of the virtual model 400VM that corresponds with a portion of the operating environment in which the autonomous transport vehicle 110 operates.
  • the autonomous transport vehicle 110 has stored thereon (or in a memory accessible thereby) only a portion of the virtual model 400VM corresponding to a storage structure level 130L on which the autonomous transport vehicle is disposed.
  • the virtual model 400VM of the operating environment 401 may be dynamically updated in any suitable manner to facilitate autonomous transport vehicle 110 operations in the storage structure 130.
  • the vision system controller 122VC is updated (e.g., such as by the controller 122 and/or wirelessly by control server 120) to include a portion of the virtual model 400VM corresponding to the other different storage structure level 130L.
  • the virtual model 400VM may be dynamically updated as case units are added and removed from the storage structure 130 so as to provide a dynamic virtual model case unit map that indicates the predetermined (expected) location of the case units CU that are to be transferred by the autonomous transport vehicles 110.
  • the predetermined (expected) locations of the case units within the storage structure may not be included in the virtual model 400VM; however, the predetermined (expected) locations, sizes, SKUs, etc. of one or more case units to be transferred by an autonomous transport vehicle 110 are communicated from, for example, controller 120 to the autonomous transport vehicle 110, where the vision system 400 (and the vision system controller 122VC) effect verification of case unit(s) at the predetermined location as described herein (e.g., the vision system 400 compares what it expects to "see” with what is actually “sees” to verify the correct case unit (s) are being transferred) and/or for detection/identification of another malfunctioning autonomous transport vehicle, dropped case unit, debris, spill, or other transient object within the storage and retrieval system 100.
  • the vision system 400 and the vision system controller 122VC
  • the vision system controller 122VC is configured to register image data captured by the supplemental navigation sensor system 288 and generate, from the captured image data, at least one image (e.g., still image and/or video image) of one or more features of the predetermined features (e.g., the fixed/permanent structure of and/or transient objects in the storage structure 130 of the storage and retrieval system described herein).
  • the at least one image see, e.g., Figs.
  • 5A-5C, 9A, 10A, and 10B for exemplary images being formatted as a virtual representation VR of the one or more (imaged) predetermined features so as to provide a comparison (in at least one but up to the six degrees of freedom X, Y, Z, Rx, Ry, Rz) to one or more corresponding reference (e.g., a corresponding feature of the virtual model 400VM that serves as a reference for identifying the form and/or location of the imaged predetermined feature) of the predetermined features of the reference representation 400VMR.
  • a corresponding reference e.g., a corresponding feature of the virtual model 400VM that serves as a reference for identifying the form and/or location of the imaged predetermined feature
  • Fig. 13 is an exemplary flow diagram of the comparison where at least one model 400VM of the storage and retrieval system is stored within or accessible to the vision system controller 122VC.
  • a storage facility information model, a storage structure/array information model, and a case input station model are provided but in other aspects any suitable models and number of models may be provided to provide the vision system controller 122VC with virtual information pertaining to the operating environment of the autonomous transport vehicles 110.
  • the different models may be combined to provide the vision system controller 122VC with a complete virtual operating environment in which the autonomous transport vehicle 110 operates.
  • the sensors of the vision system 400 (as described herein) also provide sensor data to the vision system controller 122VC.
  • the sensor data that embodies the virtual representation VR images, is processed with any suitable image processing methods to detect a region of interest and/or edge features of objects in the image.
  • the vision system controller 122VC predicts, within the model 400VM, a field of view of the sensor(s) providing the image data and determines, within the predicted field of view, regions of interest and edges of objects.
  • the regions of interest and edges of the virtual model 400VM are compared with the regions of interest and edges of the virtual representation VR pose and location determination of one or more of the autonomous transport vehicle 110 and case units (payloads) as described herein.
  • the vision system controller 122VC is configured (as described herein with at part of the virtual model 400VM and with suitable imaging processing non-transitory computer program code) so that the virtual representation VR, of the imaged one or more features (e.g., in Fig. 9A the imaged features are the storage and retrieval system rack/column structure, in Figs. 10A the imaged features are the case units CU1-CU3, and in Fig.
  • the imaged features are the case units, storage rack structure, and a portion of the payload bed 210B) of the predetermined features, is effected resident on the autonomous transport vehicle 110, and comparison between the virtual representation VR of the one or more imaged predetermined features and the one or more corresponding reference predetermined features RPF (e.g., presented in a reference presentation RPP of the virtual model 400VM) is effected resident on the autonomous transport vehicle 110 (see Figs. 9A and 10A).
  • the autonomous transport vehicle 110 pose determination and navigation is autonomous and decoupled from and independent of each system controller (e.g., control server 120 or other suitable controller of the storage and retrieval system) that sends commands to the autonomous transport vehicle 110.
  • the controller 122 is configured to employ the supplemental (e.g., pixel level) position information obtained from the vision system controller 122VC of the supplemental navigation sensor system 288 to what may be referred to as "fine tune" the vehicle pose and location relative to the pick/place location so that positioning of the vehicle 110 and case units CU placed to storage locations 130S by the vehicle 110 may be held to smaller tolerances (i.e., increased position accuracy) compared to positioning of the vehicle 110 or case units CU with the physical characteristic sensor system 270 alone.
  • supplemental e.g., pixel level
  • the fine tuning of the autonomous transport vehicle 110 pose and location is effected by the vision system controller 122VC, where the vision system controller 122VC is configured to confirm autonomous transport vehicle 110 pose and location information registered by the vision system controller 122VC from the physical characteristic sensor system 270 based on the comparison between the virtual representation VR and the reference representation RPP.
  • the comparison between the virtual representation VR and the reference representation RPP by the vision system controller 122VC builds confidence in the data generated by the physical characteristic sensor system 270 by verifying the accuracy of the data with the information obtained from the supplemental navigation sensor system 288.
  • the vision system controller 122VC is configured to identify a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation VR and the reference representation RPP, and update (e.g., modify the pose and/or location information from the physical characteristic sensor system 270) or complete (if the pose and/or location information from the physical characteristic system 270 is missing) autonomous transport vehicle 110 pose or location information from the physical characteristic sensor system 270 (e.g., to effect finally positioning the autonomous transport vehicle 110 to a predetermined commanded position) based on the variance.
  • the vision system controller 122VC is configured to determine a pose error in the information from the physical characteristic sensor system 270 and fidelity of the autonomous guided vehicle 110 pose and location information from the physical characteristic sensor system 270 based on at least one of the identified variance and an image analysis of at least one image (from the vision system 400 of the supplemental navigation sensor system 288), and assign a confidence value according to at least one of the pose error and the fidelity. Where the confidence value is below a predetermined threshold, the vision system controller 122VC is configured to switch autonomous guided vehicle navigation based on pose and location information generated from the virtual representation VR in place of pose and location information from the physical characteristic sensor system 270.
  • the switching from the physical characteristic sensor system pose and location information to the virtual representation VR pose and location information may be effected by the vision system controller 122VC (or controller 122), by de-selecting the pose and location information, generated from the physical characteristic sensor system 270, and selecting/entering pose and location information from the virtual representation VR in a kinematic/dynamic algorithm (such as described in United States patent application number 16/144,668 titled “Storage and Retrieval System” and filed on September 27, 2018, the disclosure of which is incorporated herein by reference in its entirety).
  • the vision system controller 122VC effects the above-noted switching the vision system controller 122VC is configured to continue autonomous transport vehicle 110 navigation to any suitable destination (such as a payload place destination, charging destination, etc.); while in other aspects the vision system controller 122VC is configured to select an autonomous transport vehicle 110 safe path and trajectory bringing the autonomous transport vehicle 110 from a position at switching to a safe location 157 (the safe location being a dedicated induction/extraction area of a transfer deck, a lift transfer area, or other area of the transfer deck 130B or picking aisle 130A at which the autonomous transport vehicle 110 may be accessed by an operator without obstructing operation of other autonomous transport vehicles 110 operating in the storage and retrieval system 100) for shut down of the autonomous transport vehicle 110; while in still other aspects, the vision system controller 122VC is configured to initiate communication to an operator of the storage and retrieval system 100 identifying autonomous transport vehicle 110 kinematic data and identify a destination of the autonomous transport vehicle 110 for operator selection (e.g., presented on user interface UI).
  • the operator may select or switch control of the autonomous guided vehicle (e.g., through the user interface UI) from automatic operation to either quasi automatic operation (e.g., the autonomous transport vehicle 110 operates autonomously with limited manual input) or manual operation (e.g., the operator remotely controls operation of the autonomous transport vehicle 110 through the user interface UI).
  • the user interface UI may include a capacitive touch pad/screen, joystick, haptic screen, or other input device that conveys kinematic directional commands (e.g., turn, acceleration, deceleration, etc.) and/or pick place commands from the user interface UI to the autonomous guided vehicle 110 to effect operator control inputs in the quasi automatic operational and manual operational modes of the autonomous transport vehicle 110.
  • the vision system controller 122VC may be configured to apply the variance as a offset that is automatically applied to the data from the physical characteristic sensor system 270 to grossly position the autonomous transport vehicle 110 based on the data from the physical characteristic sensor system 270 as modified by the offset, where comparison with the virtual representation VR and the reference representation RPP verifies the validity of the offset and adjusts the offset (and autonomous transport vehicle 110 pose and location) according to any variance. Where the variance reaches a predetermined threshold the vision system controller 122VC may alert a user of the storage and retrieval system 100 that the autonomous guided vehicle 110 may be due for servicing.
  • the vision system controller 122VC is configured to effect a similar pose and location error identification for the case units CU, such as held in storage locations 130S or other holding areas of the storage and retrieval system.
  • the vision system controller 122VC is configured to confirm payload pose and location information registered by the vision system controller 122VC from the physical characteristic sensor system 270 based on the comparison between the virtual representation VR and the reference representation RPP of the virtual model 400VM.
  • the vision system controller 122VC is configured to identify a variance in the payload (case unit) pose and location based on the comparison between the virtual representation VR and the reference representation RPP, and update (e.g., modify the pose and/or location information from the physical characteristic sensor system 270) or complete (if the pose and/or location information from the physical characteristic system 270 is missing) payload pose or location information from the physical characteristic sensor system based on the variance.
  • the vision system controller 122VC is configured to determine a pose error in the information from the physical characteristic sensor system 270 and fidelity of the payload pose and location information from the physical characteristic sensor system 270 based on at least one of the identified variance and an image analysis of the at least one image from the vision system 400 of the supplemental navigation sensor system 288.
  • the vision system controller 122VC assigns a confidence value according to at least one of the payload pose error and the fidelity. With the confidence value below a predetermined threshold, the vision system controller 122VC switches autonomous transport vehicle 110 payload handling based on pose and location information generated from the virtual representation VR in place of pose and location information from the physical characteristic sensor system 270.
  • the vision system controller 122VC is configured to, in some aspects, continue autonomous guided vehicle handling to a predetermined destination (such as a payload placement location or an area of the storage and retrieval system where the payload may be inspected by an operator); in other aspects the vision system controller 122VC is configured to initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to quasi automatic payload handling operation (where the operator provides limited input to transfer arm 210A and traverse movements of the autonomous guided vehicle) or manual payload handling operation (where the operator manually controls movement of the transfer arm 210A and traverse movements of the autonomous guided vehicle) via the user interface device UI.
  • a predetermined destination such as a payload placement location or an area of the storage and retrieval system where the payload may be inspected by an operator
  • the vision system controller 122VC is configured to initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to quasi automatic payload handling operation (where the operator provides limited input to transfer
  • the vision system controller 122VC is configured to transmit, via a wireless communication system (such as network 180) communicably coupling the vision system controller 122VC and an operator interface UI, a simulation image combining the virtual representation VR of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation RPP presenting the operator with an augmented reality image in real time (see Fig. 10A, where reference predetermined features include the shelves 555 and the virtual representations include those of the case units CU1-CU3).
  • a wireless communication system such as network 180
  • the vision system 400 of the supplemental navigation sensor system 288 provides a "dashboard camera” (or dash-camera) that transmits video and/or still images from the vehicle 110 to an operator to allow remote operation or monitoring of the vehicle 110. It is noted that the vision system 400 may also operate as a data recorder that periodically sends still images obtained from the vision system cameras to a memory of the user interface UI, where the still images are stored/cached for operator review (e.g., in addition to providing a real-time video stream the vision system 400 provides for non-real time review of the still images).
  • a data recorder that periodically sends still images obtained from the vision system cameras to a memory of the user interface UI, where the still images are stored/cached for operator review (e.g., in addition to providing a real-time video stream the vision system 400 provides for non-real time review of the still images).
  • the still images may be captured and transmitted to the user interface for storage at any suitable interval such as, for example, every second, every ten seconds, every thirty seconds, every minute, or at any other suitable time intervals (exclusive of real time video stream recording), where the periodicity of the still image capture/recording maintains suitable communication bandwidth between, for example, the control server 120 and the bots 110 (noting that in accordance with aspects of the disclosed embodiment, the number of bots 110 operating/transferring case units in the storage and retrieval system 100 may be on the order of hundreds to thousands of bots 110).
  • the user interface UI with the record of stored still images provides for an interactive presentation/data interface where a user reviews the still images to determine how or why an event (e.g., such as a case miss-pick, bot breakage, product spill, debris presence on the transfer deck, etc.) occurred and what transpired prior to and/or after the event.
  • an event e.g., such as a case miss-pick, bot breakage, product spill, debris presence on the transfer deck, etc.
  • the vision system controller 122VC is configured to receive real time operator commands (e.g., from the user interface UI) to the traversing autonomous guided vehicle 110, which commands are responsive to the real time augmented reality image (see Figs. 9A and 10A), and changes in the real time augmented reality image transmitted to the operator by the vision system controller 122VC.
  • the video or still images may be stored (and time stamped) in a memory onboard the vehicle 110 and sent to control server 120 and/or an operator on request; in other aspects the video and/or still images may be broadcast or otherwise transmitted in real time for viewing on a user interface UI (as described herein) accessible to the operator.
  • the vision system controller 122VC is also configured to register image data captured by the supplemental hazard sensor system 290 and generate, from the captured image data, at least one image (e.g., still image and/or video image) of one or more object or spatial feature 299 showing the predetermined physical characteristic.
  • the at least one image may be formatted as a virtual representation VR of the one or more object or spatial feature 299 (see Figs. 4D and 15) so as to provide a comparison (in at least one but up to the six degrees of freedom X, Y, Z, Rx, Ry, Rz (see Fig.
  • the controller 122VC is configured to verify (via the comparison) the existence of presence of the predetermined physical characteristic of the object or spatial feature 299 based on the comparison between the virtual representation and the reference representation (i.e., compare to determine whether the object is "known" or "unknown”).
  • the controller 122VC determines a dimension of the predetermined physical characteristic and commands (e.g., through the controller 122) the autonomous transport vehicle 110 to stop in a predetermined location relative to the object 299 (i.e., a trajectory is determined to autonomously place the bot in a predetermined position relative to the object or spatial feature 299) based on a position of the object or spatial features 299 determined from the comparison (as may be realized, the command stop interrupts the automatic routine of the vehicle previous autonomous commands, in effect diverting the bot from automatic tasking).
  • the controller 122 selectably reconfigures the autonomous transport vehicle 110 from an autonomous state to a collaborative vehicle state so as to finalize discrimination of the object or spatial feature 299 as a hazard and identify a mitigation action of the vehicle with respect to the hazard (i.e., selectably switches the autonomous transport vehicle 110 from an autonomous operation state to a collaborative operation state and identifies whether the vehicle can mitigate the hazard, e.g., remove a disabled vehicle or act as a signal/beacon to warn other vehicles performing autonomous tasks).
  • the autonomous transport vehicle 110 is disposed to receive operator commands for the autonomous transport vehicle 110 to continue effecting vehicle operation for discriminating and mitigation of the object or spatial feature 299.
  • the autonomous transport vehicle 110 may not include the reference map (e.g., virtual model 400VM).
  • the controller 122VC determines a position of the object within a reference frame of the at least one camera 292, which is calibrated and has a predetermined relationship to the autonomous transport vehicle 110. From the object pose in camera reference frame, the controller 122VC determines presence of the predetermined physical characteristic of object 299 (i.e., whether the object 299 is extended across bot path, blocks the bot, or is proximate, within a predetermined distance, to the bot path to be deemed an obstacle or hazard).
  • the controller 122VC Upon determination of presence of an object and switch from the autonomous state to the collaborative vehicle state, the controller 122VC is configured to initiate transmission communicating image/video the of presence of the predetermined physical characteristic to an operator (user) interface UI for collaborative operator operation of the autonomous transport vehicle 110 as will be further described below (Here the vehicle 110 is configured as an observation platform and pointer for a user in collaborative mode. The vehicle 110 in this mode is also a pointer for other bots executing in autonomous operation, that identify the pointer bot (e.g., via control system 120, or beacon) and reroute automatically to avoid the area until further command and if avoidance is not available to stop ahead of encountering the object/hazard).
  • the vision system controller 122VC is configured (as described herein with at least part of the virtual model 400VM and with suitable imaging processing non-transitory computer program code) so that the virtual representation VR, of the imaged object or spatial feature 299 is effected resident on the autonomous transport vehicle 110, and comparison between the virtual representation VR of the one or more imaged object or spatial feature 299 and the one or more corresponding reference predetermined features RPF (e.g., presented in a reference presentation RPP of the virtual model 400VM) is effected resident on the autonomous transport vehicle 110 (see Fig. 15).
  • the comparison between the virtual representation VR and the reference representation RPP by the vision system controller 122VC verifies whether the object or spatial feature 299 is "unknown".
  • the vision system controller 122VC is configured to determine a dimension of the object or spatial feature 299 based on image analysis of at least one image (from the vision system 400 of the supplemental hazard sensor system 290). Where the dimensions are unidentifiable, the vision system controller 122VC is configured to switch the autonomous transport vehicle 110 into the collaborative operation state for collaborative discrimination of the object 299 with the operator. The switching from the autonomous to the collaborative state may be effected by the vision system controller 122VC (or controller 122), by selectably reconfiguring the autonomous transport vehicle 110 from an autonomous vehicle to a collaborative vehicle (i.e., selectably switches the autonomous transport vehicle 110 from an autonomous operation state to a collaborative operation state).
  • the controller 122 is configured to continue autonomous transport vehicle 110 navigation to any suitable destination relative to the detected object, applying a trajectory to the autonomous transport vehicle 110 that brings the autonomous transport vehicle 110 to a zero velocity within a predetermined time period where motion of the autonomous transport vehicle 110 along the trajectory is coordinated with "known" and "unknown” objects located relative to the autonomous transport vehicle 110.
  • the vision system controller 122VC initiates communication to the operator of the storage and retrieval system 100 displaying the object or spatial feature 299 on the user interface UI for the operator to discriminate the object 299 and determine a mitigation action such as maintenance (e.g., clean-up of a spill, removal of a malfunctioning bot, etc.) and a location of the autonomous transport vehicle 110 (e.g., presented on user interface UI).
  • maintenance e.g., clean-up of a spill, removal of a malfunctioning bot, etc.
  • a location of the autonomous transport vehicle 110 e.g., presented on user interface UI.
  • the controller 122 may initiate a signal/beacon to at least another bot(s) so as to alert the other bot(s) of a traffic obstacle and to avoid the obstacle or indicate a detour area (thus, in effect, the supplemental hazard sensor system 290 provides for a hazard pointer/indicator mode of one bot to others on the same level).
  • the signal/beacon is sent via a local communication transmission to a system area bot task manager, managing tasks of nearby bots, or bots within a predetermined distance of the pointer bot.
  • the controller 122 is configured, based on object information from the vision system 400 and vision system controller 122VC, to select an autonomous transport vehicle 110 safe path and trajectory bringing the autonomous transport vehicle 110 from a position at switching to a location 157 where the operator may view the object 299 without further obstructing operation of other autonomous transport vehicles 110 operating in the storage and retrieval system 100.
  • the vision system controller is configured, based on object information from the vision system 400 and vision system controller 122VC, to select an autonomous transport vehicle 110 safe path and trajectory bringing the autonomous transport vehicle 110 from a position at switching to a location 157 where the operator may view the object 299 without further obstructing operation of other autonomous transport vehicles 110 operating in the storage and retrieval system 100.
  • the operator may select or switch control of the autonomous guided vehicle (e.g., through the user interface UI) from automatic operation to collaborative operation (e.g., the operator remotely controls operation of the autonomous transport vehicle 110 through the user interface UI).
  • the user interface UI may include a capacitive touch pad/screen, joystick, haptic screen, or other input device that conveys kinematic directional commands (e.g., turn, acceleration, deceleration, etc.) from the user interface UI to the autonomous transport vehicle 110 to effect operator control inputs in the collaborative operational mode of the autonomous transport vehicle 110.
  • the vision system 400 of the supplemental hazard sensor system 290 provides a "dashboard camera" (or dash-camera) that transmits video and/or still images from the autonomous transport vehicle 110 to an operator (through user interface UI) to allow remote operation or monitoring of the area relative to the autonomous transport vehicle 110 in a manner similar to that described herein with respect to supplemental navigation sensor system 288.
  • a "dashboard camera” or dash-camera
  • the vision system controller 122VC (and/or controller 122) is in one or more aspects configured to provide remote viewing with the vision system 400, where such remote viewing may be presented to an operator in augmented reality or in any other suitable manner (such as unaugmented).
  • the autonomous transport vehicle 110 is communicably connected to the warehouse management system 2500 (e.g., via the control server 120) over the network 180 (or any other suitable wireless network).
  • the warehouse management system 2500 includes one or more warehouse control center user interfaces UI.
  • the warehouse control center user interface US may be any suitable interfaces such as desktop computers, laptop computers, tablets, smart phones, virtual reality headsets, or any other suitable user interface configured to present visual and/or aural data obtained from the autonomous transport vehicle 110.
  • the vehicle 110 may include one or more microphones MCP (Fig. 2) where the one or more microphones and/or remote viewing may assist in preventative maintenance/troubleshooting diagnostics for storage and retrieval system components such as the vehicle 110, other vehicles, lifts, storage shelves, etc.
  • the warehouse control center user interfaces UI are configured so that warehouse control center users reguest or are otherwise supplied (such as upon detection of an unidentifiable object 299) with images from the autonomous transport vehicle 110 and so that the requested/supplied images are viewed on the warehouse control center user interfaces UI.
  • the images supplied and/or requested may be live video streams, pre-recorded (and saved in any suitable memory of the autonomous transport vehicle 110 or warehouse management system 2500) images, or images (e.g., one or more static images and/or dynamic video images that correspond to a specified (either user selectable or preset) time interval or number of images taken on demand substantially in real time with a respective image request.
  • live video stream and/or image capture provided by the vision system 400 and vision system controller 122VC may provide for real-time remote controlled operation (e.g., teleoperation) of the autonomous transport vehicle 110 by a warehouse control center user through the warehouse control center user interface UI.
  • the live video is streamed from the vision system 400 of the supplemental navigation sensor system 288 and/or the supplemental hazard sensor system 290 to the user interface UI as a conventional video stream (e.g., the image is presented on the user interface without augmentation, what the camera "sees” is what is presented) as illustrated in Figs. 9A and 15.
  • Fig. 9A illustrates a live video that streamed without augmentation from both the forward navigation cameras 420A, 420B (a similar video stream may be provided by the rearward navigation cameras 430A, 430B but in the opposite direction); while Fig.
  • FIG. 15 illustrates a live video that streamed without augmentation from the forward camera 292/477A (a similar video stream may be provided by the rearward camera 292/477B but in the opposite direction). Similar video may be streamed from any of the cameras of the supplemental navigation sensor system 288 and/or supplemental hazard sensor system 290 described herein. While Fig. 9A illustrates a side by side presentation of the forward navigation cameras 420A, 420B, the video stream, where requested by the user, may be for but one of the forward navigation cameras 420A, 420B.
  • images from the right side forward navigation camera 420A may be presented in a viewfinder of the virtual reality headset corresponding to the user's right eye and images from the left side forward navigation camera 420B may be presented in a viewfinder of the virtual reality headset corresponding to the user's left eye.
  • the live video is streamed from the vision system 400 of the supplemental navigation sensor system 288 to the user interface UI as an augmented reality video stream (e.g., a combination of live video and virtual objects are presented in the streamed video) as illustrated in Fig. 10A.
  • Fig. 10A illustrates a live video that is streamed with augmentation from one of the case unit monitoring cameras 410A, 410B (a similar video stream may be provided by the other of the case unit monitoring cameras 430A, 430B but offset by the separation distance between the cameras 430A, 430B).
  • Similar augmented video may be streamed from any of the cameras of the supplemental navigation sensor system 288 described herein.
  • Fig. 10A illustrates a live video that is streamed with augmentation from one of the case unit monitoring cameras 410A, 410B (a similar video stream may be provided by the other of the case unit monitoring cameras 430A, 430B but offset by the separation distance between the cameras 430A, 430B).
  • the case units CUI, CU2, CU3 are presented to the user through the user interface UI in the live video stream as the case units are captured by the one of the case unit monitoring cameras 410A, 410B.
  • Virtual representations of the shelf 555 and slats 520L on which the case units CUI, CU2, CU3 are seated may be inserted into the live video stream by the vision system controller 122VC or other suitable controller (such as control server 120) to augment the live video stream.
  • the virtual representations of the shelf 555 and slats 520L may be virtually inserted into the live video stream such as where portions of the structure are not within the field of view 410AF, 410BF of the case unit monitoring cameras 410A, 410B (or a field of view of whichever camera of the supplemental navigation sensor system 288 is capturing the video).
  • the virtual representations of the storage and retrieval structure may be virtually inserted into the live video streams to supplement/augment the live video stream with information that may be useful to the user (e.g., to provide a completed "picture" of what is being "observed” by the autonomous transport vehicle) where such information is not captured by cameras or not clearly discernable in the camera image data.
  • the virtual representations of the storage and retrieval structure that are virtually inserted into the live video stream are obtained by the vision system controller 122VC (or control server 120) from the virtual model 400VM.
  • the video streams may be augmented to provide the operator with a transport path VTP and/or destination location indicator DL that provide the operator with guidance as to a destination location of the autonomous transport vehicle 110.
  • the transport path VTP and destination location indicator DL may also be presented in the video streams with the autonomous transport vehicle operating in the automatic/autonomous and quasi automatic operation modes to provide an operator with an indication of the planned route and destination.
  • the method includes providing the autonomous transport vehicle 110 (Fig. 12, Block 1200) as described herein.
  • Sensor data is generated (Fig. 12, Block 1205) with the physical characteristic sensor system 270 where, as described herein, the sensor data embodies at least one of a vehicle navigation pose or location information and payload pose or location information.
  • Image data is captured (Fig. 12, Block 1210) with the supplemental navigation sensor system 288 where, as described herein, the image data informs the at least one of a vehicle navigation pose or location and payload pose or location supplement to the information of the physical characteristic sensor system 270.
  • the method may also include determining, with the vision system controller 122VC, from the information of the physical characteristic sensor system 270 vehicle pose and location (Fig. 12, Block 1220) effecting independent guidance of the autonomous transport vehicle 110 traversing the storage and retrieval system 100 facility.
  • the vision system controller 122VC may also determine from the information of the physical characteristic sensor system 270 payload (e.g., case unit CU) pose and location (Fig. 12, Block 1225) effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload bed 210B as described herein.
  • the vision system controller 122VC may also register the captured image data and generating therefrom at least one image of one or more features of the predetermined features (Fig. 12, Block 1215) where, as described herein, the at least one image is formatted as a virtual representation VR of the one or more predetermined features so as to provide comparison to one or more corresponding reference e.g., a corresponding feature of the virtual model 400VM that serves as a reference for identifying the form and/or location of the imaged predetermined feature) of the predetermined features of the reference representation 400VMR.
  • a corresponding reference e.g., a corresponding feature of the virtual model 400VM that serves as a reference for identifying the form and/or location of the imaged predetermined feature
  • the vision system controller 122VC is configured so that the virtual representation VR, of the imaged one or more features of the predetermined features, is effected resident on the autonomous transport vehicle 110, and the comparison between the virtual representation VR of the one or more imaged predetermined features and the one or more corresponding reference predetermined features (of the reference representation 400VMR) is effected resident on the autonomous transport vehicle 110.
  • the vision system controller 122 may confirm autonomous guided vehicle pose and location information or payload pose and location information (Fig. 12, Block 1230) registered by the vision system controller 122VC from the physical characteristic sensor system 270 based on the comparison between the virtual representation VR and the reference representation 400VMR.
  • the vision system controller 122VC may identify a variance in the autonomous transport vehicle 110 pose and location or a variance in the payload pose and location (Fig. 12, Block 1235) based on the comparison between the virtual representation VR and the reference representation 400VMR, and update or complete autonomous transport vehicle 110 pose or location information or update and complete the payload pose and location information from the physical characteristic sensor system 270 based on the variance. In the method, the vision system controller 122VC may determine a pose error (for the autonomous guided vehicle and/or the payload) (Fig.
  • Block 1240 in the information from the physical characteristic sensor system 270 and fidelity of the pose and location information (for the autonomous guided vehicle and/or the payload) from the physical characteristic sensor system 270 based on at least one of the identified variance and image analysis of the at least one image (e.g., from the vision system 400), and assign a confidence value according to at least one of the pose error and the fidelity.
  • the vision system controller 122VC switches payload handling based on pose and location information generated from the virtual representation VR in place of pose and location information from the physical characteristic sensor system 270; and/or with the confidence value below a predetermined threshold, the vision system controller 122VC switches autonomous guided vehicle 110 navigation based on pose and location information generated from the virtual representation VR in place of pose and location information from the physical characteristic sensor system 270.
  • the controller is configured to: continue autonomous guided vehicle navigation to destination or select an autonomous guided vehicle safe path and trajectory bringing the autonomous guided vehicle from a position at switching to a safe location for shut down, or initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to quasi automatic operation or manual operation via a user interface device; and/or continue autonomous guided vehicle handling to destination, or initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to quasi automatic payload handling operation or manual payload handling operation via a user interface device.
  • the controller transmits, via a wireless communication system (such as network 180) communicably coupling the vision system controller 122VC and the operator/user interface UI, a simulation image (see Figs. 9A, 10A, 10B) (Fig. 12, Block 1245) combining the virtual representation VR of the one or more imaged predetermined features and one or more corresponding reference predetermined features RPF of a reference presentation RPR presenting the operator with an augmented reality image in real time.
  • the vision system controller 122VC receives real time operator commands to the traversing autonomous guided vehicle 110, which commands are responsive to the real time augmented reality image (see Figs. 9A, 10A, 10B), and changes in the real time augmented reality image transmitted to the operator by the vision system controller 122VC.
  • Figs. 1A, IB, 2, 4A, 4B, and 14 an example of an autonomous transport vehicle 110 case unit (s) transfer transaction including a case unit (s) multi-pick and place operation with on the fly sortation of the case units for creating a mixed pallet load MPL (e.g., a pallet load having mixed cases or cases having different stock keeping units as shown in Fig. IB) according to a predetermined order out sequence will be described in accordance with an aspects of the disclosed embodiment.
  • a mixed pallet load MPL e.g., a pallet load having mixed cases or cases having different stock keeping units as shown in Fig. IB
  • the autonomous transport vehicle 110 picks at least a first case unit CUA from a first shelf of a first storage location 130S1 of picking aisle 130A1 (Fig. 14, Block 1400). As described above, localization of the autonomous transport vehicle 110 relative to the case unit CUA in storage location 130S1 is effected with the physical characteristic sensor system 270 and/or the supplemental navigation sensor system 288 in the manner described herein.
  • the autonomous transport vehicle 110 traverses the picking aisle 130A1 and buffers the at least first case unit CUA within the payload bed 210B (Fig. 14, Block 1410).
  • the autonomous transport vehicle 110 traverses the picking aisle 130A1 to a second storage location 130S2 and picks at least a second case unit CUB that is different than the at least first case unit CUA (Fig. 14, Block 1420). While the at least second case unit CUB is described as being in the same picking aisle 130A1 as the at least first case unit CUA, in other aspects the at least second case unit CUB may be in a different aisle or any other suitable holding location (e.g., transfer station, buffer, inbound lift, etc.) of the storage and retrieval system.
  • suitable holding location e.g., transfer station, buffer, inbound lift, etc.
  • the at least first case unit CUA and the at least second case unit CUB may comprising more than one case in ordered seguence corresponding to a predetermined case out order sequence of mixed cases.
  • the autonomous guided vehicle 110 traverses the picking aisle 130A1 and/or transfer deck 130B, with both the at least first case unit CUA and the at least second case unit CUB held within the payload bed 210B, to a predetermined destination (such as outbound lift 150B1).
  • the positions of the at least first case unit CUA and the at least second case unit CUB within the payload bed 210B may be monitored by at least one or more of the case unit monitoring cameras 410A, 410B, one or more three-dimensional imaging system 440A, 440B, and one or more case edge detection sensors 450A, 450B and arranged relative to one another (e.g., the supplemental navigation sensor system 288 at least in part effects on-the-fly justification and/or sortation of case units onboard the vehicle 110 in a manner substantially similar to that described in United States patent number 10,850,921, the disclosure of which was previously incorporated herein by reference in its entirety) within the payload bed 210B (e.g., with the justification blades 471, pushers 470, and/or pullers 472) based on data obtained from the at least one or more of the case unit monitoring cameras 410A, 410B, one or more three-dimensional imaging system 440A, 440B, and one or more case edge detection sensors 450A, 450
  • the autonomous transport vehicle 110 is localized (e.g., positioned) relative to the destination location with the physical characteristic sensor system 270 and/or the supplemental navigation sensor system 288 in the manner described herein.
  • the autonomous transport vehicle 110 places the at least first case unit CUA and/or the at least second case unit CUB (Fig. 14, Block 1430) where the transfer arm 210A is moved based on data obtained by one or more of the physical characteristic sensor system 270 and/or the supplemental navigation sensor system 288.
  • the method includes providing the autonomous transport vehicle 110 (Fig. 16, Block 1700) as described herein.
  • the autonomous transport vehicle 110 is configured to autonomously navigate to different positions with the navigation system and operates to effect predetermined transfer tasks at the different positions (Fig. 16, Block 1705) while incidentally capturing image data (Fig. 16, Block 1710) with the supplemental hazard sensor system 290.
  • the image data informs objects and/or spatial features 299 (having intrinsic physical characteristics) within at least a portion of the facility 100 viewed by the at least one camera 292 of the supplemental hazard sensor system 290 with the autonomous transport vehicle 110 in the different positions in the facility 100.
  • the method may also include determining, with the vision system controller 122VC, from the information of the supplemental hazard sensor system 290 presence of a predetermined physical characteristic of at least one object or spatial feature (Fig. 16, Block 1715), and in response thereto, selectably reconfiguring the vehicle from an autonomous state to a collaborative vehicle state (Fig. 16, Block 1720) for collaboration with an operator, the vehicle in the collaborative state is disposed to receive operator commands for the vehicle to continue effecting vehicle operation so as to finalize discrimination of the object or spatial feature 299 as a hazard (Fig. 16, Block 1725) and identify a mitigation action of the vehicle with respect to the hazard (Fig. 16, Block 1730) as described herein.
  • the vision system controller 122VC may also register the captured image data and generating therefrom at least one image of the presence of a predetermined physical characteristic of the at least one object or spatial feature 299 (Fig. 16, Block 1735) where, as described herein, the at least one image is formatted as a virtual representation VR of the predetermined physical characteristic of the at least one object or spatial feature 299 so as to provide comparison to one or more corresponding reference (e.g., a corresponding feature of the virtual model 400VM that serves as a reference for identifying the form and/or location of the imaged object or spatial feature 299) of the predetermined features of the reference representation 400VMR.
  • a corresponding reference e.g., a corresponding feature of the virtual model 400VM that serves as a reference for identifying the form and/or location of the imaged object or spatial feature 299
  • the vision system controller 122VC is configured so that the virtual representation VR, of the imaged object or spatial feature 299, is effected resident on (e.g., onboard) the autonomous transport vehicle 110, and the comparison between the virtual representation VR of the object or spatial feature 299 and the one or more corresponding reference predetermined features (of the reference representation 400VMR) is effected resident on the autonomous transport vehicle 110.
  • the vision system controller 122VC may determine presence of an unknown physical characteristic of the at least one object or spatial feature and switch the autonomous transport vehicle 110 from an autonomous operation state to a collaborative operation state.
  • the controller 122 is configured to: stop the autonomous transport vehicle 110 relative to the object or spatial feature 299 or select an autonomous guided vehicle path and trajectory bringing the autonomous transport vehicle 110 from a position at switching to a location 157 to initiate communication to an operator for identifying the object or spatial feature 299 via a user interface device UI.
  • the controller 122VC transmits, via a wireless communication system (such as network 180) communicably coupling the vision system controller 122VC and the operator/user interface UI, an image (see Fig. 15) (Fig. 16, Block 1740) combining the virtual representation VR of the one or more imaged object or spatial feature 299 and one or more corresponding reference predetermined features RPF of a reference presentation RPR presenting the operator with an augmented (or un-augmented) reality image in real time.
  • the controller 122 receives real time operator commands to the autonomous transport vehicle 110, which commands are responsive to the real time augmented reality or unaugmented image (see Fig. 15), and changes in the real time augmented reality or un-augmented image transmitted to the operator by the vision system controller 122VC.
  • the autonomous transport vehicle 110 includes the controller 122 that is coupled respectively to the drive section 261D, the case handling assembly 210, the peripheral electronics section 778, and other components/features of the autonomous transport vehicle 110 that are described herein so as to form a control system 122CS (see Figs. 25A-25C).
  • the control system 122CS effects each autonomous operation of the autonomous transport vehicle 110 described herein.
  • the controller system 122CS may be configured to provide communications, supervisory control, vehicle localization, vehicle navigation and motion control, payload sensing, payload transfer, and vehicle power management as described herein. In this and other aspects, the control system may also be configured to provide any suitable services to the vehicle 110.
  • the control system 122CS includes any suitable non- transitory program code and/or firmware that configure the vehicle 110 to perform the vehicle operations described herein.
  • the control system 122CS may be configured for (but is not limited to) one or more of remote updating of control system firmware/software, remote debugging of the vehicle 110, remote operation of the vehicle 110, tracking a position of the vehicle 110, tracking operational status of the vehicle 110, and tracking any other suitable information pertaining to the vehicle 110.
  • the control system 122CS is a distributed control system that includes, as described, herein, the controller 122, the vision system controller 122VC, and the power management section 444 (which includes the switching device 449 and the monitoring and control device 447).
  • the vision system controller 122VC and the power management section 444 are at least partially integral to the controller 122; while in other aspects one or more of the system controller 122VC and the power management section 444 are separate from but communicably coupled to the controller 122.
  • Components of the control system may be distributed throughout the autonomous transport vehicle 110 and communicably coupled to the controller 122 in any suitable manner (such as described in Figs. 25A-25C).
  • the controller 122 includes at least one of an autonomous navigation control section 122N and an autonomous payload handling control section 122H.
  • the autonomous navigation control section 122N is configured to register and hold in a volatile memory (such as memory 446 of a comprehensive power management section 444 of the controller 122) autonomous guided vehicle state and pose navigation information that is deterministic (and provided in real time) of and describes current and predicted state, pose, and location of the autonomous transport vehicle 110.
  • the autonomous transport vehicle state and pose navigation information includes both historic and current autonomous guided vehicle state and pose navigation information.
  • the state, pose, and location information is deterministic (and provided in real time) and describes the current and predicted state, pose, and location in up to six degrees of freedom X, Y, Z, Rx, Ry, Rz so that the historic, current and predicted state, pose, and location is described in full.
  • the autonomous payload handling control section 122H is configured to register and hold in the volatile memory (such as memory 446) current payload identity, state, and pose information (e.g., both historic and current).
  • the payload identity, state, and pose information describes historic and current payload identity, payload pose and state location relative to a frame of reference of the autonomous transport vehicle (e.g., such as the X, Y, Z coordinate axes and suitable datum surfaces within the payload bed 210B), and pick/place locations of current and historic payloads.
  • a frame of reference of the autonomous transport vehicle e.g., such as the X, Y, Z coordinate axes and suitable datum surfaces within the payload bed 210B
  • the controller 122 comprises a comprehensive power management section 444 (also referred to as a power distribution unit - see also Fig. 26) that is separate and distinct from each other section (such as the vision system controller 122VC) of the controller 122.
  • the power distribution unit 444 is communicably connected to the power supply 481 so as to monitor a charge level (e.g., voltage level or current level) of the power supply 481.
  • the power distribution unit 444 is connected to each respective branch circuit 482 (also referred to herein as a branch power circuit - see Fig.
  • the power distribution unit 444 is configured to comprehensively manage power consumption to each respective branch circuit 482 based on demand level of each branch circuit 482 relative to the charge level available from the power supply 481.
  • the power distribution unit 444 includes a monitoring and control device 447 (referred to herein as monitoring device 447), a switching device 449 (having switches 449S), a memory 446, a wireless communications module 445, and an analog to digital converter 448 (referred to herein as AD converter 448).
  • the monitoring device 447 is any suitable processing device configured to monitor at least the current usage and fuse status of the branch power circuits 482 and control shutdown of one or more selected branch power circuits 482 as described herein.
  • the monitoring device 447 is one or more of a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), a system on chip integrated circuit (SOC), and a central processing unit (CPU).
  • FPGA field-programmable gate array
  • CPLD complex programmable logic device
  • SOC system on chip integrated circuit
  • CPU central processing unit
  • the monitoring device 447 operates independent of the controller 122 and vision system controller 122VC, and the monitoring device 447 is programmed with non-transitory code to manage (e.g., at least power distribution to) one or more low level systems of the autonomous transport vehicle 110.
  • the power distribution unit 444 is configured to communicate with and control at least one branch device 483.
  • the power distribution unit 444 is communicably coupled to one or more of the analog sensors 483C (e.g., case edge detection sensors, line following sensors 275, and other analog sensors as described herein), the digital sensors 483B (e.g., cameras 410, 440, 450 of the vision system 400 and other digital sensors described herein), lights 483A, casters 250, drive/traction wheels 260, transfer arm 210A extension motor 667A-667C, transfer arm lift motors 669, payload justification motors 668A-668F of the payload bed 210B/transfer arm 210A, suspension lock motors, and any other suitable features of the autonomous transport vehicle 110 (see Figs.
  • the analog sensors 483C e.g., case edge detection sensors, line following sensors 275, and other analog sensors as described herein
  • the digital sensors 483B e.g., cameras 410, 440, 450 of the vision system 400 and other digital sensors described herein
  • lights 483A e.g., casters
  • the power distribution unit 444 may receive commands from the controller 122 to actuate one or more of the analog sensors 483C and the digital sensors 483B so that the one or more of the analog sensors 483C and the digital sensors 483B obtain one or more of pose and location information of the autonomous transport vehicle within the storage and retrieval system 100 storage structure 130 in a manner substantially similar to that described herein, and in United States patent numbers 8,425,173 titled “Autonomous Transport for Storage and Retrieval Systems” issued on April 23, 2013; 9,008,884 titled “Bot Position Sensing” issued on April 14, 2015; and 9,946,265 titled Bot Having High Speed Stability” issued on April 17, 2018, the disclosures of which are incorporated herein by reference in their entireties.
  • the power distribution unit 444 is configured to process and filter (in any suitable manner) the sensor data obtained by the one or more of the analog sensors 483C and the digital sensors 483B.
  • the power distribution unit 444 may also be configured to process and filter (in any suitable manner) control signals sent by the controller 122 (or vision system controller 122VC) to the one or more of the analog sensors 483C and the digital sensors 483B.
  • the power distribution unit 444 includes the AD converter 448 to effect conversion of the analog sensor data to digital sensor data for filtering and processing by the power distribution unit 444.
  • the autonomous transport vehicle may include lights 483A (Fig. 20, see also lighting/LED in Figs. 25A-25C) that are coupled to the frame 200 (or any other location of the autonomous transport vehicle 110) and that illuminate portions of the storage structure 130 adjacent the autonomous transport vehicle 110.
  • the power distribution unit 444 is configured to control operation of the lights 483A.
  • the power distribution unit 444 is configured to provide a pulse width modulation control signal to the lights 483A to actuate the lights 483A in a manner that minimizes power consumption.
  • the pulse width modulation control signal is configured to minimize an amount of power drawn from the power supply 481 for illuminating the lights 483A for a given autonomous transport vehicle task (e.g., reading a barcode with the vision system 400, detecting case unit features with the vision system, illumination of a portion of the storage and retrieval system 100 for remote operator viewing effected by the vision system as described herein.
  • the lights 483A may be any suitable lights including but not limited to light emitting diodes (LED).
  • the power distribution unit 444 is configured to manage power needs of the autonomous transport vehicle 110 so as to preserve higher level functions/operations of the autonomous transport vehicle 110.
  • the power distribution unit 444 is configured so as to comprehensively manage a demand charge level of each respective branch power circuit 482 (on which respective branch devices 483A-483F ... 483n (collectively referred to as branch devices 483, where n denotes an integer representing a maximum number of branch devices) are disposed - see Figs.
  • the predetermined pattern (e.g., for switching off the branch power circuits 482) is arranged to switch off branch power circuits 482 with a decrease in the available charge level from the power supply 481, so as to maximize available charge level from the power supply 481 directed to the controller 122.
  • the predetermined pattern is arranged to switch off the branch power circuits 482 with the decrease in the available charge level from the power supply 481 so that the available charge level directed to the controller 122 is equal to or exceeds the demand charge level of the controller 122 for a maximum time based on the available charge level of the power supply 481 (e.g., to preserve operation of the controller 122).
  • the monitoring device 447 of the power distribution unit 444 is configured to monitor the voltage of the power supply 481 (Fig. 23, Block 23800) as described herein and shut down components/systems (e.g., analog sensors, digital sensors drive systems, communications systems, etc.) of the autonomous transport vehicle 110 in a sequenced shutdown order where the each shutdown operation in the sequenced shutdown order depends on a respective threshold voltage of the power supply.
  • the power supply 481 power supply has a fully charged voltage of VI. With the power distribution unit 444 detecting the voltage VI the components/systems of the autonomous transport vehicle 110 are substantially fully operational to effect transport of case units throughout the storage structure 130.
  • the voltage of the power supply 481 may drop (and the power distribution unit 444 detects such voltage drop) to a first predetermined threshold voltage V2 (where V2 is less than VI).
  • the power distribution unit 444 monitoring the power supply 481 voltage detects that power supply voltage drops to a voltage equal to about the first predetermined threshold voltage V2 (Fig.
  • the power distribution unit 444 may operate the switches 449S to remove power from (e.g., shut down) branch power circuits 482 corresponding to case unit handling components/systems (e.g., arm extension drives 667, payload justification drives 668, arm lift drives 669, case unit sensors, arm/case unit justification position sensors, suspension locks, etc.) of the autonomous transport vehicle 110 (Fig. 23, Block 23820) so that remaining power of the power supply 481 may be employed to effect traverse of the autonomous transport vehicle to a charging station/location or other predetermined location within the storage structure 130.
  • case unit handling components/systems e.g., arm extension drives 667, payload justification drives 668, arm lift drives 669, case unit sensors, arm/case unit justification position sensors, suspension locks, etc.
  • the controller 122 may effect traverse of the autonomous transport vehicle to a safe location as described herein (e.g., a predetermined location of the storage and retrieval system where the autonomous vehicle may be accessed by an operator for maintenance or removal from the storage structure 130).
  • the power distribution unit 444 continues to monitor the voltage of the power supply 481 for a drop in the power supply voltage to a subsequent (e.g., next) lower threshold voltage (Fig. 23, Block 23830).
  • the power distribution unit 444 operates the switches 449S to remove power from (e.g., shut down) branch power circuits 482 (such as circuits 483D, 483F) corresponding to drives/systems that effect vehicle traverse (e.g., the right and left drive/traction wheels 260A, 260B (Figs. 2 and 21), caster wheel steering drives 600M (Fig. 2), traction control system 666 (Fig. 21), sensors and sensor controllers effecting vehicle navigation (e.g., vision system, line following sensors, etc. such as provided with sensor system 270) (Fig.
  • branch power circuits 482 such as circuits 483D, 483F
  • Block 23840 so that remaining power of the power supply 481 may be employed to effect operation of the controller 122 of the autonomous transport vehicle 110.
  • primary communications between the autonomous transport vehicle 110 and the control server 120 and/or an operator may also be shut down to preserve power for the controller 122.
  • the communications module 445 of the power distribution unit 444 operates to maintain a secondary communications channel between the controller 122 and the control server 120 and/or an operator (e.g., via the laptop, smart phone/tablet, etc.).
  • the power distribution unit 444 continues to monitor the voltage of the power supply 481 for the next subsequent lower threshold voltage (Fig. 23, Block 23850). For example, where a threshold voltage V4 (where V4 is less than V3) of the power supply 481 is detected by the power distribution unit 444, the power distribution unit 444 is configured to initiate shutting down of the controller 122 (Fig. 23, Block 23860) so that the controller 122 (and its software) is not adversely affected by a loss of power or an under-voltage/under-current failure.
  • the controller 122 is configured so that upon indication from (e.g., a prediction by) the power distribution unit 444 of imminent decrease in available charge level, directed from the power supply 481 to the controller 122, to less than a demand level of the controller 122, the controller 122 enters suspension of operation and hibernation. With the controller 122 in suspension and hibernating (e.g., shut down) the power distribution unit 444 may also shut itself down so that substantially all operations of the autonomous transport vehicle 110 are suspended.
  • the threshold voltage V4 is described above as the "lowest threshold voltage” such that detection of the threshold voltage V4 initiates shutdown of the controller 122.
  • the above shut down sequence effected by the power distribution unit 444 is exemplary only and in other aspects there may be any suitable number of threshold voltages at which any suitable number of corresponding vehicle components/systems are shut down to preserve power of the power supply 281.
  • Blocks 23830 and 23840 of Fig. 23 may be repeated in a loop until the next to lowest threshold voltage is reached.
  • each threshold voltage in the descending values of threshold voltages is known to the power distribution unit 444 (such as stored in memory 446 and accessible by the monitoring device 447) such that the loop ends when the next to lowest threshold voltage is reached.
  • the autonomous transport vehicle 110 has a power supply 481 with a fully charged voltage of about 46V (in other aspects the fully charged voltage may be more or less than about 46V).
  • the power distribution unit 444 monitors the voltage output by the power supply 481 during autonomous transport vehicle 110 operation in a manner similar to that described above with respect to Fig. 23.
  • the power distribution unit 444 operates the switches 449S to disable the traction motors 261M and other features (e.g., sensors associated with navigation/traverse of the autonomous transport vehicle) of the autonomous transport vehicle so that driving of the autonomous transport vehicle is disabled.
  • the power distribution unit 444 continues to monitor the output voltage of the power supply 481 for the next lowest threshold voltage of about 20V (in other aspects the output voltage may be more or less than about 20V). Upon detection of the threshold voltage of about 20V, the power distribution unit 444 effects, through the controller 122, positioning of any case units CU carried by the autonomous transport vehicle 110 to a known safe state (e.g., retracted into the payload bed 210B in a predetermined justified location) within the payload bed 210B.
  • a known safe state e.g., retracted into the payload bed 210B in a predetermined justified location
  • the controller 122 may effect extension of the transfer arm 210A to place the case unit(s) CU at the destination location rather than retract the case unit (s) CU into the payload bed 210B (noting that after placement of the case unit (s) CU the transfer arm 210A is retracted within the payload bed 21B to a safe/home position).
  • the power distribution unit 444 is configured to operate the switches 499S, upon detection of the next lowest threshold voltage of about 18V of the power supply 481 (in other aspects the output voltage may be more or less than about 18V), so as to shut down the vision system 400 and other 24V peripheral power supplies (e.g., including but not limited to case detection sensors, vehicle localization sensors, hot swap circuitry, etc.).
  • the power distribution unit 444 Upon detection of the next lowest power supply 481 output threshold voltage of about 14V (in other aspects the output voltage may be more or less than about 14V) the power distribution unit 444 is configured to operate the switches 499S to disable onboard and off-board communications (e.g., wireless communications module 445 and onboard Ethernet communications) of the autonomous transport vehicle 110.
  • onboard and off-board communications e.g., wireless communications module 445 and onboard Ethernet communications
  • the power distribution unit 444 continues to monitor the power supply 481 output voltage for the next lowest threshold voltage of about 12V (in other aspects the output voltage may be more or less than about 12V), and upon detection of the about 12V output voltage the power distribution unit 444 turns off lighting (e.g., LEDs) of the autonomous transport vehicle 110 and provides command signals to the controller 122 so that the controller 122 is placed into hibernation/sleep as described above.
  • lighting e.g., LEDs
  • the power distribution unit 444 Upon detection of the lowest power supply 481 output threshold voltage of about 10V (in other aspects the output voltage may be more or less than about 10V) by the power distribution unit 444, the power distribution unit 444 effects a complete shutdown of the autonomous transport vehicle 444 such that the controller 122, the vision system controller 122VC, and other suitable programmable devices (e.g., FPGAs, CPLDs, SOCs, CPUs, etc.) of the autonomous transport vehicle 110 are turned off/shut down.
  • suitable programmable devices e.g., FPGAs, CPLDs, SOCs, CPUs, etc.
  • the monitoring device 447 is configured to substantially continuously (e.g., with the autonomous transport vehicle 110 in operation) monitor power supply 481 operation and status.
  • the monitoring device 447 is configured to substantially continuously (or at any suitable predetermined time intervals) monitor a voltage of the power supply 481 (e.g., with any suitable voltage sensors) and communicate a low voltage condition (e.g., the voltage has dropped below a predetermined voltage level) to the controller 122 so that the controller 122 may effect a safe state of the autonomous transport vehicle 110.
  • the controller 122 is configured (e.g., via the monitoring device 447) so that upon indication from the power distribution unit 444 of imminent decrease in available charge level of the power supply 481, directed from the power supply 481 to the branch power circuit of the drive section 261D (see Fig. 21), the controller 122 is configured to command the drive section 261D so as to navigate the autonomous transport vehicle 110 along a predetermined auxiliary path AUXP and auxiliary trajectory AUXT (known as safe, nonconflicting with other vehicles 110, not impedimental nor blocking other vehicle paths, pass through nor destination location - see Fig. IB) to a predetermined bot auxiliary stop location 157 in the storage and retrieval facility (e.g., structure) 130.
  • a predetermined auxiliary path AUXP and auxiliary trajectory AUXT known as safe, nonconflicting with other vehicles 110, not impedimental nor blocking other vehicle paths, pass through nor destination location - see Fig. IB
  • the predetermined auxiliary stop location 157 is a safe, uncongested area of a transport deck 130B or picking aisle 130A or a human access zone (such as described in United States patent number 10,088,840 titled “Automated Storage and Retrieval System with Integral Secured Personnel Access Zones and Remote Rover Shutdown” issued on October 2, 2018, the disclosure of which is incorporated herein by reference in its entirety).
  • the controller 122 is configured so that upon indication from the power distribution unit 444 of imminent decrease in available charge level of the power supply 481, directed from the power supply 481 to the branch circuit of the payload handling section 210 (see Fig. 21) the controller 122 is configured to command the payload handling section 210 to move the payload handling actuator or transfer arm 210A (e.g., via one or more of arm extension drives 667 and arm lift drives 669), and any payload thereon (e.g., via payload justification drives 668), to a predetermined safe payload position in the payload bed 210B.
  • the safe payload position may be such that the payload does not overhang outside of the payload bed and is securely held within the payload bed 210B.
  • the controller 122 may also be configured to actively monitor a health status of the autonomous transport vehicle 110 and effect onboard diagnostics of vehicle systems.
  • vehicle system health is monitored in any suitable manner such as by monitoring current used and fuse status of the vehicle systems (and the branch power circuits 482 of which the branch devices 483 are a part).
  • the controller 122 includes at least one of a vehicle health status monitor 447V, a drive section health status monitor 447D, a payload handling section health monitor 447H, and a peripheral electronics section health monitor 447P.
  • the vehicle health status monitor 447V, the drive section health status monitor 447D, the payload handling section health monitor 447H, and the peripheral electronics section health monitor 447P may be sections of the monitoring device 447.
  • the controller also includes a health status register section 447M, which may be a section of the memory 446 (or memory 122M or any other suitable memory accessible by the controller 122).
  • the vehicle health status monitor 447V may monitor dynamic responses of the frame 200 and wheel suspension, such as with any suitable vehicle health sensors (such as accelerometers) coupled to the frame (e.g., such as described in United States provisional patent application number 63/213,589 titled "Autonomous Transport Vehicle with Synergistic Vehicle Dynamic Response” and filed on June 22, 2021, the disclosure of which is incorporated herein by reference in its entirety). Where a dynamic response is outside of a predetermined range the vehicle health status monitor 447V may effect (through controller 122) a maintenance request (e.g., presented on user interface UI) to an operator of the storage and retrieval system 100. In other aspects, any suitable characteristics of the vehicle may be monitored by the vehicle health status monitor 447V.
  • any suitable vehicle may be monitored by the vehicle health status monitor 447V.
  • the drive section health status monitor 447D may monitor power drawn by the motors 261M of the drive section 261D, drive section sensor (e.g., wheel encoders, etc.) status, and a status of the traction control system 666. Where the power usage of the motors 261M, drive section sensor responsiveness, and/or a traction control system response is outside of predetermined operating characteristics the drive section health status monitor 447D may effect (through controller 122) a maintenance request (e.g., presented on user interface UI) to an operator of the storage and retrieval system 100.
  • a maintenance request e.g., presented on user interface UI
  • the payload handling section health monitor 447H may monitor power drawn by the motors (e.g., extension lift, justification, etc.) of the case handling assembly 210 and a status of the case handling assembly sensors. Where the power usage of the case handling assembly motors and/or a case handling assembly sensor response is outside of predetermined operating characteristics the payload handling section health monitor 447H may effect (through controller 122) a maintenance request (e.g., presented on user interface UI) to an operator of the storage and retrieval system 100.
  • a maintenance request e.g., presented on user interface UI
  • the peripheral electronics section health monitor 447P may monitor the sensor system 270 and the at least one peripheral motor 777. Where the power usage of at least one peripheral motor 777 and/or a sensor (of the sensor system 270) response is outside of predetermined operating characteristics the peripheral electronics section health monitor 447P may effect (through controller 122) a maintenance request (e.g., presented on user interface UI) to an operator of the storage and retrieval system 100.
  • a maintenance request e.g., presented on user interface UI
  • the power distribution unit 444 is configured to monitor current in the branch power circuits 482 (in any suitable manner, such as directly with ammeters or indirectly by monitoring voltage and/or resistance of the respective branch power circuits 482) and a status of the respective fuses 484 of the branch power circuits 482.
  • Real-time feedback e.g., input data relating to current and fuse status is processed by the monitoring device 447 within milliseconds so that the processed data it is available substantially immediately as feedback
  • the controller 122 and control server 120 is provided to effect autonomous transport vehicle 110 operator and/or service/maintenance requests.
  • the real time feedback effected by the monitoring device 447 monitoring at least the branch power circuit 482 current and fuse 484 status provides for onboard diagnostics and health monitoring of the autonomous transport vehicle systems.
  • the power distribution unit 444 is configured to detect the fuse 484 status (e.g., inoperable or operable) based on, for example current of the respective branch power circuit 482. Where there is an absence of current detected in the respective branch power circuit 482 the monitoring device 447 determines that the fuse 484 is inoperable and in need of service, otherwise where current is detected the fuse 484 is operable (i.e., a fault state (see, e.g., Fig. 5) is detected).
  • the monitoring device 447 provides the fuse status (e.g., fault state) as feedback to, for example, the control server 120 and/or an operator through the communications module 445 so that servicing of the autonomous transport vehicle 110 can be scheduled.
  • the power distribution unit 444 is configured to monitor each branch power circuit 482 separately from each other power branch power circuit 482 so that where a fuse is determined to be inoperable the monitoring device 447 also identifies the branch power circuit 482 of which the fuse is a part so as to reduce downtime and troubleshooting of the autonomous transport vehicle 110 for fuse 484 replacement.
  • An increased current within a branch power circuit may be indicative of an impending drive motor fault, an impending bearing fault, or other impending electrical/mechanical fault.
  • each branch power circuit is monitored separately so that where an increased current is detected the corresponding branch power circuit 482 is also identified.
  • the monitoring device 447 provides the increased current value (e.g., fault state) and identifies the branch power circuit 482 with the overcurrent therein to, for example, the control server 120 and/or an operator through the communications module 445 so that servicing of the autonomous transport vehicle 110 can be scheduled.
  • the power distribution unit 444 is configured to monitor voltage regulators 490, branch device central processing units (CPUs) 491, and/or position sensors 492 of peripheral devices (e.g., such as transfer arm 210A, payload justification pushers/pullers, wheel encoders, navigation sensor systems (as described herein), payload positioning sensor systems (as described herein) (it is noted that suitable examples of payload justification pushers/pullers are described in, for example United States provisional patent application number 63/236,591 having attorney docket number 1127P015753-US (-#3) filed on August 24, 2021 and titled “Autonomous Transport Vehicle” as well as United States pre-grant publication number 2012/0189416 published on July 26, 2012 (United States patent application number 13/326, 52 filed on December 15, 2011) and titled "Automated Bot with Transfer Arm”; United States patent number 7591630 issued on September 22, 2009 titled “Materials-Handling System Using Autonomous Transfer and Transport Vehicles”
  • the monitoring device 447 is configured to monitor communications between the position sensors 492 and the controller 122, communications between the branch device controller(s) 491 and the controller 122, and the voltage from the voltage regulators 490. Where communication is expected from a sensor 492 and/or branch device controller 491 the monitoring device 447 may register a fault (e.g., time stamped) in the memory 446 and communicate such fault state (e.g., with the communications module 445 to the control server 120 and/or operator effecting a maintenance reguest.
  • a fault e.g., time stamped
  • the monitoring unit 447 may continue to monitor and register faults from the branch device 483/branch power circuit 482 and send a service reguested message to the control server 120 or operator depending on a freguency of the faults or any other suitable criteria.
  • the monitoring device 447 is configured to monitor a voltage of a voltage regulator 490 for one or more power branch circuits 482 in any suitable manner (such as feedback from the voltage regulator or voltmeter). Where there is an over-voltage or under-voltage detected by the monitoring device 447 the monitoring device 447 may register a fault (e.g., time stamped) in the memory 446 and communicate such fault state (e.g., with the communications module 445 to the control server 120 and/or operator effecting a maintenance reguest.
  • a fault e.g., time stamped
  • the monitoring unit 447 may continue to monitor and register faults from the voltage regulator 490 and send a service reguested message to the control server 120 or operator depending on a freguency of the faults or any other suitable criteria (such as a magnitude of the over-voltage or under-voltage).
  • the power distribution device 444 of the controller 122 is configured as a boot device so that at autonomous transport vehicle 110 cold startup (initialization) the monitoring device 447 is brought online before other sections of the controller 122 and vision system controller 122VC so as to set initial (safe) states of the autonomous transport vehicle 110 prior to boot-up of the controller 122 and vision system controller 122VC.
  • the controller 122 is configured so that upon indication from the power distribution unit 444 of imminent decrease in available power supply charge level, directed from the power supply 481 to the controller 1222, to less than a demand level of the controller 122, the controller 122 configures at least one of the autonomous guided vehicle state and pose navigation information and the payload identity, state, and pose information, held in respective registry and memory (e.g., such as memory 446 or other memories 122M of corresponding ones of the autonomous navigation control section 122N, the autonomous payload handling control section 122H, and the vision system control section (e.g., vision system controller 122VC)), into an initialization file 122F (Fig. 2) available on reboot of the controller 122.
  • registry and memory e.g., such as memory 446 or other memories 122M of corresponding ones of the autonomous navigation control section 122N, the autonomous payload handling control section 122H, and the vision system control section (e.g., vision system controller 122VC)
  • the controller 122 may also be configured so that upon indication from the power distribution unit 444 of imminent decrease in available power supply charge level, directed from the power supply 481 to the controller 122, to less than a demand level of the controller 122, to configure stored health status information from the at least one of the vehicle health status monitor 447V, the drive section health status monitor 447D, the payload handling section health monitor 447H, and the peripheral electronics section health monitor 447P in the health status register section 447M (such as in memory 122M or memory 446) into an initialization file 122F available on reboot of the controller 122.
  • the monitoring device 447 of the power distribution unit 444 is configured to control power up sequencing of the controller 122 sections (e.g., the autonomous navigation control section 122N, the autonomous payload handling control section 122H, and vision system controller 122VC), and branch devices 483 (e.g., sensors, drive motors, caster motors, transfer arm motors, justification device motors, payload bed 210B motors, etc.).
  • the sequencing may be that the vision system controller 122VC is powered up before the autonomous navigation control section 122N and the branch devices are powered up last; however, in other aspects any suitable power sequence may be employed such that control devices are powered up before the devices they control.
  • a exemplary autonomous transport vehicle 110 power up or cold startup process will be described with the power distribution device 444 as a boot device.
  • power to the autonomous transport vehicle 110 is turned on (Fig. 27, Block 2200) and the power distribution device 444 monitors the output voltage of the power supply 481 and determines if the output voltage is greater than a startup threshold voltage Via (Fig. 27, Block 2205) of about 16V (in other aspects the startup threshold voltage Via may be more or less than about 16V).
  • the power distribution unit 444 operates switches 499S so that power is provided to, for example, the controller 122, the vision system controller 122VS, the wireless communications module 445, and the other suitable programmable devices (e.g., FPGAs, CPLDs, SOCs, CPUs, etc.) of the autonomous transport vehicle 110 (Fig. 27, Block 2210).
  • the initialization file 122F (described above) may be employed on startup of the controller 122, the vision system controller 122VS, the wireless communications module 445, and the other suitable programmable devices (e.g., FPGAs, CPLDs, SOCs, CPUs, etc.) (Fig.
  • the power distribution unit 444 continues to monitor the voltage output by the power supply 481 and where the output voltage is detected as being above a next higher startup threshold voltage V2a (Fig. 27, Block 2220) of about 18V (in other aspects the startup threshold voltage V2a may be more or less than about 18V), the power distribution unit 444 operates switches 449S to turn on the lighting (e.g., LEDs - see Figs. 10A-10C) of the autonomous transport vehicle 110 (Fig. 27, Block 2225). Where the next higher startup threshold voltage V2a has not been reached the power distribution unit 444 continues to monitor the power supply 481 output voltage until the next higher startup threshold voltage V2a is reached (such as with the autonomous transport vehicle 110 being charged), or until a shutdown sequence is initiated (see Fig. 8 described herein).
  • the power distribution unit 444 With the power distribution unit 444 continuing to monitor the voltage output of the power supply 481, and with a next higher startup threshold voltage V3a detected by the power distribution unit (Fig. 27, Block 2230), the power distribution unit 444 operates the switches 449S so as to power up/turn on the case handling drives of, for example, the front and rear justification module 210ARJ, 210AFJ, payload bed 210B, and transfer arm 210A (Fig. 27, Block 2235) as well as 24V peripherals and instruments (see Figs. 25A-25C) of the autonomous transport vehicle 110.
  • the threshold voltage V3a may be about 24V but in other aspects the threshold voltage V3a may be more or less than about 24V.
  • the power distribution unit 444 continues to monitor the power supply 481 output voltage until the next higher startup threshold voltage V3a is reached (such as with the autonomous transport vehicle 110 being charged), or until a shutdown sequence is initiated (see Fig. 23 described herein).
  • the power distribution unit 444 With the power distribution unit 444 monitoring the voltage output of the power supply 481, and with detection of a next higher startup threshold voltage V4a (Fig. 27, Block 2240), the power distribution unit 444 operates the switches 449S so as to power up/turn on the traction drive motors 261M (Fig. 27, Block 2245).
  • the threshold voltage V4a may be about 28V but in other aspects the threshold voltage V4a may be more or less than about 28V.
  • the power distribution unit 444 continues to monitor the power supply 481 output voltage until the next higher startup threshold voltage V4a is reached (such as with the autonomous transport vehicle 110 being charged), or until a shutdown sequence is initiated (see Fig. 23 described herein).
  • the power distribution unit 444 is configured (e.g., with any suitable non-transitory computer program code) to power up the components of the autonomous transport vehicle 110 in the manner/sequence described above with respect to Fig. 27.
  • the power distribution unit 444 is configured so that control devices are powered up before the devices they control.
  • the controller 122 may be configured to effect one or more of onboard power supply charge mode, active control of inrush current to branch devices 483 (e.g., lower level system of the autonomous transport vehicle), and regenerative power supply 481 charging.
  • branch devices 483 e.g., lower level system of the autonomous transport vehicle
  • power distribution unit 444 With the autonomous transport vehicle 110 at a charging station (Fig. 28, Block 1300) power distribution unit 444 detects the presence of the traverse surface charging pad(s) (see Fig. 21 and Fig. 28, Block 1310).
  • the power distribution unit 444 as described herein, is configure to monitor the output voltage of the power supply 481 and effect control tasks based on the output voltage level.
  • control of power supply 481 charging is based on the output voltage of the power supply 481 detected by the power distribution unit 444.
  • the monitoring device 447 of the power distribution unit 444 is configured to control a low level charging logic of the autonomous transport vehicle 110.
  • An exemplary charging logic block diagram for the power distribution unit 444 is illustrated in Fig. 21. As can be seen in Fig.
  • the autonomous transport vehicle 110 is configured with vehicle mounted charging contacts that receive charging current from a charging pad located on a traverse surface of the transfer deck 130B, picking aisle 130A, and/or any other suitable traverse surface of the storage and retrieval system on which the autonomous transport vehicle 110 travels.
  • the traverse surface mounted charging pad and the vehicle mounted charging contacts are substantially similar to that described in United States patent number 9,469,208 titled “Rover Charging System” and issued on October 18, 2016; United States patent number 11,001,444 titled “Storage and Retrieval System Rover Interface” and issued on May 11, 2021; and United States patent application number 14/209,086 titled “Rover Charging System” and filed on March 13, 2014).
  • the autonomous transport vehicle 110 may also be configured with remote charging ports mounted to the front end 200E1 or rear end 200E2 of the frame 200 that engage (e.g., plug into) corresponding charge ports mounted to the storage structure 130 or a hand-held plug which an operator plugs into the remote charging ports of the autonomous transport vehicle 110.
  • remote charging ports mounted to the front end 200E1 or rear end 200E2 of the frame 200 that engage (e.g., plug into) corresponding charge ports mounted to the storage structure 130 or a hand-held plug which an operator plugs into the remote charging ports of the autonomous transport vehicle 110.
  • the monitoring device 447 controls a charge mode/rate of the power supply 481 so as to maximize a number of charge cycles of the power supply 481.
  • the monitoring device 447 is configured to effect one or more of a trickle charge mode (e.g., having a charge rate below a set threshold voltage), a slow charge mode, and an ultra-high-speed (e.g., high current) charge mode, where the charging current is limited by the monitoring device 447 to a set maximum charge voltage threshold to substantially prevent adverse effects on the power supply 481 from charging.
  • a trickle charge mode e.g., having a charge rate below a set threshold voltage
  • a slow charge mode e.g., a slow charge mode
  • an ultra-high-speed (e.g., high current) charge mode e.g., high current) charge mode
  • the charging current and voltage may be dependent on a capacity of and type of the power supply 481.
  • the power supply 481 may have any suitable voltage and charge capacity and may be an ultra-capacitor or any other suitable power source (e.g., lithium ion battery pack, lead acid battery pack, etc.). As can also be seen in Fig. 6, the autonomous transport vehicle 110 includes suitable active reverse voltage protection for the power supply 481.
  • the power distribution unit 444 detects that the output voltage from the power supply 481 is below a threshold charging voltage Vic (Fig. 28, Block 1320) of about 23V (in other aspects the threshold charging voltage Vic may be more or less than 23V), the monitoring device 477 of the power distribution unit 444 effects a limited current charging of the power supply 1330.
  • the limited charging current may be the slow charging mode described above.
  • the slow charge charging mode described above may have a charge current higher than that of the trickle charging mode but lower than a full charge current.
  • the power distribution unit 444 continues to monitor the output voltage of the power supply 481 during charging and with the detection of the output voltage of the power supply 481 being at or equal to the threshold charging voltage Vic (Fig. 28, Block 1320), the monitoring device 477 of the power distribution unit 444 effects another charging mode, such as the full charge current mode (Fig. 28, Block 1350).
  • the power distribution unit 444 monitors the output voltage of the power supply 481 during charging at full charge current and where the output voltage is at or greater than a next higher threshold charging voltage V2c (Fig. 13, Block 1340) of about 44V (in other aspects the output voltage may be more or less than about 44V), the monitoring device 477 of the power distribution unit 444 terminates charging.
  • the monitoring device 477 may effect the trickle charge mode so as to maintain the power supply 481 at peak/maximum charge with the vehicle charge contacts of autonomous transport vehicle 110 engaged/coupled with the traverse surface charging pad(s) (see Fig. 21).
  • the autonomous transport vehicle 110 includes one or more of current inrush protection, over voltage/current protection, and under voltage/current protection.
  • the autonomous transport vehicle 110 may include hot swap circuitry (substantially similar to that described in United States patent number 9,469,208 titled “Rover Charging System” and issued on October 18, 2016; United States patent number 11,001,444 titled “Storage and Retrieval System Rover Interface” and issued on May 11, 2021; and United States patent application number 14/209,086 titled “Rover Charging System” and filed on March 13, 2014) that is configured to effect autonomous transport vehicle 110 roll-on and roll-off of the traverse surface mounted charging pads regardless of an energization status of the traverse surface mounted charging pads.
  • the power distribution unit 444 is configured to actively control inrush current to the branch devices 483A-483F ... 483n (collectively referred to as branch devices 483, where n denotes an integer representing a maximum number of branch devices) of the respective branch power circuits 482, where the power distribution unit 444 receives from the controller 122 (and the controller 122 is configured to generate) a pulse width modulation signal that effects active control of the switches 449S to limit the inrush current (such as from charging or power surges) to the branch devices 483.
  • the power distribution unit 444 may operate one or more of the switches 449S so as to open the one or more switches to prevent inrush current from flowing to the branch devices 483.
  • one or more of the branch power circuits includes an electrical protection circuit 700 configured to protect the branch device 483 (a sensor is illustrated in Fig. 22 for exemplary purposes but in other aspects any suitable branch device, such as those described herein, may be provided).
  • the electrical protection circuit 700 is configured to substantially protect the branch device 483 (and any controls/measurement instruments devices associated therewith) from, for example, short circuits, over-voltage, and over-current.
  • the branch device 483 in this example a sensor
  • the electrical protection circuit 700 for exemplary purposes only, includes an adjustable three-terminal positive-voltage regulator 710 and a single resistor 720.
  • the voltage regulator 710 is configured to supply more than about 1.5 A over an output-voltage range of about 1.25 V to about 37 V.
  • the voltage regulator 710 with the resistor 720 coupled thereto limits the current to about 27 mA by leveraging the internal reference voltage of the voltage regulator 710.
  • the insertion of the electrical protection circuit 700 into the branch power circuit 482 substantially does not affect the about 4 mA to about 20 mA signal while providing control/measurement protection to devices disposed both upstream and downstream (with respect to the flow of current) the electrical protection circuit 700.
  • the configuration of the electrical protection circuit 700 is exemplary only and that the electrical protection circuit 700 may be configured with any suitable voltage regulator and resistor (having suitable specifications) for providing control/measurement protection for signal that are less than about 4 mA or more than about 20 mA.
  • the power distribution unit 444 is configured to effect regenerative charging of the power supply 481.
  • the back electromotive force (EMF) voltage produced by the respective motors 261M is fed back into the respective branch power circuit 483E, 483F.
  • the monitoring device 447 may operate the switches 449S (such as the Vcap_IN switch - see Fig. 21) so that the back EMF voltage (and current) regeneratively charges the power supply 481. With the motors 261M under power to drive the drive wheels 260A, 260B the monitoring device 447 may close the Vcap_IN switch to prevent power drain from the power supply 481.
  • the power distribution unit 444 includes the wireless communication module 445.
  • the wireless communication module 445 may be configured for any suitable wireless communication including, but not limited to, Wi-Fi, Bluetooth, cellular, etc.
  • the wireless commination module 445 configures the power distribution unit 444 so as to control at least in part, for example, communication between the autonomous transport vehicle 110 and other features of the storage and retrieval system including but not limited to the control server 120 over any suitable network such as network 180.
  • the wireless communication module 445 and monitoring device 447 configure the power distribution unit 444 as a secondary processor/controller such as where processing function errors of the controller 122 (e.g., such as safety related functions including remote shutdown, communications or other general component errors) are detected by the monitoring device 447.
  • processing function errors of the controller 122 e.g., such as safety related functions including remote shutdown, communications or other general component errors
  • the power distribution unit 444 maintains (secondary) communication between the control server 120 (and operators of the storage and retrieval system 100) and the different components of the autonomous transport vehicle 110 (e.g., through the communication module 445) so that the autonomous transport vehicle 110 can be remotely shut down or driven (either autonomously, semi-autonomously, or under manual remote control of an operator in a manner described herein to any suitable destination location.
  • the wireless commination module 445 also provides for "over the air” programming of the of the controller 122, vision system controller 122VC and updating firmware/programming of the monitoring device 447 or other suitable programmable devices (e.g., FPGAs, CPLDs, SOCs, CPUs, etc.) of the autonomous transport vehicle 110.
  • suitable programmable devices e.g., FPGAs, CPLDs, SOCs, CPUs, etc.
  • the power distribution unit 444 includes any suitable memory 446 that may buffer the software updates for installation in the monitoring device 447, controller 122, vision system controller 122VC and/or other suitable programmable devices (e.g., FPGAs, CPLDs, SOCs, CPUs, etc.).
  • suitable programmable devices e.g., FPGAs, CPLDs, SOCs, CPUs, etc.
  • the wireless commination module 445 of the power distribution unit 444 may also be configured as an Ethernet switch or Bridge.
  • the wireless communication modules 455 of the autonomous transport vehicles 110 travelling throughout the storage structure 130 may form a mesh network.
  • wireless communications from, for example the control server 122 or other suitable device such as a laptop, smart phone/tablet, etc. may be extended to a range the covers substantially an entirety of the storage structure 130 without dedicated Ethernet switches and bridges being disposed throughout (e.g., mounted to) the storage structure 130 in fixed/predetermined locations.
  • the method includes providing the autonomous transport 110 as described herein (Fig. 24, Block 24900). Autonomous operation of the autonomous transport vehicle 110 is effected with the controller 122 (Fig. 24, Block 24910) and a charge level of the power supply 481 of the autonomous transport vehicle 110 is monitored by the power distribution unit 444 (Fig. 24, Block 24920) as described herein.
  • the method may also include, as described herein, the switching of the branch power circuits 482 on and off in the predetermined pattern (such as described herein) based on the demand charge level of each respective branch power circuit 482 with respect to other branch power circuits 482 and the charge level available from the power supply 481 (Fig. 24, Block 24930).
  • the controller 122 Upon indication from the power distribution section 444 of imminent decrease in available power supply charge level, directed from the power supply 481 to the branch circuit 482 of the drive section 261D (see Fig. 21) and/or the case handling assembly 210, the controller 122 commands the drive section 261D to move of the autonomous transport vehicle 110 to a safe location and/or commands the case handling assembly 210 to move the payload to a safe location (Fig. 24, Block 24960) as described herein.
  • the controller 122 upon indication from the power distribution unit 444 of imminent decrease in available power supply charge level, directed from the power supply 481 to the controller 122, to less than demand level of the controller 122, the controller 122 creates at least one initialization file (Fig. 24, Block 24940).
  • the controller 122 may configure at least one of the autonomous guided vehicle state and pose navigation information and the payload identity, state, and pose information, held in respective registry and memory (e.g., such as memory 446 or other memories 122M of corresponding ones of the autonomous navigation control section 122N, the autonomous payload handling control section 122H, and the vision system control section (e.g., vision system controller 122VC)) of corresponding controller sections, into an initialization file 122F available on reboot of the controller 122.
  • registry and memory e.g., such as memory 446 or other memories 122M of corresponding ones of the autonomous navigation control section 122N, the autonomous payload handling control section 122H, and the vision system control section (e.g., vision system controller 122VC)
  • the controller 122 may store health status information from the at least one of vehicle health status monitor 447V, the drive section health status monitor 447D, the payload handling section health monitor 447H, and the peripheral electronics section health monitor 447P in the health status register section 477M into the initialization file 122F (or a different initialization file) available on reboot of the controller 122.
  • an autonomous guided vehicle comprises:
  • a drive section coupled to the frame with drive wheels supporting the autonomous guided vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the autonomous guided vehicle over the traverse surface in a facility;
  • a payload handler coupled to the frame configured to transfer a payload, with a flat undeterministic seating surface seated in the payload hold, to and from the payload hold of the autonomous guided vehicle and a storage location, of the payload, in a storage array;
  • a physical characteristic sensor system connected to the frame having electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electromagnetic beam or field with a physical characteristic, the electromagnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic, wherein the physical characteristic sensor system is configured to generate sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information; and
  • a supplemental sensor system connected to the frame, that supplements the physical characteristic sensor system, the supplemental sensor system being, at least in part, a vision system with cameras disposed to capture image data informing the at least one of a vehicle navigation pose or location and payload pose or location supplement to the information of the physical characteristic sensor system.
  • the autonomous guided vehicle further comprises a controller connected to the frame, operably connected to the drive section or the payload handler, and communicably connected to the physical characteristic sensor system, wherein the controller is configured to determine from the information of the physical characteristic sensor system vehicle pose and location effecting independent guidance of the autonomous guided vehicle traversing the facility.
  • the controller is configured to determine from the information of the physical characteristic sensor system payload pose and location effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload hold.
  • the controller is programmed with a reference representation of predetermined features defining at least part of the facility traversed through by the autonomous guided vehicle.
  • the controller is configured to register the captured image data and generate therefrom at least one image of one or more features of the predetermined features, the at least one image being formatted as a virtual representation of the one or more predetermined features so as to provide comparison to one or more corresponding reference of the predetermined features of the reference representation.
  • the controller is configured so that the virtual representation, of the imaged one or more features of the predetermined features, is effected resident on the autonomous guided vehicle, and comparison between the virtual representation of the one or more imaged predetermined features and the one or more corresponding reference predetermined features is effected resident on the autonomous guided vehicle.
  • the controller is configured to confirm autonomous guided vehicle pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
  • the controller is configured to identify a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation and the reference representation, and update or complete autonomous guided vehicle pose or location information from the physical characteristic sensor system based on the variance.
  • the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the autonomous guided vehicle pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
  • the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle navigation based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
  • the controller is configured to:
  • [0215] initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to quasi automatic operation or manual operation via a user interface device.
  • the controller is configured to confirm payload pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
  • the controller is configured to identify a variance in the payload pose and location based on the comparison between the virtual representation and the reference representation, and update or complete payload pose or location information from the physical characteristic sensor system based on the variance.
  • the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the payload pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
  • the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle payload handling based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
  • the controller is configured to: [0221] continue autonomous guided vehicle handling to destination, or
  • [0222] initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to guasi automatic payload handling operation or manual payload handling operation via a user interface device.
  • the controller is configured to transmit, via a wireless communication system communicably coupling the controller and an operator interface, a simulation image combining the virtual representation of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation presenting the operator with an augmented reality image in real time.
  • the controller is configured to receive real time operator commands to the traversing autonomous guided vehicle, which commands are responsive to the real time augmented reality image, and changes in the real time augmented reality image transmitted to the operator by the controller.
  • the supplemental sensor system at least in part effects on-the-fly justification and/or sortation of case units onboard the autonomous guided vehicle.
  • imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the supplemental sensor system are coapted to a reference model of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of the vehicle navigation pose or location information and the payload pose or location information.
  • an autonomous guided vehicle comprises:
  • a drive section coupled to the frame with drive wheels supporting the vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the autonomous guided vehicle over the traverse surface in a facility;
  • a payload handler coupled to the frame configured to transfer a payload, with a flat undeterministic seating surface seated in the payload hold, to and from the payload hold of the autonomous guided vehicle and a storage location, of the payload, in a storage array;
  • a physical characteristic sensor system connected to the frame having electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electromagnetic beam or field with a physical characteristic, the electromagnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic, wherein the physical characteristic sensor system is configured to generate sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information; and
  • an auxiliary sensor system connected to the frame, that is separate and distinct from the physical characteristic sensor system, the auxiliary sensor system being, at least in part, a vision system with cameras disposed to capture image data informing the at least one of a vehicle navigation pose or location and payload pose or location which image data is auxiliary information to the information of the physical characteristic sensor system.
  • the autonomous guided vehicle further comprises a controller connected to the frame, operably connected to the drive section or the payload handler, and communicably connected to the physical characteristic sensor system, wherein the controller is configured to determine from the information of the physical characteristic sensor system vehicle pose and location effecting independent guidance of the autonomous guided vehicle traversing the facility.
  • the controller is configured to determine from the information of the physical characteristic sensor system payload pose and location effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload hold.
  • the controller is programmed with a reference representation of predetermined features defining at least part of the facility traversed through by the autonomous guided vehicle.
  • the controller is configured to register the captured image data and generate therefrom at least one image of one or more features of the predetermined features, the at least one image being formatted as a virtual representation of the one or more predetermined features so as to provide comparison to one or more corresponding reference of the predetermined features of the reference representation.
  • the controller is configured so that the virtual representation, of the imaged one or more features of the predetermined features, is effected resident on the autonomous guided vehicle, and comparison between the virtual representation of the one or more imaged predetermined features and the one or more corresponding reference predetermined features is effected resident on the autonomous guided vehicle.
  • the controller is configured to confirm autonomous guided vehicle pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
  • the controller is configured to identify a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation and the reference representation, and update or complete autonomous guided vehicle pose or location information from the physical characteristic sensor system based on the variance.
  • the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the autonomous guided vehicle pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
  • the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle navigation based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
  • the controller is configured to:
  • [0244] initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to quasi automatic operation or manual operation via a user interface device.
  • the controller is configured to confirm payload pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
  • the controller is configured to identify a variance in the payload pose and location based on the comparison between the virtual representation and the reference representation, and update or complete payload pose or location information from the physical characteristic sensor system based on the variance.
  • the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the payload pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
  • the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle payload handling based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
  • the controller is configured to:
  • the controller is configured to transmit, via a wireless communication system communicably coupling the controller and an operator interface, a simulation image combining the virtual representation of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation presenting the operator with an augmented reality image in real time.
  • the controller is configured to receive real time operator commands to the traversing autonomous guided vehicle, which commands are responsive to the real time augmented reality image, and changes in the real time augmented reality image transmitted to the operator by the controller.
  • the supplemental sensor system at least in part effects on-the-fly justification and/or sortation of case units onboard the autonomous guided vehicle.
  • imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the auxiliary sensor system are coapted to a reference model of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of the vehicle navigation pose or location information and the payload pose or location information.
  • a method comprises:
  • a drive section coupled to the frame with drive wheels supporting the autonomous guided vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the autonomous guided vehicle over the traverse surface in a facility, and
  • a payload handler coupled to the frame configured to transfer a payload, with a flat undeterministic seating surface seated in the payload hold, to and from the payload hold of the autonomous guided vehicle and a storage location, of the payload, in a storage array;
  • [0261] generating sensor data with physical characteristic sensor system, the sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information, wherein the physical characteristic sensor system connected to the frame and has electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electro-magnetic beam or field with a physical characteristic, the electro-magnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic; and
  • capturing image data with a supplemental sensor system the image data informing the at least one of a vehicle navigation pose or location and payload pose or location supplement to the information of the physical characteristic sensor system, wherein the supplemental sensor system is connected to the frame and supplements the physical characteristic sensor system, the supplemental sensor system being, at least in part, a vision system with cameras disposed to capture the image data.
  • the method further comprises determining, with a controller, from the information of the physical characteristic sensor system vehicle pose and location effecting independent guidance of the autonomous guided vehicle traversing the facility, wherein the controller is connected to the frame and operably connected to the drive section or the payload handler, and communicably connected to the physical characteristic sensor system.
  • the method further comprises, with the controller, determining from the information of the physical characteristic sensor system payload pose and location effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload hold.
  • the controller is programmed with a reference representation of predetermined features defining at least part of the facility traversed through by the autonomous guided vehicle.
  • the method further comprises, with the controller, registering the captured image data and generating therefrom at least one image of one or more features of the predetermined features, the at least one image being formatted as a virtual representation of the one or more predetermined features so as to provide comparison to one or more corresponding reference of the predetermined features of the reference representation.
  • the controller is configured so that the virtual representation, of the imaged one or more features of the predetermined features, is effected resident on the autonomous guided vehicle, and comparison between the virtual representation of the one or more imaged predetermined features and the one or more corresponding reference predetermined features is effected resident on the autonomous guided vehicle.
  • the method further comprises, with the controller, confirming autonomous guided vehicle pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
  • the method further comprises, with the controller, identifying a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation and the reference representation, and updating or completing autonomous guided vehicle pose or location information from the physical characteristic sensor system based on the variance.
  • the controller determines a pose error in the information from the physical characteristic sensor system and fidelity of the autonomous guided vehicle pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
  • the controller switches autonomous guided vehicle navigation based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
  • the controller is configured to: [0273] continue autonomous guided vehicle navigation to destination or select an autonomous guided vehicle safe path and trajectory bringing the autonomous guided vehicle from a position at switching to a safe location for shut down, or
  • [0274] initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to guasi automatic operation or manual operation via a user interface device.
  • the controller confirms payload pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
  • the controller identifies a variance in the payload pose and location based on the comparison between the virtual representation and the reference representation, and update or complete payload pose or location information from the physical characteristic sensor system based on the variance.
  • the controller determines a pose error in the information from the physical characteristic sensor system and fidelity of the payload pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
  • the controller switches autonomous guided vehicle payload handling based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
  • the controller is configured to:
  • [0281] initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to quasi automatic payload handling operation or manual payload handling operation via a user interface device.
  • the controller transmits, via a wireless communication system communicably coupling the controller and an operator interface, a simulation image combining the virtual representation of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation presenting the operator with an augmented reality image in real time.
  • the controller receives real time operator commands to the traversing autonomous guided vehicle, which commands are responsive to the real time augmented reality image, and changes in the real time augmented reality image transmitted to the operator by the controller.
  • controller effects, with at least the supplemental sensor system, justification and/or sortation of case units onboard the autonomous guided vehicle.
  • imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the supplemental sensor system are coapted to a reference model of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of the vehicle navigation pose or location information and the payload pose or location information.
  • an autonomous guided vehicle comprises:
  • a frame with a payload hold [0287] a frame with a payload hold; [0288] a drive section coupled to the frame with drive wheels supporting the vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the vehicle over the traverse surface in a facility;
  • a payload handler coupled to the frame configured to transfer a payload to and from the payload hold of the vehicle and a storage location, of the payload, in a storage array;
  • a supplemental sensor system connected to the frame for collaboration of the vehicle and an operator, supplemental sensor system supplements a vehicle autonomous navigation/operation sensor system configured to at least collect sensory data embodying vehicle pose and location information for auto navigation by the vehicle of the facility,
  • the supplemental sensor system is, at least in part, a vision system with at least one camera disposed to capture image data informing objects and/or spatial features within at least a portion of the facility viewed by the at least one camera with the vehicle in different positions in the facility; and
  • a controller connected to the frame and communicably coupled to the supplemental sensor system so as to register the information from the image data of the at least one camera, and the controller is configured to determine, from the information, presence of a predetermined physical characteristic of at least one object or spatial feature, and in response thereto, selectably reconfigure the vehicle from an autonomous state to a collaborative vehicle state disposed to receive operator commands for the vehicle to continue effecting vehicle operation.
  • the predetermined physical characteristic is that the at least one object or spatial feature extends across at least part of, the traverse surface, a vehicle traverse path across the traverse surface or through space of the vehicle or another different vehicle traversing the traverse surface
  • the controller is programmed with a reference representation of predetermined features defining at least in part the facility traversed through by the vehicle.
  • the controller is configured to register the captured image data and generate therefrom at least one image of the at least one object or spatial feature showing the predetermined physical characteristic.
  • the at least one image is formatted as a virtual representation of the at least one object or spatial feature so as to provide comparison to one or more reference features of the predetermined features of the reference representation.
  • the controller is configured to identify the presence of the predetermined physical characteristic of the object or spatial feature based on the comparison between the virtual representation and the reference representation, determine a dimension of the predetermined physical characteristic and command the vehicle to stop in a predetermined trajectory based on a position of the object or spatial features determined from the comparison.
  • a stop position in the predetermined trajectory maintains object or spatial reference within field of view of at least one camera and continued imaging of the predetermined physical characteristic, initiates a signal to at least another vehicle of one or more of a traffic obstacle, an area to avoid, or a detour area.
  • the predetermined physical characteristic is determined by the controller by determining a position of the object within a reference frame of the at least one camera, that is calibrated and has a predetermined relationship to the vehicle, and from the object pose in the reference frame of the at least one camera determine presence of predetermined physical characteristic of object.
  • the controller is configured such that, identification of presence and switch from the autonomous state to the collaborative vehicle state, the controller initiates transmission communicating image, identification of presence of predetermined physical characteristic, to operator interface for operator collaboration operation of the vehicle.
  • the controller is configured to apply a trajectory to the autonomous guided vehicle that brings the autonomous guided vehicle to a zero velocity within a predetermined time period where motion of the autonomous guided vehicle along the trajectory is coordinated with location of the objects and/or spatial features.
  • the capture of image data informing objects and/or spatial features is opportunistic during transfer of a payload to/from the payload hold of the vehicle or a storage location in a storage array.
  • the controller is programmed to command the vehicle to the different positions in the facility associated with the vehicle effecting one or more predetermined payload autonomous transfer tasks, wherein each of the one or more predetermined payload autonomous transfer tasks is a separate and distinct task from the capture image data viewed by the at least one camera in the different positions.
  • the controller is configured so that determination of presence of the predetermined physical characteristic of the at least one object or spatial feature is, coincident at least in part with, but supplemental and peripheral to vehicle actions effecting each of the one or more predetermined payload auto transfer tasks.
  • the controller is configured so that determination of presence of the predetermined physical characteristic of the at least one object or spatial feature is, opportunistic to vehicle actions effecting each of the one or more predetermined payload auto transfer tasks.
  • At least one of the one or more predetermined payload auto transfer tasks is effected at at least one of the different positions.
  • the collaborative vehicle state is supplemental to the autonomous state of the vehicle effecting each of the one or more predetermined payload auto transfer tasks.
  • a method comprises:
  • a drive section coupled to the frame with drive wheels supporting the vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the vehicle over the traverse surface in a facility;
  • a payload handler coupled to the frame configured to transfer a payload to and from the payload hold of the vehicle and a storage location, of the payload, in a storage array;
  • the predetermined physical characteristic is that the at least one object or spatial feature extends across at least part of, the traverse surface, a vehicle traverse path across the traverse surface or through space of the vehicle or another different vehicle traversing the traverse surface.
  • the controller is programmed with a reference representation of predetermined features defining at least in part the facility traversed through by the vehicle.
  • the method further comprises generating, from the registered captured image data, at least one image of the at least one object or spatial feature showing the predetermined physical characteristic.
  • the at least one image is formatted as a virtual representation of the at least one object or spatial feature, the method further comprising comparing the virtual representation to one or more reference features of the predetermined features of the reference representation.
  • the method further comprises identifying, with the controller, the presence of the predetermined physical characteristic of the object or spatial feature based on the comparison between the virtual representation and the reference representation, determining a dimension of the predetermined physical characteristic, and commanding the vehicle to stop in a predetermined trajectory based on a position of the object or spatial features determined from the comparison.
  • the method further comprises, with the vehicle in a stop position in the predetermined trajectory, maintaining the object or spatial reference within a field of view of the at least one camera and continued imaging of the predetermined physical characteristic, initiating a signal to at least another vehicle of one or more of a traffic obstacle, an area to avoid, or a detour area.
  • the predetermined physical characteristic is determined by the controller by determining a position of the object within a reference frame of the at least one camera, that is calibrated and has a predetermined relationship to the vehicle, and from the object pose in the reference frame of the at least one camera determine presence of predetermined physical characteristic of the object.
  • the controller is configured such that, identification of presence of the predetermined physical characteristic of the at least one object or spatial feature and switch from the autonomous state to the collaborative vehicle state, initiates transmission communicating image, identification of presence of predetermined physical characteristic, to an operator interface for operator collaboration operation of the vehicle.
  • the method further comprises applying, with the controller, a trajectory to the autonomous guided vehicle bringing the autonomous guided vehicle to a zero velocity within a predetermined time period, where motion of the autonomous guided vehicle along the trajectory is coordinated with a location of the objects and/or spatial features.
  • the capture of image data informing objects and/or spatial features is opportunistic during transfer of a payload to/from the payload hold of the vehicle or a storage location in a storage array.
  • an autonomous guided vehicle comprises:
  • a payload handling section with at least one payload handling actuator configured so that actuation of the at least one payload handling actuator effects transfer of a payload to and from a payload bed, of the vehicle chassis, and a storage in the facility;
  • a peripheral electronics section having at least one of an autonomous pose and navigation sensor, at least one of a payload handling sensor, and at least one peripheral motor, the at least one peripheral motor being separate and distinct from each of the motors of the drive section and each actuator of the payload handling section;
  • a controller communicably coupled respectively to the drive section, the payload handling section, and peripheral section so at to effect each autonomous operation of the autonomous guided vehicle, wherein the controller comprises a comprehensive power management section communicably connected to the power supply so as to monitor a charge level of the power supply, and
  • the comprehensive power management section is connected to each respective branch circuit of the drive section, the payload handling section, and the peripheral electronics section respectively powering the drive section, the payload handling section, and the peripheral electronics section from the power supply, the comprehensive power management section being configured to manage power consumption of the branch circuits based on a demand level of each branch circuit relative to the charge level available from the power supply.
  • the comprehensive power management section is configured so as to manage a demand charge level of each respective branch circuit switching each respective branch circuit on or off in a predetermined pattern based on the demand charge level of each respective branch circuit with respect to other branch circuits and the charge level available from the power supply.
  • the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply, so as to maximize available charge level from the power supply directed to the controller.
  • the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply so that the available charge level directed to the controller is equal to or exceeds the demand charge level of the controller for a maximum time based on the available charge level of the power supply.
  • the controller has at least one of:
  • an autonomous navigation control section configured to register and hold in volatile memory autonomous guided vehicle state and pose navigation information, historic and current, that is deterministic of and describing current and predicted state, pose, and location of the autonomous guided vehicle; and [0338] an autonomous payload handling control section configured to register and hold in volatile memory current payload identity, state, and pose information, historic and current;
  • controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller configures at least one of the autonomous guided vehicle state and pose navigation information and the payload identity, state, and pose information, held in respective registry and memory of corresponding controller sections, into an initialization file available on reboot of the controller.
  • the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller enters suspension of operation and hibernation.
  • the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the drive section, the controller is configured to command the drive section so as to navigate the autonomous guided vehicle along a predetermined auxiliary path and auxiliary trajectory (to a predetermined autonomous guided vehicle auxiliary stop location in the facility.
  • the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the payload handling section the controller is configured to command the payload handling section to move the payload handling actuator, and any payload thereon, to a predetermined safe payload position in the payload bed.
  • the controller includes at least one of:
  • controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, to configure stored health status information from the at least one of the vehicle health status monitor, the drive section health status monitor, the payload handling section health monitor, and the peripheral electronics section health monitor in the health status register section into an initialization file available on reboot of the controller.
  • the power supply is an ultra-capacitor, or the charge level is voltage level.
  • the method comprises:
  • an autonomous guided vehicle with a vehicle chassis with a power supply mounted thereon and powered sections connected to the chassis and each powered by the power supply, the powered sections including:
  • a payload handling section with at least one payload handling actuator configured so that actuation of the at least one payload handling actuator effects transfer of a payload to and from a payload bed, of the vehicle chassis, and a storage in the facility;
  • a peripheral electronics section having at least one of an autonomous pose and navigation sensor, at least one of a payload handling sensor, and at least one peripheral motor, the at least one peripheral motor being separate and distinct from each of the motors of the drive section and each actuator of the payload handling section;
  • the comprehensive power management section manages a demand charge level of each respective branch circuit switching each respective branch circuit on or off in a predetermined pattern based on the demand charge level of each respective branch circuit with respect to other branch circuits and the charge level available from the power supply.
  • the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply, so as to maximize available charge level from the power supply directed to the controller.
  • the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply so that the available charge level directed to the controller is equal to or exceeds the demand charge level of the controller for a maximum time based on the available charge level of the power supply.
  • the method further comprises at least one of:
  • the controller upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller configures at least one of the autonomous guided vehicle state and pose navigation information and the payload identity, state, and pose information, held in respective registry and memory of corresponding controller sections, into an initialization file available on reboot of the controller.
  • the controller upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the drive section, commands the drive section so to navigate the autonomous guided vehicle along a predetermined auxiliary path and auxiliary trajectory to a predetermined autonomous guided vehicle auxiliary stop location in the facility.
  • the controller upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the payload handling section the controller commands the payload handling section to move the payload handling actuator, and any payload thereon, to a predetermined safe payload position in the payload bed.
  • the controller upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller configures stored health status information from the at least one of the vehicle health status monitor, the drive section health status monitor, the payload handling section health monitor, and the peripheral electronics section health monitor in the health status register section into an initialization file available on reboot of the controller.
  • the power supply is an ultra-capacitor, or the charge level is voltage level.

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Structural Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Geology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Civil Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

An autonomous guided vehicle includes a frame, a drive section, a payload handler, a sensor system, and a supplemental sensor system. The sensor system has electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electro- magnetic beam or field with a physical characteristic, the electro- magnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing of the physical characteristic. The sensor system generates sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information. The supplemental sensor system supplements the sensor system, and is, at least in part, a vision system with cameras disposed to capture image data informing the at least one of a vehicle navigation pose or location and payload pose or location supplement to the information of the sensor system.

Description

AUTONOMOUS TRANSPORT VEHICLE WITH VISION SYSTEM
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a non-provisional of and claims the benefit of United States provisional patent application number 63/232,546 filed on August 12, 2021, United States provisional patent application number 63/232,531 filed on August 12, 2021, and United States provisional patent application number 63/251,398 filed on October 1, 2021, the disclosures of which are incorporated herein by reference in their entireties.
BACKGROUND
1. Field
[0002] The disclosed embodiment generally relates to material handling systems, and more particularly, to transports for automated storage and retrieval systems.
2. Brief Description of Related Developments
[0003] Generally automated storage and retrieval systems employ autonomous vehicles that transport goods within the automated storage and retrieval system. These autonomous vehicles are guided throughout the automated storage and retrieval system by location beacons, capacitive or inductive proximity sensors, line following sensors, reflective beam sensors and other narrowly focused beam type sensors. These sensors may provide limited information for effecting navigation of the autonomous vehicles through the storage and retrieval system or provide limited information with respect to identification and discrimination of hazards that may be present throughout the automated storage and retrieval system.
[0004] In addition, autonomous transport vehicles in logistics/warehouse facilities are generally manufactured to have a predetermined form factor for an assigned task in a given environment. These autonomous transport vehicles are constructed of a bespoke cast or machined chassis/frame. The other components (e.g., wheels, transfer arms, etc.), some of which may also be bespoke assemblies/components, are mounted to the frame and are carried with the frame as the autonomous transport vehicle traverses along a traverse surface. The transfer arms and payload bay of these autonomous transport vehicles may include numerous components (sensors, encoders, etc.) and motor assemblies for transferring payloads to and from the autonomous transport vehicles as well as for justifying payloads within the payload bay. The motors and sensors may be substantially directly and continuously coupled to a power supply of the autonomous transport vehicle such as through an electrical bus bar. BRIEF DESCRIPTION OF THE DRAWINGS
[0005] The foregoing aspects and other features of the disclosed embodiment are explained in the following description, taken in connection with the accompanying drawings, wherein:
[0006] Fig. 1A is a schematic block diagram of an exemplary storage and retrieval system facility incorporating aspects of the disclosed embodiment;
[0007] Fig. IB is a plan view illustration of an the exemplary storage and retrieval system facility of Fig. 1A incorporating aspects of the disclosed embodiment;
[0008] Fig. 2 is an exemplary perspective illustration of an autonomous guided vehicle of the exemplary storage and retrieval system facility of Fig. 1A in accordance with aspects of the disclosed embodiment;
[0009] Figs. 3A and 3B are exemplary perspective illustrations of portions of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0010] Fig. 4A is an exemplary plan view illustration of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment; [0011] Fig. 4B is an exemplary perspective illustration of a portion of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0012] Fig. 4C is an exemplary perspective illustration of a portion of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0013] Fig. 4D is an exemplary plan view illustration of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0014] Figs. 5A, 5B, and 5C is an exemplary illustration of pose and location estimation in accordance with aspects of the disclosed embodiment;
[0015] Fig. 6 is an exemplary plan view illustration of a portion of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0016] Figs. 7A and 7B are respectively plan and perspective illustrations of a case unit illustrating a shelf invariant front face detection in accordance with aspects of the disclosed embodiment;
[0017] Fig. 8 is an exemplary illustration of data captured by a supplemental sensor system of the autonomous guided vehicle of
Fig. 2 in accordance with aspects of the disclosed embodiment; [0018] Fig. 9A is an exemplary stereo vision image captured by a supplemental sensor system of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0019] Fig. 9B is an exemplary augmented stereo vision image captured by a supplemental sensor system of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0020] Fig. 10A is an exemplary augmented image captured by a supplemental sensor system of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0021] Fig. 10B is an exemplary augmented stereo vision image captured by a supplemental sensor system of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0022] Fig. 11 is an exemplary block diagram illustrating a sensor selection depending on an operation mode of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0023] Fig. 12 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment;
[0024] Fig. 13 is an exemplary flow diagram of a vision analysis effected by the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment; [0025] Fig. 14 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment;
[0026] Fig. 15 is an exemplary image captured by a supplemental sensor system of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0027] Fig. 16 is an exemplary flow diagram of an image analysis effected by the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0028] Fig. 17 is an exemplary flow diagram of an image analysis collaboratively effected with a supplemental sensor system of the autonomous guided vehicle of Fig. 2 and an operator in accordance with aspects of the disclosed embodiment;
[0029] Fig. 18 is an exemplary flow diagram of an image analysis collaboratively effected with a supplemental sensor system of the autonomous guided vehicle of Fig. 2 and an operator in accordance with aspects of the disclosed embodiment;
[0030] Fig. 19 is an exemplary schematic block diagram of a portion of the autonomous transport vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0031] Fig. 20 is an exemplary schematic block diagram of a portion of the autonomous guided vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment; [0032] Fig. 21 is an exemplary schematic charging logic block diagram for the autonomous transport vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0033] Fig. 22 is an exemplary protection circuit of the autonomous transport vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0034] Fig. 23 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment;
[0035] Fig. 24 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment;
[0036] Figs. 25A, 25B, and 25C are collectively an exemplary schematic of a control system of the autonomous transport vehicle of Fig. 2 in accordance with aspects of the disclosed embodiment;
[0037] Fig. 26 is an exemplary schematic illustration of a portion of the control system of Figs. 25A, 25B, and 25C in accordance with aspects of the disclosed embodiment;
[0038] Fig. 27 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment; and
[0039] Fig. 28 is an exemplary flow diagram in accordance with aspects of the disclosed embodiment. DETAILED DESCRIPTION
[0040] Figs. 1A and IB illustrate an exemplary automated storage and retrieval system 100 in accordance with aspects of the disclosed embodiment. Although the aspects of the disclosed embodiment will be described with reference to the drawings, it should be understood that the aspects of the disclosed embodiment can be embodied in many forms. In addition, any suitable size, shape or type of elements or materials could be used.
[0041] The aspects of the disclosed embodiment provide for an autonomous transport vehicle 110 (also referred to herein as an autonomous guided vehicle) having a physical characteristic sensor system 276 that at least in part effects determination of at least one of a vehicle navigation pose or location and a payload pose or location. The autonomous transport vehicle 110 includes a supplemental or auxiliary navigation sensor system 288 that supplements the information from the physical characteristic sensor system 276 to at least one of verify and increase the accuracy of the vehicle navigation pose or location and the payload pose or location.
[0042] In accordance with the aspects of the disclosed embodiment the supplemental navigation sensor system 288 includes a vision system 400 that effects a reduction (e.g., compared to automated transport of case units with conventional vehicles lacking the supplemental sensor system described herein) in case unit transport errors and an increase in storage and retrieval system 100 operation efficiency. [0043] The aspects of the disclosed embodiment also provide for an autonomous transport vehicle 110 having an autonomous navigation/operation sensor system 270 that effects at least in part determination of at least one of a vehicle navigation pose or location and a payload pose or location. The autonomous transport vehicle 110 further includes a supplemental or auxiliary hazard sensor system 290 that supplements the information from the autonomous navigation/operation sensor system 270 for opportunistically determining or discriminating a presence of a predetermined physical characteristic of at least one object or spatial feature 299 (see, e.g., Figs. 4D and 15) within at least a portion of the facility 100 which the autonomous transport vehicle 110 is navigating (i.e., controller 122 is programmed to command the autonomous transport vehicle to different positions in the facility associated with effecting one or more predetermined payload autonomous transfer tasks). The vehicle navigates to the different positions with the navigation system and operates to effect the predetermined transfer tasks at the different positions separate and distinct from the captured image data by the supplemental hazard sensor system 290 in the different positions. The opportunistic determination/ discrimination of the presence of the predetermined physical characteristic of the object or spatial feature 299, incidental or peripheral to the vehicle 110 executing navigation and transfer, causes the controller 122 to selectably reconfigure the autonomous transport vehicle 110 from an autonomous state to a collaborative vehicle state for collaboration with an operator so as to finalize discrimination of the object or spatial feature 299 as a hazard and identify a mitigation action of the vehicle with respect to the hazard (i.e., the collaborative state is supplemental (auxiliary) to the autonomous state of the vehicle (wherein in the autonomous state the vehicle autonomously effects each of the one or more predetermined payload autonomous transfer tasks and in the auxiliary/collaborative state the vehicle collaborates with the operator to discriminate and mitigate hazards as described herein.
[0044] It is noted that the supplemental navigation sensor system 288 and the supplemental hazard sensor system 290 may be used in conjunction with each other or separately and may form a common vision system 400 or separate vision systems. In still other aspects, the supplemental hazard sensor system 290 may include sensors from the supplemental navigation sensor system 288 or vice versa (i.e., the supplemental navigation sensor system 288 and the supplemental hazard sensor system 290 share common sensors between the two sensor systems).
[0045] In accordance with the aspects of the disclosed embodiment, the autonomous transport vehicle 110 includes at least stereo vision that is focused on at least a payload bed (or bay or area) 210B of the autonomous transport vehicle 110 so that a controller (such as one or more of a control server 120 of the storage and retrieval system 100, a controller 122 of the autonomous transport vehicle 110, or any other suitable controller) or human operator of the storage and retrieval system 100 monitors case unit CU movement to and from the payload bed 210B. The autonomous transport vehicle 110 includes one or more imaging radar systems that independently measure (s) a size and a center point of front faces of case units CU disposed in storage spaces 130S on storage shelves of the storage level structure 130L. As will be described herein, the autonomous transport vehicle may include one or more other navigation and/or vision sensors to effect case unit transfer to and from the payload bed 210B and navigation of the autonomous transport vehicle 110 throughout a respective storage structure level 130L. As will be described further below, imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the supplemental sensor system, are coapted (e.g., fit/combined) to a reference model (or maps - such as model 400VM) of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of vehicle navigation pose or location information and payload pose or location information.
[0046] For example, referring to Fig. 4A, the autonomous transport vehicle 110 may include a forward looking stereo (e.g., with respect to a direction of travel of the autonomous transport vehicle 110) vision system and a rearward looking (e.g., with respect to the direction of travel) vision system that are configured to effect localization of the autonomous transport vehicle 110 within the storage structure level 130L by detecting any suitable navigation markers or fiducials (e.g., floor tape/lines, structural beams of the storage structure level, storage facility features, etc.) in combination with a storage level floor map and storage structure information (e.g., a virtual model 400VM of locations of columns, storage shelves, storage buffers, floor joints, etc.). Here, the storage level map (or model) and storage structure information embody the location (s) of the navigation markers so that upon recognition of the markers by the vision system 400 the autonomous transport vehicle 110 determines its localized position within the storage and retrieval system 100. The autonomous transport vehicle 110 may include one or more cameras that face upward for detecting any suitable navigation markers or fiducials located on a ceiling of the storage structure level 130L and determining a localization of the autonomous transport vehicle 110 using the storage level floor map and storage structure information. The autonomous transport vehicle 110 may include at least one sideways looking traffic monitoring camera that is configured to monitor autonomous transport vehicle traffic along transfer decks 130B of the storage and retrieval system 100 to facilitate autonomous transport vehicle 110 entry to a transfer deck 130B and merging of the autonomous transport vehicle 110 with other autonomous transport vehicles 110 already travelling along the transfer deck(s) 130B.
[0047] Referring to Fig. 4D, the autonomous transport vehicle 110 may also include a forward looking (e.g., with respect to a direction of travel of the autonomous transport vehicle 110) or omnidirectional (x, y, z, 0) vision system and/or a rearward looking (e.g., with respect to the direction of travel) vision system that is configured to effect imaging (available for continuous or periodical) for monitoring (supplemental to autonomous navigating sensor system 270) of the areas or spaces along autonomous travel paths of the autonomous transport vehicle 110 within, e.g., a storage structure level 130L and detecting any objects/hazards that may encroach on the bot travel path. In one aspect, as will be described below, the vision system 400 may effect imaging for supplemental monitoring and detection (of the objects/hazards) by the controller 122 so that monitoring and detection is performed resident on (e.g., onboard) the autonomous transport vehicle 110, such as by employment of a reference storage level floor map and storage structure information (e.g., a virtual model 400VM of locations of columns, storage shelves, storage buffers, floor joints, etc.); and from indication by the controller 122 of such detection and in collaboration with a remote operator remotely accessing the vision system effecting collaborative monitoring/detecting/identifying/discriminating/mitigating of the object 299 (see Fig. 15) with the vehicle 110 in the collaborative state. Where the vision system 400 of the autonomous transport vehicle 110 senses or detects the presence of objects/hazards which are not present in the reference storage level map and storage structure information, a determination of the object (s)/hazard (s) type(s) is effected upon indication by the controller by a remote operator receiving the images/video of the object/hazard transmitted from/by the autonomous transport vehicle 110 to the user interface UI.
[0048] As will be described herein, the autonomous transport vehicle 110 includes a vision system controller 122VC disposed onboard the autonomous transport vehicle and communicably coupled to the vision system 400 of the autonomous transport vehicle 110. The vision system controller 122VC is configured with model based vision in that the vision system controller 122VC simulates/models the storage and retrieval system 100 (e.g., based on any suitable information such as computer aided drafting (CAD) data of the storage and retrieval system structure or other suitable data stored in memory or accessible by the vision system controller 122VC that effects modeling/simulation of the storage and retrieval system 100) and compares the data obtained with the vision system 400 to the model/simulation of the storage and retrieval system structure to effect one or more or bot localization and imaging of the object/hazard. Here the autonomous transport vehicle 110 is configured to compare what it "sees" with the vision system 400 substantially directly with what the autonomous transport vehicle 110 expects to "see" based on the simulation of the (reference) storage and retrieval system structure.
[0049] The supplemental sensor system also effects augmented reality operator inspection of the storage and retrieval system environment as well as remote control of the autonomous transport vehicle 110 as will be described herein.
[0050] In accordance with the aspects of the disclosed embodiment the supplemental navigation sensor system 288 and/or the supplemental hazard sensor system 290 includes a vision system 400 that effects transmission (e.g., streaming live video, time stamped images, or any other suitable manner of transmission) of images/video to a remote operator for identification of the object/hazard present within the facility 100 (e.g., an object extending across the bot travel path, blocking the bot, proximate the bot within a predetermined distance) which is "unknown" (i.e., unidentifiable) by the autonomous transport vehicle 110. In accordance with the aspects of the disclosed embodiment, a controller (such as one or more of a control server 120 of the storage and retrieval system 100, a controller 122 of the autonomous transport vehicle 110, the vision system controller 122VC, or any other suitable controller) or human operator of the storage and retrieval system 100 monitors, via the vision system 400, the bot travel paths as the autonomous transport vehicle 110 navigates the facility to perform autonomous storage and retrieval tasks in accordance with the controller 122 commands. Further, and incidental to effecting the autonomous tasks, the vehicle 110 opportunistically discovers any objects/hazards within the facility 100 which could (based on predetermined initially identified criteria programmed in the controller 122) disrupt bot operations and/or traffic of other bots also navigating the facility 100 autonomously performing storage and retrieval tasks (i.e., the controller is configured so that determination of presence of object/hazard is coincident, at least in part, with, but supplemental and peripheral to bot actions (demanded for) effecting each of the one or more predetermined payload autonomous transfer tasks).
[0051] Referring to Figs. 1A, IB and 19, the aspects of the disclosed embodiment provide for an automated storage and retrieval system 100 having autonomous transport vehicles 110. Each autonomous transport vehicle 110 is configured with a comprehensive power management section 444 (also referred to herein as a power distribution unit - see Fig. 19). The power distribution unit 444 is configured to manage power needs of the autonomous transport vehicle 110 so as to preserve higher level functions/operations of the autonomous transport vehicle 110, the higher level functions being preserved depending on a charge level of a power supply of the autonomous transport vehicle 110. For example, control and drive operations may be preserved so that the autonomous transport vehicle 110 traverses to a charging station or maintenance location while other lower level functions of the autonomous transport vehicle (e.g., not needed for the traverse to the charging station or maintenance location) are shut down. Managing low level systems of the autonomous transport vehicle 110 conserves charge of the onboard vehicle power source to improve the operational time of the autonomous transport vehicle 110 between charging operations and preserves autonomous transport vehicle controller functionality.
[0052] The power distribution unit 444 may also be configured to control a charge mode of a power supply 481 of the autonomous transport vehicle so as to maximize a number of charge cycles of the power supply 481. The power distribution unit 444 monitors current draw for components (e.g., motors, sensors, controllers, etc. that are communicably coupled to the power source 481 on "branch circuits") of the autonomous transport vehicle 110 and manages (e.g., switches on and off) the power supply to each of the components to conserve the charge (e.g., energy usage) of the power supply 481.
[0053] The power distribution unit 444 may be configured to provide electric circuit fault protection (e.g., short circuit protection, over-voltage protection, over-current protection, etc.) for components of the autonomous transport vehicle 110 that are communicably coupled to the power supply 481 as loop devices or loop powered devices. Here, a loop powered device is an electronic device that is connected in a transmitter loop, such as a current loop, without the need to have a separate or independent power source, where the electronic device employs the power from the current flowing in the loop for its operation).
[0054] In accordance with the aspects of the disclosed embodiment, the automated storage and retrieval system 100 in Figs. 1A and IB may be disposed in a retail distribution center or warehouse, for example, to fulfill orders received from retail stores for replenishment goods shipped in cases, packages, and or parcels. The terms case, package and parcel are used interchangeably herein and as noted before may be any container that may be used for shipping and may be filled with case or more product units by the producer. Case or cases as used herein means case, package or parcel units not stored in trays, on totes, etc. (e.g., uncontained). It is noted that the case units CU (also referred to herein as mixed cases, cases, and shipping units) may include cases of items/unit (e.g., case of soup cans, boxes of cereal, etc.) or individual item/units that are adapted to be taken off of or placed on a pallet. In accordance with the exemplary embodiments, shipping cases or case units (e.g., cartons, barrels, boxes, crates, jugs, shrink wrapped trays or groups or any other suitable device for holding case units) may have variable sizes and may be used to hold case units in shipping and may be configured so they are capable of being palletized for shipping. Case units may also include totes, boxes, and/or containers of one or more individual goods, unpacked/decommissioned (generally referred to as breakpack goods) from original packaging and placed into the tote, boxes, and/or containers (collectively referred to as totes) with one or more other individual goods of mixed or common types at an order fill station. It is noted that when, for example, incoming bundles or pallets (e.g., from manufacturers or suppliers of case units arrive at the storage and retrieval system for replenishment of the automated storage and retrieval system 100, the content of each pallet may be uniform (e.g. each pallet holds a predetermined number of the same item - one pallet holds soup and another pallet holds cereal). As may be realized, the cases of such pallet load may be substantially similar or in other words, homogenous cases (e.g. similar dimensions), and may have the same SKU (otherwise, as noted before the pallets may be "rainbow" pallets having layers formed of homogeneous cases). As pallets leave the storage and retrieval system, with cases or totes filling replenishment orders, the pallets may contain any suitable number and combination of different case units (e.g., each pallet may hold different types of case units - a pallet holds a combination of canned soup, cereal, beverage packs, cosmetics and household cleaners). The cases combined onto a single pallet may have different dimensions and/or different SKU's. [0055] The automated storage and retrieval system 100 may be generally described as a storage and retrieval engine 190 coupled to a palletizer 162. In greater detail now, and with reference still to Figs. 1A and IB, the storage and retrieval system 100 may be configured for installation in, for example, existing warehouse structures or adapted to new warehouse structures. As noted before the automated storage and retrieval system 100 shown in Figs. 1A and IB is representative and may include for example, in-feed and out- feed conveyors terminating on respective transfer stations 170, 160, lift module(s) 150A, 150B, a storage structure 130, and a number of autonomous transport vehicles 110 (also referred to herein as "bots"). It is noted that the storage and retrieval engine 190 is formed at least by the storage structure 130 and the autonomous transport vehicles 110 (and in some aspect the lift modules 150A, 150B; however in other aspects the lift modules 150A, 150B may form vertical sequencers in addition to the storage and retrieval engine 190 as described in United States patent application number 17/091,265 filed on November 6, 2020 and titled "Pallet Building System with Flexible Sequencing, " the disclosure of which is incorporated herein by reference in its entirety). In alternate aspects, the storage and retrieval system 100 may also include robot or bot transfer stations (not shown) that may provide an interface between the autonomous transport vehicles 110 and the lift module(s) 150A, 150B. The storage structure 130 may include multiple levels of storage rack modules where each storage structure level 130L of the storage structure 130 includes respective picking aisles 130A, and transfer decks 130B for transferring case units between any of the storage areas of the storage structure 130 and a shelf of the lift module (s) 150A, 150B. The picking aisles 130A are in one aspect configured to provide guided travel of the autonomous transport vehicles 110 (such as along rails 130AR) while in other aspects the picking aisles are configured to provide unrestrained travel of the autonomous transport vehicle 110 (e.g., the picking aisles are open and undeterministic with respect to autonomous transport vehicle 110 guidance/travel). The transfer decks 130B have open and undeterministic bot support travel surfaces along which the autonomous transport vehicles 110 travel under guidance and control provided by bot steering (as will be described herein). In one or more aspects, the transfer decks have multiple lanes between which the autonomous transport vehicles 110 freely transition for accessing the picking aisles 130A and/or lift modules 150A, 150B. As used herein, "open and undeterministic" denotes the travel surface of the picking aisle and/or the transfer deck has no mechanical restraints (such as guide rails) that delimit the travel of the autonomous transport vehicle 110 to any given path along the travel surface. It is noted that while the aspects of the disclosed embodiment are described with respect to a multilevel storage array, the aspects of the disclosed embodiment may be equally applied to a single level storage array that is disposed on a facility floor or elevated above the facility floor.
[0056] The picking aisles 130A, and transfer decks 130B also allow the autonomous transport vehicles 110 to place case units CU into picking stock and to retrieve ordered case units CU (and define the different positions where the bot performs autonomous tasks, though any number of locations in the storage structure (e.g., decks, aisles, storage racks, etc.) can be one or more of the different positions). In alternate aspects, each level may also include respective bot transfer stations 140. The autonomous transport vehicles 110 may be configured to place case units, such as the above described retail merchandise, into picking stock in the one or more storage structure levels 130L of the storage structure 130 and then selectively retrieve ordered case units for shipping the ordered case units to, for example, a store or other suitable location. The in-feed transfer stations 170 and out-feed transfer stations 160 may operate together with their respective lift module(s) 150A, 150B for bi-directionally transferring case units CU to and from one or more storage structure levels 130L of the storage structure 130. It is noted that while the lift modules 150A, 150B may be described as being dedicated inbound lift modules 150A and outbound lift modules 150B, in alternate aspects each of the lift modules 150A, 150B may be used for both inbound and outbound transfer of case units from the storage and retrieval system 100.
[0057] As may be realized, the storage and retrieval system 100 may include multiple in-feed and out-feed lift modules 150A, 150B that are accessible by, for example, autonomous transport vehicles 110 of the storage and retrieval system 100 so that one or more case unit(s), uncontained (e.g., case unit(s) are not held in trays), or contained (within a tray or tote) can be transferred from a lift module 150A, 150B to each storage space on a respective level and from each storage space to any one of the lift modules 150A, 150B on a respective level. The autonomous transport vehicles 110 may be configured to transfer the case units between the storage spaces 130S (e.g., located in the picking aisles 130A or other suitable storage space/case unit buffer disposed along the transfer deck 130B) and the lift modules 150A, 150B. Generally, the lift modules 150A, 150B include at least one movable payload support that may move the case unit (s) between the in-feed and out-feed transfer stations 160, 170 and the respective level of the storage space where the case unit (s) is stored and retrieved. The lift module(s) may have any suitable configuration, such as for example reciprocating lift, or any other suitable configuration. The lift module (s) 150A, 150B include any suitable controller (such as control server 120 or other suitable controller coupled to control server 120, warehouse management system 2500, and/or palletizer controller 164, 164') and may form a sequencer or sorter in a manner similar to that described in United States patent application number 16/444,592 filed on June 18, 2019 and titled "Vertical Sequencer for Product Order Fulfillment" (the disclosure of which is incorporated herein by reference in its entirety).
[0058] The automated storage and retrieval system may include a control system, comprising for example one or more control servers 120 that are communicably connected to the in-feed and out-feed conveyors and transfer stations 170, 160, the lift modules 150A, 150B, and the autonomous transport vehicles 110 via a suitable communication and control network 180. The communication and control network 180 may have any suitable architecture which, for example, may incorporate various programmable logic controllers (PLC) such as for commanding the operations of the in- feed and out-feed conveyors and transfer stations 170, 160, the lift modules 150A, 150B, and other suitable system automation. The control server 120 may include high level programming that effects a case management system (CMS) managing the case flow system. The network 180 may further include suitable communication for effecting a bi-directional interface with the autonomous transport vehicles 110. For example, the autonomous transport vehicles 110 may include an on-board processor/controller 122 (which is configured to effect at least control and safety functions of the autonomous transport vehicle 110 - see also Figs. 10A-10C). The network 180 may include a suitable bi-directional communication suite enabling the autonomous transport vehicle controller 122 to request or receive commands from the control server 120 for effecting desired transport (e.g. placing into storage locations or retrieving from storage locations) of case units and to send desired autonomous transport vehicle 110 information and data including autonomous transport vehicle 110 ephemeris, status and other desired data, to the control server 120. As seen in Figs. 1A and IB, the control server 120 may be further connected to a warehouse management system 2500 for providing, for example, inventory management, and customer order fulfillment information to the CMS level program of control server 120. As noted before, the control server 120, and/or the warehouse management system 2500 allow for a degree of collaborative control, at least of bots 110, via a user interface UI, as will be further described below. A suitable example of an automated storage and retrieval system arranged for holding and storing case units is described in U.S. Patent No. 9,096,375, issued on August 4, 2015 the disclosure of which is incorporated by reference herein in its entirety. [0059] Referring now to Figs. 1A, IB, and 2, the autonomous transport vehicle 110 (which may also be referred to herein as an autonomous guided vehicle or bot) includes a vehicle frame or chassis 200 (referred to herein as a frame) with a power supply 481 mounted therein and an integral payload support or bed 210B. The frame 200 has a front end 200E1 and a back end 200E2 that define a longitudinal axis LAX of the autonomous transport vehicle 110. The frame 200 may be constructed of any suitable material (e.g., steel, aluminum, composites, etc.). As described herein, powered sections are connected to the frame 200, where each powered section is powered by the power supply 481. The powered sections include a drive section, 261D, a payload handling section 210 (also referred to herein as a case handling assembly 210), and a peripheral electronics section 778.
[0060] The payload handling section or case handling assembly 210 configured to handle cases/payloads transported by the autonomous transport vehicle 110. The case handling assembly 210 has at least one payload handling actuator (e.g., transfer arm 210A) configured so that actuation of the payload handling actuator effects transfer of the payload (e.g., case unit) to and from the payload bed 210B, of the frame, and a storage (e.g., storage spaces 130S of storage shelves) in the facility. In some aspects, the case handling assembly 210 includes the payload bed 210B (also referred to herein as a payload bay or payload hold) and is configured so as to move the payload bed in direction VER; in other aspects where the payload bed 210B is formed by the frame 200 the payload bed may be fixed/stationary in direction VER. As may be realized, payloads are placed on the payload bed 210B.
[0061] The transfer arm 210A is configured to (autonomously) transfer a payload (such as a case unit CU), with a flat undeterministic seating surface seated in the payload bed 210B, to and from the payload bed 210B of the autonomous guided vehicle 110 and a storage location (such as storage space 130S on storage shelf 555 (see Fig. 5A), a shelf of lift module 150A, 150B, buffer, transfer station, and/or any other suitable storage location), of the payload CU, in a storage array SA, where the storage location 130S, in the storage array SA, is separate and distinct from the transfer arm 210A and the payload bed 210B. The transfer arm 210A is configured with extension motors 667A-667C and lift motor(s) 669 that configure the transfer arm 210A to extend laterally in direction LAT and/or vertically in direction VER to transport payloads to and from the payload bed 210B. The payload bed 210B includes a front and rear justification module 210ARJ, 210AFJ configured to justify case units along the longitudinal axis LAX and laterally in direction LAT anywhere within the payload bed 210B. For example, the payload bed includes justification arms JAR (Figs. 10A and IOC) that are driven along the longitudinal axis by respective justification motors 668B, 668E so as to justify the case unit(s) CU along the longitudinal axis LAX. Pushers JPS and pullers JPP (Figs. 10A and IOC) may be movably mounted to the justification arms so as to be driven by respective motors 668A, 668C, 668D, 668F in direction LAT so as to justify the case unit (s) CU in direction LAT. One or more of the motors 668A-668F may also be operated to clamp or grip the case unit(s) CU held in the payload bed 210B such as during case unit transport by the vehicle 110.
[0062] Examples of suitable payload beds 210B and transfer arms 210A and/or autonomous transport vehicles to which the aspects of the disclosed embodiment may be applied can be found in United States provisional patent application number 63/236,591, having attorney docket number 1127P015753-US (-#3) filed on August 24, 2021 and titled "Autonomous Transport Vehicle" as well as United States pre-grant publication number 2012/0189416 published on July 26, 2012 (United States patent application number 13/326,952 filed on December 15, 2011) and titled "Automated Bot with Transfer Arm"; United States patent number 7591630 issued on September 22, 2009 titled "Materials-Handling System Using Autonomous Transfer and Transport Vehicles"; United States patent number 7991505 issued on August 2, 2011 titled "Materials-Handling System Using Autonomous Transfer and Transport Vehicles"; United States patent number 9561905 issued on February 7, 2017 titled "Autonomous Transport Vehicle"; United States patent number 9082112 issued on July 14, 2015 titled "Autonomous Transport Vehicle Charging System"; United States patent number 9850079 issued on December 26, 2017 titled "Storage and Retrieval System Transport Vehicle"; United States patent number 9187244 issued on November 17, 2015 titled "Bot Payload Alignment and Sensing"; United States patent number 9499338 issued on November 22, 2016 titled "Automated Bot Transfer Arm Drive System"; United States patent number 8965619 issued on February 24, 2015 titled "Bot Having High Speed Stability"; United States patent number 9008884 issued on April 14, 2015 titled "Bot Position Sensing"; United States patent number 8425173 issued on April 23, 2013 titled "Autonomous Transports for Storage and Retrieval Systems"; and United States patent number 8696010 issued on April 15, 2014 titled "Suspension System for Autonomous Transports", the disclosures of which are incorporated herein by reference in their entireties.
[0063] The frame 200 includes one or more idler wheels or casters 250 disposed adjacent the front end 200E1. The frame 200 also includes one or more drive wheels 260 disposed adjacent the back end 200E2. In other aspects, the position of the casters 250 and drive wheels 260 may be reversed (e.g., the drive wheels 260 are disposed at the front end 200E1 and the casters 250 are disposed at the back end 200E2). It is noted that in some aspects, the autonomous transport vehicle 110 is configured to travel with the front end 200E1 leading the direction of travel or with the back end 200E2 leading the direction of travel. In one aspect, casters 250A, 250B (which are substantially similar to caster 250 described herein) are located at respective front corners of the frame 200 at the front end 200E1 and drive wheels 260A, 260B (which are substantially similar to drive wheel 260 described herein) are located at respective back corners of the frame 200 at the back end 200E2 (e.g., a support wheel is located at each of the four corners of the frame 200) so that the autonomous transport vehicle 110 stably traverses the transfer deck(s) 130B and picking aisles 130A of the storage structure 130. [0064] The autonomous transport vehicle 110 includes a drive section 261D, connected to the frame 200, having motores 261M that power (or drive) drive wheels 260 (supporting the autonomous transport vehicle 110 on a traverse/rolling surface 284), where the drive wheels 260 effect vehicle traverse on the traverse surface 284 moving the autonomous transport vehicle 110 over the traverse surface 284 in a facility (e.g., such as a warehouse, store, etc.) under autonomous guidance. The drive section 261D has at least a pair of traction drive wheels 260 (also referred to as drive wheels 260 - see drive wheels 260A, 260B) astride the drive section 261D. The drive wheels 260 have a fully independent suspension 280 coupling each drive wheel 260A, 260B of the at least pair of drive wheels 260 to the frame 200, with at least one intervening pivot link (described herein) between at least one drive wheel 260A, 260B and the frame 200 configured to maintain a substantially steady state traction contact patch between the at least one drive wheel 260A, 260B and rolling/travel surface 284 (also referred to as autonomous vehicle travel surface 284- see, e.g., Figs. 4D, 9A, 9B, and 15) over rolling surface transients (e.g., bumps, surface transitions, etc.) Suitable examples of the fully independent suspension 280 can be found in United States provisional patent application number 63/213,589 titled "Autonomous Transport Vehicle with Synergistic Vehicle Dynamic Response" (having attorney docket number 1127P015753-US (-#2)) filed on June 22, 2021, the disclosure of which is incorporated herein by reference in its entirety. [0065] As described above, and also referring to Figs. 3A and 3B, the frame 200 includes one or more casters 250 disposed adjacent the front end 200E1. In one aspect, a caster 250 is located adjacent each front corner of the frame 200 so that in combination with the drive wheels 260 disposed at each rear corner of the frame 200, the frame 200 stably traverses the transfer deck 130B and picking aisles 130A of the storage structure 130. Referring to Figs. 2, 3A and 3B, in one aspect, each caster 250 comprises a motorized (e.g., active/motorized steering) caster 600M; however, in other aspects the caster 250 may be a passive (e.g., un-motorized) caster. In one aspect, the motorized caster 600M includes a caster wheel 610 coupled to a fixed geometry wheel fork 640 (Fig. 3A); while in other aspects the caster wheel 610 is coupled to a variable geometry or articulated (e.g., suspension) fork 740. Each motorized caster 600M is configured to actively pivot its respective caster wheel 610 (independent of the pivoting of other wheels of other motorized casters) in direction 690 about caster pivot axis 691 to at least assist in effecting a change in the travel direction of the autonomous transport vehicle 110. Suitable examples of casters can be found in United States provisional patent application number 63/213,589 filed on June 22, 2021 (previously incorporated herein by reference in its entirety) and United States provisional patent application number 63/193,188 titled "Autonomous Transport Vehicle with Steering" having attorney docket number 1127P015753-US (-#5) filed on May 26, 2021, the disclosure of which is incorporated herein by reference in its entirety. [0066] The autonomous transport vehicle 110 includes a physical characteristic sensor system 270 (also referred to as an autonomous navigation operation sensor system) connected to the frame 200. The physical characteristic sensor system 270 has electro-magnetic sensors. Each of the electro-magnetic sensors responsive is to interaction or interface of a sensor emitted or generated electro- magnetic beam or field with a physical characteristic (e.g., of the storage structure or a transient object such as a case unit CU, debris, etc.), where the electro-magnetic beam or field is disturbed by interaction or interface with the physical characteristic. The disturbance in the electro-magnetic beam is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic, wherein the physical characteristic sensor system 270 is configured to generate sensor data embodying at least one of a vehicle navigation pose or location (relative to the storage and retrieval system or facility in which the autonomous transport vehicle 110 operates) information and payload pose or location (relative to a storage location 130S or the payload bed 210B) information.
[0067] The physical characteristic sensor system 270 includes, an autonomous pose and navigation sensor that includes, for exemplary purposes only, one or more of laser sensor(s) 271, ultrasonic sensor(s) 272, bar code scanner(s) 273, position sensor(s) 274, line sensor(s) 275, vehicle proximity sensor(s) 278, or any other suitable sensors for sensing a position of the vehicle 110. The at least one payload handling sensor, for exemplary purposes, includes case sensors 278 (e.g., for sensing case units within the payload bed 210B onboard the vehicle 110 or on a storage shelf off-board the vehicle 110), arm proximity sensor(s) 277, or any other suitable sensors for sensing a payload (e.g., case unit CU) and its location/pose during autonomous transport vehicle handling of the payload CU. In some aspects, supplemental navigation sensor system 288 may form a portion of the physical characteristic sensor system 270. Suitable examples of sensors that may be included in the physical characteristic sensor system 270 are described in United States provisional patent application number 63/236,591 having attorney docket number 1127P015753-US (-#3) titled "Autonomous Transport Vehicle" and filed on August 24, 2021, as well as United States patent numbers 8,425,173 titled "Autonomous Transport for Storage and Retrieval Systems" issued on April 23, 2013, 9,008,884 titled "Bot Position Sensing" issued on April 14, 2015, and 9,946,265 titled Bot Having High Speed Stability" issued on April 17, 2018, the disclosures of which are incorporated herein by reference in their entireties.
[0068] Referring also to Figs. 25A, 25B, and 25C, the sensors of the physical characteristic sensor system 270 may be configured to provide the autonomous transport vehicle 110 with, for example, awareness of its environment (in up to six degrees of freedom X, Y, Z, Rx, Ry, Rz - see Fig. 2) and external objects, as well as the monitor and control of internal subsystems. For example, the sensors may provide guidance information, payload information, or any other suitable information for use in operation of the autonomous transport vehicle 110 such as described herein and/or as described in, for example, United States provisional patent application having attorney docket number 1127P015753-US (-#3) titled "Autonomous Transport Vehicle" and having United States provisional application number 63/236,591 filed on August 24, 2021, the disclosure of which is incorporated herein by reference in its entirety.
[0069] Still referring to Fig. 2, the bar code scanner(s) 273 may be mounted on the autonomous transport vehicle 110 in any suitable location. The bar code scanners (s) 273 may be configured to provide an absolute location of the autonomous transport vehicle 110 within the storage structure 130. The bar code scanner (s) 273 may be configured to verify aisle references and locations on the transfer decks by, for example, reading bar codes located on, for example the transfer decks, picking aisles and transfer station floors to verify a location of the autonomous transport vehicle 110. The bar code scanner (s) 273 may also be configured to read bar codes located on items stored in the shelves 555.
[0070] The position sensors 274 may be mounted to the autonomous transport vehicle 110 at any suitable location. The position sensors 274 may be configured to detect reference datum features (or count the slats 520L of the storage shelves 555) (e.g. see Fig. 5A) for determining a location of the vehicle 110 with respect to the shelving of, for example, the picking aisles 130A (or a buffer/transfer station located adjacent the transfer deck 130B or lift 150) . The reference datum information may be used by the controller 122 to, for example, correct the vehicle's odometry and allow the autonomous transport vehicle 110 to stop with the support tines 210AT of the transfer arm 210A positioned for insertion into the spaces between the slats 520L (see, e.g., Fig. 5A). In one exemplary embodiment, the vehicle 110 may include position sensors 274 on the drive (rear) end 200E2 and the driven (front) end 200E1 of the autonomous transport vehicle 110 to allow for reference datum detection regardless of which end of the autonomous transport vehicle 110 is facing the direction the vehicle is travelling.
[0071] The line sensors 275 may be any suitable sensors mounted to the autonomous transport vehicle 110 in any suitable location, such as for exemplary purposes only, on the frame 200 disposed adjacent the drive (rear) and driven (front) ends 200E2, 200E1 of the autonomous transport vehicle 110. For exemplary purposes only, the line sensors 275 may be diffuse infrared sensors. The line sensors 275 may be configured to detect guidance lines 900 (see Figs. 9A and 15) provided on, for example, the floor of the transfer decks 130B. The autonomous transport vehicle 110 may be configured to follow the guidance lines when travelling on the transfer decks 130B and defining ends of turns when the vehicle is transitioning on or off the transfer decks 130B. The line sensors 275 may also allow the vehicle 110 to detect index references for determining absolute localization where the index references are generated by crossed guidance lines (see Fig. 9A and 15).
[0072] The case sensors 276 may include case overhang sensors and/or other suitable sensors configured to detect the location/pose of a case unit CU within the payload bed 210B. The case sensors 276 may be any suitable sensors that are positioned on the vehicle so that the sensor(s) field of view(s) span the payload bed 210B adjacent the top surface of the support tines 210AT (see Figs. 4A and 4B). The case sensors 276 may be disposed at the edge of the payload bed 210B (e.g., adjacent a transport opening 1199 of the payload bed 210B to detect any case units CU that are at least partially extending outside of the payload bed 210B.
[0073] The arm proximity sensors 277 may be mounted to the autonomous transport vehicle 110 in any suitable location, such as for example, on the transfer arm 210A. The arm proximity sensors 277 may be configured to sense objects around the transfer arm 210A and/or support tines 210AT of the transfer arm 210A as the transfer arm 210A is raised/lowered and/or as the support tines 210AT are extended/retracted.
[0074] The laser sensors 271 and ultrasonic sensors 272 may be configured to allow the autonomous transport vehicle 110 to locate itself relative to each case unit forming the load carried by the autonomous transport vehicle 110 before the case units are picked from, for example, the storage shelves 555 and/or lift 150 (or any other location suitable for retrieving payload). The laser sensors 271 and ultrasonic sensors 272 may also allow the vehicle to locate itself relative to empty storage locations 130S for placing case units in those empty storage locations 130S. The laser sensors 271 and ultrasonic sensors 272 may also allow the autonomous transport vehicle 110 to confirm that a storage space (or other load depositing location) is empty before the payload carried by the autonomous transport vehicle 110 is deposited in, for example, the storage space 130S. In one example, the laser sensor 271 may be mounted to the autonomous transport vehicle 110 at a suitable location for detecting edges of items to be transferred to (or from) the autonomous transport vehicle 110. The laser sensor 271 may work in conjunction with, for example, retro-reflective tape (or other suitable reflective surface, coating or material) located at, for example, the back of the shelves 555 to enable the sensor to "see" all the way to the back of the storage shelves 555. The reflective tape located at the back of the storage shelves allows the laser sensor 1715 to be substantially unaffected by the color, reflectiveness, roundness, or other suitable characteristics of the items located on the shelves 555. The ultrasonic sensor 272 may be configured to measure a distance from the autonomous transport vehicle 110 to the first item in a predetermined storage area of the shelves 555 to allow the autonomous transport vehicle 110 to determine the picking depth (e.g. the distance the support tines 210AT travel into the shelves 555 for picking the item(s) off of the shelves 555). One or more of the laser sensors 271 and ultrasonic sensors 272 may allow for detection of case orientation (e.g. skewing of cases within the storage shelves 555) by, for example, measuring the distance between the autonomous transport vehicle 110 and a front surface of the case units to be picked as the autonomous transport vehicle 110 comes to a stop adjacent the case units to be picked. The case sensors may allow verification of placement of a case unit on, for example, a storage shelf 555 by, for example, scanning the case unit after it is placed on the shelf.
[0075] Vehicle proximity sensors 278 may also be disposed on the frame 200 for determining the location of the autonomous transport vehicle 110 in the picking aisle 130A and/or relative to lifts 150. The vehicle proximity sensors 278 are located on the autonomous transport vehicle 110 so as to sense targets or position determining features disposed on rails 130AR on which the vehicle 110 travels through the picking aisles 130A (and/or on walls of transfer areas 195 and/or lift 150 access location). The position of the targets on the rails 130AR are in known locations so as to form incremental or absolute encoders along the rails 130AR. The vehicle proximity sensors 278 sense the targets and provide sensor data to the controller 122 so that the controller 122 determines the position of the autonomous transport vehicle 110 along the picking aisle 130A based on the sensed targets.
[0076] The sensors of the physical characteristic sensing system 270 are communicably coupled to the controller 122 of the autonomous transport vehicle 110. As described herein, the controller 122 is operably connected to the drive section 261D and/or the transfer arm 210A. The controller 122 is configured to determine from the information of the physical characteristic sensor system 270 vehicle pose and location (e.g., in up to six degrees of freedom, X, Y, Z, Rx, Ry, Rz) effecting independent guidance of the autonomous transport vehicle 110 traversing the storage and retrieval facility/system 100. The controller 122 is also configured to determine from the information of the physical characteristic sensor system 270 payload (e.g., case unit CU) pose and location (onboard or off-board the autonomous transport vehicle 110) effecting independent underpick (e.g., lifting of the case unit CU from underneath the case unit CU) and place of the payload CU to and from a storage location 130S and independent underpick and place of the payload CU in the payload bed 210B.
[0077] Referring to Figs. 1A, IB, 2, 4A, and 4B, as described above, the autonomous transport vehicle 110 includes a supplemental or auxiliary navigation sensor system 288, connected to the frame 200. The supplemental navigation sensor system 288 supplements the physical characteristic sensor system 270. The supplemental navigation sensor system 288 is, at least in part, a vision system 400 with cameras disposed to capture image data informing the at least one of a vehicle navigation pose or location (relative to the storage and retrieval system structure or facility in which the vehicle 110 operates) and payload pose or location (relative to the storage locations or payload bed 210B) that supplements the information of the physical characteristic sensor system 270. It is noted that the term "camera" described herein is a still imaging or video imaging device that includes one or more of a two-dimensional camera, a two dimensional camera with RGB (red, green, blue) pixels, a three-dimensional camera with XYZ+A definition (where XYZ is the three-dimensional reference frame of the camera and A is one of a radar return strength, a time of flight stamp, or other distance determination stamp/indicator), and an RGB/XYZ camera which includes both RGB and three-dimensional coordinate system information, non-limiting examples of which are provided herein. It should be understood that while the vision system 400 is described herein with respect to the autonomous transport vehicle 110 in other aspects the vision system 400 may be applied to a load handling device 150LHD (Fig. 1 - which may be substantially similar to the payload bed 210B of the autonomous transport vehicle 110) of a vertical lift 150 or a pallet builder of the infeed transfer stations 170. Suitable examples, of load handling devices of lifts that the vision system 400 may be incorporated are described in United States patent number 10,947,060 titled "Vertical Sequencer for Product Order Fulfilment" and issued on March 16, 2021, the disclosure of which is incorporated herein by reference in its entirety.
[0078] Referring to Figs. 2, 4A, 4B, 25A, 25B, and 25C, the vision system 400 includes one or more of the following: case unit monitoring cameras 410A, 410B (collectively referred to monitoring cameras 410), forward navigation cameras 420A, 420B and rearward navigation cameras 430A, 430B (collectively referred to herein as navigation cameras 430), one or more three-dimensional imaging system 440A, 440B, one or more case edge detection sensors 450A, 450B, one or more traffic monitoring camera 460A, 460B (collectively referred to herein as traffic monitoring cameras 460), and one or more out of plane (e.g., upward or downward facing) localization cameras 477A, 477B (collectively referred to herein as localization cameras 477)(noting the downward facing cameras may supplement the line following sensors 275 of the physical characteristic sensor system 270 and provide a broader field of view than the line following sensors 275 so as to effect guidance/traverse of the vehicle 110 to place the guide lines 900 (see Fig. 9A) back within the field of view of the line following sensors 275 in the event the vehicle path strays from the guide line 900 removing the guide line 900 from the line following sensor 275 field of view). The out of plane localization cameras 477 may be employed with the line following sensors 275 and provide a broader field of view than the line following sensors 275 to place the autonomous transport vehicle 110 back on a followed line if the autonomous transport vehicle 110 strays from the followed line to a point outside the detection area of the line following sensor 275. Images (static images and/or dynamic video images) from the different vision system 400 cameras are reguested from the vision system controller 122VC by the controller 122 as desired for any given autonomous transport vehicle 110 task. For example, images are obtained by the controller 122 from at least one or more of the forward and rearward navigation cameras 420A, 420B, 430A, 430B to effect navigation of the autonomous transport vehicle along the transfer deck 130B and picking aisles 130A.
[0079] As another example, the controller 122 may obtain images from one or more of the three-dimensional imaging system 440A, 440B, where the case edge detection sensors 450A, 450B, and the case unit monitoring cameras 410A, 410B are employed to effect case handling by the vehicle 110. Case handling includes picking and placing case units from case unit holding locations (such as case unit localization, verification of the case unit, and verification of placement of the case unit in the payload bed 210B and/or at a case unit holding location such as a storage shelf or buffer location).
[0080] Images from the out of plane localization cameras 477A, 477B may be obtained by the controller 122 to effect navigation of the autonomous transport vehicle and/or to provide data (e.g., image data) supplemental to localization/navigation data from the one or more of the forward and rearward navigation cameras 420A, 420B, 430A, 430B. Images from the one or more traffic monitoring camera 460A, 460B may be obtained by the controller 122, where the traffic monitoring cameras 460 are employed to effect travel transitions of the autonomous transport vehicle 110 from a picking aisle 130A to the transfer deck 130B (e.g., entry to the transfer deck 130B and merging of the autonomous transport vehicle 110 with other autonomous transport vehicles travelling along the transfer deck 130B).
[0081] The case unit monitoring cameras 410A, 410B are any suitable high resolution or low resolution video cameras (where video images that include more than about 480 vertical scan lines and are captured at more than about 50 frames/second are considered high resolution). The case unit monitoring cameras 410A, 410B are arranged relative to each other to form a stereo vision camera system that is configured to monitor case unit CU ingress to and egress from the payload bed 210B. The case unit monitoring cameras 410A, 410B are coupled to the frame 200 in any suitable manner and are focused at least on the payload bed 210B. In one or more aspects, the case unit monitoring cameras 410A, 410B are coupled to the transfer arm 210A so as move in direction LAT with the transfer arm 210A (such as when picking and placing case units CU) and are positioned so as to be focused on the payload bed 210B and support tines 210AT of the transfer arm 210A.
[0082] Referring also to Fig. 5A, the case unit monitoring cameras 410A, 410B effect at least in part one or more of case unit determination, case unit localization, case unit position verification, and verification of the case unit justification features (e.g., justification blades 471 and pushers 470) and case transfer features (e.g., tines 210AT, pullers 472, and payload bed floor 473). For example, the case unit monitoring cameras 410A, 410B detect one or more of case unit length CL, CL1, CL2, CL3, a case unit height CHI, CH2, CH3, and a case unit yaw YW (e.g., relative to the transfer arm 210A extension/retraction direction LAT). The data from the case handling sensors (e.g., noted above) may also provide the location/positions of the pushers 470, pullers 472, and justification blades 471, such as where the payload bed 210B is empty (e.g., not holding a case unit).
[0083] The case unit monitoring cameras 410A, 410B are also configured to effect, with the vision system controller 122VC, a determination of a front face case center point FFCP (e.g., in the X, Y, and Z directions with the case units disposed on a shelf or other holding area off-board the vehicle 110) relative to a reference location of the autonomous transport vehicle 110. The reference location of the autonomous transport vehicle 110 may be defined by one or more justification surfaces of the payload bed 210B or the centerline CLPB of the payload bed 210B. For example, the front face case center point FFCP may be determined along the longitudinal axis LAX (e.g. in the Y direction) relative to a centerline CLPB of the payload bed 210B (Fig. 4A). The front face case center point FFCP may be determined along the vertical axis VER (e.g. in the Z direction) relative to a case unit support plane PSP of the payload bed 210B (Fig. 4B - formed by one or more of the tines 210AT of the transfer arm 210A and the payload bed floor 473). The front face case center point FFCP may be determined along the lateral axis LAT (e.g. in the X direction) relative to a justification plane surface JPP of the pushers 470 (Fig. 4A). Determination of the front face case center point FFCP of the case units CU located on a storage shelf 555 (see Fig. 5A) or other case unit holding location provides, as non-limiting examples, for localization of the autonomous transport vehicle 110 relative to case units CU to be picked, mapping locations of case units within the storage structure (e.g., such as in a manner similar to that described in United States patent number 9,242,800 issued on January 26, 2016 titled "Storage and retrieval system case unit detection", the disclosure of which is incorporated herein by reference in its entirety), and/or pick and place accuracy relative to other case units on the storage shelf 555 (e.g., so as to maintain predetermined gap sizes between case units. The determination of the front face case center point FFCP also effects a comparison of the "real world" environment in which the autonomous transport vehicle 110 is operating with the virtual model 400VM so that controller 122 of the autonomous transport vehicle 110 compares what is "sees" with the vision system 400 substantially directly with what the autonomous transport vehicle 110 expects to "see" based on the simulation of the storage and retrieval system structure. Moreover, in one aspect, illustrated in Fig. 5A, the object (case unit) and characteristics determined by the vision system controller 122VC are coapted (combined, overlayed) to the virtual model 400VM enhancing resolution, in up to six degrees of freedom resolution, of the object pose with respect to a facility reference frame. As may be realized, registration of the cameras of the vision system 400 with the facility reference frame allows for enhanced resolution of vehicle 110 pose and/or location with respect to both a global reference (facility features rendered in the virtual model 400VM) and the imaged object. More particularly, object position discrepancies or anomalies apparent and identified upon coapting the object image and virtual model (e.g., edge spacing between case unit fiducial edges or case unit inclination or shew, with respect to the rack slats 520L of the virtual model 400VM), if greater than a predetermined nominal threshold, describe an errant pose of one or more of case, rack, and/or vehicle 110. Discrimination as to whether errancy is with the pose/location of the case, rack or vehicle 110, one or more is determined via comparison with pose data from sensors 270 and supplemental navigation sensor system 288.
[0084] As an example of the above-noted enhanced resolution, if one case unit disposed on a shelf that is imaged by the vision system 400 is turned compared to juxtaposed case units on the same shelf (also imaged by the vision system) and to the virtual model 400VM the vision system 400 may determine the one case is skewed and provide the enhanced case position information to the controller 122 for operating the transfer arm 210A and positioning the transfer arm 210A so as to pick the one case based on the enhanced resolution of the case pose and location. As another example, if the edge of a case is offset from a slat 520L (see
Fig. 5A-5C) edge by more than a predetermined threshold the vision system 400 may generate a position error for the case; noting that if the offset is within the threshold, the supplemental information from the supplemental navigation sensor system 288 enhances the pose/location resolution (e.g., an offset substantially equal to the determined pose/location of the case with respect to the salt 520L and vehicle 110 payload bed 210B transfer arm 210A frame. It is further noted that if only one case is skewed/offset relative to the slat 520L edges the vision system may generate the case position error; however, if two or more juxtaposed cases are determined to be skewed relative to the slat 520L edges the vision system may generate a vehicle 110 pose error and effect repositioning of the vehicle 110 (e.g., correct the position of the vehicle 110 based on an offset determined from the supplemental navigation sensor system 288 supplemental information) or a service message to an operator (e.g., where the vision system 400 effects a "dash cam" collaborative mode (as described herein) that provides for remote control of the vehicle 110 by an operator with images (still and/or real time video) from the vision system being conveyed to the operator to effect the remote control operation). The vehicle 110 may be stopped (e.g., does not traverse the picking aisle 130A or transfer deck 130B) until the operator initiates remote control of the vehicle 110.
[0085] The case unit monitoring cameras 410A, 410B may also provide feedback with respect to the positions of the case unit justification features and case transfer features of the autonomous transport vehicle 110 prior to and/or after picking/placing a case unit from, for example, a storage shelf or other holding locations (e.g., for verifying the locations/positions of the justification features and the case transfer features so as to effect pick/place of the case unit with the transfer arm 210A without transfer arm obstruction). For example, as noted above, the case unit monitoring cameras 410A, 410B have a field of view that encompasses the payload bed 210B. The vision system controller 122VC is configured to receive sensor data from the case unit monitoring cameras 410A, 410B and determine, with any suitable image recognition algorithms stored in a memory of or accessible by the vision system controller 122VC, positions of the pushers 470, justification blades 471, pullers 472, tines 210AT, and/or any other features of the payload bed 210B that engage a case unit held on the payload bed 210B. The positions of the pushers 470, justification blades 471, pullers 472, tines 210AT, and/or any other features of the payload bed 210B may be employed by the controller 122 to verify a respective position of the pushers 470, justification blades 471, pullers 472, tines 210AT, and/or any other features of the payload bed 210B as determined by motor encoders or other respective position sensors; while in some aspects the positions determined by the vision system controller 122VC may be employed as a redundancy in the event of encoder/position sensor malfunction.
[0086] The justification position of the case unit CU within the payload bed 21B may also be verified by the case unit monitoring cameras 410A, 410B. For example, referring also to Fig. 4C, the vision system controller 122VC is configured to receive sensor data from the case unit monitoring cameras 410A, 410B and determine, with any suitable image recognition algorithms stored in a memory of or accessible by the vision system controller 122VC, a position of the case unit in the X, Y, Z directions relative to, for example, one or more of the centerline CLPB of the payload bed 210B, a reference/home position of the justification plane surface JPP of the pushers 470, and the case unit support plane PSP. Here, position determination of the case unit CU within the payload bed 210B effects at least place accuracy relative to other case units on the storage shelf 555 (e.g., so as to maintain predetermined gap sizes between case units.
[0087] Referring to Figs. 2, 4A, 4B, 6, 7A, 7B, and 8, the one or more three-dimensional imaging system 440A, 440B includes any suitable three-dimensional imager(s) including but not limited to, e.g., time-of-flight cameras, imaging radar systems, light detection and ranging (LIDAR), etc. The one or more three- dimensional imaging system 440A, 440B may effect, with the vision system controller 122VC, a determination of a size (e.g., height and width) of the front face (i.e., the front face surface) of a case unit CU and front face case center point FFCP (e.g., in the X, Y, and Z directions) relative to a reference location of the autonomous transport vehicle 110 and invariant of a shelf supporting the case unit CU (e.g., the one or more three- dimensional imaging system 440A, 440B effects case unit CU location without reference to the shelf supporting the case unit CU and effects a determination as to whether the case unit is supported on a shelf through a determination of a shelf invariant characteristic of the case units). Here, the determination of the front face surface and case center point FFCP also effects a comparison of the "real world" environment in which the autonomous transport vehicle 110 is operating with the virtual model 400VM so that controller 122 of the autonomous transport vehicle 110 compares what is "sees" with the vision system 400 substantially directly with what the autonomous transport vehicle 110 expects to "see" based on the simulation of the storage and retrieval system structure. The image data obtained from the one or more three- dimensional imaging system 440A, 440B may supplement the image data from the cameras 410A, 410B in the event data from the cameras 410A, 410B is incomplete or missing.
[0088] As illustrated in Fig. 6, the one or more three- dimensional imaging system 440A, 440B has a respective field of view that extends past the payload bed 210B substantially in direction LAT so that each three-dimensional imaging system 440A, 440B is disposed to sense case units CU adjacent to but external of the payload bed 210B (such as case units CU arranged so as to extend in one or more rows along a length of a picking aisle 130A (see Fig. 5A) or a substrate buffer/transfer stations (similar in configuration to storage racks 599 and shelves 555 thereof disposed along the picking aisles 130A) arranged along the transfer deck 130B). The field of view 440AF, 440BF of each three-dimensional imaging system 440A, 440B encompasses a volume of space 440AV, 440BV that extends a height 670 of a pick range of the autonomous transport vehicle 110 (e.g., a range/height in direction VER - Figs, and 8 - in which the arm 210A can move to pick/place case units to a shelf 555 or stacked shelves accessible from a common rolling surface 284 (e.g., of the transfer deck 130B or picking aisle 130A - see Fig. 2) on which the autonomous transport vehicle 110 rides). Here, as can be seen in Fig. 8, the one or more three- dimensional imaging system 440A, 440B provides sensor data to the vision system controller 122VC that embodies at least the front face surfaces 800A, 800B, 800C of case units CUI, CU2, CU3, where such front face surface detection is detected/determined without reference to and regardless of the presence of a shelf supporting the case units. The vision system controller 122VC determines if the case unit CU detected is disposed on a shelf with other case units through a determination of a shelf invariant characteristic common to each case unit disposed on the same shelf. Here, for case units CU with substantially vertically orientated faces, extraction of a front face normal vector (e.g., such as by planar fit) and a bottom edge of the front face (e.g., such as by region edge detection) provides for a planar equation for the shelf in the autonomous transport vehicle coordinate system X, Y, Z.
[0089] As can be seen in Figs. 7A and 7B, a case unit sitting/seated on a shelf 555 has a front face or front face surface 800 that is visible to the one or more three-dimensional imaging system 440A, 440B (and to the case unit monitoring cameras 410A, 410B). From the detected front face surface 800 the vision system controller 122VC determines a front face normal vector N that is normal to the front face surface 800. Also from the detected front face surface 800, the vision system controller 122VC (with any suitable image processing algorithms thereof) determines the bottom edge 777 (and vector B thereof) of the front face surface 800, where a shelf invariant characteristic of the case unit CU is derived from the front face normal vector N and the bottom edge 777. For example, an UP or Z axis vector U can be determined from the cross product of vectors N and B as follows:
[0090] U = N x B [eq. 1]
[0091] A center point P of the bottom edge 777 is determined by vision system controller 122VC (with any suitable image processing algorithms thereof) and a scalar eguation of a plane (that represents the bottom surface of the case unit CU seated on the shelf 555) can be written as follows:
[0092] d = U*P [eg. 2]
[0093] Where (U, d) is the shelf invariant characteristic that is common to any case unit seated on the same shelf 555 (e.g., any case unit seated on the same shelf has the same shelf invariant feature vector within a predetermined tolerance). Here, the vision system controller 122VC can determine whether the case units CUI, CU2, CU3 (see Fig. 8) are disposed on the same shelf by scanning of case units CUI, CU2, CU3 with at least the one or more three- dimensional imaging system 440A, 440B and determining the shelf invariant characteristic. The determination of the shelf invariant characteristic may effect, at least in part, comparison between what the vision system 400 of the autonomous transport vehicle 110 "sees" substantially directly with what the autonomous transport vehicle 110 expects to "see" based on the simulation of the storage and retrieval system structure. Determination of the shelf invariant characteristic may also effect placement of case units on the plane of the shelf 555 as determined from the shelf invariant characteristic.
[0094] Referring to Figs. 2 and 4A, the forward navigation cameras 420A, 420B, are any suitable cameras configured to provide object detection and ranging. The forward navigation cameras 420A, 420B may be placed on opposite sides of the longitudinal centerline LAXCL of the autonomous transport vehicle 110 and spaced apart by any suitable distance so that the forward facing fields of view 420AF, 420BF (see also Fig. provide the autonomous transport vehicle 110 with stereo vision. The forward navigation cameras 420A, 420B are any suitable high resolution or low resolution video cameras (where video images that include more than about 480 vertical scan lines and are captured at more than about 50 frames/second are considered high resolution), time-of-flight cameras, laser ranging cameras, or any other suitable cameras configured to provide object detection and ranging for effecting autonomous vehicle traverse along the transfer deck 130B and picking aisles 130A. The rearward navigation cameras 430A, 430B may be substantially similar to the forward navigation cameras. The forward navigation cameras 420A, 420B and the rear navigation cameras 430A, 430B provide for autonomous transport vehicle 110 navigation with obstacle detection and avoidance (with either end 200E1 of the autonomous transport vehicle 110 leading a direction of travel or trailing the direction of travel) as well as localization of the autonomous transport vehicle within the storage and retrieval system 100. Localization of the autonomous transport vehicle 110 may be effected by one or more of the forward navigation cameras 420A, 420B and the rearward navigation cameras 430A, 430B by detection of lines 900 on the travel/rolling surface 284 and/or by detection of suitable storage structure, including but not limited to storage rack (or other) structure 999. The line detection and/or storage structure detection may be compared to floor maps and structure information (e.g., stored in a memory of or accessible by) of the vision system controller 122VC. The forward navigation cameras 420A, 420B and the rearward navigation cameras 430A, 430B may also send signal to the controller 122 (inclusive of or through the vision system controller 122VC) so that as objects approach the autonomous transport vehicle 110 (with the autonomous transport vehicle 110 stopped or in motion) the autonomous transport vehicle 110 may be maneuvered (e.g., on the undeterministic rolling surface of the transfer deck 130B or within the picking aisle 130A (which may have a deterministic or undeterministic rolling surface) to avoid the approaching object (e.g., another autonomous transport vehicle, case unit, or other transient object within the storage and retrieval system 100).
[0095] The forward navigation cameras 420A, 420B and the rear navigation cameras 430A, 430B may also provide for convoys of vehicles 110 along the picking aisles 130A or transfer deck 130B, where one vehicle 110 follows another vehicle 110A at predetermined fixed distances. As an example, Fig. IB illustrates a three vehicle 110 convoy where one vehicle closely follows another vehicle at the predetermined fixed distance.
[0096] Still referring Figs. 2 and 4A, the one or more case edge detection sensors 450A, 450B are any suitable sensors such as laser measurement sensors configured to scan the shelves of the storage and retrieval system to verify the shelves are clear for placing case units CU, or to verify a case unit size and position before picking the case unit CU. While one case edge detection sensor 450A, 450B is illustrated on each side of the payload bed 210B centerline CLPB (see Fig. 4A) there may be more or less than two case edge detection sensors placed at any suitable locations on the autonomous transport vehicle 110 so that the vehicle 110 can traverse by and scan case units CU with the front end 200E1 leading a direction of vehicle travel or the rear/back end 200E2 leading the direction of vehicle travel
[0097] The one or more traffic monitoring cameras 460A, 460B are disposed on the frame 200 so that a respective field of view 460AF, 460BF faces laterally in lateral direction LAT1. While the one or more traffic monitoring cameras 460A, 460B are illustrated as being adjacent a transfer opening 1199 of the transfer bed 210B (e.g., on the pick side from which the arm 210A of the autonomous transport vehicle 110 extends), in other aspects there may be traffic monitoring cameras disposed on the non-pick side of the frame 200 so that a field of view of the traffic monitoring cameras faces laterally in direction LAT2. The traffic monitoring cameras 460A, 460B provide for an autonomous merging of autonomous transport vehicles 110 exiting, for example, a picking aisle 130A or lift transfer area 195 onto the transfer deck 130B (see Fig. IB). For example, the autonomous transport vehicle 110 leaving the lift transfer area 195 (Fig. IB) detects autonomous transport vehicle 110T travelling along the transfer deck 130B. Here, the controller 122 autonomously strategizes merging (e.g., entering the transfer deck in front of or behind the autonomous transport vehicle H OT, acceleration onto the transfer deck based on a speed of the approaching vehicle H OT, etc.) on to the transfer deck based on information (e.g., distance, speed, etc.) of the vehicle 110V gathered by the traffic monitoring cameras 460A, 460B and communicated to and processed by the vision system controller 122VC.
[0098] The one or more out of plane (e.g., upward or downward facing) localization cameras 477A, 477B are disposed on the frame 200 of the autonomous transport vehicle 110 so as to sense/detect location fiducials (e.g., location marks 971, lines 900, etc.) disposed on a ceiling 991 of the storage and retrieval system or on the rolling surface 284 of the storage and retrieval system. The location fiducials have known locations within the storage and retrieval system and may provide unigue identification marks/patterns that are recognized by the vision system controller 122VC (e.g., processing data obtained from the localization cameras 477A, 477B). Based on the location fiducial detected, the vision system controller 122VC compares the detected location fiducial to known location fiducials (e.g., store in a memory of or accessible to the vision system controller 122VC) to determine a location of the autonomous transport vehicle 110 within the storage structure 130.
[0099] The cameras of the supplemental navigation sensor system 288 may be calibrated in any suitable manner (such as by, e.g., an intrinsic and extrinsic camera calibration) to effect sensing of case units CU, storage structure (e.g., shelves, columns, etc.), and other structural features of the storage and retrieval system. Referring to Figs. 4A, 4B, 5A, 5B, and 5C, known objects (such as case units CUI, CU2, CU3 (or storage system structure) (e.g., having a known physical characteristics such as shape, size, etc.) may be placed within the field of view of a camera (or the vehicle 110 may be positioned so that the known objects are within the field of view of the camera) of the supplemental navigation sensor system 288. These known objects may be imaged by the camera from several angles/view points to calibrate each camera so that the vision system controller 122VC is configured to detect the known objects based on sensor signals from the calibrated camera.
[0100] For example, calibration of case unit monitoring cameras 410A, 410B will be described with respect to case units CUI, CU2, CU3 having known physical characteristics/parameters. Figs. 5A- 5C are exemplary images captured from one of case unit monitoring cameras 410A, 410B from, for exemplary purposes, three different view points. Here, physical characteristics/parameters (e.g., shape, length, width, height, etc.) of the case units CUI, CU2, CU3 are known by the vision system controller 122VC (e.g., the physical characteristics of the different case units CUI, CU2, CU3 are stored in a memory of or accessible to the vision system controller 122VC). Based on the, for example, three (or more) different view points of the case units CUI, CU2, CU3, in the images of Figs. 5A-5C, the vision system controller 122VC is provided with intrinsic and extrinsic camera and case unit parameters that effect calibration of the case unit monitoring camera 410A, 410B. For example, from the images the vision system controller 122VC registers (e.g., stores in memory) a perspective of the case units CUI, CU2, CU3 relative to the case unit monitoring camera 410A, 410B. The vision system controller 122VC estimates the pose of the case units CUI, CU2, CU3 relative to the case unit monitoring camera 410A, 410B and estimates the pose of the case units CUI, CU2, CU3 relative to each other. The pose estimates PE of the respective case units CUI, CU2, CU3 are illustrated in Figs. 5A-C as being overlaid on the respective case units CUI, CU2, CU3.
[0101] The vehicle 110 is moved so that any suitable number of view points of the case units CUI, CU2, CU3 are obtained/imaged by the case unit monitoring camera 410A, 410B to effect a convergence of the case unit characteristics/parameters (e.g., estimated by the vison system controller 122VC) for each of the known case units CUI, CU2, CU3. Upon convergence of the case unit parameters, the case unit monitoring camera 410A, 410B is calibrated. The calibration process is repeated for the other case unit monitoring camera 410A, 410B. With both of the case unit monitoring cameras 410A, 410B calibrated the vision system controller 122VC is configured with three-dimensional rays for each pixel in each of the case unit monitoring cameras 410A, 410B as well as an estimate of the three-dimensional baseline line segment separating the cameras and the relative pose of the case unit monitoring cameras 410A, 410B relative to each other. The vision system controller 122VC is configured to employ the three-dimensional rays for each pixel in each of the case unit monitoring cameras 410A, 410B, the estimate of the three-dimensional baseline line segment separating the cameras, and the relative pose of the case unit monitoring cameras 410A, 410B relative to each other so that the case unit monitoring cameras 410A, 410B form a passive stereo vision sensor such as where there are common features visible within the fields of view 410AF, 410BF of the case unit monitoring cameras 410A, 410B. AS noted above, the calibration of the case unit monitoring cameras 410A, 410B was described with respect to case units CUI, CU2, CU3 but may be performed with respect to any suitable structure (e.g., permanent or transient) of the storage and retrieval system 100 in a substantially similar manner.
[0102] As may be realized, vehicle localization (e.g., positioning of the vehicle at a predetermined location along a picking aisle 130A or along the transfer deck 130B relative to a pick/place location) effected by the physical characteristic sensor system 270 may be enhanced with the pixel level position determination effected by the supplemental navigation sensor system 288. Here, the controller 122 is configured to what may be referred to as "grossly" locate the vehicle 110 relative to a pick/place location by employing on or more sensors of the physical characteristic sensor system 270. The controller 122 is configured to employ the supplemental (e.g., pixel level) position information obtained from the vision system controller 122VC of the supplemental navigation sensor system 288 to what may be referred to as "fine tune" the vehicle pose and location relative to the pick/place location so that positioning of the vehicle 110 and case units CU placed to storage locations 130S by the vehicle 110 may be held to smaller tolerances (i.e., increased position accuracy) compared to positioning of the vehicle 110 or case units CU with the physical characteristic sensor system 270 alone. Here, the pixel level positioning provided by the supplemental navigation sensor system 288 has a higher positioning definition/resolution than the electro-magnetic sensor resolution provided by the physical characteristic sensor system 270.
[0103] In aspects where the case units may be dimply lit, lighting sources may be provided on the vehicle 110 to illuminate the case units (or other structure) to effect the calibration of the cameras in the manner noted above. The lighting may be a diffuse lighting or the lighting may have a known pattern (s) that are projected on the surface (s) of the case units (or other structure) so that the case unit or other structure) parameters may be extracted from the images and convergence of the case unit (or other structure) parameters is obtained by the vision system controller 122VC. Suitable markers (e.g., calibration stickers located at known locations on the case units or other structure) may also be placed on the case units/structure to facilitate feature extraction from the images obtained by the case unit monitoring cameras 410A, 410B and effect calibration of the case unit monitoring cameras 410A, 410B. Calibration of the other cameras (e.g., the forward and rearward navigation cameras 420A, 420B, 430A, 430B, the traffic monitoring camera(s) 460A, 460B, and the out of plane localization camera(s) 477A, 477B, etc.) of the supplemental navigation sensor system 288 may be effected in a manner similar to that described above. [0104] Referring to Figs. 1A, 2, 4A, 4B, and 11, the vision system controller 122VC of the autonomous transport vehicle 110 is configured to dynamically select and access information from different sensors (or groups of sensors) from the supplemental navigation sensor system 288 depending on vehicle 110 operation. Fig. 11 is an illustration showing non-exhaustive sensor groupings 1111-1114 and associated non-exhaustive vehicle operations in which the sensors groups may be accessed by the vision system controller 122VC to effect that vehicle operation. Exemplary sensor group 1111 includes the rear navigation cameras 230A, 230B. Exemplary sensor group 1112 includes the forward navigation cameras 420A, 420B. Exemplary sensor group 1113 includes the out of plane cameras 477A, 477B. Exemplary senor group 1114 includes the case unit monitoring cameras 410A, 410B. For exemplary purposes only, sensor groups 1111, 1113 may be employed by the vision system controller 122VC (and controller 122) for vehicle operations where the rear end 200E2 of the vehicle 110 leads a direction of vehicle travel (e.g., backward travel on the transfer deck 130B). The sensor groups 1112, 1113 may be employed by the vision system controller 122VC (and controller 122) for vehicle operations where the front end 200E1 of the vehicle 110 leads a direction of vehicle travel (e.g., forward travel on the transfer deck 130B). The sensor groups 1112, 1114 may be employed by the vision system controller 122VC (and controller 122) for vehicle operations where the front end 200E1 of the vehicle 110 leads a direction of vehicle travel (e.g., forward travel along a picking aisle 130A). The sensor groups 1111, 1114 may be employed by the vision system controller 122VC (and controller 122) for vehicle operations where the rear end 200E2 of the vehicle 110 leads a direction of vehicle travel (e.g., backward travel along a picking aisle 130A). The sensor group 1114 may be employed by the vision system controller 122VC (and controller 122) for vehicle operations where the transfer arm 210A loads or unloads a case unit CU to or from the payload bed 210B (e.g., pick place operations).
[0105] Referring to also Figs. 1A, IB, 2, 4D, 17, and 18, as described above, the autonomous transport vehicle 110 includes the supplemental hazard sensor system 290. The supplemental hazard sensor system 290 is connected to the frame 200 of the autonomous transport vehicle 110 to provide the bot operational control of the autonomous transport vehicle 110 in collaboration with an operator. The supplemental hazard sensor system 290 provides data (images) The vision system data is registered by the vision system controller 122VC that a) determines information characteristics (in turn provided to the controller 122), or b) information is passed the controller 122 without being characterizes (object in predetermined criteria) and characterization is done by the controller 122. In either a) or b) it is the controller 122 that determines selection to switch to the collaborative state. After switching, then the collaborative operation is effected by a user accessing the supplemental hazard sensor system 290 via the vision system controller 122VC and/or the controller 122. In its simplest form, however, the supplemental hazard sensor system 290 may be considered as providing a collaborative mode of operation of the autonomous transport vehicle 110. The supplemental hazard sensor system 290 supplements the autonomous navigation/operation sensor system 270 and/or the supplemental sensor system 298, with the supplemental hazard sensor system 290 configured to effect collaborative discriminating and mitigation of objects/hazards, e.g., encroaching upon the travel/rolling surface 284. The supplemental hazard sensor system 290 forms, at least in part, the vision system 400 and includes at least one camera 292. It is noted that the term "camera" described herein is a still imaging or video imaging device that includes one or more of a two- dimensional camera, a two dimensional camera with RGB (red, green, blue) pixels, a three-dimensional camera with XYZ+A definition (where XYZ is the three-dimensional reference frame of the camera and A is a radar return strength or time-of-flight stamp), and an RGB/XYZ camera which includes both RGB and three-dimensional coordinate system information, non-limiting examples of which are provided herein. The at least one camera 292 of the vision system 400 is disposed to capture image data informing objects and/or spatial features 299 (having intrinsic physical characteristics) within at least a portion of the facility 100 viewed by the at least one camera 292 with the autonomous transport vehicle 110 in the different positions in the facility 100 while executing autonomous navigation and transfer tasks. As may be realized, the at least one camera 292 is illustrated in Fig. 4D, for exemplary purposes only, as being separate and distinct from the cameras illustrated in Fig. 4A; however, the at least one camera 292 may be part of the system illustrated in Fig. 4A (e.g., camera 292 on end 200E1 of the vehicle 110 may be camera 477A in Fig. 4A; camera 292 on end 200E2 of eh vehicle 110 may be camera 477B in Fig. 4A; and cameras 292 facing laterally in direction LAT1 in Fig. 4D may be cameras 460AF, 460BF in Fig. 4A).
[0106] As noted above, the vision system 400 includes the at least one camera 292. It is noted that although the aspects of the present disclosure are described with respect to a forward facing camera (i.e., a camera that faces in the direction of travel with the end 200E1 of the autonomous transport vehicle 110 leading), the camera(s) may be positioned to face in any direction (rearward, sideways, up, down, etc.) for up to 360° monitoring about the autonomous transport vehicle 110. The at least one camera 292 may be placed on the longitudinal centerline LAXCL, on either side of the longitudinal centerline LAXCL, more than one camera 292 may be placed on opposite sides of the longitudinal centerline LAXCL of the autonomous transport vehicle 110 so that the field of view 292F provides the autonomous transport vehicle 110 with stereo vision (e.g., such as cameras 420A, 420B), or any other suitable configuration. The at least one camera 292, is any suitable camera configured to provide object or spatial feature 299 detection. For example, the at least one camera 292 is any suitable high resolution or low resolution video cameras, a 3D imaging system, time-of-flight camera, laser ranging camera, or any other suitable camera configured to provide detection of the object or spatial feature 299 within at least a portion of the facility 100 viewed by the at least one camera 292 with the autonomous transport vehicle 110 in the different positions in the facility 100 while executing autonomous navigation and transfer tasks. The at least one camera 292 provides for imaging and detection (with either end 200E1, 200E2 of the autonomous transport vehicle 110 leading a direction of travel or trailing the direction of travel). The object or spatial feature 299 detection may be compared to reference floor maps and structure information (e.g., stored in a memory of or accessible by) of the vision system controller 122VC. The at least one camera 292 may also send signals to the controller 122 (inclusive of or through the vision system controller 122VC) so that as the autonomous transport vehicle 110 approaches the object or spatial feature 299, the autonomous transport vehicle 110 initiates an autonomous stop (i.e., in an autonomous operation state) or may enter a collaborative operation state so as to be stopped by an operator or maneuvered e.g., on the undeterministic rolling surface of the transfer deck 130B or within the picking aisle 130A (which may have a deterministic or undeterministic rolling surface) by the operator in order to identify the object or spatial feature 299 (e.g., another malfunctioning autonomous transport vehicle, dropped case unit, debris, spill, or other transient object within the storage and retrieval system 100).
[0107] The camera(s) 292 of the supplemental hazard sensor system 290 may be calibrated in any suitable manner (such as by, e.g., an intrinsic and extrinsic camera calibration) to effect sensing/detection of the objects or spatial features 299 in the storage and retrieval system 100. Referring to Figs. 4D and 5B- 5C, known objects (such as case units CUI, CU2, CU3 (or storage system structure) (e.g., having a known physical characteristics such as shape, size, etc.) may be placed within the field of view 292F of a camera 292 (or the autonomous transport vehicle 110 may be positioned so that the known objects are within the field of view 292F of the camera 292) of the supplemental hazard sensor system 290. These known objects may be imaged by the camera 292 from several angles/view points to calibrate each camera so that the vision system controller 122VC is configured to determine when an "unknown" (i.e., unidentifiable) object based on sensor signals from the calibrated camera is within the field of view 292F.
[0108] For example, calibration of the camera(s) 292 will be described with respect to case units CUI, CU2, CU3 having known physical characteristics/parameters. Figs. 5B and 5C are exemplary images captured from the camera(s) 292 from, for exemplary purposes, two different view points. Here, physical characteristics/parameters (e.g., shape, length, width, height, etc.) of the case units CUI, CU2, CU3 are stored so as to be "known" (i.e., identifiable) by the vision system controller 122VC (e.g., the physical characteristics of the different case units CUI, CU2, CU3 are stored in a memory of or accessible to the vision system controller 122VC). Based on the, for example, two (or more) different view points of the case units CUI, CU2, CU3, in the images of Figs. 5A-5B, the vision system controller 122VC is provided with intrinsic and extrinsic camera and case unit parameters that effect calibration of the camera(s) 292.
[0109] The autonomous transport vehicle 110 is moved so that any suitable number of view points of the case units CUI, CU2, CU3 are obtained/imaged by the camer (s) 292 to effect a convergence of the case unit characteristics/parameters (e.g., estimated by the vison system controller 122VC) for each of the known case units CUI, CU2, CU3. Upon convergence of the case unit parameters, the camera(s) 292 is calibrated. With the camera(s) 292 calibrated the vision system controller 122VC is configured with three- dimensional rays for each pixel in each of the camer (s) 292. As noted above, the calibration of the camera(s) 292 was described with respect to case units CUI, CU2, CU3 but may be performed with respect to any suitable structure (e.g., permanent or transient) of the storage and retrieval system 100 in a substantially similar manner.
[0110] As may be realized, where the autonomous transport vehicle 110 (that in one aspect is a payload/case transport and/or transfer robot)autonomously travels along a picking aisle 130A or along the transfer deck 130B, the autonomous transport vehicle 110 may opportunistically detect (incidental or peripheral to predetermined autonomous tasks, e.g., autonomous picking/placing payload at storage, travel to transfer station and/or charge station for autonomous payload pick/place/transfer at the transfer station, and/or autonomous charging at the charge station) other objects within the facility 100 (e.g., other bots, dropped case units, spills, debris, etc.). The vision system controller 122VC is configured to employ the supplemental navigation sensor system 288 and/or the supplemental hazard sensor system 290 (i.e., imaging information obtained from the cameras of one or more of the supplemental sensor systems) to determine whether the objects are "unknown" (i.e., whether the objects or spatial features 299 are not expected to be within an area or space along the autonomous travel path of the autonomous transport vehicle 110).
[0111] Referring to Figs. 1A, 2, 4A, 4B, 4D, 10A, and 10B, the vision system 400 of the supplemental navigation sensor system 288 and/or supplemental hazard sensor system 290 configures the autonomous transport vehicle 110 with a virtual model 400VM of an operating environment 401 in which the autonomous transport vehicle 110 operates. For example, the vision system controller 122VC is programmed with a reference representation 400VMR of predetermined features (e.g., the fixed/permanent structure of and/or transient objects in the storage structure 130 of the storage and retrieval system described herein and included in the virtual model 400VM), the reference representation 400VMR of the predetermined features define the form or location of at least part of the facility or storage structure 130 traversed by the autonomous transport vehicle 110. Here the virtual model 400VM (and the reference representation 400VMR of predetermined features thereof) of the operating environment 401 is stored in any suitable memory of the autonomous transport vehicle (such as a memory of the vision system controller 122VC) or in a memory accessible to the vision system controller 122VC. The virtual model 400VM provides the autonomous transport vehicle 110 with the dimensions, locations, etc. of at least the fixed (e.g., permanent) structural components in the operating environment 401. The operating environment 401 and the virtual model 400VM thereof includes at least fixed/permanent structure (e.g., transfer deck 130B, picking aisles 130A, storage spaces 130S, case unit transfer areas, case unit buffer locations, vehicle charging locations, support columns, etc.) of one more storage structure level 130L; in one or more aspects, the operating environment 401 and the virtual model 400VM include the fixed structure of the one or more storage structure level 130L and at least some transitory structure (e.g., case units CU stored or otherwise located at case unit holding locations of the storage and retrieval system 100, etc.) of and located within the storage level 130L on which the autonomous transport vehicle 110 operates; in one or more other aspects the operating environment 401 and the virtual model 400VM includes at least the fixed structure and at least some transitory structure (e.g., case units)) of one or more levels 130L of the storage structure 130 on which the autonomous transport vehicle 110 could operate; and in still other aspects, the operating environment 401 and virtual model 400VM includes the entirety of the storage structure and at least some of the transitory structure (such as transitory structure for a storage level on which the autonomous transport vehicle operates).
[0112] The autonomous transport vehicle 110 may have stored thereon (or in a memory accessible thereby) a portion of the virtual model 400VM that corresponds with a portion of the operating environment in which the autonomous transport vehicle 110 operates. For example, the autonomous transport vehicle 110 has stored thereon (or in a memory accessible thereby) only a portion of the virtual model 400VM corresponding to a storage structure level 130L on which the autonomous transport vehicle is disposed. The virtual model 400VM of the operating environment 401 may be dynamically updated in any suitable manner to facilitate autonomous transport vehicle 110 operations in the storage structure 130. For example, where the autonomous transport vehicle 110 is moved from one storage structure level 130L to another different storage structure level 130L the vision system controller 122VC is updated (e.g., such as by the controller 122 and/or wirelessly by control server 120) to include a portion of the virtual model 400VM corresponding to the other different storage structure level 130L. As another example, the virtual model 400VM may be dynamically updated as case units are added and removed from the storage structure 130 so as to provide a dynamic virtual model case unit map that indicates the predetermined (expected) location of the case units CU that are to be transferred by the autonomous transport vehicles 110. In other aspects, the predetermined (expected) locations of the case units within the storage structure may not be included in the virtual model 400VM; however, the predetermined (expected) locations, sizes, SKUs, etc. of one or more case units to be transferred by an autonomous transport vehicle 110 are communicated from, for example, controller 120 to the autonomous transport vehicle 110, where the vision system 400 (and the vision system controller 122VC) effect verification of case unit(s) at the predetermined location as described herein (e.g., the vision system 400 compares what it expects to "see" with what is actually "sees" to verify the correct case unit (s) are being transferred) and/or for detection/identification of another malfunctioning autonomous transport vehicle, dropped case unit, debris, spill, or other transient object within the storage and retrieval system 100. [0113] The vision system controller 122VC is configured to register image data captured by the supplemental navigation sensor system 288 and generate, from the captured image data, at least one image (e.g., still image and/or video image) of one or more features of the predetermined features (e.g., the fixed/permanent structure of and/or transient objects in the storage structure 130 of the storage and retrieval system described herein). The at least one image (see, e.g., Figs. 5A-5C, 9A, 10A, and 10B for exemplary images) being formatted as a virtual representation VR of the one or more (imaged) predetermined features so as to provide a comparison (in at least one but up to the six degrees of freedom X, Y, Z, Rx, Ry, Rz) to one or more corresponding reference (e.g., a corresponding feature of the virtual model 400VM that serves as a reference for identifying the form and/or location of the imaged predetermined feature) of the predetermined features of the reference representation 400VMR.
[0114] Fig. 13 is an exemplary flow diagram of the comparison where at least one model 400VM of the storage and retrieval system is stored within or accessible to the vision system controller 122VC. For exemplary purposes only, a storage facility information model, a storage structure/array information model, and a case input station model are provided but in other aspects any suitable models and number of models may be provided to provide the vision system controller 122VC with virtual information pertaining to the operating environment of the autonomous transport vehicles 110. The different models may be combined to provide the vision system controller 122VC with a complete virtual operating environment in which the autonomous transport vehicle 110 operates. The sensors of the vision system 400 (as described herein) also provide sensor data to the vision system controller 122VC. The sensor data, that embodies the virtual representation VR images, is processed with any suitable image processing methods to detect a region of interest and/or edge features of objects in the image. The vision system controller 122VC predicts, within the model 400VM, a field of view of the sensor(s) providing the image data and determines, within the predicted field of view, regions of interest and edges of objects. The regions of interest and edges of the virtual model 400VM are compared with the regions of interest and edges of the virtual representation VR pose and location determination of one or more of the autonomous transport vehicle 110 and case units (payloads) as described herein.
[0115] The vision system controller 122VC is configured (as described herein with at part of the virtual model 400VM and with suitable imaging processing non-transitory computer program code) so that the virtual representation VR, of the imaged one or more features (e.g., in Fig. 9A the imaged features are the storage and retrieval system rack/column structure, in Figs. 10A the imaged features are the case units CU1-CU3, and in Fig. 10B the imaged features are the case units, storage rack structure, and a portion of the payload bed 210B) of the predetermined features, is effected resident on the autonomous transport vehicle 110, and comparison between the virtual representation VR of the one or more imaged predetermined features and the one or more corresponding reference predetermined features RPF (e.g., presented in a reference presentation RPP of the virtual model 400VM) is effected resident on the autonomous transport vehicle 110 (see Figs. 9A and 10A). Here, the autonomous transport vehicle 110 pose determination and navigation is autonomous and decoupled from and independent of each system controller (e.g., control server 120 or other suitable controller of the storage and retrieval system) that sends commands to the autonomous transport vehicle 110.
[0116] As described herein, the controller 122 is configured to employ the supplemental (e.g., pixel level) position information obtained from the vision system controller 122VC of the supplemental navigation sensor system 288 to what may be referred to as "fine tune" the vehicle pose and location relative to the pick/place location so that positioning of the vehicle 110 and case units CU placed to storage locations 130S by the vehicle 110 may be held to smaller tolerances (i.e., increased position accuracy) compared to positioning of the vehicle 110 or case units CU with the physical characteristic sensor system 270 alone. The fine tuning of the autonomous transport vehicle 110 pose and location is effected by the vision system controller 122VC, where the vision system controller 122VC is configured to confirm autonomous transport vehicle 110 pose and location information registered by the vision system controller 122VC from the physical characteristic sensor system 270 based on the comparison between the virtual representation VR and the reference representation RPP.
[0117] The comparison between the virtual representation VR and the reference representation RPP by the vision system controller 122VC builds confidence in the data generated by the physical characteristic sensor system 270 by verifying the accuracy of the data with the information obtained from the supplemental navigation sensor system 288. Here, the vision system controller 122VC is configured to identify a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation VR and the reference representation RPP, and update (e.g., modify the pose and/or location information from the physical characteristic sensor system 270) or complete (if the pose and/or location information from the physical characteristic system 270 is missing) autonomous transport vehicle 110 pose or location information from the physical characteristic sensor system 270 (e.g., to effect finally positioning the autonomous transport vehicle 110 to a predetermined commanded position) based on the variance.
[0118] The vision system controller 122VC is configured to determine a pose error in the information from the physical characteristic sensor system 270 and fidelity of the autonomous guided vehicle 110 pose and location information from the physical characteristic sensor system 270 based on at least one of the identified variance and an image analysis of at least one image (from the vision system 400 of the supplemental navigation sensor system 288), and assign a confidence value according to at least one of the pose error and the fidelity. Where the confidence value is below a predetermined threshold, the vision system controller 122VC is configured to switch autonomous guided vehicle navigation based on pose and location information generated from the virtual representation VR in place of pose and location information from the physical characteristic sensor system 270. The switching from the physical characteristic sensor system pose and location information to the virtual representation VR pose and location information may be effected by the vision system controller 122VC (or controller 122), by de-selecting the pose and location information, generated from the physical characteristic sensor system 270, and selecting/entering pose and location information from the virtual representation VR in a kinematic/dynamic algorithm (such as described in United States patent application number 16/144,668 titled "Storage and Retrieval System" and filed on September 27, 2018, the disclosure of which is incorporated herein by reference in its entirety).
[0119] After the vision system controller 122VC effects the above-noted switching the vision system controller 122VC is configured to continue autonomous transport vehicle 110 navigation to any suitable destination (such as a payload place destination, charging destination, etc.); while in other aspects the vision system controller 122VC is configured to select an autonomous transport vehicle 110 safe path and trajectory bringing the autonomous transport vehicle 110 from a position at switching to a safe location 157 (the safe location being a dedicated induction/extraction area of a transfer deck, a lift transfer area, or other area of the transfer deck 130B or picking aisle 130A at which the autonomous transport vehicle 110 may be accessed by an operator without obstructing operation of other autonomous transport vehicles 110 operating in the storage and retrieval system 100) for shut down of the autonomous transport vehicle 110; while in still other aspects, the vision system controller 122VC is configured to initiate communication to an operator of the storage and retrieval system 100 identifying autonomous transport vehicle 110 kinematic data and identify a destination of the autonomous transport vehicle 110 for operator selection (e.g., presented on user interface UI). Here the operator may select or switch control of the autonomous guided vehicle (e.g., through the user interface UI) from automatic operation to either quasi automatic operation (e.g., the autonomous transport vehicle 110 operates autonomously with limited manual input) or manual operation (e.g., the operator remotely controls operation of the autonomous transport vehicle 110 through the user interface UI). For example, the user interface UI may include a capacitive touch pad/screen, joystick, haptic screen, or other input device that conveys kinematic directional commands (e.g., turn, acceleration, deceleration, etc.) and/or pick place commands from the user interface UI to the autonomous guided vehicle 110 to effect operator control inputs in the quasi automatic operational and manual operational modes of the autonomous transport vehicle 110.
[0120] It is noted that where the variance described herein is persistent (to within a predetermined tolerance) the vision system controller 122VC may be configured to apply the variance as a offset that is automatically applied to the data from the physical characteristic sensor system 270 to grossly position the autonomous transport vehicle 110 based on the data from the physical characteristic sensor system 270 as modified by the offset, where comparison with the virtual representation VR and the reference representation RPP verifies the validity of the offset and adjusts the offset (and autonomous transport vehicle 110 pose and location) according to any variance. Where the variance reaches a predetermined threshold the vision system controller 122VC may alert a user of the storage and retrieval system 100 that the autonomous guided vehicle 110 may be due for servicing.
[0121] Still referring to Figs. 1A, 2, 4A, 4B, and 10A, while the pose and location error identification of the autonomous transport vehicle 110 is described above, the vision system controller 122VC is configured to effect a similar pose and location error identification for the case units CU, such as held in storage locations 130S or other holding areas of the storage and retrieval system. For example, the vision system controller 122VC is configured to confirm payload pose and location information registered by the vision system controller 122VC from the physical characteristic sensor system 270 based on the comparison between the virtual representation VR and the reference representation RPP of the virtual model 400VM. The vision system controller 122VC is configured to identify a variance in the payload (case unit) pose and location based on the comparison between the virtual representation VR and the reference representation RPP, and update (e.g., modify the pose and/or location information from the physical characteristic sensor system 270) or complete (if the pose and/or location information from the physical characteristic system 270 is missing) payload pose or location information from the physical characteristic sensor system based on the variance.
[0122] The vision system controller 122VC is configured to determine a pose error in the information from the physical characteristic sensor system 270 and fidelity of the payload pose and location information from the physical characteristic sensor system 270 based on at least one of the identified variance and an image analysis of the at least one image from the vision system 400 of the supplemental navigation sensor system 288. The vision system controller 122VC assigns a confidence value according to at least one of the payload pose error and the fidelity. With the confidence value below a predetermined threshold, the vision system controller 122VC switches autonomous transport vehicle 110 payload handling based on pose and location information generated from the virtual representation VR in place of pose and location information from the physical characteristic sensor system 270.
[0123] After switching, the vision system controller 122VC is configured to, in some aspects, continue autonomous guided vehicle handling to a predetermined destination (such as a payload placement location or an area of the storage and retrieval system where the payload may be inspected by an operator); in other aspects the vision system controller 122VC is configured to initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to quasi automatic payload handling operation (where the operator provides limited input to transfer arm 210A and traverse movements of the autonomous guided vehicle) or manual payload handling operation (where the operator manually controls movement of the transfer arm 210A and traverse movements of the autonomous guided vehicle) via the user interface device UI.
[0124] In a manner similar to that described above, the vision system controller 122VC is configured to transmit, via a wireless communication system (such as network 180) communicably coupling the vision system controller 122VC and an operator interface UI, a simulation image combining the virtual representation VR of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation RPP presenting the operator with an augmented reality image in real time (see Fig. 10A, where reference predetermined features include the shelves 555 and the virtual representations include those of the case units CU1-CU3). Here, the vision system 400 of the supplemental navigation sensor system 288 provides a "dashboard camera" (or dash-camera) that transmits video and/or still images from the vehicle 110 to an operator to allow remote operation or monitoring of the vehicle 110. It is noted that the vision system 400 may also operate as a data recorder that periodically sends still images obtained from the vision system cameras to a memory of the user interface UI, where the still images are stored/cached for operator review (e.g., in addition to providing a real-time video stream the vision system 400 provides for non-real time review of the still images). The still images may be captured and transmitted to the user interface for storage at any suitable interval such as, for example, every second, every ten seconds, every thirty seconds, every minute, or at any other suitable time intervals (exclusive of real time video stream recording), where the periodicity of the still image capture/recording maintains suitable communication bandwidth between, for example, the control server 120 and the bots 110 (noting that in accordance with aspects of the disclosed embodiment, the number of bots 110 operating/transferring case units in the storage and retrieval system 100 may be on the order of hundreds to thousands of bots 110). Here, the user interface UI with the record of stored still images provides for an interactive presentation/data interface where a user reviews the still images to determine how or why an event (e.g., such as a case miss-pick, bot breakage, product spill, debris presence on the transfer deck, etc.) occurred and what transpired prior to and/or after the event.
[0125] The vision system controller 122VC is configured to receive real time operator commands (e.g., from the user interface UI) to the traversing autonomous guided vehicle 110, which commands are responsive to the real time augmented reality image (see Figs. 9A and 10A), and changes in the real time augmented reality image transmitted to the operator by the vision system controller 122VC. The video or still images may be stored (and time stamped) in a memory onboard the vehicle 110 and sent to control server 120 and/or an operator on request; in other aspects the video and/or still images may be broadcast or otherwise transmitted in real time for viewing on a user interface UI (as described herein) accessible to the operator. [0126] The vision system controller 122VC is also configured to register image data captured by the supplemental hazard sensor system 290 and generate, from the captured image data, at least one image (e.g., still image and/or video image) of one or more object or spatial feature 299 showing the predetermined physical characteristic. The at least one image (see, e.g., Figs. 5B, 5C, and 15 for exemplary images) may be formatted as a virtual representation VR of the one or more object or spatial feature 299 (see Figs. 4D and 15) so as to provide a comparison (in at least one but up to the six degrees of freedom X, Y, Z, Rx, Ry, Rz (see Fig. 2)) to one or more corresponding reference (e.g., a corresponding feature of the virtual model 400VM that serves as a reference for identifying the form and/or location of the imaged predetermined feature) of the predetermined features of the reference representation 400VMR. The controller 122VC is configured to verify (via the comparison) the existence of presence of the predetermined physical characteristic of the object or spatial feature 299 based on the comparison between the virtual representation and the reference representation (i.e., compare to determine whether the object is "known" or "unknown"). Where the object or spatial feature 299 is verified by the controller 122VC as "unknown", the controller 122VC determines a dimension of the predetermined physical characteristic and commands (e.g., through the controller 122) the autonomous transport vehicle 110 to stop in a predetermined location relative to the object 299 (i.e., a trajectory is determined to autonomously place the bot in a predetermined position relative to the object or spatial feature 299) based on a position of the object or spatial features 299 determined from the comparison (as may be realized, the command stop interrupts the automatic routine of the vehicle previous autonomous commands, in effect diverting the bot from automatic tasking). In response to detecting the predetermined physical characteristic of at least one object or spatial feature 299, the controller 122 selectably reconfigures the autonomous transport vehicle 110 from an autonomous state to a collaborative vehicle state so as to finalize discrimination of the object or spatial feature 299 as a hazard and identify a mitigation action of the vehicle with respect to the hazard (i.e., selectably switches the autonomous transport vehicle 110 from an autonomous operation state to a collaborative operation state and identifies whether the vehicle can mitigate the hazard, e.g., remove a disabled vehicle or act as a signal/beacon to warn other vehicles performing autonomous tasks). In the collaborative operation state, the autonomous transport vehicle 110 is disposed to receive operator commands for the autonomous transport vehicle 110 to continue effecting vehicle operation for discriminating and mitigation of the object or spatial feature 299.
[0127] In one aspect, the autonomous transport vehicle 110 may not include the reference map (e.g., virtual model 400VM). In this aspect, when the camera 292 detects an object or spatial feature 299, the controller 122VC determines a position of the object within a reference frame of the at least one camera 292, which is calibrated and has a predetermined relationship to the autonomous transport vehicle 110. From the object pose in camera reference frame, the controller 122VC determines presence of the predetermined physical characteristic of object 299 (i.e., whether the object 299 is extended across bot path, blocks the bot, or is proximate, within a predetermined distance, to the bot path to be deemed an obstacle or hazard). Upon determination of presence of an object and switch from the autonomous state to the collaborative vehicle state, the controller 122VC is configured to initiate transmission communicating image/video the of presence of the predetermined physical characteristic to an operator (user) interface UI for collaborative operator operation of the autonomous transport vehicle 110 as will be further described below (Here the vehicle 110 is configured as an observation platform and pointer for a user in collaborative mode. The vehicle 110 in this mode is also a pointer for other bots executing in autonomous operation, that identify the pointer bot (e.g., via control system 120, or beacon) and reroute automatically to avoid the area until further command and if avoidance is not available to stop ahead of encountering the object/hazard).
[0128] The vision system controller 122VC is configured (as described herein with at least part of the virtual model 400VM and with suitable imaging processing non-transitory computer program code) so that the virtual representation VR, of the imaged object or spatial feature 299 is effected resident on the autonomous transport vehicle 110, and comparison between the virtual representation VR of the one or more imaged object or spatial feature 299 and the one or more corresponding reference predetermined features RPF (e.g., presented in a reference presentation RPP of the virtual model 400VM) is effected resident on the autonomous transport vehicle 110 (see Fig. 15). The comparison between the virtual representation VR and the reference representation RPP by the vision system controller 122VC verifies whether the object or spatial feature 299 is "unknown". The vision system controller 122VC is configured to determine a dimension of the object or spatial feature 299 based on image analysis of at least one image (from the vision system 400 of the supplemental hazard sensor system 290). Where the dimensions are unidentifiable, the vision system controller 122VC is configured to switch the autonomous transport vehicle 110 into the collaborative operation state for collaborative discrimination of the object 299 with the operator. The switching from the autonomous to the collaborative state may be effected by the vision system controller 122VC (or controller 122), by selectably reconfiguring the autonomous transport vehicle 110 from an autonomous vehicle to a collaborative vehicle (i.e., selectably switches the autonomous transport vehicle 110 from an autonomous operation state to a collaborative operation state).
[0129] In one aspect, with the above noted switching effected by the vision system controller 122VC (and controller 122), the controller 122 is configured to continue autonomous transport vehicle 110 navigation to any suitable destination relative to the detected object, applying a trajectory to the autonomous transport vehicle 110 that brings the autonomous transport vehicle 110 to a zero velocity within a predetermined time period where motion of the autonomous transport vehicle 110 along the trajectory is coordinated with "known" and "unknown" objects located relative to the autonomous transport vehicle 110. With the autonomous transport vehicle 110 stopped, the vision system controller 122VC initiates communication to the operator of the storage and retrieval system 100 displaying the object or spatial feature 299 on the user interface UI for the operator to discriminate the object 299 and determine a mitigation action such as maintenance (e.g., clean-up of a spill, removal of a malfunctioning bot, etc.) and a location of the autonomous transport vehicle 110 (e.g., presented on user interface UI). As noted above, in one aspect, the controller 122 may initiate a signal/beacon to at least another bot(s) so as to alert the other bot(s) of a traffic obstacle and to avoid the obstacle or indicate a detour area (thus, in effect, the supplemental hazard sensor system 290 provides for a hazard pointer/indicator mode of one bot to others on the same level). In one aspect, the signal/beacon is sent via a local communication transmission to a system area bot task manager, managing tasks of nearby bots, or bots within a predetermined distance of the pointer bot. In other aspects, the controller 122 is configured, based on object information from the vision system 400 and vision system controller 122VC, to select an autonomous transport vehicle 110 safe path and trajectory bringing the autonomous transport vehicle 110 from a position at switching to a location 157 where the operator may view the object 299 without further obstructing operation of other autonomous transport vehicles 110 operating in the storage and retrieval system 100. The vision system controller
122VC is configured to maintain the object or spatial feature 299 within field of view 292F of at least one camera 292 and continued imaging of the predetermined physical characteristic. [0130] In one aspect, the operator may select or switch control of the autonomous guided vehicle (e.g., through the user interface UI) from automatic operation to collaborative operation (e.g., the operator remotely controls operation of the autonomous transport vehicle 110 through the user interface UI). For example, the user interface UI may include a capacitive touch pad/screen, joystick, haptic screen, or other input device that conveys kinematic directional commands (e.g., turn, acceleration, deceleration, etc.) from the user interface UI to the autonomous transport vehicle 110 to effect operator control inputs in the collaborative operational mode of the autonomous transport vehicle 110. For example, the vision system 400 of the supplemental hazard sensor system 290 provides a "dashboard camera" (or dash-camera) that transmits video and/or still images from the autonomous transport vehicle 110 to an operator (through user interface UI) to allow remote operation or monitoring of the area relative to the autonomous transport vehicle 110 in a manner similar to that described herein with respect to supplemental navigation sensor system 288.
[0131] Referring to Figs. 2, 4A, 4B, 9A, 10A, and 15, the vision system controller 122VC (and/or controller 122) is in one or more aspects configured to provide remote viewing with the vision system 400, where such remote viewing may be presented to an operator in augmented reality or in any other suitable manner (such as unaugmented). For example, the autonomous transport vehicle 110 is communicably connected to the warehouse management system 2500 (e.g., via the control server 120) over the network 180 (or any other suitable wireless network). The warehouse management system 2500 includes one or more warehouse control center user interfaces UI. The warehouse control center user interface US may be any suitable interfaces such as desktop computers, laptop computers, tablets, smart phones, virtual reality headsets, or any other suitable user interface configured to present visual and/or aural data obtained from the autonomous transport vehicle 110. In some aspects the vehicle 110 may include one or more microphones MCP (Fig. 2) where the one or more microphones and/or remote viewing may assist in preventative maintenance/troubleshooting diagnostics for storage and retrieval system components such as the vehicle 110, other vehicles, lifts, storage shelves, etc. The warehouse control center user interfaces UI are configured so that warehouse control center users reguest or are otherwise supplied (such as upon detection of an unidentifiable object 299) with images from the autonomous transport vehicle 110 and so that the requested/supplied images are viewed on the warehouse control center user interfaces UI.
[0132] The images supplied and/or requested may be live video streams, pre-recorded (and saved in any suitable memory of the autonomous transport vehicle 110 or warehouse management system 2500) images, or images (e.g., one or more static images and/or dynamic video images that correspond to a specified (either user selectable or preset) time interval or number of images taken on demand substantially in real time with a respective image request. It is noted that live video stream and/or image capture provided by the vision system 400 and vision system controller 122VC may provide for real-time remote controlled operation (e.g., teleoperation) of the autonomous transport vehicle 110 by a warehouse control center user through the warehouse control center user interface UI.
[0133] In some aspects, the live video is streamed from the vision system 400 of the supplemental navigation sensor system 288 and/or the supplemental hazard sensor system 290 to the user interface UI as a conventional video stream (e.g., the image is presented on the user interface without augmentation, what the camera "sees" is what is presented) as illustrated in Figs. 9A and 15. In this aspect, Fig. 9A illustrates a live video that streamed without augmentation from both the forward navigation cameras 420A, 420B (a similar video stream may be provided by the rearward navigation cameras 430A, 430B but in the opposite direction); while Fig. 15 illustrates a live video that streamed without augmentation from the forward camera 292/477A (a similar video stream may be provided by the rearward camera 292/477B but in the opposite direction). Similar video may be streamed from any of the cameras of the supplemental navigation sensor system 288 and/or supplemental hazard sensor system 290 described herein. While Fig. 9A illustrates a side by side presentation of the forward navigation cameras 420A, 420B, the video stream, where requested by the user, may be for but one of the forward navigation cameras 420A, 420B. Where a virtual reality headset is employed by a user to view the streamed video, images from the right side forward navigation camera 420A may be presented in a viewfinder of the virtual reality headset corresponding to the user's right eye and images from the left side forward navigation camera 420B may be presented in a viewfinder of the virtual reality headset corresponding to the user's left eye.
[0134] In some aspects, the live video is streamed from the vision system 400 of the supplemental navigation sensor system 288 to the user interface UI as an augmented reality video stream (e.g., a combination of live video and virtual objects are presented in the streamed video) as illustrated in Fig. 10A. In this aspect, Fig. 10A illustrates a live video that is streamed with augmentation from one of the case unit monitoring cameras 410A, 410B (a similar video stream may be provided by the other of the case unit monitoring cameras 430A, 430B but offset by the separation distance between the cameras 430A, 430B). Similar augmented video may be streamed from any of the cameras of the supplemental navigation sensor system 288 described herein. In Fig. 10A the case units CUI, CU2, CU3 are presented to the user through the user interface UI in the live video stream as the case units are captured by the one of the case unit monitoring cameras 410A, 410B. Virtual representations of the shelf 555 and slats 520L on which the case units CUI, CU2, CU3 are seated may be inserted into the live video stream by the vision system controller 122VC or other suitable controller (such as control server 120) to augment the live video stream. The virtual representations of the shelf 555 and slats 520L (or other structure of the storage and retrieval system 100) may be virtually inserted into the live video stream such as where portions of the structure are not within the field of view 410AF, 410BF of the case unit monitoring cameras 410A, 410B (or a field of view of whichever camera of the supplemental navigation sensor system 288 is capturing the video). Here, the virtual representations of the storage and retrieval structure may be virtually inserted into the live video streams to supplement/augment the live video stream with information that may be useful to the user (e.g., to provide a completed "picture" of what is being "observed" by the autonomous transport vehicle) where such information is not captured by cameras or not clearly discernable in the camera image data. The virtual representations of the storage and retrieval structure that are virtually inserted into the live video stream are obtained by the vision system controller 122VC (or control server 120) from the virtual model 400VM. Also referring to Fig. 9B, where the autonomous transport vehicle 110 is under remote control operation, the video streams may be augmented to provide the operator with a transport path VTP and/or destination location indicator DL that provide the operator with guidance as to a destination location of the autonomous transport vehicle 110. The transport path VTP and destination location indicator DL may also be presented in the video streams with the autonomous transport vehicle operating in the automatic/autonomous and quasi automatic operation modes to provide an operator with an indication of the planned route and destination.
[0135] Referring to Figs. 1A, 2, 4A, 4B, 9A, 10A, and 12 an exemplary method will be described in accordance with aspects of the disclosed embodiment. The method includes providing the autonomous transport vehicle 110 (Fig. 12, Block 1200) as described herein. Sensor data is generated (Fig. 12, Block 1205) with the physical characteristic sensor system 270 where, as described herein, the sensor data embodies at least one of a vehicle navigation pose or location information and payload pose or location information. Image data is captured (Fig. 12, Block 1210) with the supplemental navigation sensor system 288 where, as described herein, the image data informs the at least one of a vehicle navigation pose or location and payload pose or location supplement to the information of the physical characteristic sensor system 270.
[0136] The method may also include determining, with the vision system controller 122VC, from the information of the physical characteristic sensor system 270 vehicle pose and location (Fig. 12, Block 1220) effecting independent guidance of the autonomous transport vehicle 110 traversing the storage and retrieval system 100 facility. The vision system controller 122VC may also determine from the information of the physical characteristic sensor system 270 payload (e.g., case unit CU) pose and location (Fig. 12, Block 1225) effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload bed 210B as described herein.
[0137] The vision system controller 122VC may also register the captured image data and generating therefrom at least one image of one or more features of the predetermined features (Fig. 12, Block 1215) where, as described herein, the at least one image is formatted as a virtual representation VR of the one or more predetermined features so as to provide comparison to one or more corresponding reference e.g., a corresponding feature of the virtual model 400VM that serves as a reference for identifying the form and/or location of the imaged predetermined feature) of the predetermined features of the reference representation 400VMR. As described herein, the vision system controller 122VC is configured so that the virtual representation VR, of the imaged one or more features of the predetermined features, is effected resident on the autonomous transport vehicle 110, and the comparison between the virtual representation VR of the one or more imaged predetermined features and the one or more corresponding reference predetermined features (of the reference representation 400VMR) is effected resident on the autonomous transport vehicle 110. The vision system controller 122 may confirm autonomous guided vehicle pose and location information or payload pose and location information (Fig. 12, Block 1230) registered by the vision system controller 122VC from the physical characteristic sensor system 270 based on the comparison between the virtual representation VR and the reference representation 400VMR.
[0138] The vision system controller 122VC may identify a variance in the autonomous transport vehicle 110 pose and location or a variance in the payload pose and location (Fig. 12, Block 1235) based on the comparison between the virtual representation VR and the reference representation 400VMR, and update or complete autonomous transport vehicle 110 pose or location information or update and complete the payload pose and location information from the physical characteristic sensor system 270 based on the variance. In the method, the vision system controller 122VC may determine a pose error (for the autonomous guided vehicle and/or the payload) (Fig. 12, Block 1240) in the information from the physical characteristic sensor system 270 and fidelity of the pose and location information (for the autonomous guided vehicle and/or the payload) from the physical characteristic sensor system 270 based on at least one of the identified variance and image analysis of the at least one image (e.g., from the vision system 400), and assign a confidence value according to at least one of the pose error and the fidelity. With the confidence value below a predetermined threshold, the vision system controller 122VC switches payload handling based on pose and location information generated from the virtual representation VR in place of pose and location information from the physical characteristic sensor system 270; and/or with the confidence value below a predetermined threshold, the vision system controller 122VC switches autonomous guided vehicle 110 navigation based on pose and location information generated from the virtual representation VR in place of pose and location information from the physical characteristic sensor system 270. After switching, the controller is configured to: continue autonomous guided vehicle navigation to destination or select an autonomous guided vehicle safe path and trajectory bringing the autonomous guided vehicle from a position at switching to a safe location for shut down, or initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to quasi automatic operation or manual operation via a user interface device; and/or continue autonomous guided vehicle handling to destination, or initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to quasi automatic payload handling operation or manual payload handling operation via a user interface device.
[0139] The controller transmits, via a wireless communication system (such as network 180) communicably coupling the vision system controller 122VC and the operator/user interface UI, a simulation image (see Figs. 9A, 10A, 10B) (Fig. 12, Block 1245) combining the virtual representation VR of the one or more imaged predetermined features and one or more corresponding reference predetermined features RPF of a reference presentation RPR presenting the operator with an augmented reality image in real time. The vision system controller 122VC receives real time operator commands to the traversing autonomous guided vehicle 110, which commands are responsive to the real time augmented reality image (see Figs. 9A, 10A, 10B), and changes in the real time augmented reality image transmitted to the operator by the vision system controller 122VC.
[0140] Referring now to Figs. 1A, IB, 2, 4A, 4B, and 14, an example of an autonomous transport vehicle 110 case unit (s) transfer transaction including a case unit (s) multi-pick and place operation with on the fly sortation of the case units for creating a mixed pallet load MPL (e.g., a pallet load having mixed cases or cases having different stock keeping units as shown in Fig. IB) according to a predetermined order out sequence will be described in accordance with an aspects of the disclosed embodiment. Suitable examples of multi-pick/place operations of the autonomous transport vehicle 110 in which the aspects of the disclosed embodiment may be employed are described in United States patent numbers 10,562,705 titled "Storage and Retrieval System" issued on February 18, 2020; 10,839,347 titled "Storage and Retrieval System" issued on November 17, 2020; 10,850,921 titled "Storage and Retrieval System" issued on December 1, 2020; 10,954,066 titled "Storage and Retrieval System" issued on March 23, 2021; and 10,974,897 titled "Storage and Retrieval System" issued on April 13, 2021, the disclosures of which are incorporated herein by reference in their entireties. The autonomous transport vehicle 110 picks at least a first case unit CUA from a first shelf of a first storage location 130S1 of picking aisle 130A1 (Fig. 14, Block 1400). As described above, localization of the autonomous transport vehicle 110 relative to the case unit CUA in storage location 130S1 is effected with the physical characteristic sensor system 270 and/or the supplemental navigation sensor system 288 in the manner described herein.
[0141] The autonomous transport vehicle 110 traverses the picking aisle 130A1 and buffers the at least first case unit CUA within the payload bed 210B (Fig. 14, Block 1410). The autonomous transport vehicle 110 traverses the picking aisle 130A1 to a second storage location 130S2 and picks at least a second case unit CUB that is different than the at least first case unit CUA (Fig. 14, Block 1420). While the at least second case unit CUB is described as being in the same picking aisle 130A1 as the at least first case unit CUA, in other aspects the at least second case unit CUB may be in a different aisle or any other suitable holding location (e.g., transfer station, buffer, inbound lift, etc.) of the storage and retrieval system. Localization of the autonomous transport vehicle 110 relative to the case unit CUB in storage location 130S2 is effected with the physical characteristic sensor system 270 and/or the supplemental navigation sensor system 288 in the manner described herein. The at least first case unit CUA and the at least second case unit CUB may comprising more than one case in ordered seguence corresponding to a predetermined case out order sequence of mixed cases.
[0142] The autonomous guided vehicle 110 traverses the picking aisle 130A1 and/or transfer deck 130B, with both the at least first case unit CUA and the at least second case unit CUB held within the payload bed 210B, to a predetermined destination (such as outbound lift 150B1). The positions of the at least first case unit CUA and the at least second case unit CUB within the payload bed 210B may be monitored by at least one or more of the case unit monitoring cameras 410A, 410B, one or more three-dimensional imaging system 440A, 440B, and one or more case edge detection sensors 450A, 450B and arranged relative to one another (e.g., the supplemental navigation sensor system 288 at least in part effects on-the-fly justification and/or sortation of case units onboard the vehicle 110 in a manner substantially similar to that described in United States patent number 10,850,921, the disclosure of which was previously incorporated herein by reference in its entirety) within the payload bed 210B (e.g., with the justification blades 471, pushers 470, and/or pullers 472) based on data obtained from the at least one or more of the case unit monitoring cameras 410A, 410B, one or more three-dimensional imaging system 440A, 440B, and one or more case edge detection sensors 450A, 450B. The autonomous transport vehicle 110 is localized (e.g., positioned) relative to the destination location with the physical characteristic sensor system 270 and/or the supplemental navigation sensor system 288 in the manner described herein. At the destination location the autonomous transport vehicle 110 places the at least first case unit CUA and/or the at least second case unit CUB (Fig. 14, Block 1430) where the transfer arm 210A is moved based on data obtained by one or more of the physical characteristic sensor system 270 and/or the supplemental navigation sensor system 288.
[0143] Referring to Figs. 1A, 2, 4D, 15, and 16 an exemplary method will be described in accordance with aspects of the disclosed embodiment. The method includes providing the autonomous transport vehicle 110 (Fig. 16, Block 1700) as described herein. The autonomous transport vehicle 110 is configured to autonomously navigate to different positions with the navigation system and operates to effect predetermined transfer tasks at the different positions (Fig. 16, Block 1705) while incidentally capturing image data (Fig. 16, Block 1710) with the supplemental hazard sensor system 290. As described herein, the image data informs objects and/or spatial features 299 (having intrinsic physical characteristics) within at least a portion of the facility 100 viewed by the at least one camera 292 of the supplemental hazard sensor system 290 with the autonomous transport vehicle 110 in the different positions in the facility 100.
[0144] The method may also include determining, with the vision system controller 122VC, from the information of the supplemental hazard sensor system 290 presence of a predetermined physical characteristic of at least one object or spatial feature (Fig. 16, Block 1715), and in response thereto, selectably reconfiguring the vehicle from an autonomous state to a collaborative vehicle state (Fig. 16, Block 1720) for collaboration with an operator, the vehicle in the collaborative state is disposed to receive operator commands for the vehicle to continue effecting vehicle operation so as to finalize discrimination of the object or spatial feature 299 as a hazard (Fig. 16, Block 1725) and identify a mitigation action of the vehicle with respect to the hazard (Fig. 16, Block 1730) as described herein.
[0145] The vision system controller 122VC may also register the captured image data and generating therefrom at least one image of the presence of a predetermined physical characteristic of the at least one object or spatial feature 299 (Fig. 16, Block 1735) where, as described herein, the at least one image is formatted as a virtual representation VR of the predetermined physical characteristic of the at least one object or spatial feature 299 so as to provide comparison to one or more corresponding reference (e.g., a corresponding feature of the virtual model 400VM that serves as a reference for identifying the form and/or location of the imaged object or spatial feature 299) of the predetermined features of the reference representation 400VMR. As described herein, the vision system controller 122VC is configured so that the virtual representation VR, of the imaged object or spatial feature 299, is effected resident on (e.g., onboard) the autonomous transport vehicle 110, and the comparison between the virtual representation VR of the object or spatial feature 299 and the one or more corresponding reference predetermined features (of the reference representation 400VMR) is effected resident on the autonomous transport vehicle 110.
[0146] In the method, the vision system controller 122VC may determine presence of an unknown physical characteristic of the at least one object or spatial feature and switch the autonomous transport vehicle 110 from an autonomous operation state to a collaborative operation state. With the above noted switching effected, the controller 122 is configured to: stop the autonomous transport vehicle 110 relative to the object or spatial feature 299 or select an autonomous guided vehicle path and trajectory bringing the autonomous transport vehicle 110 from a position at switching to a location 157 to initiate communication to an operator for identifying the object or spatial feature 299 via a user interface device UI.
[0147] The controller 122VC transmits, via a wireless communication system (such as network 180) communicably coupling the vision system controller 122VC and the operator/user interface UI, an image (see Fig. 15) (Fig. 16, Block 1740) combining the virtual representation VR of the one or more imaged object or spatial feature 299 and one or more corresponding reference predetermined features RPF of a reference presentation RPR presenting the operator with an augmented (or un-augmented) reality image in real time. The controller 122 receives real time operator commands to the autonomous transport vehicle 110, which commands are responsive to the real time augmented reality or unaugmented image (see Fig. 15), and changes in the real time augmented reality or un-augmented image transmitted to the operator by the vision system controller 122VC.
[0148] Referring to Figs. 2, 19, 20, 25A-25C, and 26, the autonomous transport vehicle 110 includes the controller 122 that is coupled respectively to the drive section 261D, the case handling assembly 210, the peripheral electronics section 778, and other components/features of the autonomous transport vehicle 110 that are described herein so as to form a control system 122CS (see Figs. 25A-25C). The control system 122CS effects each autonomous operation of the autonomous transport vehicle 110 described herein. The controller system 122CS may be configured to provide communications, supervisory control, vehicle localization, vehicle navigation and motion control, payload sensing, payload transfer, and vehicle power management as described herein. In this and other aspects, the control system may also be configured to provide any suitable services to the vehicle 110. The control system 122CS includes any suitable non- transitory program code and/or firmware that configure the vehicle 110 to perform the vehicle operations described herein. The control system 122CS may be configured for (but is not limited to) one or more of remote updating of control system firmware/software, remote debugging of the vehicle 110, remote operation of the vehicle 110, tracking a position of the vehicle 110, tracking operational status of the vehicle 110, and tracking any other suitable information pertaining to the vehicle 110.
[0149] As illustrated in, for example, Figs. 25A-25C, the control system 122CS is a distributed control system that includes, as described, herein, the controller 122, the vision system controller 122VC, and the power management section 444 (which includes the switching device 449 and the monitoring and control device 447). In some aspects, one or more of the vision system controller 122VC and the power management section 444 are at least partially integral to the controller 122; while in other aspects one or more of the system controller 122VC and the power management section 444 are separate from but communicably coupled to the controller 122. Components of the control system (e.g., sensors, cameras, lighting, drives, motors, etc.) may be distributed throughout the autonomous transport vehicle 110 and communicably coupled to the controller 122 in any suitable manner (such as described in Figs. 25A-25C).
[0150] The controller 122 includes at least one of an autonomous navigation control section 122N and an autonomous payload handling control section 122H. The autonomous navigation control section 122N is configured to register and hold in a volatile memory (such as memory 446 of a comprehensive power management section 444 of the controller 122) autonomous guided vehicle state and pose navigation information that is deterministic (and provided in real time) of and describes current and predicted state, pose, and location of the autonomous transport vehicle 110. The autonomous transport vehicle state and pose navigation information includes both historic and current autonomous guided vehicle state and pose navigation information. The state, pose, and location information is deterministic (and provided in real time) and describes the current and predicted state, pose, and location in up to six degrees of freedom X, Y, Z, Rx, Ry, Rz so that the historic, current and predicted state, pose, and location is described in full. The autonomous payload handling control section 122H is configured to register and hold in the volatile memory (such as memory 446) current payload identity, state, and pose information (e.g., both historic and current). The payload identity, state, and pose information describes historic and current payload identity, payload pose and state location relative to a frame of reference of the autonomous transport vehicle (e.g., such as the X, Y, Z coordinate axes and suitable datum surfaces within the payload bed 210B), and pick/place locations of current and historic payloads.
[0151] As described herein the controller 122 comprises a comprehensive power management section 444 (also referred to as a power distribution unit - see also Fig. 26) that is separate and distinct from each other section (such as the vision system controller 122VC) of the controller 122. As will be described herein, the power distribution unit 444 is communicably connected to the power supply 481 so as to monitor a charge level (e.g., voltage level or current level) of the power supply 481. As also described herein, the power distribution unit 444 is connected to each respective branch circuit 482 (also referred to herein as a branch power circuit - see Fig. 26 as a non-limiting example) of the drive section 261D, the case handling assembly 210 and the peripheral electronics section 778 respectively powering the drive section 261D, the case handling assembly 210, and the peripheral electronics section 778 from the power supply 481. The power distribution unit 444 is configured to comprehensively manage power consumption to each respective branch circuit 482 based on demand level of each branch circuit 482 relative to the charge level available from the power supply 481.
[0152] The power distribution unit 444 includes a monitoring and control device 447 (referred to herein as monitoring device 447), a switching device 449 (having switches 449S), a memory 446, a wireless communications module 445, and an analog to digital converter 448 (referred to herein as AD converter 448). The monitoring device 447 is any suitable processing device configured to monitor at least the current usage and fuse status of the branch power circuits 482 and control shutdown of one or more selected branch power circuits 482 as described herein. For example, the monitoring device 447 is one or more of a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), a system on chip integrated circuit (SOC), and a central processing unit (CPU). The monitoring device 447 operates independent of the controller 122 and vision system controller 122VC, and the monitoring device 447 is programmed with non-transitory code to manage (e.g., at least power distribution to) one or more low level systems of the autonomous transport vehicle 110. [0153] Referring to Figs. 1A, IB, 2, 19, 20 25A-25C, and 26, the power distribution unit 444 is configured to communicate with and control at least one branch device 483. For example, the power distribution unit 444 is communicably coupled to one or more of the analog sensors 483C (e.g., case edge detection sensors, line following sensors 275, and other analog sensors as described herein), the digital sensors 483B (e.g., cameras 410, 440, 450 of the vision system 400 and other digital sensors described herein), lights 483A, casters 250, drive/traction wheels 260, transfer arm 210A extension motor 667A-667C, transfer arm lift motors 669, payload justification motors 668A-668F of the payload bed 210B/transfer arm 210A, suspension lock motors, and any other suitable features of the autonomous transport vehicle 110 (see Figs. 20, 21, and 26) so as to provide power to (e.g., turn on and maintain powered operation of) the analog sensors 483C, the digital sensors 483B, and lights 483A, casters 250, drive/traction wheels 260, transfer arm 210A extension motors 667, transfer arm lift motors 669, payload justification motors 668 of the payload bed 210B/transfer arm 210A, suspension lock motors, and any other suitable features. Here, the power distribution unit 444 may receive commands from the controller 122 to actuate one or more of the analog sensors 483C and the digital sensors 483B so that the one or more of the analog sensors 483C and the digital sensors 483B obtain one or more of pose and location information of the autonomous transport vehicle within the storage and retrieval system 100 storage structure 130 in a manner substantially similar to that described herein, and in United States patent numbers 8,425,173 titled "Autonomous Transport for Storage and Retrieval Systems" issued on April 23, 2013; 9,008,884 titled "Bot Position Sensing" issued on April 14, 2015; and 9,946,265 titled Bot Having High Speed Stability" issued on April 17, 2018, the disclosures of which are incorporated herein by reference in their entireties. The power distribution unit 444 is configured to process and filter (in any suitable manner) the sensor data obtained by the one or more of the analog sensors 483C and the digital sensors 483B. The power distribution unit 444 may also be configured to process and filter (in any suitable manner) control signals sent by the controller 122 (or vision system controller 122VC) to the one or more of the analog sensors 483C and the digital sensors 483B. Where the sensor is an analog sensor 483C the power distribution unit 444 includes the AD converter 448 to effect conversion of the analog sensor data to digital sensor data for filtering and processing by the power distribution unit 444.
[0154] The autonomous transport vehicle may include lights 483A (Fig. 20, see also lighting/LED in Figs. 25A-25C) that are coupled to the frame 200 (or any other location of the autonomous transport vehicle 110) and that illuminate portions of the storage structure 130 adjacent the autonomous transport vehicle 110. The power distribution unit 444 is configured to control operation of the lights 483A. For example, the power distribution unit 444 is configured to provide a pulse width modulation control signal to the lights 483A to actuate the lights 483A in a manner that minimizes power consumption. Here, the pulse width modulation control signal is configured to minimize an amount of power drawn from the power supply 481 for illuminating the lights 483A for a given autonomous transport vehicle task (e.g., reading a barcode with the vision system 400, detecting case unit features with the vision system, illumination of a portion of the storage and retrieval system 100 for remote operator viewing effected by the vision system as described herein. The lights 483A may be any suitable lights including but not limited to light emitting diodes (LED).
[0155] Still referring to Figs. 2, 19, 20, 25A-25C, and 26, the power distribution unit 444 is configured to manage power needs of the autonomous transport vehicle 110 so as to preserve higher level functions/operations of the autonomous transport vehicle 110. The power distribution unit 444 is configured so as to comprehensively manage a demand charge level of each respective branch power circuit 482 (on which respective branch devices 483A-483F ... 483n (collectively referred to as branch devices 483, where n denotes an integer representing a maximum number of branch devices) are disposed - see Figs. 19, 20, 21, 25A-25C and 26) switching off each of the branch power circuits 482 in a predetermined pattern based on the demand level of each respective branch circuit with respect to other branch power circuits 482 and the charge level available from the power supply 481. The predetermined pattern (e.g., for switching off the branch power circuits 482) is arranged to switch off branch power circuits 482 with a decrease in the available charge level from the power supply 481, so as to maximize available charge level from the power supply 481 directed to the controller 122. The predetermined pattern is arranged to switch off the branch power circuits 482 with the decrease in the available charge level from the power supply 481 so that the available charge level directed to the controller 122 is equal to or exceeds the demand charge level of the controller 122 for a maximum time based on the available charge level of the power supply 481 (e.g., to preserve operation of the controller 122).
[0156] As an example of branch power circuit 482 shut down and preservation of controller 122 operation, the monitoring device 447 of the power distribution unit 444 is configured to monitor the voltage of the power supply 481 (Fig. 23, Block 23800) as described herein and shut down components/systems (e.g., analog sensors, digital sensors drive systems, communications systems, etc.) of the autonomous transport vehicle 110 in a sequenced shutdown order where the each shutdown operation in the sequenced shutdown order depends on a respective threshold voltage of the power supply. For example, the power supply 481 power supply has a fully charged voltage of VI. With the power distribution unit 444 detecting the voltage VI the components/systems of the autonomous transport vehicle 110 are substantially fully operational to effect transport of case units throughout the storage structure 130.
[0157] With operation of the autonomous transport vehicle 110, the voltage of the power supply 481 may drop (and the power distribution unit 444 detects such voltage drop) to a first predetermined threshold voltage V2 (where V2 is less than VI). The power distribution unit 444 monitoring the power supply 481 voltage detects that power supply voltage drops to a voltage equal to about the first predetermined threshold voltage V2 (Fig. 23, Block 23810); and with the power supply 481 voltage at about the first predetermined threshold voltage V2 the power distribution unit 444 may operate the switches 449S to remove power from (e.g., shut down) branch power circuits 482 corresponding to case unit handling components/systems (e.g., arm extension drives 667, payload justification drives 668, arm lift drives 669, case unit sensors, arm/case unit justification position sensors, suspension locks, etc.) of the autonomous transport vehicle 110 (Fig. 23, Block 23820) so that remaining power of the power supply 481 may be employed to effect traverse of the autonomous transport vehicle to a charging station/location or other predetermined location within the storage structure 130. In aspects where power supply 481 charge is not sufficient to complete traverse of the autonomous transport vehicle 110 to a charging station, the controller 122 may effect traverse of the autonomous transport vehicle to a safe location as described herein (e.g., a predetermined location of the storage and retrieval system where the autonomous vehicle may be accessed by an operator for maintenance or removal from the storage structure 130). Suitable examples of charging stations that may be disposed in the storage and retrieval system are described in United States patent number 9,469,208 titled "Rover Charging System" and issued on October 18, 2016; United States patent number 11,001,444 titled "Storage and Retrieval System Rover Interface" and issued on May 11, 2021; and United States patent application number 14/209,086 titled "Rover Charging System" and filed on March 13, 2014, the disclosures of which are incorporated herein by reference in their entireties. [0158] The power distribution unit 444 continues to monitor the voltage of the power supply 481 for a drop in the power supply voltage to a subsequent (e.g., next) lower threshold voltage (Fig. 23, Block 23830). For example, where a threshold voltage of the power supply 481 of V3 (where V3 is less than V2) is detected by the power distribution unit 444, the power distribution unit 444 operates the switches 449S to remove power from (e.g., shut down) branch power circuits 482 (such as circuits 483D, 483F) corresponding to drives/systems that effect vehicle traverse (e.g., the right and left drive/traction wheels 260A, 260B (Figs. 2 and 21), caster wheel steering drives 600M (Fig. 2), traction control system 666 (Fig. 21), sensors and sensor controllers effecting vehicle navigation (e.g., vision system, line following sensors, etc. such as provided with sensor system 270) (Fig. 23, Block 23840) so that remaining power of the power supply 481 may be employed to effect operation of the controller 122 of the autonomous transport vehicle 110. Here, primary communications between the autonomous transport vehicle 110 and the control server 120 and/or an operator may also be shut down to preserve power for the controller 122. As described above, the communications module 445 of the power distribution unit 444 operates to maintain a secondary communications channel between the controller 122 and the control server 120 and/or an operator (e.g., via the laptop, smart phone/tablet, etc.).
[0159] As above, the power distribution unit 444 continues to monitor the voltage of the power supply 481 for the next subsequent lower threshold voltage (Fig. 23, Block 23850). For example, where a threshold voltage V4 (where V4 is less than V3) of the power supply 481 is detected by the power distribution unit 444, the power distribution unit 444 is configured to initiate shutting down of the controller 122 (Fig. 23, Block 23860) so that the controller 122 (and its software) is not adversely affected by a loss of power or an under-voltage/under-current failure. Here, the controller 122 is configured so that upon indication from (e.g., a prediction by) the power distribution unit 444 of imminent decrease in available charge level, directed from the power supply 481 to the controller 122, to less than a demand level of the controller 122, the controller 122 enters suspension of operation and hibernation. With the controller 122 in suspension and hibernating (e.g., shut down) the power distribution unit 444 may also shut itself down so that substantially all operations of the autonomous transport vehicle 110 are suspended.
[0160] It is noted that the threshold voltage V4 is described above as the "lowest threshold voltage" such that detection of the threshold voltage V4 initiates shutdown of the controller 122. However, it should be understood that the above shut down sequence effected by the power distribution unit 444 is exemplary only and in other aspects there may be any suitable number of threshold voltages at which any suitable number of corresponding vehicle components/systems are shut down to preserve power of the power supply 281. For example, Blocks 23830 and 23840 of Fig. 23 may be repeated in a loop until the next to lowest threshold voltage is reached. Here, each threshold voltage in the descending values of threshold voltages is known to the power distribution unit 444 (such as stored in memory 446 and accessible by the monitoring device 447) such that the loop ends when the next to lowest threshold voltage is reached.
[0161] Still referring to Figs. 2, 19, 20, 25A-25C, and 26 another exemplary shutdown operation will be described. Here the autonomous transport vehicle 110 has a power supply 481 with a fully charged voltage of about 46V (in other aspects the fully charged voltage may be more or less than about 46V). The power distribution unit 444 monitors the voltage output by the power supply 481 during autonomous transport vehicle 110 operation in a manner similar to that described above with respect to Fig. 23. Here, with the power supply 481 output at a threshold voltage of about 22V (in other aspects the output voltage may be more or less than about 22V) the power distribution unit 444 operates the switches 449S to disable the traction motors 261M and other features (e.g., sensors associated with navigation/traverse of the autonomous transport vehicle) of the autonomous transport vehicle so that driving of the autonomous transport vehicle is disabled.
[0162] The power distribution unit 444 continues to monitor the output voltage of the power supply 481 for the next lowest threshold voltage of about 20V (in other aspects the output voltage may be more or less than about 20V). Upon detection of the threshold voltage of about 20V, the power distribution unit 444 effects, through the controller 122, positioning of any case units CU carried by the autonomous transport vehicle 110 to a known safe state (e.g., retracted into the payload bed 210B in a predetermined justified location) within the payload bed 210B. In other aspects, where the autonomous transport vehicle 110 is located in front of a predetermined destination/place location for the case unit(s) CU being carried by the autonomous transport vehicle 110, the controller 122 may effect extension of the transfer arm 210A to place the case unit(s) CU at the destination location rather than retract the case unit (s) CU into the payload bed 210B (noting that after placement of the case unit (s) CU the transfer arm 210A is retracted within the payload bed 21B to a safe/home position).
[0163] The power distribution unit 444 is configured to operate the switches 499S, upon detection of the next lowest threshold voltage of about 18V of the power supply 481 (in other aspects the output voltage may be more or less than about 18V), so as to shut down the vision system 400 and other 24V peripheral power supplies (e.g., including but not limited to case detection sensors, vehicle localization sensors, hot swap circuitry, etc.). Upon detection of the next lowest power supply 481 output threshold voltage of about 14V (in other aspects the output voltage may be more or less than about 14V) the power distribution unit 444 is configured to operate the switches 499S to disable onboard and off-board communications (e.g., wireless communications module 445 and onboard Ethernet communications) of the autonomous transport vehicle 110. The power distribution unit 444 continues to monitor the power supply 481 output voltage for the next lowest threshold voltage of about 12V (in other aspects the output voltage may be more or less than about 12V), and upon detection of the about 12V output voltage the power distribution unit 444 turns off lighting (e.g., LEDs) of the autonomous transport vehicle 110 and provides command signals to the controller 122 so that the controller 122 is placed into hibernation/sleep as described above. Upon detection of the lowest power supply 481 output threshold voltage of about 10V (in other aspects the output voltage may be more or less than about 10V) by the power distribution unit 444, the power distribution unit 444 effects a complete shutdown of the autonomous transport vehicle 444 such that the controller 122, the vision system controller 122VC, and other suitable programmable devices (e.g., FPGAs, CPLDs, SOCs, CPUs, etc.) of the autonomous transport vehicle 110 are turned off/shut down.
[0164] The monitoring device 447 is configured to substantially continuously (e.g., with the autonomous transport vehicle 110 in operation) monitor power supply 481 operation and status. For example, the monitoring device 447 is configured to substantially continuously (or at any suitable predetermined time intervals) monitor a voltage of the power supply 481 (e.g., with any suitable voltage sensors) and communicate a low voltage condition (e.g., the voltage has dropped below a predetermined voltage level) to the controller 122 so that the controller 122 may effect a safe state of the autonomous transport vehicle 110. For example, the controller 122 is configured (e.g., via the monitoring device 447) so that upon indication from the power distribution unit 444 of imminent decrease in available charge level of the power supply 481, directed from the power supply 481 to the branch power circuit of the drive section 261D (see Fig. 21), the controller 122 is configured to command the drive section 261D so as to navigate the autonomous transport vehicle 110 along a predetermined auxiliary path AUXP and auxiliary trajectory AUXT (known as safe, nonconflicting with other vehicles 110, not impedimental nor blocking other vehicle paths, pass through nor destination location - see Fig. IB) to a predetermined bot auxiliary stop location 157 in the storage and retrieval facility (e.g., structure) 130. The predetermined auxiliary stop location 157 is a safe, uncongested area of a transport deck 130B or picking aisle 130A or a human access zone (such as described in United States patent number 10,088,840 titled "Automated Storage and Retrieval System with Integral Secured Personnel Access Zones and Remote Rover Shutdown" issued on October 2, 2018, the disclosure of which is incorporated herein by reference in its entirety).
[0165] The controller 122 is configured so that upon indication from the power distribution unit 444 of imminent decrease in available charge level of the power supply 481, directed from the power supply 481 to the branch circuit of the payload handling section 210 (see Fig. 21) the controller 122 is configured to command the payload handling section 210 to move the payload handling actuator or transfer arm 210A (e.g., via one or more of arm extension drives 667 and arm lift drives 669), and any payload thereon (e.g., via payload justification drives 668), to a predetermined safe payload position in the payload bed 210B. The safe payload position may be such that the payload does not overhang outside of the payload bed and is securely held within the payload bed 210B.
[0166] Referring to Figs. 1A, IB, 2, 19, 20, and 21, as described herein the controller 122 may also be configured to actively monitor a health status of the autonomous transport vehicle 110 and effect onboard diagnostics of vehicle systems. As an example, vehicle system health is monitored in any suitable manner such as by monitoring current used and fuse status of the vehicle systems (and the branch power circuits 482 of which the branch devices 483 are a part). Here the controller 122 includes at least one of a vehicle health status monitor 447V, a drive section health status monitor 447D, a payload handling section health monitor 447H, and a peripheral electronics section health monitor 447P. The vehicle health status monitor 447V, the drive section health status monitor 447D, the payload handling section health monitor 447H, and the peripheral electronics section health monitor 447P may be sections of the monitoring device 447. The controller also includes a health status register section 447M, which may be a section of the memory 446 (or memory 122M or any other suitable memory accessible by the controller 122).
[0167] The vehicle health status monitor 447V may monitor dynamic responses of the frame 200 and wheel suspension, such as with any suitable vehicle health sensors (such as accelerometers) coupled to the frame (e.g., such as described in United States provisional patent application number 63/213,589 titled "Autonomous Transport Vehicle with Synergistic Vehicle Dynamic Response" and filed on June 22, 2021, the disclosure of which is incorporated herein by reference in its entirety). Where a dynamic response is outside of a predetermined range the vehicle health status monitor 447V may effect (through controller 122) a maintenance request (e.g., presented on user interface UI) to an operator of the storage and retrieval system 100. In other aspects, any suitable characteristics of the vehicle may be monitored by the vehicle health status monitor 447V.
[0168] The drive section health status monitor 447D may monitor power drawn by the motors 261M of the drive section 261D, drive section sensor (e.g., wheel encoders, etc.) status, and a status of the traction control system 666. Where the power usage of the motors 261M, drive section sensor responsiveness, and/or a traction control system response is outside of predetermined operating characteristics the drive section health status monitor 447D may effect (through controller 122) a maintenance request (e.g., presented on user interface UI) to an operator of the storage and retrieval system 100.
[0169] The payload handling section health monitor 447H may monitor power drawn by the motors (e.g., extension lift, justification, etc.) of the case handling assembly 210 and a status of the case handling assembly sensors. Where the power usage of the case handling assembly motors and/or a case handling assembly sensor response is outside of predetermined operating characteristics the payload handling section health monitor 447H may effect (through controller 122) a maintenance request (e.g., presented on user interface UI) to an operator of the storage and retrieval system 100.
[0170] The peripheral electronics section health monitor 447P may monitor the sensor system 270 and the at least one peripheral motor 777. Where the power usage of at least one peripheral motor 777 and/or a sensor (of the sensor system 270) response is outside of predetermined operating characteristics the peripheral electronics section health monitor 447P may effect (through controller 122) a maintenance request (e.g., presented on user interface UI) to an operator of the storage and retrieval system 100.
[0171] As a non-limiting example of health monitoring, the power distribution unit 444 is configured to monitor current in the branch power circuits 482 (in any suitable manner, such as directly with ammeters or indirectly by monitoring voltage and/or resistance of the respective branch power circuits 482) and a status of the respective fuses 484 of the branch power circuits 482. Real-time feedback (e.g., input data relating to current and fuse status is processed by the monitoring device 447 within milliseconds so that the processed data it is available substantially immediately as feedback) is provided to one or more of the controller 122 and control server 120 to effect autonomous transport vehicle 110 operator and/or service/maintenance requests.
[0172] The real time feedback effected by the monitoring device 447 monitoring at least the branch power circuit 482 current and fuse 484 status provides for onboard diagnostics and health monitoring of the autonomous transport vehicle systems. The power distribution unit 444 is configured to detect the fuse 484 status (e.g., inoperable or operable) based on, for example current of the respective branch power circuit 482. Where there is an absence of current detected in the respective branch power circuit 482 the monitoring device 447 determines that the fuse 484 is inoperable and in need of service, otherwise where current is detected the fuse 484 is operable (i.e., a fault state (see, e.g., Fig. 5) is detected). The monitoring device 447 provides the fuse status (e.g., fault state) as feedback to, for example, the control server 120 and/or an operator through the communications module 445 so that servicing of the autonomous transport vehicle 110 can be scheduled. As may be realized, the power distribution unit 444 is configured to monitor each branch power circuit 482 separately from each other power branch power circuit 482 so that where a fuse is determined to be inoperable the monitoring device 447 also identifies the branch power circuit 482 of which the fuse is a part so as to reduce downtime and troubleshooting of the autonomous transport vehicle 110 for fuse 484 replacement.
[0173] An increased current within a branch power circuit, as detected by the monitoring device 447 may be indicative of an impending drive motor fault, an impending bearing fault, or other impending electrical/mechanical fault. As noted above, each branch power circuit is monitored separately so that where an increased current is detected the corresponding branch power circuit 482 is also identified. The monitoring device 447 provides the increased current value (e.g., fault state) and identifies the branch power circuit 482 with the overcurrent therein to, for example, the control server 120 and/or an operator through the communications module 445 so that servicing of the autonomous transport vehicle 110 can be scheduled. [0174] The power distribution unit 444 is configured to monitor voltage regulators 490, branch device central processing units (CPUs) 491, and/or position sensors 492 of peripheral devices (e.g., such as transfer arm 210A, payload justification pushers/pullers, wheel encoders, navigation sensor systems (as described herein), payload positioning sensor systems (as described herein) (it is noted that suitable examples of payload justification pushers/pullers are described in, for example United States provisional patent application number 63/236,591 having attorney docket number 1127P015753-US (-#3) filed on August 24, 2021 and titled "Autonomous Transport Vehicle" as well as United States pre-grant publication number 2012/0189416 published on July 26, 2012 (United States patent application number 13/326, 52 filed on December 15, 2011) and titled "Automated Bot with Transfer Arm"; United States patent number 7591630 issued on September 22, 2009 titled "Materials-Handling System Using Autonomous Transfer and Transport Vehicles"; United States patent number 7991505 issued on August 2, 2011 titled "Materials-Handling System Using Autonomous Transfer and Transport Vehicles"; United States patent number 9561905 issued on February 7, 2017 titled "Autonomous Transport Vehicle"; United States patent number 9082112 issued on July 14, 2015 titled "Autonomous Transport Vehicle Charging System"; United States patent number 9850079 issued on December 26, 2017 titled "Storage and Retrieval System Transport Vehicle"; United States patent number 9187244 issued on November 17, 2015 titled "Bot Payload Alignment and Sensing"; United States patent number 9,499,338 issued on November 22, 2016 titled "Automated Bot Transfer Arm Drive System"; United States patent number 8965619 issued on February 24, 2015 titled "Bot Having High Speed Stability"; United States patent number 9008884 issued on April 14, 2015 titled "Bot Position Sensing"; United States patent number 8425173 issued on April 23, 2013 titled "Autonomous Transports for Storage and Retrieval Systems"; and United States patent number 8696010 issued on April 15, 2014 titled "Suspension System for Autonomous Transports", the disclosures of which were previously incorporated herein by reference in their entireties). As an example, the monitoring device 447 is configured to monitor communications between the position sensors 492 and the controller 122, communications between the branch device controller(s) 491 and the controller 122, and the voltage from the voltage regulators 490. Where communication is expected from a sensor 492 and/or branch device controller 491 the monitoring device 447 may register a fault (e.g., time stamped) in the memory 446 and communicate such fault state (e.g., with the communications module 445 to the control server 120 and/or operator effecting a maintenance reguest. Where the branch device 483/branch power circuit 482 from which the fault is obtained is of a lower operational importance, the monitoring unit 447 may continue to monitor and register faults from the branch device 483/branch power circuit 482 and send a service reguested message to the control server 120 or operator depending on a freguency of the faults or any other suitable criteria.
[0175] As another example, the monitoring device 447 is configured to monitor a voltage of a voltage regulator 490 for one or more power branch circuits 482 in any suitable manner (such as feedback from the voltage regulator or voltmeter). Where there is an over-voltage or under-voltage detected by the monitoring device 447 the monitoring device 447 may register a fault (e.g., time stamped) in the memory 446 and communicate such fault state (e.g., with the communications module 445 to the control server 120 and/or operator effecting a maintenance reguest. Where the branch device 483/branch power circuit 482 from which the fault is obtained is of a lower operational importance, the monitoring unit 447 may continue to monitor and register faults from the voltage regulator 490 and send a service reguested message to the control server 120 or operator depending on a freguency of the faults or any other suitable criteria (such as a magnitude of the over-voltage or under-voltage).
[0176] Still referring to Referring to Figs. 1A, IB, 2, 19, 20, and 21, the power distribution device 444 of the controller 122 is configured as a boot device so that at autonomous transport vehicle 110 cold startup (initialization) the monitoring device 447 is brought online before other sections of the controller 122 and vision system controller 122VC so as to set initial (safe) states of the autonomous transport vehicle 110 prior to boot-up of the controller 122 and vision system controller 122VC. To effect initialization of the autonomous transport vehicle 110 the controller 122 is configured so that upon indication from the power distribution unit 444 of imminent decrease in available power supply charge level, directed from the power supply 481 to the controller 1222, to less than a demand level of the controller 122, the controller 122 configures at least one of the autonomous guided vehicle state and pose navigation information and the payload identity, state, and pose information, held in respective registry and memory (e.g., such as memory 446 or other memories 122M of corresponding ones of the autonomous navigation control section 122N, the autonomous payload handling control section 122H, and the vision system control section (e.g., vision system controller 122VC)), into an initialization file 122F (Fig. 2) available on reboot of the controller 122. The controller 122 may also be configured so that upon indication from the power distribution unit 444 of imminent decrease in available power supply charge level, directed from the power supply 481 to the controller 122, to less than a demand level of the controller 122, to configure stored health status information from the at least one of the vehicle health status monitor 447V, the drive section health status monitor 447D, the payload handling section health monitor 447H, and the peripheral electronics section health monitor 447P in the health status register section 447M (such as in memory 122M or memory 446) into an initialization file 122F available on reboot of the controller 122.
[0177] On initialization of the autonomous transport vehicle, the monitoring device 447 of the power distribution unit 444 is configured to control power up sequencing of the controller 122 sections (e.g., the autonomous navigation control section 122N, the autonomous payload handling control section 122H, and vision system controller 122VC), and branch devices 483 (e.g., sensors, drive motors, caster motors, transfer arm motors, justification device motors, payload bed 210B motors, etc.). The sequencing may be that the vision system controller 122VC is powered up before the autonomous navigation control section 122N and the branch devices are powered up last; however, in other aspects any suitable power sequence may be employed such that control devices are powered up before the devices they control.
[0178] Referring also to Fig. 27, an exemplary autonomous transport vehicle 110 power up or cold startup process will be described with the power distribution device 444 as a boot device. Here, power to the autonomous transport vehicle 110 is turned on (Fig. 27, Block 2200) and the power distribution device 444 monitors the output voltage of the power supply 481 and determines if the output voltage is greater than a startup threshold voltage Via (Fig. 27, Block 2205) of about 16V (in other aspects the startup threshold voltage Via may be more or less than about 16V). Where the power supply 481 output voltage is greater than the startup threshold voltage Via the power distribution unit 444 operates switches 499S so that power is provided to, for example, the controller 122, the vision system controller 122VS, the wireless communications module 445, and the other suitable programmable devices (e.g., FPGAs, CPLDs, SOCs, CPUs, etc.) of the autonomous transport vehicle 110 (Fig. 27, Block 2210). Here, the initialization file 122F (described above) may be employed on startup of the controller 122, the vision system controller 122VS, the wireless communications module 445, and the other suitable programmable devices (e.g., FPGAs, CPLDs, SOCs, CPUs, etc.) (Fig.
27, Block 2215) so that startup and operation of the controlled devices is effected based on information in the initialization file 122F.
[0179] The power distribution unit 444 continues to monitor the voltage output by the power supply 481 and where the output voltage is detected as being above a next higher startup threshold voltage V2a (Fig. 27, Block 2220) of about 18V (in other aspects the startup threshold voltage V2a may be more or less than about 18V), the power distribution unit 444 operates switches 449S to turn on the lighting (e.g., LEDs - see Figs. 10A-10C) of the autonomous transport vehicle 110 (Fig. 27, Block 2225). Where the next higher startup threshold voltage V2a has not been reached the power distribution unit 444 continues to monitor the power supply 481 output voltage until the next higher startup threshold voltage V2a is reached (such as with the autonomous transport vehicle 110 being charged), or until a shutdown sequence is initiated (see Fig. 8 described herein).
[0180] With the power distribution unit 444 continuing to monitor the voltage output of the power supply 481, and with a next higher startup threshold voltage V3a detected by the power distribution unit (Fig. 27, Block 2230), the power distribution unit 444 operates the switches 449S so as to power up/turn on the case handling drives of, for example, the front and rear justification module 210ARJ, 210AFJ, payload bed 210B, and transfer arm 210A (Fig. 27, Block 2235) as well as 24V peripherals and instruments (see Figs. 25A-25C) of the autonomous transport vehicle 110. Here, the threshold voltage V3a may be about 24V but in other aspects the threshold voltage V3a may be more or less than about 24V. If the voltage output of the power supply 481 is less than about 24V the power distribution unit 444 continues to monitor the power supply 481 output voltage until the next higher startup threshold voltage V3a is reached (such as with the autonomous transport vehicle 110 being charged), or until a shutdown sequence is initiated (see Fig. 23 described herein).
[0181] With the power distribution unit 444 monitoring the voltage output of the power supply 481, and with detection of a next higher startup threshold voltage V4a (Fig. 27, Block 2240), the power distribution unit 444 operates the switches 449S so as to power up/turn on the traction drive motors 261M (Fig. 27, Block 2245). Here, the threshold voltage V4a may be about 28V but in other aspects the threshold voltage V4a may be more or less than about 28V. Where the voltage output of the power supply 481 is less than about 28V the power distribution unit 444 continues to monitor the power supply 481 output voltage until the next higher startup threshold voltage V4a is reached (such as with the autonomous transport vehicle 110 being charged), or until a shutdown sequence is initiated (see Fig. 23 described herein).
[0182] As may be realized, where the threshold voltage V4a is detected by the power distribution unit 444 at cold start of the autonomous transport vehicle 110, the power distribution unit 444 is configured (e.g., with any suitable non-transitory computer program code) to power up the components of the autonomous transport vehicle 110 in the manner/sequence described above with respect to Fig. 27. Here, the power distribution unit 444 is configured so that control devices are powered up before the devices they control.
[0183] Referring to Figs. 2, 19, 20, 21, and 28, as described herein, the controller 122 may be configured to effect one or more of onboard power supply charge mode, active control of inrush current to branch devices 483 (e.g., lower level system of the autonomous transport vehicle), and regenerative power supply 481 charging.
[0184] With the autonomous transport vehicle 110 at a charging station (Fig. 28, Block 1300) power distribution unit 444 detects the presence of the traverse surface charging pad(s) (see Fig. 21 and Fig. 28, Block 1310). The power distribution unit 444, as described herein, is configure to monitor the output voltage of the power supply 481 and effect control tasks based on the output voltage level. Here, control of power supply 481 charging is based on the output voltage of the power supply 481 detected by the power distribution unit 444. Here, the monitoring device 447 of the power distribution unit 444 is configured to control a low level charging logic of the autonomous transport vehicle 110. An exemplary charging logic block diagram for the power distribution unit 444 is illustrated in Fig. 21. As can be seen in Fig. 21, the autonomous transport vehicle 110 is configured with vehicle mounted charging contacts that receive charging current from a charging pad located on a traverse surface of the transfer deck 130B, picking aisle 130A, and/or any other suitable traverse surface of the storage and retrieval system on which the autonomous transport vehicle 110 travels. The traverse surface mounted charging pad and the vehicle mounted charging contacts are substantially similar to that described in United States patent number 9,469,208 titled "Rover Charging System" and issued on October 18, 2016; United States patent number 11,001,444 titled "Storage and Retrieval System Rover Interface" and issued on May 11, 2021; and United States patent application number 14/209,086 titled "Rover Charging System" and filed on March 13, 2014). The autonomous transport vehicle 110 may also be configured with remote charging ports mounted to the front end 200E1 or rear end 200E2 of the frame 200 that engage (e.g., plug into) corresponding charge ports mounted to the storage structure 130 or a hand-held plug which an operator plugs into the remote charging ports of the autonomous transport vehicle 110.
[0185] The monitoring device 447 controls a charge mode/rate of the power supply 481 so as to maximize a number of charge cycles of the power supply 481. For example, the monitoring device 447 is configured to effect one or more of a trickle charge mode (e.g., having a charge rate below a set threshold voltage), a slow charge mode, and an ultra-high-speed (e.g., high current) charge mode, where the charging current is limited by the monitoring device 447 to a set maximum charge voltage threshold to substantially prevent adverse effects on the power supply 481 from charging. Here the charging current and voltage may be dependent on a capacity of and type of the power supply 481. The power supply 481 may have any suitable voltage and charge capacity and may be an ultra-capacitor or any other suitable power source (e.g., lithium ion battery pack, lead acid battery pack, etc.). As can also be seen in Fig. 6, the autonomous transport vehicle 110 includes suitable active reverse voltage protection for the power supply 481.
[0186] As an example, of charge rate control, with the vehicle charge contacts coupled with the traverse surface charging pad (see Fig. 21), the power distribution unit 444 detects that the output voltage from the power supply 481 is below a threshold charging voltage Vic (Fig. 28, Block 1320) of about 23V (in other aspects the threshold charging voltage Vic may be more or less than 23V), the monitoring device 477 of the power distribution unit 444 effects a limited current charging of the power supply 1330. For example, the limited charging current may be the slow charging mode described above. The slow charge charging mode described above may have a charge current higher than that of the trickle charging mode but lower than a full charge current. The power distribution unit 444 continues to monitor the output voltage of the power supply 481 during charging and with the detection of the output voltage of the power supply 481 being at or equal to the threshold charging voltage Vic (Fig. 28, Block 1320), the monitoring device 477 of the power distribution unit 444 effects another charging mode, such as the full charge current mode (Fig. 28, Block 1350). The power distribution unit 444 monitors the output voltage of the power supply 481 during charging at full charge current and where the output voltage is at or greater than a next higher threshold charging voltage V2c (Fig. 13, Block 1340) of about 44V (in other aspects the output voltage may be more or less than about 44V), the monitoring device 477 of the power distribution unit 444 terminates charging. In other aspects, upon detection of the output voltage being at or greater than about 44V, the monitoring device 477 may effect the trickle charge mode so as to maintain the power supply 481 at peak/maximum charge with the vehicle charge contacts of autonomous transport vehicle 110 engaged/coupled with the traverse surface charging pad(s) (see Fig. 21).
[0187] Still referring to Figs. 2, 19, 20, and 21, the autonomous transport vehicle 110 includes one or more of current inrush protection, over voltage/current protection, and under voltage/current protection. For example, the autonomous transport vehicle 110 may include hot swap circuitry (substantially similar to that described in United States patent number 9,469,208 titled "Rover Charging System" and issued on October 18, 2016; United States patent number 11,001,444 titled "Storage and Retrieval System Rover Interface" and issued on May 11, 2021; and United States patent application number 14/209,086 titled "Rover Charging System" and filed on March 13, 2014) that is configured to effect autonomous transport vehicle 110 roll-on and roll-off of the traverse surface mounted charging pads regardless of an energization status of the traverse surface mounted charging pads. Here, the power distribution unit 444 is configured to actively control inrush current to the branch devices 483A-483F ... 483n (collectively referred to as branch devices 483, where n denotes an integer representing a maximum number of branch devices) of the respective branch power circuits 482, where the power distribution unit 444 receives from the controller 122 (and the controller 122 is configured to generate) a pulse width modulation signal that effects active control of the switches 449S to limit the inrush current (such as from charging or power surges) to the branch devices 483. For example, at initial contact between the vehicle charging contacts and the traverse surface mounted charging pad the power distribution unit 444 may operate one or more of the switches 449S so as to open the one or more switches to prevent inrush current from flowing to the branch devices 483.
[0188] Referring to Figs. 2, 19, and 22, one or more of the branch power circuits includes an electrical protection circuit 700 configured to protect the branch device 483 (a sensor is illustrated in Fig. 22 for exemplary purposes but in other aspects any suitable branch device, such as those described herein, may be provided). The electrical protection circuit 700 is configured to substantially protect the branch device 483 (and any controls/measurement instruments devices associated therewith) from, for example, short circuits, over-voltage, and over-current. For example, the branch device 483 (in this example a sensor) operates with an about 4 mA to about 20 mA signal. The electrical protection circuit 700, for exemplary purposes only, includes an adjustable three-terminal positive-voltage regulator 710 and a single resistor 720. The voltage regulator 710 is configured to supply more than about 1.5 A over an output-voltage range of about 1.25 V to about 37 V. The voltage regulator 710 with the resistor 720 coupled thereto limits the current to about 27 mA by leveraging the internal reference voltage of the voltage regulator 710. The insertion of the electrical protection circuit 700 into the branch power circuit 482 substantially does not affect the about 4 mA to about 20 mA signal while providing control/measurement protection to devices disposed both upstream and downstream (with respect to the flow of current) the electrical protection circuit 700. It is again noted that the configuration of the electrical protection circuit 700 is exemplary only and that the electrical protection circuit 700 may be configured with any suitable voltage regulator and resistor (having suitable specifications) for providing control/measurement protection for signal that are less than about 4 mA or more than about 20 mA.
[0189] Referring to Figs. 19, 20, and 21, the power distribution unit 444 is configured to effect regenerative charging of the power supply 481. For example, with the right and left drive wheels 260A, 260B rolling, but not under power (e.g., such as during braking), the back electromotive force (EMF) voltage produced by the respective motors 261M is fed back into the respective branch power circuit 483E, 483F. The monitoring device 447 may operate the switches 449S (such as the Vcap_IN switch - see Fig. 21) so that the back EMF voltage (and current) regeneratively charges the power supply 481. With the motors 261M under power to drive the drive wheels 260A, 260B the monitoring device 447 may close the Vcap_IN switch to prevent power drain from the power supply 481.
[0190] Referring to Figs. 2, 19, and 20, as described herein, the power distribution unit 444 includes the wireless communication module 445. The wireless communication module 445 may be configured for any suitable wireless communication including, but not limited to, Wi-Fi, Bluetooth, cellular, etc. The wireless commination module 445 configures the power distribution unit 444 so as to control at least in part, for example, communication between the autonomous transport vehicle 110 and other features of the storage and retrieval system including but not limited to the control server 120 over any suitable network such as network 180. Here, the wireless communication module 445 and monitoring device 447 configure the power distribution unit 444 as a secondary processor/controller such as where processing function errors of the controller 122 (e.g., such as safety related functions including remote shutdown, communications or other general component errors) are detected by the monitoring device 447. Where a controller 122 error occurs in communication or control effected by the controller 122, the power distribution unit 444 maintains (secondary) communication between the control server 120 (and operators of the storage and retrieval system 100) and the different components of the autonomous transport vehicle 110 (e.g., through the communication module 445) so that the autonomous transport vehicle 110 can be remotely shut down or driven (either autonomously, semi-autonomously, or under manual remote control of an operator in a manner described herein to any suitable destination location.
[0191] The wireless commination module 445 also provides for "over the air" programming of the of the controller 122, vision system controller 122VC and updating firmware/programming of the monitoring device 447 or other suitable programmable devices (e.g., FPGAs, CPLDs, SOCs, CPUs, etc.) of the autonomous transport vehicle 110. Here an operator of the storage and retrieval system
100 may push or otherwise upload software updates to the autonomous vehicle 110 over the network 180 (which is at least in part a wireless network) through the control server 120 or with other suitable device such as a laptop, smart phone/tablet, etc. The power distribution unit 444 includes any suitable memory 446 that may buffer the software updates for installation in the monitoring device 447, controller 122, vision system controller 122VC and/or other suitable programmable devices (e.g., FPGAs, CPLDs, SOCs, CPUs, etc.).
[0192] The wireless commination module 445 of the power distribution unit 444 may also be configured as an Ethernet switch or Bridge. Here, the wireless communication modules 455 of the autonomous transport vehicles 110 travelling throughout the storage structure 130 may form a mesh network. In this manner wireless communications from, for example the control server 122 or other suitable device such as a laptop, smart phone/tablet, etc. may be extended to a range the covers substantially an entirety of the storage structure 130 without dedicated Ethernet switches and bridges being disposed throughout (e.g., mounted to) the storage structure 130 in fixed/predetermined locations.
[0193] Referring now to Figs. 1A, 2, 19, 20, 21, and 24 an exemplary method for autonomous guided vehicle power management will be described in accordance with aspects of the disclosed embodiment. The method includes providing the autonomous transport 110 as described herein (Fig. 24, Block 24900). Autonomous operation of the autonomous transport vehicle 110 is effected with the controller 122 (Fig. 24, Block 24910) and a charge level of the power supply 481 of the autonomous transport vehicle 110 is monitored by the power distribution unit 444 (Fig. 24, Block 24920) as described herein. The method may also include, as described herein, the switching of the branch power circuits 482 on and off in the predetermined pattern (such as described herein) based on the demand charge level of each respective branch power circuit 482 with respect to other branch power circuits 482 and the charge level available from the power supply 481 (Fig. 24, Block 24930).
[0194] Upon indication from the power distribution section 444 of imminent decrease in available power supply charge level, directed from the power supply 481 to the branch circuit 482 of the drive section 261D (see Fig. 21) and/or the case handling assembly 210, the controller 122 commands the drive section 261D to move of the autonomous transport vehicle 110 to a safe location and/or commands the case handling assembly 210 to move the payload to a safe location (Fig. 24, Block 24960) as described herein.
[0195] Upon indication from the power distribution unit 444 of imminent decrease in available power supply charge level, directed from the power supply 481 to the controller 122, to less than demand level of the controller 122, the controller 122 enters suspension of operation and hibernation (Fig. 24, Block 24950) as described herein.
[0196] As also described herein, upon indication from the power distribution unit 444 of imminent decrease in available power supply charge level, directed from the power supply 481 to the controller 122, to less than demand level of the controller 122, the controller 122 creates at least one initialization file (Fig. 24, Block 24940). As described herein, the controller 122 may configure at least one of the autonomous guided vehicle state and pose navigation information and the payload identity, state, and pose information, held in respective registry and memory (e.g., such as memory 446 or other memories 122M of corresponding ones of the autonomous navigation control section 122N, the autonomous payload handling control section 122H, and the vision system control section (e.g., vision system controller 122VC)) of corresponding controller sections, into an initialization file 122F available on reboot of the controller 122. The controller 122 may store health status information from the at least one of vehicle health status monitor 447V, the drive section health status monitor 447D, the payload handling section health monitor 447H, and the peripheral electronics section health monitor 447P in the health status register section 477M into the initialization file 122F (or a different initialization file) available on reboot of the controller 122.
[0197] In accordance with one or more aspects of the disclosed embodiment an autonomous guided vehicle comprises:
[0198] a frame with a payload hold;
[0199] a drive section coupled to the frame with drive wheels supporting the autonomous guided vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the autonomous guided vehicle over the traverse surface in a facility; [0200] a payload handler coupled to the frame configured to transfer a payload, with a flat undeterministic seating surface seated in the payload hold, to and from the payload hold of the autonomous guided vehicle and a storage location, of the payload, in a storage array;
[0201] a physical characteristic sensor system connected to the frame having electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electromagnetic beam or field with a physical characteristic, the electromagnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic, wherein the physical characteristic sensor system is configured to generate sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information; and
[0202] a supplemental sensor system, connected to the frame, that supplements the physical characteristic sensor system, the supplemental sensor system being, at least in part, a vision system with cameras disposed to capture image data informing the at least one of a vehicle navigation pose or location and payload pose or location supplement to the information of the physical characteristic sensor system.
[0203] In accordance with one or more aspects of the disclosed embodiment the autonomous guided vehicle further comprises a controller connected to the frame, operably connected to the drive section or the payload handler, and communicably connected to the physical characteristic sensor system, wherein the controller is configured to determine from the information of the physical characteristic sensor system vehicle pose and location effecting independent guidance of the autonomous guided vehicle traversing the facility.
[0204] In accordance with one or more aspects of the disclosed embodiment the controller is configured to determine from the information of the physical characteristic sensor system payload pose and location effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload hold.
[0205] In accordance with one or more aspects of the disclosed embodiment the controller is programmed with a reference representation of predetermined features defining at least part of the facility traversed through by the autonomous guided vehicle.
[0206] In accordance with one or more aspects of the disclosed embodiment the controller is configured to register the captured image data and generate therefrom at least one image of one or more features of the predetermined features, the at least one image being formatted as a virtual representation of the one or more predetermined features so as to provide comparison to one or more corresponding reference of the predetermined features of the reference representation.
[0207] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that the virtual representation, of the imaged one or more features of the predetermined features, is effected resident on the autonomous guided vehicle, and comparison between the virtual representation of the one or more imaged predetermined features and the one or more corresponding reference predetermined features is effected resident on the autonomous guided vehicle.
[0208] In accordance with one or more aspects of the disclosed embodiment the controller is configured to confirm autonomous guided vehicle pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
[0209] In accordance with one or more aspects of the disclosed embodiment the controller is configured to identify a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation and the reference representation, and update or complete autonomous guided vehicle pose or location information from the physical characteristic sensor system based on the variance.
[0210] In accordance with one or more aspects of the disclosed embodiment the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the autonomous guided vehicle pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
[0211] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle navigation based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
[0212] In accordance with one or more aspects of the disclosed embodiment after switching, the controller is configured to:
[0213] continue autonomous guided vehicle navigation to destination, or
[0214] select an autonomous guided vehicle safe path and trajectory bringing the autonomous guided vehicle from a position at switching to a safe location for shut down, or
[0215] initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to quasi automatic operation or manual operation via a user interface device.
[0216] In accordance with one or more aspects of the disclosed embodiment the controller is configured to confirm payload pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
[0217] In accordance with one or more aspects of the disclosed embodiment the controller is configured to identify a variance in the payload pose and location based on the comparison between the virtual representation and the reference representation, and update or complete payload pose or location information from the physical characteristic sensor system based on the variance.
[0218] In accordance with one or more aspects of the disclosed embodiment the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the payload pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
[0219] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle payload handling based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
[0220] In accordance with one or more aspects of the disclosed embodiment after switching, the controller is configured to: [0221] continue autonomous guided vehicle handling to destination, or
[0222] initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to guasi automatic payload handling operation or manual payload handling operation via a user interface device.
[0223] In accordance with one or more aspects of the disclosed embodiment the controller is configured to transmit, via a wireless communication system communicably coupling the controller and an operator interface, a simulation image combining the virtual representation of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation presenting the operator with an augmented reality image in real time.
[0224] In accordance with one or more aspects of the disclosed embodiment the controller is configured to receive real time operator commands to the traversing autonomous guided vehicle, which commands are responsive to the real time augmented reality image, and changes in the real time augmented reality image transmitted to the operator by the controller.
[0225] In accordance with one or more aspects of the disclosed embodiment the supplemental sensor system at least in part effects on-the-fly justification and/or sortation of case units onboard the autonomous guided vehicle. [0226] In accordance with one or more aspects of the disclosed embodiment imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the supplemental sensor system, are coapted to a reference model of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of the vehicle navigation pose or location information and the payload pose or location information.
[0227] In accordance with one or more aspects of the disclosed embodiment an autonomous guided vehicle comprises:
[0228] a frame with a payload hold;
[0229] a drive section coupled to the frame with drive wheels supporting the vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the autonomous guided vehicle over the traverse surface in a facility;
[0230] a payload handler coupled to the frame configured to transfer a payload, with a flat undeterministic seating surface seated in the payload hold, to and from the payload hold of the autonomous guided vehicle and a storage location, of the payload, in a storage array;
[0231] a physical characteristic sensor system connected to the frame having electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electromagnetic beam or field with a physical characteristic, the electromagnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic, wherein the physical characteristic sensor system is configured to generate sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information; and
[0232] an auxiliary sensor system, connected to the frame, that is separate and distinct from the physical characteristic sensor system, the auxiliary sensor system being, at least in part, a vision system with cameras disposed to capture image data informing the at least one of a vehicle navigation pose or location and payload pose or location which image data is auxiliary information to the information of the physical characteristic sensor system.
[0233] In accordance with one or more aspects of the disclosed embodiment the autonomous guided vehicle further comprises a controller connected to the frame, operably connected to the drive section or the payload handler, and communicably connected to the physical characteristic sensor system, wherein the controller is configured to determine from the information of the physical characteristic sensor system vehicle pose and location effecting independent guidance of the autonomous guided vehicle traversing the facility. [0234] In accordance with one or more aspects of the disclosed embodiment the controller is configured to determine from the information of the physical characteristic sensor system payload pose and location effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload hold.
[0235] In accordance with one or more aspects of the disclosed embodiment the controller is programmed with a reference representation of predetermined features defining at least part of the facility traversed through by the autonomous guided vehicle.
[0236] In accordance with one or more aspects of the disclosed embodiment the controller is configured to register the captured image data and generate therefrom at least one image of one or more features of the predetermined features, the at least one image being formatted as a virtual representation of the one or more predetermined features so as to provide comparison to one or more corresponding reference of the predetermined features of the reference representation.
[0237] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that the virtual representation, of the imaged one or more features of the predetermined features, is effected resident on the autonomous guided vehicle, and comparison between the virtual representation of the one or more imaged predetermined features and the one or more corresponding reference predetermined features is effected resident on the autonomous guided vehicle. [0238] In accordance with one or more aspects of the disclosed embodiment the controller is configured to confirm autonomous guided vehicle pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
[0239] In accordance with one or more aspects of the disclosed embodiment the controller is configured to identify a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation and the reference representation, and update or complete autonomous guided vehicle pose or location information from the physical characteristic sensor system based on the variance.
[0240] In accordance with one or more aspects of the disclosed embodiment the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the autonomous guided vehicle pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
[0241] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle navigation based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
[0242] In accordance with one or more aspects of the disclosed embodiment after switching, the controller is configured to:
[0243] continue autonomous guided vehicle navigation to destination or select an autonomous guided vehicle safe path and trajectory bringing the autonomous guided vehicle from a position at switching to a safe location for shut down, or
[0244] initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to quasi automatic operation or manual operation via a user interface device.
[0245] In accordance with one or more aspects of the disclosed embodiment the controller is configured to confirm payload pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
[0246] In accordance with one or more aspects of the disclosed embodiment the controller is configured to identify a variance in the payload pose and location based on the comparison between the virtual representation and the reference representation, and update or complete payload pose or location information from the physical characteristic sensor system based on the variance. [0247] In accordance with one or more aspects of the disclosed embodiment the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the payload pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
[0248] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle payload handling based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
[0249] In accordance with one or more aspects of the disclosed embodiment after switching, the controller is configured to:
[0250] continue autonomous guided vehicle handling to destination, or
[0251] initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to guasi automatic payload handling operation or manual payload handling operation via a user interface device. [0252] In accordance with one or more aspects of the disclosed embodiment the controller is configured to transmit, via a wireless communication system communicably coupling the controller and an operator interface, a simulation image combining the virtual representation of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation presenting the operator with an augmented reality image in real time.
[0253] In accordance with one or more aspects of the disclosed embodiment the controller is configured to receive real time operator commands to the traversing autonomous guided vehicle, which commands are responsive to the real time augmented reality image, and changes in the real time augmented reality image transmitted to the operator by the controller.
[0254] In accordance with one or more aspects of the disclosed embodiment the supplemental sensor system at least in part effects on-the-fly justification and/or sortation of case units onboard the autonomous guided vehicle.
[0255] In accordance with one or more aspects of the disclosed embodiment imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the auxiliary sensor system, are coapted to a reference model of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of the vehicle navigation pose or location information and the payload pose or location information.
[0256] In accordance with one or more aspects of the disclosed embodiment a method comprises:
[0257] providing an autonomous guided vehicle with:
[0258] a frame with a payload hold,
[0259] a drive section coupled to the frame with drive wheels supporting the autonomous guided vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the autonomous guided vehicle over the traverse surface in a facility, and
[0260] a payload handler coupled to the frame configured to transfer a payload, with a flat undeterministic seating surface seated in the payload hold, to and from the payload hold of the autonomous guided vehicle and a storage location, of the payload, in a storage array;
[0261] generating sensor data with physical characteristic sensor system, the sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information, wherein the physical characteristic sensor system connected to the frame and has electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electro-magnetic beam or field with a physical characteristic, the electro-magnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic; and
[0262] capturing image data with a supplemental sensor system, the image data informing the at least one of a vehicle navigation pose or location and payload pose or location supplement to the information of the physical characteristic sensor system, wherein the supplemental sensor system is connected to the frame and supplements the physical characteristic sensor system, the supplemental sensor system being, at least in part, a vision system with cameras disposed to capture the image data.
[0263] In accordance with one or more aspects of the disclosed embodiment the method further comprises determining, with a controller, from the information of the physical characteristic sensor system vehicle pose and location effecting independent guidance of the autonomous guided vehicle traversing the facility, wherein the controller is connected to the frame and operably connected to the drive section or the payload handler, and communicably connected to the physical characteristic sensor system.
[0264] In accordance with one or more aspects of the disclosed embodiment the method further comprises, with the controller, determining from the information of the physical characteristic sensor system payload pose and location effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload hold.
[0265] In accordance with one or more aspects of the disclosed embodiment the controller is programmed with a reference representation of predetermined features defining at least part of the facility traversed through by the autonomous guided vehicle.
[0266] In accordance with one or more aspects of the disclosed embodiment the method further comprises, with the controller, registering the captured image data and generating therefrom at least one image of one or more features of the predetermined features, the at least one image being formatted as a virtual representation of the one or more predetermined features so as to provide comparison to one or more corresponding reference of the predetermined features of the reference representation.
[0267] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that the virtual representation, of the imaged one or more features of the predetermined features, is effected resident on the autonomous guided vehicle, and comparison between the virtual representation of the one or more imaged predetermined features and the one or more corresponding reference predetermined features is effected resident on the autonomous guided vehicle.
[0268] In accordance with one or more aspects of the disclosed embodiment the method further comprises, with the controller, confirming autonomous guided vehicle pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
[0269] In accordance with one or more aspects of the disclosed embodiment the method further comprises, with the controller, identifying a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation and the reference representation, and updating or completing autonomous guided vehicle pose or location information from the physical characteristic sensor system based on the variance.
[0270] In accordance with one or more aspects of the disclosed embodiment the controller determines a pose error in the information from the physical characteristic sensor system and fidelity of the autonomous guided vehicle pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
[0271] In accordance with one or more aspects of the disclosed embodiment, with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle navigation based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
[0272] In accordance with one or more aspects of the disclosed embodiment after switching, the controller is configured to: [0273] continue autonomous guided vehicle navigation to destination or select an autonomous guided vehicle safe path and trajectory bringing the autonomous guided vehicle from a position at switching to a safe location for shut down, or
[0274] initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to guasi automatic operation or manual operation via a user interface device.
[0275] In accordance with one or more aspects of the disclosed embodiment the controller confirms payload pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
[0276] In accordance with one or more aspects of the disclosed embodiment the controller identifies a variance in the payload pose and location based on the comparison between the virtual representation and the reference representation, and update or complete payload pose or location information from the physical characteristic sensor system based on the variance.
[0277] In accordance with one or more aspects of the disclosed embodiment the controller determines a pose error in the information from the physical characteristic sensor system and fidelity of the payload pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
[0278] In accordance with one or more aspects of the disclosed embodiment, with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle payload handling based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
[0279] In accordance with one or more aspects of the disclosed embodiment after switching, the controller is configured to:
[0280] continue autonomous guided vehicle handling to destination, or
[0281] initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to quasi automatic payload handling operation or manual payload handling operation via a user interface device.
[0282] In accordance with one or more aspects of the disclosed embodiment the controller transmits, via a wireless communication system communicably coupling the controller and an operator interface, a simulation image combining the virtual representation of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation presenting the operator with an augmented reality image in real time. [0283] In accordance with one or more aspects of the disclosed embodiment the controller receives real time operator commands to the traversing autonomous guided vehicle, which commands are responsive to the real time augmented reality image, and changes in the real time augmented reality image transmitted to the operator by the controller.
[0284] In accordance with one or more aspects of the disclosed embodiment controller effects, with at least the supplemental sensor system, justification and/or sortation of case units onboard the autonomous guided vehicle.
[0285] In accordance with one or more aspects of the disclosed embodiment imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the supplemental sensor system, are coapted to a reference model of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of the vehicle navigation pose or location information and the payload pose or location information.
[0286] In accordance with one or more aspects of the disclosed embodiment an autonomous guided vehicle comprises:
[0287] a frame with a payload hold; [0288] a drive section coupled to the frame with drive wheels supporting the vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the vehicle over the traverse surface in a facility;
[0289] a payload handler coupled to the frame configured to transfer a payload to and from the payload hold of the vehicle and a storage location, of the payload, in a storage array;
[0290] a supplemental sensor system, connected to the frame for collaboration of the vehicle and an operator, supplemental sensor system supplements a vehicle autonomous navigation/operation sensor system configured to at least collect sensory data embodying vehicle pose and location information for auto navigation by the vehicle of the facility,
[0291] wherein the supplemental sensor system is, at least in part, a vision system with at least one camera disposed to capture image data informing objects and/or spatial features within at least a portion of the facility viewed by the at least one camera with the vehicle in different positions in the facility; and
[0292] a controller connected to the frame and communicably coupled to the supplemental sensor system so as to register the information from the image data of the at least one camera, and the controller is configured to determine, from the information, presence of a predetermined physical characteristic of at least one object or spatial feature, and in response thereto, selectably reconfigure the vehicle from an autonomous state to a collaborative vehicle state disposed to receive operator commands for the vehicle to continue effecting vehicle operation.
[0293] In accordance with one or more aspects of the disclosed embodiment the predetermined physical characteristic is that the at least one object or spatial feature extends across at least part of, the traverse surface, a vehicle traverse path across the traverse surface or through space of the vehicle or another different vehicle traversing the traverse surface
[0294] In accordance with one or more aspects of the disclosed embodiment the controller is programmed with a reference representation of predetermined features defining at least in part the facility traversed through by the vehicle.
[0295] In accordance with one or more aspects of the disclosed embodiment the controller is configured to register the captured image data and generate therefrom at least one image of the at least one object or spatial feature showing the predetermined physical characteristic.
[0296] In accordance with one or more aspects of the disclosed embodiment the at least one image is formatted as a virtual representation of the at least one object or spatial feature so as to provide comparison to one or more reference features of the predetermined features of the reference representation.
[0297] In accordance with one or more aspects of the disclosed embodiment the controller is configured to identify the presence of the predetermined physical characteristic of the object or spatial feature based on the comparison between the virtual representation and the reference representation, determine a dimension of the predetermined physical characteristic and command the vehicle to stop in a predetermined trajectory based on a position of the object or spatial features determined from the comparison.
[0298] In accordance with one or more aspects of the disclosed embodiment a stop position in the predetermined trajectory maintains object or spatial reference within field of view of at least one camera and continued imaging of the predetermined physical characteristic, initiates a signal to at least another vehicle of one or more of a traffic obstacle, an area to avoid, or a detour area.
[0299] In accordance with one or more aspects of the disclosed embodiment the predetermined physical characteristic is determined by the controller by determining a position of the object within a reference frame of the at least one camera, that is calibrated and has a predetermined relationship to the vehicle, and from the object pose in the reference frame of the at least one camera determine presence of predetermined physical characteristic of object.
[0300] In accordance with one or more aspects of the disclosed embodiment the controller is configured such that, identification of presence and switch from the autonomous state to the collaborative vehicle state, the controller initiates transmission communicating image, identification of presence of predetermined physical characteristic, to operator interface for operator collaboration operation of the vehicle.
[0301] In accordance with one or more aspects of the disclosed embodiment the controller is configured to apply a trajectory to the autonomous guided vehicle that brings the autonomous guided vehicle to a zero velocity within a predetermined time period where motion of the autonomous guided vehicle along the trajectory is coordinated with location of the objects and/or spatial features.
[0302] In accordance with one or more aspects of the disclosed embodiment the capture of image data informing objects and/or spatial features is opportunistic during transfer of a payload to/from the payload hold of the vehicle or a storage location in a storage array.
[0303] In accordance with one or more aspects of the disclosed embodiment the controller is programmed to command the vehicle to the different positions in the facility associated with the vehicle effecting one or more predetermined payload autonomous transfer tasks, wherein each of the one or more predetermined payload autonomous transfer tasks is a separate and distinct task from the capture image data viewed by the at least one camera in the different positions.
[0304] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that determination of presence of the predetermined physical characteristic of the at least one object or spatial feature is, coincident at least in part with, but supplemental and peripheral to vehicle actions effecting each of the one or more predetermined payload auto transfer tasks.
[0305] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that determination of presence of the predetermined physical characteristic of the at least one object or spatial feature is, opportunistic to vehicle actions effecting each of the one or more predetermined payload auto transfer tasks.
[0306] In accordance with one or more aspects of the disclosed embodiment at least one of the one or more predetermined payload auto transfer tasks is effected at at least one of the different positions.
[0307] In accordance with one or more aspects of the disclosed embodiment the collaborative vehicle state is supplemental to the autonomous state of the vehicle effecting each of the one or more predetermined payload auto transfer tasks.
[0308] In accordance with one or more aspects of the disclosed embodiment a method comprises:
[0309] providing an autonomous guided vehicle with:
[0310] a frame with a payload hold;
[0311] a drive section coupled to the frame with drive wheels supporting the vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the vehicle over the traverse surface in a facility; [0312] a payload handler coupled to the frame configured to transfer a payload to and from the payload hold of the vehicle and a storage location, of the payload, in a storage array;
[0313] generating, with a supplemental sensor system connected to the frame for collaboration of the vehicle and an operator, image data informing objects and/or spatial features within at least a portion of the facility viewed by the at least one camera with the vehicle in different positions in the facility, wherein the supplemental sensor system is, at least in part, a vision system with at least one camera disposed to capture image data and the supplemental sensor system supplements a vehicle autonomous navigation/operation sensor system configured to at least collect sensory data embodying vehicle pose and location information for auto navigation by the vehicle of the facility;
[0314] registering, with a controller connected to the frame and communicably coupled to the supplemental sensor system, the information from the image data of the at least one camera; and
[0315] determining, with the controller, from the information, presence of a predetermined physical characteristic of at least one object or spatial feature, and in response thereto, selectably reconfiguring the vehicle from an autonomous state to a collaborative vehicle state disposed to receive operator commands for the vehicle to continue effecting vehicle operation.
[0316] In accordance with one or more aspects of the disclosed embodiment the predetermined physical characteristic is that the at least one object or spatial feature extends across at least part of, the traverse surface, a vehicle traverse path across the traverse surface or through space of the vehicle or another different vehicle traversing the traverse surface.
[0317] In accordance with one or more aspects of the disclosed embodiment the controller is programmed with a reference representation of predetermined features defining at least in part the facility traversed through by the vehicle.
[0318] In accordance with one or more aspects of the disclosed embodiment the method further comprises generating, from the registered captured image data, at least one image of the at least one object or spatial feature showing the predetermined physical characteristic.
[0319] In accordance with one or more aspects of the disclosed embodiment the at least one image is formatted as a virtual representation of the at least one object or spatial feature, the method further comprising comparing the virtual representation to one or more reference features of the predetermined features of the reference representation.
[0320] In accordance with one or more aspects of the disclosed embodiment the method further comprises identifying, with the controller, the presence of the predetermined physical characteristic of the object or spatial feature based on the comparison between the virtual representation and the reference representation, determining a dimension of the predetermined physical characteristic, and commanding the vehicle to stop in a predetermined trajectory based on a position of the object or spatial features determined from the comparison.
[0321] In accordance with one or more aspects of the disclosed embodiment the method further comprises, with the vehicle in a stop position in the predetermined trajectory, maintaining the object or spatial reference within a field of view of the at least one camera and continued imaging of the predetermined physical characteristic, initiating a signal to at least another vehicle of one or more of a traffic obstacle, an area to avoid, or a detour area.
[0322] In accordance with one or more aspects of the disclosed embodiment the predetermined physical characteristic is determined by the controller by determining a position of the object within a reference frame of the at least one camera, that is calibrated and has a predetermined relationship to the vehicle, and from the object pose in the reference frame of the at least one camera determine presence of predetermined physical characteristic of the object.
[0323] In accordance with one or more aspects of the disclosed embodiment the controller is configured such that, identification of presence of the predetermined physical characteristic of the at least one object or spatial feature and switch from the autonomous state to the collaborative vehicle state, initiates transmission communicating image, identification of presence of predetermined physical characteristic, to an operator interface for operator collaboration operation of the vehicle. [0324] In accordance with one or more aspects of the disclosed embodiment the method further comprises applying, with the controller, a trajectory to the autonomous guided vehicle bringing the autonomous guided vehicle to a zero velocity within a predetermined time period, where motion of the autonomous guided vehicle along the trajectory is coordinated with a location of the objects and/or spatial features.
[0325] In accordance with one or more aspects of the disclosed embodiment the capture of image data informing objects and/or spatial features is opportunistic during transfer of a payload to/from the payload hold of the vehicle or a storage location in a storage array.
[0326] In accordance with one or more aspects of the disclosed embodiment an autonomous guided vehicle comprises:
[0327] a vehicle chassis with a power supply mounted thereon and powered sections connected to the chassis and each powered by the power supply, the powered sections including:
[0328] a drive section with motors driving wheels, supporting the vehicle chassis, and disposed to traverse the autonomous guided vehicle on a traverse surface in a facility under autonomous guidance;
[0329] a payload handling section with at least one payload handling actuator configured so that actuation of the at least one payload handling actuator effects transfer of a payload to and from a payload bed, of the vehicle chassis, and a storage in the facility;
[0330] a peripheral electronics section having at least one of an autonomous pose and navigation sensor, at least one of a payload handling sensor, and at least one peripheral motor, the at least one peripheral motor being separate and distinct from each of the motors of the drive section and each actuator of the payload handling section; and
[0331] a controller communicably coupled respectively to the drive section, the payload handling section, and peripheral section so at to effect each autonomous operation of the autonomous guided vehicle, wherein the controller comprises a comprehensive power management section communicably connected to the power supply so as to monitor a charge level of the power supply, and
[0332] wherein the comprehensive power management section is connected to each respective branch circuit of the drive section, the payload handling section, and the peripheral electronics section respectively powering the drive section, the payload handling section, and the peripheral electronics section from the power supply, the comprehensive power management section being configured to manage power consumption of the branch circuits based on a demand level of each branch circuit relative to the charge level available from the power supply.
[0333] In accordance with one or more aspects of the disclosed embodiment the comprehensive power management section is configured so as to manage a demand charge level of each respective branch circuit switching each respective branch circuit on or off in a predetermined pattern based on the demand charge level of each respective branch circuit with respect to other branch circuits and the charge level available from the power supply.
[0334] In accordance with one or more aspects of the disclosed embodiment the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply, so as to maximize available charge level from the power supply directed to the controller.
[0335] In accordance with one or more aspects of the disclosed embodiment the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply so that the available charge level directed to the controller is equal to or exceeds the demand charge level of the controller for a maximum time based on the available charge level of the power supply.
[0336] In accordance with one or more aspects of the disclosed embodiment the controller has at least one of:
[0337] an autonomous navigation control section configured to register and hold in volatile memory autonomous guided vehicle state and pose navigation information, historic and current, that is deterministic of and describing current and predicted state, pose, and location of the autonomous guided vehicle; and [0338] an autonomous payload handling control section configured to register and hold in volatile memory current payload identity, state, and pose information, historic and current;
[0339] wherein the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller configures at least one of the autonomous guided vehicle state and pose navigation information and the payload identity, state, and pose information, held in respective registry and memory of corresponding controller sections, into an initialization file available on reboot of the controller.
[0340] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller enters suspension of operation and hibernation.
[0341] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the drive section, the controller is configured to command the drive section so as to navigate the autonomous guided vehicle along a predetermined auxiliary path and auxiliary trajectory (to a predetermined autonomous guided vehicle auxiliary stop location in the facility.
[0342] In accordance with one or more aspects of the disclosed embodiment the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the payload handling section the controller is configured to command the payload handling section to move the payload handling actuator, and any payload thereon, to a predetermined safe payload position in the payload bed.
[0343] In accordance with one or more aspects of the disclosed embodiment the controller includes at least one of:
[0344] a vehicle health status monitor,
[0345] a drive section health status monitor,
[0346] a payload handling section health status monitor, and
[0347] a peripheral electronics section health status monitor; and
[0348] a health status register section; and
[0349] wherein the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, to configure stored health status information from the at least one of the vehicle health status monitor, the drive section health status monitor, the payload handling section health monitor, and the peripheral electronics section health monitor in the health status register section into an initialization file available on reboot of the controller.
[0350] In accordance with one or more aspects of the disclosed embodiment the power supply is an ultra-capacitor, or the charge level is voltage level.
[0351] In accordance with one or more aspects of the disclosed embodiment method for autonomous guided vehicle power management is provided. The method comprises:
[0352] providing an autonomous guided vehicle with a vehicle chassis with a power supply mounted thereon and powered sections connected to the chassis and each powered by the power supply, the powered sections including:
[0353] a drive section with motors driving wheels, supporting the vehicle chassis, and disposed to traverse the autonomous guided vehicle on a traverse surface in a facility under autonomous guidance;
[0354] a payload handling section with at least one payload handling actuator configured so that actuation of the at least one payload handling actuator effects transfer of a payload to and from a payload bed, of the vehicle chassis, and a storage in the facility; [0355] a peripheral electronics section having at least one of an autonomous pose and navigation sensor, at least one of a payload handling sensor, and at least one peripheral motor, the at least one peripheral motor being separate and distinct from each of the motors of the drive section and each actuator of the payload handling section; and
[0356] effecting, with a controller communicably coupled respectively to the drive section, the payload handling section, and peripheral section, each autonomous operation of the autonomous guided vehicle; and
[0357] monitoring a charge level of the power supply with a comprehensive power management section of the controller, wherein the comprehensive power management section is connected to each respective branch circuit of the drive section, the payload handling section, and the peripheral electronics section respectively powering the drive section, the payload handling section, and the peripheral electronics section from the power supply, the comprehensive power management section manages power consumption of the branch circuits based on a demand level of each branch circuit relative to the charge level available from the power supply.
[0358] In accordance with one or more aspects of the disclosed embodiment the comprehensive power management section manages a demand charge level of each respective branch circuit switching each respective branch circuit on or off in a predetermined pattern based on the demand charge level of each respective branch circuit with respect to other branch circuits and the charge level available from the power supply.
[0359] In accordance with one or more aspects of the disclosed embodiment the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply, so as to maximize available charge level from the power supply directed to the controller.
[0360] In accordance with one or more aspects of the disclosed embodiment the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply so that the available charge level directed to the controller is equal to or exceeds the demand charge level of the controller for a maximum time based on the available charge level of the power supply.
[0361] In accordance with one or more aspects of the disclosed embodiment the method further comprises at least one of:
[0362] with an autonomous navigation control section of the controller, registering and holding in volatile memory autonomous guided vehicle state and pose navigation information, historic and current, that is deterministic of and describing current and predicted state, pose, and location of the autonomous guided vehicle; and
[0363] with an autonomous payload handling control section of the controller, registering and holding in volatile memory current payload identity, state, and pose information, historic and current;
[0364] wherein, upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller configures at least one of the autonomous guided vehicle state and pose navigation information and the payload identity, state, and pose information, held in respective registry and memory of corresponding controller sections, into an initialization file available on reboot of the controller.
[0365] In accordance with one or more aspects of the disclosed embodiment upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller enters suspension of operation and hibernation.
[0366] In accordance with one or more aspects of the disclosed embodiment upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the drive section, the controller commands the drive section so to navigate the autonomous guided vehicle along a predetermined auxiliary path and auxiliary trajectory to a predetermined autonomous guided vehicle auxiliary stop location in the facility. [0367] In accordance with one or more aspects of the disclosed embodiment upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the payload handling section the controller commands the payload handling section to move the payload handling actuator, and any payload thereon, to a predetermined safe payload position in the payload bed.
[0368] In accordance with one or more aspects of the disclosed embodiment method of claim 11, further comprises:
[0369] providing the controller with at least one of
[0370] a vehicle health status monitor,
[0371] a drive section health status monitor,
[0372] a payload handling section health status monitor, and
[0373] a peripheral electronics section health status monitor; and
[0374] a health status register section; and
[0375] wherein, upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller configures stored health status information from the at least one of the vehicle health status monitor, the drive section health status monitor, the payload handling section health monitor, and the peripheral electronics section health monitor in the health status register section into an initialization file available on reboot of the controller.
[0376] In accordance with one or more aspects of the disclosed embodiment the power supply is an ultra-capacitor, or the charge level is voltage level.
[0377] It should be understood that the foregoing description is only illustrative of the aspects of the disclosed embodiment. Various alternatives and modifications can be devised by those skilled in the art without departing from the aspects of the disclosed embodiment. Accordingly, the aspects of the disclosed embodiment are intended to embrace all such alternatives, modifications and variances that fall within the scope of any claims appended hereto. Further, the mere fact that different features are recited in mutually different dependent or independent claims does not indicate that a combination of these features cannot be advantageously used, such a combination remaining within the scope of the aspects of the disclosed embodiment.
[0378] What is claimed is:

Claims

CLAIMS An autonomous guided vehicle comprising: a frame with a payload hold; a drive section coupled to the frame with drive wheels supporting the autonomous guided vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the autonomous guided vehicle over the traverse surface in a facility; a payload handler coupled to the frame configured to transfer a payload, with a flat undeterministic seating surface seated in the payload hold, to and from the payload hold of the autonomous guided vehicle and a storage location, of the payload, in a storage array; a physical characteristic sensor system connected to the frame having electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electro-magnetic beam or field with a physical characteristic, the electro-magnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic, wherein the physical characteristic sensor system is configured to generate sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information; and a supplemental sensor system, connected to the frame, that supplements the physical characteristic sensor system, the supplemental sensor system being, at least in part, a vision system with cameras disposed to capture image data informing the at least one of a vehicle navigation pose or location and payload pose or location supplement to the information of the physical characteristic sensor system.
2. The autonomous guided vehicle of claim 1, further comprising a controller connected to the frame, operably connected to the drive section or the payload handler, and communicably connected to the physical characteristic sensor system, wherein the controller is configured to determine from the information of the physical characteristic sensor system vehicle pose and location effecting independent guidance of the autonomous guided vehicle traversing the facility.
3. The autonomous guided vehicle of claim 2, wherein the controller is configured to determine from the information of the physical characteristic sensor system payload pose and location effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload hold.
4. The autonomous guided vehicle of claim 2, wherein the controller is programmed with a reference representation of predetermined features defining at least part of the facility traversed through by the autonomous guided vehicle.
5. The autonomous guided vehicle of claim 4, wherein the controller is configured to register the captured image data and generate therefrom at least one image of one or more features of the predetermined features, the at least one image being formatted as a virtual representation of the one or more predetermined features so as to provide comparison to one or more corresponding reference of the predetermined features of the reference representation.
6. The autonomous guided vehicle of claim 5, wherein the controller is configured so that the virtual representation, of the imaged one or more features of the predetermined features, is effected resident on the autonomous guided vehicle, and comparison between the virtual representation of the one or more imaged predetermined features and the one or more corresponding reference predetermined features is effected resident on the autonomous guided vehicle.
7. The autonomous guided vehicle of claim 5, wherein the controller is configured to confirm autonomous guided vehicle pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
8. The autonomous guided vehicle of claim 7, the controller is configured to identify a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation and the reference representation, and update or complete autonomous guided vehicle pose or location information from the physical characteristic sensor system based on the variance.
9. The autonomous guided vehicle of claim 8, wherein the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the autonomous guided vehicle pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
10. The autonomous guided vehicle of claim 9, wherein the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle navigation based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
11. The autonomous guided vehicle of claim 10, wherein after switching, the controller is configured to: continue autonomous guided vehicle navigation to destination, or select an autonomous guided vehicle safe path and trajectory bringing the autonomous guided vehicle from a position at switching to a safe location for shut down, or initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to guasi automatic operation or manual operation via a user interface device.
12. The autonomous guided vehicle of claim 5, wherein the controller is configured to confirm payload pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
13. The autonomous guided vehicle of claim 12, wherein the controller is configured to identify a variance in the payload pose and location based on the comparison between the virtual representation and the reference representation, and update or complete payload pose or location information from the physical characteristic sensor system based on the variance.
14. The autonomous guided vehicle of claim 13, wherein the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the payload pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
15. The autonomous guided vehicle of claim 14, wherein the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle payload handling based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
16. The autonomous guided vehicle of claim 15, wherein after switching, the controller is configured to: continue autonomous guided vehicle handling to destination, or initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to quasi automatic payload handling operation or manual payload handling operation via a user interface device.
17. The autonomous guided vehicle of claim 2, wherein the controller is configured to transmit, via a wireless communication system communicably coupling the controller and an operator interface, a simulation image combining the virtual representation of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation presenting the operator with an augmented reality image in real time.
18. The autonomous guided vehicle of claim 17, wherein the controller is configured to receive real time operator commands to the traversing autonomous guided vehicle, which commands are responsive to the real time augmented reality image, and changes in the real time augmented reality image transmitted to the operator by the controller.
19. The autonomous guided vehicle of claim 1, wherein the supplemental sensor system at least in part effects on-the-fly justification and/or sortation of case units onboard the autonomous guided vehicle
20. The autonomous guided vehicle of claim 1, wherein imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the supplemental sensor system, are coapted to a reference model of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of the vehicle navigation pose or location information and the payload pose or location information.
21. An autonomous guided vehicle comprising: a frame with a payload hold; a drive section coupled to the frame with drive wheels supporting the vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the autonomous guided vehicle over the traverse surface in a facility; a payload handler coupled to the frame configured to transfer a payload, with a flat undeterministic seating surface seated in the payload hold, to and from the payload hold of the autonomous guided vehicle and a storage location, of the payload, in a storage array; a physical characteristic sensor system connected to the frame having electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electro-magnetic beam or field with a physical characteristic, the electro-magnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic, wherein the physical characteristic sensor system is configured to generate sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information; and an auxiliary sensor system, connected to the frame, that is separate and distinct from the physical characteristic sensor system, the auxiliary sensor system being, at least in part, a vision system with cameras disposed to capture image data informing the at least one of a vehicle navigation pose or location and payload pose or location which image data is auxiliary information to the information of the physical characteristic sensor system.
22. The autonomous guided vehicle of claim 21, further comprising a controller connected to the frame, operably connected to the drive section or the payload handler, and communicably connected to the physical characteristic sensor system, wherein the controller is configured to determine from the information of the physical characteristic sensor system vehicle pose and location effecting independent guidance of the autonomous guided vehicle traversing the facility.
23. The autonomous guided vehicle of claim 22, wherein the controller is configured to determine from the information of the physical characteristic sensor system payload pose and location effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload hold.
24. The autonomous guided vehicle of claim 22, wherein the controller is programmed with a reference representation of predetermined features defining at least part of the facility traversed through by the autonomous guided vehicle.
25. The autonomous guided vehicle of claim 24, wherein the controller is configured to register the captured image data and generate therefrom at least one image of one or more features of the predetermined features, the at least one image being formatted as a virtual representation of the one or more predetermined features so as to provide comparison to one or more corresponding reference of the predetermined features of the reference representation.
26. The autonomous guided vehicle of claim 25, wherein the controller is configured so that the virtual representation, of the imaged one or more features of the predetermined features, is effected resident on the autonomous guided vehicle, and comparison between the virtual representation of the one or more imaged predetermined features and the one or more corresponding reference predetermined features is effected resident on the autonomous guided vehicle.
27. The autonomous guided vehicle of claim 25, wherein the controller is configured to confirm autonomous guided vehicle pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
28. The autonomous guided vehicle of claim 27, wherein the controller is configured to identify a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation and the reference representation, and update or complete autonomous guided vehicle pose or location information from the physical characteristic sensor system based on the variance.
29. The autonomous guided vehicle of claim 28, wherein the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the autonomous guided vehicle pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
30. The autonomous guided vehicle of claim 29, wherein the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle navigation based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
31. The autonomous guided vehicle of claim 30, wherein after switching, the controller is configured to: continue autonomous guided vehicle navigation to destination or select an autonomous guided vehicle safe path and trajectory bringing the autonomous guided vehicle from a position at switching to a safe location for shut down, or initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to guasi automatic operation or manual operation via a user interface device.
32. The autonomous guided vehicle of claim 25, wherein the controller is configured to confirm payload pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
33. The autonomous guided vehicle of claim 32, wherein the controller is configured to identify a variance in the payload pose and location based on the comparison between the virtual representation and the reference representation, and update or complete payload pose or location information from the physical characteristic sensor system based on the variance.
34. The autonomous guided vehicle of claim 33, wherein the controller is configured to determine a pose error in the information from the physical characteristic sensor system and fidelity of the payload pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
35. The autonomous guided vehicle of claim 34, wherein the controller is configured so that with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle payload handling based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
36. The autonomous guided vehicle of claim 35, wherein after switching, the controller is configured to: continue autonomous guided vehicle handling to destination, or initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to quasi automatic payload handling operation or manual payload handling operation via a user interface device.
37. The autonomous guided vehicle of claim 22, wherein the controller is configured to transmit, via a wireless communication system communicably coupling the controller and an operator interface, a simulation image combining the virtual representation of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation presenting the operator with an augmented reality image in real time.
38. The autonomous guided vehicle of claim 37, wherein the controller is configured to receive real time operator commands to the traversing autonomous guided vehicle, which commands are responsive to the real time augmented reality image, and changes in the real time augmented reality image transmitted to the operator by the controller.
39. The autonomous guided vehicle of claim 21, wherein the auxiliary sensor system at least in part effects on-the-fly justification and/or sortation of case units onboard the autonomous guided vehicle
40. The autonomous guided vehicle of claim 21, wherein imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the auxiliary sensor system, are coapted to a reference model of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of the vehicle navigation pose or location information and the payload pose or location information.
41. A method comprising: providing an autonomous guided vehicle with: a frame with a payload hold, a drive section coupled to the frame with drive wheels supporting the autonomous guided vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the autonomous guided vehicle over the traverse surface in a facility, and a payload handler coupled to the frame configured to transfer a payload, with a flat undeterministic seating surface seated in the payload hold, to and from the payload hold of the autonomous guided vehicle and a storage location, of the payload, in a storage array; generating sensor data with physical characteristic sensor system, the sensor data embodying at least one of a vehicle navigation pose or location information and payload pose or location information, wherein the physical characteristic sensor system connected to the frame and has electro-magnetic sensors, each responsive to interaction or interface of a sensor emitted or generated electro-magnetic beam or field with a physical characteristic, the electro-magnetic beam or field being disturbed by interaction or interface with the physical characteristic, and which disturbance is detected by and effects sensing by the electro-magnetic sensor of the physical characteristic; and capturing image data with a supplemental sensor system, the image data informing the at least one of a vehicle navigation pose or location and payload pose or location supplement to the information of the physical characteristic sensor system, wherein the supplemental sensor system is connected to the frame and supplements the physical characteristic sensor system, the supplemental sensor system being, at least in part, a vision system with cameras disposed to capture the image data.
42. The method of claim 41, further comprising determining, with a controller, from the information of the physical characteristic sensor system vehicle pose and location effecting independent guidance of the autonomous guided vehicle traversing the facility, wherein the controller is connected to the frame and operably connected to the drive section or the payload handler, and communicably connected to the physical characteristic sensor system.
43. The method of claim 42, further comprising, with the controller, determining from the information of the physical characteristic sensor system payload pose and location effecting independent underpick and place of the payload to and from the storage location and independent underpick and place of the payload in the payload hold.
44. The method of claim 42, wherein the controller is programmed with a reference representation of predetermined features defining at least part of the facility traversed through by the autonomous guided vehicle.
45. The method of claim 44, further comprising, with the controller, registering the captured image data and generating therefrom at least one image of one or more features of the predetermined features, the at least one image being formatted as a virtual representation of the one or more predetermined features so as to provide comparison to one or more corresponding reference of the predetermined features of the reference representation.
46. The method of claim 45, wherein the controller is configured so that the virtual representation, of the imaged one or more features of the predetermined features, is effected resident on the autonomous guided vehicle, and comparison between the virtual representation of the one or more imaged predetermined features and the one or more corresponding reference predetermined features is effected resident on the autonomous guided vehicle.
47. The method of claim 45, further comprising, with the controller, confirming autonomous guided vehicle pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
48. The method of claim 47, further comprising, with the controller, identifying a variance in the autonomous guided vehicle pose and location based on the comparison between the virtual representation and the reference representation, and updating or completing autonomous guided vehicle pose or location information from the physical characteristic sensor system based on the variance.
49. The method of claim 48, wherein the controller determines a pose error in the information from the physical characteristic sensor system and fidelity of the autonomous guided vehicle pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
50. The method of claim 49, wherein, with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle navigation based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
51. The method of claim 50, wherein after switching, the controller is configured to: continue autonomous guided vehicle navigation to destination or select an autonomous guided vehicle safe path and trajectory bringing the autonomous guided vehicle from a position at switching to a safe location for shut down, or initiate communication to an operator identifying autonomous guided vehicle kinematic data and a destination for operator selection of autonomous guided vehicle control from automatic operation to guasi automatic operation or manual operation via a user interface device.
52. The method of claim 45, wherein the controller confirms payload pose and location information registered by the controller from the physical characteristic sensor system based on the comparison between the virtual representation and the reference representation.
53. The method of claim 52, wherein the controller identifies a variance in the payload pose and location based on the comparison between the virtual representation and the reference representation, and update or complete payload pose or location information from the physical characteristic sensor system based on the variance.
54. The method of claim 53, wherein the controller determines a pose error in the information from the physical characteristic sensor system and fidelity of the payload pose and location information from the physical characteristic sensor system based on at least one of the identified variance and analysis of the at least one image, and assign a confidence value according to at least one of the pose error and the fidelity.
55. The method of claim 54, wherein, with the confidence value below a predetermined threshold, the controller switches autonomous guided vehicle payload handling based on pose and location information generated from the virtual representation in place of pose and location information from the physical characteristic sensor system.
56. The method of claim 55, wherein after switching, the controller is configured to: continue autonomous guided vehicle handling to destination, or initiate communication to an operator identifying payload data along with an operator selection of autonomous guided vehicle control from automatic payload handling operation to quasi automatic payload handling operation or manual payload handling operation via a user interface device.
57. The method of claim 42, wherein the controller transmits, via a wireless communication system communicably coupling the controller and an operator interface, a simulation image combining the virtual representation of the one or more imaged predetermined features and one or more corresponding reference predetermined features of a reference presentation presenting the operator with an augmented reality image in real time.
58. The method of claim 57, wherein the controller receives real time operator commands to the traversing autonomous guided vehicle, which commands are responsive to the real time augmented reality image, and changes in the real time augmented reality image transmitted to the operator by the controller.
59. The method of claim 41, wherein the supplemental sensor system at least in part effects on-the-fly justification and/or sortation of case units onboard the autonomous guided vehicle
60. The method of claim 41, wherein imaged or viewed objects described by one or more of supplemental information, supplemental vehicle navigation pose or location, and supplemental payload pose or location, from the supplemental sensor system, are coapted to a reference model of one or more of surrounding facility features and interfacing facility features so as to enhance, via the one or more of the supplemental information, the supplemental vehicle navigation pose or location, and the supplemental payload pose or location resolution of one or more of the vehicle navigation pose or location information and the payload pose or location information.
61. An autonomous guided vehicle comprising: a frame with a payload hold; a drive section coupled to the frame with drive wheels supporting the vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the vehicle over the traverse surface in a facility; a payload handler coupled to the frame configured to transfer a payload to and from the payload hold of the vehicle and a storage location, of the payload, in a storage array; a supplemental sensor system, connected to the frame for collaboration of the vehicle and an operator, supplemental sensor system supplements a vehicle autonomous navigation/operation sensor system configured to at least collect sensory data embodying vehicle pose and location information for auto navigation by the vehicle of the facility, wherein the supplemental sensor system is, at least in part, a vision system with at least one camera disposed to capture image data informing objects and/or spatial features within at least a portion of the facility viewed by the at least one camera with the vehicle in different positions in the facility; and a controller connected to the frame and communicably coupled to the supplemental sensor system so as to register the information from the image data of the at least one camera, and the controller is configured to determine, from the information, presence of a predetermined physical characteristic of at least one object or spatial feature, and in response thereto, selectably reconfigure the vehicle from an autonomous state to a collaborative vehicle state disposed to receive operator commands for the vehicle to continue effecting vehicle operation.
62. The autonomous guided vehicle of claim 61, wherein the predetermined physical characteristic is that the at least one object or spatial feature extends across at least part of, the traverse surface, a vehicle traverse path across the traverse surface, or through space of the vehicle or another different vehicle traversing the traverse surface.
63. The autonomous guided vehicle of claim 61, wherein the controller is programmed with a reference representation of predetermined features defining at least in part the facility traversed through by the vehicle.
64. The autonomous guided vehicle of claim 61, wherein the controller is configured to register the captured image data and generate therefrom at least one image of the at least one object or spatial feature showing the predetermined physical characteristic.
65. The autonomous guided vehicle of claim 1, wherein the at least one image is formatted as a virtual representation of the at least one object or spatial feature so as to provide comparison to one or more reference features of the predetermined features of the reference representation.
66. The autonomous guided vehicle of claim 65, wherein the controller is configured to identify the presence of the predetermined physical characteristic of the object or spatial feature based on the comparison between the virtual representation and the reference representation, determine a dimension of the predetermined physical characteristic and command the vehicle to stop in a predetermined trajectory based on a position of the object or spatial features determined from the comparison.
67. The autonomous guided vehicle of claim 66, wherein a stop position in the predetermined trajectory maintains object or spatial reference within field of view of at least one camera and continued imaging of the predetermined physical characteristic, initiates a signal to at least another vehicle of one or more of a traffic obstacle, an area to avoid, or a detour area.
68. The autonomous guided vehicle of claim 61, wherein the predetermined physical characteristic is determined by the controller by determining a position of the object within a reference frame of the at least one camera, that is calibrated and has a predetermined relationship to the vehicle, and from the object pose in the reference frame of the at least one camera determine presence of predetermined physical characteristic of the object.
69. The autonomous guided vehicle of claim 68, wherein the controller is configured such that, identification of presence and switch from the autonomous state to the collaborative vehicle state, initiates transmission communicating image, identification of presence of predetermined physical characteristic, to operator interface for operator collaboration operation of the vehicle.
70. The autonomous guided vehicle of claim 61, wherein the controller is configured to apply a trajectory to the autonomous guided vehicle that brings the autonomous guided vehicle to a zero velocity within a predetermined time period where motion of the autonomous guided vehicle along the trajectory is coordinated with location of the objects and/or spatial features.
71. The autonomous guided vehicle of claim 61, wherein the capture of image data informing objects and/or spatial features is opportunistic during transfer of a payload to/from the payload hold of the vehicle or a storage location in a storage array.
72. The autonomous guided vehicle of claim 61, wherein the controller is programmed to command the vehicle to the different positions in the facility associated with the vehicle effecting one or more predetermined payload autonomous transfer tasks, wherein each of the one or more predetermined payload autonomous transfer tasks is a separate and distinct task from the capture image data viewed by the at least one camera in the different positions.
73. The autonomous guided vehicle of claim 68, wherein the controller is configured so that determination of presence of the predetermined physical characteristic of the at least one object or spatial feature is, coincident at least in part with, but supplemental and peripheral to vehicle actions effecting each of the one or more predetermined payload auto transfer tasks.
74. The autonomous guided vehicle of claim 68, wherein the controller is configured so that determination of presence of the predetermined physical characteristic of the at least one object or spatial feature is, opportunistic to vehicle actions effecting each of the one or more predetermined payload auto transfer tasks.
75. The autonomous guided vehicle of claim 74, wherein at least one of the one or more predetermined payload auto transfer tasks is effected at least one of the different positions.
76. The autonomous guided vehicle of claim 61, wherein the collaborative vehicle state is supplemental to the autonomous state of the vehicle effecting each of the one or more predetermined payload auto transfer tasks.
77. A method comprising: providing an autonomous guided vehicle with: a frame with a payload hold; a drive section coupled to the frame with drive wheels supporting the vehicle on a traverse surface, the drive wheels effect vehicle traverse on the traverse surface moving the vehicle over the traverse surface in a facility; a payload handler coupled to the frame configured to transfer a payload to and from the payload hold of the vehicle and a storage location, of the payload, in a storage array; generating, with a supplemental sensor system connected to the frame for collaboration of the vehicle and an operator, image data informing objects and/or spatial features within at least a portion of the facility viewed by the at least one camera with the vehicle in different positions in the facility, wherein the supplemental sensor system is, at least in part, a vision system with at least one camera disposed to capture image data and the supplemental sensor system supplements a vehicle autonomous navigation/operation sensor system configured to at least collect sensory data embodying vehicle pose and location information for auto navigation by the vehicle of the facility; registering, with a controller connected to the frame and communicably coupled to the supplemental sensor system, the information from the image data of the at least one camera; and determining, with the controller, from the information, presence of a predetermined physical characteristic of at least one object or spatial feature, and in response thereto, selectably reconfiguring the vehicle from an autonomous state to a collaborative vehicle state disposed to receive operator commands for the vehicle to continue effecting vehicle operation.
78. The method of claim 77, wherein the predetermined physical characteristic is that the at least one object or spatial feature extends across at least part of, the traverse surface, a vehicle traverse path across the traverse surface or through space of the vehicle or another different vehicle traversing the traverse surface.
79. The method of claim 77, wherein the controller is programmed with a reference representation of predetermined features defining at least in part the facility traversed through by the vehicle.
80. The method of claim 77, further comprising generating, from the registered captured image data, at least one image of the at least one object or spatial feature showing the predetermined physical characteristic.
81. The method of claim 77, wherein the at least one image is formatted as a virtual representation of the at least one object or spatial feature, the method further comprising comparing the virtual representation to one or more reference features of the predetermined features of the reference representation.
82. The method of claim 81, further comprising identifying, with the controller, the presence of the predetermined physical characteristic of the object or spatial feature based on the comparison between the virtual representation and the reference representation, determining a dimension of the predetermined physical characteristic, and commanding the vehicle to stop in a predetermined trajectory based on a position of the object or spatial features determined from the comparison.
83. The method of claim 82, further comprising, with the vehicle in a stop position in the predetermined trajectory, maintaining the object or spatial reference within a field of view of the at least one camera and continued imaging of the predetermined physical characteristic, and initiating a signal to at least another vehicle of one or more of a traffic obstacle, an area to avoid, or a detour area.
84. The method of claim 77, wherein the predetermined physical characteristic is determined by the controller by determining a position of the object within a reference frame of the at least one camera, that is calibrated and has a predetermined relationship to vehicle, and from the object pose in the reference frame of the at least one camera determine presence of predetermined physical characteristic of the object.
85. The method of claim 84, wherein the controller is configured such that, identification of presence of the predetermined physical characteristic of the at least one object or spatial feature and switch from autonomous state to collaborative vehicle state, initiates transmission communicating image, identification of presence of predetermined physical characteristic, to an operator interface for operator collaboration operation of the vehicle.
86. The autonomous guided vehicle of claim 77, further comprising applying, with the controller, a trajectory to the autonomous guided vehicle bringing the autonomous guided vehicle to a zero velocity within a predetermined time period, where motion of the autonomous guided vehicle along the trajectory is coordinated with a location of the objects and/or spatial features.
87. The autonomous guided vehicle of claim 77, wherein the capture of image data informing objects and/or spatial features is opportunistic during transfer of a payload to/from the payload hold of the vehicle or a storage location in a storage array.
88. An autonomous guided vehicle comprising: a vehicle chassis with a power supply mounted thereon and powered sections connected to the chassis and each powered by the power supply, the powered sections including: a drive section with motors driving wheels, supporting the vehicle chassis, and disposed to traverse the autonomous guided vehicle on a traverse surface in a facility under autonomous guidance; a payload handling section with at least one payload handling actuator configured so that actuation of the at least one payload handling actuator effects transfer of a payload to and from a payload bed, of the vehicle chassis, and a storage in the facility; a peripheral electronics section having at least one of an autonomous pose and navigation sensor, at least one of a payload handling sensor, and at least one peripheral motor, the at least one peripheral motor being separate and distinct from each of the motors of the drive section and each actuator of the payload handling section; and a controller communicably coupled respectively to the drive section, the payload handling section, and peripheral section so at to effect each autonomous operation of the autonomous guided vehicle, wherein the controller comprises a comprehensive power management section communicably connected to the power supply so as to monitor a charge level of the power supply, and wherein the comprehensive power management section is connected to each respective branch circuit of the drive section, the payload handling section, and the peripheral electronics section respectively powering the drive section, the payload handling section, and the peripheral electronics section from the power supply, the comprehensive power management section being configured to manage power consumption of the branch circuits based on a demand level of each branch circuit relative to the charge level available from the power supply.
89. The autonomous guided vehicle of claim 88, wherein the comprehensive power management section is configured so as to manage a demand charge level of each respective branch circuit switching each respective branch circuit on or off in a predetermined pattern based on the demand charge level of each respective branch circuit with respect to other branch circuits and the charge level available from the power supply.
90. The autonomous guided vehicle of claim 88, wherein the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply, so as to maximize available charge level from the power supply directed to the controller.
91. The autonomous guided vehicle of claim 88, wherein the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply so that the available charge level directed to the controller is equal to or exceeds the demand charge level of the controller for a maximum time based on the available charge level of the power supply.
92. The autonomous guided vehicle of claim 88, wherein the controller has at least one of: an autonomous navigation control section configured to register and hold in volatile memory autonomous guided vehicle state and pose navigation information, historic and current, that is deterministic of and describing current and predicted state, pose, and location of the autonomous guided vehicle; and an autonomous payload handling control section configured to register and hold in volatile memory current payload identity, state, and pose information, historic and current; wherein the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller configures at least one of the autonomous guided vehicle state and pose navigation information and the payload identity, state, and pose information, held in respective registry and memory of corresponding controller sections, into an initialization file available on reboot of the controller.
93. The autonomous guided vehicle of claim 88, wherein the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller enters suspension of operation and hibernation.
94. The autonomous guided vehicle of claim 88, wherein the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the drive section, the controller is configured to command the drive section so as to navigate the autonomous guided vehicle along a predetermined auxiliary path and auxiliary trajectory (to a predetermined autonomous guided vehicle auxiliary stop location in the facility.
95. The autonomous guided vehicle of claim 88, wherein the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the payload handling section the controller is configured to command the payload handling section to move the payload handling actuator, and any payload thereon, to a predetermined safe payload position in the payload bed.
96. The autonomous guided vehicle of claim 88, wherein the controller includes at least one of: a vehicle health status monitor, a drive section health status monitor, a payload handling section health status monitor, and a peripheral electronics section health status monitor; and a health status register section; and wherein the controller is configured so that upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, to configure stored health status information from the at least one of the vehicle health status monitor, the drive section health status monitor, the payload handling section health monitor, and the peripheral electronics section health monitor in the health status register section into an initialization file available on reboot of the controller.
97. The autonomous guided vehicle of claim 88, wherein the power supply is an ultra-capacitor, or the charge level is voltage level.
98. A method for autonomous guided vehicle power management, the method comprising: providing an autonomous guided vehicle with a vehicle chassis with a power supply mounted thereon and powered sections connected to the chassis and each powered by the power supply, the powered sections including: a drive section with motors driving wheels, supporting the vehicle chassis, and disposed to traverse the autonomous guided vehicle on a traverse surface in a facility under autonomous guidance; a payload handling section with at least one payload handling actuator configured so that actuation of the at least one payload handling actuator effects transfer of a payload to and from a payload bed, of the vehicle chassis, and a storage in the facility; a peripheral electronics section having at least one of an autonomous pose and navigation sensor, at least one of a payload handling sensor, and at least one peripheral motor, the at least one peripheral motor being separate and distinct from each of the motors of the drive section and each actuator of the payload handling section; and effecting, with a controller communicably coupled respectively to the drive section, the payload handling section, and peripheral section, each autonomous operation of the autonomous guided vehicle; and monitoring a charge level of the power supply with a comprehensive power management section of the controller, wherein the comprehensive power management section is connected to each respective branch circuit of the drive section, the payload handling section, and the peripheral electronics section respectively powering the drive section, the payload handling section, and the peripheral electronics section from the power supply, the comprehensive power management section manages power consumption of the branch circuits based on a demand level of each branch circuit relative to the charge level available from the power supply.
99. The method of claim 98, wherein the comprehensive power management section manages a demand charge level of each respective branch circuit switching each respective branch circuit on or off in a predetermined pattern based on the demand charge level of each respective branch circuit with respect to other branch circuits and the charge level available from the power supply.
100. The method of claim 98, wherein the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply, so as to maximize available charge level from the power supply directed to the controller.
101. The method of claim 98, wherein the predetermined pattern is arranged to switch off branch circuits with a decrease in the available charge level from the power supply so that the available charge level directed to the controller is equal to or exceeds the demand charge level of the controller for a maximum time based on the available charge level of the power supply.
102. The method of claim 98, further comprising at least one of: with an autonomous navigation control section of the controller, registering and holding in volatile memory autonomous guided vehicle state and pose navigation information, historic and current, that is deterministic of and describing current and predicted state, pose, and location of the autonomous guided vehicle; and with an autonomous payload handling control section of the controller, registering and holding in volatile memory current payload identity, state, and pose information, historic and current; wherein, upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller configures at least one of the autonomous guided vehicle state and pose navigation information and the payload identity, state, and pose information, held in respective registry and memory of corresponding controller sections, into an initialization file available on reboot of the controller.
103. The method of claim 98, wherein upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller enters suspension of operation and hibernation.
104. The method of claim 98, wherein upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the drive section, the controller commands the drive section so to navigate the autonomous guided vehicle along a predetermined auxiliary path and auxiliary trajectory to a predetermined autonomous guided vehicle auxiliary stop location in the facility.
105. The method of claim 98, wherein upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the branch circuit of the payload handling section the controller commands the payload handling section to move the payload handling actuator, and any payload thereon, to a predetermined safe payload position in the payload bed.
106. The method of claim 98, further comprising: providing the controller with at least one of a vehicle health status monitor, a drive section health status monitor, a payload handling section health status monitor, and a peripheral electronics section health status monitor; and a health status register section; and wherein, upon indication from the comprehensive power management section of imminent decrease in available charge level, directed from the power supply to the controller, to less than demand level of the controller, the controller configures stored health status information from the at least one of the vehicle health status monitor, the drive section health status monitor, the payload handling section health monitor, and the peripheral electronics section health monitor in the health status register section into an initialization file available on reboot of the controller.
107. The autonomous guided vehicle of claim 98, wherein the power supply is an ultra-capacitor, or the charge level is voltage level.
PCT/US2022/072592 2021-08-12 2022-05-26 Autonomous transport vehicle with vision system WO2023019038A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020237044782A KR20240046119A (en) 2021-08-12 2022-05-26 Autonomous transport vehicle with vision system
CN202280052290.7A CN117794845A (en) 2021-08-12 2022-05-26 Autonomous transport vehicle with vision system
EP22856718.6A EP4384470A1 (en) 2021-08-12 2022-05-26 Autonomous transport vehicle with vision system
CA3220378A CA3220378A1 (en) 2021-08-12 2022-05-26 Autonomous transport vehicle with vision system

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US202163232531P 2021-08-12 2021-08-12
US202163232546P 2021-08-12 2021-08-12
US63/232,531 2021-08-12
US63/232,546 2021-08-12
US202163251398P 2021-10-01 2021-10-01
US63/251,398 2021-10-01
US17/804,039 2022-05-25
US17/804,026 US20230050980A1 (en) 2021-08-12 2022-05-25 Autonomous transport vehicle with vision system
US17/804,039 US20230107709A1 (en) 2021-10-01 2022-05-25 Autonomous transport vehicle with power management
US17/804,026 2022-05-25

Publications (1)

Publication Number Publication Date
WO2023019038A1 true WO2023019038A1 (en) 2023-02-16

Family

ID=85200328

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/072592 WO2023019038A1 (en) 2021-08-12 2022-05-26 Autonomous transport vehicle with vision system

Country Status (4)

Country Link
EP (1) EP4384470A1 (en)
KR (1) KR20240046119A (en)
CA (1) CA3220378A1 (en)
WO (1) WO2023019038A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100102625A1 (en) * 2008-10-24 2010-04-29 The Boeing Company Intelligent Energy Management Architecture
US20160103449A1 (en) * 2013-05-27 2016-04-14 Renault S.A.S. Operating method for a vehicle in manual mode and in autonomous mode
US20210039897A1 (en) * 2015-01-23 2021-02-11 Symbotic Llc Storage and retrieval system transport vehicle
US20210114826A1 (en) * 2019-10-16 2021-04-22 Symbotic Canada, Ulc Vision-assisted robotized depalletizer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100102625A1 (en) * 2008-10-24 2010-04-29 The Boeing Company Intelligent Energy Management Architecture
US20160103449A1 (en) * 2013-05-27 2016-04-14 Renault S.A.S. Operating method for a vehicle in manual mode and in autonomous mode
US20210039897A1 (en) * 2015-01-23 2021-02-11 Symbotic Llc Storage and retrieval system transport vehicle
US20210114826A1 (en) * 2019-10-16 2021-04-22 Symbotic Canada, Ulc Vision-assisted robotized depalletizer

Also Published As

Publication number Publication date
EP4384470A1 (en) 2024-06-19
CA3220378A1 (en) 2023-02-16
KR20240046119A (en) 2024-04-08

Similar Documents

Publication Publication Date Title
US20230050980A1 (en) Autonomous transport vehicle with vision system
AU2017301538B2 (en) Inventory management
KR102101417B1 (en) Joint inventory monitoring
US10488523B2 (en) Using laser sensors to augment stereo sensor readings for robotic devices
US10289111B1 (en) Systems and methods for removing debris from warehouse floors
US10929800B1 (en) Modular automated inventory sorting and retrieving
CN108349079B (en) Information communication about a robot using an optical identifier
US10122995B2 (en) Systems and methods for generating and displaying a 3D model of items in a warehouse
US9665095B1 (en) Systems and methods for removing debris from warehouse floors
EP3391297B1 (en) Illuminating containers in an inventory system
US9527710B1 (en) Enhanced inventory holder
US20230107709A1 (en) Autonomous transport vehicle with power management
EP4384470A1 (en) Autonomous transport vehicle with vision system
CN117794845A (en) Autonomous transport vehicle with vision system
TW202307779A (en) Autonomous transport vehicle with vision system
US20240111308A1 (en) Logistics autonomous vehicle with robust object detection, localization and monitoring
TW202421549A (en) Logistics autonomous vehicle with robust object detection, localization and monitoring
WO2024107684A1 (en) Logistics autonomous vehicle with robust object detection, localization and monitoring

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22856718

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 3220378

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2023573211

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 202280052290.7

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022856718

Country of ref document: EP

Effective date: 20240312