CN110914779A - Autonomous vehicle repositioning - Google Patents
Autonomous vehicle repositioning Download PDFInfo
- Publication number
- CN110914779A CN110914779A CN201880047506.4A CN201880047506A CN110914779A CN 110914779 A CN110914779 A CN 110914779A CN 201880047506 A CN201880047506 A CN 201880047506A CN 110914779 A CN110914779 A CN 110914779A
- Authority
- CN
- China
- Prior art keywords
- autonomous
- vehicle
- semi
- location
- platform
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 75
- 238000004891 communication Methods 0.000 claims description 87
- 230000007704 transition Effects 0.000 claims description 19
- 230000036962 time dependent Effects 0.000 claims description 17
- 238000010801 machine learning Methods 0.000 claims description 9
- 230000001413 cellular effect Effects 0.000 claims description 8
- 238000009826 distribution Methods 0.000 claims description 5
- 238000012358 sourcing Methods 0.000 claims description 4
- 230000001052 transient effect Effects 0.000 abstract 1
- 238000012545 processing Methods 0.000 description 48
- 231100000817 safety factor Toxicity 0.000 description 40
- 230000005540 biological transmission Effects 0.000 description 39
- 238000007726 management method Methods 0.000 description 39
- 238000003860 storage Methods 0.000 description 39
- 238000007689 inspection Methods 0.000 description 20
- 238000004590 computer program Methods 0.000 description 18
- 238000000034 method Methods 0.000 description 18
- 238000012544 monitoring process Methods 0.000 description 18
- 230000003993 interaction Effects 0.000 description 10
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 9
- 230000032258 transport Effects 0.000 description 7
- 238000011161 development Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 230000037361 pathway Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 239000000047 product Substances 0.000 description 6
- 241001465754 Metazoa Species 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 5
- 238000000605 extraction Methods 0.000 description 5
- 229920001621 AMOLED Polymers 0.000 description 4
- ATUOYWHBWRKTHZ-UHFFFAOYSA-N Propane Chemical compound CCC ATUOYWHBWRKTHZ-UHFFFAOYSA-N 0.000 description 4
- 238000007792 addition Methods 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 230000002441 reversible effect Effects 0.000 description 4
- 239000003570 air Substances 0.000 description 3
- 230000006399 behavior Effects 0.000 description 3
- 238000004883 computer application Methods 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000012384 transportation and delivery Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000002411 adverse Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 239000001294 propane Substances 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 239000002689 soil Substances 0.000 description 2
- 239000010409 thin film Substances 0.000 description 2
- VYZAMTAEIAYCRO-UHFFFAOYSA-N Chromium Chemical compound [Cr] VYZAMTAEIAYCRO-UHFFFAOYSA-N 0.000 description 1
- 241000196324 Embryophyta Species 0.000 description 1
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 1
- 239000005562 Glyphosate Substances 0.000 description 1
- 240000000594 Heliconia bihai Species 0.000 description 1
- 235000015842 Hesperis Nutrition 0.000 description 1
- 235000012633 Iberis amara Nutrition 0.000 description 1
- 235000006508 Nelumbo nucifera Nutrition 0.000 description 1
- 240000002853 Nelumbo nucifera Species 0.000 description 1
- 235000006510 Nelumbo pentapetala Nutrition 0.000 description 1
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 241000700605 Viruses Species 0.000 description 1
- NUFNQYOELLVIPL-UHFFFAOYSA-N acifluorfen Chemical compound C1=C([N+]([O-])=O)C(C(=O)O)=CC(OC=2C(=CC(=CC=2)C(F)(F)F)Cl)=C1 NUFNQYOELLVIPL-UHFFFAOYSA-N 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- XDDAORKBJWWYJS-UHFFFAOYSA-N glyphosate Chemical compound OC(=O)CNCP(O)(O)=O XDDAORKBJWWYJS-UHFFFAOYSA-N 0.000 description 1
- 229940097068 glyphosate Drugs 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000010979 ruby Substances 0.000 description 1
- 229910001750 ruby Inorganic materials 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000005437 stratosphere Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 239000005436 troposphere Substances 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0027—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
- G01C21/3438—Rendez-vous, i.e. searching a destination where several users can meet, and the routes to this destination for these users; Ride sharing, i.e. searching a route such that at least two users can share a vehicle for at least part of the route
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0287—Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
- G05D1/0291—Fleet control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0287—Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
- G05D1/0291—Fleet control
- G05D1/0297—Fleet control by controlling means in a control room
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
- G06Q30/0204—Market segmentation
- G06Q30/0205—Location or geographical consideration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/582—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/586—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of parking space
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/017—Detecting movement of traffic to be counted or controlled identifying vehicles
- G08G1/0175—Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/14—Traffic control systems for road vehicles indicating individual free spaces in parking areas
- G08G1/141—Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces
- G08G1/143—Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces inside the vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/14—Traffic control systems for road vehicles indicating individual free spaces in parking areas
- G08G1/145—Traffic control systems for road vehicles indicating individual free spaces in parking areas where the indication depends on the parking areas
- G08G1/148—Management of a network of parking areas
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
- G06N5/022—Knowledge engineering; Knowledge acquisition
- G06N5/025—Extracting rules from data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30184—Infrastructure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Business, Economics & Management (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Strategic Management (AREA)
- Finance (AREA)
- Development Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Accounting & Taxation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Game Theory and Decision Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- General Engineering & Computer Science (AREA)
- Traffic Control Systems (AREA)
- Software Systems (AREA)
- Navigation (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
Abstract
A platform for distributing and navigating an autonomous or semi-autonomous fleet of vehicles over multiple paths is provided herein. The platform may employ demand allocation prediction algorithms and transient relocation algorithms to allocate autonomous or semi-autonomous fleets to execute orders and tasks.
Description
Cross-referencing
This application claims priority from U.S. provisional patent application No. 62/534,929, filed on 20/7/2017, the contents of which are incorporated herein by reference in their entirety.
Background
The field of autonomous and semi-autonomous vehicles is an area of constant development of innovation. Autonomous and semi-autonomous vehicles are used for many purposes, including warehouse inventory operations, home cleaners, hospital transport, sanitary, military, or defense applications.
Disclosure of Invention
A platform for distribution and navigation of an autonomous or semi-autonomous fleet of vehicles (fleets) over multiple paths is provided herein, the platform comprising: a fleet of vehicles comprising a plurality of autonomous or semi-autonomous vehicles (vehicles), wherein each autonomous or semi-autonomous vehicle comprises: an autonomous or semi-autonomous propulsion system; a position sensor configured to measure a current vehicle position of the vehicle; a condition sensor configured to measure a current vehicle state; and a communication device configured to transmit a current vehicle position and a current vehicle state; a server processor configured to provide a server application, the server application comprising: a database comprising a map of a plurality of routes, wherein each route is associated with route parameters including an autonomous driving safety factor and a speed coefficient; a communication module that receives a current vehicle position and a current vehicle state from a communication device; a scheduling module that assigns one or more of a plurality of autonomous or semi-autonomous vehicles to a task destination based at least on a current vehicle location and a current vehicle state; and a navigation module that applies a route calculation algorithm to determine a vehicle mission route from a current vehicle location to a mission destination based at least on the path parameters and a current vehicle state, wherein the vehicle mission route includes at least a portion of one of the plurality of paths, wherein the communication device further directs an autonomous or semi-autonomous propulsion system of the autonomous or semi-autonomous vehicle based on the vehicle mission route.
In some embodiments, the server application further comprises a requirements database comprising historical requirements data associated with a geographic area, and wherein the geographic area comprises at least the task destination. In some embodiments, the server application further comprises a demand prediction module that applies a prediction algorithm to determine a predicted demand schedule for each of the autonomous or semi-autonomous vehicles based at least on historical demand data, wherein the predicted demand schedule comprises a predicted demand time period and a predicted demand mission location in the geographic area. In some embodiments, the server application further comprises a transition (intervention) relocation module that assigns a transition relocation pattern to each of the plurality of autonomous or semi-autonomous vehicles based at least on one or more of a predicted demand task location, a predicted demand time period, a task destination, and a current vehicle state. In some embodiments, the transitional repositioning mode includes a replenishment station mode corresponding to a replenishment station (depot) location, a parking mode associated with one of the plurality of parking space locations, and a cruise mode associated with a set threshold cruise (hover) distance from the mission destination or the predicted required mission location. In some embodiments, the database further comprises a plurality of parking spot locations in the geographic area. In some embodiments, the application further comprises a parking allocation module that determines a selected parking space location for one or more of the plurality of autonomous or semi-autonomous vehicles based on at least one of the mission destination and the predicted demanded mission location, the parking pattern, and the plurality of parking space locations. In some embodiments, the navigation module further applies a route calculation algorithm to determine a vehicle relocation route from the task destination to: a tender station location, the determination based on the tender station mode; selecting a parking space position, the determination based on a parking pattern; or a vehicle cruise route, the determination being based on a cruise pattern. In some embodiments, the vehicle tour route includes at least a portion of one of a plurality of routes within a set threshold tour distance from the task destination or the predicted required task location. In some embodiments, the communication device further directs the autonomous or semi-autonomous propulsion system of the autonomous or semi-autonomous vehicle to remain at the tender station location, at the selected parking space location, or within the vehicle tour route for the predicted demand period. In some embodiments, the route calculation algorithm includes a machine learning algorithm, a rule-based algorithm, or both. In some embodiments, the predictive algorithm comprises a machine learning algorithm, a rule-based algorithm, or both. In some embodiments, the current vehicle state includes a vehicle power level, a vehicle stock (stock), a vehicle hardware state, or any combination thereof. In some embodiments, at least one of the speed coefficient and the autonomous vehicle safety parameter comprises: a speed limit, an average speed, a time-dependent average speed, a number of intersections, a number of turns, a type of turn, an accident indicator, a stopped vehicle indicator, a number of lanes, a one-way street indicator, a cellular reception parameter, a road grade, a maximum road grade, an average pedestrian density, a maximum pedestrian density, a minimum pedestrian density, a time-dependent pedestrian density, an average rider density, an unprotected turn parameter, a road flatness parameter, a road visibility parameter, or any combination thereof. In some embodiments, the autonomous or semi-autonomous vehicle further comprises a sensor configured to measure sensed data. In some embodiments, the database further stores at least one of a current vehicle location, a current vehicle state, and sensed data. In some embodiments, the path parameter is based at least on the sensed data. In some embodiments, at least one of the safety factor and the speed coefficient is based on sensed data. In some embodiments, the sensed data enables crowd sourced (crowd sourced) safety factor and speed coefficient determination. In some embodiments, the application further comprises a path parameter prediction module that predicts future path parameters based at least on the sensed data. In some embodiments, the route calculation algorithm further determines a vehicle mission route based on the predicted road parameters. In some embodiments, the autonomous or semi-autonomous vehicle further comprises a sensor configured to measure sensed data, and wherein the sensed data corresponds to a parking space status of one or more of the plurality of parking space locations in the geographic area. In some embodiments, the parking allocation module further determines the selected parking spot location based on the parking spot status. In some embodiments, the server application further comprises a display module that displays at least one of a current vehicle location, a current vehicle state, a task destination, a path parameter, a task route, a selected parking space location, and a predicted required task location.
Another aspect provided herein is a fleet of vehicles including a plurality of autonomous or semi-autonomous vehicles operating autonomously or semi-autonomously and a fleet management module for coordinating the fleet of vehicles, the vehicle management module coordinating the activities and locations of each autonomous or semi-autonomous vehicle in the fleet of vehicles, the fleet of vehicles configured to monitor, collect and report data and capable of operating in an unstructured open or closed environment, each autonomous or semi-autonomous vehicle in the fleet of vehicles including: a transmission system; a power system; a navigation module for navigating in an unstructured open or closed environment; at least one communication module adapted to transmit data from each autonomous or semi-autonomous vehicle to at least one of: a fleet management module, a user, and/or other autonomous or semi-autonomous vehicles in a fleet, and accepting instructions from the fleet management module or the user; a sensor system comprising a plurality of sensors configured to detect an environment surrounding an autonomous or semi-autonomous vehicle; and at least one processor configured to manage the transmission system, the power system, the navigation module, the sensor system, and the at least one communication module, and to access data provided by the autonomous or semi-autonomous vehicle, the data relating to navigation of the autonomous or semi-autonomous vehicle.
In some embodiments, the user comprises a fleet manager, a subcontracting provider, a service provider, a consumer, a business entity, a government entity, an individual, or a third party. In some embodiments, the fleet management module is controlled by a user. In some embodiments, the unstructured open environment is an unrestricted geographic area reachable through navigable pathways, including one or more of the following: public roads, private roads, bicycle lanes, open territory, open public land, open private land, sidewalks, lakes, rivers or streams, open airspace; and wherein the enclosed environment is a confined, enclosed or semi-enclosed structure accessible through a navigable pathway, comprising: open areas or rooms within commercial buildings (with or without structures or obstacles); open areas within commercial buildings or airspace within rooms (with or without structures or obstacles); a public or private aisle; a corridor; a tunnel; a ramp; an elevator; conveyors and walkways. In some embodiments, the navigation system controls the route of the transmission system of each autonomous or semi-autonomous vehicle in the fleet in an unstructured open or closed environment.
In some embodiments, the communication is via wireless transmission. In some embodiments, each autonomous or semi-autonomous vehicle may be configured to accept wireless transmissions from a user. In some embodiments, the wireless transmission interaction of the user is conducted via the mobile application and is sent by the electronic device and forwarded to the at least one communication module via one or more of: a central server, a fleet management module, and a mesh network. In some embodiments, the vehicle fleet wireless transmission interaction from each autonomous or semi-autonomous vehicle communication module is forwarded to the user or users via: a central server; a fleet management module; and/or a mesh network. In some embodiments, the vehicle fleet wireless transmission interaction from each autonomous or semi-autonomous vehicle includes one or more of: road and route conditions, road and route information, traffic speed, traffic congestion, weather conditions, parking violations, public utility problems, street light problems, traffic light problems, street light and traffic light current states, pedestrian density, pedestrian traffic, animals, alternative vehicle traffic, area monitoring, channel conditions, bridge inspections, internal and external structure inspections, and branch and leaf inspections.
In some embodiments, the electronic device comprises one or more of: smart phones, personal mobile devices, Personal Digital Assistants (PDAs), desktop computers, laptop computers, tablet computers, and wearable computing devices. In some embodiments, the plurality of sensors includes one or more of: still cameras, video cameras, perspective projection sensors, microphones, infrared sensors, radar, lidar, altimeters, and depth detectors. In some embodiments, the sensor system further comprises a transmission system sensor configured to: monitoring drive mechanism performance, monitoring power system levels, or monitoring drive mechanism performance. In some embodiments, the sensors are further configured to report sensor readings remotely to a fleet management module via at least one communication module. In some embodiments, each autonomous or semi-autonomous vehicle further comprises a storage or memory device, wherein data collected from the sensor system is retrievably stored. In some embodiments, each autonomous or semi-autonomous vehicle further comprises a communication port for wired communication between the autonomous or semi-autonomous vehicle and an external digital processing device.
In some embodiments, each autonomous or semi-autonomous vehicle further comprises a software module executed by the at least one processor to apply one or more algorithms to data collected from the plurality of sensors to access and store to a memory device one or more of: road and route conditions, road and route information, traffic speed, traffic congestion, weather conditions, parking violations, public utility problems, street light problems, traffic light problems, street light and traffic light current states, pedestrian density, pedestrian traffic, animals, alternative vehicle traffic, area monitoring, channel conditions, bridge inspections, internal and external structure inspections, and branch and leaf inspections. In some embodiments, the at least one communication module is further configured to receive and respond to commands from a user to: selecting or changing a monitored destination, selecting or changing an order of monitored destinations, selecting or changing a route to a destination to be monitored, reporting a geolocation of the autonomous or semi-autonomous vehicle, reporting a condition of the autonomous or semi-autonomous vehicle, reporting a speed of the autonomous or semi-autonomous vehicle, or reporting an ETA to the destination. In some embodiments, each autonomous or semi-autonomous vehicle is configured with a maximum speed range from 13mph to 90 mph.
In some embodiments, the fleet of vehicles is controlled directly by a user. In some embodiments, a plurality of the autonomous or semi-autonomous vehicles in the fleet are operated on behalf of a third party provider or a third party service provider. In some embodiments, a plurality of said autonomous or semi-autonomous vehicles in the fleet are further configured as part of at least one sub-fleet comprising a sub-plurality of autonomous or semi-autonomous vehicles, each sub-fleet configured to operate independently or with the fleet. In some embodiments, each autonomous or semi-autonomous vehicle is configured with a forward mode, a reverse mode, and a park mode. In some embodiments, a plurality of autonomous or semi-autonomous vehicles in the fleet are configured as secondary (secondary) autonomous or semi-autonomous vehicles that are half the size of other fleet autonomous or semi-autonomous vehicles, wherein a smaller secondary autonomous or semi-autonomous vehicle is an independent vehicle having all of the same capabilities as any other autonomous or semi-autonomous vehicle in the fleet. In some embodiments, the secondary autonomous or semi-autonomous vehicle may be configured for storage in one or more securable compartments provided in a plurality of autonomous or semi-autonomous vehicles in a fleet. In some embodiments, the secondary autonomous or semi-autonomous vehicle may be separate from the autonomous vehicle and configured for secondary duties.
In some embodiments, each autonomous or semi-autonomous vehicle is configured with a "crawl" or "creep" speed that includes a speed range of about 0.01mph to about 13.0 mph. In some embodiments, the secondary autonomous or semi-autonomous vehicle is configured with a maximum speed range from 13.0mph to about 90.0 mph. In some embodiments, the secondary autonomous or semi-autonomous vehicle is configured with a "crawl" or "crawl" speed that includes a speed range between about 0.01mph to about 13.0 mph. In some embodiments, a secondary autonomous or semi-autonomous vehicle is configured with a sensor system that includes one or more of: still cameras, video cameras, lidar, radar, ultrasonic sensors, microphones, altimeters, and depth detectors. In some embodiments, the secondary autonomous or semi-autonomous vehicle is configured with internal computer processing capabilities. In some embodiments, the secondary autonomous vehicle is configured with a forward mode, a reverse mode, and a park mode.
Another aspect provided herein is a fleet of autonomous or semi-autonomous vehicles, comprising a plurality of autonomous or semi-autonomous vehicles operating autonomously or semi-autonomously and a fleet management module for coordinating the fleet of vehicles, the fleet management module coordinating the activities and locations of each autonomous or semi-autonomous vehicle in the fleet of vehicles, the fleet of vehicles configured to monitor, collect and report data and capable of operating in an unstructured open or closed environment, each autonomous or semi-autonomous vehicle in the fleet of vehicles comprising: a transmission system, a power system, a navigation module for navigating in a non-structural open or closed environment, at least one communication module configurable to: transmitting data between and from each autonomous or semi-autonomous vehicle of the fleet of autonomous or semi-autonomous vehicles to at least one of: a fleet management module, a user, other autonomous or semi-autonomous vehicles in a fleet; the transmitted data is related to at least an environmental condition surrounding the autonomous or semi-autonomous vehicle; storing data from each autonomous or semi-autonomous vehicle to a memory device; and receiving instructions from a fleet management module or a user; a sensor system comprising a plurality of sensors configured to detect an environment surrounding an autonomous or semi-autonomous vehicle; at least one processor configured to manage the transmission system, the power system, the navigation module, the sensor system, and the at least one communication module, and to access data provided by the autonomous or semi-autonomous vehicle, the data relating to navigation of the autonomous or semi-autonomous vehicle; and a software module executed by the at least one processor to apply one or more algorithms to data collected from the plurality of sensors to identify, record and store to the memory device one or more of: road and path conditions, high precision map data, traffic speed, traffic congestion, weather conditions, parking violations, utility problems, street light problems, traffic light problems, street light and traffic light current states, pedestrian density, pedestrian traffic, animals, alternative vehicle traffic, area monitoring, channel conditions, bridge inspections, internal and external structure inspections, and branch and leaf inspections.
The present disclosure relates to autonomous or semi-autonomous vehicle fleets comprising a plurality of vehicles, in particular vehicles for transporting or retrieving transports in unstructured outdoor or enclosed environments. The present disclosure also relates to a fleet of vehicles comprising a plurality of autonomous vehicles operating autonomously or semi-autonomously, each autonomous vehicle configured to: data is monitored, collected and reported while operating in an unstructured open or closed environment.
A vehicle fleet is provided herein, comprising a plurality of autonomous or semi-autonomous vehicles and a fleet management module (associated with a central server) for coordinating the vehicle fleet. A fleet management module coordinates the activity, location, and positioning of each autonomous or semi-autonomous vehicle in a fleet configured to monitor, collect, and report data while operating in an unstructured open or closed environment.
In some embodiments, the fleet of vehicles is alternatively configured to: selling and transporting goods, including a plurality of compartments for transporting/selling one or more goods; responding to scheduled or immediate/on-demand requests, and/or locating based on anticipated demand; a temperature controlled compartment to contain hot or cold items; the preloaded cargo is carried with expected demand based on where to go and what to load.
In some embodiments, a fleet of vehicles is configured to enable a consumer, user, or multiple users to summon one or more autonomous or semi-autonomous vehicles through a mobile (phone/tablet/watch/laptop) application for designated delivery or for a mobile marketplace to them. In some embodiments, the consumer, user, or users may choose to additionally specify an exact location on the map for the vehicle (e.g., by placing a pointer, etc.) for a specified delivery or pick-up (pickup).
In some embodiments, a fleet of vehicles is configured to provide one or more services, such as: shipping services, advertising services, land survey services, patrol services, monitoring services, traffic survey services, signage and signage survey services, and building or road infrastructure survey services. In some embodiments, the vehicle fleet service includes a "White tag" service, which relates to the shipment or representation of "White tag" products or services.
In some embodiments, each autonomous or semi-autonomous vehicle in the fleet of vehicles is equipped with at least one processor capable of both high-level computing power for processing and low-level safety-critical computing power for controlling hardware. In some embodiments, each autonomous or semi-autonomous vehicle in the fleet includes a transmission system (e.g., a drive system having propulsion engines, wheels, wings, rotors, blowers, rockets, propellers, brakes, etc.) and a power source.
In some embodiments, each autonomous or semi-autonomous vehicle in the fleet includes a navigation module (e.g., digital map, GPS, etc.) for navigating in an unstructured open or closed environment. In some embodiments, each autonomous or semi-autonomous vehicle in the fleet of vehicles comprises at least one communication module adapted to transmit data from the autonomous or semi-autonomous vehicle to at least one of: fleet managers, users, or other autonomous or semi-autonomous vehicles.
In some embodiments, each autonomous or semi-autonomous vehicle in the fleet comprises: at least one communication module configurable to receive, store, and transmit data to a user or users and autonomous or semi-autonomous vehicles in a fleet of vehicles; transmitting data between autonomous or semi-autonomous vehicles of a fleet of vehicles; transmitting data between and among the user or users and an autonomous or semi-autonomous vehicle in a fleet of vehicles, the transmitted data relating to at least environmental conditions and vehicle fleet interaction; a sensor system comprising a plurality of sensors configured to evaluate an environment surrounding an autonomous or semi-autonomous vehicle; at least one processor configured to manage the transmission system, the power system, the navigation module, the sensor system, and the at least one communication module; and a software module executed by the at least one processor to apply one or more algorithms to data collected from the plurality of sensors to identify, record and store to the memory device one or more of: road and path conditions (damaged roads, potholes), construction, road congestion, detours, traffic flow, traffic speed, traffic congestion, accidents, road user behavior, weather conditions, parking violations, public utility problems, street light problems, traffic light problems, current status of street lights and traffic lights, signage problems, pedestrian density/traffic, pedestrian behavior, animals, alternative vehicle traffic (e.g., motorcycles, mopeds, bicycles, wheelchairs, strollers, etc.), consumer/pedestrian flow through areas, area monitoring, parking space utilization, bridge inspection, internal and external structure inspection, and foliage inspection.
In some embodiments, the surveillance application may be extended to include detecting and/or identifying people, vehicles, objects, moving objects within certain areas, such as the number of cars in a parking lot, the number of consumers or people entering and leaving a building, and the like.
In some embodiments, the inspection application is extensible to include business, office, home, building, and structure inspections. In some embodiments, the monitoring application may be extended to include traffic information, such as: business name, address, business type, and real-time attributes including the business, park, and shopping mall being flooded at any given time. In some embodiments, HD maps and contextual maps, construction areas, road closures, road works, crowded areas, etc. are updated with data collected from sensors. In some embodiments, an unstructured open environment is an unrestricted geographic area reachable through navigable pathways, including: public roads, private roads, bicycle lanes, sidewalks, or open airspace. In some embodiments, the enclosed environment is a confined, enclosed or semi-enclosed structure accessible through a navigable pathway, including: open areas or rooms within commercial buildings (with or without structures or obstacles); open areas within commercial buildings or airspace within rooms (with or without structures or obstacles); a public or private aisle; a corridor; a tunnel; a ramp; an elevator; conveyors and walkways.
In some embodiments, a navigation system controls routing of transmission systems of autonomous or semi-autonomous vehicles in a fleet of vehicles in an unstructured open or closed environment. In some embodiments, the communication to the user or users, the fleet management module, the autonomous or semi-autonomous vehicles in the fleet, the communication between autonomous or semi-autonomous vehicles of the fleet, the transmission of received, stored, and transmitted data between the user or users and the autonomous or semi-autonomous vehicles in the fleet, and the vehicle fleet interaction occurs via wireless transmission.
In some embodiments, the wireless transmission interaction of the user or users is conducted via a mobile application and is sent by the electronic device and forwarded to the at least one communication module via one or more of: a central server, a fleet management module, and a mesh network.
In some embodiments, the vehicle fleet wireless transmission interaction from each autonomous or semi-autonomous vehicle communication module is forwarded to the user or users via: a central server, a fleet management module, and/or a mesh network. In some embodiments, the electronic device comprises a phone, a personal mobile device, a Personal Digital Assistant (PDA), a mainframe computer, a desktop computer, a notebook computer, a tablet computer, and/or a wearable computing device (including a communications headset, smart glasses, contact lens or lenses, a digital watch, a bracelet, a ring, jewelry, or combinations thereof).
In some embodiments, the plurality of sensors includes one or more of: still cameras, video cameras, perspective projection sensors, microphones, infrared sensors, ultrasonic sensors, radar sensors, lidar sensors, altimeters, and depth detectors.
In some embodiments, the autonomous or semi-autonomous vehicles of the fleet of vehicles further comprise a transmission system configured to: monitoring drive train performance (propelling the engine), monitoring powertrain system levels (e.g., battery, solar, gasoline, propane, etc.), or monitoring transmission performance (e.g., gearbox, tires, pedals, brakes, rotors, blowers, propellers, etc.).
In some embodiments, the sensor is further configured to remotely transmit the sensor readings to a fleet manager via at least one communication module. In some embodiments, the sensor is further configured to report the sensor readings remotely to the user or users through the at least one communication module. In some embodiments, the at least one communication module is further configured to receive and respond to commands from a user or users to: selecting or changing a monitored destination, selecting or changing an order of monitored destinations, selecting or changing a route to a destination to be monitored, reporting a geographic location of an autonomous or semi-autonomous vehicle, reporting a condition of an autonomous or semi-autonomous vehicle (e.g., fuel supply, accident, component failure), reporting a speed of an autonomous or semi-autonomous vehicle, or reporting an ETA to a destination.
In some embodiments, the fleet of vehicles are configured as land vehicles. In some embodiments, the land vehicle autonomous or semi-autonomous vehicles in the fleet are configured with a maximum speed range from 13.0mph to about 90.0 mph.
In some embodiments, land vehicle autonomous or semi-autonomous vehicles in a fleet of vehicles are configured with a "crawl" or "crawl speed that includes a speed range between about 0.01mph to about 1.0 mph. In some embodiments, land vehicle autonomous or semi-autonomous vehicles in a fleet of vehicles are configured with "crawl" or "crawl" speeds that include a speed range between about 0.01mph to about 5.0 mph. In some embodiments, land vehicle autonomous or semi-autonomous vehicles in a fleet of vehicles are configured with "crawl" or "crawl" speeds that include a speed range between about 0.01mph to about 10.0 mph. In some embodiments, land vehicle autonomous or semi-autonomous vehicles in a fleet of vehicles are configured with "crawl" or "crawl" speeds that include a speed range between about 0.01mph to about 13.0 mph. In some embodiments, the land vehicle autonomous or semi-autonomous vehicles in the fleet are configured with an operating speed range from about 0.01mph to about 90.0 mph.
In some embodiments, the maximum speed is determined by hardware and software present in the autonomous or semi-autonomous vehicle. In some embodiments, the maximum speed allows operation in open roads, bicycle paths, and other environments that accommodate higher speeds.
In some embodiments, the operating speed in any given environment is managed by onboard sensors that monitor environmental conditions, operating environment, etc. to determine the appropriate speed at any given time.
In some embodiments of the fleet of vehicles, the plurality of autonomous or semi-autonomous vehicles includes a secondary autonomous or semi-autonomous vehicle that may be configured as an independent vehicle capable of operating in a manner similar to any other autonomous or semi-autonomous vehicle in the fleet of vehicles.
In some embodiments of the fleet, the secondary autonomous or semi-autonomous vehicle is a component of the land vehicle that is separable from the land vehicle and configured for secondary duties, such as: obtaining a soil, water or air sample; acquiring a close-up photo; access to a small or restricted area that is too small for a larger autonomous or semi-autonomous vehicle to enter; or to transport the component or package from an autonomous or semi-autonomous vehicle on a street or sidewalk to a door, drop box, or nearby secondary location. In some embodiments, the secondary autonomous or semi-autonomous vehicle is configured to transport the component or package to an entrance of a building or an interior of a building.
In some embodiments, the secondary autonomous or semi-autonomous vehicle is a smaller land-based autonomous or semi-autonomous vehicle. In some embodiments, the secondary autonomous or semi-autonomous vehicle is a drone. In some embodiments, the autonomous or semi-autonomous vehicle is a naval vessel. In some embodiments, the secondary autonomous or semi-autonomous vehicle is transported in a storage compartment of the land vehicle autonomous or semi-autonomous vehicle. In some embodiments, the secondary autonomous or semi-autonomous vehicle is transported over the land vehicle autonomous or semi-autonomous vehicle. In some embodiments, the secondary autonomous or semi-autonomous vehicle is configured to automatically pick up from a storage compartment of the terrestrial autonomous or semi-autonomous vehicle. In some embodiments, the secondary autonomous or semi-autonomous vehicle is configured to assist in automatic retrieval from a storage bay of the land autonomous vehicle, wherein the land autonomous or semi-autonomous vehicle provides a ramp, platform, or lift to assist in retrieving the secondary autonomous vehicle from the bay of the land autonomous or semi-autonomous vehicle.
In some embodiments, the secondary autonomous vehicle is configured with a maximum speed range from 1.0mph to about 13.0 mph. In some embodiments, the secondary autonomous vehicle is configured with a maximum speed range from 1.0mph to about 90.0 mph. In some embodiments, the secondary autonomous vehicle is configured with a "crawl" or "creep" speed that includes a speed range between approximately 0.01mph to 1.0 mph. In some embodiments, land autonomous or semi-autonomous vehicles in a fleet of vehicles are configured with "crawl" or "crawl" speeds that include a speed range between about 0.01mph to about 5.0 mph. In some embodiments, land autonomous or semi-autonomous vehicles in a fleet of vehicles are configured with "crawl" or "crawl" speeds that include a speed range between about 0.01mph to about 10.0 mph. In some embodiments, land autonomous or semi-autonomous vehicles in a fleet of vehicles are configured with "crawl" or "crawl" speeds that include a speed range between about 0.01mph to about 13.0 mph.
In some embodiments, the fleet of vehicles are fully autonomous. In some embodiments, the fleet of vehicles are semi-autonomous. In some embodiments, the fleet of vehicles is controlled directly by the user or users. In some embodiments, a plurality of the autonomous vehicles in the fleet are operated on behalf of a third party supplier/service provider. In some embodiments, autonomous vehicles in the fleet are configured for land travel as land vehicles. In some embodiments, autonomous vehicles in the fleet are configured for travel over water as water borne vehicles. In some embodiments, autonomous vehicles in a fleet are configured to cruise as land or water hovercraft vehicles. In some embodiments, autonomous vehicles in the fleet are configured to travel in the air as aerial drones or aerial hovercraft vehicles.
In some embodiments, a plurality of said autonomous or semi-autonomous vehicles in the fleet are further configured as part of a sub-fleet comprising a sub-plurality of autonomous or semi-autonomous vehicles; each sub-fleet is configured to operate independently or in conjunction with a fleet of vehicles.
Drawings
The novel features believed characteristic of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
FIG. 1 is an exemplary view of a fleet of autonomous vehicles including two sub-fleets;
FIG. 2 is a front view of an exemplary autonomous vehicle alongside a pedestrian, in accordance with some embodiments;
FIG. 3 is a right side view of an exemplary autonomous vehicle, according to some embodiments;
FIG. 4 is a left side view of an exemplary autonomous vehicle alongside a pedestrian, in accordance with some embodiments;
FIG. 5 is a rear view of an exemplary autonomous vehicle, according to some embodiments;
FIG. 6 is a flow diagram of an example fleet management control module, according to some embodiments;
FIG. 7 is a flow diagram of an example autonomous vehicle application, according to some embodiments;
FIG. 8 shows a non-limiting schematic diagram of a digital processing device; in this case, the device has one or more CPUs, a memory, a communication interface, and a display;
FIG. 9 shows a non-limiting schematic diagram of a web/mobile application providing system; in this case, the system provides a browser-based and/or local mobile user interface;
FIG. 10 shows a non-limiting schematic diagram of a cloud-based web/mobile application provisioning system; in this case, the system includes resilient load-balanced, auto-scaling web server and application server resources and a synchronously replicated database;
FIG. 11 illustrates a non-limiting schematic of a platform for determining real-time parking status for a plurality of parking locations;
FIG. 12 illustrates another non-limiting schematic of a platform for determining real-time parking status for a plurality of parking locations; and
FIG. 13 illustrates another non-limiting schematic of a platform for determining real-time parking status for multiple parking locations.
Detailed Description
Although governmental organizations, non-governmental organizations and contractors are tasked with monitoring the status of infrastructure to ensure maintenance and safety of public and private facilities, such manual inspections are costly and cumbersome in view of the wide range of roads, walkways and buildings. Although airborne infrastructure monitoring has been used to record and maintain agricultural and wilderness conditions, such manned systems are costly to monitor on a large scale and may not be configurable for transport over and inspection of ground and urban infrastructure. Furthermore, such infrastructure monitoring systems are incompatible with the use and addition of current infrastructure databases and support lists. Accordingly, provided herein is a platform for determining non-navigational (non-navigational) quality of at least one infrastructure.
An autonomous or semi-autonomous vehicle fleet is provided herein, comprising a plurality of autonomous or semi-autonomous vehicles operating autonomously or semi-autonomously, each configured to monitor, collect, and report data while operating in an unstructured open or closed environment, and a fleet management module for coordinating the autonomous or semi-autonomous vehicle fleet. Compared to manned vehicles, autonomous and semi-autonomous vehicles may need to collect and process additional types and forms of navigation data compared to unmanned vehicles to detect and respond to the surrounding environment and to address challenges specific to these applications. Furthermore, additional types and forms of data may be required for unmanned autonomous and semi-autonomous vehicles, as such unmanned vehicles cannot rely on passenger overrides or instructions. Thus, there is currently an unmet need for systems, platforms and methods for autonomous or semi-autonomous vehicles (whether manned or unmanned) to collect and process these additional types and forms of data and to navigate the vehicle based on this data.
An autonomous or semi-autonomous vehicle fleet herein may include a plurality of autonomous or semi-autonomous vehicles operating autonomously or semi-autonomously, each autonomous or semi-autonomous vehicle in the fleet configured to monitor, collect, and report data while being capable of operating in an unstructured open or closed system, and a fleet management module for coordinating the fleet of autonomous or semi-autonomous vehicles, each autonomous or semi-autonomous vehicle including a transmission system, a power system, a navigation module, a sensor module, at least one communication module, and at least one processor configured to manage the transmission system, the power system, the navigation module, the sensor module, and the at least one communication module.
Fleet of autonomous vehicles
According to fig. 1, provided herein is a fleet of autonomous or semi-autonomous vehicles 100 comprising a plurality of autonomous or semi-autonomous vehicles 101.
In some embodiments, the fleet of autonomous or semi-autonomous vehicles 100 includes at least a first sub-fleet (including a first fleet autonomous or semi-autonomous vehicle 101a) and a second fleet (including a second fleet autonomous or semi-autonomous vehicle 101 b). Each sub-fleet may include 1, 2, 3, 4, 5, 10, 15, 20, 50, 100, 200, 300, 400, 500, 700, 1000, 2000, 3000, 5000, 10000, or more autonomous or semi-autonomous vehicles 101. Two or more sub-fleets may operate independently or simultaneously.
In a non-limiting example of the operation of a sub-fleet of autonomous or semi-autonomous vehicles, an independent survey company rents or leases a sub-fleet comprising 10 autonomous or semi-autonomous vehicles 101, which autonomous or semi-autonomous vehicles 101 are partially or fully dedicated to the survey company's tasks and/or services. The sub-fleet may include a plurality of "white tag" vehicles that display the survey company's logo.
Autonomous or semi-autonomous vehicles
As shown in fig. 2-5, an exemplary autonomous or semi-autonomous vehicle 101 may be configured for land travel. The width of the vehicle 101 is about 2 to 5 feet. For stability, the vehicle 101 may also exhibit low mass and low center of gravity, or both.
In some embodiments, the vehicle 101 is configured to enable human interaction and/or override by a user or fleet operator. The vehicle 101 or semi-autonomous vehicle 101 may be configured to allow direct control of the processors, conveyors, or sensors therein by the fleet operator. This direct control may allow the vehicle 101 to safely return to the base station for maintenance. In some embodiments, vehicle 101 includes a plurality of securable compartments 102 configured for transporting goods or equipment.
Further, each autonomous or semi-autonomous vehicle 101 may include a transmission system configured to propel the autonomous or semi-autonomous vehicle 101. The transmission system may include an engine, a wheel, a tread, a wing, a rotor, a blower, a rocket, a propeller, a brake, a transmission, or any combination thereof. The transmission system may also include a power system configured to provide and/or store energy required to propel the autonomous or semi-autonomous vehicle 101.
According to fig. 3, a vehicle 101 may comprise a storage compartment 102. In some embodiments, the storage compartments 102 include 1, 2, 3, 4, 5, 6, 7, 8, 10, 15, 20, or more compartments 102. In some embodiments, storage compartments 102 comprise nested storage compartments, wherein a sub-storage compartment is located within another storage compartment 102. In some embodiments, the storage compartment 102 may be configured to carry a particular item or range of items. In some embodiments, the storage compartment 102 is configured to house a secondary autonomous or semi-autonomous vehicle.
Furthermore, according to fig. 3, the vehicle 101 may comprise sensors. The sensors may include one or more of still image cameras, video cameras, LiDAR, RADAR, ultrasound sensors, microphones, altimeters, and depth detectors. In some embodiments, the sensor 301 comprises a transmission system sensor configured to monitor at least one of performance and speed of the transmission system. The transmission system sensors may be configured to monitor power levels (e.g., battery, solar, gasoline, propane, etc.) or to monitor transmission performance (e.g., transmission, tires, pedals, brakes, rotors, blowers, propellers, etc.). In some embodiments, the sensor system is configured to monitor the surroundings of the vehicle 101 and collect data about the unstructured open or closed environment. Further, each vehicle 101 may include an internal processor for navigation and obstacle avoidance.
The vehicle may be configured for land use. In some embodiments, the vehicle comprises an automobile, a van, a tricycle, a truck, a trailer, a bus, a rail vehicle, a train, a tram, a boat, a ship, a watercraft, an ferry, a landing boat, a barge, a raft, an airborne drone, an airborne hovercraft, a land hovercraft, a water hovercraft, an airplane, a spacecraft, or any combination thereof. In some embodiments, the vehicle comprises a marine vehicle, wherein the transmission system comprises a gas engine, a turbine engine, an electric motor, a hybrid gas/electric engine, a propeller, a jet, or any combination thereof. In some embodiments, the vehicle comprises a cruise vehicle, wherein the transmission system comprises a blower, a gas engine, a turbine engine, an electric motor, a hybrid gas/electric engine, a propeller, or any combination thereof. In some embodiments, the vehicle comprises a cruise vehicle, wherein the transmission system comprises a wing, a rotor, a blower, a rocket, a propeller, a gas engine, a turbine engine, an electric motor, a hybrid gas/electric motor, or any combination thereof.
In some embodiments, the vehicle comprises a land vehicle having a maximum speed of about 13 miles per hour (mph) to about 90 mph. In some embodiments, the vehicle comprises a marine vehicle having a maximum speed of about 1mph to about 45 mph. In some embodiments, the vehicle comprises a land or water hovercraft vehicle having a maximum speed of about 1mph to about 60 mph. In some embodiments, the vehicle comprises an aircraft (e.g., an aerial drone or an aerial hovercraft) having a maximum speed of about 1 to 90 mph.
In some embodiments, the vehicle is configured with a forward speed mode, a reverse mode, and a park mode. In some embodiments, the vehicle has a speed of about 13mph to about 100 mph. Each land vehicle may also be configured to operate within a particular speed range to accommodate a particular surrounding environment. The particular ambient environment may include, for example, slow-moving traffic, foot traffic, vehicle traction, automatic parking, reverse driving, weather conditions, bike lanes, city traffic, rural traffic, residential traffic, local road traffic, state road traffic, and state road traffic. The surroundings of each vehicle can be determined by on-board or remote sensors and software. In some cases, for example, if the in-vehicle navigation map and sensors provide conflicting information, safety measures may be taken to further reduce the speed.
In some embodiments, the vehicle may respond to one or more ambient conditions by entering a "full stop," "creep," or "creep" mode. These modes may be enabled for navigation in a very emergency situation, automatic parking, presence of other vehicles, or when preparing to park.
In some embodiments, at least one of the "stop completely," "crawl," or "crawl" modes includes a speed of about 0.01mph to about 13 mph. In some embodiments, at least one of the "stop completely," "crawl," or "crawl" modes includes a speed of at least about 0.01 mph. In some embodiments, at least one of the "stop completely," "crawl," or "crawl" modes includes a speed of at most about 13 mph. In some embodiments, at least one of "stop completely," "crawl," or "crawl" includes the following speeds: about 0.01 to about 0.05mph, about 0.01 to about 0.1mph, about 0.01 to about 0.5mph, about 0.01 to about 1mph, about 0.01 to about 2mph, about 0.01 to about 3mph, about 0.01 to about 4mph, about 0.01 to about 5mph, about 0.01 to about 8mph, about 0.01 to about 11mph, about 0.01 to about 13mph, about 0.05 to about 0.1mph, about 0.05 to about 0.5mph, about 0.05 to about 1mph, about 0.05 to about 0.05mph, about 0.05 to about 1mph, about 0.05 to about 2mph, about 0.05 to about 3mph, about 0.05 to about 4mph, about 0.05 to about 0.05mph, about 0.05 to about 1mph, about 0 to about 1mph, about 0.05 to about 0 to about 1mph, about 0 to about 0.1mph, about 0 to about 1mph, about 0.05 to about 0 to about 0.1mph, about 0 to about 0.05 to about 1mph, about 0 to about 1mph, about 0.05 to about 1mph, about 0 to about 0.1 to about 1mph, about 0.05 to, From about 0.5mph to about 4mph, from about 0.5mph to about 5mph, from about 0.5mph to about 8mph, from about 0.5mph to about 11mph, from about 0.5mph to about 13mph, from about 1mph to about 2mph, from about 1mph to about 3mph, from about 1mph to about 4mph, from about 1mph to about 5mph, from about 1mph to about 8mph, from about 1mph to about 11mph, from about 1mph to about 13mph, from about 2mph to about 3mph, from about 2mph to about 4mph, from about 2mph to about 5mph, from about 2mph to about 8mph, from about 2mph to about 11mph, about 2mph to about 13mph, about 3mph to about 4mph, about 3mph to about 5mph, about 3mph to about 8mph, about 3mph to about 11mph, about 3mph to about 13mph, about 4mph to about 5mph, about 4mph to about 8mph, about 4mph to about 11mph, about 4mph to about 13mph, about 5mph to about 8mph, about 5mph to about 11mph, about 5mph to about 13mph, about 8mph to about 11mph, about 8mph to about 13mph, or about 11mph to about 13 mph. In some embodiments, at least one of "stop completely," "crawl," or "crawl" includes the following speeds: about 0.01mph, about 0.05mph, about 0.1mph, about 0.5mph, about 1mph, about 2mph, about 3mph, about 4mph, about 5mph, about 8mph, about 11mph, or about 13 mph.
In one exemplary embodiment, a land vehicle is configured with a conventional four-wheel vehicle configuration that includes a steering and braking system. The drive may be a two-wheel drive or a four-wheel all terrain traction drive, and the propulsion system may include a gas engine, a turbine engine, an electric motor, a hybrid gas/electric motor, or any combination thereof. The autonomous or semi-autonomous vehicle may additionally include an auxiliary solar power system for providing backup emergency power or power for a secondary low power subsystem.
In another exemplary embodiment, the watercraft is configured to monitor, collect and report data in public waterways, canals, dams and lakes. Thus, the watercraft can monitor and report the conditions of the flooded area, and/or collect water samples.
Alternatively, in some embodiments, the large storage compartment may house automatically deployable remote autonomous or semi-autonomous vehicles when the autonomous or semi-autonomous vehicles operate in an unstructured open environment.
Fleet management module
According to fig. 6, a system for fleet management is provided herein, which includes a fleet management module 601, a central server 602, vehicles 604, consumers 603, and service providers 605. In some embodiments, the fleet management module 601 coordinates and dispatches tasks and monitors the position of each of the plurality of vehicles 604 in the fleet. The fleet management module 601 may coordinate the vehicles 604 in the fleet to monitor and collect data about the unstructured open or closed environment and report to the service provider 605. As shown, the fleet management module 601 may coordinate with a central server 602. The central server 602 may be located in a central operating facility owned or managed by the fleet owner. Service provider 605 may include a third party provider of goods or services.
In one example, the customer's 603 order is sent to the central server 602, and the central server 602 then communicates with the fleet management module 601, and the fleet management module 601 forwards the order to the service provider 605 associated with the order and the vehicle 604. The fleet management module 601 may employ one or more vehicles 604 or sub-fleet vehicles proximate to the service provider 605, the customer 603, or both. The assigned service provider then interacts with the vehicle 604 through a service provider application to provide any goods, maps, or instructions associated with the order to the vehicle 604. The vehicle 604 then travels to the customer 603 and reports completion of the order to at least one of the customer 603, the service provider 605, the central server 602, and the fleet management module 601.
In some embodiments, the vehicle 604 may operate on behalf of the service provider 605, with at least one of the central server 602 and the fleet management module 601 being operated by the service provider 605. In some embodiments, the vehicle 604 is controlled directly by the user 603. In some embodiments, human interaction of the vehicle 604 may be required to address maintenance issues such as mechanical failures, electrical failures, or traffic accidents.
According to fig. 7, the fleet management module 701 instructs the processor 703 of an autonomous or semi-autonomous vehicle via the communication module 702. The processor 703 may be configured to send instructions and receive sensed data from the sensor system 706, and may further control at least one of the power system 707, the navigation module 705, and the transmission system 704. The processor 703 may also be configured to instruct the controller 708 to open the securable compartment 709 to release any content associated with the order.
In some embodiments, the processor 703 of the autonomous or semi-autonomous vehicle comprises at least one communication module 702, the communication module 702 being adapted to receive, store and transmit data to and from the user and fleet management module 701. In some embodiments, the data includes a schedule, a request or order, a current location, a delivery location, a service provider location, a route, an Estimated Time of Arrival (ETA), a relocation instruction, a vehicle condition, a vehicle speed, or any combination thereof.
In some embodiments, the communication module 702 is configured to receive, store, and transmit data to and from a user via a user application. In some embodiments, the user application includes a computer application, an internet application, a tablet application, a phone application, or any combination thereof. In some embodiments, the communication module 702 is configured to receive, store, and transmit data via wireless transmission (e.g., 4G, 5G, or satellite communication). In some embodiments, the wireless transmission occurs via: a central server, a fleet management module, a mesh network, or any combination thereof. In some embodiments, the user application is configured to send and receive data via electronic devices including phones, personal mobile devices, Personal Digital Assistants (PDAs), mainframe computers, desktop computers, laptop computers, tablet computers, and/or wearable computing devices including communication headsets, smart glasses, or combinations thereof.
In some embodiments, the navigation module 705 controls the transmission system 704 to maneuver an autonomous or semi-autonomous vehicle through an unstructured open or closed environment. In some embodiments, the navigation module 705 includes a digital map, a street view photograph, a GPS spot, or any combination thereof. In some embodiments, the map is generated by a user, a consumer, a service provider, a fleet operator, an online repository, a public database, or any combination thereof. In some embodiments, the map is generated only for the expected operational geographic environment location. The map may be augmented by data obtained by the sensor system 706. The navigation module 705 may further implement data collected by the sensor system 706 to determine the location and/or surroundings of the autonomous or semi-autonomous vehicle. In some embodiments, the map further comprises navigation markers including lanes, landmarks, intersections, grades, or any combination thereof.
In some embodiments, the fleet management module 701 is configured to determine and predict geographic requirements for autonomous or semi-autonomous vehicles deployed strategically throughout a geographic area in anticipation of known requirements. The fleet management module 701 may determine and predict geographic requirements by storing data related to locations, quantities, times, prices, items, item types, services, service types, service providers, or any combination thereof that have placed orders and requested. In addition, the service provider may provide trends for independent measurements to supplement or enhance the trends for the measurements. Thus, vehicles may be strategically deployed to reduce transportation and idle events and to increase sales volume and efficiency.
Operating environment
In some embodiments, an unstructured open environment is an unrestricted geographic area reachable through navigable pathways, including: public roads, private roads, bicycle lanes, open land, open public land, open private land, sidewalks, lakes, rivers, or streams.
In some embodiments, the enclosed environment is a confined, enclosed or semi-enclosed structure accessible through a navigable pathway, including: a ground space within a commercial building, an airspace within a commercial building, an aisle, a corridor, a tunnel, a ramp, an elevator, a conveyor, or a sidewalk. The enclosed environment may or may not include internal structures or obstructions.
In some embodiments, the unstructured open environment is an unrestricted airspace or near-space environment in the earth's atmosphere, including the troposphere, the stratosphere, the intermediate layer, the thermal atmosphere, and the external atmosphere.
Primary autonomous or semi-autonomous vehicle and secondary autonomous or semi-autonomous vehicle
In some embodiments, a fleet of autonomous or semi-autonomous vehicles includes a plurality of primary vehicles and a plurality of secondary vehicles, wherein one or more secondary vehicles may be transported by or within the primary vehicles, and wherein the secondary vehicles may be separate from the primary vehicles.
In some embodiments, the secondary autonomous or semi-autonomous vehicle comprises a land-based autonomous or semi-autonomous vehicle, an aerial drone, or a boat. In some embodiments, the secondary autonomous or semi-autonomous vehicle is configured to be at least half the size of the primary autonomous or semi-autonomous vehicle. In some embodiments, the secondary vehicle is configured with the same travel speed and pattern as the primary autonomous or semi-autonomous vehicle. Alternatively, the secondary autonomous or semi-autonomous vehicle may be configured with some, if not all, of the same capabilities as the primary autonomous or semi-autonomous vehicle.
In some embodiments, the secondary autonomous or semi-autonomous vehicle is configured to perform secondary tasks, such as: obtaining a soil/water/air sample; acquiring a close-up photo; access to small or restricted areas (e.g., drain lines and tunnels); transporting an item from a location of an autonomous or semi-autonomous vehicle to a door, drop box, or alternatively an auxiliary location; transporting the item to the interior of the building or any combination thereof.
In some embodiments, the secondary autonomous or semi-autonomous vehicle is transported in, on top of, or on the back of a bay of the primary vehicle autonomous vehicle or semi-autonomous vehicle. In some embodiments, the secondary autonomous or semi-autonomous vehicle is tethered to the primary autonomous or semi-autonomous vehicle. In some embodiments, the secondary autonomous or semi-autonomous vehicle is configured for automatic self-extraction (auto-self-extraction) from the primary vehicle. In some embodiments, the host vehicle includes a ramp, a platform, a lift, or any combination thereof to facilitate automatic self-retrieval of the autonomous or semi-autonomous vehicle.
Goods and services
In some embodiments, a fleet of autonomous or semi-autonomous vehicles herein is configured to receive and transport products and provide services to users. In some embodiments, the user comprises a fleet manager, a subcontracting provider, a service provider, a customer, a business entity, an individual, or a third party. In some embodiments, the autonomous or semi-autonomous vehicle fleet user is a city, county, state, or federal road authority. In some embodiments, autonomous or semi-autonomous vehicle fleet users are business entities that utilize fleets of vehicles to investigate and report (indoor or outdoor) large quantities of property. In some embodiments, a fleet of autonomous or semi-autonomous vehicles may be configured for safety monitoring.
In some embodiments, autonomous or semi-autonomous vehicles in a fleet may be configured to monitor and report weather and atmospheric conditions in any number of adverse environments. In some embodiments, the service includes routine inspection of roads, job sites, parking lots, and the like. In some embodiments, the service includes a new generation of high definition map information that is automatically generated. In some embodiments, the service includes updating map information (number of lanes, lane boundary locations, crosswalk locations, curbs, general road information useful for navigation) in real time. In some embodiments, servicing may include updating the service routinely, on a schedule (e.g., multiple intervals per day, daily intervals, weekly intervals, monthly intervals, or yearly intervals). In some embodiments, the service includes updating map information (number of lanes, lane boundary position, crosswalk position, curbs, general road information useful for navigation) at a frequency determined by the service contract. In some embodiments, the frequency is about once per week, about once per month, about once per quarter, about once per half year, about once per three quarters, about once per year, about once per 18 months, about once per two years, about once per three years, about once per four years, or about once per five years. In some embodiments, the service includes a land/field (terrain) survey. In some embodiments, the services include disaster area investigation and evaluation. In some embodiments, the service includes road condition surveys. In some embodiments, the service includes a traffic survey. In some embodiments, the services include traffic lights and signage surveys. In some embodiments, the service includes building or road infrastructure (bridge condition) measurements. In some embodiments, the service comprises an advertising service.
Controller and processor
In some embodiments, each autonomous or semi-autonomous vehicle in a fleet of autonomous or semi-autonomous vehicles is equipped with at least one processor configured with high level computing capabilities for processing and low level safety critical computing capabilities for controlling hardware. The at least one processor is configured to manage the transmission system, manage the powertrain system, manage the navigation module, manage various aspects of the sensor system, process and manage instructions from the fleet management module, and manage the at least one communication module.
In some embodiments, each autonomous or semi-autonomous vehicle in the fleet of autonomous or semi-autonomous vehicles is equipped with a software module executed by at least one processor to apply one or more algorithms to data collected from the plurality of sensors to identify, record, and store to a memory device for one or more of: road and path conditions; (damaged roads, potholes); road and path information, such as: lane number, boundary position, curb position, road edge position, pedestrian crossing position, and the like; traffic speed; traffic congestion; weather conditions; a parking violation; utility problems; utility problems; street lamp problems; traffic light problems; pedestrian density/traffic volume; an animal; alternative vehicular traffic (e.g., motorcycles, mopeds, bicycles, wheelchairs, strollers); monitoring a region; a water channel condition; bridge inspection; internal and external structural inspections; various types of field inspections; a land survey result; and branch and leaf examination. In addition, the data collected from the plurality of sensors may include the current status of the street lamps and traffic lights, including the color of the traffic lights (to accumulate real-time data such as which lights are green) and the confirmation of when the street lamps are on.
In some embodiments, the data stored to the memory device is uploadable (either wirelessly uploaded to a fleet administrator, or downloaded via wire or wirelessly when an autonomous or semi-autonomous vehicle returns to a fleet terminal). Once the data is uploaded wirelessly or downloaded to the fleet manager via a wireless or wired download, the data can be processed appropriately.
In some embodiments, the data stored to the memory device may be uploaded to a local or central server different from the fleet administrator (e.g., autonomous or semi-autonomous vehicles return to base, upload their data to a processing server, the processing server processes the data, then provide the resulting processed data directly to a customer or business or fleet of autonomous or semi-autonomous vehicles, etc.).
Digital processing device
In some embodiments, the platforms, systems, media and methods described herein comprise digital processing devices or uses thereof. In a further embodiment, the digital processing device includes one or more hardware Central Processing Units (CPUs) or general purpose graphics processing units (gpgpgpus) that perform device functions. In still further embodiments, the digital processing device further comprises an operating system configured to execute the executable instructions. In some embodiments, the digital processing device is optionally connected to a computer network. In a further embodiment, the digital processing device is optionally connected to the internet so that it accesses the world wide web. In still further embodiments, the digital processing device is optionally connected to a cloud computing infrastructure. In other embodiments, the digital processing device is optionally connected to an intranet. In other embodiments, the digital processing device is optionally connected to a data storage device.
Suitable digital processing devices include, as non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, network tablet computers, set-top box computers and media streaming devices, handheld computers, internet devices, mobile smart phones, tablets, personal digital assistants, video game consoles, and vehicles, in accordance with the description herein. Those skilled in the art will recognize that many smart phones are suitable for use in the system described herein. Those skilled in the art will also recognize that alternative televisions, video players, and digital music players with alternative computer network connections are suitable for use in the system described herein. Suitable tablet computers include those known to those skilled in the art having a manual, tablet and convertible configuration.
In some embodiments, the digital processing device includes an operating system configured to execute executable instructions. For example, an operating system is software, including programs and data, that manages the hardware of a device and provides services for the execution of applications. Those skilled in the art will recognize suitable server operating systems including, by way of non-limiting example, FreeBSD, OpenBSD,Linux、Mac OS XWindowsAnd those skilled in the art will recognize suitable personal computer operating systems including, by way of non-limiting exampleMac OSAnd UNIX-like operating systems (e.g. for computer systems)). In some embodiments, the operating system is provided by cloud computing. Those skilled in the art will also recognize suitable mobile smartphone operating systems including (as a non-limiting example)OS、Research InBlackBerryWindowsOS、WindowsOS,And those skilled in the art will also recognize that suitable media streaming device operating systems include, by way of non-limiting example, AppleGoogleGoogleAmazonAndthose skilled in the art will also recognize suitable video game machine operating systems including (by way of non-limiting example)XboxMicrosoft Xbox One、WiiAnd
in some embodiments, the device includes a storage and/or memory device. A storage and/or memory device is one or more physical means for temporarily or permanently storing data or programs. In some embodiments, the device is a volatile memory and requires power to maintain the stored information. In some embodiments, the device is a non-volatile memory and retains stored information when the digital processing device is not powered. In a further embodiment, the non-volatile memory comprises flash memory. In some embodiments, the non-volatile memory comprises Dynamic Random Access Memory (DRAM). In some embodiments, the non-volatile memory comprises Ferroelectric Random Access Memory (FRAM). In some embodiments, the non-volatile memory includes phase change random access memory (PRAM). In other embodiments, the device is a storage device, including (as non-limiting examples) CD-ROMs, DVDs, flash memory devices, disk drives, tape drives, optical disk drives, and cloud-based storage. In further embodiments, the storage and/or memory devices are a combination of devices such as those disclosed herein.
In some embodiments, the digital processing device includes a display that sends visual information to the user. In some embodiments, the display is a Liquid Crystal Display (LCD). In a further embodiment, the display is a thin film transistor liquid crystal display (TFT-LCD). In some embodiments, the display is an Organic Light Emitting Diode (OLED) display. In various further embodiments, the OLED display is a passive matrix OLED (pmoled) or active matrix OLED (amoled) display. In some embodiments, the display is a plasma display. In other embodiments, the display is a video projector. In other embodiments, the display is a head mounted display in communication with a digital processing device, such as a VR headset. In further embodiments, suitable VR headsets include (as non-limiting examples) HTC Vive, Oculus Rift, samsung gear VR, Microsoft HoloLens, Razer OSVR, FOVE VR, Zeiss VR One, averant glyphosate, Freefly VR headsets, and the like. In still further embodiments, the display is a combination of devices such as those disclosed herein.
In some embodiments, the digital processing device includes an input device for receiving information from a user. In some embodiments, the input device is a keyboard. In some embodiments, the input device is through a device including, by way of non-limiting example, a mouse, a trackball, a trackpad, a joystick, a game controller, or a stylus. In some embodiments, the input device is a touch screen or a multi-touch screen. In other embodiments, the input device is a microphone for capturing speech or other sound input. In other embodiments, the input device is a camera or other sensor for capturing motion or visual input. In a further embodiment, the input device is a Kinect, Leap Motion, or the like. In still further embodiments, the input device is a combination of devices such as those disclosed herein.
Referring to fig. 8, in certain embodiments, a digital processing device 801 is programmed or otherwise configured to manage an autonomous or semi-autonomous vehicle. Device 801 is programmed or otherwise configured to manage an autonomous or semi-autonomous vehicle. In this embodiment, the digital processing device 801 includes a central processing unit (CPU, also referred to herein as a "processor" and "computer processor") 805, which may optionally be single-core, multi-core, or multiple processors, for parallel processing. Digital processing device 801 also includes memory or memory location 810 (e.g., random access memory, read only memory, flash memory), electronic storage unit 815 (e.g., hard disk), communication interface 820 (e.g., network adapter) for communicating with one or more other systems, and peripheral devices 825 (e.g., cache, other memory, data storage, and/or electronic display adapter). The memory 810, the storage unit 815, the interface 820, and the peripheral device 825 communicate with the CPU 805 through a communication bus (solid line) such as a motherboard. The storage unit 815 includes a data storage unit (or data repository) for storing data. Digital processing device 801 is optionally operatively coupled to a computer network ("network") 830 via a communication interface 820. In various instances, the network 830 is the internet, and/or an extranet, or an intranet and/or extranet in communication with the internet. In some cases, network 830 is a telecommunications and/or data network. Network 830 optionally includes one or more computer servers implementing distributed computing (e.g., cloud computing). In some cases, network 830 implements a peer-to-peer network with device 801 that enables devices coupled to device 801 to act as clients or servers.
With continued reference to fig. 8, CPU 805 is configured to execute sequences of machine-readable instructions embodied in programs, applications, and/or software. The instructions may optionally be stored in a memory location (e.g., memory 810). Instructions are directed to CPU 805 which then programs or otherwise configures CPU 805 to implement the methods of the present disclosure. Examples of operations performed by CPU 805 include fetch, decode, execute, and write back. In some cases, CPU 805 is part of a circuit (e.g., an integrated circuit). One or more other components of the device 801 may optionally be included in the circuitry. In some cases, the circuit is an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA).
With continued reference to fig. 8, the storage unit 815 optionally stores files such as drivers, libraries, and saved programs. The storage unit 815 optionally stores user data such as user preferences and user programs. In some cases, digital processing device 801 includes one or more additional data storage units external, for example, on a remote server communicating over an intranet or the internet.
With continued reference to FIG. 8, the digital processing device 801 optionally communicates with one or more remote computer systems via a network 830. For example, device 801 optionally communicates with a remote computer system of the user. Examples of remote computer systems include personal computers (e.g., laptop PCs), board or tablet PCs (e.g.,iPad、galaxy Tab, etc.), smartphone ((ii)iPhone, android enabled devices,Etc.), or a personal digital assistant.
The methods described herein are optionally implemented by machine (e.g., computer processor) executable code stored on an electronic storage location (e.g., on memory 810 or electronic storage unit 815) of digital processing apparatus 604 (fig. 6). Alternatively, the machine executable or machine readable code is provided in software. During use, code is executed by the processor 805. In some cases, the code is retrieved from the storage unit 815 and stored on the memory 810 for ready access by the processor 805. In some cases, the electronic storage unit 815 is drained and the machine-executable instructions are stored on the memory 810.
Non-transitory computer readable storage medium
In some embodiments, the platforms, systems, media, and methods disclosed herein include one or more non-transitory computer-readable storage media encoded with a program comprising instructions executable by an operating system of an optionally networked digital processing device. In a further embodiment, the computer readable storage medium is a tangible component of a digital processing device. In yet another embodiment, the computer readable storage medium is optionally removable from the digital processing apparatus. In some embodiments, computer-readable storage media include, as non-limiting examples, CD-ROMs, DVDs, flash memory devices, solid state memory, magnetic disk drives, magnetic tape drives, optical disk drives, cloud computing systems and services, and the like. In some cases, programs and instructions are encoded on media permanently, substantially permanently, semi-permanently, or non-transitory.
ComputingMachine program
In some embodiments, the platforms, systems, media and methods disclosed herein comprise at least one computer program or use thereof. The computer program includes a sequence of instructions executable in the CPU of the digital processing apparatus and written to perform specified tasks. Computer readable instructions may be implemented as program modules, e.g., functions, objects, Application Programming Interfaces (APIs), data structures, etc., that perform particular tasks or implement particular abstract data types. Based on the disclosure provided herein, one of ordinary skill in the art will recognize that the computer program may be written in various versions of various languages.
The functionality of the computer readable instructions may be combined or distributed as desired in various environments. In some embodiments, the computer program comprises a sequence of instructions. In some embodiments, a computer program comprises a plurality of sequences of instructions. In some embodiments, the computer program is provided from one location. In other embodiments, the computer program is provided from multiple locations. In various embodiments, the computer program includes one or more software modules. In various embodiments, the computer program comprises, in part or in whole, one or more web applications, one or more mobile applications, one or more standalone applications, one or more web browser plug-ins, extensions, extras, or additions, or a combination thereof.
Web application
In some embodiments, the computer program comprises a web application. In light of the disclosure provided herein, one of ordinary skill in the art will recognize that, in various embodiments, web applications utilize one or more software frameworks and one or more database systems. In some embodiments, the web application is in a web application such asNET or Ruby on Rails (RoR). In some embodiments, the web application utilizes one or more database systems including, as non-limiting examples, relational, non-relational, object-oriented, relational, and XML database systems. In a further implementationBy way of example, suitable relational database systems include (as a non-limiting example)SQL Server、mySQLTMAndthose skilled in the art will also recognize that, in various embodiments, web applications are written in one or more versions of one or more languages. The web application may be written in one or more markup languages, presentation definition languages, client-side scripting languages, server-side coding languages, database query languages, or a combination thereof. In some embodiments, the web application is written in some degree of markup language, such as hypertext markup language (HTML), extensible hypertext markup language (XHTML), or extensible markup language (XML). In some embodiments, web applications are written in a somewhat representational definition language, such as Cascading Style Sheets (CSS). In some embodiments, the web application is in a client-side scripting language (e.g., Asynchronous Javascript and XML (AJAX)),Actionscript, Javascript or) And (4) writing. In some embodiments, the web application is in some way in a Server-side coding language (e.g., Active Server Pages (ASPs)),Perl、JavaTM、JavaServer Pages(JSP)、Hypertext Preprocessor(PHP)、PythonTMRuby、Tel、Smalltalk、Or Groovy). In some embodiments, the web application is written in a database query language (e.g., Structured Query Language (SQL)).In some embodiments, a web application integration enterprise server product (e.g.,Lotus). In some embodiments, the web application includes a media player element. In various further embodiments, the media player element utilizes one or more of a plurality of multimedia technologies including (by way of non-limiting example)HTML 5,、JavaTMAnd
referring to FIG. 9, in certain embodiments, the application providing system includes one or more databases 900 accessed by a relational database management system (RDBMS). Suitable RDBMSs include Firebird, MySQL, PostgreSQL, SQLite, Oracle Database, Microsoft SQL Server, IBM DB2, IBM Informix, SAP Sybase, SAPSybase, Teradata, and the like. In this embodiment, the application providing system further includes one or more application servers 920 (e.g., Java server,. NET server, PHP server, etc.) and one or more web servers 930 (e.g., Apache, IIS, GWS, etc.). The web server(s) optionally expose one or more web services through an Application Programming Interface (API) 940. Via a network, such as the internet, the system provides a browser-based and/or mobile local user interface.
Referring to fig. 10, in a particular embodiment, the application provisioning system instead has a distributed, cloud-based architecture 1000 and includes resilient load-balanced, auto-scaling web server resources 1010, application server resources 1020, and a synchronously replicated database 1030.
Mobile application
In some embodiments, the computer program comprises a mobile application provided to the mobile digital processing device. In some embodiments, the mobile application is provided to the mobile digital processing device at the time of manufacture. In other embodiments, the mobile application is provided to the mobile digital processing device via a computer network as described herein.
In light of the disclosure provided herein, mobile applications are created using hardware, languages, and development environments known in the art through techniques known to those skilled in the art. Those skilled in the art will recognize that mobile applications are written in several languages. Suitable programming languages include, as non-limiting examples, C, C + +, C #, Objective-C, JavaTM、Javascript、Pascal、Object Pascal、PythonTMNet, WML, and XHTML/HTML with or without CC, or combinations thereof.
A suitable mobile application development environment may be obtained from a number of sources. Commercially available development environments include, by way of non-limiting example, AirplaySDK, alchemio, algalmo, and the like,Celsius, Bedrop, Flash Lite,. NETcompact frame, Rhomobile, and Worklight Mobile Platform. Other development environments that are freely available include, by way of non-limiting example, Lazarus, mobilflex, MoSync, and Phonegap. In addition, mobile device manufacturers distribute software developer toolkits, including (as non-limiting examples) iPhone and IPad (iOS) SDK, AndroidTMSDK、SDK、BREW SDK、OS SDK, Symbian SDK, webOS SDK, andMobile SDK。
in the field ofThose skilled in the art will recognize that several commercial forums are available for the distribution of mobile applications, including (as a non-limiting example)App Store、Play、Chrome WebStore、AppWorld, Palm device App Store, App Catalog for webOS, for MobileMarketplace、Ovi Store, of the plant,Apps andDSi Shop。
independent application
In some embodiments, the computer program comprises a stand-alone application that is implemented as a program running as a stand-alone computer process, rather than as an addition to an existing process, e.g., it is not a plug-in. Those skilled in the art will recognize that stand-alone applications are often compiled. A compiler is a computer program(s) that converts source code written in a programming language into binary object code (e.g., assembly language or machine code). Suitable compiler languages include (as non-limiting examples) C, C + +, Objective-C, COBOL, Delphi, Eiffel, JavaTMLisp、PythonTMVisual Basic and vb. Often compilation is performed at least partially to create an executable program. In some embodiments, the computer program includes one or more executable compiled applications.
Platform for distribution and navigation of autonomous or semi-autonomous fleets of vehicles
According to fig. 11, a platform for distributing and navigating an autonomous or semi-autonomous fleet of vehicles over multiple paths is provided herein, the platform including a fleet of vehicles and a server processor configured to provide a server application 1120.
The fleet may include a plurality of autonomous or semi-autonomous vehicles 1110. Each autonomous or semi-autonomous vehicle 1110 may include an autonomous or semi-autonomous propulsion system 1111, a position sensor 1112, a condition sensor 1113, and a communication device 1114.
The position sensor 1112 may be configured to measure a current vehicle position of the vehicle 1110. The current vehicle location may include a street address, GPS coordinates, proximity to a set location, or any combination thereof. The location sensor 1112 may include a camera, a camcorder, LiDAR, RADAR, microphone, GPS sensor, or any combination thereof. The condition sensors 1113 may be configured to measure a current vehicle state. The communication device 1114 may be configured to transmit the current vehicle location and the current vehicle state. In some embodiments, the current vehicle state comprises a vehicle power level, a vehicle reserve, a vehicle hardware state, or any combination thereof.
The server application 1120 may include a database 1112, a communication module 1122, a scheduling module 1123, and a navigation module 1124. The database 1112 may include maps of multiple paths. The map may include a plurality of GPS points, a plurality of addresses, a plurality of streets, a plurality of locations of interest, or any combination thereof. The path may include a road, an expressway, a highway, a sidewalk, a walkway, a bridge, a tunnel, a walkway, a pedestrian area, a market, a yard, or any combination thereof. Each path may be associated with path parameters. The path parameters may include an autonomous driving safety factor and a speed coefficient. In some embodiments, at least one of the speed coefficient and the autonomous vehicle safety parameter includes a speed limit, an average speed, a time-dependent average speed, a number of intersections, a number of turns, a type of turn, an accident indicator, a parked vehicle indicator, a number of lanes, a one-way street indicator, a cellular reception parameter, a road grade, a maximum road grade, an average pedestrian density, a maximum pedestrian density, a minimum pedestrian density, a time-dependent pedestrian density, an average rider density, an unprotected turn parameter, a road flatness parameter; a road visibility parameter, or any combination thereof.
In some embodiments, the autonomous driving safety factor includes a speed limit, an average speed, a time-dependent average speed, a number of intersections, a number of turns, a type of turn, an accident indicator, a vehicle indicator to stop, a number of lanes, a one-way indicator, a cellular reception parameter, a road grade, a maximum road grade, an average pedestrian density, a maximum pedestrian density, a minimum pedestrian density, a time-dependent pedestrian density, an average rider density, an unprotected turn parameter, a road flatness parameter, a road visibility parameter, or any combination thereof.
The communication module 1122 may receive the current vehicle position and the current vehicle state. The communication module 1122 may receive the current vehicle position and the current vehicle state from the communication device 1114.
The scheduling module 1123 may assign one or more of the plurality of autonomous or semi-autonomous vehicles 1110 to a task destination. The task destination may include a GPS point, an address, a street, or any combination thereof. The task destination may be an order shipping address associated with the customer, the order, or both. The task destination may be an order extraction address associated with the supplier, the service provider, the order, or both. The vehicle 1110 may pick up items from the consumer at the task location, bring items to the task location and drop off, or both. The scheduling module 1123 may assign one or more of the plurality of autonomous or semi-autonomous vehicles 1110 to a task destination based at least on the current vehicle location and the current vehicle state. The scheduling module 1123 may dispatch a vehicle 1110 having a current location with a shortest point-to-point distance from the task destination, a shortest driving time relative to the driving distance, or any weighted combination thereof. The scheduling module 1123 may dispatch the vehicle 1110 having the current vehicle state including the highest stored propulsion energy. The scheduling module 1123 may dispatch the vehicle 1110 having the current vehicle state including the lowest power level. The scheduling module 1123 may dispatch the vehicle 1110 having the current vehicle state including the highest power level. The scheduling module 1123 may dispatch a vehicle 1110 having a vehicle hardware status without any problems. The scheduling module 1123 may dispatch a vehicle 1110 having a reserve of vehicles 1110 associated with the order.
The navigation module 1124 may apply a route calculation algorithm to determine a vehicle mission route from a current vehicle location to a mission destination. The navigation module 1124 may apply a route calculation algorithm to determine a vehicle mission route from a current vehicle location to a mission destination based at least on the path parameters and the current vehicle state. The navigation module 1124 may apply a route calculation algorithm to determine a vehicle mission route from a current vehicle location to a mission destination to maximize an autonomous driving safety factor, a speed coefficient, or both. In some embodiments, the route calculation algorithm includes a machine learning algorithm, a rule-based algorithm, or both. The vehicle mission route may include at least a portion of one of a plurality of paths.
In some embodiments, the autonomous driving safety factor is associated with an autonomous driving task. In some embodiments, the autonomous driving safety factor comprises a semi-autonomous driving safety factor. In some embodiments, the autonomous driving safety factor comprises an unmanned driving safety factor. In some embodiments, the autonomous driving safety factor is unique to autonomous driving, semi-autonomous driving, unmanned driving, or any combination thereof. Each autonomous driving safety factor may be associated with a set autonomous driving safety factor weight, wherein the weight is based on a severity, a risk, or both of the autonomous driving safety factors. In some embodiments, the autonomous driving safety factor weight is unique to autonomous driving, semi-autonomous driving, unmanned driving, or any combination thereof.
For example, the route calculation algorithm may select or prioritize routes along residential streets only, rather than routes having a turn left type included on bidirectional roads without dedicated turn left signal lights, as such turns may be more difficult to maneuver autonomously. In another example, the autonomous driving safety factor weight is important for roads associated with: stadiums, school zones, bars or any other infrastructure that provides alcohol, accident events or rates, stop traffic events or rates, and other areas with less predictable pedestrian behavior that present unique challenges for unmanned driving, where such challenges are easily addressed by human drivers. The route calculation algorithm may select or prioritize safer routes that are longer in distance or travel time, rather than less secure but more direct routes. The route calculation algorithm may select or prioritize routes currently or previously taken by other autonomous or semi-autonomous vehicles. In some embodiments, the autonomous driving safety factor may be time-dependent. Multiple lanes or locations in a road may be associated with different autonomous driving safety factors, for example, based on lighting, debris, or other associated risks or advantages. The autonomous driving safety factor may be configured to compound the autonomous driving safety factor weight so that it is greater than the sum of its portions. For example, roads with high road grade and high ice parameters may be associated with an autonomous driving safety factor greater than the sum of the weights associated with each of them, since roads that are both steep and icy are particularly dangerous for unmanned vehicles.
In some embodiments, the autonomous driving safety factor may be different from the speed factor, whereby some conditions that increase the speed factor do not increase or decrease the autonomous driving safety factor. The speed factor and the autonomous driving safety factor can be considered separately. The speed coefficient and the autonomous driving safety factor may be considered in combination or influence each other. Some autonomous driving safety factors may also include a speed coefficient. For example, a high average speed on a road may allow for faster transportation and shorter transportation times. However, high average speeds may present additional security risks associated with the bandwidth and operational calculations required to operate at such speeds.
The communication device 1114 may also direct an autonomous or semi-autonomous propulsion system 1111 of an autonomous or semi-autonomous vehicle. The communication device 1114 may also direct the autonomous or semi-autonomous propulsion system 1111 of the assigned autonomous or semi-autonomous vehicle based on the vehicle mission route.
In some embodiments, autonomous or semi-autonomous vehicle 1110 further includes sensors configured to measure sensed data. In some embodiments, the database 1112 also stores at least one of current vehicle location, current vehicle state, and sensed data. In some embodiments, the path parameter is based at least on the sensed data. In some embodiments, at least one of the safety factor and the speed coefficient is based on sensed data. In some embodiments, the sensed data enables crowd sourcing safety factor and speed factor determination. In some embodiments, the application 1120 further comprises a path parameter prediction module that predicts future path parameters based at least on the sensed data. In some embodiments, the route calculation algorithm also determines a mission route for the vehicle 1110 based on the predicted road parameters. In some embodiments, autonomous or semi-autonomous vehicle 1110 further includes sensors configured to measure sensed data. The sensed data may include photographs, videos, three-dimensional images, sounds, light values, tactile values, chemical data, or any combination thereof.
In some embodiments, the server application further includes a display module that displays at least one of a current vehicle location, a current vehicle status, and a task destination.
According to fig. 12, a platform for distributing and navigating an autonomous or semi-autonomous fleet of vehicles over multiple paths is provided herein, the platform including a fleet of vehicles and a server processor configured to provide a server application 1220.
The fleet may include a plurality of autonomous or semi-autonomous vehicles 1210. Each autonomous or semi-autonomous vehicle 1210 may include an autonomous or semi-autonomous propulsion system 1211, a location sensor 1212, a condition sensor 1213, and a communication device 1214.
The position sensor 1212 may be configured to measure a current vehicle position of the vehicle 1210. The current vehicle location may include a street address, GPS coordinates, proximity to a set location, or any combination thereof. The location sensor 1212 may include a camera, a camcorder, LiDAR, RADAR, microphone, GPS sensor, or any combination thereof. The condition sensor 1213 may be configured to measure the current vehicle state. The communication device 1214 may be configured to transmit the current vehicle position and the current vehicle state. In some embodiments, the current vehicle state comprises a vehicle power level, a vehicle reserve, a vehicle hardware state, or any combination thereof.
The server application 1220 may include a database 1221, a communication module 1222, a scheduling module 1223, and a navigation module 1224. The database 1221 may include maps of multiple routes. The map may include a plurality of GPS points, a plurality of addresses, a plurality of streets, a plurality of locations of interest, or any combination thereof. The path may include a road, an expressway, a highway, a sidewalk, a walkway, a bridge, a tunnel, a walkway, a pedestrian area, a market, a yard, or any combination thereof. Each path may be associated with path parameters. The path parameters may include an autonomous driving safety factor and a speed coefficient. In some embodiments, the speed coefficient includes a speed limit, an average speed, a time-dependent average speed, a number of intersections, a number of turns, a type of turn, an accident indicator, a vehicle indicator to stop, a number of lanes, a one-way indicator, a cellular reception parameter, a road grade, a maximum road grade, an average pedestrian density, a maximum pedestrian density, a minimum pedestrian density, a time-dependent pedestrian density, an average rider density, an unprotected turn parameter, a road flatness parameter, a road visibility parameter, or any combination thereof. In some embodiments, the autonomous driving safety factor includes a speed limit, an average speed, a time-dependent average speed, a number of intersections, a number of turns, a type of turn, an accident indicator, a vehicle indicator to stop, a number of lanes, a one-way indicator, a cellular reception parameter, a road grade, a maximum road grade, an average pedestrian density, a maximum pedestrian density, a minimum pedestrian density, a time-dependent pedestrian density, an average rider density, an unprotected turn parameter, a road flatness parameter, a road visibility parameter, or any combination thereof.
The communication module 1222 may receive a current vehicle location and a current vehicle state. The communication module 1222 may receive the current vehicle location and the current vehicle state from the communication device 1214.
The scheduling module 1223 may assign one or more of the plurality of autonomous or semi-autonomous vehicles 1210 to a task destination. The task destination may include a GPS point, an address, a street, or any combination thereof. The task destination may be an order shipping address associated with the customer, the order, or both. The task destination may be an order extraction address associated with the supplier, the service provider, the order, or both. The vehicle 1210 may pick up items from the consumer at the task location, bring items to the task location and drop off, or both. The scheduling module 1223 may assign one or more of the plurality of autonomous or semi-autonomous vehicles 1210 to a task destination based at least on the current vehicle location and the current vehicle state. The scheduling module 1223 may dispatch a vehicle 1210 having a current location with a shortest point-to-point distance from a task destination, a shortest driving distance from the task destination, a shortest driving time relative to the driving distance, or any weighted combination thereof. The scheduling module 1223 may dispatch a vehicle 1210 having a current vehicle state that includes the highest stored propulsion energy. The scheduling module 1223 may dispatch a vehicle 1210 having a current vehicle state that includes a lowest power level. The scheduling module 1223 may dispatch a vehicle 1210 having a current vehicle state that includes a highest power level. The scheduling module 1223 may dispatch a vehicle 1210 having a vehicle hardware status without any problems. The scheduling module 1223 may dispatch a vehicle 1210 having a vehicle 1210 reserve associated with the order.
The navigation module 1224 may apply a route calculation algorithm to determine a vehicle task route from a current vehicle location to a task destination. The navigation module 1224 may apply a route calculation algorithm to determine a vehicle mission route from a current vehicle location to a mission destination based at least on the path parameters and a current vehicle state. The navigation module 1224 may apply a route calculation algorithm to determine a vehicle mission route from a current vehicle location to a mission destination to maximize an autonomous driving safety factor, a speed coefficient, or both. In some embodiments, the route calculation algorithm includes a machine learning algorithm, a rule-based algorithm, or both. The vehicle mission route may include at least a portion of one of a plurality of paths. The communication device 1214 may also direct an autonomous or semi-autonomous propulsion system 1211 of the autonomous or semi-autonomous vehicle. The communication device 1214 may also direct the autonomous or semi-autonomous propulsion system 1211 of the assigned autonomous or semi-autonomous vehicle based on the vehicle mission route.
In some embodiments, the server application 1220 further includes a requirements database 1225. The demand database 1225 may include historical demand data associated with a geographic area. The geographic area may include at least a task destination.
In some embodiments, the server application 1220 further includes a demand prediction module 1226. The demand prediction module 1226 may apply a prediction algorithm to determine a predicted demand schedule for each of the autonomous or semi-autonomous vehicles 1210. The demand prediction module 1226 may apply a prediction algorithm to determine a predicted demand schedule for each of the autonomous or semi-autonomous vehicles 1210 based at least on historical demand data. In some embodiments, the predictive algorithm comprises a machine learning algorithm, a rule-based algorithm, or both. The forecasted demand schedule may include forecasted demand task locations, forecasted demand time periods, or both, in the geographic area. The predicted demand task location may include a GPS point, an address, a location of interest, or any combination thereof. The predicted demand time period may include a peak demand time, a plurality of ordered peak demand times, a peak demand time period, a plurality of ordered demand time periods, or any combination thereof.
In some embodiments, server application 1220 further includes a transition relocation module 1227. The transition relocation module 1227 may assign a transition relocation mode to each of the plurality of autonomous or semi-autonomous vehicles 1210. The transition relocation module 1227 may assign a transition relocation mode to each of the plurality of autonomous or semi-autonomous vehicles 1210 based at least on one or more of the predicted demand task location, the predicted demand time period, the task destination, and the current vehicle state. In some embodiments, the transitional repositioning mode includes a replenishment station mode corresponding to a replenishment station location, a parking mode associated with a plurality of parking space locations, and a cruise mode associated with a set threshold cruise distance from a mission destination or a predicted required mission location. In some embodiments, database 1221 also includes a plurality of parking spot locations in a geographic area.
In some embodiments, the navigation module 1224 further applies a route calculation algorithm to determine a vehicle repositioning route from the task destination to: a tender station location, the determination based on the tender station mode; a parking space position, the determination being based on a parking pattern; or a vehicle cruise route, the determination being based on a cruise pattern. In some embodiments, based on the tender station mode, the navigation module 1224 also applies a route calculation algorithm to determine a vehicle repositioning route from the task destination to the tender station location. In some embodiments, based on the parking pattern, the navigation module 1224 also applies a route calculation algorithm to determine a vehicle repositioning route from the mission destination to the parking location. In some embodiments, based on the cruise mode, the navigation module 1224 also determines a vehicle repositioning route from the task destination to the vehicle cruise route.
In some embodiments, the vehicle tour route includes at least a portion of one of a plurality of routes within a set threshold tour distance from the task destination or the predicted required task location. In some embodiments, the communication device 1214 also directs the autonomous or semi-autonomous propulsion system 1211 of the autonomous or semi-autonomous vehicle 1210 to remain at the tender station location, the parking space location, or within the vehicle 1210 itinerary for the predicted demand period. In some embodiments, the communication device 1214 also directs the autonomous or semi-autonomous propulsion system 1211 of the autonomous or semi-autonomous vehicle 1210 to remain at the tender station location for the predicted demand period. In some embodiments, the communication device 1214 also directs the autonomous or semi-autonomous propulsion system 1211 of the autonomous or semi-autonomous vehicle 1210 to remain at the parking spot location for the predicted demand period. In some embodiments, the communication device 1214 also directs the autonomous or semi-autonomous propulsion system 1211 of the autonomous or semi-autonomous vehicle 1210 to remain within the vehicle tour for the predicted demand period.
In some embodiments, autonomous or semi-autonomous vehicle 1210 further includes a sensor configured to measure sensed data. In some embodiments, the database 1221 also stores at least one of current vehicle position, current vehicle state, and sensed data. In some embodiments, the path parameter is based at least on the sensed data. In some embodiments, at least one of the safety factor and the speed coefficient is based on sensed data. In some embodiments, the sensed data enables crowd sourcing safety factor and speed factor determination.
In some embodiments, the application 1220 further includes a path parameter prediction module that predicts future path parameters based at least on the sensed data. In some embodiments, the route calculation algorithm also determines a mission route for the vehicle 1210 based on the predicted road parameters. In some embodiments, autonomous or semi-autonomous vehicle 1210 further includes a sensor configured to measure sensed data.
In some embodiments, the server application further comprises a display module that displays at least one of a current vehicle location, a current vehicle state, a task destination, a path parameter, a task route, a parking location, and a predicted required task location.
According to fig. 13, a platform for distributing and navigating an autonomous or semi-autonomous fleet of vehicles over multiple paths is provided herein, the platform including a fleet of vehicles and a server processor configured to provide a server application 1320.
The fleet may include a plurality of autonomous or semi-autonomous vehicles 1310. Each autonomous or semi-autonomous vehicle 1310 may include an autonomous or semi-autonomous propulsion system 1311, a position sensor 1312, a condition sensor 1313, and a communication device 1314.
The position sensor 1312 may be configured to measure a current vehicle position of the vehicle 1310. The current vehicle location may include a street address, GPS coordinates, proximity to a set location, or any combination thereof. The location sensor 1312 may include a camera, a camcorder, a LiDAR, a RADAR, a microphone, a GPS sensor, or any combination thereof. The condition sensor 1313 may be configured to measure a current vehicle state. The communication device 1314 may be configured to transmit the current vehicle location and the current vehicle state. In some embodiments, the current vehicle state comprises a vehicle power level, a vehicle reserve, a vehicle hardware state, or any combination thereof.
The server application 1320 may include a database 1321, a communication module 1322, a scheduling module 1323, and a navigation module 1324. Database 1321 may include maps of multiple paths. The map may include a plurality of GPS points, a plurality of addresses, a plurality of streets, a plurality of locations of interest, or any combination thereof. The path may include a road, an expressway, a highway, a sidewalk, a walkway, a bridge, a tunnel, a walkway, a pedestrian area, a market, a yard, or any combination thereof. Each path may be associated with path parameters. The path parameters may include an autonomous driving safety factor and a speed coefficient. In some embodiments, the speed coefficient includes a speed limit, an average speed, a time-dependent average speed, a number of intersections, a number of turns, a type of turn, an accident indicator, a vehicle indicator to stop, a number of lanes, a one-way indicator, a cellular reception parameter, a road grade, a maximum road grade, an average pedestrian density, a maximum pedestrian density, a minimum pedestrian density, a time-dependent pedestrian density, an average rider density, an unprotected turn parameter, a road smoothness parameter, a road visibility parameter, or any combination thereof. In some embodiments, the autonomous driving safety factor includes a speed limit, an average speed, a time-dependent average speed, a number of intersections, a number of turns, a type of turn, an accident indicator, a vehicle indicator to stop, a number of lanes, a one-way indicator, a cellular reception parameter, a road grade, a maximum road grade, an average pedestrian density, a maximum pedestrian density, a minimum pedestrian density, a time-dependent pedestrian density, an average rider density, an unprotected turn parameter, a road smoothness parameter, a road visibility parameter, or any combination thereof.
The communication module 1322 may receive a current vehicle position and a current vehicle state. The communication module 1322 may receive the current vehicle location and the current vehicle state from the communication device 1314.
The scheduling module 1323 may assign one or more of the plurality of autonomous or semi-autonomous vehicles 1310 to a task destination. The task destination may include a GPS point, an address, a street, or any combination thereof. The task destination may be an order shipping address associated with the customer, the order, or both. The task destination may be an order extraction address associated with the supplier, the service provider, the order, or both. The vehicle 1310 may pick up items from the consumer at the task location, bring items to the task location and drop off, or both. The scheduling module 1323 may assign one or more of the plurality of autonomous or semi-autonomous vehicles 1310 to a task destination based at least on the current vehicle location and the current vehicle state. The scheduling module 1323 may dispatch a vehicle 1310 whose current location has the shortest point-to-point distance from the task destination, the shortest driving time relative to the driving distance, or any weighted combination thereof. The scheduling module 1323 may dispatch the vehicle 1310 having the current vehicle state that includes the highest stored propulsion energy. The scheduling module 1323 may dispatch the vehicle 1310 with the current vehicle state including the lowest power level. The scheduling module 1323 may dispatch a vehicle 1310 having a current vehicle state that includes a highest power level. The scheduling module 1323 may dispatch a vehicle 1310 that has a vehicle hardware status without any problems. The scheduling module 1323 may dispatch a vehicle 1310 that has a reserve of vehicles 1310 associated with the order.
The navigation module 1324 may apply a route calculation algorithm to determine a vehicle mission route from the current vehicle location to the mission destination. The navigation module 1324 may apply a route calculation algorithm to determine a vehicle mission route from the current vehicle location to the mission destination based at least on the path parameters and the current vehicle state. The navigation module 1324 may apply a route calculation algorithm to determine a vehicle mission route from a current vehicle location to a mission destination to maximize an autonomous driving safety factor, a speed factor, or both. In some embodiments, the route calculation algorithm includes a machine learning algorithm, a rule-based algorithm, or both. The vehicle mission route may include at least a portion of one of a plurality of paths. The communication device 1314 may also direct the autonomous or semi-autonomous propulsion system 1311 of the autonomous or semi-autonomous vehicle. The communication device 1314 may also direct the autonomous or semi-autonomous propulsion system 1311 of the assigned autonomous or semi-autonomous vehicle based on the vehicle mission route.
In some embodiments, the server application 1320 also includes a requirements database 1325. The demand database 1325 may include historical demand data associated with a geographic area. The geographic area may include at least a task destination.
In some embodiments, the server application 1320 also includes a demand forecasting module 1326. The demand forecasting module 1326 may apply a forecasting algorithm to determine a forecasted demand schedule for each of the autonomous or semi-autonomous vehicles 1310. The demand prediction module 1326 may apply a prediction algorithm to determine a predicted demand schedule for each of the autonomous or semi-autonomous vehicles 1310 based at least on historical demand data. In some embodiments, the predictive algorithm comprises a machine learning algorithm, a rule-based algorithm, or both. The forecasted demand schedule may include forecasted demand task locations, forecasted demand time periods, or both, in the geographic area. The predicted demand task location may include a GPS point, an address, a location of interest, or any combination thereof. The predicted demand time period may include a peak demand time, a plurality of ordered peak demand times, a peak demand time period, a plurality of ordered demand time periods, or any combination thereof.
In some embodiments, the server application 1320 also includes a transition relocation module 1327. The transition relocation module 1327 may assign a transition relocation mode to each of the plurality of autonomous or semi-autonomous vehicles 1310. The transition relocation module 1327 may assign a transition relocation mode to each of the plurality of autonomous or semi-autonomous vehicles 1310 based at least on one or more of a predicted demand task location, a predicted demand time period, a task destination, and a current vehicle state. In some embodiments, the transitional repositioning mode includes a replenishment station mode corresponding to a replenishment station location, a parking mode associated with a plurality of parking space locations, and a cruise mode associated with a set threshold cruise distance from a mission destination or a predicted required mission location. In some embodiments, database 1312 also includes a plurality of parking spot locations in the geographic area.
In some embodiments, the application 1320 also includes a parking allocation module 1328. The parking allocation module 1328 may determine a selected parking spot location for one or more of the plurality of autonomous or semi-autonomous vehicles 1310. The parking allocation module 1328 may determine a selected parking space location for one or more of the plurality of autonomous or semi-autonomous vehicles 1310 based on at least one of a mission destination and a predicted demand mission location, a parking pattern, and a plurality of parking space locations. In some embodiments, based on the replenishment station mode, the navigation module 1324 also applies a route calculation algorithm to determine a vehicle relocation route from the task destination to the replenishment station location. In some embodiments, based on the parking pattern, the navigation module 1324 also applies a route calculation algorithm to determine a vehicle repositioning route from the mission destination to the selected parking spot location. In some embodiments, based on the cruise pattern, the navigation module 1324 also determines a vehicle repositioning route from the task destination to the vehicle cruise route.
In some embodiments, the navigation module 1324 also applies a route calculation algorithm to determine a vehicle relocation route from the mission destination to: a tender station location, the determination based on the tender station mode; selecting a parking space position, the determination based on a parking pattern; or a vehicle cruise route, the determination being based on a cruise pattern. In some embodiments, based on the replenishment station mode, the navigation module 1324 also applies a route calculation algorithm to determine a vehicle relocation route from the task destination to the replenishment station location. In some embodiments, based on the parking pattern, the navigation module 1324 also applies a route calculation algorithm to determine a vehicle repositioning route from the mission destination to the selected parking spot location. In some embodiments, based on the cruise pattern, the navigation module 1324 also determines a vehicle repositioning route from the task destination to the vehicle cruise route.
In some embodiments, the vehicle tour route includes at least a portion of one of a plurality of routes within a set threshold tour distance from the task destination or the predicted required task location. In some embodiments, the communication device 1314 also directs the autonomous or semi-autonomous propulsion system 1311 of the autonomous or semi-autonomous vehicle 1310 to remain at the tender station location, the selected parking space location, or within the vehicle 1310's tour for the predicted demand period. In some embodiments, the communication device 1314 also directs the autonomous or semi-autonomous propulsion system 1311 of the autonomous or semi-autonomous vehicle 1310 to remain at the tender station location for the predicted demand period. In some embodiments, the communication device 1314 also directs the autonomous or semi-autonomous propulsion system 1311 of the autonomous or semi-autonomous vehicle 1310 to remain at the selected parking spot location for the predicted demand period. In some embodiments, the communication device 1314 also directs the autonomous or semi-autonomous propulsion system 1311 of the autonomous or semi-autonomous vehicle 1310 to remain within the vehicle tour for a predicted demand period.
In some embodiments, autonomous or semi-autonomous vehicle 1310 also includes sensors configured to measure sensed data. In some embodiments, the database 1312 also stores at least one of a current vehicle location, a current vehicle state, and sensed data. In some embodiments, the path parameter is based at least on the sensed data. In some embodiments, at least one of the safety factor and the speed coefficient is based on sensed data. In some embodiments, the sensed data enables crowd sourcing safety factor and speed factor determination.
In some embodiments, the application 1320 further includes a path parameter prediction module that predicts future path parameters based at least on the sensed data. In some embodiments, the route calculation algorithm also determines the vehicle 1310 mission route based on predicted road parameters. In some embodiments, autonomous or semi-autonomous vehicle 1310 also includes sensors configured to measure sensed data. The sensed data may correspond to a parking spot status of one or more of a plurality of parking spot locations in the geographic area. In some embodiments, parking allocation module 1328 also determines a selected parking spot location based on the parking spot status.
In some embodiments, the server application further comprises a display module that displays at least one of a current vehicle location, a current vehicle state, a task destination, a path parameter, a task route, a selected parking space location, and a predicted required task location.
Web browser plug-in
In some embodiments, the computer program includes a web browser plug-in (e.g., extension, etc.). In computing, a plug-in is one or more software components that add specific functionality to a larger software application. Manufacturers of software applications support plug-ins to enable third party developers to create functionality to extend applications to support easy addition of new functionality and reduction in the size of applications. When supported, the plug-ins make the functionality of the custom software application functional. For example, plug-ins are commonly used to play videos, generate interactivity, scan for viruses, and display specific file types. Those skilled in the art will be familiar with several web browser plug-ins, includingPlayer、 And
based on the disclosure provided herein, one of ordinary skill in the art will recognize that several plug-in frameworks may be used to enable the development of plug-ins in a variety of programming languages, including (as non-limiting examples) C + +, Delphi, JavaTMPUP、PythonTMNet or a combination thereof.
A web browser (also known as an internet browser) is a software application designed for use with networked digital processing devices for retrieving, presenting, and traversing information resources on the world wide web. Suitable web browserIncluding (as a non-limiting example)InternetChrome、OperaAnd KDE Konqueror. In some embodiments, the web browser is a mobile web browser. Mobile web browsers (also known as microbrowsers, mini-browsers, and wireless browsers) are designed for use on mobile digital processing devices, including by way of non-limiting example, handheld computers, tablet computers, netbook computers, sub-notebook computers, smart phones, music players, Personal Digital Assistants (PDAs), and handheld video game systems. Suitable mobile web browsers include (as a non-limiting example) Browser, RIMA browser,Blazer、Browser and mobileInternetMobile、Basic Web、Browser, OperaMobile andPSPTMa browser.
Software module
In some embodiments, the platforms, systems, media, and methods disclosed herein include, or use of, software, servers, and/or database modules. In light of the disclosure provided herein, software modules are created using machines, software, and languages known in the art by techniques known to those skilled in the art. The software modules disclosed herein are implemented in a variety of ways. In various embodiments, a software module comprises a file, a code segment, a programming object, a programming structure, or a combination thereof. In further various embodiments, a software module comprises a plurality of files, a plurality of code segments, a plurality of programming objects, a plurality of programming structures, or a combination thereof. In various embodiments, the one or more software modules include, by way of non-limiting example, web applications, mobile applications, and standalone applications. In some embodiments, the software modules reside in one computer program or application. In some embodiments, the software modules reside in one or more computer programs or applications. In some embodiments, the software module is hosted on one machine. In some embodiments, the software module is hosted on more than one machine. In a further embodiment, the software module is hosted on a cloud computing platform. In some embodiments, software modules are hosted on one or more machines at one location. In some embodiments, software modules are hosted on one or more machines at more than one location.
Database with a plurality of databases
In some embodiments, the platforms, systems, media, and methods disclosed herein include one or more databases or uses thereof. In light of the disclosure provided herein, one skilled in the art will recognize that multiple databases are suitable for autonomous or semi-autonomous vehicles. In various embodiments, suitable databases include, by way of non-limiting example, relational databases, non-relational databases, object-oriented databases, object databases, entity-relational model databases, relational databases, and XML databases. Further non-limiting examples include SQL, PostgreSQL, MySQL, Oracle, DB2, and Sybase. In some embodiments, the database is internet-based. In a further embodiment, the database is web-based. In still further embodiments, the database is cloud computing based. In some embodiments, the database is based on one or more local computer storage devices.
Terms and definitions
As used herein, the phrases "at least one," "one or more," and/or "are open-ended expressions that, in operation, are both conjunctions and conjunctions. For example, each of the expressions "at least one of A, B and C", "at least one of A, B or C", "one or more of A, B and C", "one or more of A, B or C", and "A, B and/or C" means a alone, B alone, C alone, a and B together, a and C together, B and C together, or A, B and C together.
As used herein, the terms "fleet" and "sub-fleet" are used to refer to a plurality of land vehicles, boat units, or airplane units operating together or under the same ownership. In some embodiments, the fleet or sub-fleet participate in the same activity. In some embodiments, a fleet or sub-fleet is engaged in similar activities. In some embodiments, the fleet or sub-fleet is engaged in different activities.
As used herein, the terms "autonomous or semi-autonomous vehicle", "unit", "autonomous or semi-autonomous fleet", "vehicle", and "all terrain vehicle" are used to indicate a mobile machine capable of transporting cargo. The vehicle may include a car, van, unmanned engine vehicle (e.g., tricycle, truck, trailer, bus, etc.), unmanned rail vehicle (e.g., train, tram, etc.), unmanned naval vessel (e.g., ship, boat, ferry, landing boat, barge, wooden raft, etc.), unmanned aerial vehicle, unmanned hovercraft (of the aviation, land and water types), unmanned aerial vehicle, and even unmanned spacecraft.
As used herein, the terms "user," "users," "operator," and "fleet operator" are used to indicate an entity that owns or is responsible for managing and operating a fleet of autonomous or semi-autonomous vehicles. As used herein, the term "consumer" is used to indicate an entity requesting a service provided by a fleet of autonomous or semi-autonomous vehicles.
As used herein, the terms "provider," "enterprise," "provider," "third party provider" are used to indicate an entity working in conjunction with a fleet owner or operator to utilize the services of an autonomous or semi-autonomous vehicle fleet to transport a provider's products from or back to the provider's place of business or staging location.
As used herein, the terms "white label," "white label product," "white label service," and "white label provider" shall refer to a product or service that is produced by one company (or producer) while other companies (marketers) brand-reshape it to make it appear as if it were produced by itself.
As used herein, the terms "maximum speed" and "maximum speed range" shall refer to the maximum speed that an autonomous or semi-autonomous vehicle is capable of producing and allowing operation in a mission environment (e.g., in open roads, bicycle lanes, and other environments more suitable for high speed travel).
As used herein, the term "operating speed" shall refer to the range of speeds at which an autonomous or semi-autonomous vehicle can operate (including a full stop, or zero speed), as determined by onboard sensors and software that can monitor environmental conditions, operating environment, etc., to determine an appropriate speed at a given time.
As used herein, the terms "check" and "monitoring" shall refer to an autonomous or semi-autonomous vehicle collecting data from an environment (and the use thereof) that may be used to monitor, check, or evaluate any number of environmental elements.
As used herein, the term "environment" shall refer to a physical surroundings or condition of autonomous or semi-autonomous vehicle operation, its functional habitat, geographic location, region, area, surroundings, environment, or condition, including atmospheric conditions such as rain, humidity, solar index, wind conditions, atmospheric pressure, and the like.
As used herein, unless otherwise indicated, the terms "about" and "approximately" mean an acceptable error for the particular value determined by one of ordinary skill in the art, which error depends in part on how the value is measured or determined. In certain embodiments, the term "about" or "approximately" means within 1, 2, 3, or 4 standard deviations. In certain embodiments, the term "about" or "approximately" means within 30%, 25%, 20%, 15%, 10%, 9%, 8%, 7%, 6%, 5%, 4%, 3%, 2%, 1%, 0.5%, 0.1%, or 0.05% of a given value or range. In certain embodiments, the term "about" or "approximately" means within 40.0mm, 30.0mm, 20.0mm, 10.0mm, 5.0mm, 1.0mm, 0.9mm, 0.8mm, 0.7mm, 0.6mm, 0.5mm, 0.4mm, 0.3mm, 0.2mm, or 0.1mm of a given value or range. In certain embodiments, the term "about" or "approximately" means within 20.0 degrees, 15.0 degrees, 10.0 degrees, 9.0 degrees, 8.0 degrees, 7.0 degrees, 6.0 degrees, 5.0 degrees, 4.0 degrees, 3.0 degrees, 2.0 degrees, 1.0 degrees, 0.9 degrees, 0.8 degrees, 0.7 degrees, 0.6 degrees, 0.5 degrees, 0.4 degrees, 0.3 degrees, 0.2 degrees, 0.1 degrees, 0.09 degrees, 0.08 degrees, 0.07 degrees, 0.06 degrees, 0.05 degrees, 0.04 degrees, 0.03 degrees, 0.02 degrees, or 0.01 degrees of a given value or range. In certain embodiments, the term "about" or "approximately" means within 0.1mph, 0.2mph, 0.3mph, 0.4mph, 0.5mph, 0.6mph, 0.7mph, 0.8mph, 0.9mph, 1.0mph, 1.1mph, 1.2mph, 1.3mph, 1.4mph, 1.5mph, 1.6mph, 1.7mph, 1.8mph, 1.9mph, 2.0mph, 3.0mph, 4.0mph, or 5.0mph of a given value or range.
As used herein, the terms "server," "computer server," "central server," "mobile server," and "client server" indicate devices on a network that manage fleet resources (i.e., autonomous or semi-autonomous vehicles).
As used herein, the term "controller" is used to indicate a device that controls the transfer of data from a computer to a peripheral device (and vice versa). For example, disk drives, display screens, keyboards, and printers all require controllers. In a personal computer, the controller is typically a single chip. As used herein, a controller is generally used to manage access to components (e.g., a safety compartment) of an autonomous or semi-autonomous vehicle.
As used herein, a "mesh network" is a network topology in which each node relays data for the network. All mesh nodes cooperate in distributing data in the network. It can be applied to both wired and wireless networks. A wireless mesh network may be considered a "wireless ad hoc" network. Therefore, wireless mesh networks are closely related to mobile ad hoc networks (MANETs). The MANET is not limited to a particular mesh network topology and the wireless ad hoc network or MANET may take any form of network topology. Mesh networks may use flooding or routing techniques to relay messages. By routing, a message propagates along a path by jumping from one node to another until it reaches its destination. To ensure that all of its paths are available, the network must allow continuous connections and must reconfigure itself around the broken path using a self-healing algorithm (e.g., shortest path bridging). Self-healing allows a route-based network to operate when a node fails or when a connection becomes unreliable. Thus, the network is reliable, as there is typically more than one path between a source and a destination in the network. This concept is also applicable to wired networks and software interaction. A mesh network whose nodes are all interconnected is a fully connected network.
As used herein, the term "module" is used to indicate a self-contained hardware component of a central server, which in turn includes a software module. In software, a module is a part of a program. A program consists of one or more independently developed modules that are not combined until the program is linked. A single module may contain one or several routines, or program parts, which perform specific tasks. As used herein, a fleet management module includes software modules for managing various aspects and functions of a fleet of autonomous or semi-autonomous vehicles.
As used herein, the terms "processor" and "digital processing device" are used to refer to a microprocessor or one or more Central Processing Units (CPUs). A CPU is an electronic circuit within a computer that executes instructions of a computer program by performing basic arithmetic, logic, control, and input/output (I/O) operations specified by the instructions.
Suitable digital processing devices include, as non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, network tablet computers, set-top box computers, handheld computers, internet appliances, mobile smartphones, tablets, personal digital assistants, video game consoles, and vehicles, in accordance with the description herein. Those skilled in the art will recognize that many smart phones are suitable for use in the system described herein. Suitable tablet computers include those known to those skilled in the art having a manual, tablet and convertible configuration.
In some embodiments, the digital processing device includes an operating system configured to execute executable instructions. For example, an operating system is software, including programs and data, that manages the hardware of a device and provides services for the execution of applications. Those skilled in the art will recognize suitable server operating systems including, by way of non-limiting example, FreeBSD, OpenBSD,Linux、Mac OS XWindowsAnd those skilled in the art will recognize suitable personal computer operating systems including, by way of non-limiting exampleMac OSAnd UNIX-like operating systems (e.g. for computer systems)). In some embodiments, the operating system is provided by cloud computing. Those skilled in the art will also recognize suitable mobile smartphone operating systems including (as a non-limiting example)OS、Research InBlackBerryWindowsOS、WindowsOS、And
in some embodiments, the device includes a storage and/or memory device. A storage and/or memory device is one or more physical means for temporarily or permanently storing data or programs. In some embodiments, the device is a volatile memory and requires power to maintain the stored information. In some embodiments, the device is a non-volatile memory and retains stored information when the digital processing device is not powered. In some embodiments, the non-volatile memory comprises flash memory. In some embodiments, the non-volatile memory comprises Dynamic Random Access Memory (DRAM). In some embodiments, the non-volatile memory comprises Ferroelectric Random Access Memory (FRAM). In some embodiments, the non-volatile memory includes phase change random access memory (PRAM). In some embodiments, the device is a storage device, including (as non-limiting examples) CD-ROMs, DVDs, flash memory devices, disk drives, tape drives, optical disk drives, and cloud-based storage. In some embodiments, the storage and/or memory device is a combination of devices such as those disclosed herein.
In some embodiments, the digital processing device includes a display that sends visual information to the user or users. In some embodiments, the display is a Cathode Ray Tube (CRT). In some embodiments, the display is a Liquid Crystal Display (LCD). In some embodiments, the display is a thin film transistor liquid crystal display (TFT-LCD). In some embodiments, the display is an Organic Light Emitting Diode (OLED) display. In various embodiments, the OLED display is a passive matrix OLED (pmoled) or active matrix OLED (amoled) display. In some embodiments, the display is a plasma display. In some embodiments, the display is a video projector. In other embodiments, the display is a combination of devices such as those disclosed herein.
Examples of the invention
Example 1-platform for distribution and navigation
In one example, a platform for distributing and navigating an autonomous or semi-autonomous fleet of vehicles over multiple paths includes a plurality of autonomous or semi-autonomous vehicles and a server configured to provide a server application.
The communication module of the server application receives from the communication device a current vehicle location (determined by a location sensor on the vehicle) and a current vehicle state (measured by a state sensor), wherein the current vehicle state includes a vehicle power level, a vehicle reserve, and a vehicle hardware state.
The scheduling module then assigns one or more of the plurality of autonomous or semi-autonomous vehicles to a task destination associated with the order based on the current vehicle location and the current vehicle state.
The navigation module then receives path parameters from a database in the server application, the path parameters including an autonomous driving safety factor and a speed coefficient, wherein a plurality of paths are stored in a map. The navigation module then applies a route calculation algorithm to determine a vehicle mission route from the current vehicle location to the mission destination based on the path parameters and the current vehicle state, wherein the vehicle mission route includes at least a portion of one of the plurality of paths.
The communication device then directs the autonomous or semi-autonomous propulsion system of the autonomous or semi-autonomous vehicle based on the vehicle mission route.
Example 2-platform for transitional replenishment station relocation
In another example, once an autonomous or semi-autonomous vehicle traverses a vehicle mission route, an order is completed at a mission destination.
The transition relocation module assigns a transition relocation mode including a replenishment station mode to the autonomous or semi-autonomous vehicle when a current vehicle state of one of the autonomous or semi-autonomous vehicles includes a low battery indication, wherein the replenishment station mode corresponds to a replenishment station location. The navigation module then applies a route calculation algorithm to determine a vehicle relocation route from the mission destination to the replenishment station location, wherein the communication device then guides the autonomous or semi-autonomous propulsion system of the autonomous or semi-autonomous vehicle based on the vehicle relocation route.
Example 3-platform for transitional tour relocation
In another example, once an autonomous or semi-autonomous vehicle traverses a vehicle mission route and completes an order at a mission destination, the server application may employ a demand prediction module to determine that high demand will occur within a set threshold cruise distance (2 miles) from a predicted demand mission location (123Main Street) for a predicted demand period (1:00pm to 2:00pm) based on historical demand data associated with a geographic area.
The transition repositioning module assigns a transition repositioning mode to the autonomous or semi-autonomous vehicle when the current vehicle state of the third autonomous or semi-autonomous vehicle includes a high charge indication and a medium reserve indication, the transition repositioning mode including a cruise mode associated with a set threshold cruise distance from the predicted demanded mission location.
The navigation module then applies a route calculation algorithm to determine a vehicle tour route from the task destination within a set threshold tour distance from the predicted required task location. The communication device then directs an autonomous or semi-autonomous propulsion system of the autonomous or semi-autonomous vehicle based on the vehicle tour route such that the vehicle reaches and/or is within a set threshold of the predicted demand mission location for the predicted demand time period.
Example 4-platform for transitional parking repositioning
In another example, once an autonomous or semi-autonomous vehicle traverses a vehicle mission route and completes an order at a mission destination, the server application may employ a demand prediction module to determine that high demand will occur at a predicted demand mission location (444Elm Street) for a predicted demand period (5:00pm) based on historical demand data associated with a geographic area.
When the current vehicle state of the third autonomous or semi-autonomous vehicle includes a medium charge and a high reserve indication, a transitional repositioning module assigns a transitional repositioning mode to the autonomous or semi-autonomous vehicle, the transitional repositioning module including a parking mode associated with the predicted demanded mission location.
The parking allocation module then determines a selected parking space position from the plurality of parking space positions in the database based on the predicted demanded task position.
The sensors on the autonomous or semi-autonomous vehicle then further measure sensed data, which is stored in a database. The sensing data and previously stored sensing data are used to predict and determine path parameters of the road. The navigation module then applies a route calculation algorithm to determine a vehicle repositioning route from the task destination to the selected parking space location based on the current and predicted path parameters. The communication device then directs the autonomous or semi-autonomous propulsion system of the autonomous or semi-autonomous vehicle based on the vehicle relocation route.
The display module then displays to a user or fleet manager at least one of a current vehicle location, a current vehicle state, a task destination, a path parameter, a task route, a selected parking space location, and a predicted required task location.
Example 5-platform for transitional parking repositioning
In another example, the automatic fleet allocation module controls where all vehicles should be located at any point in time. The vehicle travels to a replenishment station, finds a legal parking space, wraps around the block, and is assigned to the next task/customer. The task request module receives a task request from an API, a web interface, a mobile interface, or any combination thereof, the task request having a particular location associated with a task. The scheduling module schedules the vehicles for a specific task that works with the fleet wide navigation module to select the vehicle that will most likely arrive at the location most quickly or within a predetermined time, depending on the current status of all vehicles in the fleet.
The active but out-of-service vehicle allocation module determines what each vehicle should do when not serving the customer. The demand prediction module (which may be rule-based, statistical, machine-learned, or both) predicts demand levels at any point in time throughout the market. The location optimization module receives the current location of the vehicle, the battery level, or other state type, and the demand forecast, and determines what each vehicle should do among the following four possible tasks: stay in a nearby local area (find temporary parking places or circle around the block), move to another nearby area, find a supply station, or go to a particular location. The database of the status of all vehicles in the fleet stores the current locations of all vehicles in the fleet, including: location, battery level, current task, machine status/error, current destination, etc. The communication module between the server and the vehicle may be used to collaborate or perform tasks directly.
Example 6 parking and replenishment station Module
In another example, the parking and replenishment station module may be flagged by the fleet allocation module or other system as an intent to find a parking lot or replenishment station for which the parking and replenishment station module finds the best parking space or replenishment station for the vehicle. A database of replenishment stations or parking spaces is located on the server, which is created manually or automatically. The database of the replenishment station includes the availability of the charger, and optionally a plurality of available locations. A nearest parking/replenishment station optimization module that receives the current location of the vehicle and cooperates with the navigation module to find the best parking/replenishment station that the vehicle is the fastest arriving parking/replenishment station to meet the vehicle's demand. The crowd-sourced parking location database may receive reports from each of the vehicles to form a parking database that includes parking lot availability throughout the city.
Example 7 fleet Range navigation Module
In another example, the fleet range navigation module receives signals relating to where the vehicle is and where the vehicle needs to go. The fleet wide navigation module determines where the vehicle should go and the best path to go. The optimized navigation path calculation module uses city map data and a cost algorithm to balance between the fastest path and the safest path for the AV. The cost algorithm may choose to weigh adverse conditions to the AV, such as freeways, unprotected turns/U-turns, real-time/historical pedestrian density, real-time/historical cyclists' density, road flatness, road grade, road visibility, etc. The fleet wide navigation module then uses a combination of methods (e.g., a simple weight between the fastest and safest paths, some threshold (e.g., the fastest path given an X safety score), etc.) to determine the best path for the AV from point a to point B. The fleet Range navigation module includes a map database that includes speed limits, types of intersections and turns, number of lanes, cell receptions. As a crowd-sourced element, all vehicles in the fleet report back to the server real-time road conditions including traffic volume/speed, temporary traffic pattern modifications (including accidents and construction), number of pedestrians and cyclists, road visibility, etc. The real-time/historical database records "crowd-sourced" information for all vehicles in the fleet to help predict the fastest and safest path.
Claims (23)
1. A platform for distribution and navigation of an autonomous or semi-autonomous fleet of vehicles over multiple paths, the platform comprising:
a) the fleet of vehicles, comprising a plurality of autonomous or semi-autonomous vehicles, wherein each autonomous or semi-autonomous vehicle comprises:
(i) an autonomous or semi-autonomous propulsion system;
(ii) a position sensor configured to measure a current vehicle position of the vehicle;
(iii) a condition sensor configured to measure a current vehicle state; and
(iv) a communication device configured to transmit the current vehicle position and the current vehicle state;
b) a server processor configured to provide a server application, the server application comprising:
(i) a database comprising a map of the plurality of routes, wherein each route is associated with route parameters including an autonomous driving safety factor and a speed coefficient;
(ii) a communication module that receives the current vehicle position and the current vehicle state from the communication device;
(iii) a scheduling module that assigns one or more of the plurality of autonomous or semi-autonomous vehicles to a task destination based at least on the current vehicle location and the current vehicle state; and
(iv) a navigation module to apply a route calculation algorithm to determine a vehicle mission route from the current vehicle location to the mission destination based at least on the path parameters and the current vehicle state, wherein the vehicle mission route includes at least a portion of one of the plurality of paths;
wherein the communication device further directs the autonomous or semi-autonomous propulsion system of the autonomous or semi-autonomous vehicle based on the vehicle mission route.
2. The platform of claim 1, wherein the server application further comprises a requirements database comprising historical requirements data associated with a geographic area, and wherein the geographic area includes at least the task destination.
3. The platform of claim 2, wherein the server application further comprises a demand prediction module that applies a prediction algorithm to determine a predicted demand schedule for each of the autonomous or semi-autonomous vehicles based at least on the historical demand data, wherein the predicted demand schedule includes a predicted demand time period and a predicted demand mission location in the geographic area.
4. The platform of claim 3, wherein the server application further comprises a transition relocation module that assigns a transition relocation mode to each of the plurality of autonomous or semi-autonomous vehicles based at least on one or more of the predicted demand task location, the predicted demand time period, the task destination, and the current vehicle state.
5. The platform of claim 4, wherein the transitional repositioning mode includes a replenishment station mode corresponding to a replenishment station location, a parking mode associated with one of a plurality of parking space locations, and a cruise mode associated with a set threshold cruise distance from the task destination or the predicted demand task location.
6. The platform of claim 5, wherein the database further comprises the plurality of parking spot locations in the geographic area.
7. The platform of claim 6, wherein the application further comprises a parking allocation module that determines a selected parking location for one or more of the plurality of autonomous or semi-autonomous vehicles based on the plurality of parking locations, the parking pattern, and at least one of the mission destination and the predicted demand mission location.
8. The platform of claim 7, wherein the navigation module further applies the route calculation algorithm to determine a vehicle repositioning route from the task destination to:
a) the replenishment station location, the determination based on the replenishment station mode;
b) the selected parking space position, the determination being based on the parking pattern; or
c) A vehicle tour route, the determination based on the tour mode.
9. The platform of claim 8, wherein the vehicle tour route includes at least a portion of one of the plurality of routes within the set threshold tour distance from the task destination or the predicted required task location.
10. The platform of claim 8, wherein the communication device further directs the autonomous or semi-autonomous propulsion system of the autonomous or semi-autonomous vehicle to remain at the tender station location, the selected parking space location, or within the vehicle tour route for the predicted demand period.
11. The platform of claim 1, wherein the route calculation algorithm comprises a machine learning algorithm, a rule-based algorithm, or both.
12. The platform of claim 3, wherein the predictive algorithm comprises a machine learning algorithm, a rule-based algorithm, or both.
13. The platform of claim 1, wherein the current vehicle state comprises a vehicle power level, a vehicle reserve, a vehicle hardware state, or any combination thereof.
14. The platform of claim 1, wherein at least one of the speed coefficient and the autonomous driving safety factor comprises: a speed limit, an average speed, a time-dependent average speed, a number of intersections, a number of turns, a type of turn, an accident indicator, a stopped vehicle indicator, a number of lanes, a one-way street indicator, a cellular reception parameter, a road grade, a maximum road grade, an average pedestrian density, a maximum pedestrian density, a minimum pedestrian density, a time-dependent pedestrian density, an average rider density, an unprotected turn parameter, a road flatness parameter, a road visibility parameter, or any combination thereof.
15. The platform of claim 1, wherein the autonomous or semi-autonomous vehicle further comprises a sensor configured to measure sensed data.
16. The platform of claim 15, wherein the database further stores at least one of the current vehicle location, the current vehicle state, and the sensed data.
17. The platform of claim 16, wherein at least one of the safety factor and the speed factor is based on the sensed data.
18. The platform of claim 15, wherein the sensing data enables crowd sourcing safety factor and speed coefficient determination.
19. The platform of claim 18, wherein the application further comprises a path parameter prediction module that predicts future path parameters based at least on the sensed data.
20. The platform of claim 19, wherein the route calculation algorithm determines the vehicle mission route further based on predicted road parameters.
21. The platform of claim 5, wherein the autonomous or semi-autonomous vehicle further comprises a sensor configured to measure sensed data, and wherein the sensed data corresponds to a parking space status of one or more of the plurality of parking space locations in the geographic area.
22. The platform of claim 21, wherein the parking allocation module further determines the selected parking space position based on the parking space status.
23. The platform of claim 6, wherein the server application further comprises a display module that displays at least one of the current vehicle location, the current vehicle state, the mission destination, the path parameter, the mission route, the selected parking space location, and the predicted demand mission location.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762534929P | 2017-07-20 | 2017-07-20 | |
US62/534,929 | 2017-07-20 | ||
PCT/US2018/042967 WO2019018695A1 (en) | 2017-07-20 | 2018-07-19 | Autonomous vehicle repositioning |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110914779A true CN110914779A (en) | 2020-03-24 |
Family
ID=65016424
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201880047506.4A Pending CN110914779A (en) | 2017-07-20 | 2018-07-19 | Autonomous vehicle repositioning |
Country Status (6)
Country | Link |
---|---|
US (3) | US11449050B2 (en) |
EP (1) | EP3655836A4 (en) |
JP (1) | JP2020527805A (en) |
CN (1) | CN110914779A (en) |
CA (1) | CA3070186A1 (en) |
WO (1) | WO2019018695A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112445923A (en) * | 2021-02-01 | 2021-03-05 | 智道网联科技(北京)有限公司 | Data processing method and device based on intelligent traffic and storage medium |
Families Citing this family (78)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017091894A1 (en) * | 2015-12-01 | 2017-06-08 | Genetec Inc. | Peer-to-peer virtual chalking |
JP6864006B2 (en) * | 2015-12-21 | 2021-04-21 | バイエリシエ・モトーレンウエルケ・アクチエンゲゼルシヤフト | How to modify control equipment related to automobile safety and / or security and related equipment |
US11265284B2 (en) * | 2016-03-18 | 2022-03-01 | Westinghouse Air Brake Technologies Corporation | Communication status system and method |
US11176500B2 (en) | 2016-08-16 | 2021-11-16 | Teleport Mobility, Inc. | Interactive real time system and real time method of use thereof in conveyance industry segments |
US11087252B2 (en) | 2016-08-16 | 2021-08-10 | Teleport Mobility, Inc. | Interactive real time system and real time method of use thereof in conveyance industry segments |
US11182709B2 (en) | 2016-08-16 | 2021-11-23 | Teleport Mobility, Inc. | Interactive real time system and real time method of use thereof in conveyance industry segments |
GB2554875B (en) * | 2016-10-10 | 2018-11-07 | Ford Global Tech Llc | Improvements in or relating to manual transmission vehicles |
US10399106B2 (en) * | 2017-01-19 | 2019-09-03 | Ford Global Technologies, Llc | Camera and washer spray diagnostic |
CA3070186A1 (en) | 2017-07-20 | 2019-01-24 | Nuro, Inc. | Autonomous vehicle repositioning |
US11009868B2 (en) | 2017-07-20 | 2021-05-18 | Nuro, Inc. | Fleet of autonomous vehicles with lane positioning and platooning behaviors |
US10403133B1 (en) * | 2017-07-27 | 2019-09-03 | State Farm Mutual Automobile Insurance Company | Vehicle roadway traffic density management systems for optimizing vehicle spacing |
WO2019023521A1 (en) * | 2017-07-28 | 2019-01-31 | Nuro, Inc. | Automated retail store on autonomous or semi-autonomous vehicle |
US11126191B2 (en) | 2017-08-07 | 2021-09-21 | Panasonic Intellectual Property Corporation Of America | Control device and control method |
US10636298B2 (en) * | 2017-08-11 | 2020-04-28 | Cubic Corporation | Adaptive traffic control using object tracking and identity details |
WO2019036425A1 (en) * | 2017-08-17 | 2019-02-21 | Walmart Apollo, Llc | Systems and methods for delivery of commercial items |
US11300963B1 (en) | 2017-08-18 | 2022-04-12 | Amazon Technologies, Inc. | Robot movement constraint system |
US11422565B1 (en) * | 2017-08-18 | 2022-08-23 | Amazon Technologies, Inc. | System for robot movement that is informed by cultural conventions |
US10807591B1 (en) * | 2017-11-02 | 2020-10-20 | Zoox, Inc. | Vehicle disaster detection and response |
US10824862B2 (en) | 2017-11-14 | 2020-11-03 | Nuro, Inc. | Three-dimensional object detection for autonomous robotic systems using image proposals |
JP7040936B2 (en) * | 2017-12-26 | 2022-03-23 | 株式会社ゼンリンデータコム | Information gathering system and information gathering device |
US10821973B2 (en) * | 2018-01-05 | 2020-11-03 | Telenav, Inc. | Navigation system with parking facility navigation mechanism and method of operation thereof |
MX2020007114A (en) * | 2018-01-10 | 2020-12-09 | Simbe Robotics Inc | Method for detecting and responding to spills and hazards. |
US10836379B2 (en) * | 2018-03-23 | 2020-11-17 | Sf Motors, Inc. | Multi-network-based path generation for vehicle parking |
FR3082984B1 (en) * | 2018-06-26 | 2021-05-21 | Transdev Group | ELECTRONIC DEVICE AND METHOD FOR MONITORING A SET OF AUTONOMOUS MOTOR VEHICLES, ASSOCIATED TRANSPORT SYSTEM AND COMPUTER PROGRAM |
US10757596B2 (en) * | 2018-06-29 | 2020-08-25 | Intel Corporation | Methods and apparatus to collect data from user equipment outside a network |
US10778943B2 (en) | 2018-07-17 | 2020-09-15 | C-Tonomy, LLC | Autonomous surveillance duo |
US11537954B2 (en) * | 2018-09-04 | 2022-12-27 | Beijing Didi Infinity Technology And Development Co., Ltd. | System and method for ride order dispatching and vehicle repositioning |
US11312331B2 (en) * | 2018-11-13 | 2022-04-26 | Carrier Corporation | System and method for providing temporary access to a vehicle |
US11004449B2 (en) * | 2018-11-29 | 2021-05-11 | International Business Machines Corporation | Vocal utterance based item inventory actions |
US10991251B2 (en) * | 2019-01-29 | 2021-04-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | Parking meter monitoring and payment system |
JP7304535B2 (en) * | 2019-02-14 | 2023-07-07 | パナソニックIpマネジメント株式会社 | VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD AND PROGRAM |
US11899448B2 (en) * | 2019-02-21 | 2024-02-13 | GM Global Technology Operations LLC | Autonomous vehicle that is configured to identify a travel characteristic based upon a gesture |
US11084387B2 (en) * | 2019-02-25 | 2021-08-10 | Toyota Research Institute, Inc. | Systems, methods, and storage media for arranging a plurality of cells in a vehicle battery pack |
CN112997226A (en) * | 2019-03-28 | 2021-06-18 | 松下电器(美国)知识产权公司 | Information processing method and information processing system |
US11313688B2 (en) * | 2019-04-10 | 2022-04-26 | Waymo Llc | Advanced trip planning for autonomous vehicle services |
US10623734B1 (en) * | 2019-04-23 | 2020-04-14 | Nuro, Inc. | Systems and methods for adaptive mobile telecommunications for autonomous vehicles |
TWI684085B (en) * | 2019-04-24 | 2020-02-01 | 揚昇育樂事業股份有限公司 | Self-driving travel path central controlling device of self-driving car |
EP3736753A1 (en) * | 2019-05-08 | 2020-11-11 | Siemens Mobility GmbH | Method, fleet management device and system for determining a preferred location of a vehicle |
US11386778B2 (en) | 2019-05-17 | 2022-07-12 | sibrtech inc. | Road user detecting and communication device and method |
US10665109B1 (en) | 2019-05-17 | 2020-05-26 | sibrtech inc. | Construction zone apparatus and method |
US11275376B2 (en) | 2019-06-20 | 2022-03-15 | Florida Power & Light Company | Large scale unmanned monitoring device assessment of utility system components |
US11378949B2 (en) * | 2019-06-20 | 2022-07-05 | Omnitracs, Llc | Systems and methods for autonomously controlling movement of delivery vehicles |
CN110244742B (en) * | 2019-07-01 | 2023-06-09 | 阿波罗智能技术(北京)有限公司 | Method, apparatus and storage medium for unmanned vehicle tour |
US11095741B2 (en) * | 2019-07-11 | 2021-08-17 | Ghost Locomotion Inc. | Value-based transmission in an autonomous vehicle |
JP7445882B2 (en) * | 2019-08-06 | 2024-03-08 | パナソニックIpマネジメント株式会社 | Driving support method, road photographic image collection method, and roadside device |
EP3772729B1 (en) * | 2019-08-08 | 2022-08-31 | Ningbo Geely Automobile Research & Development Co. Ltd. | A method for preconditioning vehicles |
US11964627B2 (en) | 2019-09-30 | 2024-04-23 | Nuro, Inc. | Methods and apparatus for supporting compartment inserts in autonomous delivery vehicles |
KR20210043065A (en) * | 2019-10-10 | 2021-04-21 | 현대모비스 주식회사 | Apparatus and method For Warning a signal violation vehicle at intersection |
WO2021077205A1 (en) * | 2019-10-26 | 2021-04-29 | Genetec Inc. | Automated license plate recognition system and related method |
US11590862B2 (en) | 2019-12-12 | 2023-02-28 | Lear Corporation | Seating system |
US11587001B2 (en) * | 2020-01-15 | 2023-02-21 | International Business Machines Corporation | Rebalancing autonomous vehicles according to last-mile delivery demand |
WO2021194747A1 (en) | 2020-03-23 | 2021-09-30 | Nuro, Inc. | Methods and apparatus for automated deliveries |
US11904854B2 (en) * | 2020-03-30 | 2024-02-20 | Toyota Research Institute, Inc. | Systems and methods for modeling pedestrian activity |
US20210343091A1 (en) * | 2020-04-29 | 2021-11-04 | Teleo, Inc. | Deported compute for teleoperation and autonomous systems |
US20210358066A1 (en) * | 2020-05-17 | 2021-11-18 | Ahmad Abusaad | Intelligent Traffic Violation Detection System |
USD935350S1 (en) * | 2020-05-21 | 2021-11-09 | Nuro, Inc. | Autonomous vehicle |
JP7068386B2 (en) * | 2020-06-10 | 2022-05-16 | ソフトバンク株式会社 | Management equipment, programs, systems, and methods |
US11797019B2 (en) * | 2020-07-20 | 2023-10-24 | Ford Global Technologies, Llc | Rugged terrain vehicle design and route optimization |
US20220063661A1 (en) * | 2020-08-25 | 2022-03-03 | Gm Cruise Holdings Llc | Blockage routing and maneuver arbitration |
CL2021002230A1 (en) * | 2020-08-27 | 2022-04-18 | Tech Resources Pty Ltd | Method and Apparatus for Coordinating Multiple Cooperative Vehicle Paths on Shared Highway Networks |
JP2022053313A (en) * | 2020-09-24 | 2022-04-05 | いすゞ自動車株式会社 | Dispatch vehicle control device, and dispatch vehicle control method |
USD965471S1 (en) | 2020-10-01 | 2022-10-04 | Nuro, Inc. | Autonomous vehicle |
US11797896B2 (en) | 2020-11-30 | 2023-10-24 | At&T Intellectual Property I, L.P. | Autonomous aerial vehicle assisted viewing location selection for event venue |
US11443518B2 (en) | 2020-11-30 | 2022-09-13 | At&T Intellectual Property I, L.P. | Uncrewed aerial vehicle shared environment privacy and security |
US11726475B2 (en) | 2020-11-30 | 2023-08-15 | At&T Intellectual Property I, L.P. | Autonomous aerial vehicle airspace claiming and announcing |
CN112562351B (en) * | 2020-12-03 | 2022-07-19 | 江西台德智慧科技有限公司 | Intelligent glasses based on intelligent recognition |
US11537197B2 (en) | 2020-12-03 | 2022-12-27 | Lear Corporation | Vehicle system for providing access to removable vehicle components |
CN112612788B (en) * | 2020-12-11 | 2024-03-01 | 中国北方车辆研究所 | Autonomous positioning method under navigation-free satellite signal |
US20220188953A1 (en) | 2020-12-15 | 2022-06-16 | Selex Es Inc. | Sytems and methods for electronic signature tracking |
US11644335B2 (en) * | 2021-03-26 | 2023-05-09 | Gm Cruise Holdings Llc | Service area coverage for autonomous vehicle fleets |
US11892303B2 (en) * | 2021-05-26 | 2024-02-06 | Here Global B.V. | Apparatus and methods for predicting state of visibility for a road object |
USD985400S1 (en) | 2021-08-31 | 2023-05-09 | Nuro, Inc. | Sensor arch for vehicle |
USD1003194S1 (en) | 2021-08-31 | 2023-10-31 | Nuro, Inc. | Autonomous vehicle |
US20230194304A1 (en) * | 2021-12-21 | 2023-06-22 | Micron Technology, Inc. | Map update using images |
USD1014398S1 (en) * | 2021-12-23 | 2024-02-13 | Waymo Llc | Vehicle |
FR3132233A1 (en) * | 2022-02-03 | 2023-08-04 | Psa Automobiles Sa | Method and device for controlling mobile air purification devices |
GB2619044A (en) * | 2022-05-25 | 2023-11-29 | Bae Systems Plc | Controlling an aquatic vessel |
EP4312197A1 (en) * | 2022-07-27 | 2024-01-31 | Bayerische Motoren Werke Aktiengesellschaft | Vehicle, apparatus, computer program, and method for monitoring an environment of a vehicle |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000030195A (en) * | 1998-07-10 | 2000-01-28 | Honda Motor Co Ltd | Car allocation system |
CN1804551A (en) * | 2005-01-14 | 2006-07-19 | 阿尔卡特公司 | Navigation service |
WO2014024254A1 (en) * | 2012-08-07 | 2014-02-13 | 株式会社日立製作所 | Use-assisting tool for autonomous traveling device, operation management center, operation system, and autonomous traveling device |
WO2016209595A1 (en) * | 2015-06-22 | 2016-12-29 | Google Inc. | Determining pickup and destination locations for autonomous vehicles |
US20170116477A1 (en) * | 2015-10-23 | 2017-04-27 | Nokia Technologies Oy | Integration of positional data and overhead images for lane identification |
US20170124781A1 (en) * | 2015-11-04 | 2017-05-04 | Zoox, Inc. | Calibration for autonomous vehicle operation |
CN113253692A (en) * | 2021-06-21 | 2021-08-13 | 浙江华睿科技有限公司 | Tour method, tour device, tour equipment and readable storage medium for AGV |
Family Cites Families (191)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US1012038A (en) | 1909-12-16 | 1911-12-19 | Sears Roebuck & Co | Strip-serving device. |
US1002978A (en) | 1911-05-27 | 1911-09-12 | Theodore C Fedders | Radiator. |
US3064252A (en) | 1952-03-31 | 1962-11-13 | Arthur A Varela | Height finding radar system |
US4766548A (en) | 1987-01-02 | 1988-08-23 | Pepsico Inc. | Telelink monitoring and reporting system |
US4952911A (en) | 1988-05-18 | 1990-08-28 | Eastman Kodak Company | Scanning intrusion detection device |
JPH02155067A (en) | 1988-12-07 | 1990-06-14 | Hitachi Ltd | Method for warning inventory and system using such method |
US5207784A (en) | 1989-03-09 | 1993-05-04 | Wilbur Schwartzendruber | Vending machine with monitoring system |
JPH036407A (en) | 1989-06-03 | 1991-01-11 | Daido Steel Co Ltd | Measuring device for shape of outer periphery |
DE4025333A1 (en) | 1990-08-10 | 1992-02-13 | Cohausz Helge B | FRIDGE |
US5636122A (en) | 1992-10-16 | 1997-06-03 | Mobile Information Systems, Inc. | Method and apparatus for tracking vehicle location and computer aided dispatch |
NO941202L (en) | 1994-03-30 | 1995-10-02 | Oeystein Konsmo | Method of monitoring and generating messages as well as equipment using the method |
DE69633524T2 (en) | 1995-04-12 | 2005-03-03 | Matsushita Electric Industrial Co., Ltd., Kadoma | Method and device for object detection |
US5922040A (en) | 1995-05-17 | 1999-07-13 | Mobile Information System, Inc. | Method and apparatus for fleet management |
US6075441A (en) | 1996-09-05 | 2000-06-13 | Key-Trak, Inc. | Inventoriable-object control and tracking system |
WO1997024701A1 (en) | 1995-12-27 | 1997-07-10 | Sanyo Electric Co., Ltd. | Sales management method in automatic vending machine |
US6181981B1 (en) | 1996-05-15 | 2001-01-30 | Marconi Communications Limited | Apparatus and method for improved vending machine inventory maintenance |
US6034803A (en) | 1997-04-30 | 2000-03-07 | K2 T, Inc. | Method and apparatus for directing energy based range detection sensor |
US6230150B1 (en) | 1997-10-09 | 2001-05-08 | Walker Digital, Llc | Vending machine evaluation network |
USD411814S (en) | 1998-03-11 | 1999-07-06 | Honda Giken Kogyo Kabushiki Kaisha | Automobile |
US20030209375A1 (en) | 1999-01-25 | 2003-11-13 | Zip Charge Corporation | Electrical vehicle energy supply system, electrical vehicle battery, electrical vehicle battery charging apparatus, battery supply apparatus, and electrical vehicle battery management system |
WO2000058891A1 (en) | 1999-03-26 | 2000-10-05 | The Retail Pipeline Integration Group, Inc. | Method and system for determining time-phased sales forecasts and projected replenishment shipments in a supply chain |
US7177825B1 (en) | 1999-05-11 | 2007-02-13 | Borders Louis H | Integrated system for ordering, fulfillment, and delivery of consumer products using a data network |
US6323941B1 (en) | 1999-08-06 | 2001-11-27 | Lockheed Martin Corporation | Sensor assembly for imaging passive infrared and active LADAR and method for same |
US7783508B2 (en) | 1999-09-20 | 2010-08-24 | Numerex Corp. | Method and system for refining vending operations based on wireless data |
US6212473B1 (en) * | 1999-09-20 | 2001-04-03 | Ford Global Technologies, Inc. | Vehicle navigation system having inferred user preferences |
US6636598B1 (en) | 2000-01-24 | 2003-10-21 | Avaya Technology Corp. | Automated transaction distribution system and method implementing transaction distribution to unavailable agents |
JP4370660B2 (en) | 2000-03-09 | 2009-11-25 | 株式会社Ihi | Fire monitoring system |
JP2001344640A (en) | 2000-03-29 | 2001-12-14 | Sanyo Electric Co Ltd | Automatic vending machine managing method and automatic vending machine |
US20110130134A1 (en) | 2000-04-19 | 2011-06-02 | Van Rysselberghe Pierre C | Security systems |
AU2001282850A1 (en) | 2000-04-26 | 2001-11-07 | Arete Associates | Very fast time resolved imaging in multiparameter measurement space |
US7139721B2 (en) | 2000-05-10 | 2006-11-21 | Borders Louis H | Scheduling delivery of products via the internet |
US6490995B2 (en) | 2000-08-28 | 2002-12-10 | George Jefferson Greene, Jr. | Air-conditioned animal transporter |
FR2817339B1 (en) | 2000-11-24 | 2004-05-14 | Mensi | THREE-DIMENSIONAL LIFTING DEVICE OF A LASER EMISSION SCENE |
JP3748790B2 (en) | 2001-06-12 | 2006-02-22 | 喜久雄 金子 | Courier delivery system and method thereof |
US7190465B2 (en) | 2001-08-30 | 2007-03-13 | Z + F Zoller & Froehlich Gmbh | Laser measurement system |
US7436887B2 (en) * | 2002-02-06 | 2008-10-14 | Playtex Products, Inc. | Method and apparatus for video frame sequence-based object tracking |
US7051539B2 (en) | 2002-12-30 | 2006-05-30 | Whirlpool Corporation | Convertible refrigerator-freezer |
GB2398841A (en) | 2003-02-28 | 2004-09-01 | Qinetiq Ltd | Wind turbine control having a Lidar wind speed measurement apparatus |
CA2531849A1 (en) | 2003-07-11 | 2005-01-27 | Rf Code, Inc. | System, method and computer program product for monitoring inventory |
US20100013615A1 (en) * | 2004-03-31 | 2010-01-21 | Carnegie Mellon University | Obstacle detection having enhanced classification |
US7798885B2 (en) | 2004-08-04 | 2010-09-21 | Mattel, Inc. | Instant message toy phone |
US20080103851A1 (en) | 2004-09-27 | 2008-05-01 | Jay S Walker | Products and Processes for Determining Allocation of Inventory for a Vending Machine |
US20060106490A1 (en) | 2004-11-15 | 2006-05-18 | Admmicro, Llc | Vending machine monitoring system |
JP4171728B2 (en) | 2004-12-24 | 2008-10-29 | パルステック工業株式会社 | 3D shape measuring device |
CA2613906A1 (en) | 2004-12-29 | 2006-07-06 | Bernard Trest | Dynamic information system |
JP4631761B2 (en) | 2005-08-08 | 2011-02-16 | トヨタ自動車株式会社 | Battery life prediction device and battery life warning device for powertrain |
US9036028B2 (en) | 2005-09-02 | 2015-05-19 | Sensormatic Electronics, LLC | Object tracking and alerts |
CA2625895C (en) | 2005-10-14 | 2016-05-17 | Aethon, Inc. | Robotic ordering and delivery apparatuses, systems and methods |
US8191779B2 (en) | 2005-10-27 | 2012-06-05 | Avt, Inc. | Wireless management of remote vending machines |
US7944548B2 (en) | 2006-03-07 | 2011-05-17 | Leica Geosystems Ag | Increasing measurement rate in time of flight measurement apparatuses |
US9373149B2 (en) | 2006-03-17 | 2016-06-21 | Fatdoor, Inc. | Autonomous neighborhood vehicle commerce network and community |
US8072581B1 (en) | 2007-01-19 | 2011-12-06 | Rockwell Collins, Inc. | Laser range finding system using variable field of illumination flash lidar |
US20100228405A1 (en) | 2007-06-13 | 2010-09-09 | Intrago Corporation | Shared vehicle management system |
DE102008031682A1 (en) | 2008-07-04 | 2010-03-11 | Eads Deutschland Gmbh | Direct Receive Doppler LIDAR Method and Direct Receive Doppler LIDAR Device |
US9026315B2 (en) * | 2010-10-13 | 2015-05-05 | Deere & Company | Apparatus for machine coordination which maintains line-of-site contact |
USD615905S1 (en) | 2008-09-22 | 2010-05-18 | Chrysler Group Llc | Automobile body |
US9147192B2 (en) | 2009-01-14 | 2015-09-29 | International Business Machines Corporation | Targeted vehicle advertising and entertainment system method |
US8120488B2 (en) | 2009-02-27 | 2012-02-21 | Rf Controls, Llc | Radio frequency environment object monitoring system and methods of use |
US20100301056A1 (en) | 2009-05-27 | 2010-12-02 | Sharon Wolfe | Portable Compartmentalized Thermal Food Container |
JP5560788B2 (en) | 2009-06-26 | 2014-07-30 | 日産自動車株式会社 | Information provision device |
US8326707B2 (en) | 2009-10-08 | 2012-12-04 | At&T Intellectual Property I, Lp | Methods and systems for providing wireless enabled inventory peering |
US8788341B1 (en) | 2010-04-27 | 2014-07-22 | VendScreen, Inc. | Vending machine systems using standard inventory control system components |
US8636208B2 (en) | 2010-06-22 | 2014-01-28 | White Systems, Inc. | Mobile retail store structure with inventory system |
US8063797B1 (en) * | 2010-07-31 | 2011-11-22 | ParkMe LLC | Parking information collection system and method |
WO2012027730A1 (en) | 2010-08-26 | 2012-03-01 | Humberto Enrique Roa | Location aware mobile marketplace application and system |
US8831826B2 (en) | 2011-11-16 | 2014-09-09 | Flextronics Ap, Llc | Gesture recognition for on-board display |
US8799037B2 (en) * | 2010-10-14 | 2014-08-05 | Palto Alto Research Center Incorporated | Computer-implemented system and method for managing motor vehicle parking reservations |
US9542662B2 (en) | 2010-12-30 | 2017-01-10 | Sap Se | Lineage information for streaming event data and event lineage graph structures for visualization |
US9111440B2 (en) | 2011-01-06 | 2015-08-18 | Lg Electronics Inc. | Refrigerator and remote controller |
US8630897B1 (en) | 2011-01-11 | 2014-01-14 | Google Inc. | Transportation-aware physical advertising conversions |
US20120185130A1 (en) | 2011-01-18 | 2012-07-19 | Ekchian Gregory J | Vehicle lighting |
US10217160B2 (en) | 2012-04-22 | 2019-02-26 | Emerging Automotive, Llc | Methods and systems for processing charge availability and route paths for obtaining charge for electric vehicles |
US9123035B2 (en) | 2011-04-22 | 2015-09-01 | Angel A. Penilla | Electric vehicle (EV) range extending charge systems, distributed networks of charge kiosks, and charge locating mobile apps |
US9493130B2 (en) * | 2011-04-22 | 2016-11-15 | Angel A. Penilla | Methods and systems for communicating content to connected vehicle users based detected tone/mood in voice input |
US20120280836A1 (en) * | 2011-05-04 | 2012-11-08 | Federal Signal Corporation | Vehicle Detection System with RFID-Based Location Determination |
US20140209634A1 (en) | 2011-06-21 | 2014-07-31 | Smart Bar International LLC | Automatic beverage dispenser beverage cart |
TWI618020B (en) | 2011-07-26 | 2018-03-11 | 睿能創意公司 | Method,system and computer-readable medium for providing locations of power storage device collection,charging and distribution machines |
WO2013025803A1 (en) | 2011-08-17 | 2013-02-21 | Eyal Shlomot | Smart electronic roadside billboard |
US9037852B2 (en) * | 2011-09-02 | 2015-05-19 | Ivsc Ip Llc | System and method for independent control of for-hire vehicles |
WO2013072926A2 (en) | 2011-09-19 | 2013-05-23 | Tata Consultancy Services Limited | A computing platform for development and deployment of sensor-driven vehicle telemetry applications and services |
FR2984254B1 (en) * | 2011-12-16 | 2016-07-01 | Renault Sa | CONTROL OF AUTONOMOUS VEHICLES |
US9417199B2 (en) * | 2012-01-17 | 2016-08-16 | Triune Systems, LLC | Method and system of wireless power transfer foreign object detection |
US20140025230A1 (en) | 2012-07-17 | 2014-01-23 | Elwha LLC, a limited liability company of the State of Delaware | Unmanned device interaction methods and systems |
US9230236B2 (en) * | 2012-08-07 | 2016-01-05 | Daniel Judge Villamar | Automated delivery vehicle, systems and methods for automated delivery |
US9383753B1 (en) | 2012-09-26 | 2016-07-05 | Google Inc. | Wide-view LIDAR with areas of special attention |
US9086273B1 (en) | 2013-03-08 | 2015-07-21 | Google Inc. | Microrod compression of laser beam in combination with transmit lens |
US9349238B2 (en) | 2013-03-13 | 2016-05-24 | Pantry Retail, Inc. | Vending kit and method |
US9152870B2 (en) * | 2013-03-15 | 2015-10-06 | Sri International | Computer vision as a service |
WO2014147510A1 (en) | 2013-03-18 | 2014-09-25 | Koninklijke Philips N.V. | Methods and apparatus for information management and control of outdoor lighting networks |
US9489490B1 (en) | 2013-04-29 | 2016-11-08 | Daniel Theobald | Mobile robot for receiving, transporting, and/or delivering one or more pharmaceutical items |
US20140330738A1 (en) | 2013-05-01 | 2014-11-06 | Gruppo Due Mondi, Inc. | Optimizing Customer Delivery Services |
CN105103106B (en) | 2013-05-16 | 2019-08-20 | 英特尔公司 | Display area is automatically adjusted to reduce power consumption |
US9307383B1 (en) | 2013-06-12 | 2016-04-05 | Google Inc. | Request apparatus for delivery of medical support implement by UAV |
US9256852B1 (en) * | 2013-07-01 | 2016-02-09 | Google Inc. | Autonomous delivery platform |
US10551851B2 (en) * | 2013-07-01 | 2020-02-04 | Steven Sounyoung Yu | Autonomous unmanned road vehicle for making deliveries |
US8836922B1 (en) | 2013-08-20 | 2014-09-16 | Google Inc. | Devices and methods for a rotating LIDAR platform with a shared transmit/receive path |
US9464902B2 (en) | 2013-09-27 | 2016-10-11 | Regents Of The University Of Minnesota | Symbiotic unmanned aerial vehicle and unmanned surface vehicle system |
WO2015061008A1 (en) | 2013-10-26 | 2015-04-30 | Amazon Technologies, Inc. | Unmanned aerial vehicle delivery system |
CN111114377A (en) | 2013-11-28 | 2020-05-08 | 松下电器(美国)知识产权公司 | Information output method, information presentation device, and information output system |
US9234757B2 (en) * | 2013-11-29 | 2016-01-12 | Fedex Corporate Services, Inc. | Determining node location using a variable power characteristic of a node in a wireless node network |
US9741011B2 (en) | 2013-12-12 | 2017-08-22 | Main Grade Assets, Llc | System for improving efficiencies of on-demand delivery services |
US9684914B1 (en) | 2014-01-17 | 2017-06-20 | Amazon Technologies, Inc. | Techniques for real-time dynamic pricing |
CN106133781B (en) | 2014-02-07 | 2020-08-11 | 可口可乐公司 | System and method for selling goods or services or collecting recycle waste with a mechanized mobile vending machine |
CN106537430A (en) | 2014-04-14 | 2017-03-22 | 热布卡公司 | Systems and methods for vehicle fleet sharing |
US9984525B2 (en) | 2014-04-24 | 2018-05-29 | The Hillman Group, Inc. | Automated vending inventory management apparatuses and method |
US10402665B2 (en) * | 2014-05-14 | 2019-09-03 | Mobileye Vision Technologies, Ltd. | Systems and methods for detecting traffic signs |
USD734211S1 (en) | 2014-05-23 | 2015-07-14 | Google Inc. | Autonomous vehicle exterior |
WO2015191866A1 (en) | 2014-06-13 | 2015-12-17 | Kis-Benedek Pinero Iidiko' | Portable display device and system |
US10163177B2 (en) | 2014-07-31 | 2018-12-25 | Emmett Farris | System and method for controlling drone delivery or pick up during a delivery or pick up phase of drone operation |
US9542664B2 (en) | 2014-09-08 | 2017-01-10 | Inventor-E Limited | Stock monitoring |
US9508204B2 (en) | 2014-10-01 | 2016-11-29 | Continental Intelligent Transportation Systems, LLC | Package exchange and service system using a key fob simulator |
EP3230965A4 (en) | 2014-12-08 | 2018-09-05 | Vendwatch Telematics, LLC | Vending machine route management |
US9824394B1 (en) | 2015-02-06 | 2017-11-21 | Square, Inc. | Payment processor financing of customer purchases |
EP3845427A1 (en) * | 2015-02-10 | 2021-07-07 | Mobileye Vision Technologies Ltd. | Sparse map for autonomous vehicle navigation |
GB2535718A (en) * | 2015-02-24 | 2016-08-31 | Addison Lee Ltd | Resource management |
US9552564B1 (en) | 2015-03-19 | 2017-01-24 | Amazon Technologies, Inc. | Autonomous delivery transportation network |
US10891584B2 (en) * | 2015-04-10 | 2021-01-12 | Smiotex, Inc. | Devices, systems, and methods for storing items |
US10507807B2 (en) * | 2015-04-28 | 2019-12-17 | Mobileye Vision Technologies Ltd. | Systems and methods for causing a vehicle response based on traffic light detection |
US9547309B2 (en) | 2015-05-13 | 2017-01-17 | Uber Technologies, Inc. | Selecting vehicle type for providing transport |
US9711050B2 (en) * | 2015-06-05 | 2017-07-18 | Bao Tran | Smart vehicle |
US9836056B2 (en) | 2015-06-05 | 2017-12-05 | Bao Tran | Smart vehicle |
US20160357187A1 (en) | 2015-06-05 | 2016-12-08 | Arafat M.A. ANSARI | Smart vehicle |
US9786187B1 (en) | 2015-06-09 | 2017-10-10 | Amazon Technologies, Inc. | Transportation network utilizing autonomous vehicles for transporting items |
US10802575B2 (en) | 2015-06-11 | 2020-10-13 | Karma Automotive Llc | Smart external display for vehicles |
DE102015111033A1 (en) | 2015-07-08 | 2017-01-12 | Deutsche Post Ag | Device and method for flexible collection and / or delivery of a shipment |
KR20170010645A (en) | 2015-07-20 | 2017-02-01 | 엘지전자 주식회사 | Autonomous vehicle and autonomous vehicle system including the same |
US20150348112A1 (en) | 2015-08-12 | 2015-12-03 | Madhusoodhan Ramanujam | Providing Advertisements to Autonomous Vehicles |
US10139237B2 (en) | 2015-09-01 | 2018-11-27 | Chris Outwater | Method for remotely identifying one of a passenger and an assigned vehicle to the other |
US9731726B2 (en) | 2015-09-02 | 2017-08-15 | International Business Machines Corporation | Redirecting self-driving vehicles to a product provider based on physiological states of occupants of the self-driving vehicles |
US10082797B2 (en) | 2015-09-16 | 2018-09-25 | Ford Global Technologies, Llc | Vehicle radar perception and localization |
US10139828B2 (en) | 2015-09-24 | 2018-11-27 | Uber Technologies, Inc. | Autonomous vehicle operated with safety augmentation |
JP2019502975A (en) | 2015-10-13 | 2019-01-31 | スターシップ テクノロジーズ オサイヒング | Autonomous or semi-autonomous delivery method and system |
MX2018005330A (en) | 2015-10-30 | 2018-09-05 | Walmart Apollo Llc | Mobile retail systems and methods of distributing and stocking the mobile retail systems. |
WO2017076813A1 (en) | 2015-11-02 | 2017-05-11 | Starship Technologies Oü | System and method for traversing vertical obstacles |
CN108369420B (en) | 2015-11-02 | 2021-11-05 | 星船科技私人有限公司 | Apparatus and method for autonomous positioning |
US9606539B1 (en) * | 2015-11-04 | 2017-03-28 | Zoox, Inc. | Autonomous vehicle fleet service and system |
WO2017079341A2 (en) * | 2015-11-04 | 2017-05-11 | Zoox, Inc. | Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles |
US9958864B2 (en) | 2015-11-04 | 2018-05-01 | Zoox, Inc. | Coordination of dispatching and maintaining fleet of autonomous vehicles |
US9632502B1 (en) * | 2015-11-04 | 2017-04-25 | Zoox, Inc. | Machine-learning systems and techniques to optimize teleoperation and/or planner decisions |
US9754490B2 (en) * | 2015-11-04 | 2017-09-05 | Zoox, Inc. | Software application to request and control an autonomous vehicle service |
EP3374947A4 (en) | 2015-11-09 | 2019-03-27 | Simbe Robotics, Inc. | Method for tracking stock level within a store |
US20180336512A1 (en) | 2015-11-20 | 2018-11-22 | Ocado Innovation Limited | Sensor system and method |
JP6540482B2 (en) * | 2015-12-04 | 2019-07-10 | 株式会社デンソー | INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING DEVICE, AND OUTPUT CONTROL METHOD |
US20170282859A1 (en) | 2015-12-22 | 2017-10-05 | GM Global Technology Operations LLC | On-sale vehicle sharing accessory device and system |
US20170227470A1 (en) | 2016-02-04 | 2017-08-10 | Proxy Technologies, Inc. | Autonomous vehicle, system and method for structural object assessment and manufacture thereof |
US10088846B2 (en) | 2016-03-03 | 2018-10-02 | GM Global Technology Operations LLC | System and method for intended passenger detection |
US10228694B2 (en) * | 2016-03-04 | 2019-03-12 | Animusoft Corporation | Drone and robot control systems and methods |
US9792575B2 (en) | 2016-03-11 | 2017-10-17 | Route4Me, Inc. | Complex dynamic route sequencing for multi-vehicle fleets using traffic and real-world constraints |
CA3017153A1 (en) | 2016-03-14 | 2017-10-19 | Walmart Apollo, Llc | Unmanned aircraft systems and methods to interact with specifically intended objects |
US9488984B1 (en) * | 2016-03-17 | 2016-11-08 | Jeff Williams | Method, device and system for navigation of an autonomous supply chain node vehicle in a storage center using virtual image-code tape |
CN109071015B (en) * | 2016-04-29 | 2021-11-30 | 美国联合包裹服务公司 | Unmanned aerial vehicle picks up and delivers system |
WO2017192868A1 (en) | 2016-05-04 | 2017-11-09 | Wal-Mart Stores, Inc. | Distributed autonomous robot systems and methods |
CN109416873B (en) * | 2016-06-24 | 2022-02-15 | 瑞士再保险有限公司 | Autonomous or partially autonomous motor vehicle with automated risk control system and corresponding method |
US10029787B1 (en) | 2016-06-30 | 2018-07-24 | X Development Llc | Interactive transport services provided by unmanned aerial vehicles |
US10166976B2 (en) | 2016-07-22 | 2019-01-01 | International Business Machines Corporation | Connection of an autonomous vehicle with a second vehicle to receive goods |
US10216188B2 (en) | 2016-07-25 | 2019-02-26 | Amazon Technologies, Inc. | Autonomous ground vehicles based at delivery locations |
US10062288B2 (en) | 2016-07-29 | 2018-08-28 | GM Global Technology Operations LLC | Systems and methods for autonomous driving merging management |
US11176500B2 (en) | 2016-08-16 | 2021-11-16 | Teleport Mobility, Inc. | Interactive real time system and real time method of use thereof in conveyance industry segments |
SG10201606948XA (en) | 2016-08-19 | 2018-03-28 | Mastercard Asia Pacific Pte Ltd | Item delivery management systems and methods |
US10275975B2 (en) | 2016-08-19 | 2019-04-30 | Walmart Apollo, Llc | Apparatus and method for mobile vending |
US11429917B2 (en) | 2016-09-02 | 2022-08-30 | Home Valet, Inc. | System and method for robotic delivery |
US20180101818A1 (en) | 2016-09-03 | 2018-04-12 | John Simms | System and Method for Delivery of Goods Incentivized by Rewards Program |
US10643256B2 (en) | 2016-09-16 | 2020-05-05 | International Business Machines Corporation | Configuring a self-driving vehicle for charitable donations pickup and delivery |
US9815633B1 (en) | 2016-09-19 | 2017-11-14 | Amazon Technologies, Inc. | Automated fulfillment of unmanned aerial vehicles |
US9905133B1 (en) * | 2016-09-30 | 2018-02-27 | Allstate Insurance Company | Controlling autonomous vehicles to provide automated emergency response functions |
US10046688B2 (en) | 2016-10-06 | 2018-08-14 | Ford Global Technologies, Llc | Vehicle containing sales bins |
CA3043473A1 (en) | 2016-11-16 | 2018-05-24 | Walmart Apollo, Llc | Climate controlled container for vehicle |
US10699305B2 (en) * | 2016-11-21 | 2020-06-30 | Nio Usa, Inc. | Smart refill assistant for electric vehicles |
US10762441B2 (en) | 2016-12-01 | 2020-09-01 | Uber Technologies, Inc. | Predicting user state using machine learning |
US10223829B2 (en) * | 2016-12-01 | 2019-03-05 | Here Global B.V. | Method and apparatus for generating a cleaned object model for an object in a mapping database |
US9741010B1 (en) | 2016-12-02 | 2017-08-22 | Starship Technologies Oü | System and method for securely delivering packages to different delivery recipients with a single vehicle |
US20180158018A1 (en) | 2016-12-05 | 2018-06-07 | United States Postal Service | Systems for autonomous item delivery |
US11068949B2 (en) | 2016-12-09 | 2021-07-20 | 365 Retail Markets, Llc | Distributed and automated transaction systems |
GB2572875A (en) | 2016-12-12 | 2019-10-16 | Walmart Apollo Llc | Systems and methods for delivering products via autonomous ground vehicles to vehicles designated by customers |
US10949792B2 (en) | 2016-12-30 | 2021-03-16 | United States Postal Service | System and method for delivering items using autonomous vehicles and receptacle targets |
US20180196417A1 (en) | 2017-01-09 | 2018-07-12 | nuTonomy Inc. | Location Signaling with Respect to an Autonomous Vehicle and a Rider |
US10740863B2 (en) | 2017-01-09 | 2020-08-11 | nuTonomy Inc. | Location signaling with respect to an autonomous vehicle and a rider |
US10168167B2 (en) | 2017-01-25 | 2019-01-01 | Via Transportation, Inc. | Purposefully selecting longer routes to improve user satisfaction |
US9919704B1 (en) * | 2017-01-27 | 2018-03-20 | International Business Machines Corporation | Parking for self-driving car |
WO2018156292A1 (en) | 2017-02-24 | 2018-08-30 | Walmart Apollo, Llc | Systems and methods for delivering products via unmanned mobile lockers |
US20180260780A1 (en) | 2017-03-08 | 2018-09-13 | Wheely's Café International AB | Movable hub distribution system |
US20180260778A1 (en) | 2017-03-08 | 2018-09-13 | Wheely's Café International AB | Self driving automated vending vehicle |
US11023803B2 (en) * | 2017-04-10 | 2021-06-01 | Intel Corporation | Abstraction library to enable scalable distributed machine learning |
US11094029B2 (en) * | 2017-04-10 | 2021-08-17 | Intel Corporation | Abstraction layers for scalable distributed machine learning |
US11227270B2 (en) | 2017-05-30 | 2022-01-18 | Robomart, Inc. | One tap/command grocery ordering via self-driving mini marts and seamless checkout-free technology |
US11244252B2 (en) | 2017-06-21 | 2022-02-08 | Chian Chiu Li | Autonomous driving under user instructions and hailing methods |
US10620629B2 (en) | 2017-06-22 | 2020-04-14 | The Boeing Company | Autonomous swarm for rapid vehicle turnaround |
US10296795B2 (en) * | 2017-06-26 | 2019-05-21 | Here Global B.V. | Method, apparatus, and system for estimating a quality of lane features of a roadway |
US20190023236A1 (en) | 2017-07-19 | 2019-01-24 | Ronald Michael Webb | Autonomous vehicle service station, systems and methods |
CA3070186A1 (en) | 2017-07-20 | 2019-01-24 | Nuro, Inc. | Autonomous vehicle repositioning |
WO2019023521A1 (en) | 2017-07-28 | 2019-01-31 | Nuro, Inc. | Automated retail store on autonomous or semi-autonomous vehicle |
-
2018
- 2018-07-19 CA CA3070186A patent/CA3070186A1/en active Pending
- 2018-07-19 US US16/040,437 patent/US11449050B2/en active Active
- 2018-07-19 JP JP2020502211A patent/JP2020527805A/en active Pending
- 2018-07-19 EP EP18835958.2A patent/EP3655836A4/en active Pending
- 2018-07-19 WO PCT/US2018/042967 patent/WO2019018695A1/en unknown
- 2018-07-19 US US16/040,432 patent/US11467574B2/en active Active
- 2018-07-19 US US16/040,446 patent/US10331124B2/en active Active
- 2018-07-19 CN CN201880047506.4A patent/CN110914779A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000030195A (en) * | 1998-07-10 | 2000-01-28 | Honda Motor Co Ltd | Car allocation system |
CN1804551A (en) * | 2005-01-14 | 2006-07-19 | 阿尔卡特公司 | Navigation service |
WO2014024254A1 (en) * | 2012-08-07 | 2014-02-13 | 株式会社日立製作所 | Use-assisting tool for autonomous traveling device, operation management center, operation system, and autonomous traveling device |
WO2016209595A1 (en) * | 2015-06-22 | 2016-12-29 | Google Inc. | Determining pickup and destination locations for autonomous vehicles |
US20170116477A1 (en) * | 2015-10-23 | 2017-04-27 | Nokia Technologies Oy | Integration of positional data and overhead images for lane identification |
US20170124781A1 (en) * | 2015-11-04 | 2017-05-04 | Zoox, Inc. | Calibration for autonomous vehicle operation |
CN113253692A (en) * | 2021-06-21 | 2021-08-13 | 浙江华睿科技有限公司 | Tour method, tour device, tour equipment and readable storage medium for AGV |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112445923A (en) * | 2021-02-01 | 2021-03-05 | 智道网联科技(北京)有限公司 | Data processing method and device based on intelligent traffic and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2019018695A1 (en) | 2019-01-24 |
EP3655836A4 (en) | 2021-04-21 |
US10331124B2 (en) | 2019-06-25 |
US20190025820A1 (en) | 2019-01-24 |
US11449050B2 (en) | 2022-09-20 |
US20190043355A1 (en) | 2019-02-07 |
EP3655836A1 (en) | 2020-05-27 |
US20190026886A1 (en) | 2019-01-24 |
CA3070186A1 (en) | 2019-01-24 |
JP2020527805A (en) | 2020-09-10 |
US11467574B2 (en) | 2022-10-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10331124B2 (en) | Autonomous vehicle repositioning | |
US20210356959A1 (en) | Fleet of autonomous vehicles with lane positioning and platooning behaviors | |
US11341560B2 (en) | Advertising on autonomous or semi-autonomous vehicle exterior | |
US10416677B2 (en) | Autonomous vehicle routing using annotated maps | |
US11340077B2 (en) | Driving condition specific sensor quality index | |
US20190146508A1 (en) | Dynamic vehicle routing using annotated maps and profiles | |
US11435193B2 (en) | Dynamic map rendering | |
US11804136B1 (en) | Managing and tracking scouting tasks using autonomous vehicles | |
US20210082291A1 (en) | Integrating air and ground data collection for improved drone operation | |
US11892307B2 (en) | Stranding and scoping analysis for autonomous vehicle services | |
US20190339694A1 (en) | Using environmental information to estimate sensor functionality for autonomous vehicles | |
US11788854B1 (en) | Assessing the impact of blockages on autonomous vehicle services | |
US20220222597A1 (en) | Timing of pickups for autonomous vehicles | |
US20220107650A1 (en) | Providing deliveries of goods using autonomous vehicles | |
US11914642B2 (en) | Difference merging for map portions | |
US20220410881A1 (en) | Apparatus and methods for predicting a state of visibility for a road object based on a light source associated with the road object | |
US20240174032A1 (en) | Apparatus and methods for predicting tire temperature levels | |
US20220172259A1 (en) | Smart destination suggestions for a transportation service | |
US20230146500A1 (en) | Systems and methods for determining an optimal placement of a package |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200324 |