WO2019151995A1 - Motion planning for autonomous point-of-sale vehicles - Google Patents

Motion planning for autonomous point-of-sale vehicles Download PDF

Info

Publication number
WO2019151995A1
WO2019151995A1 PCT/US2018/016020 US2018016020W WO2019151995A1 WO 2019151995 A1 WO2019151995 A1 WO 2019151995A1 US 2018016020 W US2018016020 W US 2018016020W WO 2019151995 A1 WO2019151995 A1 WO 2019151995A1
Authority
WO
WIPO (PCT)
Prior art keywords
potential customer
customer
autonomous vehicle
point
data
Prior art date
Application number
PCT/US2018/016020
Other languages
French (fr)
Inventor
David Michael Herman
Nunzio Decia
Nicholas Scheufler
David A. Herman
Stephen ORRIS
David Orris
Original Assignee
Ford Global Technologies, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies, Llc filed Critical Ford Global Technologies, Llc
Priority to PCT/US2018/016020 priority Critical patent/WO2019151995A1/en
Publication of WO2019151995A1 publication Critical patent/WO2019151995A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0025Planning or execution of driving tasks specially adapted for specific operations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/041Potential occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4046Behavior, e.g. aggressive or erratic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data

Definitions

  • This invention relates to motion planning for autonomous vehicles.
  • Figure l is a high-level block diagram showing one example of a computing system in which a system and method in accordance with the invention may be implemented;
  • Figure 2 is a high level schematic diagram showing a customer hailing an autonomous vehicle in accordance with certain embodiments of the invention.
  • Figure 3 is a perspective view of a customer detected and analyzed in accordance with certain embodiments of the invention.
  • Figure 4 is a high level schematic diagram showing a customer requesting an autonomous vehicle via a cell phone application in accordance with certain embodiments of the invention
  • Figure 5 is a front view of a cellular phone having an application for requesting an autonomous vehicle in accordance with one embodiment of the invention
  • Figure 6 is a flow chart showing a process for detecting a potential customer in accordance with certain embodiments of the invention.
  • Figure 7 is a flow chart showing a process for positively identifying a potential customer in accordance with embodiments of the invention.
  • Figure 8 is a flow chart depicting an algorithm for determining whether a potential customer is likely to patronize the business associated with the autonomous vehicle in accordance with embodiments of the invention.
  • FIG. 1 one example of a computing system 100 is illustrated.
  • the computing system 100 is presented to show one example of an environment where a system and method in accordance with the invention may be implemented.
  • the computing system 100 may be embodied as a mobile device 100 such as a smart phone or tablet, a desktop computer, a workstation, a server, or the like.
  • the computing system 100 is presented by way of example and is not intended to be limiting. Indeed, the systems and methods disclosed herein may be applicable to a wide variety of different computing systems in addition to the computing system 100 shown. The systems and methods disclosed herein may also potentially be distributed across multiple computing systems 100.
  • the computing system 100 includes at least one processor 102 and may include more than one processor 102.
  • the processor 102 may be operably connected to a memory 104.
  • the memory 104 may include one or more non-volatile storage devices such as hard drives l04a, solid state drives l04a, CD-ROM drives l04a, DVD-ROM drives l04a, tape drives l04a, or the like.
  • the memory 104 may also include non-volatile memory such as a read-only memory l04b (e.g., ROM, EPROM, EEPROM, and/or Flash ROM) or volatile memory such as a random access memory l04c (RAM or operational memory).
  • a bus 106, or plurality of buses 106 may interconnect the processor 102, memory devices 104, and other devices to enable data and/or instructions to pass therebetween.
  • the computing system 100 may include one or more ports 108.
  • Such ports 108 may be embodied as wired ports 108 (e.g., ETSB ports, serial ports, Firewire ports, SCSI ports, parallel ports, etc.) or wireless ports 108 (e.g., Bluetooth, IrDA, etc.).
  • the ports 108 may enable communication with one or more input devices 110 (e.g., keyboards, mice, touchscreens, cameras, microphones, scanners, storage devices, etc.) and output devices 112 (e.g., displays, monitors, speakers, printers, storage devices, etc.).
  • the ports 108 may also enable communication with other computing systems 100.
  • the computing system 100 includes a wired or wireless network adapter 114 to connect the computing system 100 to a network 116, such as a LAN, WAN, or the Internet.
  • a network 116 may enable the computing system 100 to connect to one or more servers 118, workstations 120, personal computers 120, mobile computing devices, or other devices.
  • the network 116 may also enable the computing system 100 to connect to another network by way of a router 122 or other device 122.
  • a router 122 may allow the computing system 100 to communicate with servers, workstations, personal computers, or other devices located on different networks.
  • a system 200 for motion planning for autonomous point- of-sale vehicles in accordance with the invention may include an autonomous vehicle 204 providing goods and/or services in connection with a point-of-sale business.
  • the autonomous vehicle 204 may travel on a road network incorporating a point-of-sale capability, and may be equipped with an array of onboard sensors 206 to detect one or more potential customers 202 in the vicinity.
  • the sensors 206 for autonomous navigation may also be used to detect potential customers 202 physically hailing the autonomous vehicle 204 and/or to correctly identify a potential customer 202 requesting service via a cell phone application.
  • Such sensors 206 may include, for example, lidar sensors, radar sensors, GPS, inertial measurement unit, time-of-flight cameras, stereo cameras, monocular cameras, microphones, and the like.
  • a potential customer 202 may hail the autonomous vehicle 204 spontaneously after seeing the autonomous vehicle 204 nearby.
  • One or more sensors 206 may detect the presence of the potential customer 202 from a distance, and embodiments of the invention may utilize sensor 206 fusion for biometric analysis combining, for example, face, gait, 3D point cloud, voice, and/or fingerprint analysis to identify the potential customer 202. Such identification may be incorporated into the determination of a customer’s intent or probability of making a present purchase based on historical data.
  • the sensor fusion may occur with a single algorithm or via an ensemble method where several individual algorithms may be evaluated simultaneously to make a determination.
  • the sensors 206 of the autonomous vehicle 204 may intermittently or continuously scan the external environment to detect the presence of a potential customer 202, and to determine whether the potential customer 202 is likely to patronize the point- of-sale business associated with the autonomous vehicle 204.
  • Data from the sensors 206 may be gathered and analyzed by an onboard or remote processor or a combination of onboard and remote processing.
  • the processor may utilize various algorithms to analyze this information to determine the potential customer’s 202 intent. Such algorithms may also be capable of tracking individuals, and/or estimating the number of people within a partially occluded crowd to predict potential profitability of providing service.
  • a determination of customer intent may be made by analyzing the sensor 206 data using known methods such as Haar Cascade, Histogram of Gradients in a support vector machine (SVM), or deep neural networks.
  • SVM support vector machine
  • Such systems may also incorporate a Bayesian framework to provide information on the confidence of the systems predictions.
  • Such a system may incorporate convolutional layers in the neural network, a common neural network architecture for image related data.
  • the algorithm may incorporate sensor fusion as raw data input into a single algorithm, or via several individual algorithms incorporating an ensemble method to incorporate each of the individual algorithm’s output.
  • the algorithm may incorporate memory units, such as long short term memory (LSTM) or Gated Recurrent Unit (GRU), to make predictions based on a sequential time slice of data.
  • LSTM long short term memory
  • GRU Gated Recurrent Unit
  • Such an algorithm may incorporate past individual user behavior when hailing the vehicle when a potential customer is identified. For example, a user intent vector may be created based upon the user’s past hailing behavior. This intent vector may be incorporated into the algorithm along with the sensor 206 data to calculate the probability of that individual customer’s desire for goods or services. This method may improve the correct identification of a user’s intent where different individuals may display their desire for service for service in different ways.
  • the machine learning algorithms may also be trained to differentiate between different classes of customers, or sub-classify potential customers 202. For example, in one embodiment, a group of children after a baseball game may be distinguished from a gathering of adults at an outdoor party.
  • the system 200 may inform a motion planning algorithm to alter the behavior of a point-of-sale autonomous vehicle 204 based on the class determination.
  • the motion planning algorithm may be used to adjust the autonomous vehicle’ s 204 driving patterns or to decrease its speed to compensate for an increased likelihood of children in close proximity to the autonomous vehicle 204.
  • the motion planning algorithm may be used to navigate the vehicle to a location to improve visibility of potential customers. This may be accomplished by changing lanes or temporarily parking the vehicle, for example.
  • embodiments of the invention may cause the autonomous vehicle 204 to adjust its behavior to either accommodate or avoid the potential customer 202. For example, upon identifying a potential customer 202 as a likely patron of the point-of-sale business, the system 200 may identify a nearest available parking space and cause the autonomous vehicle 204 to move toward it, given present traffic conditions.
  • the system 300 may gather images for purposes of image segmentation and/or posture detection to estimate the physical positioning 310 of a potential customer 302.
  • Such positioning 310 may indicate that the customer 302 is summoning or hailing the autonomous vehicle 304 for service.
  • the captured image sequence may be analyzed by an algorithm capable of understanding an image scene and sequence of images. In one embodiment, such analysis may be performed by long-term recurrent convolutional networks (LRCN).
  • LRCN long-term recurrent convolutional networks
  • Head pose and eye gaze vectors 308 may also be calculated from sensor 306 data. This evidence may be used to confirm the potential customer’ s 302 intent by verifying that the potential customer 302 is facing the autonomous vehicle 304 and tracking the autonomous vehicle 304 as it drives toward the potential customer 302.
  • the system 300 may estimate the position of a potential customer 302 relative to the autonomous vehicle 304.
  • the system 300 may utilize image data from camera sensors 306 and/or lidar sensors 306 to determine the position and angle of the potential customer 302 relative to the autonomous vehicle 304.
  • the system 300 may also track head position and/or eye gaze vectors 308 relative to the autonomous vehicle 304 to determine whether the potential customer 302 is focused on the autonomous vehicle 304, regardless of whether he or she is also using physical motions to summon or hail the autonomous vehicle 304.
  • the trajectory of the potential customer 302 may be included in the determination of whether the potential customer 302 likely desires to patronize the autonomous vehicle 304. For example, a potential customer 302 who is walking towards the autonomous vehicle 304 or a path that may intersect with the vehicle’s current motion plan trajectory, is more likely interested in patronizing the autonomous vehicle 304 than a person who is walking away from the autonomous vehicle 304.
  • the customer trajectory information may be used to infer, based upon real-time sensor 306 data and mapping of the local area, that the customer is traveling to a location where the vehicle 304 may be able to stop and deliver services.
  • the system 300 may use images captured by the sensors 306 to perform facial recognition and/or gait analysis to recognize past customers (including customers 302 that have electronically requested an autonomous vehicle 304 in the past), as well as new customers 302.
  • the system 300 may implement neural networks, deep and convolutional, to facilitate facial recognition, gait detection, and speaker recognition.
  • a probability framework such as a Bayesian probabilistic framework combined with neural networks, may be utilized to determine a likelihood that the potential customer 302 desires to patronize the point-of- sale business associated with the autonomous vehicle 304.
  • Three dimensional data obtained from lidar sensors 306, time of flight cameras, or other such sensors 306 may also be used to identify the customer. Such data may be general (such as physical dimensions) or detailed (such as a 3D facial profile).
  • the system 300 may alter the lidar sensor’s 306 data collection behavior. For example, for a mechanical lidar sensor 306, the rotation speed of the sensor 306 may be decreased to create a more tightly grouped points along the rotational direction of the lidar sensor 306 unit. In the case of a solid state lidar unit, the lidar may be“steered” towards the potential customer to obtain a tighter grouping of 3D data points. This denser lidar data may assist in correct identification of customer identity or customer intent.
  • a response of the autonomous vehicle 304 may vary based on real time conditions as well as a degree of confidence in the prediction of the potential customer’s 302 identity and intent.
  • prediction confidence may be reflected in terms of a confidence score assigned to the customer identification, which may be interpreted in terms of business logic specific to the objectives of the business, as described in more detail with reference to Figure 8 below.
  • a probability framework may be used to determine a profit- making potential of a specific location, taking into account the identities of potential customers 302 in the location and their historical behavior. For example, in one embodiment, a point-of-sale autonomous vehicle 304 may scan a crowded street to identify known customers 302 in a crowd. Based on the composition of customers and historical data corresponding thereto, the system 300 may predict potential sales revenue at that location. Potential sales revenue may also be predicted based on the vehicle’s 304 present configuration or stock of products and/or services available. [0035] In one embodiment, the invention may implement cooperative motion planning.
  • another autonomous vehicle 304 may be dispatched to assist in meeting a predicted demand of a given product or service when that demand is greater than the current on-site vehicle’ s 304 inventory.
  • Such predicted demand may be calculated by adding together the predicted demand values corresponding to each new customer, as well as to each identified customer 302 with a prior order history.
  • Facial recognition and/or gait analysis may also be performed to identify individuals that may pose a detriment to the profitability or security of the autonomous vehicle 304.
  • individuals may include customers 302 that take a long time to order, order low-profit items, disrupt lines for service of the autonomous vehicle 304, damage or deface the autonomous vehicle 304, or pose a danger or detriment to other customers 302.
  • embodiments of the invention may display a warning on the autonomous vehicle 304 to direct the detrimental individual away from the autonomous vehicle 304 and/or to alert other customers to the potential problem.
  • the autonomous vehicle 304 may reject the detrimental individual as a customer 302 by avoiding the individual (leaving the location, changing lanes, continuing to drive, or the like), or by physically or electronically preventing the individual from transacting business with the autonomous vehicle 304. As a final escalation level, the autonomous vehicle 304 may alert police if warranted by the circumstances.
  • any action may incorporate additional business logic.
  • a customer designated to be avoided due to excessive ordering of low-cost items may nevertheless be approached if the area contains enough other customers determined to be interested in service. In this manner, the profitable customers may offset the deleterious effect of the avoidable customer and the total time the vehicle 304 is stopped may be optimally profitable.
  • the customer recognition performed by the system 300 may be used as a platform for police surveillance of individuals sought for a crime.
  • a system 300 in accordance with the invention may alert police to crimes in progress through monitoring the data captured by the sensors 306, and/or utilizing an LRCN for classification purposes.
  • the autonomous vehicle 304 may be equipped to make apprehensions of suspects if approached.
  • Past interactions of the potential customer 302 with the autonomous vehicle 304, or with other autonomous vehicles 304 within the fleet of vehicles 304 associated with a particular point-of-sale business, may also be used to estimate a proclivity of the potential customer 302 to patronize the autonomous vehicle 304.
  • data gathered by the sensors 306 may be used to biometrically identify the potential customer 302. Such identification information may be compared against a database to determine whether the potential customer 302 has previously interacted with the point-of-sale autonomous vehicle 304, and whether those interactions were positive or negative. This information may form the basis of planning an optimal route for the vehicle to follow.
  • the combination of these factors may be used to produce a probability that the potential customer 302 is actually summoning or hailing the autonomous vehicle 304.
  • Probability data generated in this manner may include probability scores stored and accumulated to produce a probabilistic model indicating the potential customer’s 302 probability of hailing and/or patronizing the autonomous vehicle 304 as a function of time.
  • This probability score or determination may inform the motion planning algorithm used to direct the autonomous vehicle 304 based on current path and predicted customer 302 demand.
  • the system 300 may slow down or temporarily stop the autonomous vehicle 304 based on current traffic conditions to test its predictions that the potential customer 302 is interested in patronizing the point-of-sale vehicle 304.
  • the motion planning algorithm may direct the autonomous vehicle 304 away from the area if the system 300 returns a low probability determination for the potential customer 302.
  • a potential customer 402 may summon an autonomous vehicle 404 by way of an application on a computing device 408, such as a wearable device, digital kiosk, cell phone, tablet, laptop computer, desktop computer, or the like.
  • the computing device 408 may communicate with the cloud 406 over any suitable network such as a cellular network, WiFi, or the like.
  • the cloud 406 may in turn communicate with the autonomous vehicle 404 over a cellular network, WiFi, or the like.
  • a remote server such as a cloud 406 server, may find a suitable autonomous vehicle 404 in the vicinity of the potential customer 402, and may instruct the autonomous vehicle 404 to travel to the location of the customer 402.
  • the cloud 406 server may calculate a travel time between available autonomous vehicles 404 and the location of the potential customer 402. The cloud 406 server may then select a particular autonomous vehicle 404 to provide goods or services to the customer 402 based on a shortest associated travel time.
  • the potential customer 402 may request a particular autonomous vehicle 404, or may request a specific location for the autonomous vehicle 404 to stop.
  • an application 502 installed on a mobile device 500 of the customer 402 may include symbols to designate autonomous vehicles 506 in the area available to provide various types of goods or services in accordance with embodiments of the invention.
  • the application 502 may include stickers depicting objects such as ice cream, tools, or groceries, overlaid onto a map of the vicinity.
  • the stickers may alert the customer 402 to the types of goods and services provided by autonomous vehicles 404 in the area, and in some embodiments, may allow the customer 402 to select a particular autonomous vehicle 404 to access more detailed information regarding the vehicle 506 or its goods and services, and to request service.
  • the customer 402 may utilize this application 502, for example, to summon a ride from an autonomous vehicle 404.
  • the user may specify a desired pickup location 504, or the pickup location 504 may be automatically determined based on a present location of the potential customer 402.
  • the mobile device 500 may include a global positioning system that may communicate the potential customer’s 402 present location to a cloud 406 server.
  • the cloud 406 server may utilize this information to locate available autonomous vehicles 404 in the area.
  • One autonomous vehicle 404 may be assigned to pick up the customer 402 based on the ride request.
  • the closest available autonomous vehicle 404 may be identified and assigned to the customer 402 based on an estimated travel time between a current location of the autonomous vehicle 404 and the pickup location 504, a distance between the autonomous vehicle 404 and the pickup location 504, or the like.
  • the customer 402 may be difficult for the autonomous vehicle 404 to accurately identify or locate due to user error, sensor error, movement of the customer 402 to a different nearby location, or a location populated by more than one individual.
  • a system 400 in accordance with the invention may address this problem by automatically accessing a customer profile associated with customer 402, and/or the application or mobile phone 500 registered to the customer 402, upon receiving a request for service. For example, a photo of the customer 402 may be automatically transmitted to the autonomous vehicle 404 in connection with the customer’ s 402 request. Additionally, in some embodiments, the system 400 may automatically send to the autonomous vehicle 404 location indicators (such as signs or nearby stores) to facilitate precise identification of a service or pickup location 504 by the autonomous vehicle 404.
  • the autonomous vehicle 404 location indicators such as signs or nearby stores
  • a method 600 for detecting a potential customer may include determining 602 whether a point-of-sale autonomous vehicle is in a location that is favorable for business.
  • a point-of-sale autonomous vehicle located in an area surrounding a public place such as a park or shopping mall is more likely to receive business than one located in a remote residential area, for example. If the vehicle is not in a favorable location, the vehicle may move 614 to a more favorable location.
  • sensors onboard the autonomous vehicle may scan 604 a surrounding environment for potential customers.
  • Sensors may include, for example, lidar sensors, radar sensors, time-of-flight cameras, stereo cameras, monocular cameras, microphones, and the like.
  • Sensor data may be retrieved 606 from the various sensors and analyzed to determine the presence and attributes of potential customers in the vicinity. Data gathered from the sensors may be retrieved 606 and analyzed by an onboard or remote processor to determine 608 whether a potential customer is proximate the autonomous vehicle.
  • sensor data may be combined with stored historical data corresponding to the customer to determine 610 whether the customer is likely to patronize the point-of-sale business associated with the autonomous vehicle. If yes, the autonomous vehicle may navigate 612 toward the customer for parking and engaging in a point-of-sale transaction. If no, the autonomous vehicle may continue 614 to a new location.
  • a method 700 for positively identifying a potential customer in accordance with the invention may include detecting 702 a customer by utilizing sensors onboard the autonomous vehicle, as set forth above. Upon detecting a customer, sensor data may be retrieved 704 and utilized to biometrically ascertain 706 the identity and/or intent of the customer in patronizing the point-of-sale vehicle.
  • certain embodiments of the invention may determine 708 whether a geographic history corresponding to the customer is commensurate with the customer’s current location.
  • geographic historical information may be included in a customer profile associated with the customer and stored in a database. The stored geographic historical information may be updated each time the customer makes a request for service or is otherwise identified by an autonomous vehicle in connection with embodiments of the invention.
  • Such information may include GPS coordinates associated with the user, cell phone application geo-location information, general location information, and the like.
  • Such information may inform a current identification of a customer by comparing a current location of the customer against the geographical historical information associated with the customer. Indeed, it is unlikely that a customer known to transact business with point-of-sale vehicles primarily in the Chicago area would be in Miami. Accordingly, a confidence score associated with identification of the customer in Miami may be reduced.
  • the combination of sensor fusion and historical geo-location of a customer may be used to develop accurate machine learning models, such as convolution neural networks (CNN), support vector machines (SVM), or principal component extraction (PCA), or ensemble algorithms where different models vote on their prediction (i.e., Gradient Boosted Decision Trees).
  • CNN convolution neural networks
  • SVM support vector machines
  • PCA principal component extraction
  • ensemble algorithms where different models vote on their prediction (i.e., Gradient Boosted Decision Trees).
  • the method 700 may return to retrieve 704 additional sensor data to aid accurate customer identification. If the current location matches the geographical historical information or is within a specified deviation range, the method 700 may gather 710 additional data from the customer to positively identify or confirm identification of the customer.
  • additional information may include, for example, additional biometric data including voice data, facial data, fingerprint data, gait data, and 3D point cloud data.
  • additional data may include payment data such as bank card or credit card data, driver license, passport, other identification information or data such as phone identifiers, NFC payments, credit reports, and historical customer data.
  • some embodiments of the method 700 may determine 712 a class associated with the customer.
  • a class may include, for example, age, gender, familial role, and the like. This information, combined with the information and data previously discussed, may be used to authenticate the customer to enable automatic or biometric- based payment for services at the point-of-sale autonomous vehicle, and/or to customize goods or services to the known or predicted preferences of the customer
  • a customer uses a mobile device to pay for service or request a point-of-sale autonomous vehicle
  • this information may be used to connect the customer’ s biometric signature with his or her identity and/or bank account. Assuming high confidence in such identification, such biometric information may also allow the customer to pay for service automatically by authenticating a deduction from his or her account.
  • a customized menu of goods or services may be presented to a customer depending on the class to which the customer is assigned and their past transaction history. For example, a food truck may customize its menu to remove any alcoholic beverage options to customers who have been identified as being under the legal drinking age.
  • a mobile animal groomery may tailor its menu to reflect only dog-grooming services if the customer has been identified as a dog owner.
  • it may be determined 712 that the customer is female and a mother to the small child accompanying her. These class designations may inform the method 700 that the child is accompanied by a paying adult, and a menu of services may be customized to reflect family-friendly items.
  • a past ordering history of the customer may be used to tailor a current menu of goods or services for display to the customer.
  • the method 700 may then determine 714 whether the customer interacts with or transacts business with the autonomous vehicle.
  • the customer interaction may be positive or negative, including such things as perusing a menu, making a purchase, vandalizing the vehicle, or the like. If there is any customer interaction 714, a customer profile associated with the customer may be updated 716 to reflect the type and quality of interaction. If there is no interaction 714, the method 700 may return to detect 702 another customer.
  • Figure 8 depicts an algorithm or method 800 for determining whether a potential customer is likely to patronize a point-of-sale business associated with an autonomous vehicle.
  • the method 800 may include first detecting 802 a potential customer.
  • a potential customer may be detected by onboard sensors associated with the autonomous vehicle, or by a server or processor in communication with a mobile device of the potential customer.
  • the method 800 may used the information gathered by the sensors and/or server or processor to identify 804 the customer.
  • additional identification data may be collected 806. Additional identification data may include, for example, face, gait, 3D point cloud, voice, and/or fingerprint analysis.
  • additional identification data may further include historical geo-location and other data of the customer, payment data, driver license, passport, other identification data, and the like.
  • the method 800 may determine 808 whether the customer is detrimental to the point-of-sale business. Current sensor data may aid this determination, as well as historical behavioral information associated with the customer. If the customer is determined 808 to be detrimental, the autonomous vehicle may avoid the customer by navigating 818 to another location, preventing a transaction with the customer, displaying a warning for others, or calling police. If the customer is determined 808 not to be detrimental, the method 800 may determine 810 whether the customer directly requested the point-of-sale autonomous vehicle. If yes, the autonomous vehicle may navigate 812 to the customer to facilitate a point-of-sale transaction, slow its speed to capture additional data, or allow extended parking to allow the customer to approach the vehicle.
  • the method 800 may continue to determine 814 whether the customer is likely hailing 814 the vehicle. This determination may be made by utilizing biometric data and algorithms to estimate the physical positioning and movements of a potential customer indicating that the customer is intentionally summoning or hailing the vehicle. In one embodiment, such analysis may be performed by long-term recurrent convolutional networks. If it is determined 814 that the customer is hailing the vehicle, the vehicle may navigate 812 to the customer and/or otherwise allow the customer to patronize the vehicle. [0063] If not, the method 800 may reference business logic provided by the owner or operator of the point-of-sale autonomous vehicle to determine 816 whether to delay servicing the potential customer. If the business logic authorizes a delay in its regular service to provide service to a new potential customer, the autonomous vehicle may navigate 812 to the customer. If not, the autonomous vehicle may navigate 818 to its original or other destination.
  • an autonomous vehicle may be tasked with servicing a customer who has electronically requested pizza delivery by the autonomous vehicle. Time is of the essence to avoid delivering a cold pizza. If the autonomous vehicle encounters a new potential customer on its way to the requesting customer, it may be delayed by an additional 5-10 minutes, which may result in unsatisfactory service quality to the requesting customer.
  • the method 800 may avoid this result by referencing business logic that does not authorize a delay to the requesting customer.
  • the method 800 may dispatch an additional vehicle to service either the requesting customer or the potential customer to maximize business profits without sacrificing service quality.
  • Implementations of the systems, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, implementations of the disclosure can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
  • Computer storage media includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • SSDs solid state drives
  • PCM phase-change memory
  • An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network.
  • A“network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
  • Transmissions media can include a network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
  • Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
  • the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
  • the disclosure may be practiced in network computing environments with many types of computer system configurations, including, an in-dash vehicle computer, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like.
  • the disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
  • program modules may be located in both local and remote memory storage devices.
  • ASICs application specific integrated circuits
  • a sensor may include computer code configured to be executed in one or more processors, and may include hardware logic/electrical circuitry controlled by the computer code.
  • processors may include hardware logic/electrical circuitry controlled by the computer code.
  • At least some embodiments of the disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer useable medium.
  • Such software when executed in one or more data processing devices, causes a device to operate as described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method for motion planning for a point-of-sale autonomous vehicle. The method includes scanning, via sensors of an autonomous vehicle, an external environment for a potential customer of a point-of-sale business. The sensors may detect the potential customer. The method may then determine whether the potential customer is likely to patronize the point-of-sale business. If so, the method may navigate the autonomous vehicle to a location proximate the potential customer to facilitate a business opportunity. A corresponding system and computer program product are also disclosed and claimed herein.

Description

MOTION PLANNING FOR AUTONOMOUS POINT-OF-SALE VEHICLES
BACKGROUND
FIELD OF THE INVENTION
[0001] This invention relates to motion planning for autonomous vehicles.
BACKGROUND OF THE INVENTION
[0002] Many retail and hospitality-oriented businesses have realized new economic opportunity from incorporating vehicles as additional, and sometimes primary, points-of-sale. Though once reserved for ice cream and pizza delivery, food trucks have proliferated in modern cities and can be found on nearly every street corner, carrying everything from sushi to waffles. In addition, mobile pet grooming services, car detailing services, mobile hair and nail salons, and the like, are readily available to provide on-demand service at nearly any location. The proliferation of such mobile point-of-sale businesses evidences a prime opportunity for business owners.
[0003] Mobile point-of-sale technology, however, currently requires a human to provide motion planning for the point-of-sale vehicle and to identify potential customers. As a result, the potential to efficiently scale such a business is reduced. Although still under development, autonomous (e.g., driverless) vehicles may present additional revenue and business opportunities for point-of-sale businesses. By obviating the need for a human driver, autonomous vehicles may provide a unique opportunity for point-of-sale businesses to reduce costs while increasing their efficiency and business capacity.
[0004] In view of the foregoing, what are needed are systems and methods to enable the efficient utilization of autonomous vehicles as mobile points-of-sale. Ideally, such systems and methods would enable the autonomous vehicle to recognize a potential customer’s presence, identity, location, and desire for service, and to further assess the desirability of providing service to that customer. Such systems and methods would also integrate motion planning to enable the autonomous vehicle to prioritize requests for service and to navigate to an appropriate service location.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] In order that the advantages of the invention will be readily understood, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered limiting of its scope, the invention will be described and explained with additional specificity and detail through use of the accompanying drawings, in which:
[0006] Figure l is a high-level block diagram showing one example of a computing system in which a system and method in accordance with the invention may be implemented;
[0007] Figure 2 is a high level schematic diagram showing a customer hailing an autonomous vehicle in accordance with certain embodiments of the invention;
[0008] Figure 3 is a perspective view of a customer detected and analyzed in accordance with certain embodiments of the invention;
[0009] Figure 4 is a high level schematic diagram showing a customer requesting an autonomous vehicle via a cell phone application in accordance with certain embodiments of the invention;
[0010] Figure 5 is a front view of a cellular phone having an application for requesting an autonomous vehicle in accordance with one embodiment of the invention;
[0011] Figure 6 is a flow chart showing a process for detecting a potential customer in accordance with certain embodiments of the invention;
[0012] Figure 7 is a flow chart showing a process for positively identifying a potential customer in accordance with embodiments of the invention; and [0013] Figure 8 is a flow chart depicting an algorithm for determining whether a potential customer is likely to patronize the business associated with the autonomous vehicle in accordance with embodiments of the invention.
DETAILED DESCRIPTION
[0014] Referring to Figure 1, one example of a computing system 100 is illustrated. The computing system 100 is presented to show one example of an environment where a system and method in accordance with the invention may be implemented. The computing system 100 may be embodied as a mobile device 100 such as a smart phone or tablet, a desktop computer, a workstation, a server, or the like. The computing system 100 is presented by way of example and is not intended to be limiting. Indeed, the systems and methods disclosed herein may be applicable to a wide variety of different computing systems in addition to the computing system 100 shown. The systems and methods disclosed herein may also potentially be distributed across multiple computing systems 100.
[0015] As shown, the computing system 100 includes at least one processor 102 and may include more than one processor 102. The processor 102 may be operably connected to a memory 104. The memory 104 may include one or more non-volatile storage devices such as hard drives l04a, solid state drives l04a, CD-ROM drives l04a, DVD-ROM drives l04a, tape drives l04a, or the like. The memory 104 may also include non-volatile memory such as a read-only memory l04b (e.g., ROM, EPROM, EEPROM, and/or Flash ROM) or volatile memory such as a random access memory l04c (RAM or operational memory). A bus 106, or plurality of buses 106, may interconnect the processor 102, memory devices 104, and other devices to enable data and/or instructions to pass therebetween.
[0016] To enable communication with external systems or devices, the computing system 100 may include one or more ports 108. Such ports 108 may be embodied as wired ports 108 (e.g., ETSB ports, serial ports, Firewire ports, SCSI ports, parallel ports, etc.) or wireless ports 108 (e.g., Bluetooth, IrDA, etc.). The ports 108 may enable communication with one or more input devices 110 (e.g., keyboards, mice, touchscreens, cameras, microphones, scanners, storage devices, etc.) and output devices 112 (e.g., displays, monitors, speakers, printers, storage devices, etc.). The ports 108 may also enable communication with other computing systems 100.
[0017] In certain embodiments, the computing system 100 includes a wired or wireless network adapter 114 to connect the computing system 100 to a network 116, such as a LAN, WAN, or the Internet. Such a network 116 may enable the computing system 100 to connect to one or more servers 118, workstations 120, personal computers 120, mobile computing devices, or other devices. The network 116 may also enable the computing system 100 to connect to another network by way of a router 122 or other device 122. Such a router 122 may allow the computing system 100 to communicate with servers, workstations, personal computers, or other devices located on different networks.
[0018] As previously mentioned, many point-of-sale businesses have realized new economic opportunity by utilizing vehicles to deliver their goods and/or services. While autonomous vehicles present additional possibilities for increased revenue and point-of-sale business opportunities, current point-of-sale technology requires human interaction and judgment to find and interact with potential customers. Autonomous vehicles are thus currently ill-equipped for widespread, cost- effective use in connection with point-of-sale business. Embodiments of the invention address this issue by providing autonomous vehicles that utilize an array of on-board sensors and algorithms to accurately and reliably detect and identify potential customers.
[0019] Referring now to Figure 2, a system 200 for motion planning for autonomous point- of-sale vehicles in accordance with the invention may include an autonomous vehicle 204 providing goods and/or services in connection with a point-of-sale business. The autonomous vehicle 204 may travel on a road network incorporating a point-of-sale capability, and may be equipped with an array of onboard sensors 206 to detect one or more potential customers 202 in the vicinity. In some embodiments, the sensors 206 for autonomous navigation may also be used to detect potential customers 202 physically hailing the autonomous vehicle 204 and/or to correctly identify a potential customer 202 requesting service via a cell phone application. Such sensors 206 may include, for example, lidar sensors, radar sensors, GPS, inertial measurement unit, time-of-flight cameras, stereo cameras, monocular cameras, microphones, and the like.
[0020] In certain embodiments, a potential customer 202 may hail the autonomous vehicle 204 spontaneously after seeing the autonomous vehicle 204 nearby. One or more sensors 206 may detect the presence of the potential customer 202 from a distance, and embodiments of the invention may utilize sensor 206 fusion for biometric analysis combining, for example, face, gait, 3D point cloud, voice, and/or fingerprint analysis to identify the potential customer 202. Such identification may be incorporated into the determination of a customer’s intent or probability of making a present purchase based on historical data. The sensor fusion may occur with a single algorithm or via an ensemble method where several individual algorithms may be evaluated simultaneously to make a determination.
[0021] In other embodiments, the sensors 206 of the autonomous vehicle 204 may intermittently or continuously scan the external environment to detect the presence of a potential customer 202, and to determine whether the potential customer 202 is likely to patronize the point- of-sale business associated with the autonomous vehicle 204.
[0022] Data from the sensors 206 may be gathered and analyzed by an onboard or remote processor or a combination of onboard and remote processing. The processor may utilize various algorithms to analyze this information to determine the potential customer’s 202 intent. Such algorithms may also be capable of tracking individuals, and/or estimating the number of people within a partially occluded crowd to predict potential profitability of providing service. [0023] In certain embodiments, for example, a determination of customer intent may be made by analyzing the sensor 206 data using known methods such as Haar Cascade, Histogram of Gradients in a support vector machine (SVM), or deep neural networks. Such systems may also incorporate a Bayesian framework to provide information on the confidence of the systems predictions. Such a system may incorporate convolutional layers in the neural network, a common neural network architecture for image related data. The algorithm may incorporate sensor fusion as raw data input into a single algorithm, or via several individual algorithms incorporating an ensemble method to incorporate each of the individual algorithm’s output.
[0024] In some embodiments, the algorithm may incorporate memory units, such as long short term memory (LSTM) or Gated Recurrent Unit (GRU), to make predictions based on a sequential time slice of data. Such an algorithm may incorporate past individual user behavior when hailing the vehicle when a potential customer is identified. For example, a user intent vector may be created based upon the user’s past hailing behavior. This intent vector may be incorporated into the algorithm along with the sensor 206 data to calculate the probability of that individual customer’s desire for goods or services. This method may improve the correct identification of a user’s intent where different individuals may display their desire for service for service in different ways.
[0025] The machine learning algorithms may also be trained to differentiate between different classes of customers, or sub-classify potential customers 202. For example, in one embodiment, a group of children after a baseball game may be distinguished from a gathering of adults at an outdoor party. The system 200 may inform a motion planning algorithm to alter the behavior of a point-of-sale autonomous vehicle 204 based on the class determination. For example, the motion planning algorithm may be used to adjust the autonomous vehicle’ s 204 driving patterns or to decrease its speed to compensate for an increased likelihood of children in close proximity to the autonomous vehicle 204. In other embodiments, the motion planning algorithm may be used to navigate the vehicle to a location to improve visibility of potential customers. This may be accomplished by changing lanes or temporarily parking the vehicle, for example.
[0026] In addition, based on the determination of customer intent, embodiments of the invention may cause the autonomous vehicle 204 to adjust its behavior to either accommodate or avoid the potential customer 202. For example, upon identifying a potential customer 202 as a likely patron of the point-of-sale business, the system 200 may identify a nearest available parking space and cause the autonomous vehicle 204 to move toward it, given present traffic conditions.
[0027] Referring now to Figure 3, in one embodiment, the system 300 may gather images for purposes of image segmentation and/or posture detection to estimate the physical positioning 310 of a potential customer 302. Such positioning 310 may indicate that the customer 302 is summoning or hailing the autonomous vehicle 304 for service. The captured image sequence may be analyzed by an algorithm capable of understanding an image scene and sequence of images. In one embodiment, such analysis may be performed by long-term recurrent convolutional networks (LRCN).
[0028] Certain combinations of ranges of motion of the potential customer 302, such as waving, facial expressions, and other gestures, indicate a strong likelihood that the potential customer 302 desires to patronize the autonomous vehicle 304. Head pose and eye gaze vectors 308 may also be calculated from sensor 306 data. This evidence may be used to confirm the potential customer’ s 302 intent by verifying that the potential customer 302 is facing the autonomous vehicle 304 and tracking the autonomous vehicle 304 as it drives toward the potential customer 302.
[0029] In some embodiments, the system 300 may estimate the position of a potential customer 302 relative to the autonomous vehicle 304. For example, the system 300 may utilize image data from camera sensors 306 and/or lidar sensors 306 to determine the position and angle of the potential customer 302 relative to the autonomous vehicle 304. The system 300 may also track head position and/or eye gaze vectors 308 relative to the autonomous vehicle 304 to determine whether the potential customer 302 is focused on the autonomous vehicle 304, regardless of whether he or she is also using physical motions to summon or hail the autonomous vehicle 304.
[0030] In certain embodiments, the trajectory of the potential customer 302 may be included in the determination of whether the potential customer 302 likely desires to patronize the autonomous vehicle 304. For example, a potential customer 302 who is walking towards the autonomous vehicle 304 or a path that may intersect with the vehicle’s current motion plan trajectory, is more likely interested in patronizing the autonomous vehicle 304 than a person who is walking away from the autonomous vehicle 304. In some embodiments, the customer trajectory information may be used to infer, based upon real-time sensor 306 data and mapping of the local area, that the customer is traveling to a location where the vehicle 304 may be able to stop and deliver services.
[0031] At close distances, the system 300 may use images captured by the sensors 306 to perform facial recognition and/or gait analysis to recognize past customers (including customers 302 that have electronically requested an autonomous vehicle 304 in the past), as well as new customers 302. In certain embodiments, the system 300 may implement neural networks, deep and convolutional, to facilitate facial recognition, gait detection, and speaker recognition. A probability framework, such as a Bayesian probabilistic framework combined with neural networks, may be utilized to determine a likelihood that the potential customer 302 desires to patronize the point-of- sale business associated with the autonomous vehicle 304. Three dimensional data obtained from lidar sensors 306, time of flight cameras, or other such sensors 306 may also be used to identify the customer. Such data may be general (such as physical dimensions) or detailed (such as a 3D facial profile).
[0032] At long distances, data may be limited by the resolution and field of view of the camera sensors 306 on the vehicle. This may limit the ability to correctly identify a potential customer’s intent and identity. As a result, the prediction confidence may be lower than at shorter distances. In such circumstances, the system 300 may alter the lidar sensor’s 306 data collection behavior. For example, for a mechanical lidar sensor 306, the rotation speed of the sensor 306 may be decreased to create a more tightly grouped points along the rotational direction of the lidar sensor 306 unit. In the case of a solid state lidar unit, the lidar may be“steered” towards the potential customer to obtain a tighter grouping of 3D data points. This denser lidar data may assist in correct identification of customer identity or customer intent.
[0033] A response of the autonomous vehicle 304 may vary based on real time conditions as well as a degree of confidence in the prediction of the potential customer’s 302 identity and intent. In certain embodiments, prediction confidence may be reflected in terms of a confidence score assigned to the customer identification, which may be interpreted in terms of business logic specific to the objectives of the business, as described in more detail with reference to Figure 8 below.
[0034] In some embodiments, a probability framework may be used to determine a profit- making potential of a specific location, taking into account the identities of potential customers 302 in the location and their historical behavior. For example, in one embodiment, a point-of-sale autonomous vehicle 304 may scan a crowded street to identify known customers 302 in a crowd. Based on the composition of customers and historical data corresponding thereto, the system 300 may predict potential sales revenue at that location. Potential sales revenue may also be predicted based on the vehicle’s 304 present configuration or stock of products and/or services available. [0035] In one embodiment, the invention may implement cooperative motion planning. In this case, another autonomous vehicle 304 may be dispatched to assist in meeting a predicted demand of a given product or service when that demand is greater than the current on-site vehicle’ s 304 inventory. Such predicted demand may be calculated by adding together the predicted demand values corresponding to each new customer, as well as to each identified customer 302 with a prior order history.
[0036] Facial recognition and/or gait analysis may also be performed to identify individuals that may pose a detriment to the profitability or security of the autonomous vehicle 304. For example, such individuals may include customers 302 that take a long time to order, order low-profit items, disrupt lines for service of the autonomous vehicle 304, damage or deface the autonomous vehicle 304, or pose a danger or detriment to other customers 302. In such cases, embodiments of the invention may display a warning on the autonomous vehicle 304 to direct the detrimental individual away from the autonomous vehicle 304 and/or to alert other customers to the potential problem. In addition, the autonomous vehicle 304 may reject the detrimental individual as a customer 302 by avoiding the individual (leaving the location, changing lanes, continuing to drive, or the like), or by physically or electronically preventing the individual from transacting business with the autonomous vehicle 304. As a final escalation level, the autonomous vehicle 304 may alert police if warranted by the circumstances.
[0037] In addition, any action may incorporate additional business logic. For example, in one embodiment, a customer designated to be avoided due to excessive ordering of low-cost items may nevertheless be approached if the area contains enough other customers determined to be interested in service. In this manner, the profitable customers may offset the deleterious effect of the avoidable customer and the total time the vehicle 304 is stopped may be optimally profitable. [0038] In some embodiments, the customer recognition performed by the system 300 may be used as a platform for police surveillance of individuals sought for a crime. Similarly, in other embodiments, a system 300 in accordance with the invention may alert police to crimes in progress through monitoring the data captured by the sensors 306, and/or utilizing an LRCN for classification purposes. The autonomous vehicle 304 may be equipped to make apprehensions of suspects if approached.
[0039] Past interactions of the potential customer 302 with the autonomous vehicle 304, or with other autonomous vehicles 304 within the fleet of vehicles 304 associated with a particular point-of-sale business, may also be used to estimate a proclivity of the potential customer 302 to patronize the autonomous vehicle 304. In some embodiments, data gathered by the sensors 306 may be used to biometrically identify the potential customer 302. Such identification information may be compared against a database to determine whether the potential customer 302 has previously interacted with the point-of-sale autonomous vehicle 304, and whether those interactions were positive or negative. This information may form the basis of planning an optimal route for the vehicle to follow.
[0040] The combination of these factors (physical motions of the potential customer 302, head position and/or eye gaze vectors 308, trajectory, facial expression, gestures, and historical customer 302 interactions), may be used to produce a probability that the potential customer 302 is actually summoning or hailing the autonomous vehicle 304. Probability data generated in this manner may include probability scores stored and accumulated to produce a probabilistic model indicating the potential customer’s 302 probability of hailing and/or patronizing the autonomous vehicle 304 as a function of time. [0041] This probability score or determination may inform the motion planning algorithm used to direct the autonomous vehicle 304 based on current path and predicted customer 302 demand. For example, the system 300 may slow down or temporarily stop the autonomous vehicle 304 based on current traffic conditions to test its predictions that the potential customer 302 is interested in patronizing the point-of-sale vehicle 304. Conversely, the motion planning algorithm may direct the autonomous vehicle 304 away from the area if the system 300 returns a low probability determination for the potential customer 302.
[0042] Referring now to Figures 4 and 5, in certain embodiments, a potential customer 402 may summon an autonomous vehicle 404 by way of an application on a computing device 408, such as a wearable device, digital kiosk, cell phone, tablet, laptop computer, desktop computer, or the like. The computing device 408 may communicate with the cloud 406 over any suitable network such as a cellular network, WiFi, or the like. The cloud 406 may in turn communicate with the autonomous vehicle 404 over a cellular network, WiFi, or the like.
[0043] In certain embodiments, a remote server, such as a cloud 406 server, may find a suitable autonomous vehicle 404 in the vicinity of the potential customer 402, and may instruct the autonomous vehicle 404 to travel to the location of the customer 402. In some embodiments, the cloud 406 server may calculate a travel time between available autonomous vehicles 404 and the location of the potential customer 402. The cloud 406 server may then select a particular autonomous vehicle 404 to provide goods or services to the customer 402 based on a shortest associated travel time.
[0044] In other embodiments, the potential customer 402 may request a particular autonomous vehicle 404, or may request a specific location for the autonomous vehicle 404 to stop. As shown in Figure 5, for example, an application 502 installed on a mobile device 500 of the customer 402 may include symbols to designate autonomous vehicles 506 in the area available to provide various types of goods or services in accordance with embodiments of the invention. As shown, the application 502 may include stickers depicting objects such as ice cream, tools, or groceries, overlaid onto a map of the vicinity. The stickers may alert the customer 402 to the types of goods and services provided by autonomous vehicles 404 in the area, and in some embodiments, may allow the customer 402 to select a particular autonomous vehicle 404 to access more detailed information regarding the vehicle 506 or its goods and services, and to request service.
[0045] The customer 402 may utilize this application 502, for example, to summon a ride from an autonomous vehicle 404. The user may specify a desired pickup location 504, or the pickup location 504 may be automatically determined based on a present location of the potential customer 402. In one embodiment, the mobile device 500 may include a global positioning system that may communicate the potential customer’s 402 present location to a cloud 406 server. The cloud 406 server may utilize this information to locate available autonomous vehicles 404 in the area. One autonomous vehicle 404 may be assigned to pick up the customer 402 based on the ride request. In some embodiments, the closest available autonomous vehicle 404 may be identified and assigned to the customer 402 based on an estimated travel time between a current location of the autonomous vehicle 404 and the pickup location 504, a distance between the autonomous vehicle 404 and the pickup location 504, or the like.
[0046] In some instances, the customer 402 may be difficult for the autonomous vehicle 404 to accurately identify or locate due to user error, sensor error, movement of the customer 402 to a different nearby location, or a location populated by more than one individual. A system 400 in accordance with the invention may address this problem by automatically accessing a customer profile associated with customer 402, and/or the application or mobile phone 500 registered to the customer 402, upon receiving a request for service. For example, a photo of the customer 402 may be automatically transmitted to the autonomous vehicle 404 in connection with the customer’ s 402 request. Additionally, in some embodiments, the system 400 may automatically send to the autonomous vehicle 404 location indicators (such as signs or nearby stores) to facilitate precise identification of a service or pickup location 504 by the autonomous vehicle 404.
[0047] Referring now to Figure 6, a method 600 for detecting a potential customer in accordance with embodiments of the invention may include determining 602 whether a point-of-sale autonomous vehicle is in a location that is favorable for business. A point-of-sale autonomous vehicle located in an area surrounding a public place such as a park or shopping mall is more likely to receive business than one located in a remote residential area, for example. If the vehicle is not in a favorable location, the vehicle may move 614 to a more favorable location.
[0048] If, however, it is determined 602 that the vehicle is located in a favorable location, sensors onboard the autonomous vehicle may scan 604 a surrounding environment for potential customers. Sensors may include, for example, lidar sensors, radar sensors, time-of-flight cameras, stereo cameras, monocular cameras, microphones, and the like. Sensor data may be retrieved 606 from the various sensors and analyzed to determine the presence and attributes of potential customers in the vicinity. Data gathered from the sensors may be retrieved 606 and analyzed by an onboard or remote processor to determine 608 whether a potential customer is proximate the autonomous vehicle.
[0049] In some embodiments, sensor data may be combined with stored historical data corresponding to the customer to determine 610 whether the customer is likely to patronize the point-of-sale business associated with the autonomous vehicle. If yes, the autonomous vehicle may navigate 612 toward the customer for parking and engaging in a point-of-sale transaction. If no, the autonomous vehicle may continue 614 to a new location.
[0050] Referring now to Figure 7, a method 700 for positively identifying a potential customer in accordance with the invention may include detecting 702 a customer by utilizing sensors onboard the autonomous vehicle, as set forth above. Upon detecting a customer, sensor data may be retrieved 704 and utilized to biometrically ascertain 706 the identity and/or intent of the customer in patronizing the point-of-sale vehicle.
[0051] To increase prediction confidence in the identification 706 of the customer, certain embodiments of the invention may determine 708 whether a geographic history corresponding to the customer is commensurate with the customer’s current location. Such geographic historical information may be included in a customer profile associated with the customer and stored in a database. The stored geographic historical information may be updated each time the customer makes a request for service or is otherwise identified by an autonomous vehicle in connection with embodiments of the invention. Such information may include GPS coordinates associated with the user, cell phone application geo-location information, general location information, and the like.
[0052] Such information may inform a current identification of a customer by comparing a current location of the customer against the geographical historical information associated with the customer. Indeed, it is unlikely that a customer known to transact business with point-of-sale vehicles primarily in the Chicago area would be in Miami. Accordingly, a confidence score associated with identification of the customer in Miami may be reduced.
[0053] In addition, the combination of sensor fusion and historical geo-location of a customer may be used to develop accurate machine learning models, such as convolution neural networks (CNN), support vector machines (SVM), or principal component extraction (PCA), or ensemble algorithms where different models vote on their prediction (i.e., Gradient Boosted Decision Trees).
[0054] If the current location of the customer is not commensurate with the geographical historical information for the customer, the method 700 may return to retrieve 704 additional sensor data to aid accurate customer identification. If the current location matches the geographical historical information or is within a specified deviation range, the method 700 may gather 710 additional data from the customer to positively identify or confirm identification of the customer. Such additional information may include, for example, additional biometric data including voice data, facial data, fingerprint data, gait data, and 3D point cloud data. In some embodiments, such additional data may include payment data such as bank card or credit card data, driver license, passport, other identification information or data such as phone identifiers, NFC payments, credit reports, and historical customer data.
[0055] Based on the sensor data and/or the additional data, some embodiments of the method 700 may determine 712 a class associated with the customer. A class may include, for example, age, gender, familial role, and the like. This information, combined with the information and data previously discussed, may be used to authenticate the customer to enable automatic or biometric- based payment for services at the point-of-sale autonomous vehicle, and/or to customize goods or services to the known or predicted preferences of the customer
[0056] For example, when a customer uses a mobile device to pay for service or request a point-of-sale autonomous vehicle, this information may be used to connect the customer’ s biometric signature with his or her identity and/or bank account. Assuming high confidence in such identification, such biometric information may also allow the customer to pay for service automatically by authenticating a deduction from his or her account. [0057] In certain embodiments, as mentioned previously, a customized menu of goods or services may be presented to a customer depending on the class to which the customer is assigned and their past transaction history. For example, a food truck may customize its menu to remove any alcoholic beverage options to customers who have been identified as being under the legal drinking age. Likewise, a mobile animal groomery may tailor its menu to reflect only dog-grooming services if the customer has been identified as a dog owner. In one embodiment, it may be determined 712 that the customer is female and a mother to the small child accompanying her. These class designations may inform the method 700 that the child is accompanied by a paying adult, and a menu of services may be customized to reflect family-friendly items. In other embodiments, a past ordering history of the customer may be used to tailor a current menu of goods or services for display to the customer.
[0058] The method 700 may then determine 714 whether the customer interacts with or transacts business with the autonomous vehicle. The customer interaction may be positive or negative, including such things as perusing a menu, making a purchase, vandalizing the vehicle, or the like. If there is any customer interaction 714, a customer profile associated with the customer may be updated 716 to reflect the type and quality of interaction. If there is no interaction 714, the method 700 may return to detect 702 another customer.
[0059] Figure 8 depicts an algorithm or method 800 for determining whether a potential customer is likely to patronize a point-of-sale business associated with an autonomous vehicle. The method 800 may include first detecting 802 a potential customer. A potential customer may be detected by onboard sensors associated with the autonomous vehicle, or by a server or processor in communication with a mobile device of the potential customer. The method 800 may used the information gathered by the sensors and/or server or processor to identify 804 the customer. [0060] If the customer cannot be identified, additional identification data may be collected 806. Additional identification data may include, for example, face, gait, 3D point cloud, voice, and/or fingerprint analysis. In some embodiments, additional identification data may further include historical geo-location and other data of the customer, payment data, driver license, passport, other identification data, and the like.
[0061] If the customer is positively identified 804, the method 800 may determine 808 whether the customer is detrimental to the point-of-sale business. Current sensor data may aid this determination, as well as historical behavioral information associated with the customer. If the customer is determined 808 to be detrimental, the autonomous vehicle may avoid the customer by navigating 818 to another location, preventing a transaction with the customer, displaying a warning for others, or calling police. If the customer is determined 808 not to be detrimental, the method 800 may determine 810 whether the customer directly requested the point-of-sale autonomous vehicle. If yes, the autonomous vehicle may navigate 812 to the customer to facilitate a point-of-sale transaction, slow its speed to capture additional data, or allow extended parking to allow the customer to approach the vehicle.
[0062] If the customer did not request 810 the autonomous vehicle, the method 800 may continue to determine 814 whether the customer is likely hailing 814 the vehicle. This determination may be made by utilizing biometric data and algorithms to estimate the physical positioning and movements of a potential customer indicating that the customer is intentionally summoning or hailing the vehicle. In one embodiment, such analysis may be performed by long-term recurrent convolutional networks. If it is determined 814 that the customer is hailing the vehicle, the vehicle may navigate 812 to the customer and/or otherwise allow the customer to patronize the vehicle. [0063] If not, the method 800 may reference business logic provided by the owner or operator of the point-of-sale autonomous vehicle to determine 816 whether to delay servicing the potential customer. If the business logic authorizes a delay in its regular service to provide service to a new potential customer, the autonomous vehicle may navigate 812 to the customer. If not, the autonomous vehicle may navigate 818 to its original or other destination.
[0064] In operation, for example, an autonomous vehicle may be tasked with servicing a customer who has electronically requested pizza delivery by the autonomous vehicle. Time is of the essence to avoid delivering a cold pizza. If the autonomous vehicle encounters a new potential customer on its way to the requesting customer, it may be delayed by an additional 5-10 minutes, which may result in unsatisfactory service quality to the requesting customer. The method 800 may avoid this result by referencing business logic that does not authorize a delay to the requesting customer. In some embodiments, the method 800 may dispatch an additional vehicle to service either the requesting customer or the potential customer to maximize business profits without sacrificing service quality.
[0065] In the above disclosure, reference has been made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific implementations in which the disclosure may be practiced. It is understood that other implementations may be utilized and structural changes may be made without departing from the scope of the present disclosure. References in the specification to“one embodiment,”“an embodiment,”“an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
[0066] Implementations of the systems, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, implementations of the disclosure can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
[0067] Computer storage media (devices) includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
[0068] An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network. A“network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
[0069] Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
[0070] Those skilled in the art will appreciate that the disclosure may be practiced in network computing environments with many types of computer system configurations, including, an in-dash vehicle computer, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
[0071] Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the description and claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
[0072] It should be noted that the sensor embodiments discussed above may comprise computer hardware, software, firmware, or any combination thereof to perform at least a portion of their functions. For example, a sensor may include computer code configured to be executed in one or more processors, and may include hardware logic/electrical circuitry controlled by the computer code. These example devices are provided herein purposes of illustration, and are not intended to be limiting. Embodiments of the present disclosure may be implemented in further types of devices, as would be known to persons skilled in the relevant art(s).
[0073] At least some embodiments of the disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer useable medium. Such software, when executed in one or more data processing devices, causes a device to operate as described herein.
[0074] While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the disclosure. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents. The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all of the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the disclosure.

Claims

1. A method comprising:
scanning, with at least one sensor of an autonomous vehicle, an external environment for a potential customer of a point-of-sale business;
detecting, with the at least one sensor, the potential customer;
determining whether the potential customer is likely to patronize the point-of-sale business; and
navigating the autonomous vehicle to a location proximate the potential customer in the event the potential customer is likely to patronize the point-of-sale business.
2. The method of claim 1, further comprising gathering and analyzing data associated with the potential customer to positively identify the potential customer.
3. The method of claim 2, wherein the data comprises at least one of biometric data, GPS data, historical geo-location of the potential customer, payment data, and identification data.
4. The method of claim 3, wherein the biometric data comprises at least one of voice data, face data, fingerprint data, gait data, and 3D point cloud data.
5. The method of claim 1, wherein the at least one sensor is selected from the group consisting of a lidar sensor, a time-of-flight camera, a stereo camera, a monocular camera, and a microphone.
6. The method of claim 2, further comprising customizing at least one of goods and services of the point-of-sale business according to preferences associated with the identified customer.
7. The method of claim 1, wherein determining whether the potential customer is likely to patronize the point-of-sale business comprises analyzing at least one of historical interactions with the potential customer, a head pose, an eye gaze, and a physical motion of the potential customer relative to the autonomous vehicle.
8. The method of claim 1, wherein determining whether the potential customer is likely to patronize the point-of-sale business comprises receiving a request from the potential customer.
9. The method of claim 8, wherein navigating the autonomous vehicle to a location proximate the potential customer comprises navigating the autonomous vehicle to a location associated with the request.
10. The method of claim 2, further comprising enabling payment from the potential customer to the point-of-sale business via the data.
11. A system comprising:
an autonomous vehicle associated with a point-of sale business, the autonomous vehicle having at least one sensor;
at least one processor; and
at least one memory device operably coupled to the at least one processor and storing instructions for execution on the at least one processor, the instructions causing the at least one processor to:
scan, with the at least one sensor, an external environment for a potential customer of the point-of-sale business;
detect, with the at least one sensor, the potential customer;
determine whether the potential customer is likely to patronize the point-of-sale business; and
navigate the autonomous vehicle to a location proximate the potential customer in the event the potential customer is likely to patronize the point-of-sale business.
12. The system of claim 11, wherein the instructions further cause the at least one processor to gather and analyze data associated with the potential customer to positively identify the potential customer.
13. The system of claim 12, wherein the data comprises at least one of biometric data, historical geo-location of the potential customer, payment data, and identification data.
14. The system of claim 12, wherein the instructions further cause the at least one processor to customize at least one of goods and services of the point-of-sale business according to preferences associated with the identified customer.
15. The system of claim 11, wherein determining whether the potential customer is likely to patronize the point-of-sale business comprises analyzing at least one of historical interactions with the potential customer, a trajectory, a head pose, an eye gaze, and a physical motion of the potential customer relative to the autonomous vehicle.
16. The system of claim 11, wherein determining whether the potential customer is likely to patronize the point-of-sale business comprises receiving a request from the potential customer.
17. The system of claim 16, wherein navigating the autonomous vehicle to a location proximate the potential customer comprises navigating the autonomous vehicle to a location associated with the request.
18. The system of claim 12, wherein the instructions further cause the at least one processor to enable payment from the potential customer to the point-of-sale business via the data.
19. A computer program product comprising a computer-readable storage medium having computer-usable program code embodied therein, the computer-usable program code configured to perform the following when executed by at least one processor:
scan, with at least one sensor of an autonomous vehicle, an external environment for a potential customer of a point-of-sale business;
detect, with the at least one sensor, the potential customer;
determine whether the potential customer is likely to patronize the point-of-sale business; and
navigate the autonomous vehicle to a location proximate the potential customer in the event the potential customer is likely to patronize the point-of-sale business.
20. The computer program product of claim 19, wherein the computer-usable program code is further configured to gather and analyze data associated with the potential customer to positively identify the potential customer.
PCT/US2018/016020 2018-01-30 2018-01-30 Motion planning for autonomous point-of-sale vehicles WO2019151995A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2018/016020 WO2019151995A1 (en) 2018-01-30 2018-01-30 Motion planning for autonomous point-of-sale vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/016020 WO2019151995A1 (en) 2018-01-30 2018-01-30 Motion planning for autonomous point-of-sale vehicles

Publications (1)

Publication Number Publication Date
WO2019151995A1 true WO2019151995A1 (en) 2019-08-08

Family

ID=67478461

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/016020 WO2019151995A1 (en) 2018-01-30 2018-01-30 Motion planning for autonomous point-of-sale vehicles

Country Status (1)

Country Link
WO (1) WO2019151995A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112947412A (en) * 2021-01-26 2021-06-11 东北大学 Method for vending robot to independently select traveling destination
US20210302178A1 (en) * 2020-03-30 2021-09-30 Toyota Jidosha Kabushiki Kaisha Operation management apparatus, operation management system, operation management method, and vehicle
US11163372B2 (en) 2020-04-01 2021-11-02 Toyota Motor North America, Inc Transport gait and gesture interpretation
US20220413497A1 (en) * 2018-02-26 2022-12-29 Nvidia Corporation Systems and methods for computer-assisted shuttles, buses, robo-taxis, ride-sharing and on-demand vehicles with situational awareness

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070001904A1 (en) * 2005-05-09 2007-01-04 Ehud Mendelson System and method navigating indoors and outdoors without GPS. utilizing a network of sensors
US20130282283A1 (en) * 2012-04-20 2013-10-24 Bank Of America Corporation Proximity-Based Dynamic Vehicle Navigation
US20140136414A1 (en) * 2006-03-17 2014-05-15 Raj Abhyanker Autonomous neighborhood vehicle commerce network and community
US9256852B1 (en) * 2013-07-01 2016-02-09 Google Inc. Autonomous delivery platform

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070001904A1 (en) * 2005-05-09 2007-01-04 Ehud Mendelson System and method navigating indoors and outdoors without GPS. utilizing a network of sensors
US20140136414A1 (en) * 2006-03-17 2014-05-15 Raj Abhyanker Autonomous neighborhood vehicle commerce network and community
US20130282283A1 (en) * 2012-04-20 2013-10-24 Bank Of America Corporation Proximity-Based Dynamic Vehicle Navigation
US9256852B1 (en) * 2013-07-01 2016-02-09 Google Inc. Autonomous delivery platform

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220413497A1 (en) * 2018-02-26 2022-12-29 Nvidia Corporation Systems and methods for computer-assisted shuttles, buses, robo-taxis, ride-sharing and on-demand vehicles with situational awareness
US11874663B2 (en) * 2018-02-26 2024-01-16 Nvidia Corporation Systems and methods for computer-assisted shuttles, buses, robo-taxis, ride-sharing and on-demand vehicles with situational awareness
US20210302178A1 (en) * 2020-03-30 2021-09-30 Toyota Jidosha Kabushiki Kaisha Operation management apparatus, operation management system, operation management method, and vehicle
US11163372B2 (en) 2020-04-01 2021-11-02 Toyota Motor North America, Inc Transport gait and gesture interpretation
US11935147B2 (en) 2020-04-01 2024-03-19 Toyota Motor North America, Inc. Transport gait and gesture interpretation
CN112947412A (en) * 2021-01-26 2021-06-11 东北大学 Method for vending robot to independently select traveling destination
CN112947412B (en) * 2021-01-26 2023-09-26 东北大学 Method for autonomously selecting advancing destination of vending robot

Similar Documents

Publication Publication Date Title
US11507857B2 (en) Systems and methods for using artificial intelligence to present geographically relevant user-specific recommendations based on user attentiveness
US11587373B2 (en) Fleet maintenance management for autonomous vehicles
US10147004B2 (en) Automatic image content analysis method and system
US10482421B1 (en) System for expediting delivery of items
US20190197497A1 (en) Responses to detected impairments
WO2019151995A1 (en) Motion planning for autonomous point-of-sale vehicles
KR20210035296A (en) System and method for detecting and recording anomalous vehicle events
WO2021133789A1 (en) Systems and methods for incident detection using inference models
US11783257B1 (en) Systems and methods of using a transferrable token for gig-economy activity assessment
US11880899B2 (en) Proximity-based shared transportation reservations
US9865056B2 (en) Video based method and system for automated side-by-side drive thru load balancing
KR102289181B1 (en) System, apparatus and method for vision based parking management
KR102289182B1 (en) System, apparatus and method for vision based parking management
US11729444B2 (en) System and methods for sensor-based audience estimation during digital media display sessions on mobile vehicles
CN110858300A (en) Eye gaze tracking of vehicle occupants
US20240027218A1 (en) User preview of rideshare service vehicle surroundings
US20230112797A1 (en) Systems and methods for using artificial intelligence to present geographically relevant user-specific recommendations based on user attentiveness
US11928543B2 (en) User-based vehicle determination
US11423710B2 (en) Approaches for managing vehicles
JP6982875B2 (en) Information provision system
Shukla et al. Real-time alert system for delivery operators through artificial intelligence in last-mile delivery
US20230055559A1 (en) Methods and systems for estimating a value of a future vehicle
Vishal et al. AI based Smart Parking System
WO2023164074A1 (en) User authentication
WO2023137459A1 (en) Systems and methods for secure communications via blockchain for use in image-based parking systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18903759

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18903759

Country of ref document: EP

Kind code of ref document: A1