US20210094508A1 - Pick-up authentication via audible signals - Google Patents
Pick-up authentication via audible signals Download PDFInfo
- Publication number
- US20210094508A1 US20210094508A1 US16/586,645 US201916586645A US2021094508A1 US 20210094508 A1 US20210094508 A1 US 20210094508A1 US 201916586645 A US201916586645 A US 201916586645A US 2021094508 A1 US2021094508 A1 US 2021094508A1
- Authority
- US
- United States
- Prior art keywords
- autonomous vehicle
- user
- audible
- authentication code
- server
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000004891 communication Methods 0.000 claims abstract description 41
- 238000000034 method Methods 0.000 claims description 87
- 230000008569 process Effects 0.000 description 34
- 238000012795 verification Methods 0.000 description 23
- 238000001514 detection method Methods 0.000 description 20
- 238000010586 diagram Methods 0.000 description 16
- 230000001771 impaired effect Effects 0.000 description 11
- 208000029257 vision disease Diseases 0.000 description 11
- 230000004393 visual impairment Effects 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 8
- 206010047571 Visual impairment Diseases 0.000 description 7
- 230000006735 deficit Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 230000001010 compromised effect Effects 0.000 description 2
- 230000009429 distress Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 239000000725 suspension Substances 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q5/00—Arrangement or adaptation of acoustic signal devices
- B60Q5/005—Arrangement or adaptation of acoustic signal devices automatically actuated
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/10—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
- B60R25/1003—Alarm systems characterised by arm or disarm features
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/02—Reservations, e.g. for tickets, services or events
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/40—Business processes related to the transportation industry
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/2866—Architectures; Arrangements
- H04L67/30—Profiles
- H04L67/306—User profiles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/06—Authentication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/60—Context-dependent security
- H04W12/65—Environment-dependent, e.g. using captured environmental data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/024—Guidance services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/44—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/20—Means to switch the anti-theft system on or off
- B60R25/24—Means to switch the anti-theft system on or off using electronic identifiers containing a code not memorised by the user
-
- G05D2201/0213—
Definitions
- An autonomous vehicle is a motorized vehicle that can operate without a human driver.
- HMIs physical human-machine interfaces
- exemplary HMIs may include a mechanical push-button, a touch-sensitive display, or the like. While such HMIs are well-suited for a large portion of the population, these HMIs may be sub-optimal for those with vision impairments for a variety of reasons. For example, a person with a visual impairment may have difficulty trying to locate the autonomous vehicle and verifying that the particular vehicle is the correct and/or intended vehicle that the person is looking for.
- a visually impaired passenger may have difficulty or concerns during the ride as to the route and direction of the vehicle, in which case such HMIs may lack the ability to provide the visually impaired passenger with desired support or information.
- HMIs upon disembarking from the autonomous vehicle may also prove to be difficult. Without the assistance of a driver to guide the passenger safely out of the autonomous vehicle, such HMIs may be sub-optimal for visually impaired passengers.
- Described herein are various technologies pertaining to enabling ridesharing and usage of an autonomous vehicle by a visually impaired passenger.
- the various technologies present a notification to the passenger based on a profile of the passenger, wherein the profile of the passenger specifies that the notification is to be provided audibly to the passenger (e.g., the passenger may have a visual impairment).
- the various technologies present the notification when a successful authentication between the passenger and an autonomous vehicle occurs. Content of the notification is based on occurrence of the successful authentication. Because the content of the notification is based on the detection of divergence occurring, the notification can inform the visually impaired passenger of the fact that the autonomous vehicle is the intended autonomous vehicle that the passenger should be looking for.
- the notification can include “the car is parked 100 feet to your right.”
- the notification can include “this is the correct vehicle, the doors have been unlocked so that you can get in.”
- an autonomous vehicle is configured to output a first authentication signal, while a user receives a second authentication signal from a server. The user may then verify that the two authentication signals are the matching authentication signals and notify the autonomous vehicle that the authentication signals match. The autonomous vehicle may then confirm again that there is a successful authentication and output the notification.
- a server is in communication with the autonomous vehicle and the mobile computing device of the passenger.
- the autonomous vehicle may output the first authentication signal and the user may output the second authentication signal.
- the autonomous vehicle and/or the server may then confirm that the authentication signals match and accordingly output the above notification.
- FIG. 1 illustrates an environment having a user and an autonomous vehicle.
- FIG. 2A illustrates an exemplary method of audible authentication between a user and an autonomous vehicle.
- FIG. 2B illustrates another exemplary method of audible authentication between a user and an autonomous vehicle.
- FIG. 3 illustrates an autonomous vehicle during a journey.
- FIG. 4 illustrates a user and an autonomous vehicle arriving at a destination.
- FIG. 5 illustrates an autonomous vehicle in accordance with this subject disclosure.
- FIG. 6 illustrates a mobile computing device in accordance with this subject disclosure.
- FIG. 7 illustrates a server in accordance with this subject disclosure.
- FIG. 8 is a flow diagram that illustrates various processes that occur when a user requests a ridesharing service.
- FIG. 9A is a flow diagram that illustrates a pickup process in FIG. 8 .
- FIG. 9 b is a flow diagram that illustrates a second safe boarding process.
- FIG. 10 is a timeline flow diagram that illustrates a successful safe boarding process.
- FIG. 11 is a schematic block diagram that illustrates a pickup process.
- FIG. 12 is a flow diagram that illustrates a journey verification process in FIG. 8 .
- FIG. 13 is a schematic block diagram that illustrates an enroute process.
- FIG. 14 is a flow diagram that illustrates a safe disembarking process in FIG. 8 .
- FIG. 15 is a schematic block diagram that illustrates a disembarking process.
- the term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from the context, the phrase “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, the phrase “X employs A or B” is satisfied by any of the following instances: X employs A; X employs B; or X employs both A and B.
- the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from the context to be directed to a singular form.
- the terms “component” and “system” are intended to encompass computer-readable data storage that is configured with computer-executable instructions that cause certain functionality to be performed when executed by a processor.
- the computer-executable instructions may include a routine, a function, or the like. It is also to be understood that a component or system may be localized on a single device or distributed across several devices.
- the term “exemplary” is intended to mean serving as an illustration or example of something and is not intended to indicate a preference.
- a computer-readable profile of a passenger can indicate that the passenger prefers to receive information audibly.
- the autonomous vehicle can cause an audible message to be presented to the passenger.
- the passenger (who may be visually impaired) need not attempt to request information by interacting with a physical human-machine interface (HMI).
- HMI human-machine interface
- sensed events on a trip in an autonomous vehicle can trigger audible support for a passenger who has a visual impairment. While the techniques set forth herein are described for use with passengers having visual impairments, it is to be appreciated that these techniques can be utilized to assist passengers having other types of disabilities and/or even without disabilities; thus, as noted herein, personalization settings in a profile of a passenger can indicate a preference for audible presentation of information when the passenger is in the autonomous vehicle.
- an autonomous vehicle includes a display screen located in an interior of the autonomous vehicle that can be configured to receive typed-out support requests and provide support information.
- a display screen located in an interior of the autonomous vehicle that can be configured to receive typed-out support requests and provide support information.
- the passenger can be difficult for the passenger to operate the display screen to request support, much less read the support information while the autonomous vehicle is moving.
- the disclosed methods and systems can be integrated with an autonomous vehicle to provide contextual audible support messages.
- FIGS. 1-4 collectively show a high level overview of different processes that may occur when a user uses an autonomous vehicle.
- a user or passenger 102 may request use of an autonomous vehicle 104 .
- the user or passenger 102 may have a vision impairment.
- typical use of and navigation to/from the autonomous vehicle 104 may be more difficult for the user 102 .
- the user 102 may try to follow a path 106 to a pickup location 108 to board the autonomous vehicle 104 .
- the vision impairment the user 102 may have difficulty navigating to the autonomous vehicle 104 .
- audible guidance 110 may be provided to the user to follow the path 106 to the pickup location 108 .
- the user 102 will need to identify and authenticate the autonomous vehicle 104 to determine that the particular autonomous vehicle 104 is the intended autonomous vehicle to board. Due to the vision impairment the user 102 may have difficulty with identifying the particular autonomous vehicle 104 by traditional authentication methods, such as examining the license plate or make, model and/or color of the vehicle. Thus, the user may audibly authenticate the autonomous vehicle 104 .
- FIGS. 2A-2B show two different embodiments for audibly authenticating the autonomous vehicle 104 .
- FIG. 2A demonstrates a server 202 sending a first audible authentication code or signal 204 a over a first signal 206 a to the autonomous vehicle 104 and a second audible authentication code or signal 204 b over a second signal 206 b to a mobile computing device 200 of the user 102 .
- the mobile computing device 200 of the user 102 may output the second audible authentication code 204 b so that the user 102 is audibly aware of and/or may audibly receive the second audible authentication signal 204 b .
- the autonomous vehicle 104 then outputs the first audible authentication code 204 a so that the user 102 can audibly receive the first audible authentication code 204 a .
- the user 102 may then output the second audible authentication code 204 b so that the autonomous vehicle 104 can audibly receive the second audible authentication code 204 b .
- the autonomous vehicle 104 can determine whether the audible authentication codes 204 a , 204 b match to form a successful authentication and accordingly unlock the autonomous vehicle.
- the autonomous vehicle 104 may also then notify the server 202 of the successful authentication over a third signal 208 a .
- the server 202 may receive and confirm that the audible authentication codes 204 a , 204 b match and accordingly unlock the autonomous vehicle 104 over the third signal 208 a .
- the mobile computing device 200 of the user 102 may be configured to output the second audible authentication code 204 b so that the autonomous vehicle 104 may receive the second audible authentication code 204 b , thus relieving the user 102 from receiving and outputting the second audible authentication code 204 b.
- FIG. 2B demonstrates the server 202 sending the first audible authentication code 204 a over the first signal 206 a to the autonomous vehicle 104 and the second audible authentication code 204 b over the second signal 206 b to the mobile computing device 200 of the user 102 .
- the mobile computing device 200 of the user 102 may output the second audible authentication code 204 b so that the user 102 is audibly aware of and/or may audibly receive the second audible authentication signal 204 b .
- the autonomous vehicle 104 then outputs the first audible authentication code 204 a so that the user 102 can audibly receive the first audible authentication code 204 a .
- the user 102 then verifies the first audible authentication code 204 a against the second audible authentication code 204 b and notifies the server 202 and/or the autonomous vehicle 104 over the third signal 208 b that the audible authentication codes 204 a , 204 b match.
- the server 202 may then notify the autonomous vehicle 104 , so that the autonomous vehicle 104 would then allow entry thereto.
- the authentication codes may be communicated in other ways as well (e.g., non-audible means).
- the server 202 may securely send authentication codes to the autonomous vehicle 104 and mobile computing device 200 of the user 102 .
- the mobile computing device 200 may be in communication with the autonomous vehicle 104 through an interface, such as Bluetooth. Once the mobile computing device 200 is authenticated, connected to and/or in communication with the autonomous vehicle 104 , then the mobile computing device 200 and the autonomous vehicle 104 communicate to ensure that the user 102 is near the autonomous vehicle 104 . When the user 102 is near the autonomous vehicle 104 , the mobile computing device 200 may signal to the autonomous vehicle 104 to open the doors.
- the authentication codes are communicated among the autonomous vehicle 104 , the mobile computing device 200 , and the server 202 non-audibly. More specifically, the authentication codes are sent from the server 202 to the autonomous vehicle 104 and the mobile computing device 200 . Then, the autonomous vehicle 104 and the mobile computing device 200 communicate directly, so that the autonomous vehicle 104 and/or the mobile computing device 200 may authenticate the authentication codes non-audibly. The autonomous vehicle 104 and/or the mobile computing device 200 may then notify the other and/or the server 202 , which may then send notification of the successful authentication to one or both of the autonomous vehicle 104 and the mobile computing device 200 .
- the autonomous vehicle 104 and the mobile computing device 200 may communicate directly, so that the autonomous vehicle 104 and/or the mobile computing device 200 may receive the other authentication code to send both authentication codes back to the server 202 for verification.
- the server 202 would then notify the autonomous vehicle 104 and/or the mobile computing device 200 of the authentication.
- an unintended third party may also enter the autonomous vehicle 104 while the door is open. This may occur when the unintended third party enters before the user 102 , after the user 102 , and/or enters through a different door than the user 102 .
- the autonomous vehicle 104 may also track and verify a number of people boarding the autonomous vehicle 104 against an intended number of users 102 .
- the autonomous vehicle 104 may also verify that the doors to the autonomous vehicle 104 are properly closed. If the doors are not properly closed, the autonomous vehicle 104 may warn and/or notify the user 102 . For example, a user may have a long skirt, which gets stuck in the door when the door closes. In some embodiments, the autonomous vehicle 104 may detect this and alert the user 102 .
- the autonomous vehicle 104 may detect that the user 102 is accompanied by young children. Thus, the autonomous vehicle 104 may alert the user 102 to lock the doors near where the young children are sitting, so that the young children don't accidentally open the doors. Moreover, the autonomous vehicle 104 may also detect that the young children are wearing seatbelts, such that if the autonomous vehicle 104 fails to detect the seatbelts being fastened, the autonomous vehicle 104 may alert the user 102 .
- the user 102 may also give commands regarding these and other issues, while in the autonomous vehicle 104 or through their mobile computing device 200 , so that the autonomous vehicle 104 may initiate the above.
- FIG. 3 demonstrates that throughout the journey, the autonomous vehicle 104 will have an actual journey route or path 304 that may have a divergence 306 from a predetermined journey route or path 302 .
- the user 102 may be curious of the predetermined journey route or path 302 that the autonomous vehicle 104 should follow.
- a support message 308 may be output so that the user 102 may be aware of the divergence 306 and the cause of the divergence 306 .
- the support message 308 may be output audibly, haptically, visibly, and/or a variety of other methods.
- the autonomous vehicle 104 may provide the support message 308 for some and/or all maneuvers that the autonomous vehicle 104 takes.
- the support message 308 may additionally be used to let the user 102 know that there is not a divergence 306 .
- FIG. 4 demonstrates the autonomous vehicle 104 arriving at a destination 402 .
- the destination 402 will be the intended destination as the user 102 requested.
- the autonomous vehicle 104 may implement procedures to ensure that the user 102 disembarks safely. For users 102 with vision impairments, disembarking safely may be difficult due to a lesser ability to see potential hazards or dangers outside of the autonomous vehicle 104 .
- a pedestrian 404 may be walking down the street towards the autonomous vehicle 104 .
- the user 102 may be unaware of which side of the autonomous vehicle 104 he/she should disembark on.
- the autonomous vehicle 104 may pull over on the left side of a one-way street.
- the autonomous vehicle 104 may provide an audible message 406 on from a side that the user 102 should disembark on.
- the audible message 406 may provide directions relative to the user 102 .
- the audible message 406 may state “Please exit to your right.”
- the autonomous vehicle 104 may identify each user 102 and localize the audible message 406 , such that the audible message 406 is sent to the specific user 102 .
- the autonomous vehicle 104 may instead output the audible message 406 having more generic instructions so that each user 102 is notified of which side to exit from.
- the autonomous vehicle 104 may output the audible message 406 to an exterior of the autonomous vehicle 104 .
- the audible message 406 may indicate that a person (e.g., a person that has a disability) is disembarking from the autonomous vehicle 104 to notify and protect potential pedestrians 404 .
- a person e.g., a person that has a disability
- the audible message 406 may be communicated in a variety of other methods, such as haptically, visibly, etc.
- FIG. 5 is a schematic block diagram of the autonomous vehicle 104 .
- the autonomous vehicle 104 may have an autonomous vehicle computing system 502 , an autonomous vehicle transceiver 512 , an autonomous vehicle location detection system 514 , an autonomous vehicle audio system 516 , a microphone 518 , and an autonomous vehicle sensor system 520 .
- the autonomous vehicle computing system 502 may have an autonomous vehicle processor 504 and autonomous vehicle memory 506 , where the autonomous vehicle memory 506 stores computer-executable instructions that are executed by the autonomous vehicle processor 504 .
- the autonomous vehicle processor 504 may be or may include a central processing unit (CPU), a graphics processing unit (GPU), a plurality of GPUs and/or CPUs, an application-specific integrated circuit (ASIC), or the like.
- the support system 508 is configured to receive and/or store preference data for the user 102 in a computer-readable profile of the user.
- the preference data may include or show that: 1) the user is differently abled (e.g. visual impairment); 2) information should be delivered in a certain way (e.g. audibly); and 3) the frequency of update information (i.e. how often the autonomous vehicle 104 should tell the user 102 of turns, stoplights, etc.).
- the user 102 may have a visual impairment, so the user 102 would define in his profile that he would like information to be delivered audibly at every turn.
- the support system 508 may be configured to continually track the movement of the autonomous vehicle 104 and provide updates of the movements of the autonomous vehicle 104 to the user 102 .
- the autonomous vehicle 104 may audibly notify the visually impaired user 102 that the autonomous vehicle 104 is making a turn along the predetermined journey route 302 or is making a turn, resulting in the divergence 306 from the predetermined journey route 302 .
- the autonomous vehicle memory 506 also stores computer-executable instructions that run the control system 510 .
- the control system 510 is configured to control a propulsion system 522 , a braking system 524 , and a steering system 526 of the autonomous vehicle 104 .
- the control system 510 may also control other systems in the autonomous vehicle, such as locks on doors of the autonomous vehicle 104 .
- the autonomous vehicle 104 further has an autonomous vehicle transceiver 512 that is configured to communicate with other systems.
- the autonomous vehicle transceiver 512 may send to and/or receive signals from a mobile device of the user and/or with the server 202 .
- Onboard the autonomous vehicle 104 may also be an autonomous vehicle location detection system or position sensor system 514 .
- This may be any sensor that provides detection of the location of the autonomous vehicle 104 , such as a global positioning system (GPS) or a combination of other systems.
- GPS global positioning system
- the autonomous vehicle 104 may further have an audio system 516 that may output information and other audible signals, such as music.
- the audio system 516 may also be a spatial audio system that provides audible signals from a variety of different directions, such that the user 102 can understand directional cues from the audio system 516 .
- the audio system 516 may audibly provide directions from a first side of the autonomous vehicle 104 to exit the autonomous vehicle 104 from the first side.
- the autonomous vehicle 104 may have a microphone 518 and/or autonomous vehicle sensor system 520 to receive audible and other types of signals.
- the microphone 518 may receive the aforementioned second audible authentication code 204 b from the user.
- the autonomous vehicle sensor system 520 may be composed of a variety of different observational sensors, such as light detection and ranging (LIDAR) sensors, cameras, suspension monitors etc. These various sensors may provide a wide variety of signals that indicate different types of movement of the autonomous vehicle 104 as well as information about the environment around the autonomous vehicle 104 .
- the suspension monitors may detect a turn based on different loads on each wheel.
- cameras and LIDAR are able to determine whether obstacles may be near a door of the autonomous vehicle 104 or whether other hazards are near or in the surroundings of the autonomous vehicle 104 .
- FIG. 6 illustrates a mobile computing device 200 that the user 102 may have.
- the mobile computing device 200 may take form in a wide variety of different embodiments, including, but not limited to smart phones, smart watches, tablets, hearing aids, etc.
- the mobile computing device 200 may have a mobile device processor 602 , mobile device memory 604 , a mobile device transceiver 608 , a mobile device location detection system 610 , a mobile device audio system 612 , and a mobile device microphone 614 .
- the mobile device processor 602 may be or may include a central processing unit (CPU), a graphics processing unit (GPU), a plurality of GPUs and/or CPUs, an application-specific integrated circuit (ASIC), or the like.
- CPU central processing unit
- GPU graphics processing unit
- ASIC application-specific integrated circuit
- the mobile device memory 604 stores computer-executable instructions that are executed by the mobile device processor 602 .
- a support application 606 may be stored within the mobile device memory 604 .
- the support application 606 may be configured to interface with the user 102 to receive and/or store the preference data for the user 102 in a computer-readable profile of the user.
- the preference data may include or show that: 1) the user is differently abled (e.g. visual impairment); 2) information should be delivered in a certain way (e.g. audibly); and 3) the type of information to be given (i.e. guidance to and from the autonomous vehicle 104 ).
- the user 102 may have a visual impairment, so the user 102 would define in his profile that he would like guidance to and from the autonomous vehicle 104 to be delivered audibly.
- the mobile computing device 200 may have a mobile device transceiver 608 configured to communicate with other systems.
- the mobile device transceiver 608 may send to and/or receive signals from the autonomous vehicle 104 and/or with the server 202 .
- the mobile computing device 200 may also have a mobile device location detection system 610 .
- This may be any sensor that provides a location signal of the autonomous vehicle, such as a global positioning system (GPS) or a combination of other systems.
- GPS global positioning system
- the mobile device audio system 612 of the mobile computing device 200 is adapted to output audible signals, such as audible notifications, music, audible guidance, etc.
- the mobile device audio system 612 may be a speaker, speakers, a headset, and/or other similar component or devices that provide audible signals.
- the mobile device audio system 612 may be a component within the mobile computing device 200 or may also be an attachment or peripheral that connects to the mobile computing device 200 .
- the mobile computing device 200 may also have a mobile device microphone 614 .
- the mobile device microphone 614 is adapted to receive audible signals, such as the first audible authentication code 204 a from the autonomous vehicle.
- FIG. 7 illustrates the server 202 that may communicate with the autonomous vehicle 104 and the mobile computing device 200 .
- the server 202 may have a server processor 702 , server memory 704 , a server communication system 712 , and a data store 714 .
- the server processor 702 may be or may include a central processing unit (CPU), a graphics processing unit (GPU), a plurality of GPUs and/or CPUs, an application-specific integrated circuit ASIC), or the like.
- CPU central processing unit
- GPU graphics processing unit
- ASIC application-specific integrated circuit
- the server memory 704 stores computer-executable instructions that are executed by the server processor 702 .
- Within the server memory 704 may be a pairing system 706 , a route planning system 708 , and a verification system 710 .
- the pairing system 706 is configured to connect the user 102 with the autonomous vehicle 104 .
- the connection may include both the designation or “pairing” of the user 102 with the specific autonomous vehicle 104 and/or the authentication process detailed further below between the user 102 and the specific autonomous vehicle 104 .
- the route planning system 708 is configured to determine the predetermined journey route 302 based on a location of the user 102 at the time of the user's request for ridesharing service and a location of the destination 402 requested.
- the route planning system 708 may take into account a wide variety of factors to determine the predetermined journey route 302 , such as traffic, weather conditions, etc.
- the route planning system 708 may also be configured to determine the path 106 to assist the user 102 in finding the autonomous vehicle 104 , as shown in FIG. 1 .
- the route planning system 708 may also be configured to determine a final path from a drop-off location to the destination 402 , as shown in FIG. 4 .
- the verification system 710 is configured to track a location of the user 102 through the mobile device location detection system 610 of the user's mobile computing device 200 and a location of the autonomous vehicle 104 through the autonomous vehicle location detection system 514 . As the location of the user 102 changes, the verification system 710 tracks the changes in real-time to form and update the actual journey route 304 . In addition, the verification system 710 is configured to compare the location of the user 102 and the actual journey route 304 against the predetermined journey route 302 for potential divergences 306 .
- the server communication system 712 configured to communicate with the autonomous vehicle 104 and/or the mobile computing device 200 .
- the server communication system 712 may also be configured to communicate with additional autonomous vehicles 104 a and other mobile computing devices 200 .
- FIG. 8 is a flowchart diagram that provides an overview of a process the user 102 may encounter when requesting and using a ridesharing service.
- Subprocesses within the process of requesting and using a ridesharing service may consist of a pick-up process 806 , an en route process 808 , and a disembarking process 810 .
- the process ends at step 812 if the user 102 does not need the ridesharing service or does not indicate the need and/or preferences for audible support.
- FIG. 8 is a flowchart diagram that provides an overview of a process the user 102 may encounter when requesting and using a ridesharing service.
- the user 102 requests the autonomous vehicle 104 .
- the user 102 may have set up a user profile, having preference and/or disability data therein.
- step 804 shows that upon receiving the request, the server 202 may access the user profile and determine whether the user is differently abled, requires special actions, and/or prefers special actions, such as providing information audibly.
- FIGS. 9A and 9B each illustrate an exemplary pick-up process 806 a , 806 b , respectively, that may be used for the pick-up process 806 .
- FIG. 9A illustrates what occurs after the user 102 requests use of the autonomous vehicle 104 .
- the server 202 and/or the mobile computing device 200 notifies the autonomous vehicle 104 of user profile information, which may include user preferences, special actions, special actions and/or the user's 102 disabilities.
- step 904 a shows the server 202 sends the first audible authentication code 204 a to the autonomous vehicle 104 over the first signal 206 a and the second audible authentication code 204 b to the user 102 over the second signal 206 b .
- steps 902 a and 904 a may occur in any combination of points in time.
- the two steps 902 a , 904 a may occur concurrently, or step 904 a may occur before step 902 a .
- the server 202 may send the audible authentication codes 204 a , 204 b in any order to either party.
- the first audible authentication code 204 a may be sent to the user 102 after the second audible authentication code 204 b is sent to the autonomous vehicle 104 .
- step 906 a when the autonomous vehicle 104 arrives at pickup location 108 , the autonomous vehicle 104 notifies the server 202 and/or the mobile computing device 200 of the user 102 of the arrival of the autonomous vehicle 104 . In some instances, the autonomous vehicle 104 may be unable to arrive at the exact location of the user 102 , which would require the user 102 to travel to the location of the autonomous vehicle 104 for pick-up and/or boarding.
- step 908 a if the user 102 has noted in a user profile that the user 102 is visually impaired and/or would prefer to receive audible guidance to the pick-up location 108 , the server 202 and/or the mobile computing device 200 would provide audible guidance 110 for the path 106 from the user's current position to pick-up location 108 , where the autonomous vehicle 104 is situated.
- FIGS. 2A and 9A show in step 910 a , when the mobile device location detection system 610 outputs a position signal similar to and/or within a threshold distance from the pick-up location 108 , the autonomous vehicle 104 audibly outputs the first audible authentication signal 204 a .
- the user 102 then hears the first audible authentication signal 204 a and verifies the first audible authentication signal 204 a against the second audible authentication signal 204 b that the server 202 sent to the user's mobile computing device 200 in step 904 a .
- the mobile computing device 200 may output the second audible signal 204 b again so that the user 102 may correctly verify the audible authentication codes 204 a , 204 b .
- the user 102 may then notify the server 202 whether the audible authentication codes 204 a , 204 b match.
- the user 102 may select an option on the mobile computing device 200 indicating whether the audible authentication codes 204 a , 204 b match. If the user 102 verifies that the audible authentication codes 204 a , 204 b match, then the process continues forward to step 918 a.
- Step 916 a shows that if the audible authentication codes 204 a , 204 b do not match, the autonomous vehicle 104 a is not the autonomous vehicle 104 that the user 102 is searching for. Thus, the user 102 does not board the autonomous vehicle 104 a and is prompted to search nearby for the intended autonomous vehicle 104 . More specifically, the server 202 receives notice that the audible authentication codes 204 a , 204 b do not match. Thus, the server 202 sends instructions to the autonomous vehicle 104 and/or the mobile computing device 200 to notify the user that the autonomous vehicle 104 is not the intended autonomous vehicle 104 and to search nearby for the intended autonomous vehicle 104 .
- the mobile computing device 200 may communicate directly with the autonomous vehicle 104 that the autonomous vehicle 104 is not the intended autonomous vehicle 104 .
- the autonomous vehicle 104 may issue a signal, such as a honk, to indicate that the autonomous vehicle 104 is incorrect.
- the intended autonomous vehicle 104 may issue a signal, such as a honk, to provide the user 102 notice of where the autonomous vehicle 104 is located. Again, audible guidance 110 may be provided to the user 102 .
- the mobile computing device 200 of the user 102 could notify the user 102 in some way that is consistent with the user's understanding that there is a mismatch with the autonomous vehicle 104 .
- the mismatch may also be verified by the server 202 and sent to the mobile computing device 200 .
- the mobile computing device 200 may also send, in combination with whether the audible authentication codes 204 a , 204 b match, the audible authentication codes 204 a , 204 b received to the server 202 .
- the server 202 may check and/or verify whether the audible authentication codes 204 a , 204 b match.
- the mobile computing device 200 may be a smart braille device, such that the smart braille device can communicate to the user 102 the match or mismatch with the autonomous vehicle 104 .
- Matching of the audible authentication codes 204 a , 204 b may have a variety of different implementations. For example, a sequence of random or specific letters, words, and/or numbers may be used (e.g. a license plate of the autonomous vehicle 104 , 1234, 1493, ABC, ASDF, etc.). For example, in some embodiments, the server 202 may send the user 102 the license plate number as one of the authentication codes, such that if the user gets the same license plate number as one of the authentication codes from the autonomous vehicle 104 , then the autonomous vehicle 104 and the user 102 are matched. As another example, a melody of a song may also be used.
- a sequence of random or specific letters, words, and/or numbers may be used (e.g. a license plate of the autonomous vehicle 104 , 1234, 1493, ABC, ASDF, etc.).
- the server 202 may send the user 102 the license plate number as one of the authentication codes, such that if the user gets the same license plate number as
- a successful authentication need not be exact copies of the audible authentication codes 204 a , 204 b (e.g., the first audible authentication code 204 a may be 1234, while the second audible authentication code 204 b may be 5678).
- a first part of a melody of a song may be used as the first audible authentication code 204 a and a second part of the melody of the song may be used as the second audible authentication code 204 b .
- a “matching” may be a relational connection between the first audible authentication code 204 a and the second audible authentication code 204 b instead of a mirror image or identical copy of the other.
- matching of the audible authentication codes 204 a , 204 b to form the successful authentication may be implemented in a variety of different ways.
- the autonomous vehicle 104 may unlock the doors, play a welcome message, welcome the customer with a personalized message verifying the name of the customer and/or the destination 402 etc. In some embodiments, the autonomous vehicle 104 may recognize that the user 102 has come with additional passengers and welcomes the additional passengers as well. Step 920 a shows the user 102 boarding the autonomous vehicle 104 .
- FIG. 9B also illustrates what may occur after the user 102 requests use of the autonomous vehicle 104 . More specifically, FIG. 9B shows a similar pick-up process with a different method of authentication. As shown in step 902 b , the server 202 and/or the mobile computing device 200 notifies the autonomous vehicle 104 of user profile information, which may include user preferences, special actions and/or the user's 102 disabilities. Next step 904 b shows the server 202 sends the first audible authentication code 204 a to the autonomous vehicle 104 over the first signal 206 a and the second audible authentication code 204 b to the user 102 over the second signal 206 b .
- steps 902 b and 904 b may occur in any combination of points in time.
- the two steps 902 b , 904 b may occur concurrently, or step 904 b may occur before step 902 b .
- the server 202 may send the audible authentication codes 204 a , 204 b in any order to either party.
- the first audible authentication code 204 a may be sent to the user 102 after the second audible authentication code 204 b is sent to the autonomous vehicle 104 .
- step 906 b when the autonomous vehicle 104 arrives at pickup location 108 , the autonomous vehicle 104 notifies the server 202 and/or the mobile computing device 200 of the user 102 of the arrival of the autonomous vehicle 104 . In some instances, the autonomous vehicle 104 may be unable to arrive at the exact location of the user 102 , which would require the user 102 to travel to the location of the autonomous vehicle 104 for pick-up and/or boarding.
- step 908 b if the user 102 has noted in a user profile that the user 102 is visually impaired and/or would prefer to receive audible guidance to the pick-up location 108 , the server 202 and/or the mobile computing device 200 would provide audible guidance 110 for the path 106 from the user's current position to pick-up location 108 , where the autonomous vehicle 104 is situated.
- FIGS. 2B and 9B show in step 910 b , when the mobile device location detection system 610 outputs a position signal similar to and/or within a threshold distance from the pick-up location 108 , the autonomous vehicle 104 audibly outputs the first audible authentication signal 204 a.
- step 912 b after the autonomous vehicle 104 outputs the first audible authentication code 204 a , the user 102 is prompted to respond with the second audible authentication code 204 b.
- the microphone 518 may then receive the audibly output second audible authentication code 204 b and either process the matching authentication onboard the autonomous vehicle 104 or send the audible authentication signals 204 a , 204 b from the autonomous vehicle transceiver 512 to the server communication system 712 , where the server 202 uses the pairing system 706 to process and identify whether the audible authentication signals 204 a , 204 b match. If the latter, the server communication system 712 then notifies the autonomous vehicle 104 whether the audible authentication signals 204 a , 204 b match.
- Step 916 b shows that if the audible authentication codes 204 a , 204 b do not match, the autonomous vehicle 104 a is not the autonomous vehicle 104 that the user 102 is searching for. Thus, the user 102 does not board the autonomous vehicle 104 a and is prompted to search nearby for the intended autonomous vehicle 104 .
- the autonomous vehicle 104 may issue a signal, such as a honk, to indicate that the autonomous vehicle 104 is incorrect.
- the intended autonomous vehicle 104 may issue a signal, such as a honk, to provide the user 102 some notice of where the autonomous vehicle 104 is located.
- audible guidance 110 may be provided to the user 102 .
- the mobile computing device 200 of the user 102 could notify the user 102 in some way that is consistent with the user's understanding that there is a mismatch with the autonomous vehicle 104 .
- the mismatch may also be verified by the server 202 and sent to the mobile computing device 200 .
- the mobile computing device 200 may be a smart braille device, such that the smart braille device can communicate to the user 102 the match or mismatch with the autonomous vehicle 104 .
- Step 918 b shows that if the audible authentication signals 204 a , 204 b match, then the user 102 is allowed to board the autonomous vehicle 104 .
- the mobile computing device 200 of the user 102 may communicate securely with the autonomous vehicle 104 either directly and/or through the server 202 .
- the autonomous vehicle 104 may notify the user 102 .
- the autonomous vehicle 104 may unlock the doors, play a welcome message, welcome the customer with a personalized message verifying the name of the customer and/or the destination 402 etc.
- the autonomous vehicle 104 may recognize that the user 102 has come with additional passengers and welcomes the additional passengers as well.
- Step 920 b shows the user 102 boarding the autonomous vehicle 104 .
- the audible authentication codes 204 a , 204 b need not both be audible.
- the autonomous vehicle 104 may output the first audible authentication code 204 a so that the user 102 may authenticate the first audible authentication code 204 a against a non-audible authentication code (e.g. haptic-based code that matches a pattern of the first audible authentication code, or a visible authentication code for non-visually impaired users, etc.).
- the mobile computing device 200 would then send the signal to the server 202 informing the server 202 of the successful authentication.
- the server 202 may then send instructions to the autonomous vehicle 104 to unlock.
- the autonomous vehicle 104 may output the first audible authentication signal 204 a , while the mobile computing device 200 is configured to and receives the first audible authentication signal 204 a .
- the mobile computing device 200 may then encode the first audible authentication signal 204 a and transmit the first audible authentication signal 204 a to the server 202 , where the server 202 can authenticate and notify both the autonomous vehicle 104 and the mobile computing device 200 if there is a successful authentication.
- the mobile computing device 200 may instead output the first audible authentication signal 204 a , while the autonomous vehicle 104 is configured to and receives the first audible authentication signal 204 a .
- the autonomous vehicle 104 may then encode and transmit the first audible authentication signal 204 a to the server 202 , where the server 202 can authenticate and notify both the autonomous vehicle 104 and the mobile computing device 200 if there is a successful authentication. In these instances, only the first audible authentication code 204 a is necessary for a successful authentication.
- FIG. 10 provides an exemplary timeline of different processes, while also providing insight into which component may handle the process.
- timeline and categorization of processes by each component is merely provided as an example and it is to be understood that the order and handling of each process by different components may be varied in a wide variety of different ways.
- the user 102 may request a ride in step 1002 . More specifically, the user 102 may use the support application 606 on his or her mobile computing device 200 to request a ride on the autonomous vehicle 104 . The request is sent through the mobile device transceiver 608 to the server communication system 712 .
- the server 202 then accesses the user profile in the data store 714 to determine user profile information, such as if the user 102 is differently abled and/or if the user 102 has requested that information be provided audibly.
- the request is then handled by the pairing system 706 to pair the user 102 with the autonomous vehicle 104 .
- the pairing system 706 then creates and sends authentication signals to both autonomous vehicle transceiver 512 and the mobile device transceiver 608 of the mobile computing device 200 of the user 102 . If the user 102 is visually impaired or has noted in the profile that information should be provided audibly, then the authentication signals will be the audible authentication signals 204 a , 204 b.
- step 1006 a the autonomous vehicle 104 receives the first audible authentication signal 204 a .
- step 1006 b the user 102 and/or the mobile computing device 200 of the user 102 receives the second audible authentication signal 204 b to verify when authenticating the autonomous vehicle 104 .
- Step 1008 shows the autonomous vehicle arriving at the pickup location 108 .
- Step 1010 shows that, in response to the autonomous vehicle 104 arriving at the pick-up location 108 , the server 202 is notified of the arrival of the autonomous vehicle 104 by either a notification sent from the autonomous vehicle 104 to the server communication system 710 and/or a matching of the location signal from the autonomous vehicle location detection system 514 with the pickup location 108 .
- the server 202 then notifies the user 102 of the arrival of the autonomous vehicle 104 and, in accordance with the user's profile, causes the mobile device audio system 612 to output and provide audible guidance 110 to the pickup location 108 , where the autonomous vehicle 104 is also located.
- the autonomous vehicle 104 may directly communicate with the mobile computing device 200 .
- the user's 102 location may be determined through the mobile device location detection system 610 .
- the mobile device location detection system 610 outputs a position signal that identifies where the user 102 is located.
- the mobile computing device 200 may be configured to send via the mobile device transceiver 608 the position signal output from the mobile device location detection system 610 to the server 202 .
- steps 1012 and 1014 show that as the user 102 arrives at the pickup location 108 , where the autonomous vehicle 104 is also located, the server 202 will detect that the position signal of the mobile device location detection system 610 is similar to and/or within a threshold distance from the pickup location 108 and the position signal of the autonomous vehicle location detection system 514 . The server 202 may then notify the autonomous vehicle 104 that the user 102 is near the autonomous vehicle 104 .
- the mobile computing device 200 may communicate directly with the autonomous vehicle 104 .
- a near-field communication (NFC) system or other wireless technology, such as Bluetooth may be used to provide direct communication between the mobile computing device 200 and the autonomous vehicle 104 .
- step 1016 when the autonomous vehicle 104 is aware that the user 102 is nearby, either by a notification from the server 202 , by direct communication with the mobile computing device 200 of the user 102 , or by any other method of communication, the autonomous vehicle computing system 502 will cause the autonomous vehicle audio system 516 to output the first audible authentication signal 204 a.
- step 1018 the user 102 verifies the first audible authentication signal 204 a against the earlier received second audible authentication signal 204 b .
- step 1020 the user 102 may then use the mobile computing device 200 to notify the server 202 and/or the autonomous vehicle 104 directly through the support application 606 and/or by sending a notification or signal through the mobile device transceiver 608 .
- the user 102 and/or the mobile computing device 200 of the user 102 may instead output the second audible authentication signal 204 b so that the autonomous vehicle 104 receives the second audible authentication signal 204 b .
- the autonomous vehicle 104 may then either verify onboard the first audible authentication signal 204 a against the second audible authentication signal 204 b or send the first and second audible authentication signals 204 a , 204 b to the server 202 , where the pairing system 706 determines whether the signals 204 a , 204 b match.
- the server communication system 712 then notifies the autonomous vehicle 104 of a successful authentication.
- Step 1022 illustrates that in response to the successful authentication of the first and second audible authentication signals 204 a , 204 b , the autonomous vehicle control system 510 allows entry to the autonomous vehicle 104 and the user 102 accordingly boards the autonomous vehicle 104 . It is further contemplated that the autonomous vehicle 104 may previously have had doors locked to prevent entry thereto; thus, when the autonomous vehicle 104 allows entry thereto, the doors may be unlocked.
- FIG. 10 is simply one example in a wide variety of different methods that fall within the scope of this disclosure.
- One of ordinary skill in the art would understand that modifications to ordering, timing, components, and/or other processes would fall within the scope of this disclosure.
- FIG. 11 illustrates a schematic block diagram of a procedure 1100 for boarding the autonomous vehicle 104 .
- Procedure 1100 begins with step 1105 , and continues to step 1110 where the autonomous vehicle 104 determines that information is to be provided audibly to the user 102 when information is provided to the user 102 .
- Procedure 1100 continues to step 1115 where the user 102 and the autonomous vehicle 104 create a successful authentication using the first audible authentication code 204 a and the second audible authentication code 204 b .
- the successful authentication may be created in a wide variety of methods, as discussed above.
- the user 102 is audibly notified of the successful authentication. It is to be clear that any of the above-identified audio systems or communication systems may be used to audibly notify the user 102 , including but not limited to the autonomous vehicle audio system 516 , the mobile device audio system 612 , and/or the server communication system 712 .
- the user 102 is then allowed entry into the autonomous vehicle at step 1125 .
- the process subsequently adds at step 1130 .
- the autonomous vehicle 104 and the user 102 may begin the journey.
- FIG. 12 illustrates the en route process 808 .
- the server 202 may use the route planning system 708 to create the predetermined journey route 302 .
- the server 202 communicates the predetermined journey route 302 to the autonomous vehicle 104 .
- the autonomous vehicle 104 may have a locally stored route planning system to create the predetermined journey route 302 and the locally generated predetermined journey route 302 may be transmitted to the server 202 (e.g., the verification system 710 ).
- the autonomous vehicle 104 begins moving along the predetermined journey route 302 .
- the autonomous vehicle location detection system 514 outputs the position signal of the autonomous vehicle 104 .
- the server communication system 712 may receive the position signal output, store the position signal output in the data store 714 , and use the position signal output in the verification system 710 .
- the mobile device location detection system 610 may also output the position signal output of the mobile device 200 , which allows the server communication system 712 to similarly receive the position signal output, store the position signal output in the data store 714 , and use the position signal output in the verification system 710 .
- Steps 1206 and step 1208 show the verification system detecting divergences 306 from the predetermined journey route 302 . More specifically, the verification system 710 compares the position signal output of the autonomous vehicle 104 and/or the mobile computing device 200 against the predetermined journey route 302 . The position signal output of the autonomous vehicle 104 creates the actual journey route 304 , which may or may not be substantially similar to the predetermined journey route 302 .
- Step 1210 shows that if the actual journey route 304 is substantially similar to the predetermined journey route 302 , then the autonomous vehicle 104 will arrive at the destination 402 as planned.
- the actual journey route 304 is not substantially similar to and/or exceeds a threshold distance away from the predetermined journey route 302 , then there is the divergence 306 .
- Step 1212 then shows that the user 102 is notified of the divergence 306 .
- the verification system 710 of the server 202 may communicate with the server communication system 712 to send a signal to the mobile device transceiver 608 .
- the mobile computing device 200 processes the signal from the mobile device transceiver 608 and may notify the user 102 through the support application 606 or any other suitable method.
- the autonomous vehicle 104 may announce the divergence through the autonomous vehicle audio system 516 . In either instances, the audio systems 516 , 612 output the support message 308 , which explains that divergence 306 has occurred.
- Step 1214 shows that when the divergence 306 is detected, the server 202 may further determine and/or authenticate the cause of the divergence 306 . This determination and/or authentication may be achieved through a wide variety of different methods according to this subject disclosure.
- One such method is comparing the position signal outputs of the autonomous vehicle 104 and the position signal outputs of the mobile computing device 200 of the user 102 .
- the position signal outputs differ significantly, there is a possibility that the autonomous vehicle 104 and/or the mobile computing device 200 has been maliciously attacked. In other words, the autonomous vehicle and/or the mobile computing device 200 may be compromised to cause the location detection systems 508 , 610 to output inaccurate position signals (i.e. spoofing).
- Another method is comparing the actual journey route 304 against journey routes or paths of other nearby autonomous vehicles 104 a . If the actual journey route 304 differs significantly against other journey routes of other autonomous vehicles 104 a , there is a possibility that the autonomous vehicle 104 has been malicious attacked. In other words, when the autonomous vehicle 104 makes a divergence 306 from the predetermined journey route 302 that other autonomous vehicles 104 a do not also do or follow, the autonomous vehicle 104 may be compromised such that the propulsion system 522 , braking system 524 , and/or steering system 526 no longer follow routing instructions from the server 202 .
- a third party may have control of the autonomous vehicle 104 , resulting in actions not requested and/or implemented by the user 102 and/or the server 202 .
- the server 202 and/or the autonomous vehicle 104 may communicate with other autonomous vehicles 104 and/or other sensors nearby to determine whether unusual changes. These sensors may include, but are not limited to, other trusted autonomous vehicles 104 , cameras as part of infrastructure, other government and non-government infrastructure, etc.
- the server 202 and/or the autonomous vehicle 104 may communicate with the city to determine whether other autonomous vehicles 104 a are making similar pathing decisions or if there are new traffic developments, such as construction, flooding, etc.
- Step 1216 shows that if the verification system 710 determines that the divergence 306 is not caused by a malicious attack, the verification system 710 communicates with the route planning system 708 to create a new journey route, which becomes the predetermined journey route 302 .
- the route planning system 708 reroutes the autonomous vehicle 104 to the destination 402 , despite the divergence 306 .
- the autonomous vehicle 104 may detect the divergence 306 and request a new journey route from the server 202 , which will respond by having the route planning system 708 create a new journey route that becomes the new predetermined journey route 302 . It is further contemplated that the verification system 710 may further determine the cause of the divergence 306 and, in accordance with the profile of the user, cause the autonomous vehicle audio system 516 or the mobile device audio system 612 to audibly notify the user 102 of the cause of the divergence 306 .
- the verification system 710 may determine that construction has caused a road in the predetermined journey route to be closed; thus the verification system 710 communicates with the mobile device audio system 612 via the support application 606 to output an audible message stating that the divergence 306 (or turn away from the street) is due to a road closure caused by construction.
- the verification system 710 After receiving the new predetermined journey route 302 , the verification system 710 again tracks the progress of the autonomous vehicle 104 and/or the mobile computing device 200 along the predetermined journey route 302 for divergences 306 from the predetermined journey route 302 . If the actual journey route 304 is identical to the predetermined journey route 302 , then the autonomous vehicle 104 and the user 102 will arrive at the destination 402 .
- the server communication system 712 sends instructions to the autonomous vehicle 104 to stop and/or pull over.
- the server 202 may also be configured to send a notification or otherwise inform authorities and/or other emergency services of the malicious attack, the position of the user 102 and the autonomous vehicle 104 .
- the mobile computing device 200 may be configured to also compare the progress of the autonomous vehicle 104 and/or the mobile computing device 200 along the predetermined journey route 302 . If the divergence 306 occurs, the mobile computing device 200 may be configured to request information regarding the divergence from the server 202 and/or the autonomous vehicle 104 . The verification system 710 may then respond with the information. In the event of a malicious attack, the user 102 may then contact authorities and/or other people on their own or may request through the support application 606 to have the server 202 and/or the autonomous vehicle 104 contact authorities and/or other resources.
- verification system 710 may also be configured to cause the autonomous vehicle audio system 516 and/or the mobile device audio system 612 to, in accordance with the user's profile, audibly output each turn along the predetermined journey path 302 , irrespective of whether there is any divergence 306 .
- the user 102 may want to change the destination 402 while enroute.
- the user 102 may speak to the microphone 518 of the autonomous vehicle 104 and/or the mobile computing device 200 to ask for the change.
- the autonomous vehicle 104 may then send the request to the route planning system 708 , which will create a new journey path and update the predetermined journey path 302 with the new journey path.
- the user 102 may be notified audibly of the change.
- the user 102 may also want to add stops and/or quick destinations along the journey to the destination 402 .
- the user 102 may speak to the microphone 518 of the autonomous vehicle 104 and/or the mobile computing device 200 to ask for the addition.
- the autonomous vehicle 104 may then send the request to the route planning system 708 , which will create a new journey path with the additional stops and update the predetermined journey path 302 with the new journey path.
- FIG. 13 is a schematic block diagram of a procedure 1300 for boarding the autonomous vehicle 104 .
- Procedure 1300 begins with step 1305 , and continues to step 1310 where it is determined that information is to be provided audibly to the user 102 when information is provided to the user 102 .
- Procedure 1300 continues to step 1315 where a location of the passenger 102 is compared against the predetermined journey path 302 .
- the comparison may detect the divergence 306 from the predetermined journey path based upon the position signal output by the autonomous vehicle 104 and/or the mobile computing device 200 .
- the divergence 306 is authenticated.
- the divergence 306 may be authenticated in a wide variety of different methods to determine whether there is actually a divergence 306 from the predetermined journey path 302 and the cause of the divergence 306 .
- the divergence 306 may be authenticated by comparing a location of the autonomous vehicle against a location of a second autonomous vehicle that is trusted.
- the second autonomous vehicle acts just as a sensor, in that the second autonomous vehicle is able to determine the location of the autonomous vehicle 104 .
- Procedure 1300 continues to step 1325 , where the cause of the divergence 306 from the predetermined journey path 302 is determined to be from a malicious attack. As stated above, there are a wide variety of different methods to determine whether the divergence 306 is caused by a malicious attack.
- the autonomous vehicle audio system 516 and/or the mobile device audio system 612 outputs the support message 308 to notify the passenger of the divergence and the cause of the divergence.
- a distress signal is sent to inform authorities of the malicious attack and the divergence 306 from the predetermined journey path 302 .
- the distress signal may be sent from the mobile computing device 200 of the user, the server 202 , and/or the autonomous vehicle 104 .
- the procedure 1300 subsequently ends at step 1340 .
- FIG. 14 demonstrates one such disembarking process 810 .
- the autonomous vehicle 104 , the server 202 and/or the mobile computing device 200 may be tracking the location of the autonomous vehicle 104 and/or the mobile computing device 200 .
- the autonomous vehicle 104 upon arrival near the destination 402 , the autonomous vehicle 104 will find a safe location to stop the autonomous vehicle 104 and come to a stop.
- the location may be considered safe based on a wide variety of different factors, such as vehicular traffic, foot traffic, speed of traffic, weather conditions, etc.
- the server 202 will compare the stop location against the location of the destination 402 .
- the server communication system sends a notification to the user 102 .
- the notification may be audible and may be communicated through the autonomous vehicle audio system 516 and/or the mobile device audio system 612 .
- the notification may further include details of the location of the autonomous vehicle 104 in relation to the destination 402 , which may also allow the user 102 to verify that the destination 402 is the intended destination. For example, the notification may state “We have arrived at a safe location. We are parked 100 feet in front of the post office.”
- Step 1406 shows that the autonomous vehicle 104 then scans an area around the vehicle and an intended direction for the user 102 to disembark towards.
- the autonomous vehicle 104 may scan the surrounding area with the autonomous vehicle sensor system 520 .
- an optical camera and/or lidar may be configured to observe obstacles and movement along a sidewalk nearby.
- Step 1408 shows the autonomous vehicle 104 determining whether the surrounding area is safe.
- the autonomous vehicle 104 may determine the surrounding area is not safe because there is the pedestrian 404 is running in the direction of the autonomous vehicle 104 , which would create a possibility of the pedestrian hitting the door of the autonomous vehicle 104 when the user 102 opens the door. If the autonomous vehicle 104 determines that the surrounding area is not safe, it continues to scan the surrounding area until there is a safe opportunity for the user 102 to disembark from the autonomous vehicle 104 . Many factors may be used to determine whether the surrounding area is safe and/or if there is a safe opportunity for the user 102 to disembark.
- the autonomous vehicle 102 may create a prediction of where the objects and/or obstacles will be based on their current positions and velocities and accordingly further determine the safe opportunity, such that the safe opportunity takes into account that no other objects will be entering that area or location when the user 102 may disembark.
- Step 1410 shows that when the autonomous vehicle 104 has detected a safe opportunity to disembark, the autonomous vehicle 104 notifies the user that there is a safe opportunity to disembark. Again, this may be accomplished through the autonomous vehicle audio system 516 and/or the mobile device audio system 612 .
- Step 1412 shows that the autonomous vehicle 104 may guide the user 102 to exit the vehicle during the safe opportunity to disembark. More specifically, the autonomous vehicle 104 determines directional information to audibly guide disembarking. This directional information may be determined from a wide variety of factors, such as the location of the autonomous vehicle 104 , the orientation of the autonomous vehicle 104 , a direction of the destination 402 in relation to the autonomous vehicle 104 , etc. Again, this information may be audibly provided to the user 102 through the autonomous vehicle audio system 516 and/or the mobile device audio system 612 . Furthermore, the autonomous vehicle audio system 516 may be configured to provide spatial audio.
- the autonomous vehicle audio system 516 may be configured to output audio from specified directions, such that the user 102 will be able to easily identify the specified direction to disembark from the autonomous vehicle 104 . While this may be helpful to users 102 with visual impairments, this spatial audio guidance may also be helpful for all users 102 in general. For example, in camp-style seating arrangements (e.g. seats all facing inwards, such that passengers may easily talk to each other, like around a camp fire), the users 102 may easily forget and/or lose track of which direction is “left” and/or “right.” Thus, by spatially providing audible guidance through the autonomous vehicle audio system 516 , all users 102 in general would easily understand the correct direction to disembark the autonomous vehicle 104 .
- camp-style seating arrangements e.g. seats all facing inwards, such that passengers may easily talk to each other, like around a camp fire
- the users 102 may easily forget and/or lose track of which direction is “left” and/or “right.”
- step 1412 further shows that the autonomous vehicle 104 may instruct or direct the user 102 to perform a procedure prior to exiting the autonomous vehicle.
- the autonomous vehicle 104 may require that the user 102 use the arm distal from the autonomous vehicle 104 door to open the door (i.e. performing a Dutch reach).
- Optional step 1414 shows that the autonomous vehicle 104 observes and determines whether the user 102 has conducted the requested procedure.
- the autonomous vehicle 104 may observe through an in-cabin camera of the autonomous vehicle sensor system 520 to determine whether the user has performed a Dutch reach to open the door.
- the autonomous vehicle 104 may again instruct or direct the user 102 to perform the procedure. For example, when a sighted user 102 has not performed a Dutch reach, the autonomous vehicle 104 may remain locked and notify the user 102 again to perform a Dutch reach.
- the autonomous vehicle 104 may then allow the user 102 to disembark from the autonomous vehicle 104 during the safe opportunity. If the safe opportunity is no longer present, the autonomous vehicle 104 may notify the user 102 that the safe opportunity has ended.
- the autonomous vehicle 104 may, in accordance with the profile of the user 102 , output the audible message 406 exterior to the autonomous vehicle 104 , the audible message 406 indicating that the user 102 is disembarking from the autonomous vehicle 104 .
- This audible message 406 may be helpful in alerting nearby pedestrians 404 that the disembarking of the user 102 may cause an obstruction in their path.
- the server 202 may cause the mobile device audio system 612 to output audible guidance from user's location to the destination 402 .
- FIG. 15 is a schematic block diagram of a procedure 1500 for boarding the autonomous vehicle 104 .
- Procedure 1500 begins with step 1505 , and continues to step 1510 where it is determined that information is to be provided audibly to the user 102 when information is provided to the user 102 .
- the autonomous vehicle 104 may cause the autonomous vehicle audio system 516 to audibly output that the autonomous vehicle has found the safe location to stop.
- Procedure 1500 continues to step 1525 , the autonomous vehicle 104 then searches for and attempts to detect a safe opportunity for the passenger to disembark or exit the autonomous vehicle 104 .
- the safe opportunity may be determined based on a wide variety of different factors.
- the autonomous vehicle audio system 516 audibly notifies and guides the passenger 102 to exit the autonomous vehicle 104 safely during the safe opportunity.
- the autonomous vehicle audio system 516 may spatially guide the passenger 102 through directional audio.
- Computer-readable media includes computer-readable storage media.
- a computer-readable storage media can be any available storage media that can be accessed by a computer.
- such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer.
- Disk and disc include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc (BD), where disks usually reproduce data magnetically and discs usually reproduce data optically with lasers. Further, a propagated signal is not included within the scope of computer-readable storage media.
- Computer-readable media also includes communication media including any medium that facilitates transfer of a computer program from one place to another. A connection, for instance, can be a communication medium.
- the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave
- coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of communication medium.
- DSL digital subscriber line
- wireless technologies such as infrared, radio, and microwave
- the functionality described herein can be performed, at least in part, by one or more hardware logic components.
- illustrative types of hardware logic components include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
- FPGAs Field-programmable Gate Arrays
- ASICs Application-specific Integrated Circuits
- ASSPs Application-specific Standard Products
- SOCs System-on-a-chip systems
- CPLDs Complex Programmable Logic Devices
- functionality described herein can be performed on different components. For example, and without limitation, determinations by the verification system 710 may be performed onboard the autonomous vehicle 104 and vice versa.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Tourism & Hospitality (AREA)
- Theoretical Computer Science (AREA)
- Economics (AREA)
- Automation & Control Theory (AREA)
- General Business, Economics & Management (AREA)
- Remote Sensing (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Human Computer Interaction (AREA)
- Development Economics (AREA)
- Environmental & Geological Engineering (AREA)
- Entrepreneurship & Innovation (AREA)
- Acoustics & Sound (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
Description
- An autonomous vehicle is a motorized vehicle that can operate without a human driver. Conventionally, since the autonomous vehicle lacks a driver, physical human-machine interfaces (HMIs) may be provided to assist a passenger in finding the autonomous vehicle, requesting supporting or information during a ride, and disembarking the autonomous vehicle safely. Exemplary HMIs may include a mechanical push-button, a touch-sensitive display, or the like. While such HMIs are well-suited for a large portion of the population, these HMIs may be sub-optimal for those with vision impairments for a variety of reasons. For example, a person with a visual impairment may have difficulty trying to locate the autonomous vehicle and verifying that the particular vehicle is the correct and/or intended vehicle that the person is looking for. As another example, a visually impaired passenger may have difficulty or concerns during the ride as to the route and direction of the vehicle, in which case such HMIs may lack the ability to provide the visually impaired passenger with desired support or information. Furthermore, upon disembarking from the autonomous vehicle may also prove to be difficult. Without the assistance of a driver to guide the passenger safely out of the autonomous vehicle, such HMIs may be sub-optimal for visually impaired passengers.
- The following is a brief summary of subject matter that is described in greater detail herein. This summary is not intended to be limiting as to scope of the claims.
- Described herein are various technologies pertaining to enabling ridesharing and usage of an autonomous vehicle by a visually impaired passenger.
- More specifically, the various technologies present a notification to the passenger based on a profile of the passenger, wherein the profile of the passenger specifies that the notification is to be provided audibly to the passenger (e.g., the passenger may have a visual impairment). Even more specifically, the various technologies present the notification when a successful authentication between the passenger and an autonomous vehicle occurs. Content of the notification is based on occurrence of the successful authentication. Because the content of the notification is based on the detection of divergence occurring, the notification can inform the visually impaired passenger of the fact that the autonomous vehicle is the intended autonomous vehicle that the passenger should be looking for. For instance, where the autonomous vehicle comes to a pickup location having multiple autonomous vehicles, the notification can include “the car is parked 100 feet to your right.” In another instance, where the successful authentication has occurred, the notification can include “this is the correct vehicle, the doors have been unlocked so that you can get in.”
- In one example, an autonomous vehicle is configured to output a first authentication signal, while a user receives a second authentication signal from a server. The user may then verify that the two authentication signals are the matching authentication signals and notify the autonomous vehicle that the authentication signals match. The autonomous vehicle may then confirm again that there is a successful authentication and output the notification.
- In another example, a server is in communication with the autonomous vehicle and the mobile computing device of the passenger. The autonomous vehicle may output the first authentication signal and the user may output the second authentication signal. The autonomous vehicle and/or the server may then confirm that the authentication signals match and accordingly output the above notification.
- The above summary presents a simplified summary in order to provide a basic understanding of some aspects of the systems and/or methods discussed herein. This summary is not an extensive overview of the systems and/or methods discussed herein. It is not intended to identify key/critical elements or to delineate the scope of such systems and/or methods. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
-
FIG. 1 illustrates an environment having a user and an autonomous vehicle. -
FIG. 2A illustrates an exemplary method of audible authentication between a user and an autonomous vehicle. -
FIG. 2B illustrates another exemplary method of audible authentication between a user and an autonomous vehicle. -
FIG. 3 illustrates an autonomous vehicle during a journey. -
FIG. 4 illustrates a user and an autonomous vehicle arriving at a destination. -
FIG. 5 illustrates an autonomous vehicle in accordance with this subject disclosure. -
FIG. 6 illustrates a mobile computing device in accordance with this subject disclosure. -
FIG. 7 illustrates a server in accordance with this subject disclosure. -
FIG. 8 is a flow diagram that illustrates various processes that occur when a user requests a ridesharing service. -
FIG. 9A is a flow diagram that illustrates a pickup process inFIG. 8 . -
FIG. 9b is a flow diagram that illustrates a second safe boarding process. -
FIG. 10 is a timeline flow diagram that illustrates a successful safe boarding process. -
FIG. 11 is a schematic block diagram that illustrates a pickup process. -
FIG. 12 is a flow diagram that illustrates a journey verification process inFIG. 8 . -
FIG. 13 is a schematic block diagram that illustrates an enroute process. -
FIG. 14 is a flow diagram that illustrates a safe disembarking process inFIG. 8 . -
FIG. 15 is a schematic block diagram that illustrates a disembarking process. - Various technologies pertaining to an autonomous vehicle are now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of one or more aspects. It may be evident, however, that such aspect(s) may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing one or more aspects. Further, it is to be understood that functionality that is described as being carried out by certain system components may be performed by multiple components. Similarly, for instance, a component may be configured to perform functionality that is described as being carried out by multiple components
- Moreover, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from the context, the phrase “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, the phrase “X employs A or B” is satisfied by any of the following instances: X employs A; X employs B; or X employs both A and B. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from the context to be directed to a singular form.
- Further, as used herein, the terms “component” and “system” are intended to encompass computer-readable data storage that is configured with computer-executable instructions that cause certain functionality to be performed when executed by a processor. The computer-executable instructions may include a routine, a function, or the like. It is also to be understood that a component or system may be localized on a single device or distributed across several devices. Further, as used herein, the term “exemplary” is intended to mean serving as an illustration or example of something and is not intended to indicate a preference.
- Disclosed are various technologies that are particularly well-suited for use when a passenger of an autonomous vehicle has a vision impairment. More specifically, a computer-readable profile of a passenger can indicate that the passenger prefers to receive information audibly. When an event is detected that may be of interest to the passenger, the autonomous vehicle can cause an audible message to be presented to the passenger. Hence, the passenger (who may be visually impaired) need not attempt to request information by interacting with a physical human-machine interface (HMI).
- Accordingly, sensed events on a trip in an autonomous vehicle can trigger audible support for a passenger who has a visual impairment. While the techniques set forth herein are described for use with passengers having visual impairments, it is to be appreciated that these techniques can be utilized to assist passengers having other types of disabilities and/or even without disabilities; thus, as noted herein, personalization settings in a profile of a passenger can indicate a preference for audible presentation of information when the passenger is in the autonomous vehicle.
- Generally, an autonomous vehicle includes a display screen located in an interior of the autonomous vehicle that can be configured to receive typed-out support requests and provide support information. However, where the passenger is visually-impaired it can be difficult for the passenger to operate the display screen to request support, much less read the support information while the autonomous vehicle is moving. Thus, it may be preferable for the passenger to receive audible support. The disclosed methods and systems can be integrated with an autonomous vehicle to provide contextual audible support messages.
-
FIGS. 1-4 collectively show a high level overview of different processes that may occur when a user uses an autonomous vehicle. - With reference now to
FIG. 1 , a user orpassenger 102 may request use of anautonomous vehicle 104. The user orpassenger 102 may have a vision impairment. Thus, typical use of and navigation to/from theautonomous vehicle 104 may be more difficult for theuser 102. For instance, theuser 102 may try to follow apath 106 to apickup location 108 to board theautonomous vehicle 104. However, due to the vision impairment, theuser 102 may have difficulty navigating to theautonomous vehicle 104. Thus,audible guidance 110 may be provided to the user to follow thepath 106 to thepickup location 108. - As the
user 102 approaches theautonomous vehicle 104, theuser 102 will need to identify and authenticate theautonomous vehicle 104 to determine that the particularautonomous vehicle 104 is the intended autonomous vehicle to board. Due to the vision impairment theuser 102 may have difficulty with identifying the particularautonomous vehicle 104 by traditional authentication methods, such as examining the license plate or make, model and/or color of the vehicle. Thus, the user may audibly authenticate theautonomous vehicle 104. -
FIGS. 2A-2B show two different embodiments for audibly authenticating theautonomous vehicle 104. -
FIG. 2A demonstrates aserver 202 sending a first audible authentication code or signal 204 a over afirst signal 206 a to theautonomous vehicle 104 and a second audible authentication code or signal 204 b over asecond signal 206 b to amobile computing device 200 of theuser 102. Themobile computing device 200 of theuser 102 may output the secondaudible authentication code 204 b so that theuser 102 is audibly aware of and/or may audibly receive the secondaudible authentication signal 204 b. Theautonomous vehicle 104 then outputs the firstaudible authentication code 204 a so that theuser 102 can audibly receive the firstaudible authentication code 204 a. Theuser 102 may then output the secondaudible authentication code 204 b so that theautonomous vehicle 104 can audibly receive the secondaudible authentication code 204 b. Theautonomous vehicle 104 can determine whether theaudible authentication codes autonomous vehicle 104 may also then notify theserver 202 of the successful authentication over athird signal 208 a. Alternatively, theserver 202 may receive and confirm that theaudible authentication codes autonomous vehicle 104 over thethird signal 208 a. Similarly, themobile computing device 200 of theuser 102 may be configured to output the secondaudible authentication code 204 b so that theautonomous vehicle 104 may receive the secondaudible authentication code 204 b, thus relieving theuser 102 from receiving and outputting the secondaudible authentication code 204 b. -
FIG. 2B demonstrates theserver 202 sending the firstaudible authentication code 204 a over thefirst signal 206 a to theautonomous vehicle 104 and the secondaudible authentication code 204 b over thesecond signal 206 b to themobile computing device 200 of theuser 102. Themobile computing device 200 of theuser 102 may output the secondaudible authentication code 204 b so that theuser 102 is audibly aware of and/or may audibly receive the secondaudible authentication signal 204 b. Theautonomous vehicle 104 then outputs the firstaudible authentication code 204 a so that theuser 102 can audibly receive the firstaudible authentication code 204 a. Theuser 102 then verifies the firstaudible authentication code 204 a against the secondaudible authentication code 204 b and notifies theserver 202 and/or theautonomous vehicle 104 over thethird signal 208 b that theaudible authentication codes server 202 may then notify theautonomous vehicle 104, so that theautonomous vehicle 104 would then allow entry thereto. - According to some embodiments, the authentication codes may be communicated in other ways as well (e.g., non-audible means). For example, in some embodiments, the
server 202 may securely send authentication codes to theautonomous vehicle 104 andmobile computing device 200 of theuser 102. Themobile computing device 200 may be in communication with theautonomous vehicle 104 through an interface, such as Bluetooth. Once themobile computing device 200 is authenticated, connected to and/or in communication with theautonomous vehicle 104, then themobile computing device 200 and theautonomous vehicle 104 communicate to ensure that theuser 102 is near theautonomous vehicle 104. When theuser 102 is near theautonomous vehicle 104, themobile computing device 200 may signal to theautonomous vehicle 104 to open the doors. In other words, the authentication codes are communicated among theautonomous vehicle 104, themobile computing device 200, and theserver 202 non-audibly. More specifically, the authentication codes are sent from theserver 202 to theautonomous vehicle 104 and themobile computing device 200. Then, theautonomous vehicle 104 and themobile computing device 200 communicate directly, so that theautonomous vehicle 104 and/or themobile computing device 200 may authenticate the authentication codes non-audibly. Theautonomous vehicle 104 and/or themobile computing device 200 may then notify the other and/or theserver 202, which may then send notification of the successful authentication to one or both of theautonomous vehicle 104 and themobile computing device 200. In some embodiments, theautonomous vehicle 104 and themobile computing device 200 may communicate directly, so that theautonomous vehicle 104 and/or themobile computing device 200 may receive the other authentication code to send both authentication codes back to theserver 202 for verification. Theserver 202 would then notify theautonomous vehicle 104 and/or themobile computing device 200 of the authentication. - In some scenarios, an unintended third party may also enter the
autonomous vehicle 104 while the door is open. This may occur when the unintended third party enters before theuser 102, after theuser 102, and/or enters through a different door than theuser 102. Thus, in some embodiments, theautonomous vehicle 104 may also track and verify a number of people boarding theautonomous vehicle 104 against an intended number ofusers 102. - In some embodiments, the
autonomous vehicle 104 may also verify that the doors to theautonomous vehicle 104 are properly closed. If the doors are not properly closed, theautonomous vehicle 104 may warn and/or notify theuser 102. For example, a user may have a long skirt, which gets stuck in the door when the door closes. In some embodiments, theautonomous vehicle 104 may detect this and alert theuser 102. - In some embodiments, the
autonomous vehicle 104 may detect that theuser 102 is accompanied by young children. Thus, theautonomous vehicle 104 may alert theuser 102 to lock the doors near where the young children are sitting, so that the young children don't accidentally open the doors. Moreover, theautonomous vehicle 104 may also detect that the young children are wearing seatbelts, such that if theautonomous vehicle 104 fails to detect the seatbelts being fastened, theautonomous vehicle 104 may alert theuser 102. - Similarly, in some embodiments, the
user 102 may also give commands regarding these and other issues, while in theautonomous vehicle 104 or through theirmobile computing device 200, so that theautonomous vehicle 104 may initiate the above. -
FIG. 3 demonstrates that throughout the journey, theautonomous vehicle 104 will have an actual journey route or path 304 that may have adivergence 306 from a predetermined journey route orpath 302. Thus, after theuser 102 boards theautonomous vehicle 104 and begins a journey to a destination, theuser 102 may be curious of the predetermined journey route orpath 302 that theautonomous vehicle 104 should follow. - In some cases, the
user 102 may also be curious of the reason for thedivergence 306 from thepredetermined journey route 302. Similarly, it is important to know whether thedivergence 306 from thepredetermined journey route 302 is the result of a malicious attack from a third party. Thus, as will be discussed in further detail below, asupport message 308 may be output so that theuser 102 may be aware of thedivergence 306 and the cause of thedivergence 306. Thesupport message 308 may be output audibly, haptically, visibly, and/or a variety of other methods. Similarly, theautonomous vehicle 104 may provide thesupport message 308 for some and/or all maneuvers that theautonomous vehicle 104 takes. For example, “we are making a left turn at Bush street, there is a doubled parked vehicle in front of us. I will wait for a safe opportunity and pass.” Thus, thesupport message 308 may additionally be used to let theuser 102 know that there is not adivergence 306. -
FIG. 4 demonstrates theautonomous vehicle 104 arriving at adestination 402. In the absence of thedivergence 306, thedestination 402 will be the intended destination as theuser 102 requested. As theuser 102 prepares to disembark, theautonomous vehicle 104 may implement procedures to ensure that theuser 102 disembarks safely. Forusers 102 with vision impairments, disembarking safely may be difficult due to a lesser ability to see potential hazards or dangers outside of theautonomous vehicle 104. For example, apedestrian 404 may be walking down the street towards theautonomous vehicle 104. Furthermore, theuser 102 may be unaware of which side of theautonomous vehicle 104 he/she should disembark on. For example, in a typically unusual situation, theautonomous vehicle 104 may pull over on the left side of a one-way street. Thus, theautonomous vehicle 104 may provide anaudible message 406 on from a side that theuser 102 should disembark on. In some embodiments, theaudible message 406 may provide directions relative to theuser 102. For example, theaudible message 406 may state “Please exit to your right.” In some scenarios, there may bemultiple users 102 in theautonomous vehicle 104. Thus, theautonomous vehicle 104 may identify eachuser 102 and localize theaudible message 406, such that theaudible message 406 is sent to thespecific user 102. In some embodiments, theautonomous vehicle 104 may instead output theaudible message 406 having more generic instructions so that eachuser 102 is notified of which side to exit from. - Similarly, the
autonomous vehicle 104 may output theaudible message 406 to an exterior of theautonomous vehicle 104. Theaudible message 406 may indicate that a person (e.g., a person that has a disability) is disembarking from theautonomous vehicle 104 to notify and protectpotential pedestrians 404. Although described as audible, theaudible message 406 may be communicated in a variety of other methods, such as haptically, visibly, etc. - Although various embodiments disclosed here relate to helping persons that have disabilities, this and/or other embodiments may be used to assist any
user 102. In other words, many of these improvements are still helpful to people irrespective of disabilities. -
FIG. 5 is a schematic block diagram of theautonomous vehicle 104. Theautonomous vehicle 104 may have an autonomousvehicle computing system 502, anautonomous vehicle transceiver 512, an autonomous vehiclelocation detection system 514, an autonomousvehicle audio system 516, amicrophone 518, and an autonomous vehicle sensor system 520. - The autonomous
vehicle computing system 502 may have anautonomous vehicle processor 504 andautonomous vehicle memory 506, where theautonomous vehicle memory 506 stores computer-executable instructions that are executed by theautonomous vehicle processor 504. As an example, theautonomous vehicle processor 504 may be or may include a central processing unit (CPU), a graphics processing unit (GPU), a plurality of GPUs and/or CPUs, an application-specific integrated circuit (ASIC), or the like. - Within the
autonomous vehicle memory 506 may be asupport system 508 andcontrol system 510. Thesupport system 508 is configured to receive and/or store preference data for theuser 102 in a computer-readable profile of the user. The preference data may include or show that: 1) the user is differently abled (e.g. visual impairment); 2) information should be delivered in a certain way (e.g. audibly); and 3) the frequency of update information (i.e. how often theautonomous vehicle 104 should tell theuser 102 of turns, stoplights, etc.). For example, theuser 102 may have a visual impairment, so theuser 102 would define in his profile that he would like information to be delivered audibly at every turn. - Furthermore, the
support system 508 may be configured to continually track the movement of theautonomous vehicle 104 and provide updates of the movements of theautonomous vehicle 104 to theuser 102. For example, theautonomous vehicle 104 may audibly notify the visuallyimpaired user 102 that theautonomous vehicle 104 is making a turn along thepredetermined journey route 302 or is making a turn, resulting in thedivergence 306 from thepredetermined journey route 302. - The
autonomous vehicle memory 506 also stores computer-executable instructions that run thecontrol system 510. Thecontrol system 510 is configured to control apropulsion system 522, abraking system 524, and asteering system 526 of theautonomous vehicle 104. Similarly, thecontrol system 510 may also control other systems in the autonomous vehicle, such as locks on doors of theautonomous vehicle 104. - The
autonomous vehicle 104 further has anautonomous vehicle transceiver 512 that is configured to communicate with other systems. For example, theautonomous vehicle transceiver 512 may send to and/or receive signals from a mobile device of the user and/or with theserver 202. - Onboard the
autonomous vehicle 104 may also be an autonomous vehicle location detection system orposition sensor system 514. This may be any sensor that provides detection of the location of theautonomous vehicle 104, such as a global positioning system (GPS) or a combination of other systems. - The
autonomous vehicle 104 may further have anaudio system 516 that may output information and other audible signals, such as music. Theaudio system 516 may also be a spatial audio system that provides audible signals from a variety of different directions, such that theuser 102 can understand directional cues from theaudio system 516. For example, theaudio system 516 may audibly provide directions from a first side of theautonomous vehicle 104 to exit theautonomous vehicle 104 from the first side. - Similarly, the
autonomous vehicle 104 may have amicrophone 518 and/or autonomous vehicle sensor system 520 to receive audible and other types of signals. For example, themicrophone 518 may receive the aforementioned secondaudible authentication code 204 b from the user. The autonomous vehicle sensor system 520 may be composed of a variety of different observational sensors, such as light detection and ranging (LIDAR) sensors, cameras, suspension monitors etc. These various sensors may provide a wide variety of signals that indicate different types of movement of theautonomous vehicle 104 as well as information about the environment around theautonomous vehicle 104. For example, the suspension monitors may detect a turn based on different loads on each wheel. As another example, cameras and LIDAR are able to determine whether obstacles may be near a door of theautonomous vehicle 104 or whether other hazards are near or in the surroundings of theautonomous vehicle 104. -
FIG. 6 illustrates amobile computing device 200 that theuser 102 may have. Themobile computing device 200 may take form in a wide variety of different embodiments, including, but not limited to smart phones, smart watches, tablets, hearing aids, etc. Themobile computing device 200 may have amobile device processor 602, mobile device memory 604, amobile device transceiver 608, a mobile device location detection system 610, a mobiledevice audio system 612, and amobile device microphone 614. - The
mobile device processor 602 may be or may include a central processing unit (CPU), a graphics processing unit (GPU), a plurality of GPUs and/or CPUs, an application-specific integrated circuit (ASIC), or the like. - The mobile device memory 604 stores computer-executable instructions that are executed by the
mobile device processor 602. Asupport application 606 may be stored within the mobile device memory 604. Thesupport application 606 may be configured to interface with theuser 102 to receive and/or store the preference data for theuser 102 in a computer-readable profile of the user. The preference data may include or show that: 1) the user is differently abled (e.g. visual impairment); 2) information should be delivered in a certain way (e.g. audibly); and 3) the type of information to be given (i.e. guidance to and from the autonomous vehicle 104). For example, theuser 102 may have a visual impairment, so theuser 102 would define in his profile that he would like guidance to and from theautonomous vehicle 104 to be delivered audibly. - The
mobile computing device 200 may have amobile device transceiver 608 configured to communicate with other systems. For example, themobile device transceiver 608 may send to and/or receive signals from theautonomous vehicle 104 and/or with theserver 202. - The
mobile computing device 200 may also have a mobile device location detection system 610. This may be any sensor that provides a location signal of the autonomous vehicle, such as a global positioning system (GPS) or a combination of other systems. - The mobile
device audio system 612 of themobile computing device 200 is adapted to output audible signals, such as audible notifications, music, audible guidance, etc. The mobiledevice audio system 612 may be a speaker, speakers, a headset, and/or other similar component or devices that provide audible signals. The mobiledevice audio system 612 may be a component within themobile computing device 200 or may also be an attachment or peripheral that connects to themobile computing device 200. - The
mobile computing device 200 may also have amobile device microphone 614. Themobile device microphone 614 is adapted to receive audible signals, such as the firstaudible authentication code 204 a from the autonomous vehicle. -
FIG. 7 illustrates theserver 202 that may communicate with theautonomous vehicle 104 and themobile computing device 200. Theserver 202 may have aserver processor 702,server memory 704, a server communication system 712, and adata store 714. - The
server processor 702 may be or may include a central processing unit (CPU), a graphics processing unit (GPU), a plurality of GPUs and/or CPUs, an application-specific integrated circuit ASIC), or the like. - The
server memory 704 stores computer-executable instructions that are executed by theserver processor 702. Within theserver memory 704 may be apairing system 706, a route planning system 708, and averification system 710. - The
pairing system 706 is configured to connect theuser 102 with theautonomous vehicle 104. The connection may include both the designation or “pairing” of theuser 102 with the specificautonomous vehicle 104 and/or the authentication process detailed further below between theuser 102 and the specificautonomous vehicle 104. - The route planning system 708 is configured to determine the
predetermined journey route 302 based on a location of theuser 102 at the time of the user's request for ridesharing service and a location of thedestination 402 requested. The route planning system 708 may take into account a wide variety of factors to determine thepredetermined journey route 302, such as traffic, weather conditions, etc. The route planning system 708 may also be configured to determine thepath 106 to assist theuser 102 in finding theautonomous vehicle 104, as shown inFIG. 1 . Similarly, the route planning system 708 may also be configured to determine a final path from a drop-off location to thedestination 402, as shown inFIG. 4 . - The
verification system 710 is configured to track a location of theuser 102 through the mobile device location detection system 610 of the user'smobile computing device 200 and a location of theautonomous vehicle 104 through the autonomous vehiclelocation detection system 514. As the location of theuser 102 changes, theverification system 710 tracks the changes in real-time to form and update the actual journey route 304. In addition, theverification system 710 is configured to compare the location of theuser 102 and the actual journey route 304 against thepredetermined journey route 302 forpotential divergences 306. - The server communication system 712 configured to communicate with the
autonomous vehicle 104 and/or themobile computing device 200. The server communication system 712 may also be configured to communicate with additionalautonomous vehicles 104 a and othermobile computing devices 200. - The
server 202 may also have adata store 714, in which data for various systems may be stored. For example, thedata store 714 may store profiles ofusers 102 and preference data for eachuser 102. Furthermore, data from autonomous vehicle sensor systems 520 may be sent thedata store 714 to be stored. -
FIG. 8 is a flowchart diagram that provides an overview of a process theuser 102 may encounter when requesting and using a ridesharing service. Subprocesses within the process of requesting and using a ridesharing service may consist of a pick-upprocess 806, an enroute process 808, and adisembarking process 810. The process ends atstep 812 if theuser 102 does not need the ridesharing service or does not indicate the need and/or preferences for audible support. - As mentioned above,
FIG. 8 is a flowchart diagram that provides an overview of a process theuser 102 may encounter when requesting and using a ridesharing service. Atstep 802, in various embodiments, theuser 102 requests theautonomous vehicle 104. At some earlier point in time, theuser 102 may have set up a user profile, having preference and/or disability data therein. Thus, step 804 shows that upon receiving the request, theserver 202 may access the user profile and determine whether the user is differently abled, requires special actions, and/or prefers special actions, such as providing information audibly. -
FIGS. 9A and 9B each illustrate an exemplary pick-upprocess process 806. -
FIG. 9A illustrates what occurs after theuser 102 requests use of theautonomous vehicle 104. As shown instep 902 a, theserver 202 and/or themobile computing device 200 notifies theautonomous vehicle 104 of user profile information, which may include user preferences, special actions, special actions and/or the user's 102 disabilities.Next step 904 a shows theserver 202 sends the firstaudible authentication code 204 a to theautonomous vehicle 104 over thefirst signal 206 a and the secondaudible authentication code 204 b to theuser 102 over thesecond signal 206 b. Although shown as occurring at different points in time, steps 902 a and 904 a may occur in any combination of points in time. For example, the twosteps step 902 a. Similarly, theserver 202 may send theaudible authentication codes audible authentication code 204 a may be sent to theuser 102 after the secondaudible authentication code 204 b is sent to theautonomous vehicle 104. - In
step 906 a, when theautonomous vehicle 104 arrives atpickup location 108, theautonomous vehicle 104 notifies theserver 202 and/or themobile computing device 200 of theuser 102 of the arrival of theautonomous vehicle 104. In some instances, theautonomous vehicle 104 may be unable to arrive at the exact location of theuser 102, which would require theuser 102 to travel to the location of theautonomous vehicle 104 for pick-up and/or boarding. - For
users 102 with visual impairments, finding the location of theautonomous vehicle 104 may be difficult without assistance. Thus, instep 908 a, if theuser 102 has noted in a user profile that theuser 102 is visually impaired and/or would prefer to receive audible guidance to the pick-uplocation 108, theserver 202 and/or themobile computing device 200 would provideaudible guidance 110 for thepath 106 from the user's current position to pick-uplocation 108, where theautonomous vehicle 104 is situated. -
FIGS. 2A and 9A show instep 910 a, when the mobile device location detection system 610 outputs a position signal similar to and/or within a threshold distance from the pick-uplocation 108, theautonomous vehicle 104 audibly outputs the firstaudible authentication signal 204 a. Atstep user 102 then hears the firstaudible authentication signal 204 a and verifies the firstaudible authentication signal 204 a against the secondaudible authentication signal 204 b that theserver 202 sent to the user'smobile computing device 200 instep 904 a. In some embodiments, themobile computing device 200 may output the secondaudible signal 204 b again so that theuser 102 may correctly verify theaudible authentication codes user 102 may then notify theserver 202 whether theaudible authentication codes user 102 may select an option on themobile computing device 200 indicating whether theaudible authentication codes user 102 verifies that theaudible authentication codes - Step 916 a shows that if the
audible authentication codes autonomous vehicle 104 a is not theautonomous vehicle 104 that theuser 102 is searching for. Thus, theuser 102 does not board theautonomous vehicle 104 a and is prompted to search nearby for the intendedautonomous vehicle 104. More specifically, theserver 202 receives notice that theaudible authentication codes server 202 sends instructions to theautonomous vehicle 104 and/or themobile computing device 200 to notify the user that theautonomous vehicle 104 is not the intendedautonomous vehicle 104 and to search nearby for the intendedautonomous vehicle 104. In some embodiments, themobile computing device 200 may communicate directly with theautonomous vehicle 104 that theautonomous vehicle 104 is not the intendedautonomous vehicle 104. In some embodiments, to notify theuser 102 that theautonomous vehicle 104 is not the intendedautonomous vehicle 104, theautonomous vehicle 104 may issue a signal, such as a honk, to indicate that theautonomous vehicle 104 is incorrect. In some embodiments, the intendedautonomous vehicle 104 may issue a signal, such as a honk, to provide theuser 102 notice of where theautonomous vehicle 104 is located. Again,audible guidance 110 may be provided to theuser 102. In some embodiments, themobile computing device 200 of theuser 102 could notify theuser 102 in some way that is consistent with the user's understanding that there is a mismatch with theautonomous vehicle 104. The mismatch may also be verified by theserver 202 and sent to themobile computing device 200. For example, themobile computing device 200 may also send, in combination with whether theaudible authentication codes audible authentication codes server 202. Theserver 202 may check and/or verify whether theaudible authentication codes mobile computing device 200 may be a smart braille device, such that the smart braille device can communicate to theuser 102 the match or mismatch with theautonomous vehicle 104. - Matching of the
audible authentication codes autonomous vehicle 104, 1234, 1493, ABC, ASDF, etc.). For example, in some embodiments, theserver 202 may send theuser 102 the license plate number as one of the authentication codes, such that if the user gets the same license plate number as one of the authentication codes from theautonomous vehicle 104, then theautonomous vehicle 104 and theuser 102 are matched. As another example, a melody of a song may also be used. Similarly, a successful authentication need not be exact copies of theaudible authentication codes audible authentication code 204 a may be 1234, while the secondaudible authentication code 204 b may be 5678). As another example, a first part of a melody of a song may be used as the firstaudible authentication code 204 a and a second part of the melody of the song may be used as the secondaudible authentication code 204 b. In other words, a “matching” may be a relational connection between the firstaudible authentication code 204 a and the secondaudible authentication code 204 b instead of a mirror image or identical copy of the other. Thus, matching of theaudible authentication codes - At
step 918 a, when theuser 102 verifies that theaudible authentication codes user 102 is allowed to board theautonomous vehicle 104. More specifically, in response to receiving notice that theaudible authentication codes server 202 sends instructions to theautonomous vehicle 104 and/or themobile computing device 200 to notify theuser 102 that theautonomous vehicle 104 is the intended autonomous vehicle. In some embodiments, themobile computing device 200 of theuser 102 may communicate securely with theautonomous vehicle 104 either directly and/or through theserver 202. Furthermore, when theuser 102 is allowed to board theautonomous vehicle 104, theautonomous vehicle 104 may notify theuser 102. In some embodiments, theautonomous vehicle 104 may unlock the doors, play a welcome message, welcome the customer with a personalized message verifying the name of the customer and/or thedestination 402 etc. In some embodiments, theautonomous vehicle 104 may recognize that theuser 102 has come with additional passengers and welcomes the additional passengers as well. Step 920 a shows theuser 102 boarding theautonomous vehicle 104. -
FIG. 9B also illustrates what may occur after theuser 102 requests use of theautonomous vehicle 104. More specifically,FIG. 9B shows a similar pick-up process with a different method of authentication. As shown instep 902 b, theserver 202 and/or themobile computing device 200 notifies theautonomous vehicle 104 of user profile information, which may include user preferences, special actions and/or the user's 102 disabilities.Next step 904 b shows theserver 202 sends the firstaudible authentication code 204 a to theautonomous vehicle 104 over thefirst signal 206 a and the secondaudible authentication code 204 b to theuser 102 over thesecond signal 206 b. Although shown as occurring at different points in time, steps 902 b and 904 b may occur in any combination of points in time. For example, the twosteps step 902 b. Similarly, theserver 202 may send theaudible authentication codes audible authentication code 204 a may be sent to theuser 102 after the secondaudible authentication code 204 b is sent to theautonomous vehicle 104. - In
step 906 b, when theautonomous vehicle 104 arrives atpickup location 108, theautonomous vehicle 104 notifies theserver 202 and/or themobile computing device 200 of theuser 102 of the arrival of theautonomous vehicle 104. In some instances, theautonomous vehicle 104 may be unable to arrive at the exact location of theuser 102, which would require theuser 102 to travel to the location of theautonomous vehicle 104 for pick-up and/or boarding. - For
users 102 with visual impairments, finding the location of theautonomous vehicle 104 may be difficult without assistance. Thus, instep 908 b, if theuser 102 has noted in a user profile that theuser 102 is visually impaired and/or would prefer to receive audible guidance to the pick-uplocation 108, theserver 202 and/or themobile computing device 200 would provideaudible guidance 110 for thepath 106 from the user's current position to pick-uplocation 108, where theautonomous vehicle 104 is situated. -
FIGS. 2B and 9B show instep 910 b, when the mobile device location detection system 610 outputs a position signal similar to and/or within a threshold distance from the pick-uplocation 108, theautonomous vehicle 104 audibly outputs the firstaudible authentication signal 204 a. - At
step 912 b, after theautonomous vehicle 104 outputs the firstaudible authentication code 204 a, theuser 102 is prompted to respond with the secondaudible authentication code 204 b. - Next, at
step 914 b, themicrophone 518 may then receive the audibly output secondaudible authentication code 204 b and either process the matching authentication onboard theautonomous vehicle 104 or send the audible authentication signals 204 a, 204 b from theautonomous vehicle transceiver 512 to the server communication system 712, where theserver 202 uses thepairing system 706 to process and identify whether the audible authentication signals 204 a, 204 b match. If the latter, the server communication system 712 then notifies theautonomous vehicle 104 whether the audible authentication signals 204 a, 204 b match. - Step 916 b shows that if the
audible authentication codes autonomous vehicle 104 a is not theautonomous vehicle 104 that theuser 102 is searching for. Thus, theuser 102 does not board theautonomous vehicle 104 a and is prompted to search nearby for the intendedautonomous vehicle 104. For example, theautonomous vehicle 104 may issue a signal, such as a honk, to indicate that theautonomous vehicle 104 is incorrect. In some embodiments, the intendedautonomous vehicle 104 may issue a signal, such as a honk, to provide theuser 102 some notice of where theautonomous vehicle 104 is located. Again,audible guidance 110 may be provided to theuser 102. In some embodiments, themobile computing device 200 of theuser 102 could notify theuser 102 in some way that is consistent with the user's understanding that there is a mismatch with theautonomous vehicle 104. The mismatch may also be verified by theserver 202 and sent to themobile computing device 200. In some embodiments, themobile computing device 200 may be a smart braille device, such that the smart braille device can communicate to theuser 102 the match or mismatch with theautonomous vehicle 104. - Step 918 b shows that if the audible authentication signals 204 a, 204 b match, then the
user 102 is allowed to board theautonomous vehicle 104. In some embodiments, themobile computing device 200 of theuser 102 may communicate securely with theautonomous vehicle 104 either directly and/or through theserver 202. Furthermore, when theuser 102 is allowed to board theautonomous vehicle 104, theautonomous vehicle 104 may notify theuser 102. In some embodiments, theautonomous vehicle 104 may unlock the doors, play a welcome message, welcome the customer with a personalized message verifying the name of the customer and/or thedestination 402 etc. In some embodiments, theautonomous vehicle 104 may recognize that theuser 102 has come with additional passengers and welcomes the additional passengers as well. Step 920 b shows theuser 102 boarding theautonomous vehicle 104. - It is further contemplated that the
audible authentication codes autonomous vehicle 104 may output the firstaudible authentication code 204 a so that theuser 102 may authenticate the firstaudible authentication code 204 a against a non-audible authentication code (e.g. haptic-based code that matches a pattern of the first audible authentication code, or a visible authentication code for non-visually impaired users, etc.). Themobile computing device 200 would then send the signal to theserver 202 informing theserver 202 of the successful authentication. Theserver 202 may then send instructions to theautonomous vehicle 104 to unlock. - Many other variations of authentication are further within the scope of this disclosure. For example, the
autonomous vehicle 104 may output the firstaudible authentication signal 204 a, while themobile computing device 200 is configured to and receives the firstaudible authentication signal 204 a. Themobile computing device 200 may then encode the firstaudible authentication signal 204 a and transmit the firstaudible authentication signal 204 a to theserver 202, where theserver 202 can authenticate and notify both theautonomous vehicle 104 and themobile computing device 200 if there is a successful authentication. Similarly, themobile computing device 200 may instead output the firstaudible authentication signal 204 a, while theautonomous vehicle 104 is configured to and receives the firstaudible authentication signal 204 a. Theautonomous vehicle 104 may then encode and transmit the firstaudible authentication signal 204 a to theserver 202, where theserver 202 can authenticate and notify both theautonomous vehicle 104 and themobile computing device 200 if there is a successful authentication. In these instances, only the firstaudible authentication code 204 a is necessary for a successful authentication. -
FIG. 10 provides an exemplary timeline of different processes, while also providing insight into which component may handle the process. However, the timeline and categorization of processes by each component is merely provided as an example and it is to be understood that the order and handling of each process by different components may be varied in a wide variety of different ways. - As discussed above, the
user 102 may request a ride instep 1002. More specifically, theuser 102 may use thesupport application 606 on his or hermobile computing device 200 to request a ride on theautonomous vehicle 104. The request is sent through themobile device transceiver 608 to the server communication system 712. - In
step 1004, theserver 202 then accesses the user profile in thedata store 714 to determine user profile information, such as if theuser 102 is differently abled and/or if theuser 102 has requested that information be provided audibly. The request is then handled by thepairing system 706 to pair theuser 102 with theautonomous vehicle 104. Thepairing system 706 then creates and sends authentication signals to bothautonomous vehicle transceiver 512 and themobile device transceiver 608 of themobile computing device 200 of theuser 102. If theuser 102 is visually impaired or has noted in the profile that information should be provided audibly, then the authentication signals will be the audible authentication signals 204 a, 204 b. - In
step 1006 a, theautonomous vehicle 104 receives the firstaudible authentication signal 204 a. Instep 1006 b, theuser 102 and/or themobile computing device 200 of theuser 102 receives the secondaudible authentication signal 204 b to verify when authenticating theautonomous vehicle 104. -
Step 1008 shows the autonomous vehicle arriving at thepickup location 108. -
Step 1010 shows that, in response to theautonomous vehicle 104 arriving at the pick-uplocation 108, theserver 202 is notified of the arrival of theautonomous vehicle 104 by either a notification sent from theautonomous vehicle 104 to theserver communication system 710 and/or a matching of the location signal from the autonomous vehiclelocation detection system 514 with thepickup location 108. Theserver 202 then notifies theuser 102 of the arrival of theautonomous vehicle 104 and, in accordance with the user's profile, causes the mobiledevice audio system 612 to output and provideaudible guidance 110 to thepickup location 108, where theautonomous vehicle 104 is also located. Alternatively or additionally, theautonomous vehicle 104 may directly communicate with themobile computing device 200. - As stated above, the user's 102 location may be determined through the mobile device location detection system 610. The mobile device location detection system 610 outputs a position signal that identifies where the
user 102 is located. Themobile computing device 200 may be configured to send via themobile device transceiver 608 the position signal output from the mobile device location detection system 610 to theserver 202. - Thus steps 1012 and 1014 show that as the
user 102 arrives at thepickup location 108, where theautonomous vehicle 104 is also located, theserver 202 will detect that the position signal of the mobile device location detection system 610 is similar to and/or within a threshold distance from thepickup location 108 and the position signal of the autonomous vehiclelocation detection system 514. Theserver 202 may then notify theautonomous vehicle 104 that theuser 102 is near theautonomous vehicle 104. - Alternatively, the
mobile computing device 200 may communicate directly with theautonomous vehicle 104. For example, a near-field communication (NFC) system or other wireless technology, such as Bluetooth, may be used to provide direct communication between themobile computing device 200 and theautonomous vehicle 104. - In
step 1016, when theautonomous vehicle 104 is aware that theuser 102 is nearby, either by a notification from theserver 202, by direct communication with themobile computing device 200 of theuser 102, or by any other method of communication, the autonomousvehicle computing system 502 will cause the autonomousvehicle audio system 516 to output the firstaudible authentication signal 204 a. - In
step 1018, theuser 102 verifies the firstaudible authentication signal 204 a against the earlier received secondaudible authentication signal 204 b. Instep 1020, theuser 102 may then use themobile computing device 200 to notify theserver 202 and/or theautonomous vehicle 104 directly through thesupport application 606 and/or by sending a notification or signal through themobile device transceiver 608. Alternatively, theuser 102 and/or themobile computing device 200 of theuser 102 may instead output the secondaudible authentication signal 204 b so that theautonomous vehicle 104 receives the secondaudible authentication signal 204 b. Theautonomous vehicle 104 may then either verify onboard the firstaudible authentication signal 204 a against the secondaudible authentication signal 204 b or send the first and second audible authentication signals 204 a, 204 b to theserver 202, where thepairing system 706 determines whether thesignals autonomous vehicle 104 of a successful authentication. -
Step 1022 illustrates that in response to the successful authentication of the first and second audible authentication signals 204 a, 204 b, the autonomousvehicle control system 510 allows entry to theautonomous vehicle 104 and theuser 102 accordingly boards theautonomous vehicle 104. It is further contemplated that theautonomous vehicle 104 may previously have had doors locked to prevent entry thereto; thus, when theautonomous vehicle 104 allows entry thereto, the doors may be unlocked. -
FIG. 10 is simply one example in a wide variety of different methods that fall within the scope of this disclosure. One of ordinary skill in the art would understand that modifications to ordering, timing, components, and/or other processes would fall within the scope of this disclosure. -
FIG. 11 illustrates a schematic block diagram of aprocedure 1100 for boarding theautonomous vehicle 104.Procedure 1100 begins withstep 1105, and continues to step 1110 where theautonomous vehicle 104 determines that information is to be provided audibly to theuser 102 when information is provided to theuser 102. -
Procedure 1100 continues to step 1115 where theuser 102 and theautonomous vehicle 104 create a successful authentication using the firstaudible authentication code 204 a and the secondaudible authentication code 204 b. The successful authentication may be created in a wide variety of methods, as discussed above. - At
step 1120, responsive to the successful authentication, and in accordance with the profile of theuser 102, theuser 102 is audibly notified of the successful authentication. It is to be clear that any of the above-identified audio systems or communication systems may be used to audibly notify theuser 102, including but not limited to the autonomousvehicle audio system 516, the mobiledevice audio system 612, and/or the server communication system 712. - The
user 102 is then allowed entry into the autonomous vehicle atstep 1125. The process subsequently adds atstep 1130. - After the
user 102 has boarded theautonomous vehicle 104, theautonomous vehicle 104 and theuser 102 may begin the journey. -
FIG. 12 illustrates the enroute process 808. - As shown in
step 1202, to ensure that theuser 102 to the intendeddestination 402, theserver 202 may use the route planning system 708 to create thepredetermined journey route 302. Theserver 202 communicates thepredetermined journey route 302 to theautonomous vehicle 104. Alternatively or additionally, theautonomous vehicle 104 may have a locally stored route planning system to create thepredetermined journey route 302 and the locally generatedpredetermined journey route 302 may be transmitted to the server 202 (e.g., the verification system 710). - In
step autonomous vehicle 104 begins moving along thepredetermined journey route 302. During the journey, the autonomous vehiclelocation detection system 514 outputs the position signal of theautonomous vehicle 104. The server communication system 712 may receive the position signal output, store the position signal output in thedata store 714, and use the position signal output in theverification system 710. Alternatively or additionally, the mobile device location detection system 610 may also output the position signal output of themobile device 200, which allows the server communication system 712 to similarly receive the position signal output, store the position signal output in thedata store 714, and use the position signal output in theverification system 710. -
Steps 1206 andstep 1208 show the verificationsystem detecting divergences 306 from thepredetermined journey route 302. More specifically, theverification system 710 compares the position signal output of theautonomous vehicle 104 and/or themobile computing device 200 against thepredetermined journey route 302. The position signal output of theautonomous vehicle 104 creates the actual journey route 304, which may or may not be substantially similar to thepredetermined journey route 302. -
Step 1210 shows that if the actual journey route 304 is substantially similar to thepredetermined journey route 302, then theautonomous vehicle 104 will arrive at thedestination 402 as planned. - On the other hand, if the actual journey route 304 is not substantially similar to and/or exceeds a threshold distance away from the
predetermined journey route 302, then there is thedivergence 306. -
Step 1212 then shows that theuser 102 is notified of thedivergence 306. More specifically, theverification system 710 of theserver 202 may communicate with the server communication system 712 to send a signal to themobile device transceiver 608. Themobile computing device 200 processes the signal from themobile device transceiver 608 and may notify theuser 102 through thesupport application 606 or any other suitable method. Alternatively or additionally, theautonomous vehicle 104 may announce the divergence through the autonomousvehicle audio system 516. In either instances, theaudio systems support message 308, which explains thatdivergence 306 has occurred. -
Step 1214 shows that when thedivergence 306 is detected, theserver 202 may further determine and/or authenticate the cause of thedivergence 306. This determination and/or authentication may be achieved through a wide variety of different methods according to this subject disclosure. - One such method is comparing the position signal outputs of the
autonomous vehicle 104 and the position signal outputs of themobile computing device 200 of theuser 102. When the position signal outputs differ significantly, there is a possibility that theautonomous vehicle 104 and/or themobile computing device 200 has been maliciously attacked. In other words, the autonomous vehicle and/or themobile computing device 200 may be compromised to cause thelocation detection systems 508, 610 to output inaccurate position signals (i.e. spoofing). - Another method is comparing the actual journey route 304 against journey routes or paths of other nearby
autonomous vehicles 104 a. If the actual journey route 304 differs significantly against other journey routes of otherautonomous vehicles 104 a, there is a possibility that theautonomous vehicle 104 has been malicious attacked. In other words, when theautonomous vehicle 104 makes adivergence 306 from thepredetermined journey route 302 that otherautonomous vehicles 104 a do not also do or follow, theautonomous vehicle 104 may be compromised such that thepropulsion system 522,braking system 524, and/orsteering system 526 no longer follow routing instructions from theserver 202. Thus, a third party may have control of theautonomous vehicle 104, resulting in actions not requested and/or implemented by theuser 102 and/or theserver 202. In some scenarios, theserver 202 and/or theautonomous vehicle 104 may communicate with otherautonomous vehicles 104 and/or other sensors nearby to determine whether unusual changes. These sensors may include, but are not limited to, other trustedautonomous vehicles 104, cameras as part of infrastructure, other government and non-government infrastructure, etc. - While both methods may provide information of the possibility of a malicious attack, there may be situations in which a malicious attack may be determined, when in reality there is no malicious attack (i.e. a false positive malicious attack). To remedy this, a combination of the above identified methods and/or any other different method may be useful to authenticate the presence of a malicious attack.
- While the above methods are specifically identified, other methods are also contemplated and within this subject disclosure. For example, in areas having an infrastructure designed with transceivers or other wireless capabilities (i.e. “smart city”), the
server 202 and/or theautonomous vehicle 104 may communicate with the city to determine whether otherautonomous vehicles 104 a are making similar pathing decisions or if there are new traffic developments, such as construction, flooding, etc. -
Step 1216 shows that if theverification system 710 determines that thedivergence 306 is not caused by a malicious attack, theverification system 710 communicates with the route planning system 708 to create a new journey route, which becomes thepredetermined journey route 302. In other words, if theverification system 710 determines that theautonomous vehicle 104 diverges 306 from thepredetermined journey path 302 due to an innocent cause (e.g., missed a turn, could not switch lanes, etc.) then the route planning system 708 reroutes theautonomous vehicle 104 to thedestination 402, despite thedivergence 306. In some embodiments, theautonomous vehicle 104 may detect thedivergence 306 and request a new journey route from theserver 202, which will respond by having the route planning system 708 create a new journey route that becomes the newpredetermined journey route 302. It is further contemplated that theverification system 710 may further determine the cause of thedivergence 306 and, in accordance with the profile of the user, cause the autonomousvehicle audio system 516 or the mobiledevice audio system 612 to audibly notify theuser 102 of the cause of thedivergence 306. For example, theverification system 710 may determine that construction has caused a road in the predetermined journey route to be closed; thus theverification system 710 communicates with the mobiledevice audio system 612 via thesupport application 606 to output an audible message stating that the divergence 306 (or turn away from the street) is due to a road closure caused by construction. - After receiving the new
predetermined journey route 302, theverification system 710 again tracks the progress of theautonomous vehicle 104 and/or themobile computing device 200 along thepredetermined journey route 302 fordivergences 306 from thepredetermined journey route 302. If the actual journey route 304 is identical to thepredetermined journey route 302, then theautonomous vehicle 104 and theuser 102 will arrive at thedestination 402. - Otherwise, as shown in
step 1218, if theverification system 710 has determined that there is a malicious attack, the server communication system 712 sends instructions to theautonomous vehicle 104 to stop and/or pull over. Theserver 202 may also be configured to send a notification or otherwise inform authorities and/or other emergency services of the malicious attack, the position of theuser 102 and theautonomous vehicle 104. - Alternatively or additionally, the
mobile computing device 200 may be configured to also compare the progress of theautonomous vehicle 104 and/or themobile computing device 200 along thepredetermined journey route 302. If thedivergence 306 occurs, themobile computing device 200 may be configured to request information regarding the divergence from theserver 202 and/or theautonomous vehicle 104. Theverification system 710 may then respond with the information. In the event of a malicious attack, theuser 102 may then contact authorities and/or other people on their own or may request through thesupport application 606 to have theserver 202 and/or theautonomous vehicle 104 contact authorities and/or other resources. - It is further contemplated that the
verification system 710 may also be configured to cause the autonomousvehicle audio system 516 and/or the mobiledevice audio system 612 to, in accordance with the user's profile, audibly output each turn along thepredetermined journey path 302, irrespective of whether there is anydivergence 306. - It is further contemplated that, in some embodiments, the
user 102 may want to change thedestination 402 while enroute. Thus, theuser 102 may speak to themicrophone 518 of theautonomous vehicle 104 and/or themobile computing device 200 to ask for the change. Theautonomous vehicle 104 may then send the request to the route planning system 708, which will create a new journey path and update thepredetermined journey path 302 with the new journey path. Again, theuser 102 may be notified audibly of the change. In some embodiments, theuser 102 may also want to add stops and/or quick destinations along the journey to thedestination 402. Again, theuser 102 may speak to themicrophone 518 of theautonomous vehicle 104 and/or themobile computing device 200 to ask for the addition. Theautonomous vehicle 104 may then send the request to the route planning system 708, which will create a new journey path with the additional stops and update thepredetermined journey path 302 with the new journey path. -
FIG. 13 is a schematic block diagram of aprocedure 1300 for boarding theautonomous vehicle 104.Procedure 1300 begins withstep 1305, and continues to step 1310 where it is determined that information is to be provided audibly to theuser 102 when information is provided to theuser 102. -
Procedure 1300 continues to step 1315 where a location of thepassenger 102 is compared against thepredetermined journey path 302. The comparison may detect thedivergence 306 from the predetermined journey path based upon the position signal output by theautonomous vehicle 104 and/or themobile computing device 200. - At
step 1320, responsive to detecting thedivergence 306 from thepredetermined journey path 302, thedivergence 306 is authenticated. As stated above, thedivergence 306 may be authenticated in a wide variety of different methods to determine whether there is actually adivergence 306 from thepredetermined journey path 302 and the cause of thedivergence 306. For example, thedivergence 306 may be authenticated by comparing a location of the autonomous vehicle against a location of a second autonomous vehicle that is trusted. In some embodiments, the second autonomous vehicle acts just as a sensor, in that the second autonomous vehicle is able to determine the location of theautonomous vehicle 104. In some embodiments, there may be multiple additional autonomous vehicles, all of which may be able to determine the behavior and location of theautonomous vehicle 104. -
Procedure 1300 continues to step 1325, where the cause of thedivergence 306 from thepredetermined journey path 302 is determined to be from a malicious attack. As stated above, there are a wide variety of different methods to determine whether thedivergence 306 is caused by a malicious attack. - At
step 1330, in accordance with the profile of theuser 102, the autonomousvehicle audio system 516 and/or the mobiledevice audio system 612 outputs thesupport message 308 to notify the passenger of the divergence and the cause of the divergence. - Then at
step 1335, responsive to the malicious attack, a distress signal is sent to inform authorities of the malicious attack and thedivergence 306 from thepredetermined journey path 302. Again, the distress signal may be sent from themobile computing device 200 of the user, theserver 202, and/or theautonomous vehicle 104. Theprocedure 1300 subsequently ends atstep 1340. - Assuming the
user 102 and theautonomous vehicle 104 arrive at the intendeddestination 402, there may be adisembarking process 810. -
FIG. 14 demonstrates onesuch disembarking process 810. - As discussed above, the
autonomous vehicle 104, theserver 202 and/or themobile computing device 200 may be tracking the location of theautonomous vehicle 104 and/or themobile computing device 200. Thus, as shown instep 1402, upon arrival near thedestination 402, theautonomous vehicle 104 will find a safe location to stop theautonomous vehicle 104 and come to a stop. The location may be considered safe based on a wide variety of different factors, such as vehicular traffic, foot traffic, speed of traffic, weather conditions, etc. When the position signal of theautonomous vehicle 104 and/or the position signal of themobile computing device 200 comes to a stop, theserver 202 will compare the stop location against the location of thedestination 402. - As
step 1404, if the location is substantially similar and/or within a threshold distance, the server communication system sends a notification to theuser 102. The notification may be audible and may be communicated through the autonomousvehicle audio system 516 and/or the mobiledevice audio system 612. The notification may further include details of the location of theautonomous vehicle 104 in relation to thedestination 402, which may also allow theuser 102 to verify that thedestination 402 is the intended destination. For example, the notification may state “We have arrived at a safe location. We are parked 100 feet in front of the post office.” -
Step 1406 shows that theautonomous vehicle 104 then scans an area around the vehicle and an intended direction for theuser 102 to disembark towards. Theautonomous vehicle 104 may scan the surrounding area with the autonomous vehicle sensor system 520. For example, an optical camera and/or lidar may be configured to observe obstacles and movement along a sidewalk nearby. -
Step 1408 shows theautonomous vehicle 104 determining whether the surrounding area is safe. For example, theautonomous vehicle 104 may determine the surrounding area is not safe because there is thepedestrian 404 is running in the direction of theautonomous vehicle 104, which would create a possibility of the pedestrian hitting the door of theautonomous vehicle 104 when theuser 102 opens the door. If theautonomous vehicle 104 determines that the surrounding area is not safe, it continues to scan the surrounding area until there is a safe opportunity for theuser 102 to disembark from theautonomous vehicle 104. Many factors may be used to determine whether the surrounding area is safe and/or if there is a safe opportunity for theuser 102 to disembark. These factors may include, but are not limited to, the presence and/or absence of pedestrians, animals, high curbs, skateboarders, flooding and/or other obstacles, the speed at which objects are moving, the speed at which theuser 102 may be able to disembark (i.e. the amount of time necessary for theuser 102 to safely disembark), etc. In some embodiments, theautonomous vehicle 102 may create a prediction of where the objects and/or obstacles will be based on their current positions and velocities and accordingly further determine the safe opportunity, such that the safe opportunity takes into account that no other objects will be entering that area or location when theuser 102 may disembark. -
Step 1410 shows that when theautonomous vehicle 104 has detected a safe opportunity to disembark, theautonomous vehicle 104 notifies the user that there is a safe opportunity to disembark. Again, this may be accomplished through the autonomousvehicle audio system 516 and/or the mobiledevice audio system 612. -
Step 1412 then shows that theautonomous vehicle 104 may guide theuser 102 to exit the vehicle during the safe opportunity to disembark. More specifically, theautonomous vehicle 104 determines directional information to audibly guide disembarking. This directional information may be determined from a wide variety of factors, such as the location of theautonomous vehicle 104, the orientation of theautonomous vehicle 104, a direction of thedestination 402 in relation to theautonomous vehicle 104, etc. Again, this information may be audibly provided to theuser 102 through the autonomousvehicle audio system 516 and/or the mobiledevice audio system 612. Furthermore, the autonomousvehicle audio system 516 may be configured to provide spatial audio. In other words, the autonomousvehicle audio system 516 may be configured to output audio from specified directions, such that theuser 102 will be able to easily identify the specified direction to disembark from theautonomous vehicle 104. While this may be helpful tousers 102 with visual impairments, this spatial audio guidance may also be helpful for allusers 102 in general. For example, in camp-style seating arrangements (e.g. seats all facing inwards, such that passengers may easily talk to each other, like around a camp fire), theusers 102 may easily forget and/or lose track of which direction is “left” and/or “right.” Thus, by spatially providing audible guidance through the autonomousvehicle audio system 516, allusers 102 in general would easily understand the correct direction to disembark theautonomous vehicle 104. - In some embodiments,
step 1412 further shows that theautonomous vehicle 104 may instruct or direct theuser 102 to perform a procedure prior to exiting the autonomous vehicle. For example, theautonomous vehicle 104 may require that theuser 102 use the arm distal from theautonomous vehicle 104 door to open the door (i.e. performing a Dutch reach). -
Optional step 1414 shows that theautonomous vehicle 104 observes and determines whether theuser 102 has conducted the requested procedure. Continuing the earlier example, theautonomous vehicle 104 may observe through an in-cabin camera of the autonomous vehicle sensor system 520 to determine whether the user has performed a Dutch reach to open the door. - If the
autonomous vehicle 104 does not observe and/or determines that theuser 102 has not conducted the requested procedure, theautonomous vehicle 104 may again instruct or direct theuser 102 to perform the procedure. For example, when asighted user 102 has not performed a Dutch reach, theautonomous vehicle 104 may remain locked and notify theuser 102 again to perform a Dutch reach. - At
step 1416, if theautonomous vehicle 104 observes and determines that theuser 102 has conducted the requested procedure, theautonomous vehicle 104 may then allow theuser 102 to disembark from theautonomous vehicle 104 during the safe opportunity. If the safe opportunity is no longer present, theautonomous vehicle 104 may notify theuser 102 that the safe opportunity has ended. - At
step 1418, as theuser 102 disembarks from theautonomous vehicle 104, theautonomous vehicle 104 may, in accordance with the profile of theuser 102, output theaudible message 406 exterior to theautonomous vehicle 104, theaudible message 406 indicating that theuser 102 is disembarking from theautonomous vehicle 104. Thisaudible message 406 may be helpful in alertingnearby pedestrians 404 that the disembarking of theuser 102 may cause an obstruction in their path. - After the
user 102 has disembarked from theautonomous vehicle 104, theserver 202 may cause the mobiledevice audio system 612 to output audible guidance from user's location to thedestination 402. -
FIG. 15 is a schematic block diagram of aprocedure 1500 for boarding theautonomous vehicle 104.Procedure 1500 begins withstep 1505, and continues to step 1510 where it is determined that information is to be provided audibly to theuser 102 when information is provided to theuser 102. -
Procedure 1500 continues to step 1515 where theautonomous vehicle 104 searches for a safe location to stop theautonomous vehicle 104. The safe location may be directly in front of thedestination 402. - At
step 1520, in accordance with the profile of thepassenger 102, theautonomous vehicle 104 may cause the autonomousvehicle audio system 516 to audibly output that the autonomous vehicle has found the safe location to stop. -
Procedure 1500 continues to step 1525, theautonomous vehicle 104 then searches for and attempts to detect a safe opportunity for the passenger to disembark or exit theautonomous vehicle 104. As stated above, the safe opportunity may be determined based on a wide variety of different factors. - At
step 1530, responsive to detecting the safe opportunity, and in accordance with the profile of thepassenger 102, the autonomousvehicle audio system 516 audibly notifies and guides thepassenger 102 to exit theautonomous vehicle 104 safely during the safe opportunity. As stated above, the autonomousvehicle audio system 516 may spatially guide thepassenger 102 through directional audio. - Then at
step 1535, in accordance with the profile of thepassenger 102, theaudible message 406 is output, wherein content of theaudible message 406 is based on thepassenger 102 exiting theautonomous vehicle 104 being disabled. The procedure subsequently ends atstep 1540. - Various functions described herein can be implemented in hardware, software, or any combination thereof. If implemented in software, the functions can be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes computer-readable storage media. A computer-readable storage media can be any available storage media that can be accessed by a computer. By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc, as used herein, include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc (BD), where disks usually reproduce data magnetically and discs usually reproduce data optically with lasers. Further, a propagated signal is not included within the scope of computer-readable storage media. Computer-readable media also includes communication media including any medium that facilitates transfer of a computer program from one place to another. A connection, for instance, can be a communication medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of communication medium. Combinations of the above should also be included within the scope of computer-readable media.
- Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc. Similarly, functionality described herein can be performed on different components. For example, and without limitation, determinations by the
verification system 710 may be performed onboard theautonomous vehicle 104 and vice versa. - What has been described above includes examples of one or more embodiments. It is, of course, not possible to describe every conceivable modification and alteration of the above devices or methodologies for purposes of describing the aforementioned aspects, but one of ordinary skill in the art can recognize that many further modifications and permutations of various aspects are possible. Accordingly, the described aspects are intended to embrace all such alterations, modifications, and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term “includes” is used in either the details description or the claims, such term is intended to be inclusive in a manner similar to the term “comprising” as “comprising” is interpreted when employed as a transitional word in a claim.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/586,645 US10953852B1 (en) | 2019-09-27 | 2019-09-27 | Pick-up authentication via audible signals |
US17/204,474 US11220238B2 (en) | 2019-09-27 | 2021-03-17 | Pick-up authentication via audible signals |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/586,645 US10953852B1 (en) | 2019-09-27 | 2019-09-27 | Pick-up authentication via audible signals |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/204,474 Continuation US11220238B2 (en) | 2019-09-27 | 2021-03-17 | Pick-up authentication via audible signals |
Publications (2)
Publication Number | Publication Date |
---|---|
US10953852B1 US10953852B1 (en) | 2021-03-23 |
US20210094508A1 true US20210094508A1 (en) | 2021-04-01 |
Family
ID=74882673
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/586,645 Active US10953852B1 (en) | 2019-09-27 | 2019-09-27 | Pick-up authentication via audible signals |
US17/204,474 Active US11220238B2 (en) | 2019-09-27 | 2021-03-17 | Pick-up authentication via audible signals |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/204,474 Active US11220238B2 (en) | 2019-09-27 | 2021-03-17 | Pick-up authentication via audible signals |
Country Status (1)
Country | Link |
---|---|
US (2) | US10953852B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210156700A1 (en) * | 2019-11-22 | 2021-05-27 | Lyft, Inc. | Determining ridership errors by analyzing provider-requestor consistency signals across ride stages |
WO2024121047A1 (en) * | 2022-12-08 | 2024-06-13 | Volkswagen Aktiengesellschaft | Method for operating a system comprising a self-driving vehicle |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6729518B2 (en) * | 2017-08-25 | 2020-07-22 | トヨタ自動車株式会社 | Self-driving vehicle and driverless transportation system |
US10696222B1 (en) * | 2019-03-12 | 2020-06-30 | Waymo Llc | Communications for autonomous vehicles |
US11856480B2 (en) * | 2019-09-27 | 2023-12-26 | Ford Global Technologies, Llc | Haptic guidance and navigation to mobile points of entry |
US20220107650A1 (en) * | 2020-10-06 | 2022-04-07 | Waymo Llc | Providing deliveries of goods using autonomous vehicles |
US11772520B2 (en) * | 2020-11-09 | 2023-10-03 | Ford Global Technologies, Llc | Remote notification and adjustment of a passenger compartment arrangement |
US20220144141A1 (en) * | 2020-11-09 | 2022-05-12 | Ford Global Technologies, Llc | Vehicular system capable of adjusting a passenger compartment arrangement |
US11884238B2 (en) * | 2021-10-21 | 2024-01-30 | Zoox, Inc. | Vehicle door interface interactions |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8595302B2 (en) | 2008-02-22 | 2013-11-26 | Qualcomm Incorporated | Method and apparatus for monitoring message status in an asynchronous mediated communication system |
US8149850B2 (en) | 2008-02-22 | 2012-04-03 | Qualcomm Incorporated | Method and apparatus for asynchronous mediated communicaton |
US9008628B2 (en) * | 2008-05-19 | 2015-04-14 | Tbm, Llc | Interactive voice access and notification system |
US9659164B2 (en) | 2011-08-02 | 2017-05-23 | Qualcomm Incorporated | Method and apparatus for using a multi-factor password or a dynamic password for enhanced security on a device |
US9672049B2 (en) | 2011-09-22 | 2017-06-06 | Qualcomm Incorporated | Dynamic and configurable user interface |
US20130159939A1 (en) | 2011-10-12 | 2013-06-20 | Qualcomm Incorporated | Authenticated gesture recognition |
US10212207B2 (en) * | 2013-08-21 | 2019-02-19 | At&T Intellectual Property I, L.P. | Method and apparatus for accessing devices and services |
US10369967B2 (en) * | 2014-04-01 | 2019-08-06 | Mico Latta Inc. | Vehicle and program for vehicle |
US9369462B2 (en) * | 2014-08-05 | 2016-06-14 | Dell Products L.P. | Secure data entry via audio tones |
US10536867B2 (en) | 2015-02-12 | 2020-01-14 | Qualcomm Incorporated | On-device behavioral analysis to detect malfunction due to RF interference |
US9979606B2 (en) | 2015-03-04 | 2018-05-22 | Qualcomm Incorporated | Behavioral analysis to automate direct and indirect local monitoring of internet of things device health |
US20160327596A1 (en) | 2015-05-06 | 2016-11-10 | Qualcomm Incorporated | Behavioral Analysis To Detect Anomalous Electromagnetic Emissions |
US10025938B2 (en) | 2016-03-02 | 2018-07-17 | Qualcomm Incorporated | User-controllable screen privacy software |
US10154048B2 (en) | 2016-03-18 | 2018-12-11 | Qualcomm Incorporated | Methods and systems for location-based authentication using neighboring sensors |
US10275955B2 (en) | 2016-03-25 | 2019-04-30 | Qualcomm Incorporated | Methods and systems for utilizing information collected from multiple sensors to protect a vehicle from malware and attacks |
US9961496B2 (en) | 2016-06-17 | 2018-05-01 | Qualcomm Incorporated | Methods and systems for context based anomaly monitoring |
WO2018044285A1 (en) * | 2016-08-31 | 2018-03-08 | Ford Global Technologies, Llc | Vehicle movement authorization |
US10515289B2 (en) | 2017-01-09 | 2019-12-24 | Qualcomm Incorporated | System and method of generating a semantic representation of a target image for an image processing operation |
US10464530B2 (en) * | 2017-01-17 | 2019-11-05 | Nio Usa, Inc. | Voice biometric pre-purchase enrollment for autonomous vehicles |
JP6729518B2 (en) * | 2017-08-25 | 2020-07-22 | トヨタ自動車株式会社 | Self-driving vehicle and driverless transportation system |
US10725483B2 (en) | 2017-09-01 | 2020-07-28 | Qualcomm Incorporated | Personal security robotic vehicle |
-
2019
- 2019-09-27 US US16/586,645 patent/US10953852B1/en active Active
-
2021
- 2021-03-17 US US17/204,474 patent/US11220238B2/en active Active
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210156700A1 (en) * | 2019-11-22 | 2021-05-27 | Lyft, Inc. | Determining ridership errors by analyzing provider-requestor consistency signals across ride stages |
US11761770B2 (en) * | 2019-11-22 | 2023-09-19 | Lyft, Inc. | Determining ridership errors by analyzing provider-requestor consistency signals across ride stages |
WO2024121047A1 (en) * | 2022-12-08 | 2024-06-13 | Volkswagen Aktiengesellschaft | Method for operating a system comprising a self-driving vehicle |
Also Published As
Publication number | Publication date |
---|---|
US20210197765A1 (en) | 2021-07-01 |
US11220238B2 (en) | 2022-01-11 |
US10953852B1 (en) | 2021-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11220238B2 (en) | Pick-up authentication via audible signals | |
US11267401B2 (en) | Safe passenger disembarking for autonomous vehicles via audible signals | |
US9953283B2 (en) | Controlling autonomous vehicles in connection with transport services | |
EP3657465A1 (en) | Vehicle control device and vehicle control method | |
JP2019031284A (en) | Autonomous vehicles | |
JP2020532452A (en) | Identification of unassigned passengers in autonomous vehicles | |
JP2006189394A (en) | Vehicle agent device | |
CN112601689B (en) | Vehicle travel control method and travel control device | |
JP2015032312A (en) | Vehicle-to-pedestrian communication system and method | |
US10562541B1 (en) | Contextual autonomous vehicle support through speech interaction | |
JP2006199264A (en) | Junction supporting device and junction supporting method | |
US10919439B2 (en) | Virtual vehicle protective fence | |
WO2020157531A1 (en) | Boarding permission determination device and boarding permission determination method | |
JPWO2020031695A1 (en) | Information processing equipment, mobiles, information processing methods and programs | |
US20210354660A1 (en) | Journey Verification for Ridesharing Via Audible Signals | |
JP7119846B2 (en) | VEHICLE TRIP CONTROL METHOD AND TRIP CONTROL DEVICE | |
CN111757300A (en) | Agent device, control method for agent device, and storage medium | |
Fink et al. | The Autonomous Vehicle Assistant (AVA): Emerging technology design supporting blind and visually impaired travelers in autonomous transportation | |
JP4498911B2 (en) | Alarm generation system and communication equipment | |
KR102428079B1 (en) | School vehicle passenger management system and control method thereof | |
JP2009252022A (en) | Hazardous condition determination device | |
JP2020166715A (en) | Information processing system, mobile body, information processing method, and program | |
JP7469358B2 (en) | Traffic Safety Support System | |
JP7372381B2 (en) | Traffic safety support system | |
US20240173179A1 (en) | Apparatus and method for guiding the transportation vulnerable |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: GM CRUISE HOLDINGS LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KRISHNAMURTHI, GOVIND;REEL/FRAME:055888/0230 Effective date: 20190924 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |