US20200247357A1 - Self-driving vehicle systems and methods - Google Patents

Self-driving vehicle systems and methods Download PDF

Info

Publication number
US20200247357A1
US20200247357A1 US16/528,100 US201916528100A US2020247357A1 US 20200247357 A1 US20200247357 A1 US 20200247357A1 US 201916528100 A US201916528100 A US 201916528100A US 2020247357 A1 US2020247357 A1 US 2020247357A1
Authority
US
United States
Prior art keywords
seat
self
rider
driving vehicle
program instructions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/528,100
Other versions
US10744976B1 (en
Inventor
Eric John Wengreen
Wesley Edward Schwie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Drivent LLC
Original Assignee
Drivent LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/266,698 external-priority patent/US10377342B1/en
Application filed by Drivent LLC filed Critical Drivent LLC
Priority to US16/528,100 priority Critical patent/US10744976B1/en
Assigned to DRIVENT LLC reassignment DRIVENT LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SCHWIE, WESLEY EDWARD, WENGREEN, ERIC JOHN
Priority to US16/678,751 priority patent/US11221622B2/en
Priority to US16/678,660 priority patent/US11221621B2/en
Publication of US20200247357A1 publication Critical patent/US20200247357A1/en
Application granted granted Critical
Publication of US10744976B1 publication Critical patent/US10744976B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/002Seats provided with an occupancy detection means mounted therein or thereon
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R22/00Safety belts or body harnesses in vehicles
    • B60R22/48Control systems, alarms, or interlock systems, for the correct application of the belt or harness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G06K9/00255
    • G06K9/00288
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/04Payment circuits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/12Payment architectures specially adapted for electronic shopping systems
    • G06Q20/127Shopping or accessing services according to a time-limitation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/14Payment architectures specially adapted for billing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/14Payment architectures specially adapted for billing systems
    • G06Q20/145Payments according to the detected use or quantity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/322Aspects of commerce using mobile devices [M-devices]
    • G06Q20/3224Transactions dependent on location of M-devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4015Transaction verification using location information
    • G06Q20/40155Transaction verification using location information for triggering transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/593Recognising seat occupancy
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/0042Coin-freed apparatus for hiring articles; Coin-freed facilities or services for hiring of objects
    • G07F17/0057Coin-freed apparatus for hiring articles; Coin-freed facilities or services for hiring of objects for the hiring or rent of vehicles, e.g. cars, bicycles or wheelchairs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R22/00Safety belts or body harnesses in vehicles
    • B60R22/48Control systems, alarms, or interlock systems, for the correct application of the belt or harness
    • B60R2022/4808Sensing means arrangements therefor
    • B60R2022/4816Sensing means arrangements therefor for sensing locking of buckle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R22/00Safety belts or body harnesses in vehicles
    • B60R22/48Control systems, alarms, or interlock systems, for the correct application of the belt or harness
    • B60R2022/4866Displaying or indicating arrangements thereof
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0213Road vehicle, e.g. car or truck

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Theoretical Computer Science (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Finance (AREA)
  • Mechanical Engineering (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Automation & Control Theory (AREA)
  • Tourism & Hospitality (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Educational Administration (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Traffic Control Systems (AREA)

Abstract

A seat-belt monitoring system can include a first self-driving vehicle having a first seat, a first seat belt, and a first seat belt sensor; and a second self-driving vehicle having a second seat, a second seat belt, and a second seat belt sensor. The first seat belt sensor may be configured to detect a buckled state of the first seat belt and an unbuckled state of the first seat belt. The second seat belt sensor may be configured to detect a buckled state of the second seat belt and an unbuckled state of the second seat belt.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The entire contents of the following application are incorporated by reference herein: U.S. patent application Ser. No. 16/266,698; filed Feb. 4, 2019; and entitled SELF-DRIVING VEHICLE SYSTEMS AND METHODS.
  • The entire contents of the following application are incorporated by reference herein: U.S. Patent Application No. 62/823,552; filed Mar. 25, 2019; and entitled SELF-DRIVING VEHICLE SYSTEMS AND METHODS.
  • The entire contents of the following application are incorporated by reference herein: U.S. Patent Application No. 62/824,941; filed Mar. 27, 2019; and entitled SELF-DRIVING VEHICLE SYSTEMS AND METHODS.
  • The entire contents of the following application are incorporated by reference herein: U.S. Patent Application No. 62/836,525; filed Apr. 19, 2019; and entitled SELF-DRIVING VEHICLE SYSTEMS AND METHODS.
  • The entire contents of the following application are incorporated by reference herein: U.S. Patent Application No. 62/841,785; filed May 1, 2019; and entitled SELF-DRIVING VEHICLE SYSTEMS AND METHODS.
  • BACKGROUND Field
  • Various embodiments disclosed herein relate to vehicles. Certain embodiments relate to self-driving vehicles and seat belts.
  • Description of Related Art
  • Seat belts save thousands of lives every year. A seat belt, however, cannot protect a passenger if the passenger is not wearing the seat belt. Thus, there is a need for systems and methods that encourage seat belt use.
  • SUMMARY
  • The ability of self-driving vehicles to save lives is so impressive that society has a moral imperative to develop self-driving technology such that it can be widely adopted. Self-driving vehicles will save tens of thousands of lives per year. The majority of vehicle-related deaths are caused by driver error. Tests have shown that self-driving vehicles nearly eliminate self-inflicted accidents (although they are not immune to accidents caused by human drivers of other vehicles). Self-driving vehicles can have unlimited attention spans and can process complex sensor data nearly instantaneously.
  • Seat belts have been proven to save lives. One problem, however, is that people sometimes do not wear seat belts. Failing to wear a seat belt puts the rider at risk of death and puts the self-driving vehicle owner at risk of devastating financial loss. Various embodiments disclosed herein increase seat belt use.
  • In some embodiments, a seat-belt monitoring system comprises a first self-driving vehicle configured to transport a rider; a first seat coupled to the first self-driving vehicle; a first seat belt configured to secure the rider in the first seat; and/or a first seat belt sensor configured to detect at least one of a buckled state of the first seat belt and an unbuckled state of the first seat belt.
  • In some embodiments, a seat-belt monitoring system comprises a first occupancy sensor configured to detect the rider sitting in the first seat; and/or a computer system comprising at least one computer and located remotely relative to the first self-driving vehicle.
  • In some embodiments, a computer system comprises a processor and a memory having program instructions that when executed by the processor are configured to execute any of the steps described herein.
  • In some embodiments, a first self-driving vehicle comprises a computer system having a processor and a memory. The memory can comprise program instructions that when executed by the processor are configured to execute any of the steps described herein.
  • In some embodiments, a second self-driving vehicle comprises a computer system having a processor and a memory. The memory can comprise program instructions that when executed by the processor are configured to execute any of the steps described herein.
  • In some embodiments, a first self-driving vehicle comprises a computer system having a processor and a memory. The memory can comprise program instructions (e.g., of the first self-driving vehicle) that when executed by the processor are configured to send a first wireless communication to another computer system located remotely relative to the first self-driving vehicle.
  • In some embodiments, the first self-driving vehicle comprises first program instructions configured to send a first wireless communication to a remotely located computer system in response to the first occupancy sensor detecting the rider sitting in the first seat and in response to the first seat belt sensor detecting the unbuckled state.
  • In some embodiments, a computer system comprises second program instructions configured to send a second wireless communication to a remote computing device of the rider in response to the computer system receiving the first wireless communication. The second wireless communication can be configured to prompt the rider to place the first seat belt in the buckled state. The second wireless communication can be configured to warn the rider of a fine for the unbuckled state of the first seat belt.
  • In some embodiments, a computer system comprises a memory having a buckling history of the rider. The buckling history can comprise data indicative of past buckling behavior of the rider prior to the rider entering the first self-driving vehicle. The computer system can comprise second program instructions configured to update the buckling history in response to receiving the first wireless communication.
  • In some embodiments, the computer system comprises second program instructions configured to deny a future ride request from the rider in response to the computer system receiving the first wireless communication.
  • In some embodiments, the computer system comprises a memory having a buckling history of the rider. The computer system can comprise second program instructions configured to update the buckling history of the rider in response to receiving the first wireless communication. The second program instructions can be configured to deny a future ride request of the rider in response to the computer system analyzing the buckling history.
  • In some embodiments, the computer system comprises a memory having a buckling history of the rider. The buckling history can be at least partially based on the first wireless communication. The computer system can comprise second program instructions configured to determine a future ride price for the rider at least partially based on the buckling history.
  • In some embodiments, the seat-belt monitoring system comprises a computer system having at least one computer. The computer system can be located remotely relative to the first self-driving vehicle (e.g., the computer system is not mechanically coupled to the first self-driving vehicle). Even though the computer system is not mechanically coupled to the first self-driving vehicle, the computer system may be communicatively coupled to the first self-driving vehicle.
  • In some embodiments, the first self-driving vehicle comprises first program instructions configured to send a first wireless communication to the computer system in response to the first seat belt sensor detecting the unbuckled state.
  • In some embodiments, the computer system comprises second program instructions configured to fine an account of the rider in response to the computer system receiving the first wireless communication.
  • In some embodiments, the first wireless communication comprises an identification of the rider. The identification can be configured to enable the second program instructions to fine the account.
  • In some embodiments, the first wireless communication comprises data configured to enable the computer system to identify an account of the rider. Identifying the account of the rider can enable the system to fine the account of the rider instead of mistakenly fining the account of another person.
  • In some embodiments, the seat-belt monitoring system comprises many self-driving vehicles. In some embodiments, the seat-belt monitoring system comprises a second self-driving vehicle having a second seat belt and a second seat belt sensor. The second seat belt sensor can be configured to detect at least one of a buckled state of the second seat belt and an unbuckled state of the second seat belt.
  • In some embodiments, the computer system comprises second program instructions. The second self-driving vehicle can comprise third program instructions configured to send a second wireless communication to the computer system in response to the second seat belt sensor detecting the unbuckled state of the second seat belt. The second self-driving vehicle can comprise third program instructions configured to send a second wireless communication to the computer system in response to both the second seat belt sensor detecting the unbuckled state of the second seat belt and a second occupancy sensor detecting the rider sitting in the second seat.
  • In some embodiments, the computer system comprises a memory having a buckling history of the rider. The buckling history can comprise data indicative of buckling behavior of the rider.
  • The buckling history data can be at least partially based on the first wireless communication and the second wireless communication.
  • In some embodiments, the second program instructions of the computer system are configured to fine an account of the rider a first amount in response to the computer system receiving the first wireless communication, and the second program instructions are configured to fine the account a second amount in response to the computer system receiving the second wireless communication.
  • In some embodiments, the second program instructions are configured to make the second amount larger than the first amount in response to the computer system receiving the second wireless communication after receiving the first wireless communication.
  • In some embodiments, the computer system is configured to determine a future ride price at least partially based on the first wireless communication and the second wireless communication such that the future ride price is at least partially affected by a buckling history of the rider.
  • In some embodiments, the second program instructions are configured to increase the future ride price in response to the buckling history indicating the unbuckled state of the first seat belt and the unbuckled state of the second seat belt.
  • In some embodiments, a seat-belt monitoring system comprises a first occupancy sensor configured to detect the rider sitting in the first seat. The first self-driving vehicle can comprise first program instructions configured to exit a first driving mode and enter a second driving mode in response to the first occupancy sensor detecting the rider sitting in the first seat and in response to the first seat belt sensor detecting the unbuckled state.
  • In some embodiments, just detecting the unbuckled state is insufficient to trigger a reaction (because an unbuckled state could simply be due to the seat being unoccupied). Detecting the unbuckled state and detecting the rider sitting in the seat, however, can trigger a reaction, according to some embodiments.
  • In some embodiments, the first self-driving vehicle comprises an object detection system having at least one of a camera, a radar, and a lidar. The object detection system can be configured to detect a second vehicle on a road to enable the first self-driving vehicle to avoid colliding with the second vehicle. First program instructions can be configured to maintain a greater minimum distance (and/or a greater average distance) from the first self-driving vehicle to the second vehicle in the second mode than in the first mode. The second mode can be configured to keep the first self-driving vehicle farther away from other vehicles than the first mode. In some embodiments, the minimum distance of the second mode is less than 10 meters and the minimum distance of the first mode is less than 5 meters. In some embodiments, the minimum distance of the second mode is less than 15 meters and the minimum distance of the first mode is less than 10 meters. In some embodiments, the minimum distance of the second mode is at least two meters greater than the minimum distance of the first mode.
  • In some embodiments, the first self-driving vehicle comprises an object detection system having at least one of a camera, a radar, and a lidar. The object detection system is configured to detect objects on roads to enable the first self-driving vehicle to avoid colliding with the objects.
  • In some embodiments, in the first driving mode the first program instructions are configured to exceed a first safety threshold while using the object detection system to avoid colliding with the objects while driving on the roads. In some embodiments, in the second driving mode the first program instructions are configured to exceed a second safety threshold while using the object detection system to avoid colliding with the objects while driving on the roads. The second safety threshold is configured to have a lower probability than the first safety threshold of colliding with objects.
  • In some embodiments of the first driving mode, the first program instructions are configured to cause the first self-driving vehicle to transport the rider. In some embodiments of the second driving mode, the first program instructions are configured to cause the first self-driving vehicle to find a location to park.
  • In some embodiments, the seat-belt monitoring system comprises a first occupancy sensor configured to detect the rider sitting in the first seat. The first self-driving vehicle can comprise a camera and first program instructions. The first program instructions can be configured to cause the camera to take a picture of the rider in the first seat in response to the first occupancy sensor detecting the rider sitting in the first seat and in response to the first seat belt sensor detecting the unbuckled state.
  • In some embodiments, the seat-belt monitoring system comprises a computer system having at least one computer. The computer system can be located remotely relative to the first self-driving vehicle (e.g., such that the computer system is not mechanically coupled to the self-driving vehicle). The first program instructions can be configured to send a first wireless communication comprising the picture to the computer system in response to the first occupancy sensor detecting the rider sitting in the first seat and in response to the first seat belt sensor detecting the unbuckled state.
  • In some embodiments, the seat-belt monitoring system comprises a first occupancy sensor configured to detect the rider sitting in the first seat. The seat-belt monitoring system can comprise a computer system comprising at least one computer. The computer system can be located remotely relative to the first self-driving vehicle (e.g., such that the computer system is not mechanically coupled to the self-driving vehicle). The computer system can comprise a memory having a buckling history of the rider. The first self-driving vehicle can comprise a camera configured to take a picture of the rider in the first seat during the unbuckled state.
  • In some embodiments, the first self-driving vehicle comprises first program instructions configured to send a first wireless communication to the computer system in response to the first occupancy sensor detecting the rider sitting in the first seat and in response to the first seat belt sensor detecting the unbuckled state. The computer system can comprise second program instructions configured to save the picture to the memory such that the buckling history of the rider comprises the picture as evidence of the unbuckled state. Saving the picture as evidence can enable sending the picture to the rider (e.g., to show the rider that she was the one who was unbuckled as the vehicle was moving). Some embodiments comprise sending the picture to a remote computing device of the rider (e.g., in response to the computer system receiving the first wireless communication).
  • In some embodiments, the seat-belt monitoring system comprises a computer system having at least one computer. The computer system can comprise second program instructions configured to notify the rider, at least partially in response to the second program instructions analyzing at least one of road conditions, a travel route, and traffic conditions, that the rider is permitted to unbuckle the first seat belt. The second program instructions can analyze at least one of road conditions, a travel route, and traffic conditions, and then in response to the analysis, the second program instructions can notify the rider that the rider is permitted to unbuckle the first seat belt.
  • In some embodiments, the seat-belt monitoring system comprises a computer system having at least one computer. The first self-driving vehicle can comprise an object detection system having at least one of a camera, a radar, and a lidar. The object detection system can be configured to detect a second vehicle. The computer system can comprise second program instructions configured to notify the rider, at least partially in response to analyzing a distance from the first self-driving vehicle to the second vehicle, that the rider is permitted to unbuckle the first seat belt.
  • In some embodiments, a computer system can comprise second program instructions configured to notify the rider to buckle the first seat belt, wherein the notifying is at least partially in response to the second program instructions analyzing at least one of road conditions, a travel route, and traffic conditions. The second program instructions can analyze at least one of road conditions, a travel route, and traffic conditions, and then in response to the analysis, the second program instructions can notify the rider to buckle the first seat belt.
  • In some embodiments, the seat-belt monitoring system comprises a computer system having at least one computer. The computer system comprises second program instructions configured to instruct the rider to buckle the first seat belt in response to the first seat belt sensor detecting the unbuckled state and in response to the second program instructions analyzing at least one of road conditions and traffic conditions.
  • In some embodiments, the seat-belt monitoring system comprises a computer system having at least one computer. The first self-driving vehicle can comprise an object detection system having at least one of a camera, a radar, and a lidar. The object detection system can be configured to detect a second vehicle. The computer system can comprise second program instructions configured to instruct the rider to buckle the first seat belt in response to the first seat belt sensor detecting the unbuckled state and in response to the object detection system analyzing a distance from the first self-driving vehicle to the second vehicle.
  • In some embodiments, the seat-belt monitoring system comprises a first weight sensor configured to detect first data indicative of a first weight of the rider sitting in the first seat coupled to the first self-driving vehicle. The seat-belt monitoring system can comprise a computer system comprising at least one computer. The first self-driving vehicle can be configured to send a first wireless communication comprising the first data to the computer system. The seat-belt monitoring system can comprise a second self-driving vehicle having a second seat and a second weight sensor configured to detect second data indicative of a second weight of an unknown rider sitting in the second seat. The computer system can comprise second program instructions configured to determine that the unknown rider is the rider in response to comparing the second data to the first data.
  • In some embodiments, the seat-belt monitoring system comprises a first occupancy sensor configured to detect the rider sitting in the first seat. The seat-belt monitoring system can comprise a computer system comprising at least one computer. The computer system can comprise second program instructions configured to receive an identification of the rider. The second program instructions can be configured to receive from the first occupancy sensor an indication of the rider sitting in the first seat. The second program instructions can be configured to record that the rider is sitting in the first seat in response to receiving the identification and the indication.
  • In some embodiments, the first self-driving vehicle comprises an audio speaker. The first program instructions can be configured to cause the audio speaker to emit words instructing the rider to put the first seat belt in the buckled state in response to the first seat belt sensor detecting the unbuckled state. The words can warn the rider that the rider will receive a fine if the rider does not put the first seat belt in the buckled state.
  • In some embodiments, the first self-driving vehicle comprises at least one of an audio speaker and a display screen. The first program instructions can be configured to at least one of cause the audio speaker to emit words and cause the display screen to display the words. The words can be configured to warn the rider to put the first seat belt in the buckled state to avoid a fine.
  • Additionally, the disclosure includes a self-driving vehicle comprising a first seat coupled to the vehicle; a crash detection system coupled to the vehicle, the crash detection system comprising at least one of a camera, a radar, a lidar, and an accelerometer; and at least one motor coupled to the first seat and communicatively coupled to the crash detection system. The at least one motor may be arranged and configured to move the first seat in response to the crash detection system detecting an indication of an imminent crash.
  • In some embodiments, the at least one motor is arranged and configured to rotate the first seat in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the at least one motor is arranged and configured to rotate the first seat so that the first seat faces a center of the self-driving vehicle. In some embodiments, the at least one motor is arranged and configured to rotate the first seat so that the first seat faces away from a predicted location of the imminent crash.
  • Furthermore, in some embodiments, the at least one motor is arranged and configured to raise the first seat in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the at least one motor is arranged and configured to lower the first seat in response to the crash detection system detecting the indication of the imminent crash.
  • Even still, in some embodiments, the at least one motor is arranged and configured to move at least a portion of a head rest of the first seat in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the at least one motor is arranged and configured to raise at least the portion of the head rest of the first seat. In some embodiments, the at least one motor is arranged and configured to lower at least the portion of the head rest of the first seat.
  • Additionally, in some embodiments, the at least one motor is arranged and configured to adjust a firmness setting of at least a portion of the first seat in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the at least one motor is arranged and configured to adjust the firmness setting to be more firm than prior to detecting the indication of the imminent crash. In some embodiments, the at least one motor is arranged and configured to adjust the firmness setting to be less firm than prior to detecting the indication of the imminent crash.
  • In some embodiments, the vehicle also includes a first occupancy sensor coupled to the first seat, the first occupancy sensor configured to detect the rider sitting in the first seat. In some embodiments, the motor is arranged and configured to disable movement of the first seat in response to the first occupancy sensor detecting that no person is sitting in the first seat.
  • In some embodiments, the self-driving vehicle also includes a first seat belt sensor coupled to a first seat belt of the first seat, the first seat belt sensor configured to detect at least one of a buckled state of the first seat belt and an unbuckled state of the first seat belt. In some embodiments, the motor is arranged and configured to disable movement of the first seat in response to the first seat belt sensor detecting the unbuckled state of the first seat belt.
  • The first occupancy sensor may include at least one of a camera, an infrared camera, and a weight sensor. In some embodiments, the self-driving vehicle further includes a second seat coupled to the vehicle; and at least one motor coupled to the second seat and communicatively coupled to the crash detection system. The at least one motor of the second seat may be arranged and configured to move the second seat in response to the crash detection system detecting the indication of the imminent crash.
  • The disclosure also includes a self-driving vehicle including a first seat coupled to the vehicle; a crash detection system coupled to the vehicle; and at least one motor coupled to the first seat and communicatively coupled to the crash detection system. In some embodiments, the crash detection system comprises at least one of a camera, a radar, a lidar, and an accelerometer. The at least one motor may be arranged and configured to move the first seat in response to the crash detection system detecting an indication of an imminent crash.
  • In some embodiments, the at least one motor is arranged and configured to rotate the first seat in response to the crash detection system detecting the indication of an imminent crash. In some embodiments, the at least one motor is arranged and configured to rotate the first seat so that the first seat faces a center of the self-driving vehicle. In some embodiments, the at least one motor is arranged and configured to rotate the first seat so that the first seat faces away from a predicted location of the imminent crash. In some embodiments, the at least one motor is arranged and configured to raise the first seat in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the at least one motor is arranged and configured to lower the first seat in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the at least one motor is arranged and configured to move at least a portion of a head rest of the first seat in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the at least one motor is arranged and configured to raise at least the portion of the head rest of the first seat. In some embodiments, the at least one motor is arranged and configured to lower at least the portion of the head rest of the first seat.
  • In some embodiments, the at least one motor is arranged and configured to adjust a firmness setting of at least a portion of the first seat in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the at least one motor is arranged and configured to adjust the firmness setting to be more firm than prior to detecting the indication of the imminent crash. In some embodiments, the at least one motor is arranged and configured to adjust the firmness setting to be less firm than prior to detecting the indication of the imminent crash.
  • In some embodiments, the self-driving vehicle includes a first occupancy sensor coupled to the first seat, the first occupancy sensor configured to detect the rider sitting in the first seat.
  • Accordingly, in some embodiments, the motor is arranged and configured to disable movement of the first seat in response to the first occupancy sensor detecting that no rider is sitting in the first seat. The first occupancy sensor may comprise at least one of a camera, infrared camera, and a weight sensor.
  • Even still, in some embodiments, the self-driving vehicle includes a first seat belt sensor coupled to a first seat belt of the first seat. The first seat belt sensor may be configured to detect at least one of a buckled state of the first seat belt and an unbuckled state of the first seat belt. In some embodiments, the motor is arranged and configured to disable movement of the first seat in response to the first seat belt sensor detecting the unbuckled state of the first seat belt.
  • In some embodiments, the self-driving vehicle further includes a second seat coupled to the vehicle; and at least one motor coupled to the second seat and communicatively coupled to the crash detection system. In some embodiments, the at least one motor of the second seat is arranged and configured to move the second seat in response to the crash detection system detecting the indication of the crash imminent.
  • The disclosure also includes a method of using a self-driving vehicle comprising detecting an indication of an imminent crash, by a crash detection system coupled to the vehicle. In some embodiments, the crash detection system comprises at least one of a camera, a radar, a lidar, and an accelerometer. In some embodiments, in response to detecting the indication of the imminent crash, moving a first seat coupled to the vehicle, by at least one motor coupled to the first seat and communicatively coupled to the crash detection system.
  • In some embodiments, the method includes rotating, by the at least one motor, the first seat in response to detecting the indication of the imminent crash. In some embodiments, the method includes determining, by the crash detection system, a predicted location of the imminent crash with respect to the vehicle. Additionally, in some embodiments, the method includes rotating, by the at least one motor, the first seat so that the first seat faces away from the predicted location of the imminent crash in response to detecting the indication of the imminent crash.
  • In some embodiments, the method includes raising, by the at least one motor, the first seat in response to the crash detection system detecting the indication of the imminent crash. Likewise, in some embodiments, the method includes lowering, by the at least one motor, the first seat in response to the crash detection system detecting the indication of the imminent crash.
  • In order to provide additional support to the rider, in some embodiments, the method includes moving at least a portion of a head rest of the first seat, by the at least one motor, in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the method includes raising at least the portion of the head rest, by the at least one motor, in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the method includes lowering at least the portion of the head rest, by the at least one motor, in response to the crash detection system detecting the indication of the imminent crash.
  • Furthermore, in some embodiments, the method includes adjusting, by the at least one motor, a firmness of at least a portion of the first seat in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the method includes increasing, by the at least one motor, the firmness of at least the portion of the first seat in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the method includes decreasing, by the at least one motor, the firmness of at least the portion of the first seat in response to the crash detection system detecting the indication of the imminent crash.
  • In some embodiments, the method includes detecting, by a first occupancy sensor coupled to the first seat, whether a rider is sitting in the first seat. In some embodiments, the method includes disabling, by the crash detection system, movement of the first seat in response to detecting that no rider is sitting in the first seat. In some embodiments, the method includes enabling, by the crash detection system, movement of the first seat in response to detecting that the rider is sitting in the first seat. In some embodiments, the method includes moving the first seat in response to detecting that the rider is sitting in the first seat.
  • Additionally, in some embodiments, the method includes detecting, by a first seat belt sensor coupled to a first seat belt, at least one of a buckled state of the first seat belt and an unbuckled state of the first seat belt. In some embodiments, the method includes disabling, by the crash detection system, movement of the first seat in response to detecting the unbuckled state of the first seat belt. In some embodiments, the method includes enabling, by the crash detection system, movement of the first seat in response to detecting the buckled state of the first seat belt.
  • In some embodiments, the method includes moving the first seat in response to detecting the buckled state of the first seat belt.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and other features, aspects, and advantages are described below with reference to the drawings, which are intended to illustrate, but not to limit, the invention. In the drawings, like reference characters denote corresponding features consistently throughout similar embodiments.
  • FIG. 1 illustrates a perspective view of a self-driving vehicle, according to some embodiments.
  • FIG. 2 illustrates a perspective view of a top side, a front side and a passenger side of a detection system, according to some embodiments.
  • FIG. 3 illustrates a perspective view of the top side, a backside side and a driver side of the detection system, according to some embodiments.
  • FIG. 4 illustrates a diagrammatic view of portions of a self-driving vehicle, according to some embodiments.
  • FIG. 5 illustrates a diagrammatic view of portions of a system, according to some embodiments.
  • FIG. 6 illustrates a diagrammatic view of a remote computing device, according to some embodiments.
  • FIG. 7 illustrates a side view of a seat, according to some embodiments.
  • FIG. 8 illustrates a perspective view of a camera, according to some embodiments.
  • FIG. 9 illustrates a perspective view of a seat belt, according to some embodiments.
  • FIG. 10 illustrates a perspective view of portions of a seat belt, according to some embodiments.
  • FIG. 11 illustrates a perspective view of a self-driving vehicle driving on a road, according to some embodiments.
  • FIG. 12 illustrates a diagrammatic view of a system, according to some embodiments.
  • FIGS. 13a-13c illustrate diagrammatic views of a system, according to some embodiments.
  • FIG. 14 illustrates a side view of a seat, according to some embodiments.
  • DETAILED DESCRIPTION
  • Although certain embodiments and examples are disclosed below, inventive subject matter extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses, and to modifications and equivalents thereof Thus, the scope of the claims appended hereto is not limited by any of the particular embodiments described below. For example, in any method or process disclosed herein, the acts or operations of the method or process may be performed in any suitable sequence and are not necessarily limited to any particular disclosed sequence. Various operations may be described as multiple discrete operations in turn, in a manner that may be helpful in understanding certain embodiments; however, the order of description should not be construed to imply that these operations are order dependent. Additionally, the structures, systems, and/or devices described herein may be embodied as integrated components or as separate components.
  • For purposes of comparing various embodiments, certain aspects and advantages of these embodiments are described. Not necessarily all such aspects or advantages are achieved by any particular embodiment. Thus, for example, various embodiments may be carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other aspects or advantages as may also be taught or suggested herein.
  • Self-driving vehicles will save tens of thousands of lives per year. The majority of vehicle-related deaths are caused by driver errors. Tests have shown that self-driving vehicles nearly eliminate self-inflicted accidents (although they are not immune to accidents caused by human drivers of other vehicles).
  • Self-driving vehicles typically have unlimited attention spans and can process complex sensor data nearly instantaneously. (Waymo LLC and Tesla Motors Inc. have built self-driving vehicles.) The ability of self-driving vehicles to save lives is so impressive that society has a moral imperative to develop self-driving technology such that it can be widely adopted.
  • Although self-driving vehicles will unlock many safety benefits, there are several barriers to rapid adoption of self-driving vehicles. Some of the embodiments described herein overcome several of these barriers.
  • Self-driving vehicles are sometimes referred to as autonomous cars, autonomous vehicles, driverless cars, and driverless vehicles. Various levels of “self-driving” behaviors are available to sense surrounding environments and navigate appropriately (e.g., without hitting objects, in a time-efficient manner). Levels of self-driving vehicles comprise Level 1 (Driver Assistance), Level 2 (Partial Automation), Level 3 (Conditional Automation), Level 4 (High Automation), and Level 5 (Full Automation). Of course, other levels and distinctions are possible. The National Highway Traffic Safety Administration has outlined various levels of self-driving vehicle automation based on information from the Society of Automotive Engineers.
  • Some embodiments can be used with self-driving vehicles. Embodiments, however, are not limited to self-driving vehicles and can be used with non-self-driving vehicles.
  • As used herein, “location” is used broadly and is not limited to a street address. A location can be a Global Positioning System (“GPS”) location and can be any other location indicator. A location can be an outdoor location. A location can be an indoor location (e.g., a location inside a large shopping center, an apartment complex or other building).
  • Some embodiments use iBeacon hardware to enable tracking remote computing devices indoors. iBeacon is a protocol developed by Apple Inc. Several embodiments use radio transceivers (such as Bluetooth transceivers) to enable tracking remote computing devices indoors.
  • Some embodiments use Global Positioning System (“GPS”) hardware to determine an outdoor location of a remote computing device or vehicle. GPS can include the system of satellites put into orbit and maintained by the U.S. Department of Defense, Russia's GLONASS satellite system, assisted GPS systems, and/or any satellite system used to provide location data.
  • In some embodiments, each system comprises at least one processor and a memory comprising program instructions that when executed by the at least one processor cause the system to perform any of the method steps described herein and/or incorporated by reference.
  • FIG. 1 illustrates a perspective view of a self-driving vehicle 5. The self-driving vehicle 5 can include a detection system 7 configured to detect objects (e.g., cars, pedestrians, other vehicles, buildings, fire hydrants, trees, lane markers, guardrails, roadway barriers, sidewalks, roadway signs, traffic lights) located around the self-driving vehicle 5. Various sensors of the detection system 7 can sense objects even closer than an inch away (e.g., by using ultrasonic sensors 73) and even farther away than 100 yards (e.g., using long-range radar).
  • FIG. 2 illustrates a perspective view of the top side, the front side and the passenger side of the detection system 7. FIG. 3 illustrates a perspective view of the top side, the backside side and the driver side of the detection system 7. FIG. 4 illustrates a diagrammatic view of portions of a self-driving vehicle 5, according to some embodiments.
  • The detection system 7 can comprise radar 8, lidar 9, ultrasonic sensors 73, cameras 11, and any other sensing devices configured to enable the vehicle 5 to detect objects.
  • The self-driving vehicle 5 illustrated in FIGS. 1-4 includes a detection system 7 mounted to the roof of the self-driving vehicle 5. In some embodiments, however, the components of the detection system 7 are mounted on different areas of the self-driving vehicle 5. For example, the ultrasonic sensors 73 can be mounted on the bumpers of the self-driving vehicle 5. The short range of the ultrasonic sensors 73 can make bumper mounting helpful (because the bumper is often closer to the objects being sensed). The cameras 11 can be mounted just behind the windshield (e.g., in the rearview mirror) and just behind other windows. The radars 8 can be mounted near each of the four corners of the self-driving vehicle 5. In the illustrated embodiment, however, the detection system 7 can be contained in one assembly to simplify the integration of the detection system 7 into a vehicle.
  • The detection system 7 can use cameras 11 mounted around a perimeter (e.g., around a perimeter of the vehicle 5 or around a perimeter of a housing of the detection system 7). As illustrated in FIGS. 1-4, the cameras 11 face forward, backward, left, and right to provide (collectively) a 360-degree view around the vehicle 5. The cameras 11 can be high-resolution cameras covered by a glass window to protect each cameras 11 from water and dirt.
  • Cameras 11 can be configured to see lane markers on a road. Using cameras 11 to see painted lane markers can be helpful (because painted lane markers sometimes lack enough three dimensional nature to be detected by some other sensors). In addition, cameras 11 can see color differences (e.g., the difference between the color of the asphalt and the color of yellow or white paint of a lane marker). Cameras 11 can see the color of traffic lights (e.g., red, yellow, green).
  • Cameras 11 sometimes have trouble seeing in situations where the human eye would have trouble seeing (e.g., in fog or rain).
  • Radars 8 can be very helpful in fog and rain. An object that is not detected by cameras 11 (e.g., due to fog or rain) can be detected by radar 8. Radars 8 can detect the speed of other vehicles and the distance to other vehicles. Radars 8 can also detect objects that are far away.
  • Radar is an object-detection system that uses radio waves to determine the range, angle, or velocity of objects. A radar can comprise a transmitter producing electromagnetic waves in the radio or microwave domain, a transmitting antenna, a receiving antenna (which can be the same antenna as the transmitting antenna), a receiver, and/or a processor to determine properties of the objects detected by the radar.
  • Lidar uses light to detect objects. A lidar 9 can be located on a top portion of the detection system 7 to provide a 360-degree view of the area around the self-driving vehicle 5. The lidar 9 can tell the difference between an actual person and a billboard that includes a picture of a person (due to the three dimensional nature of the actual person and the two dimensional nature of the picture of a person).
  • The lidar 9 can accurately sense the three dimensional nature of the world around the self-driving vehicle 5. The lidar 9 can also measure the distance to objects. Measuring distance can enable the self-driving vehicle 5 to know, for example, if an approaching car is 5 meters away (so there is not enough time to turn in front of the car) or 25 meters away (so there may be enough time to turn in front of the car).
  • In some embodiments, the lidar 9 is a Velodyne VLS-128 made by Velodyne LiDAR, Inc. having an office in San Jose, Calif. The Velodyne VLS-128 can provide real-time, three-dimensional data with up to 0.1 degree vertical and horizontal resolution, a range of up to 300 meters, and a 360-degree surround view. The VLS-128 can provide the range, resolution and accuracy required by some of the most advanced autonomous vehicle programs in the world.
  • Many types of lidars can be used. Some embodiments use “incoherent” or direct energy detection (which principally measures amplitude changes of the reflected light). Some embodiments use coherent detection (which in some cases can be well suited for measuring Doppler shifts, or changes in phase of the reflected light). Coherent systems can use optical heterodyne detection.
  • Lidar can use pulse models. Some lidar embodiments use micropulse or high energy systems. Micropulse systems can use intermittent bursts of energy. Some lidar embodiments use high-power systems.
  • Lidar can comprise lasers. Some embodiments include solid-state lasers. Some embodiments include flash lidar. Some embodiments include electromechanical lidar. Some embodiments include phased arrays to illuminate any direction by using a microscopic array of individual antennas. Some embodiments include mirrors (e.g., micro electromechanical mirrors).
  • Some embodiments include dual oscillating plane mirrors, a polygon mirror and/or a scanner (e.g., a dual-axis scanner).
  • Lidar embodiments can include photodetector and receiver electronics. Any suitable type of photodetector can be used. Some embodiments include solid-state photodetectors (e.g., silicon avalanche photodiodes) and/or photomultipliers.
  • The motion of the vehicle 5 can be compensated for to accurately determine the location, speed, and direction of objects (such as other vehicles) located outside the vehicle 5. For example, if a vehicle 5 a is heading west at 35 miles per hour and a second vehicle is heading east at an unknown speed, a detection system 7 a of the vehicle 5 a can remove the contribution of the 35 miles per hour when determining the speed of the second vehicle.
  • In some embodiments, motion of the vehicle 5 is compensated for by using position and navigation systems to determine the absolute position, speed, and orientation of the lidar, camera, radar, or other object sensing system. A Global Positioning System (“GPS”) receiver and/or an Inertial Measurement Unit (“IMU”) can be used to determine the absolute position and orientation of the object sensing system.
  • Lidar can use active sensors that supply their own illumination source. The energy can hit objects. The reflected energy can be detected and measured by sensors. Distance to the object can be determined by recording the time between transmitted and backscattered pulses and by using the speed of light to calculate the distance traveled. Scanning can be used to create a three dimensional image or map of the area around the vehicle 5.
  • Embodiments can use a short-range lidar to give the self-driving vehicle 5 a surround view near the self-driving vehicle 5 (to see objects close to the self-driving vehicle 5) and can use a long-range lidar configured to not only detect objects located far from the self-driving vehicle 5, but also to enable zooming into objects that are over 200 meters away. The long-range lidar can be very helpful at high-speed highway situations.
  • Lidar uses light to detect a distance to an object, a direction to the object, and/or a location of an object. Lidar can use pulsed laser light emitted by a laser.
  • The light can reflect off objects around the vehicle. These reflections can be detected by a sensor of the lidar. Measuring how long the light takes to return to the sensor and measuring the wavelengths of the reflected light can enable making a three-dimensional model of the object being sensed and of the entire area around the vehicle 5.
  • FIG. 4 illustrates a diagrammatic view of portions of a self-driving vehicle 5, according to some embodiments. The self-driving vehicle 5 can include a vehicle navigation system 14, a communication system 16 that has a transmitter 18 and a receiver 17, a computer system 19 that has a processor 26, a memory 20 that has program instructions 27 and map information 28, a traffic monitor 23, and a drive-by-wire system 24. In some embodiments, at least some of these items are part of the detection system 7.
  • The vehicle navigation system 14 can be configured to enable the vehicle 5 to follow a driving route. The vehicle navigation system 14 can direct the vehicle toward a pick-up location.
  • The communication system 16 can be configured to communicate with a vehicle management system. The communication system 16 can be configured to communicate with a remote computing device 12 of a rider. The communication system 16 can use an antenna 13 to communicate with other vehicles and other devices (such as a vehicle management system and remote computing devices) via intermediary communication systems 15.
  • Intermediary communication systems 15 can comprise wireless networks, Wi-Fi routers, Bluetooth systems, cellular networks, telephone networks, Internet systems, servers, cloud computing, remotely located computers, satellite systems, communication systems, and any other suitable means of enabling communication between the various components of embodiments described herein and/or incorporated by reference.
  • The drive-by-wire system 24 can be a computer-regulated system for controlling the engine, accelerating, braking, steering, signaling, handling, suspension, and/or other functions related to autonomously driving the vehicle 5.
  • In some embodiments, at least portions of a vehicle management system are located far away from vehicles 5, 5 a, 5 b, 5 c. The vehicle management system can include software that is run on servers. The servers can communicate with vehicles 5, 5 a, 5 b, 5 c via intermediary communication systems 15.
  • In some embodiments, portions of the vehicle management system are located in one or more vehicles 5, 5 a, 5 b, 5 c and portions of the vehicle management system are located far away from the one or more vehicles 5, 5 a, 5 b, 5 c.
  • FIG. 5 illustrates a diagrammatic view of portions of a vehicle management system, according to some embodiments. FIG. 5 illustrates many optional items. Not all the items illustrated in FIG. 5 are necessarily part of each vehicle management system.
  • A vehicle management system can comprise a location tracking system 30 configured to track locations of vehicles 5, 5 a, 5 b, 5 c and also configured to track locations of vehicles 48 that have been identified as potentially impaired (according to indications collected by the vehicles 5, 5 a, 5 b, 5 c).
  • The location tracking system 30 can receive GPS location data of the vehicles 5, 5 a, 5 b, 5 c by the vehicles 5, 5 a, 5 b, 5 c sending their GPS location data to the location tracking system 30 via intermediary communication systems 15.
  • The vehicles 5, 5 a, 5 b, 5 c can receive radio communications from GPS satellites. These radio communications can include information configured to enable the vehicles 5, 5 a, 5 b, 5 c to calculate their position at any time.
  • Receiving radio communications (with position data) from three or more GPS satellites can provide data to enable each vehicle and each remote computing device to calculate its own position. Then each vehicle and each remote computing device can send its position data to a vehicle management system (e.g., via intermediary communication systems 15).
  • Each device can receive radio signals broadcasted from GPS satellites. Then, the device can calculate how far it is away from the broadcasting satellite by determining how long the radio signal (traveling at light speed) took to arrive at the device. Trilateration (based on data from at least three GPS satellites) enables the device to know where it is located. The device can then send its location to the vehicle management system. A location tracking system can receive the location data from the vehicle management system, from the device, and/or from any other system.
  • The location tracking system 30 can comprise a computer configured to receive locations of vehicles and remote computing devices. The location tracking system 30 can comprise a processor 35 and a memory 31 comprising program instructions 32 configured such that when executed by the processor 35 the program instructions 32 cause the location tracking system 30 to monitor locations of vehicles and remote computing devices.
  • A vehicle management system can comprise a computer system 34 that includes one or more computers of any suitable type. Each computer can include a processor 35 and a memory 31 comprising program instructions 32 configured such that when executed by the processor 35 the program instructions 32 cause the vehicle management system to perform the methods described herein. The computer system 34 can comprise a database 64 having information.
  • The vehicle management system can comprise map information 37 (including street information, preferred pick-up locations, and preferred drop-off locations) and a traffic monitor 38 configured to receive traffic information from third parties (e.g., Google Maps).
  • The vehicle management system can comprise a communication system 39 having a transmitter 40, a receiver 41, and an antenna 42. The communication system 39 can be configured to communicate with the vehicles 5, 5 a, 5 b, 5 c. In some embodiments, the communication system 39 communicates with the vehicles 5, 5 a, 5 b, 5 c via intermediary communication systems 15. The antenna 42 can be communicatively coupled to the antenna 13 shown in FIG. 4.
  • The antenna 42 can send wireless communications to vehicles 5 a, 5 b, 5 c via intermediary communication systems 15 (e.g., as indicated by arrows 44, 45, 46, 47). The antenna 42 can receive wireless communications from vehicles 5 a, 5 b, 5 c via intermediary communication systems 15 (e.g., as indicated by arrows 44, 45, 46, 47).
  • The antenna 42 can be communicatively coupled (e.g., via intermediary communication systems 15) with self-driving vehicles 5, 5 a, 5 b, 5 c that can include a vehicle navigation system 14, a communication system 16 that has a transmitter 18 and a receiver 17, a computer system 19 that has a processor 26, a memory 20 that has program instructions 27 and map information 28, a traffic monitor 23, and a drive-by-wire system 24 (as illustrated in FIG. 4).
  • Communicative coupling may be via continuous communications or intermittent communications. Intermittent communications can be via periodic communications (e.g., every 1 second, every 60 seconds, every 10 minutes). As used herein, “periodically” does not imply that every period has the same duration. In some embodiments, the communicative coupling is via intermediary communication systems 15.
  • Each self-driving vehicle 5 a, 5 b, 5 c can include all of the items described in the context of vehicle 5.
  • Vehicle 5 a includes a detection system 7 a that can include all of the items described in the context of detection system 7. Vehicle 5 b includes a detection system 7 b that can include all of the items described in the context of detection system 7. Vehicle 5 c includes a detection system 7 c that can include all of the items described in the context of detection system 7.
  • FIG. 6 illustrates a diagrammatic view of a remote computing device 12. Many different types of remote computing devices can be used with the embodiments described herein and/or incorporated by reference. Some remote computing devices do not include all the parts illustrated in FIG. 6. Some remote computing devices include parts not illustrated in FIG. 6.
  • A remote computing device can be a smartphone, a smartwatch, a tablet computer, a laptop computer, a desktop computer, a server, augmented reality glasses, an implanted computer, and/or any type of computer. A rider can bring her remote computing device into the self-driving vehicle, use her remote computing device in the self-driving vehicle, and leave the self-driving vehicle with her remote computing device. In some embodiments, the rider requests a ride at her home with a remote computing device, but then leaves the remote computing device at home when she goes to get a ride from the self-driving vehicle.
  • The remote computing device 12 can comprise an accelerometer 74, a barometer 75 (which can include an altimeter), a gyroscope 76, a WiFi tracker 77, a compass 78, a location tracking system 79, a memory 80, a computer system 82 having a processor 83, a database 84 and/or a communication system 86. The communication system can include a transmitter 87, a receiver 88, and/or an antenna 89. The remote computing device 12 can comprise a display screen 90 configured to display images to a rider. The remote computing device 12 can comprise a speaker configured to emit sounds to the rider. The remote computing device 12 can comprise a microphone configured to record sounds from the rider.
  • The remote computing device 12 can comprise map information 85 (including street information, preferred pick-up locations, and preferred drop-off locations). The map information 85 can be based on information from third parties (e.g., Google Maps).
  • The remote computing device 12 can comprise program instructions 81 configured to be executed by the processor 83.
  • Self-driving vehicles can include cars, vans, trucks, trains, boats, buses, scooters, motorcycles, helicopters, quadcopters, flying machines, air taxis, planes, and any motorized vehicle configured to transport a person.
  • A person (i.e., a rider) can enter (and/or ride on) a first self-driving vehicle 5 a or a second self-driving vehicle 5 b. Whichever vehicle the rider enters can transport the rider to a drop-off location.
  • In some cases, the first self-driving vehicle 5 a transports the rider to the drop-off location and the second self-driving vehicle 5 b picks up the rider. A self-driving vehicle fleet can comprise one vehicle, two vehicles, three vehicles or more vehicles. In some embodiments, a self-driving vehicle fleet comprises hundreds or even thousands of vehicles. The vehicles can coordinate to provide efficient transportation to the rider (and/or to many riders).
  • The first self-driving vehicle 5 a can receive communications (e.g., radio signals) from positioning systems (which in some embodiments are GPS satellites). The second self-driving vehicle 5 b can receive communications (e.g., radio signals) from positioning systems (which in some embodiments are GPS satellites).
  • Positioning systems (e.g., GPS satellites) can send communications (e.g., radio signals) to the first self-driving vehicle 5 a. The first self-driving vehicle 5 a can use these communications to determine positions of the first self-driving vehicle 5 a at various times (e.g., when the first self-driving vehicle 5 a drops off the rider at a drop-off location).
  • The first self-driving vehicle 5 a can send communications (which can include GPS coordinates of the first self-driving vehicle 5 a) to an antenna 42 via intermediary communication systems 15. Intermediary communication systems 15 can send communications (which can include GPS coordinates of the first self-driving vehicle 5 a) to the antenna 42.
  • Receiving radio communications (with position data) from three or more GPS satellites can provide data to enable each vehicle 5 a, 5 b, 5 c and each remote computing device 12 (shown in FIG. 6) to calculate its own position. Then each vehicle 5 a, 5 b, 5 c and each remote computing device 12 can send its position data to a vehicle management system (e.g., via intermediary communication systems).
  • The location tracking system 30 can receive GPS location data of the vehicles 5, 5 a, 5 b, 5 c by the vehicles 5, 5 a, 5 b, 5 c sending their GPS location data to the location tracking system 30 via intermediary communication systems 15.
  • The vehicles 5, 5 a, 5 b, 5 c can receive radio communications from GPS satellites. These radio communications can include information configured to enable the vehicles 5, 5 a, 5 b, 5 c to calculate their position at any time.
  • Each device can receive radio signals broadcasted from GPS satellites. Then, the device can calculate how far it is away from the broadcasting satellite by determining how long the radio signal (traveling at light speed) took to arrive at the device. Trilateration (based on data from at least three GPS satellites) enables the device to know where it is located. The device can then send its location to the vehicle management system. A location tracking system 30 can receive the location data from the vehicle management system, from the device, and/or from any other system.
  • The location tracking system 30 can comprise a computer configured to receive locations of vehicles and remote computing devices. The location tracking system 30 can comprise a processor 35 and a memory 31 comprising program instructions 32 configured such that (when executed by the processor 35) the program instructions 32 cause the location tracking system 30 to monitor locations of vehicles 5, 5 a, 5 b, 5 c and remote computing devices 12.
  • In some cases, a rider uses a remote computing device 12 to request a ride from the vehicle management system. Then, the vehicle management system prompts a self-driving vehicle 5 to drive to a pick-up location selected by the rider (e.g., via an “app” on the remote computing device 12).
  • A seat belt is a restraining device configured to hold a rider in a seat of a vehicle during a collision. Many different types of seat belts can be used with the embodiments described herein and/or incorporated by reference.
  • Seat belt embodiments can use two-point seat belts, lap seat belts, shoulder seat belts, sash seat belts, three-point seat belts, four-point seat belts, five-point seat belts, six-point seat belts, eight-point seat belts, and any other type of restraining device configured to hold a rider in a seat of a vehicle during a collision.
  • FIG. 7 illustrates a side view of a first seat 51. A camera device 10 is also shown in FIG. 7. The camera device 10 can be configured to detect if a rider is sitting in the first seat 51. An optional shoulder strap is hidden in FIG. 7.
  • FIG. 9 illustrates a seat belt 53 having anchor points 100 a, 100 b, 100 c, 100 d that can be bolted to a frame of a self-driving vehicle 5. The seat belt 53 can comprise a strap 105. The strap 105 can include webbing. The strap 105 can be flexible. The strap 105 can be rigid. Many other types of seat belts can be used.
  • The seat belt 53 can include a retractor 107 that is spring loaded to apply a force to the strap 105 that wraps at least a portion of the seat belt 53 around a rotating portion of the retractor 107. Pulling the seat belt tongue 106 toward the buckle 108 can create a triangular shape that forms a shoulder belt and a lap belt as the strap 105 slides through the pillar loop 104.
  • FIG. 10 illustrates portions of the seat belt 53. Inserting the tongue 106 (which can be metal) into an opening 109 of the buckle 108 can “buckle” the seat belt 53.
  • A seat belt sensor 55 is configured to detect at least one of a buckled state of the first seat belt 53 and an unbuckled state of the first seat belt 53. In some embodiments, a seat belt sensor detects if a strap 105 is coupled to an anchor such that the seat belt is arranged to restrain a rider in the event of a collision. In some embodiments, a seat belt sensor detects if there is strain on the strap 105 (e.g., due to one end of the strap 105 being coupled to an anchor and another end of the strap 105 being pulled by a retractor 107) such that the seat belt is arranged to restrain a rider in the event of a collision. Many types of seat belt sensors can be used with the embodiments described herein and/or incorporated by reference.
  • U.S. Pat. No. 7,093,515 teaches a seat belt sensor to detect whether a seat belt is in a buckled state: “The buckle switch 17 is turned off when the seat belt 14 is fastened by inserting and thereby hooking a tongue 15 to a buckle 16. The buckle switch 17 is turned on when the tongue 15 is not inserted into and hooked on the buckle 16 and thus the seat belt 14 is unfastened.” See FIG. 4 of U.S. Pat. No. 7,093,515. The entire contents of U.S. Pat. No. 7,093,515 are incorporated by reference herein.
  • U.S. Pat. Nos. 5,960,523; 6,554,318; and 5,871,063 teach about seat belt sensors. The entire contents of U.S. Pat. Nos. 5,960,523; 6,554,318; and 5,871,063 are incorporated by reference herein.
  • U.S. Pat. Nos. 5,965,827; 5,996,421; 5,765,774; 6,205,868; and 6,357,091 teach about seat belt sensors. The entire contents of U.S. Pat. Nos. 5,965,827; 5,996,421; 5,765,774; 6,205,868; and 6,357,091 are incorporated by reference herein.
  • In some embodiments, a seat belt sensor 55 comprises a switch that is triggered by inserting the tongue 106 into the opening 109 of the buckle 108. The switch can be a mechanical switch.
  • The switch can be an electrical component configured to, for example, complete an electrical current in response to inserting the tongue 106 into the opening 109 of the buckle 108. Many different types of electrical switches can be used. Some embodiments use a reed switch.
  • Some embodiments use a light switch configured such that inserting the tongue 106 into the opening 109 of the buckle 108 blocks light emitted from a light source (located inside the buckle 108) from reaching a light sensor (located inside the buckle 108). If the buckle sensor does not detect the light from the light source, then the system determines that the seat belt 53 is “buckled.”
  • In some embodiments, the tongue 106 comprises a signal emitting portion and the buckle 108 comprises a signal receiving portion. If the buckle sensor does not detect the signal, then the system determines that the seat belt 53 is “unbuckled.”
  • In some embodiments, the tongue 106 comprises a signal receiving portion and the buckle 108 comprises a signal emitting portion. If the buckle sensor does not detect the signal, then the system determines that the seat belt 53 is “unbuckled.”
  • In some embodiments, the buckle sensor 55 is a strain gauge coupled to the strap 105 such that when the seat belt 53 is “buckled,” the resulting strain is sensed by the buckle sensor 55 (such that the system can determine that the seat belt 53 is “buckled”).
  • A seat belt sensor can be configured to detect at least one of a buckled state of a seat belt and an unbuckled state of first seat belt. As used herein, “buckled state” is used broadly to mean that the seat belt is arranged to help secure a rider if a collision occurs.
  • A seat belt may be extendable (e.g., the retractor 107 may allow additional strap length to unroll) but still may be arranged to help secure the rider if a collision occurs. For example, a seat belt arranged across a lap of a rider, across a chest and shoulder area of a rider, and/or across a frontside (e.g., of an upper body) of the rider is arranged to help secure a rider if a collision occurs.
  • As used herein, “unbuckled state” is used broadly to mean that the seat belt is not arranged to help secure a rider if a collision occurs. For example, if the seat belt is simply located at the side of the rider and the tongue 106 is not coupled to the buckle 108, then the seat belt is not arranged to help secure the rider if a collision occurs.
  • Some seat belts do not have a buckle but can still be placed in a “buckled state” if the seat belt is arranged to help secure a rider if a collision occurs. A seat belt (even without a buckle) can be positioned across a lap of a rider such that the seat belt is arranged to help secure a rider if a collision occurs.
  • Embodiments can use the terminology “uncoupled state” for a seat belt that is not arranged to help secure a rider if a collision occurs and “coupled state” for a seat belt that is arranged to help secure a rider if a collision occurs.
  • Embodiments can use the terminology “unfastened state” for a seat belt that is not arranged to help secure a rider if a collision occurs. Embodiments can use the terminology “fastened state” for a seat belt that is arranged to help secure a rider if a collision occurs.
  • Motorized seat belts can be placed in a buckled state via a motor that moves the seat belt into a position to help secure a rider if a collision occurs. The seat belt can begin in an unbuckled state that enables the rider to enter the vehicle and sit in the seat. Then, the motor can move the seat belt (e.g., along a track coupled to a frame of the vehicle) into a buckled state such that the seat belt is in a position to help secure the rider if a collision occurs. In some motorized seat belts, actuating a buckle is not necessary to transition from an unbuckled state to a buckled state because movement of an end of the seat belt along the track (rather than actuating the buckle) places the seat belt in a position to help secure a rider if a collision occurs. The entire contents of U.S. Pat. No. 4,995,640 are incorporated by reference herein.
  • In some embodiments, a seat-belt monitoring system comprises a first occupancy sensor 57 configured to detect the rider sitting in the first seat 51. Occupancy sensor embodiments can use any type of sensor that enables the seat-belt monitoring system to detect whether a rider is located in a seat. Many types of sensors can be used.
  • In some embodiments, an occupancy sensor comprises a camera (e.g., 10, 10 a, 10 b, 11 a, 11 b, 24 a, 24 b, 24 c) configured to take a picture of the rider. A computer system (e.g., 34, 19, 19 a, 82) can comprise program instructions (e.g., 32, 27, 27 a, 81) configured to visually analyze the picture to determine if the picture shows a rider (a person) sitting in a seat (e.g., 51, 65, 52, 67). For example, Amazon Web Services, Inc. provides an application programming interface (“API”) called “Amazon Rekognition” to automatically recognize people and objects in pictures. A communication system 16 of a self-driving vehicle 5 a can send the picture to the API for analysis. The API can then tell a computer system (e.g., 34, 19, 82) if the picture shows a rider located in the seat of the vehicle. The API can also tell a computer system (e.g., 34, 19, 19 a, 82) if the picture shows the seat belt in a buckled state or in an unbuckled state. Thus, the API can enable a camera-based system to serve as both an occupancy sensor (to determine if a person is sitting in the seat) and as a seat belt sensor configured to detect a buckled state of the seat belt and an unbuckled state of the seat belt.
  • In some embodiments, an occupancy sensor comprises a camera. U.S. Pat. No. 7,505,841 includes an occupancy sensor that comprises a camera. The entire contents of U.S. Pat. No. 7,505,841 are incorporated by reference herein. U.S. Pat. No. 7,415,126 includes an occupancy sensor that comprises a camera. The entire contents of U.S. Pat. No. 7,415,126 are incorporated by reference herein.
  • In some embodiments, an occupancy sensor comprises a pressure sensor configured to detect whether a rider is sitting in a seat of a vehicle. The pressure sensor can collect data indicative of a rider's weight. The rider likely has her feet on the floor, so the weight data may not reflect the rider's entire weight. Even so, the weight data can help identify the rider. For example, a first rider may have a weight of 120 pounds with 30 pounds resting on the floor (due to the first rider's feet being on the floor) and 90 pounds resting on a first seat. The distribution between the first rider's weight on the floor and weight on the first seat may vary (e.g., by plus or minus fifteen percent) depending on how the first rider is sitting on the first seat, but generally is fairly consistent.
  • A second rider may have a weight of 200 pounds with 50 pounds resting on the floor (due to the second rider's feet being on the floor) and 150 pounds resting on a second seat. The distribution between the second rider's weight on the floor and weight on the second seat may vary (e.g., by plus or minus fifteen percent) depending on how the second rider is sitting on the second seat, but generally is fairly consistent.
  • Program instructions can be configured to use the weight on the seat (and/or on the floor) to help identify which rider is sitting in a particular seat. For example, the first rider may travel in four vehicles on four different days with recorded weights (e.g., in seats of each vehicle) of 90+/−8 pounds, 92+/−7 pounds, 91+/−8 pounds, and 89+/−7 pounds. The second rider may travel in four vehicles on four different days with recorded weights (e.g., in seats of each vehicle) of 150+/−9 pounds, 148+/−7 pounds, 148+/−5 pounds, and 151+/−8 pounds.
  • If the first and second riders are located in a vehicle, the program instructions can be configured to use the weight history data of the first and second riders to determine which rider is located in a particular seat. For example, “seat A” may detect a weight of 92 pounds and “seat B” may detect a weight of 148 pounds. The computer system can then compare the detected weights to a weight history in a profile of each rider to determine that the first rider is located in “seat A” and the second rider is located in “seat B.” If a seat belt sensor of “seat B” detects an unbuckled state, the program instructions can be configured to fine an account of the second rider (rather than mistakenly fining an account of the first rider).
  • A weight (e.g., detected by an occupancy sensor) over a predetermined threshold can be used to enable the system to determine that the weight is due to a person rather than another object such as a laptop, a backpack, or a grocery bag. In some embodiments, the predetermined threshold is greater than 20 pounds, greater than 30 pounds, less than 50 pounds, and/or less than 70 pounds.
  • U.S. Pat. No. 6,609,054 includes occupancy sensor information. The entire contents of U.S. Pat. No. 6,609,054 are incorporated by reference herein.
  • U.S. Pat. No. 6,918,612 includes occupancy sensor information. The entire contents of U.S. Pat. No. 6,918,612 are incorporated by reference herein.
  • U.S. Pat. No. 6,927,678 includes occupancy sensor information. The entire contents of U.S. Pat. No. 6,927,678 are incorporated by reference herein.
  • U.S. Pat. No. 6,920,256 includes occupancy sensor information. The entire contents of U.S. Pat. No. 6,920,256 are incorporated by reference herein. Occupancy sensor systems can use effects of a presence of a rider sitting in a seat to detect whether a rider is sitting in a seat. In some embodiments, light from a light source is blocked by the rider sitting on a seat. The blocked light is not detected by a light sensor. Failing to detect the light is a signal that a rider is sitting in the seat.
  • In some embodiments, an occupancy sensor comprises a weight sensor 72. U.S. Pat. No. 6,636,792 includes weight sensor information. The entire contents of U.S. Pat. No. 6,636,792 are incorporated by reference herein.
  • Referring now primarily to FIG. 12, a seat-belt monitoring system can comprise a first self-driving vehicle 5 a configured to transport a rider (a person); a first seat 51 coupled to the first self-driving vehicle 5 a ; a first seat belt 53 configured to secure the rider in the first seat 51; and/or a first seat belt sensor 55 configured to detect at least one of a buckled state of the first seat belt 53 and an unbuckled state of the first seat belt 53.
  • The first self-driving vehicle 5 a can comprise a communication system 16, a computer system 19, a processor 26, a memory 20, program instructions 27, and seats 51, 65. A seat-belt monitoring system can comprise a second self-driving vehicle 5 b configured to transport a rider. The second self-driving vehicle 5 b can comprise a communication system 16 a, a computer system 19 a, a processor 26 a, a memory 20 a, program instructions 27 a, and seats 52, 67.
  • In some embodiments, a seat-belt monitoring system comprises a first occupancy sensor 57 configured to detect the rider sitting in the first seat 51; and/or a computer system 34 comprising at least one computer and located remotely relative to the first self-driving vehicle 5 a.
  • In some embodiments, a computer system (e.g., 34, 19, 19 a, 82, other computer systems) comprises a processor and a memory having program instructions that when executed by the processor are configured to execute any of the steps described herein.
  • In some embodiments, a computer system 34 comprises a processor 35 and a memory 31 having program instructions 32 that when executed by the processor 35 are configured to execute any of the steps described herein.
  • In some embodiments, a first self-driving vehicle 5 a comprises a computer system 19 having a processor 26 and a memory 20. The memory 20 can comprise program instructions 27 that when executed by the processor 26 are configured to execute any of the steps described herein.
  • In some embodiments, a second self-driving vehicle 5 b comprises a computer system 19 a having a processor 26 a and a memory 20 a. The memory 20 a can comprise program instructions 27 a that when executed by the processor 26 a are configured to execute any of the steps described herein.
  • In some embodiments, a first self-driving vehicle 5 a comprises a computer system 19 having a processor 26 and a memory 20. The memory 20 can comprise program instructions 27 (e.g., of the first self-driving vehicle 5 a) that when executed by the processor 26 are configured to send a first wireless communication 61 to another computer system (e.g., 34) located remotely relative to the first self-driving vehicle 5 a.
  • A communication system 16 of the first self-driving vehicle 5 a can send the first wireless communication 61 via an antenna 13, intermediary communication systems 15, an antenna 42, and/or a communication system 39 to the remotely located computer system 34. The computer system 34 can be located remotely relative to the first self-driving vehicle 5 a by not being mechanically coupled to the first self-driving vehicle 5 a.
  • In some embodiments, the computer system 34 is located in the “cloud.” The computer system 34 can comprise servers configured to operate the program instructions 32 to control various aspects of a fleet of vehicles (e.g., 5 a, 5 b, 5 c). The fleet of vehicles can include one vehicle, hundreds of vehicles, thousands of vehicles, and/or even more vehicles.
  • The vehicles 5 a, 5 b, 5 c can include any of the features described in the context of vehicle 5. To reduce unnecessary redundancy, some of the features of vehicle 5 are not described again in the context of vehicles 5 a, 5 b, 5 c.
  • In some embodiments, the first self-driving vehicle 5 a comprises first program instructions 27 configured to send a first wireless communication 61 to a remotely located computer system 34 in response to the first occupancy sensor 57 detecting the rider sitting in the first seat 51 and in response to the first seat belt sensor 55 detecting the unbuckled state.
  • Sending the first wireless communication 61 in response to both the first occupancy sensor 57 detecting the rider sitting in the first seat 51 and the first seat belt sensor 55 detecting the unbuckled state can reduce the risk of falsely reporting that a rider is unbuckled when really the first seat belt sensor 55 detected an unbuckled state simply because no one is sitting in the first seat 51.
  • The first wireless communication 61 can comprise several communication sessions. In other words, the data communicated by the first wireless communication 61 does not necessarily have to be sent in a single communication. In some embodiments, the data communicated by the first wireless communication 61 is split into several communication sessions, which together form the first wireless communication 61.
  • The remote computing device 12 of the rider can be used to notify the rider about seat belt related information. For example, a display screen 90 of the remote computing device 12 can display a message to prompt the rider to place the first seat belt 53 in a buckled state. Many types of messages can be used to prompt the rider. For example, the following message can be displayed on the screen 90: “Please fasten your seat belt to avoid a fine, to avoid restrictions on future transportation privileges, and to avoid injury due to hazardous conditions.”
  • In some embodiments, a computer system 34 comprises second program instructions 32 configured to send a second wireless communication 68 to a remote computing device 12 of the rider in response to the computer system 34 receiving the first wireless communication 61. The second wireless communication 68 can be configured to prompt the rider to place the first seat belt 53 in the buckled state. The second wireless communication 68 can be configured to warn the rider of a fine for the unbuckled state of the first seat belt 53.
  • In some embodiments, a computer system 34 comprises a memory 31 having a buckling history of the rider. The buckling history can comprise data indicative of past buckling behavior of the rider prior to the rider entering the first self-driving vehicle 5 a. The buckling history can comprise data indicative of buckling behavior of the rider while the rider is being transported by the first self-driving vehicle 5 a.
  • An embodiment of a buckling history of “Rider 1” can include the following information: Jun. 15, 2017—Unbuckled during at least a portion of ride 342,948; Jun. 16, 2017—Buckled during ride 343,217; Jun. 16, 2017—Buckled during ride 343,314; Jun. 19, 2017—Unbuckled during at least a portion of ride 345,152. Of course, many different types of buckling histories are possible in many diverse embodiments.
  • In some embodiments, the buckling history includes data regarding a date of a ride, instances of the rider being unbuckled while a vehicle was moving, instances of the rider being buckled while the vehicle was moving, warnings (including emails, text messages, and audible warnings emitted from a vehicle) given to the rider in response to the rider not being buckled, pictures of the rider while unbuckled, a record of fines given to the rider for not being buckled, etc.
  • The buckling history can be used as evidence of the rider's buckling behavior while being transported by many different vehicles (e.g., 5 a, 5 b, 5 c). If a collision occurs, the buckling history can be used to show that the rider received multiple warnings to buckle up prior to the collision (e.g., in the minutes, weeks, or months prior to the collision).
  • A rider with a buckling history that indicates the rider has failed to fasten her seat belt in the past may receive higher transportation prices in the future or may be denied transportation in the future. A rider with a buckling history that indicates the rider has failed to fasten her seat belt in the past may receive a lower ride priority ranking in the future such that if demand for transportation exceeds the supply of vehicles to provide transportation, the system may decide to deny the rider's transportation request or make the rider wait longer (than a person with a more consistent history of fastening her seat belt).
  • A rider with a buckling history that indicates the rider has consistently fastened her seat belt in the past may receive lower transportation prices in the future. A rider with a buckling history that indicates the rider has consistently fastened her seat belt in the past may receive a higher ride priority ranking in the future.
  • A computer system can comprise a memory having data regarding a buckling history of the rider. Buckling history data can be used by program instructions to decide which rider's transportation request to fulfill first.
  • Buckling history data can be used by program instructions to decide insurance prices. A buckling history indicative of failing to fasten seat belts can result in higher insurance prices. A buckling history indicative of consistently fastening seat belts can result in lower insurance prices.
  • The computer system 34 can comprise second program instructions 32 configured to update the buckling history in response to receiving the first wireless communication 61. (In some embodiments, a computer system 19 comprises a memory 20 having a buckling history of the rider.)
  • In some embodiments, a computer system (e.g., 34, 19 a) comprises second program instructions configured to deny a future ride request from the rider in response to the computer system receiving the first wireless communication 61.
  • In some embodiments, a computer system 34 comprises a memory 31 having a buckling history of the rider. The computer system 34 can comprise second program instructions 32 configured to update the buckling history of the rider in response to receiving the first wireless communication 61. The second program instructions 32 can be configured to deny a future ride request of the rider in response to the computer system 34 analyzing the buckling history. For example, if buckling compliance is below a predetermined threshold, second program instructions 32 can be configured to deny a future ride request of the rider.
  • In some embodiments, the computer system 34 comprises a memory 31 having a buckling history of the rider. The buckling history can be at least partially based on the first wireless communication 61 and/or another wireless communication 62 from a second self-driving vehicle 5 b. The computer system 34 can comprise second program instructions 32 configured to determine a future ride price for the rider at least partially based on the buckling history. For example, if buckling compliance is below a predetermined threshold, second program instructions 32 can be configured to increase the future ride price (e.g., by at least 10 percent compared to a regular price that does not include buckling history consideration). For example, if buckling compliance is above a predetermined threshold, second program instructions 32 can be configured to decrease the future ride price (e.g., by at least 10 percent compared to a regular price that does not include buckling history consideration).
  • In some embodiments, the seat-belt monitoring system comprises a computer system 34 having at least one computer. The computer system 34 can be located remotely relative to the first self-driving vehicle 5 a (e.g., the computer system 34 is not mechanically coupled to the first self-driving vehicle 5 a). Even if the computer system 34 is not mechanically coupled to the first self-driving vehicle 5 a, the computer system 34 may be communicatively coupled to the first self-driving vehicle 5 a (e.g., via intermediary communication systems 15).
  • In some embodiments, the first self-driving vehicle 5 a comprises first program instructions 27 configured to send a first wireless communication 61 to the computer system 34 in response to the first seat belt sensor 55 detecting the unbuckled state.
  • In some embodiments, the first self-driving vehicle 5 a comprises a first processor 26 and a first memory 20 having first program instructions 27 that when executed by the first processor 26 are configured to send a first wireless communication 61 to the computer system 34 in response to both the first occupancy sensor 57 detecting the rider sitting in the first seat 51 and the first seat belt sensor 55 detecting the unbuckled state. Relying on both the first occupancy sensor 57 detecting the rider sitting in the first seat 51 and the first seat belt sensor 55 detecting the unbuckled state reduces occurrences of the system falsely determining that a rider is unbuckled (when in reality a seat belt sensor detects an unbuckled state simply because a rider is not sitting in the seat).
  • The system can impose financial penalties on a rider in response to the rider (or one of her passengers) failing to fasten her seat belt. The financial penalty may be a dollar amount charged to an account 63 of the rider. The account 63 may include credit card information of the rider such that the program instructions 32 can be configured to charge a credit card in response to the rider not fastening her seat belt. If an adult requests a ride and then enters the vehicle with a child, the adult may be charged in response to the child not fastening his seat belt.
  • In some embodiments, a computer system (e.g., 34) comprises program instructions (e.g., 32) configured to fine an account 63 of the rider in response to the computer system (e.g., 34) receiving a first communication (e.g., a first wireless communication 61).
  • In some embodiments, a computer system 34 comprises a processor 35 and a memory 31 having second program instructions 32 that when executed by the processor 35 are configured to fine an account 63 of the rider in response to the computer system 34 receiving the first wireless communication 61.
  • In some embodiments, the computer system 34 needs to know an identity of the rider in order to know which account to fine and/or penalize.
  • In some embodiments, the first wireless communication 61 comprises an identification of the rider. The identification can be configured to enable the second program instructions 32 to fine the account 63. For example, the first wireless communication may include an identification code (e.g., 3921078, G382HR) that indicates the identity of the rider. The database 64 can include a table that enables the computer system 34 to determine the name and credit card associated with the identification code.
  • Some embodiments use a picture of the rider as an identification of the rider. The computer system 34 can analyze the picture to determine an identity of the rider. Of course, many types of identifications can be used to enable the computer system 34 to identify the rider.
  • In some embodiments, the first wireless communication 61 comprises data configured to enable the computer system 34 to identify an account 63 of the rider. Identifying the account 63 of the rider can enable the system to fine the account 63 of the rider instead of mistakenly fining an account of another person.
  • In some embodiments, the seat-belt monitoring system comprises many self-driving vehicles. In some embodiments, the seat-belt monitoring system comprises a second self-driving vehicle 5 b having a second seat 52, a second seat belt 53 a and a second seat belt sensor 56. The second seat belt sensor 56 can be configured to detect at least one of a buckled state of the second seat belt 53 a and an unbuckled state of the second seat belt 53 a.
  • In some embodiments, the computer system 34 is located remotely relative to the first self-driving vehicle 5 a and the second self-driving vehicle 5 b. In some embodiments, the computer system 34 is not located remotely relative to the first self-driving vehicle 5 a (and is mechanically coupled to the first self-driving vehicle 5 a). In some embodiments, the computer system 34 is not located remotely relative to the second self-driving vehicle 5 b (and is mechanically coupled to the second self-driving vehicle 5 b).
  • In some embodiments, the computer system 34 comprises second program instructions 32. The second self-driving vehicle 5 b can comprise third program instructions 27 a configured to send a second wireless communication 62 to the computer system 34 (e.g., via intermediary communication systems 15) in response to the second seat belt sensor 56 detecting the unbuckled state of the second seat belt 53 a.
  • The second self-driving vehicle 5 b can comprise third program instructions 27 a configured to send a second wireless communication 62 to the computer system 34 in response to both the second seat belt sensor 56 detecting the unbuckled state of the second seat belt 53 a and a second occupancy sensor 58 detecting the rider sitting in the second seat 52 of the second self-driving vehicle 5 b.
  • The second self-driving vehicle 5 b can comprise a processor 26 a and a memory 20 a having program instructions 27 a that when executed by the processor 26 a are configured to send a wireless communication 62 to a computer system 34 and/or to a computer system 19 in response to the second seat belt sensor 56 detecting the unbuckled state.
  • A first self-driving vehicle 5 a can send wireless communications to a second self-driving vehicle 5 b. The second self-driving vehicle 5 b can send wireless communications to the first self-driving vehicle 5 a.
  • In some embodiments, a computer system (e.g., 34, 19, 19 a) comprises a memory having a buckling history of the rider. The buckling history can comprise data indicative of buckling behavior of the rider. The buckling history data can be at least partially based on the first wireless communication 61 and/or on the second wireless communication 62.
  • In some embodiments, an account of a rider is fined in response to more than one vehicle reporting that the rider failed to fasten her seat belt. Program instructions 32 can compare buckling history data for a single rider from multiple vehicles.
  • In some embodiments, the second program instructions 32 of the computer system 34 are configured to fine an account 63 of the rider a first amount (e.g., in dollars) in response to the computer system 34 receiving the first wireless communication 61. The second program instructions 32 can be configured to fine the account 63 a second amount (e.g., in dollars) in response to the computer system 34 receiving the second wireless communication 62.
  • In some embodiments, the second program instructions 32 are configured to make the second amount larger than the first amount in response to the computer system 34 receiving the second wireless communication 62 after receiving the first wireless communication 61.
  • In some embodiments, the computer system 34 is configured to determine a future ride price at least partially based on the first wireless communication 61 and the second wireless communication 62 such that the future ride price is at least partially affected by a buckling history of the rider. The buckling history may affect the future ride price by increasing the future ride price. The buckling history may affect the future ride price by decreasing the future ride price.
  • In some embodiments, the second program instructions 32 are configured to increase the future ride price in response to the buckling history indicating the unbuckled state of the first seat belt 53 (of a first self-driving vehicle 5 a) and the unbuckled state of the second seat belt 53 a (of a second self-driving vehicle 5 b).
  • Some embodiments are configured to enter a safer driving mode in response to detecting that a rider is unbuckled. The safer driving mode can decrease a probability of the vehicle getting into a collision during a period in which the seat belt of the rider is unfastened.
  • In some embodiments, a seat-belt monitoring system comprises a first occupancy sensor 57 configured to detect the rider sitting in the first seat 51. The first self-driving vehicle 5 a can comprise first program instructions 27 configured to exit a first driving mode and enter a second driving mode in response to the first occupancy sensor 57 detecting the rider sitting in the first seat 51 and in response to the first seat belt sensor 55 detecting the unbuckled state.
  • In some embodiments, just detecting the unbuckled state is insufficient to trigger a reaction (because an unbuckled state could simply be due to the seat being unoccupied). Detecting the unbuckled state and detecting the rider sitting in the seat, however, can trigger a reaction, according to some embodiments.
  • In some embodiments, the first self-driving vehicle 5 a comprises an object detection system 7 a having at least one of a camera 11, a radar 8, and a lidar 9. The object detection system 7 a can include any of the features of detection system 7. To reduce unnecessary redundancy, some the features of detection system 7 are not necessarily described again in the context of object detection system 7 a.
  • FIG. 11 illustrates a perspective view of a self-driving vehicle 5 a driving on a road 54. The first vehicle 5 a uses its object detection system 7 a to detect several vehicles 48.
  • The object detection system 7 a can detect an outline 49 of each vehicle 48 using a camera 11, radar 8, and/or lidar 9. (The outlines 49 are depicted by broken line boxes in FIG. 11.) Detecting the outlines 49 can enable the detection system 7 a to detect a center of each vehicle 48. (The centers are depicted by circles.) Measuring distances 50 a, 50 b, 50 c, 50 d, 50e from the centers to lane markers 59 enables the vehicle 5 a to track paths of the vehicles 48.
  • In some embodiments, the second self-driving vehicle 5 b comprises an object detection system 7 b having at least one of a camera 11, a radar 8, and a lidar 9. The object detection system 7 b can include any of the features of detection system 7. To reduce unnecessary redundancy, some the features of detection system 7 are not necessarily described again in the context of object detection system 7 b.
  • An object detection system 7 a can be configured to detect a second vehicle 48 on a road 54 to enable the first self-driving vehicle 5 a to avoid colliding with the second vehicle 48. First program instructions 27 can be configured to maintain a greater minimum distance (and/or a greater average distance) from the first self-driving vehicle 5 a to the second vehicle 48 in the second mode than in the first mode. The second mode can be configured to keep the first self-driving vehicle 5 a farther away from other vehicles 48 than the first mode.
  • In some embodiments, the minimum distance of the second mode is less than 10 meters and the minimum distance of the first mode is less than 5 meters. In some embodiments, the minimum distance of the second mode is less than 15 meters and the minimum distance of the first mode is less than 10 meters. In some embodiments, the minimum distance of the second mode is at least two meters greater than the minimum distance of the first mode.
  • In some embodiments, in the first driving mode the first program instructions 27 are configured to maintain at least a first distance from the first self-driving vehicle 5 a to the second vehicle 48, and in the second driving mode the first program instructions 27 are configured to maintain at least a second distance from the first self-driving vehicle 5 a to the second vehicle 48, wherein the second distance is greater than the first distance.
  • Turning at intersections can be particularly dangerous. In the first mode, the first program instructions 27 can be configured to enable the first self-driving vehicle 5 a to turn at intersections. In the second driving mode, the first program instructions 27 can be configured to prevent the first self-driving vehicle 5 a from turning at intersections. Instead, the first self-driving vehicle 5 a would keep going along the road until the rider fastens her seatbelt.
  • In some embodiments, the memory 20 comprises a driving route that includes turning at an upcoming intersection. In the first mode, the first program instructions 27 can be configured to enable the first self-driving vehicle 5 a to turn at the intersection. In the second driving mode, the first program instructions 27 can be configured to prevent the first self-driving vehicle 5 a from turning at the intersection.
  • In some embodiments, the first self-driving vehicle 5 a comprises an object detection system 7 a having at least one of a camera 11, a radar 8, and a lidar 9. The object detection system 7 a can be configured to detect objects on roads (e.g., 54) to enable the first self-driving vehicle 5 a to avoid colliding with the objects (e.g., other vehicles 48, guardrails, street signs, pedestrians).
  • In some embodiments, in the first driving mode the first program instructions 27 are configured to exceed a first safety threshold while using the object detection system 7 a to avoid colliding with the objects while driving on the roads. In some embodiments, in the second driving mode the first program instructions 27 are configured to exceed a second safety threshold while using the object detection system 7 a to avoid colliding with the objects while driving on the roads.
  • The second safety threshold is configured to have a lower probability than the first safety threshold of colliding with objects.
  • Exceeding the second safety threshold (rather than just exceeding the lower first safety threshold) can include driving slower, taking turns more slowly, maintaining a greater distance from other vehicles 48, stopping more slowly, and accelerating more slowly.
  • In some embodiments of the first driving mode, the first program instructions 27 are configured to cause the first self-driving vehicle 5 a to transport the rider. In some embodiments of the second driving mode, the first program instructions 27 are configured to cause the first self-driving vehicle 5 a to find a location to park. The location to park can be located on a shoulder of a road.
  • For example, the first self-driving vehicle 5 a may be driving from a pick-up location to a drop-off location in the first driving mode. Then, the rider may unfasten her seat belt 53. In response to the unfastening of the seat belt 53, the first self-driving vehicle 5 a may cease driving toward the drop-off location (due to the second driving mode). Ceasing to drive toward the drop-off location may not necessarily happen instantly. Instead, the first self-driving vehicle 5 a may search for a safe location to pull over to park. In response to the first seat belt sensor 55 detecting a fastened state of the seat belt 53, the first self-driving vehicle 5 a may resume driving toward the drop-off location.
  • A rider may dispute that she was the person who had an unfastened seat belt. A picture may be used as evidence of the unfastened seat belt.
  • In some embodiments, the seat-belt monitoring system comprises a first occupancy sensor 57 configured to detect the rider sitting in the first seat 51. The first self-driving vehicle 5 a can comprise a camera (e.g., 10, 10 a, 10 b, 11 a, 11 b, 24 a, 24 b, 24 c). Camera devices 10, 10 a, 10 b can be coupled to a ceiling 20 of a self-driving vehicle 5, 5 a, 5 b, 5 c such that they include cameras (e.g., 24 a, 24 b, 24 c) directed towards the first row of seats and/or towards the second row of seats.
  • A camera device Ila can be integrated into the rear-view mirror of a self-driving vehicle 5, 5 a, 5 b, 5 c. A camera device 1 lb can be integrated into the dash of a self-driving vehicle 5, 5 a, 5 b, 5 c. Camera devices 10 a, 10 b, 11 a, 11 b can be placed in any area of a self-driving vehicle 5, 5 a, 5 b, 5 c.
  • FIG. 8 illustrates a perspective view of a camera device 10. The camera devices 10 a, 10 b illustrated in FIG. 12 can include any of the features of the camera device 10 illustrated in FIG. 8.
  • Each camera 24 a, 24 b, 24 c can include a wide-angle lens to provide a wider field of view, which can be particularly helpful in the small confines of a self-driving vehicle 5, 5 a, 5 b. The cameras 24 a, 24 b, 24 c can be high-resolution cameras with auto-focus.
  • The camera device 10 can comprise a rider detection system, a communication module (with can include an antenna, a transmitter, and a receiver), a printed circuit board populated with integrated circuits and other electrical components, an image analysis system, a battery, a power management system, a microphone, a speaker, a memory with software configured to carry out the features described herein, and lights configured to illuminate the interior of a self-driving vehicle 5, 5 a, 5 b.
  • The camera device 10 can comprise a smoke detector configured to detect if a rider is smoking (e.g., cigarettes, vaping) inside the self-driving vehicle 2. Holes 34 enable the smoke to enter the camera device 10 to enable the smoke detector to detect the smoke. Not all the holes 34 are labeled to increase the clarity of other features.
  • The camera device 10 includes buttons that can be configured to enable the rider to interact physically with the camera device. A first button 27 a is configured to summon emergency responders in response to the rider pressing the button 27 a. The camera device 10 can call “911” and can provide the GPS location of the self-driving vehicle 2 to the emergency responders.
  • A second button 27 b is configured to call a virtual assistant (or a live human assistant) in response to the rider pressing the button 27 b. The assistant can be configured to answer the rider's questions. The virtual assistant can use Apple's “Siri” technology or Amazon's “Alexa” technology.
  • Pressing a third button 27 c can notify the maintenance system that the interior of the self-driving vehicle 2 needs to be cleaned. Pressing a fourth button 27d can notify the maintenance system that the exterior of the self-driving vehicle 2 needs to be cleaned.
  • The camera device 10 can include an outer housing 33 (e.g., molded from plastic) that snaps onto a molded plastic base plate that is coupled to the ceiling 20 by screws. A hatch can be removed to enable plugging cables into the camera device 10. The cables can provide electrical power from a self-driving vehicle 5, 5 a, 5 b to the camera device 10. The cables can also communicatively couple the camera device 10 to other portions of a self-driving vehicle 5, 5 a, 5 b that communicatively couple a self-driving vehicle 5, 5 a, 5 b to a seat belt monitoring system. The cables can exit through holes 30 in the hatch.
  • In some embodiments, a camera device 10 is mechanically coupled to a self-driving vehicle 5, 5 a, 5 b and is communicatively coupled to a computer system 34 via intermediary communication systems 15. The camera device 10 can be coupled by wires or wirelessly communicatively coupled to the other elements described herein and/or incorporated by reference.
  • Embodiments can include any of the features of the cameras and other elements described in U.S. patent application Ser. 16/230,410. The entire contents of U.S. patent application Ser. No. 16/230,410 are incorporated by reference herein.
  • Embodiments can include any of the features of the cameras and other elements described in U.S. Patent Application 62/782,887. The entire contents of U.S. Patent Application 62/782,887 are incorporated by reference herein.
  • The first self-driving vehicle 5 a can comprise first program instructions 27. The first program instructions 27 can be configured to cause a camera (e.g., 10, 10 a, 10 b, 11 a, 11 b, 24 a, 24 b, 24 c) to take a picture 69 of the rider in the first seat 51 in response to the first occupancy sensor 57 detecting the rider sitting in the first seat 51 and in response to the first seat belt sensor 55 detecting the unbuckled state. The picture 69 can be a video. The picture 69 can be a still image.
  • In some embodiments, the seat-belt monitoring system comprises a computer system 34 having at least one computer. The computer system 34 can be located remotely relative to the first self-driving vehicle 5 a (e.g., such that the computer system 34 is not mechanically coupled to the self-driving vehicle). The first program instructions 27 can be configured to send a first wireless communication 61 comprising the picture 69 to the computer system 34 (e.g., via intermediary communication systems 15) in response to the first occupancy sensor 57 detecting the rider sitting in the first seat 51 and in response to the first seat belt sensor 55 detecting the unbuckled state.
  • In some embodiments, a camera is always recording inside a cabin of a first self-driving vehicle 5 a.
  • In some embodiments, the seat-belt monitoring system comprises a first occupancy sensor 57 configured to detect the rider sitting in the first seat 51. The seat-belt monitoring system can comprise a computer system 34 comprising at least one computer. The computer system 34 can be located remotely relative to the first self-driving vehicle 5 a (e.g., such that the computer system 34 is not mechanically coupled to the self-driving vehicle). The computer system 34 can comprise a memory 31 having buckling history data of the rider. The first self-driving vehicle 5 a can comprise a camera (e.g., 10, 10 a, 10 b, 11 a, 11 b) configured to take a picture 69 of the rider in the first seat 51 during the unbuckled state. The communication system 16 can send the picture 69 to the computer system 34.
  • In some embodiments, the first self-driving vehicle 5 a comprises first program instructions 27 configured to send a first wireless communication 61 to the computer system 34 in response to the first occupancy sensor 57 detecting the rider sitting in the first seat 51 and in response to the first seat belt sensor 55 detecting the unbuckled state. The computer system 34 can comprise second program instructions 32 configured to save the picture 69 to the memory such that the buckling history of the rider comprises the picture 69 as evidence of the unbuckled state.
  • Saving the picture 69 as evidence can enable sending the picture 69 to the rider (e.g., to show the rider that she was the one who was unbuckled as the vehicle was moving). Some embodiments comprise sending the picture 69 to a remote computing device 12 of the rider (e.g., in response to the computer system 34 receiving the first wireless communication 61).
  • In some embodiments, the seat-belt monitoring system comprises a computer system (e.g., 34, 19, 19 a) having at least one computer. The computer system can comprise program instructions (e.g., 32, 27, 27 a) configured to notify the rider (e.g., at least partially in response to the program instructions analyzing at least one of road conditions, a travel route, and traffic conditions) that the rider is permitted to unbuckle the first seat belt 53.
  • A first self-driving vehicle 5 a can comprise a road conditions monitor 29 a. The road conditions monitor 29 a can comprise a camera oriented toward a road to enable the road conditions monitor 29 a to visually analyze the road to determine if they road is icy, if the road is wet, if there is rain, if there is snow, if there are potholes, if there are other hazards on the road.
  • In some embodiments, the first self-driving vehicle 5 a receives road condition data from a remote road condition monitor 29. The road condition data can be based on information from other vehicles, weather reports, weather forecasts, and/or databases.
  • The object detection system 7 a can be used to monitor traffic conditions (e.g., if many vehicles 48 are located near the first self-driving vehicle 5 a. A traffic monitor 23 can receive traffic condition data from remote computers (e.g., can receive traffic data from Google).
  • A vehicle navigation system 14 can comprise travel route data. If the travel route has many turns, the system can be less likely to notify the rider that the rider is permitted to unbuckle the first seat belt 53.
  • Notifying the rider can comprise emitting audio from a speaker 71 of a self-driving vehicle 5 a that is transporting the rider. The audio can include words that instruct the rider that the rider is permitted to unbuckle the first seat belt 53. For example, the words can say, “Due to predicted safe travel conditions, you are now permitted to unbuckle your seat belt.”
  • Notifying the rider can comprise sending a wireless communication from the computer system (e.g., 34, 19, 19 a) to a remote computing device 12 of the rider. The wireless communication can prompt application software (an “app”) to display a message that instructs the rider that the rider is permitted to unbuckle the first seat belt 53. For example, the message displayed on a screen 90 of the remote computing device 12 can say, “Due to predicted safe travel conditions, you are now permitted to unbuckle your seat belt.”
  • Program instructions (e.g., 32, 27, 27 a) can analyze at least one of road conditions, a travel route, and traffic conditions, and then in response to the analysis, the second program instructions 32 can notify the rider that the rider is permitted to unbuckle the first seat belt 53.
  • Road conditions reducing the likelihood of notifying the rider that the rider is permitted to unbuckle the first seat belt 53 can include snowy roads, wet roads, rain, potholes, and curvy roads.
  • Travel route features reducing the likelihood of notifying the rider that the rider is permitted to unbuckle the first seat belt 53 can include turns in the travel route, little time until the next turn, inconsistent travel speeds (e.g., stopping and starting),
  • Traffic conditions reducing the likelihood of notifying the rider that the rider is permitted to unbuckle the first seat belt 53 can include many vehicles on the road, vehicles driving poorly, and vehicles located too close to the first self-driving vehicle 5 a.
  • Conditions of the first self-driving vehicle 5 a can reduce the likelihood of notifying the rider that the rider is permitted to unbuckle the first seat belt 53. These conditions can include low tire pressure, dropping tire pressure, airbag defects, and motor warnings.
  • Some embodiments comprise a computer system having at least one computer. The computer system can comprise second program instructions configured to notify the rider that the rider is permitted to unbuckle the first seat belt 53. The second program instructions can be configured to notify the rider at least partially in response to analyzing at least one of road conditions, a travel route, and traffic conditions.
  • In some embodiments, the seat-belt monitoring system comprises a computer system (e.g., 19) having at least one computer. The first self-driving vehicle 5 a can comprise an object detection system 7 a having at least one of a camera 11, a radar 8, and a lidar 9. The object detection system 7 a can be configured to detect a second vehicle 48. The computer system (e.g., 19) can comprise program instructions (e.g., 27) configured to notify the rider, at least partially in response to the program instructions analyzing a distance from the first self-driving vehicle 5 a to the second vehicle 48, that the rider is permitted to unbuckle the first seat belt 53.
  • A rider may decide to unfasten her seat belt 53 during a trip, but if conditions indicate a substantial risk of a collision, the program instructions can be configured to instruct a rider to fasten her seat belt 53.
  • In some embodiments, a computer system (e.g., 34, 19, 19 a) can comprise program instructions (e.g., 32, 27, 27 a) configured to notify the rider to buckle the first seat belt 53, wherein the notifying is at least partially in response to the program instructions analyzing at least one of road conditions, a travel route, and traffic conditions. The program instructions can analyze at least one of road conditions, a travel route, and traffic conditions, and then in response to the analysis, the second program instructions can notify the rider to buckle the first seat belt 53.
  • Notifying the rider can comprise emitting audio from a speaker 71 of a self-driving vehicle 5 a that is transporting the rider. The audio can include words that instruct the rider to buckle the first seat belt 53. For example, the words can say, “Due to predicted potentially unsafe travel conditions, you should buckle your seat belt.”
  • Notifying the rider can comprise sending a wireless communication from the computer system (e.g., 34, 19, 19 a) to a remote computing device 12 of the rider. The wireless communication can prompt application software (an “app”) running on the remote computing device 12 to display a message that instructs the rider that the rider should buckle the first seat belt 53. For example, the message displayed on a screen 90 of the remote computing device 12 can say, “Due to predicted potentially unsafe travel conditions, you should buckle your seat belt.”
  • In some embodiments, the seat-belt monitoring system comprises a computer system (e.g., 34, 19, 19 a) having at least one computer. The computer system comprises program instructions (e.g., 32, 27, 27 a) configured to instruct the rider to buckle the first seat belt 53 in response to the first seat belt sensor 55 detecting the unbuckled state and in response to the second program instructions 32 analyzing at least one of road conditions and traffic conditions.
  • In some embodiments, program instructions are configured to notify the rider that the rider is permitted to unbuckle her seat belt. If a system detects a higher risk due to any of the factors described herein, the system can instruct the rider to buckle her seat belt.
  • In some embodiments, the seat-belt monitoring system comprises a computer system having at least one computer. The first self-driving vehicle 5 a can comprise an object detection system 7 a having at least one of a camera 11, a radar 8, and a lidar 9. (An object detection system 7, 7 a, 7 b can include many cameras 11, radars 8, and lidars 9.)
  • The object detection system 7 a can be configured to detect a second vehicle 48. The computer system (e.g., 34, 19, 19 a) can comprise program instructions (e.g., 32, 27, 27 a) configured to instruct the rider to buckle the first seat belt 53 in response to the first seat belt sensor 55 detecting the unbuckled state and in response to an object detection system 7 a analyzing a distance from the first self-driving vehicle 5 a to the second vehicle 48.
  • Some embodiments enable the system to determine which rider is in which seat. Knowing the identity of a rider in a particular seat can enable the system to fine the correct rider, send notifications to the correct rider, and/or maintain an accurate buckling history for each rider.
  • The rider can be any age. In some cases, the rider is an adult. In some cases, the rider is a young child. In some embodiments, a first rider arranges for a self-driving vehicle to provide a ride for both the first rider and a baby.
  • In some embodiments, the seat-belt monitoring system comprises a first weight sensor 72 configured to detect first data indicative of a first weight of the rider sitting in the first seat 51 coupled to the first self-driving vehicle 5 a.
  • The seat-belt monitoring system can comprise a computer system (e.g., 34, 19 a) comprising at least one computer. The first self-driving vehicle 5 a can be configured to send a first wireless communication 61 comprising the first data to the computer system (e.g., 34, 19 a). The seat-belt monitoring system can comprise a second self-driving vehicle 5 b having a second seat 52 and a second weight sensor 72 configured to detect second data indicative of a second weight of an unknown rider sitting in the second seat. The computer system (e.g., 34, 19 a) can comprise second program instructions (e.g., 32, 27 a) configured to determine that the unknown rider is the rider in response to comparing the second data to the first data. The computer system (e.g., 34, 19 a) can comprise second program instructions (e.g., 32, 27 a) configured to determine that the unknown rider is the rider in response to analyzing which riders have been picked up and not dropped off by the vehicle.
  • In some embodiments, the seat-belt monitoring system comprises a first occupancy sensor 57 configured to detect the rider sitting in the first seat 51. The seat-belt monitoring system can comprise a computer system (e.g., 34, 19, 19 a) comprising at least one computer. The computer system can comprise second program instructions (e.g., 32, 27, 27 a) configured to receive an identification of the rider. The computer system can receive an identification of the rider from a database 64 and/or from a remote computing device 12 of the rider. The second program instructions can be configured to receive from the first occupancy sensor 57 an indication of the rider sitting in the first seat 51. The second program instructions can be configured to record that the rider is sitting in the first seat 51 in response to receiving the identification and the indication. The identification can be any type of identification configured to enable the system to identify the rider.
  • Some embodiments include audio and/or visual warnings. The first self-driving vehicle 5 a can comprise an audio speaker 71. The first program instructions 27 can be configured to cause the audio speaker 71 to emit words instructing the rider to put the first seat belt 53 in the buckled state in response to the first seat belt sensor 55 detecting the unbuckled state. The words can warn the rider that the rider will receive a fine if the rider does not put the first seat belt 53 in the buckled state.
  • The first self-driving vehicle 5 a can comprise at least one of an audio speaker 71 and a display screen 93. The first program instructions 27 can be configured to at least one of cause the audio speaker 71 to emit words and cause the display screen 93 to display the words. The words can be configured to warn the rider to put the first seat belt 53 in the buckled state to avoid a fine.
  • Now with reference to FIGS. 13a-13c , the self-driving vehicle 5 may be implemented with various safety features to not only detect an indication of an imminent crash or impact with another object (e.g. a car crash), but also adapt the cabin to prepare for the impact. In doing so, the cabin may provide a safer configuration to reduce the severity of an impact to one or more riders within the vehicle 5. For example, if the vehicle 5 detects an indication of an imminent impact, the seats of the vehicle 5 may rotate away from the crash to reduce injuries to the one or more riders.
  • Accordingly, in some embodiments, the self-driving vehicle 5 includes a first seat 51, a second seat 65, and a crash detection system 7 coupled to the vehicle. The crash detection system 7 may comprise at least one of a camera, a radar, a lidar, and an accelerometer. For example, the crash detection system 7 may determine that a crash is imminent due to the accelerometer(s) detecting an acceleration or deceleration that exceeds a predetermined threshold, which is indicative of an imminent impact.
  • Furthermore, the vehicle 5 may also include at least one motor 95 a coupled to the first seat 51 and communicatively coupled to the crash detection system 7 a. As well, the vehicle 5 may also include at least one motor 95 b coupled to the second seat 65 and communicatively coupled to the crash detection system 7 a. Upon detecting the sudden change in speed, the crash detection system 7 may transmit a communication to the at least one motor 95 to move the seat 51, 65 to a safer configuration. It should also be appreciated that the crash detection system 7 may transmit the communication through any ancillary device or system of the vehicle 5, such as through the vehicle management system, which thereby transmits the communication to the at least one motor 95 to move the seat 51. The at least one motor 95 may be arranged and configured to move at least one of the respective first seat 51 and second seat 65 in response to the crash detection system 7 a detecting an indication of an imminent crash.
  • With reference to FIG. 14, the seat 51 may be arranged and configured to move in a variety of directions and/or rotations to thereby re-position the rider(s) in an orientation that may reduce the severity of injuries from an imminent danger situation, such as a crash. Regarding the movement of the seats 51, 65, the at least one motor 95 may be arranged and configured to move the seat 51, 65 in a variety of movements. In some embodiments, the movement is a rotational movement whereby the at least one motor 95 rotates the respective seat 51, 65 so that the seat 51, 65 faces away from the predicted location of the crash. The at least one motor 95 may also rotate the seat 51, 65 so that the seat 51, 65 faces a center of the vehicle 5. Generally, the at least one motor 95 may rotationally move the seat 51, 65 to any position to protect the safety of the rider 1. Accordingly, with reference to arrow 97, the seat 51 may rotate and/or spin in any direction with respect to the X, Y, and/or Z axis.
  • The at least one motor 95 may also be arranged and configured to move the seat 51, 65 in other directions in response to the crash detection system detecting the indication of the imminent crash. For example, in some embodiments, the at least one motor 95 may also be arranged and configured to raise or lower the seat 51, 65. As shown by arrow 98, the seat 51, 65 may raise or lower in any direction with respect to the X, Y, and/or Z axis. Additionally, as shown by arrow 99, the seat 51, 65 may move forward or backward with respect to the X, Y, and/or Z axis.
  • Even still, the at least one motor 95 may be arranged and configured to move various parts of the seat 51, 65. This may occur because there is not enough time to move the entire seat 51, 65 to a desired position, or the movement of the seat 51, 65 may risk the safety of the rider(s). Additionally, moving various parts of the seat 51, 65 may be intended to protect different parts of the rider's anatomy. For example, because of the severity of neck injuries, the at least one motor 95 may be arranged and configured to move at least a portion of a head rest of the seat 51, 65 in response to the crash detection system detecting the indication of the crash. Such movement of the head rest may better support the rider's neck in the event of the crash. Of course, it should be appreciated that the movement of the head rest may be to raise or lower the head rest.
  • Additionally, the at least one motor 95 may be arranged and configured to adjust a firmness setting of at least a portion of the seat 51, 65 in response to the crash detection system detecting the indication of the imminent crash. In some embodiments, the at least one motor 95 is arranged and configured to adjust the firmness setting to be more firm than prior to detecting the indication of the crash. Moreover, in some embodiments, the at least one motor 95 is arranged and configured to adjust the firmness setting to be less firm than prior to detecting the indication of the crash.
  • With continued reference to FIGS. 13a-13c , the vehicle 5 may also include an occupancy sensor 57 coupled to the seat, 51, 65, whereby the occupancy sensor 57 is configured to detect the rider 1 sitting in the first seat 51, 65. The at least one motor 95 may thereby be arranged and configured to disable movement of the seat 51, 65 in response to the occupancy sensor 57 detecting that no rider is sitting in the seat 51, 65.
  • In some embodiments, the self-driving vehicle 5 also includes a seat belt sensor 55 coupled to a seat belt 53 of the seat 51, 65. The seat belt sensor 55 may be configured to detect at least one of a buckled state of the seat belt 53 and an unbuckled state of the seat belt 53. In some embodiments, the at least one motor 95 is arranged and configured to disable movement of the seat 51, 65 in response to the seat belt sensor 55 detecting the unbuckled state of the seat belt 53. Moreover, the first occupancy sensor 57 may include at least one of a camera, an infrared camera, and a weight sensor.
  • FIGS. 13b and 13c illustrate the vehicle 5 before and after a crash, respectively. Specifically, FIG. 13b shows the seat 51 and rider prior to the crash detection system 7 detecting an indication of an imminent crash, whereby the direction of the crash is the direction that the rider is facing in FIG. 13b . FIG. 13c illustrates a period of time after the car crash has occurred whereby the at least one motor 95 has rotated the seat 51 so that the rider faces away from the crash. Furthermore, FIGS. 13b and 13c also show that the second seat 65, without a rider, did not move in response to the crash detection system 7 detecting the indication of the imminent crash.
  • In the past, a driver would own her vehicle, would enjoy full ownership rights over the vehicle, and would use her same vehicle day after day. In the future, however, fewer people will be drivers who own their own vehicle and more people will be guest riders who will receive transportation services from many different vehicles that they do not own. Any of the embodiments described herein can be used with a rider who is a guest rider.
  • In some embodiments, a fleet of self-driving vehicles is configured to provide transportation services to many guest riders. As used herein, a “guest rider” is used broadly to refer to a person who is transported by a third-party service such as a public or private transportation system configured to provide transportation services to many different people (e.g., who often do not know each other). The guest rider does not exclusively own the first self-driving vehicle 5 a or the second self-driving vehicle 5 b, but can receive rides from the first self-driving vehicle 5 a and the second self-driving vehicle 5 b (and often from many other vehicles in the fleet of self-driving vehicles).
  • Referring now primarily to FIGS. 5, 7-10, and 12, in some embodiments, a seat-belt monitoring system comprises a first self-driving vehicle 5 a configured to transport a guest rider and comprising a first seat 51, a first seat belt 53 configured to secure the guest rider in the first seat 51, and a first seat belt sensor 55 configured to detect at least one of a buckled state of the first seat belt 53 and an unbuckled state of the first seat belt 53.
  • In some embodiments, a seat-belt monitoring system comprises a second self-driving vehicle 5 b configured to transport the guest rider and comprising a second seat 52, a second seat belt 53 a configured to secure the guest rider in the second seat 52, and a second seat belt sensor 56 configured to detect at least one of a buckled state of the second seat belt 53 a and an unbuckled state of the second seat belt 53 a. The seat-belt monitoring system can comprise a computer system 34 having at least one computer. The computer system 34 can be located remotely relative to the first self-driving vehicle 5 a.
  • The first self-driving vehicle 5 a can comprise first program instructions 27 configured to send a first wireless communication to the computer system 34 in response to the first seat belt sensor 55 detecting the unbuckled state. The computer system 34 can comprise second program instructions 32. The second self-driving vehicle 5 b can comprise third program instructions 27 a configured to send a second wireless communication to the computer system 34 in response to the second seat belt sensor 56 detecting the unbuckled state of the second seat belt 53 a. The computer system 34 can be communicatively coupled (e.g., continuously or intermittently or in any other suitable way) to at least one of a first self-driving vehicle 5 a and a second self-driving vehicle 5 b.
  • The computer system 34 can comprise a memory having a buckling history of the guest rider. The buckling history can be at least partially based on the first wireless communication and the second wireless communication.
  • In some embodiments, the second program instructions 32 of the computer system 34 are configured to fine an account of the guest rider a first monetary amount in response to the computer system 34 analyzing the buckling history and/or in response to the computer system 34 receiving at least one of the first wireless communication and the second wireless communication. The first monetary amount can be a financial fine such as 20 dollars, 50 dollars, or any other suitable amount of money.
  • In some embodiments, the first wireless communication comprises an identification of the guest rider. The identification can be configured to enable the second program instructions 32 to fine the account (e.g., by enabling the computer system 34 to know which account corresponds to the person who did not buckle her seat belt or who unbuckled her seat belt while the vehicle was traveling at a speed over a predetermined threshold, which in some embodiments is at least ten miles per hour).
  • In some embodiments, the second program instructions 32 of the computer system 34 are configured to fine an account of the guest rider a first monetary amount in response to the computer system 34 receiving the first wireless communication, and the second program instructions 32 are configured to fine the account a second monetary amount in response to the computer system 34 receiving the second wireless communication. In some embodiments, the second program instructions 32 are configured to make the second monetary amount larger than the first monetary amount in response to the computer system 34 receiving the second wireless communication after receiving the first wireless communication.
  • In some embodiments, a computer system 34 comprises second program instructions 32 configured to determine a future transportation price for a ride (to be given to the guest rider) at least partially based on the buckling history of the guest rider.
  • In some embodiments, the second program instructions 32 are configured to determine a future transportation price for the guest rider at least partially based on the first wireless communication and the second wireless communication such that the future transportation price is at least partially based on the buckling history of the guest rider. A first guest rider with a buckling history indicative of consistently buckling a seat belt can receive a future transportation price (e.g., for a ride at 2:00 this afternoon) of five dollars while a second guest rider with a buckling history indicative of consistently not buckling a seat belt can receive a future transportation price (e.g., for a ride at 2:00 this afternoon) of ten dollars.
  • In some embodiments, second program instructions 32 are configured to increase the future transportation price in response to the buckling history indicating the unbuckled state of the first seat belt 53 (of the first self-driving vehicle 5 a) and the unbuckled state of the second seat belt 53 a (of the second self-driving vehicle 5 b). At least one of the first program instructions 27, the second program instructions 32, and the third program instructions 27 a can be configured to charge the future transportation price to an account of the guest rider (e.g., at least partially in response to the second program instructions 32 determining the future transportation price for the guest rider based on the first wireless communication and the second wireless communication such that the future transportation price is at least partially based on the buckling history of the guest rider).
  • In some embodiments, the second program instructions 32 are configured to update the buckling history in response to the computer system 34 receiving the first wireless communication and the second wireless communication. The computer system 34 can comprise fourth program instructions configured to deny a future ride request from the guest rider in response to the second program instructions 32 updating the buckling history.
  • In some embodiments, so many different guest riders can request rides that would occur at the same time that the seat-belt monitoring system must decide which guest riders will receive rides before other guest riders.
  • The second program instructions 32 can be configured to update the buckling history in response to the computer system 34 receiving the first wireless communication and the second wireless communication. The computer system 34 can comprise fourth program instructions configured to update a first ride priority indicator of the guest rider in response to the second program instructions 32 updating the buckling history. The fourth program instructions can be configured to prioritize a future ride request from the guest rider at least partially based on comparing the first ride priority indicator to a second ride priority indicator of another person who has requested transportation.
  • The guest rider might have a first ride priority indicator of 5.2, but the fourth program instructions can be configured to update the first ride priority indicator to 3.1 in response to the second program instructions 32 updating the buckling history (due to the computer system 34 receiving a wireless communication indicating that the guest rider did not buckle her seat belt during a recent ride from a fleet of vehicles). Another person can have a second ride priority indicator of 4.4.
  • The fourth program instructions can be configured to prioritize a future ride request from the guest rider at least partially based on comparing the first ride priority indicator (which is now 3.1) to a second ride priority indicator (which is currently 4.4) of another person who has requested transportation (by providing a ride to the other person before providing a ride to the guest rider due to the higher priority indicator of the other person).
  • In some embodiments, the first self-driving vehicle 5 a comprises a first camera (e.g., 10, 10 a, 10 b) and a first occupancy sensor (e.g., 57) configured to detect the guest rider sitting in the first seat 51. The first program instructions 27 can be configured to cause the first camera to take a first picture of the guest rider in the first seat 51. The second self-driving vehicle 5 b can comprise a second camera (e.g., 10, 10 a, 10 b) and a second occupancy sensor (e.g., 58) configured to detect the guest rider sitting in the second seat 52. The third program instructions 27 a can be configured to cause the second camera to take a second picture of the guest rider in the second seat 52. The second program instructions 32 can be configured to save the first picture and the second picture to the memory such that the buckling history of the guest rider comprises the first picture and the second picture.
  • In some embodiments, the first self-driving vehicle 5 a comprises a camera and a first occupancy sensor configured to detect the guest rider sitting in the first seat 51. The first program instructions 27 can be configured to cause the camera to take a picture of the guest rider in the first seat 51. The first program instructions 27 can be configured to send a third wireless communication comprising the picture to the computer system 34 in response to the first occupancy sensor detecting the guest rider sitting in the first seat 51 and in response to the first seat belt sensor 55 detecting the unbuckled state.
  • In some embodiments, the first self-driving vehicle 5 a comprises a camera and a first occupancy sensor configured to detect the guest rider sitting in the first seat 51. The first program instructions 27 can be configured to cause the camera to take a picture of the guest rider in the first seat 51 in response to the first occupancy sensor detecting the guest rider sitting in the first seat 51 and in response to the first seat belt sensor 55 detecting the unbuckled state.
  • In some embodiments, the first program instructions 27 are configured to send a third wireless communication comprising the picture to the computer system 34 in response to the first occupancy sensor detecting the guest rider sitting in the first seat 51 and in response to the first seat belt sensor 55 detecting the unbuckled state.
  • Some embodiments are directed to enabling the system to determine which guest rider is sitting in which seat inside of a vehicle.
  • In some embodiments, the first self-driving vehicle 5 a comprises a first occupancy sensor configured to detect the guest rider sitting in the first seat 51. In some embodiments, the second program instructions 32 of the computer system 34 are configured to receive an identification of the guest rider, are configured to receive from the first occupancy sensor an indication of the guest rider sitting in the first seat 51, and/or are configured to record that the guest rider is sitting in the first seat 51 in response to receiving the identification and the indication.
  • In some embodiments, the first self-driving vehicle 5 a comprises a first camera configured to take a first picture of the guest rider sitting in the first seat 51. The second self-driving vehicle 5 b can comprise a second camera configured to take a second picture of the guest rider sitting in the second seat 52.
  • The camera 10 can comprise a facial recognition system. Some facial recognition system embodiments use “Amazon Rekognition” (from Amazon Web Services, Inc.).
  • The seat-belt monitoring system can comprise a facial recognition system configured to analyze the first picture to determine a first indicator of an identification of the guest rider and configured to analyze the second picture to determine a second indicator of the identification of the guest rider. The seat-belt monitoring system can be configured to assign the first wireless communication and the second wireless communication to the buckling history of the guest rider in response to the facial recognition system determining the first indicator and the second indicator.
  • In some embodiments, the first self-driving vehicle 5 a comprises a first weight sensor 72 configured to detect first data indicative of a first weight of the guest rider sitting in the first seat 51 of the first self-driving vehicle 5 a. The first program instructions 27 of the first self-driving vehicle 5 a can be configured to send a third wireless communication comprising the first data to the computer system 34. The second self-driving vehicle 5 b can comprise a second weight sensor configured to detect second data indicative of a second weight of an unknown person sitting in the second seat 52 of the second self-driving vehicle 5 b. The third program instructions 27 a of the second self-driving vehicle 5 b can be configured to send a fourth wireless communication comprising the second data to the computer system 34. The second program instructions 32 of the computer system 34 can be configured to determine that the unknown person is the guest rider in response to comparing the second data to the first data.
  • Any of the embodiments described herein can be used with one, two, three, four, five, or more vehicles. Some embodiments use hundreds or even thousands of vehicles.
  • In some embodiments, a seat-belt monitoring system comprises a first self-driving vehicle 5 a configured to transport a guest rider and comprising a first seat 51, a first seat belt 53 configured to secure the guest rider in the first seat 51, and a first seat belt sensor 55 configured to detect at least one of a buckled state of the first seat belt 53 and an unbuckled state of the first seat belt 53. The seat-belt monitoring system can comprise a computer system 34 located remotely relative to the first self-driving vehicle 5 a and communicatively coupled to the first self-driving vehicle 5 a. The computer system 34 can comprise at least one computer. In some embodiments, the computer system 34 comprises many computers located at one or more locations.
  • In some embodiments, the first self-driving vehicle 5 a comprises first program instructions 27 configured to send a first wireless communication to the computer system 34 in response to the first seat belt sensor 55 detecting the unbuckled state. The computer system 34 can comprise a memory having a buckling history of the guest rider. The buckling history can be at least partially based on the first wireless communication. The computer system 34 can comprise second program instructions 32 configured to fine an account of the guest rider a first monetary amount in response to the computer system 34 analyzing the buckling history.
  • Interpretation
  • To reduce unnecessary redundancy, not every element or feature is described in the context of every embodiment, but all elements and features described in the context of any embodiment herein and/or incorporated by reference can be combined with any elements and/or features described in the context of any other embodiments.
  • The self-driving vehicle can be any suitable vehicle. For example, the self-driving vehicle can be a Tesla Model S made by Tesla, Inc. The Tesla Model S can include the Enhanced Autopilot package and the Full Self-Driving Capability package. The Full Self-Driving Capability package includes eight active cameras to enable full self-driving in almost all circumstances.
  • The self-driving vehicle can also be a Waymo car made by Waymo LLC. Waymo was formerly the Google self-driving car project. Waymo has logged thousands of self-driving miles over many years. Waymo vehicles have sensors and software that are designed to detect pedestrians, cyclists, vehicles, roadwork and more from a distance of up to two football fields away in all directions. Waymo has stated that its software leverages over four million miles of real-world driving data. In some embodiments, self-driving vehicles sometimes drive themselves, sometimes are driven remotely by a computer system, and sometimes are driven manually by a human turning a steering wheel, operating pedals, and performing other driver functions. In several embodiments, a self-driving vehicle drives without a human inside the vehicle to pick up the human and then lets the human drive the vehicle. Although in some cases, the human may choose not to drive the vehicle and instead may allow the vehicle to drive itself (e.g., steer and control speed) (e.g., in response to a destination requested by the human).
  • A remote computing device can be a smartphone, a tablet computer, a laptop computer, a desktop computer, a server, augmented reality glasses, an implanted computer, and/or any type of computer. A rider can bring her remote computing device into the self-driving vehicle, use her remote computing device in the self-driving vehicle, and leave the self-driving vehicle with her remote computing device. In some embodiments, the rider requests a ride at her home with a remote computing device, but then leaves the remote computing device at home when she goes to get a ride from the self-driving vehicle.
  • In some embodiments, the remote computing device is an iPhone made by Apple Inc. or an Android phone based on software made by Alphabet Inc. The remote computing device can comprise a speaker configured to emit sounds, a microphone configured to record sounds, and a display screen configured to display images. The remote computing device can comprise a battery configured to provide electrical power to operate the remote computing device.
  • The phrase “communicatively coupling” can include any type of direct and/or indirect coupling between various items including, but not limited to, a self-driving vehicle, a remote computing device, and a vehicle management system. For example, a remote computing device can be communicatively coupled to a vehicle management system via servers, the Cloud, the Internet, satellites, Wi-Fi networks, cellular networks, and any other suitable communication means.
  • The term “price” shall be defined as the amount of money paid by a user (e.g. rider) for a ride. The price of a future ride may be at least partially determined by the buckling history of the rider. In order to reduce liability, ride share companies may want to encourage riders to wear seat belts. As such, a ride share company may charge a rider more for a future ride cost if the rider has a history of leaving her or his seat belt unbuckled, which may encourage riders to wear seat belts.
  • Some of the devices, systems, embodiments, and processes use computers. Each of the routines, processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computers, computer processors, or machines configured to execute computer instructions. The code modules may be stored on any type of non-transitory computer-readable storage medium or tangible computer storage device, such as hard drives, solid state memory, flash memory, optical disc, and/or the like. The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The results of the disclosed processes and process steps may be stored, persistently or otherwise, in any type of non-transitory computer storage such as, e.g., volatile or non-volatile storage.
  • The term “app”, as used in this disclosure, refers to both native apps and mobile cloud apps (and Web apps). Native apps can be installed directly on remote computing devices, whereby developers can create separate app versions for each type of remote computing device (e.g., iPhone devices and Android devices). Native apps may be stored on the remote computing device out of the box, or the native apps can be downloaded from a public or private app store and installed on the remote computing device. Self-driving vehicle data associated with native apps can be stored on the remote computing device and/or can be stored remotely and accessed by the native app. Internet connectivity may be used by some instances of apps. Other instances of apps may not use Internet connectivity. In some embodiments, apps can function without Internet connectivity.
  • Mobile cloud apps are very similar to Web-based apps. The main similarity is that both mobile cloud apps and Web apps run on servers external to the remote computing device and may require the use of a browser on the remote computing device to display and then use the app user interface (UI). Mobile cloud apps can be native apps rebuilt to run in the mobile cloud; custom apps developed for mobile devices; or third-party apps downloaded to the cloud from external sources. Some organizations offer both a native and mobile cloud versions of their applications. In short, the term “app” refers to both native apps and mobile cloud apps.
  • None of the steps described herein is essential or indispensable. Any of the steps can be adjusted or modified. Other or additional steps can be used. Any portion of any of the steps, processes, structures, and/or devices disclosed or illustrated in one embodiment, flowchart, or example in this specification can be combined or used with or instead of any other portion of any of the steps, processes, structures, and/or devices disclosed or illustrated in a different embodiment, flowchart, or example. The embodiments and examples provided herein are not intended to be discrete and separate from each other.
  • The section headings and subheadings provided herein are nonlimiting. The section headings and subheadings do not represent or limit the full scope of the embodiments described in the sections to which the headings and subheadings pertain. For example, a section titled “Topic 1” may include embodiments that do not pertain to Topic 1 and embodiments described in other sections may apply to and be combined with embodiments described within the “Topic 1” section.
  • Some of the devices, systems, embodiments, and processes use computers. Each of the routines, processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computers, computer processors, or machines configured to execute computer instructions. The code modules may be stored on any type of non-transitory computer-readable storage medium or tangible computer storage device, such as hard drives, solid state memory, flash memory, optical disc, and/or the like. The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The results of the disclosed processes and process steps may be stored, persistently or otherwise, in any type of non-transitory computer storage such as, e.g., volatile or non-volatile storage.
  • The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method, event, state, or process blocks may be omitted in some implementations. The methods, steps, and processes described herein are also not limited to any particular sequence, and the blocks, steps, or states relating thereto can be performed in other sequences that are appropriate. For example, described tasks or events may be performed in an order other than the order specifically disclosed. Multiple steps may be combined in a single block or state. The example tasks or events may be performed in serial, in parallel, or in some other manner. Tasks or events may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.
  • Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Conjunctive language such as the phrase “at least one of X, Y, and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be either X, Y, or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present.
  • The term “and/or” means that “and” applies to some embodiments and “or” applies to some embodiments. Thus, A, B, and/or C can be replaced with A, B, and C written in one sentence and A, B, or C written in another sentence. A, B, and/or C means that some embodiments can include A and B, some embodiments can include A and C, some embodiments can include B and C, some embodiments can only include A, some embodiments can include only B, some embodiments can include only C, and some embodiments can include A, B, and C. The term “and/or” is used to avoid unnecessary redundancy.
  • While certain example embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions disclosed herein. Thus, nothing in the foregoing description is intended to imply that any particular feature, characteristic, step, module, or block is necessary or indispensable. Indeed, the novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions, and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions disclosed herein.

Claims (26)

1. A seat-belt monitoring system comprising:
a first self-driving vehicle configured to transport a guest rider and comprising a first seat, a first seat belt configured to secure the guest rider in the first seat, and a first seat belt sensor configured to detect at least one of a buckled state of the first seat belt and an unbuckled state of the first seat belt;
a second self-driving vehicle configured to transport the guest rider and comprising a second seat, a second seat belt configured to secure the guest rider in the second seat, and a second seat belt sensor configured to detect at least one of a buckled state of the second seat belt and an unbuckled state of the second seat belt; and
a computer system having at least one computer and located remotely relative to the first self-driving vehicle,
wherein the first self-driving vehicle comprises first program instructions configured to send a first wireless communication to the computer system in response to the first seat belt sensor detecting the unbuckled state, the computer system comprises second program instructions, and the second self-driving vehicle comprises third program instructions configured to send a second wireless communication to the computer system in response to the second seat belt sensor detecting the unbuckled state of the second seat belt,
wherein the computer system comprises a memory having a buckling history of the guest rider, and the buckling history is at least partially based on the first wireless communication and the second wireless communication,
wherein the second program instructions of the computer system are configured to fine an account of the guest rider a first monetary amount in response to the computer system analyzing the buckling history.
2. (canceled)
3. (canceled)
4. (canceled)
5. The seat-belt monitoring system of claim 1, wherein the first wireless communication comprises an identification of the guest rider, and the identification is configured to enable the second program instructions to fine the account.
6. The seat-belt monitoring system of claim 1, wherein the second program instructions of the computer system are configured to fine the account a second monetary amount in response to the computer system receiving the first wireless communication, and the second program instructions are configured to fine the account a second third monetary amount in response to the computer system receiving the second wireless communication.
7. The seat-belt monitoring system of claim 6, wherein the second program instructions are configured to make the third monetary amount larger than the second monetary amount in response to the computer system receiving the second wireless communication after receiving the first wireless communication.
8. The seat-belt monitoring system of claim 1, wherein the second program instructions are configured to determine a future transportation price for the guest rider at least partially based on the first wireless communication and the second wireless communication such that the future transportation price is at least partially based on the buckling history of the guest rider.
9. The seat-belt monitoring system of claim 8, wherein the second program instructions are configured to increase the future transportation price in response to the buckling history indicating the unbuckled state of the first seat belt and the unbuckled state of the second seat belt, wherein at least one of the first program instructions, the second program instructions, and the third program instructions is configured to charge the future transportation price to the account.
10. The seat-belt monitoring system of claim 1, wherein the second program instructions are configured to update the buckling history in response to the computer system receiving the first wireless communication and the second wireless communication,
wherein the computer system comprises fourth program instructions configured to deny a future ride request from the guest rider in response to the second program instructions updating the buckling history.
11. A seat-belt monitoring system comprising:
a first self-driving vehicle configured to transport a guest rider and comprising a first seat, a first seat belt configured to secure the guest rider in the first seat, and a first seat belt sensor configured to detect at least one of a first buckled state of the first seat belt and a first unbuckled state of the first seat belt
a second self-driving vehicle configured to transport the guest rider and comprising a second seat, a second seat belt configured to secure the guest rider in the second seat, and a second seat belt sensor configured to detect at least one of a second buckled state of the second seat belt and a second unbuckled state of the second seat belt and
a computer system comprising a memory having a buckling history of the guest rider, wherein the buckling history is based at least partially on a first communication from the first seat belt sensor and a second communication from the second seat belt sensor,
wherein the computer system comprises first program instructions configured to update a first ride priority indicator of the guest rider in response to the buckling history of the guest rider,
wherein the first program instructions are configured to prioritize a future ride request from the guest rider at least partially based on comparing the first ride priority indicator to a second ride priority indicator of another person who has requested transportation.
12. The seat-belt monitoring system of claim 1, wherein the first self-driving vehicle comprises a first camera and a first occupancy sensor configured to detect the guest rider sitting in the first seat, and the first self-driving vehicle comprises first program instructions configured to cause the first camera to take a first picture of the guest rider in the first seat, wherein the computer system comprises second program instructions,
wherein the second self-driving vehicle comprises a second camera and a second occupancy sensor configured to detect the guest rider sitting in the second seat, and the second self-driving vehicle comprises third program instructions configured to cause the second camera to take a second picture of the guest rider in the second seat,
wherein the second program instructions are configured to save the first picture and the second picture to the memory such that the buckling history of the guest rider comprises the first picture and the second picture.
13. The seat-belt monitoring system of claim 1, wherein the first self-driving vehicle comprises a camera and a first occupancy sensor configured to detect the guest rider sitting in the first seat, the first program instructions are configured to cause the camera to take a picture of the guest rider in the first seat, and the first program instructions are configured to send a third wireless communication comprising the picture to the computer system in response to the first occupancy sensor detecting the guest rider sitting in the first seat and in response to the first seat belt sensor detecting the unbuckled state.
14. The seat-belt monitoring system of claim 1, wherein the first self-driving vehicle comprises a camera and a first occupancy sensor configured to detect the guest rider sitting in the first seat, and the first program instructions are configured to cause the camera to take a picture of the guest rider in the first seat in response to the first occupancy sensor detecting the guest rider sitting in the first seat and in response to the first seat belt sensor detecting the unbuckled state.
15. The seat-belt monitoring system of claim 14, wherein the first program instructions are configured to send a third wireless communication comprising the picture to the computer system in response to the first occupancy sensor detecting the guest rider sitting in the first seat and in response to the first seat belt sensor detecting the unbuckled state.
16. The seat-belt monitoring system of claim 1, wherein the first self-driving vehicle comprises a first occupancy sensor configured to detect the guest rider sitting in the first seat, and
the second program instructions of the computer system are configured to receive an identification of the guest rider, are configured to receive from the first occupancy sensor an indication of the guest rider sitting in the first seat, and are configured to record that the guest rider is sitting in the first seat in response to receiving the identification and the indication.
17. The seat-belt monitoring system of claim 1, wherein the first self-driving vehicle comprises a first camera configured to take a first picture of the guest rider sitting in the first seat, the second self-driving vehicle comprises a second camera configured to take a second picture of the guest rider sitting in the second seat, and the seat-belt monitoring system comprises a facial recognition system configured to analyze the first picture to determine a first indicator of an identification of the guest rider and configured to analyze the second picture to determine a second indicator of the identification of the guest rider,
wherein the seat-belt monitoring system is configured to assign the first wireless communication and the second wireless communication to the buckling history of the guest rider in response to the facial recognition system determining the first indicator and the second indicator.
18. The seat-belt monitoring system of claim 1, wherein the first self-driving vehicle comprises a first weight sensor configured to detect first data indicative of a first weight of the guest rider sitting in the first seat of the first self-driving vehicle, and the first program instructions of the first self-driving vehicle are configured to send a third wireless communication comprising the first data to the computer system,
the second self-driving vehicle comprises a second weight sensor configured to detect second data indicative of a second weight of an unknown person sitting in the second seat of the second self-driving vehicle, and the third program instructions of the second self-driving vehicle are configured to send a fourth wireless communication comprising the second data to the computer system, and
the second program instructions of the computer system are configured to determine that the unknown person is the guest rider in response to comparing the second data to the first data.
19. (canceled)
20. (canceled)
21. (canceled)
22. The seat-belt monitoring system of claim 11, wherein the first self-driving vehicle comprises a first camera configured to take a first picture of the guest rider sitting in the first seat, the second self-driving vehicle comprises a second camera configured to take a second picture of the guest rider sitting in the second seat, and the seat-belt monitoring system comprises a facial recognition system configured to analyze the first picture to determine a first indicator of an identification of the guest rider and configured to analyze the second picture to determine a second indicator of the identification of the guest rider.
23. The seat-belt monitoring system of claim 22, wherein the first self-driving vehicle comprises second program instructions configured to send a third wireless communication to the computer system in response to the first seat belt sensor detecting the first unbuckled state, and the second self-driving vehicle comprises third program instructions configured to send a fourth wireless communication to the computer system in response to the second seat belt sensor detecting the second unbuckled state of the second seat belt, and
wherein the computer system comprises a memory having a buckling history of the guest rider, and the buckling history is at least partially based on the third wireless communication and the fourth wireless communication.
24. The seat-belt monitoring system of claim 23, wherein the seat-belt monitoring system is configured to assign the third wireless communication and the fourth wireless communication to the buckling history of the guest rider in response to the facial recognition system determining the first indicator and the second indicator.
25. The seat-belt monitoring system of claim 11, wherein the first self-driving vehicle comprises a first weight sensor configured to detect first data indicative of a first weight of the guest rider sitting in the first seat of the first self-driving vehicle, and the first self-driving vehicle comprises second program instructions configured to send a third wireless communication comprising the first data to the computer system, and
the second self-driving vehicle comprises a second weight sensor configured to detect second data indicative of a second weight of an unknown person sitting in the second seat of the second self-driving vehicle, and the second self-driving vehicle comprises third program instructions configured to send a fourth wireless communication comprising the second data to the computer system.
26. The seat-belt monitoring system of claim 25, wherein the computer system comprises fourth program instructions are configured to determine that the unknown person is the guest rider in response to comparing the second data to the first data.
US16/528,100 2019-02-04 2019-07-31 Self-driving vehicle systems and methods Active US10744976B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/528,100 US10744976B1 (en) 2019-02-04 2019-07-31 Self-driving vehicle systems and methods
US16/678,751 US11221622B2 (en) 2019-03-21 2019-11-08 Self-driving vehicle systems and methods
US16/678,660 US11221621B2 (en) 2019-03-21 2019-11-08 Self-driving vehicle systems and methods

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US16/266,698 US10377342B1 (en) 2019-02-04 2019-02-04 Self-driving vehicle systems and methods
US201962823552P 2019-03-25 2019-03-25
US201962824941P 2019-03-27 2019-03-27
US201962836525P 2019-04-19 2019-04-19
US201962841785P 2019-05-01 2019-05-01
US16/528,100 US10744976B1 (en) 2019-02-04 2019-07-31 Self-driving vehicle systems and methods

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US16/266,698 Continuation-In-Part US10377342B1 (en) 2019-02-04 2019-02-04 Self-driving vehicle systems and methods
US16/524,110 Continuation-In-Part US10479319B1 (en) 2018-09-18 2019-07-28 Self-driving vehicle systems and methods

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/678,660 Continuation-In-Part US11221621B2 (en) 2018-10-01 2019-11-08 Self-driving vehicle systems and methods

Publications (2)

Publication Number Publication Date
US20200247357A1 true US20200247357A1 (en) 2020-08-06
US10744976B1 US10744976B1 (en) 2020-08-18

Family

ID=71837294

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/528,100 Active US10744976B1 (en) 2019-02-04 2019-07-31 Self-driving vehicle systems and methods

Country Status (1)

Country Link
US (1) US10744976B1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10901415B1 (en) * 2015-05-26 2021-01-26 Waymo Llc Non-passenger requests for autonomous vehicles
US20210237681A1 (en) * 2020-02-05 2021-08-05 Ford Global Technologies, Llc Locating systems and methods for wireless seat belt monitoring in vehicles with removable or reconfigurable seats
US11161500B1 (en) * 2019-05-13 2021-11-02 GM Cruise Holdings, LLC Advanced passenger safety for an autonomous vehicle
US11741400B1 (en) 2020-12-18 2023-08-29 Beijing Didi Infinity Technology And Development Co., Ltd. Machine learning-based real-time guest rider identification

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11073838B2 (en) * 2018-01-06 2021-07-27 Drivent Llc Self-driving vehicle systems and methods

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050017888A (en) * 2003-08-11 2005-02-23 현대자동차주식회사 Penalty system for unfastening seatbelt and method for controlling the same
US20090132128A1 (en) * 2007-11-20 2009-05-21 Tk Holdings Inc. Occupant monitoring and restraint status system
KR20090094569A (en) * 2008-03-03 2009-09-08 안진득 System for control in order to wear seat belt
US20090289443A1 (en) * 2008-05-23 2009-11-26 Okezie Pathfins C Vehicle safety apparatus and method
US20130214919A1 (en) * 2012-02-20 2013-08-22 Fred Bassali Seatbelt usage logging and alerting system
US20150206206A1 (en) * 2014-01-23 2015-07-23 Cox Enterprises, Inc. Systems and methods for flexible vehicle sharing
US9272713B1 (en) * 2013-06-24 2016-03-01 Imperium Technologies LLC Compliance device, system and method for machine operation
US20160216130A1 (en) * 2012-06-21 2016-07-28 Cellepathy Ltd. Enhanced navigation instruction
US20180018833A1 (en) * 2016-07-18 2018-01-18 Ford Global Technologies, Llc Vehicle Database Storage And Retrieval Methods And Systems
US20180075380A1 (en) * 2016-09-10 2018-03-15 Swiss Reinsurance Company Ltd. Automated, telematics-based system with score-driven triggering and operation of automated sharing economy risk-transfer systems and corresponding method thereof
US20180338241A1 (en) * 2013-10-04 2018-11-22 Sol Mingso Li Systems and methods for programming, controlling and monitoring wireless networks
US20190050787A1 (en) * 2018-01-03 2019-02-14 Intel Corporation Rider matching in ridesharing
US20190228654A1 (en) * 2018-01-22 2019-07-25 RPMAnetworks Holding System and method of two-way wireless communication for connected car vehicle
US20200117690A1 (en) * 2018-10-15 2020-04-16 Bao Tran Smart device
US20200168008A1 (en) * 2018-11-26 2020-05-28 Uber Technologies, Inc. Managing the operational state of a vehicle

Family Cites Families (200)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4212069A (en) 1976-08-31 1980-07-08 Baumann Dwight M Paratransit fare computation and dispatching method
US7660437B2 (en) 1992-05-05 2010-02-09 Automotive Technologies International, Inc. Neural network systems for vehicles
US6748318B1 (en) 1993-05-18 2004-06-08 Arrivalstar, Inc. Advanced notification systems and methods utilizing a computer network
JP3370218B2 (en) 1995-09-04 2003-01-27 アイシン精機株式会社 Door unlocking device in case of collision
US5945919A (en) 1996-05-30 1999-08-31 Trimble Navigation Limited Dispatcher free vehicle allocation system
JP3478028B2 (en) 1996-11-11 2003-12-10 トヨタ車体株式会社 Automatic opening and closing device for flip-up doors
US5871063A (en) 1997-01-22 1999-02-16 Automotive Systems Laboratory, Inc. Seat belt latch sensor system
US5798695A (en) 1997-04-02 1998-08-25 Northrop Grumman Corporation Impaired operator detection and warning system employing analysis of operator control actions
JPH1123458A (en) 1997-05-08 1999-01-29 Nittan Co Ltd Smoke sensor and monitoring control system
JP3415014B2 (en) 1997-12-26 2003-06-09 アスモ株式会社 Automatic switchgear
US7426429B2 (en) 1998-04-27 2008-09-16 Joseph A Tabe Smart seatbelt control system
US5960523A (en) 1998-08-25 1999-10-05 Breed Automotive Technology, Inc. Seat belt buckle sensor
US6530251B1 (en) 1999-11-18 2003-03-11 Strattec Security Corporation Modular vehicle door lock and latch system and method
AU2001243285A1 (en) 2000-03-02 2001-09-12 Donnelly Corporation Video mirror systems incorporating an accessory module
US8706542B2 (en) 2000-12-18 2014-04-22 Apple Inc. Allocation of location-based orders to mobile agents
DE10110373C2 (en) 2001-03-03 2003-03-06 Wolfgang Daum Method and device for cleaning the interior of automobiles
US7262790B2 (en) 2002-01-09 2007-08-28 Charles Adams Bakewell Mobile enforcement platform with aimable violation identification and documentation system for multiple traffic violation types across all lanes in moving traffic, generating composite display images and data to support citation generation, homeland security, and monitoring
JP2004054444A (en) 2002-07-17 2004-02-19 Omron Corp Operation service information mediating system
JP2004149110A (en) 2002-10-09 2004-05-27 Nissan Motor Co Ltd Accelerator pedal device
GB0302886D0 (en) 2003-02-07 2003-03-12 Faith Jonathan D Transportation ordering system
US7119716B2 (en) 2003-05-28 2006-10-10 Legalview Assets, Limited Response systems and methods for notification systems for modifying future notifications
JP4075742B2 (en) 2003-08-29 2008-04-16 三菱自動車工業株式会社 Seat belt non-wear warning device
US20070096447A1 (en) 2003-10-07 2007-05-03 Tabe Joseph A Smart seatbelt control system
WO2006092849A1 (en) 2005-03-01 2006-09-08 Fujitsu Limited Magnetoresistive element and magnetic memory
US7413357B2 (en) 2005-06-13 2008-08-19 Silverstate Safety Image Concealed camera
AU2006306523B2 (en) 2005-10-21 2011-05-19 Deere & Company Systems and methods for switching between autonomous and manual operation of a vehicle
US9459622B2 (en) 2007-01-12 2016-10-04 Legalforce, Inc. Driverless vehicle commerce network and community
US20160027307A1 (en) 2005-12-23 2016-01-28 Raj V. Abhyanker Short-term automobile rentals in a geo-spatial environment
US20140172727A1 (en) 2005-12-23 2014-06-19 Raj V. Abhyanker Short-term automobile rentals in a geo-spatial environment
US7777619B2 (en) 2007-04-11 2010-08-17 Ford Global Technologies, Llc System and method for implementing active safety counter measures for an impaired driver
US7937202B2 (en) 2007-05-01 2011-05-03 Ronald Van Houten Seat belt/accelerator behavioral system
US20080312782A1 (en) 2007-06-15 2008-12-18 Gene Berdichevsky Electric vehicle communication interface
US9848447B2 (en) 2007-06-27 2017-12-19 Ford Global Technologies, Llc Method and system for emergency notification
US8180379B2 (en) 2007-06-28 2012-05-15 Apple Inc. Synchronizing mobile and vehicle devices
US20090140886A1 (en) 2007-12-03 2009-06-04 International Truck Intellectual Property Company, Llc Multiple geofence system for vehicles
US8374743B2 (en) 2008-05-16 2013-02-12 GM Global Technology Operations LLC Method and apparatus for driver control of a limited-ability autonomous vehicle
JP4582205B2 (en) 2008-06-12 2010-11-17 トヨタ自動車株式会社 Electric vehicle
US7999701B1 (en) 2008-06-26 2011-08-16 Bin Xu Transportation notification system
US8018329B2 (en) 2008-12-12 2011-09-13 Gordon * Howard Associates, Inc. Automated geo-fence boundary configuration and activation
US8285611B2 (en) 2008-12-31 2012-10-09 Fuller Max L Method for in-cab driver operation
US8892299B2 (en) 2009-10-05 2014-11-18 Tesla Motors, Inc. Vehicle user interface with proximity activation
US20130246301A1 (en) 2009-12-04 2013-09-19 Uber Technologies, Inc. Providing user feedback for transport services through use of mobile devices
US9230292B2 (en) 2012-11-08 2016-01-05 Uber Technologies, Inc. Providing on-demand services through use of portable computing devices
EP2507753A4 (en) 2009-12-04 2013-10-30 Uber Technologies Inc System and method for arranging transport amongst parties through use of mobile devices
US8430192B2 (en) 2010-01-04 2013-04-30 Carla R. Gillett Robotic omniwheel vehicle
US8346426B1 (en) 2010-04-28 2013-01-01 Google Inc. User interface for displaying internal state of autonomous driving system
US8260482B1 (en) 2010-04-28 2012-09-04 Google Inc. User interface for displaying internal state of autonomous driving system
US8836491B2 (en) 2010-04-29 2014-09-16 Ford Global Technologies, Llc Occupant detection
US20120009845A1 (en) 2010-07-07 2012-01-12 Juniper Holding Corp. Configurable location-aware toy capable of communicating with like toys and associated system infrastructure for communicating with such toys
US8509982B2 (en) 2010-10-05 2013-08-13 Google Inc. Zone driving
US8874773B2 (en) 2010-11-30 2014-10-28 Gary W. Grube Obtaining group and individual emergency preparedness communication information
US9542847B2 (en) 2011-02-16 2017-01-10 Toyota Motor Engineering & Manufacturing North America, Inc. Lane departure warning/assistance method and system having a threshold adjusted based on driver impairment determination using pupil size and driving patterns
GB201106555D0 (en) 2011-04-19 2011-06-01 Tomtom Int Bv Taxi dispatching system
US8686844B1 (en) 2011-06-29 2014-04-01 Intellectual Ventures Fund 79 Llc Methods, devices, and mediums associated with risk management of vehicle operation
CN103857987B (en) 2011-07-05 2016-08-24 丰田自动车株式会社 Recommendation information provides system
US8817761B2 (en) 2011-08-29 2014-08-26 Randal Gruberman System and method for remotely controlling features of wireless mobile devices
US20130085817A1 (en) 2011-09-29 2013-04-04 Michael Collins Pinkus Discount offer system and method for use with for hire vehicles
WO2013093933A1 (en) 2011-09-29 2013-06-27 Tata Consultancy Services Limited Rogue vehicle detection
US9786009B2 (en) 2011-11-29 2017-10-10 Hartford Fire Insurance Company System and method for administering a telematics-enabled test drive dealer program
US20130197674A1 (en) 2012-01-30 2013-08-01 Apple Inc. Automatic configuration of self-configurable environments
US20140350855A1 (en) 2012-02-28 2014-11-27 Google Inc. Systems and Methods for Providing Navigational Assistance to Reserved Parking Locations
US9429943B2 (en) 2012-03-05 2016-08-30 Florida A&M University Artificial intelligence valet systems and methods
US8718861B1 (en) 2012-04-11 2014-05-06 Google Inc. Determining when to drive autonomously
US8700251B1 (en) 2012-04-13 2014-04-15 Google Inc. System and method for automatically detecting key behaviors by vehicles
US10515489B2 (en) 2012-05-23 2019-12-24 Enterprise Holdings, Inc. Rental/car-share vehicle access and management system and method
US9139133B2 (en) 2012-05-31 2015-09-22 GM Global Technology Operations LLC Vehicle collision warning system and method
US8433934B1 (en) 2012-06-28 2013-04-30 Google Inc. Saving battery on wireless connections on mobile devices using internal motion detection
US8836788B2 (en) 2012-08-06 2014-09-16 Cloudparc, Inc. Controlling use of parking spaces and restricted locations using multiple cameras
US10055694B2 (en) 2012-08-07 2018-08-21 Hitachi, Ltd. Use-assisting tool for autonomous mobile device, operation management center, operation system, and autonomous mobile device
US9120485B1 (en) 2012-09-14 2015-09-01 Google Inc. Methods and systems for smooth trajectory generation for a self-driving vehicle
US9424728B2 (en) 2012-09-24 2016-08-23 Amy Rambadt Child safety seat mobile alarm and method therefor
US9196164B1 (en) 2012-09-27 2015-11-24 Google Inc. Pedestrian notifications
US8949016B1 (en) 2012-09-28 2015-02-03 Google Inc. Systems and methods for determining whether a driving environment has changed
US9686306B2 (en) 2012-11-02 2017-06-20 University Of Washington Through Its Center For Commercialization Using supplemental encrypted signals to mitigate man-in-the-middle attacks on teleoperated systems
US9026300B2 (en) 2012-11-06 2015-05-05 Google Inc. Methods and systems to aid autonomous vehicles driving through a lane merge
US8825258B2 (en) 2012-11-30 2014-09-02 Google Inc. Engaging and disengaging for autonomous driving
GB201300006D0 (en) 2013-01-01 2013-02-13 Tomtom Dev Germany Gmbh Vehicle management system
US8948993B2 (en) 2013-03-08 2015-02-03 Richard Schulman Method and system for controlling the behavior of an occupant of a vehicle
US9075415B2 (en) 2013-03-11 2015-07-07 Airphrame, Inc. Unmanned aerial vehicle and methods for controlling same
WO2014152254A2 (en) 2013-03-15 2014-09-25 Carnegie Robotics Llc Methods, systems, and apparatus for multi-sensory stereo vision for robotics
US8849494B1 (en) 2013-03-15 2014-09-30 Google Inc. Data selection by an autonomous vehicle for trajectory modification
US9008890B1 (en) 2013-03-15 2015-04-14 Google Inc. Augmented trajectories for autonomous vehicles
US9721285B2 (en) 2013-03-15 2017-08-01 Home Depot Product Authority, Llc Facilitation of authorized in-store pickup in conjunction with online ordering
US8996224B1 (en) 2013-03-15 2015-03-31 Google Inc. Detecting that an autonomous vehicle is in a stuck condition
US9632210B2 (en) 2013-05-07 2017-04-25 Google Inc. Methods and systems for detecting weather conditions using vehicle onboard sensors
US9119038B2 (en) 2013-05-21 2015-08-25 Yopima Llc Systems and methods for comparative geofencing
US9019107B2 (en) 2013-06-19 2015-04-28 GM Global Technology Operations LLC Methods and apparatus for detection and reporting of vehicle operator impairment
US20150012833A1 (en) 2013-07-02 2015-01-08 Fortis Riders Corporation Mobile application using gestures to facilitate communication
US11488241B2 (en) 2013-07-26 2022-11-01 U-Haul International, Inc. Method and apparatus for mobile rental of vehicles
US20150066284A1 (en) 2013-09-05 2015-03-05 Ford Global Technologies, Llc Autonomous vehicle control for impaired driver
US9958289B2 (en) 2013-09-26 2018-05-01 Google Llc Controlling navigation software on a portable device from the head unit of a vehicle
US20150120504A1 (en) 2013-10-25 2015-04-30 Michael Todasco Systems and methods for completion of item delivery and transactions using a mobile beacon
WO2015077745A1 (en) 2013-11-25 2015-05-28 Agco Corporation Dynamic cooperative geofence
WO2015099679A1 (en) 2013-12-23 2015-07-02 Intel Corporation In-vehicle authorization for autonomous vehicles
US9984574B2 (en) 2014-01-21 2018-05-29 Tribal Rides, Inc. Method and system for anticipatory deployment of autonomously controlled vehicles
US20150248689A1 (en) 2014-03-03 2015-09-03 Sunil Paul Systems and methods for providing transportation discounts
US10692370B2 (en) 2014-03-03 2020-06-23 Inrix, Inc. Traffic obstruction detection
US9646326B2 (en) 2014-03-13 2017-05-09 Gary Goralnick Advertising-integrated car
US10176517B2 (en) 2014-03-13 2019-01-08 Gary Goralnick Advertising-integrated car
US9960986B2 (en) 2014-03-19 2018-05-01 Uber Technologies, Inc. Providing notifications to devices based on real-time conditions related to an on-demand service
US20160071056A1 (en) 2014-03-21 2016-03-10 United Parcel Service Of America, Inc. Programmatically executing time compressed delivery
CN105094767B (en) 2014-05-06 2019-02-12 华为技术有限公司 Automatic driving vehicle dispatching method, vehicle scheduling server and automatic driving vehicle
US9631933B1 (en) 2014-05-23 2017-04-25 Google Inc. Specifying unavailable locations for autonomous vehicles
US10424036B2 (en) 2014-06-02 2019-09-24 Uber Technologies, Inc. Maintaining data for use with a transport service during connectivity loss between systems
US9774410B2 (en) 2014-06-10 2017-09-26 PB, Inc. Radiobeacon data sharing by forwarding low energy transmissions to a cloud host
CA2956063A1 (en) 2014-07-22 2016-01-28 Lyft, Inc. Ride chaining
US11107019B2 (en) 2014-07-30 2021-08-31 Uber Technologies, Inc. Arranging a transport service for multiple users
BR112017002174A2 (en) 2014-08-04 2017-11-21 Uber Technologies Inc determination and provision of predetermined location data points to service providers
US9365186B2 (en) 2014-08-17 2016-06-14 Toyota Motor Engineering & Manufacturing North America, Inc. Advanced seatbelt interlock using video recognition
CA2905904A1 (en) 2014-09-25 2016-03-25 2435603 Ontario Inc. Roving vehicle rental system and method
US9377315B2 (en) 2014-10-22 2016-06-28 Myine Electronics, Inc. System and method to provide valet instructions for a self-driving vehicle
US9290174B1 (en) 2014-10-23 2016-03-22 GM Global Technology Operations LLC Method and system for mitigating the effects of an impaired driver
US9547985B2 (en) 2014-11-05 2017-01-17 Here Global B.V. Method and apparatus for providing access to autonomous vehicles based on user context
US9358953B2 (en) 2014-11-07 2016-06-07 Ford Global Technologies, Llc Seat belt presenter fault indication
US9747655B2 (en) 2014-11-18 2017-08-29 William Michael Smith Emergency service provision with destination-specific information
CN113654561A (en) 2014-12-05 2021-11-16 苹果公司 Autonomous navigation system
US11187544B2 (en) 2014-12-30 2021-11-30 Ebay Inc. Determining and dispatching a ride-share vehicle
US9625906B2 (en) 2015-01-15 2017-04-18 Nissan North America, Inc. Passenger docking location selection
GB201503079D0 (en) 2015-02-24 2015-04-08 Addison Lee Ltd Managing a vehicle sharing facility
GB2535718A (en) 2015-02-24 2016-08-31 Addison Lee Ltd Resource management
US20160247106A1 (en) 2015-02-24 2016-08-25 Siemens Aktiengesellschaft Managing a fleet of autonomous electric vehicles for on-demand transportation and ancillary services to electrical grid
US9514623B1 (en) 2015-05-15 2016-12-06 Google Inc. Smoke detector chamber architecture and related methods using two different wavelengths of light
US9196141B1 (en) 2015-05-15 2015-11-24 Google, Inc. Smoke detector chamber
GB2535246B (en) 2015-05-19 2019-04-17 Ford Global Tech Llc A method and system for increasing driver awareness
US20160342934A1 (en) 2015-05-22 2016-11-24 Peter Michalik System and process for communicating between a drone and a handheld device
US10031522B2 (en) 2015-05-27 2018-07-24 Dov Moran Alerting predicted accidents between driverless cars
US10200824B2 (en) 2015-05-27 2019-02-05 Apple Inc. Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device
US9841762B2 (en) 2015-05-27 2017-12-12 Comigo Ltd. Alerting predicted accidents between driverless cars
US20160364823A1 (en) 2015-06-11 2016-12-15 Raymond Cao Systems and methods for on-demand transportation
US9904900B2 (en) 2015-06-11 2018-02-27 Bao Tran Systems and methods for on-demand transportation
US20160364812A1 (en) 2015-06-11 2016-12-15 Raymond Cao Systems and methods for on-demand transportation
US9551586B2 (en) 2015-06-12 2017-01-24 Uber Technologies, Inc. System and method for providing contextual information for a location
US9733096B2 (en) 2015-06-22 2017-08-15 Waymo Llc Determining pickup and destination locations for autonomous vehicles
US9562785B1 (en) 2015-07-20 2017-02-07 Via Transportation, Inc. Continuously updatable computer-generated routes with continuously configurable virtual bus stops for passenger ride-sharing of a fleet of ride-sharing vehicles and computer transportation systems and computer-implemented methods for use thereof
US10067988B2 (en) 2015-07-21 2018-09-04 Uber Technologies, Inc. User-based content filtering and ranking to facilitate on-demand services
US9527217B1 (en) 2015-07-27 2016-12-27 Westfield Labs Corporation Robotic systems and methods
US9805605B2 (en) 2015-08-12 2017-10-31 Madhusoodhan Ramanujam Using autonomous vehicles in a taxi service
US10023231B2 (en) 2015-08-12 2018-07-17 Madhusoodhan Ramanujam Parking autonomous vehicles
US10430744B2 (en) 2015-08-21 2019-10-01 Autodesk, Inc. Robot service platform
US10139828B2 (en) 2015-09-24 2018-11-27 Uber Technologies, Inc. Autonomous vehicle operated with safety augmentation
US9754338B2 (en) 2015-10-09 2017-09-05 Gt Gettaxi Limited System to facilitate a correct identification of a service provider
US10115029B1 (en) 2015-10-13 2018-10-30 Ambarella, Inc. Automobile video camera for the detection of children, people or pets left in a vehicle
EP3368860A4 (en) 2015-10-30 2019-11-27 Zemcar, Inc. Rules-based ride security
US9916703B2 (en) 2015-11-04 2018-03-13 Zoox, Inc. Calibration for autonomous vehicle operation
US10467561B2 (en) 2015-11-05 2019-11-05 Gt Gettaxi Limited System for identifying events and preemptively navigating drivers to transport passengers from the events
US9663032B1 (en) 2015-11-09 2017-05-30 Ford Global Technologies, Llc Child seat monitoring system and method
AU2016355605B2 (en) 2015-11-20 2021-08-19 Uber Technologies, Inc. Controlling autonomous vehicles in connection with transport services
US10005563B2 (en) 2015-11-23 2018-06-26 The Boeing Company Methods and apparatus for determining seatbelt status
US10685297B2 (en) 2015-11-23 2020-06-16 Google Llc Automatic booking of transportation based on context of a user of a computing device
US10036642B2 (en) 2015-12-08 2018-07-31 Uber Technologies, Inc. Automated vehicle communications system
US10050760B2 (en) 2015-12-08 2018-08-14 Uber Technologies, Inc. Backend communications system for a fleet of autonomous vehicles
US10685416B2 (en) 2015-12-10 2020-06-16 Uber Technologies, Inc. Suggested pickup location for ride services
US20170213165A1 (en) 2016-01-26 2017-07-27 GM Global Technology Operations LLC Systems and methods for vehicle ride safety and security of person and property
CN108475466B (en) 2016-01-27 2022-07-12 北京嘀嘀无限科技发展有限公司 System and method for matching and displaying service requests and available vehicles
US10049566B2 (en) 2016-03-02 2018-08-14 Michael E. Shanahan Systems and methods for intra-vehicle pedestrian and infrastructure communication
JP6668497B2 (en) 2016-03-17 2020-03-18 スイス リインシュランス カンパニー リミテッド Telematics system and corresponding method
US9836057B2 (en) 2016-03-24 2017-12-05 Waymo Llc Arranging passenger pickups for autonomous vehicles
US9896096B2 (en) 2016-04-11 2018-02-20 David E. Newman Systems and methods for hazard mitigation
US9429947B1 (en) 2016-04-14 2016-08-30 Eric John Wengreen Self-driving vehicle systems and methods
US20170300053A1 (en) 2016-04-14 2017-10-19 Eric John Wengreen Self-driving vehicle systems and methods
US10255648B2 (en) 2016-04-14 2019-04-09 Eric John Wengreen Self-driving vehicle systems and methods
US10529221B2 (en) 2016-04-19 2020-01-07 Navio International, Inc. Modular approach for smart and customizable security solutions and other applications for a smart city
JP2017198633A (en) 2016-04-28 2017-11-02 本田技研工業株式会社 Vehicle control system, vehicle control method, and vehicle control program
US20170316533A1 (en) 2016-04-29 2017-11-02 GM Global Technology Operations LLC Personal safety and privacy features for passengers of an autonomous vehicle based transportation system
US20170316516A1 (en) 2016-04-29 2017-11-02 GM Global Technology Operations LLC Systems and methods for managing a social autonomous taxi service
US20170327082A1 (en) 2016-05-12 2017-11-16 GM Global Technology Operations LLC End-to-end accommodation functionality for passengers of fully autonomous shared or taxi-service vehicles
US10331964B2 (en) 2016-05-23 2019-06-25 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Trunk inventory detector
SG11201810381QA (en) 2016-05-27 2018-12-28 Uber Technologies Inc Facilitating rider pick-up for a self-driving vehicle
US10002515B2 (en) 2016-06-01 2018-06-19 Tile, Inc. User intervention based on tracking device location
DK201670622A1 (en) 2016-06-12 2018-02-12 Apple Inc User interfaces for transactions
WO2017223031A1 (en) 2016-06-21 2017-12-28 Via Transportation, Inc. Systems and methods for vehicle ridesharing management
CN109416873B (en) 2016-06-24 2022-02-15 瑞士再保险有限公司 Autonomous or partially autonomous motor vehicle with automated risk control system and corresponding method
US10438493B2 (en) 2016-08-24 2019-10-08 Uber Technologies, Inc. Hybrid trip planning for autonomous vehicles
US20180060778A1 (en) 2016-08-31 2018-03-01 Uber Technologies, Inc. Driver location prediction for a transportation service
US20180075565A1 (en) 2016-09-13 2018-03-15 Ford Global Technologies, Llc Passenger validation systems and methods
US20180096601A1 (en) 2016-10-04 2018-04-05 Toyota Motor Engineering & Manufacturing North America, Inc. Collision alert system
US20180115924A1 (en) 2016-10-20 2018-04-26 Nokia Solutions And Networks Oy Dynamic Exchange Of Wireless Communication Services
WO2018080650A2 (en) 2016-10-25 2018-05-03 725-1 Corporation Video-based data collection, image capture and analysis configuration
US10479328B2 (en) 2016-11-04 2019-11-19 Ford Global Technologies, Llc System and methods for assessing the interior of an autonomous vehicle
US10726640B2 (en) 2016-11-15 2020-07-28 At&T Mobility Ii Llc Facilitation of smart communications hub to support driverless vehicles in 5G networks or other next generation networks
US20180156625A1 (en) 2016-12-06 2018-06-07 Delphi Technologies, Inc. Automated-vehicle pickup-location evaluation system
US20180157268A1 (en) 2016-12-06 2018-06-07 Delphi Technologies, Inc. Taxi client identification for automated vehicles
US10818188B2 (en) 2016-12-13 2020-10-27 Direct Current Capital LLC Method for dispatching a vehicle to a user's location
US10924376B2 (en) 2016-12-30 2021-02-16 Google Llc Selective sensor polling
US20180196417A1 (en) 2017-01-09 2018-07-12 nuTonomy Inc. Location Signaling with Respect to an Autonomous Vehicle and a Rider
US20180211540A1 (en) 2017-01-23 2018-07-26 Delphi Technologies, Inc. Automated vehicle transportation system for multiple-segment ground-transportation
US10677602B2 (en) 2017-01-25 2020-06-09 Via Transportation, Inc. Detecting the number of vehicle passengers
US10290158B2 (en) 2017-02-03 2019-05-14 Ford Global Technologies, Llc System and method for assessing the interior of an autonomous vehicle
US9953539B1 (en) 2017-03-28 2018-04-24 Nec Corporation Method and system for providing demand-responsive dispatching of a fleet of transportation vehicles, and a mobility-activity processing module for providing a mobility trace database
US9769616B1 (en) 2017-04-04 2017-09-19 Lyft, Inc. Geohash-related location predictions
US10458802B2 (en) 2017-06-13 2019-10-29 Gt Gettaxi Limited System and method for navigating drivers to dynamically selected drop-off locations for shared rides
KR102263395B1 (en) 2017-07-25 2021-06-11 삼성전자주식회사 Electronic device for identifying external vehicle changing identification based on data associated with movement of external vehicle
US10127795B1 (en) 2017-12-31 2018-11-13 Lyft, Inc. Detecting and handling material left in vehicles by transportation requestors
US10299216B1 (en) 2018-01-06 2019-05-21 Eric John Wengreen Self-driving vehicle actions in response to a low battery
US10274950B1 (en) 2018-01-06 2019-04-30 Drivent Technologies Inc. Self-driving vehicle systems and methods
US10282625B1 (en) 2018-10-01 2019-05-07 Eric John Wengreen Self-driving vehicle systems and methods
US10289922B1 (en) 2018-09-18 2019-05-14 Eric John Wengreen System for managing lost, mislaid, or abandoned property in a self-driving vehicle
US10223844B1 (en) 2018-09-18 2019-03-05 Wesley Edward Schwie Self-driving vehicle systems and methods
US10240938B1 (en) 2018-10-22 2019-03-26 Drivent Technologies Inc. Self-driving vehicle systems and methods
US10286908B1 (en) 2018-11-01 2019-05-14 Eric John Wengreen Self-driving vehicle systems and methods

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050017888A (en) * 2003-08-11 2005-02-23 현대자동차주식회사 Penalty system for unfastening seatbelt and method for controlling the same
US20090132128A1 (en) * 2007-11-20 2009-05-21 Tk Holdings Inc. Occupant monitoring and restraint status system
KR20090094569A (en) * 2008-03-03 2009-09-08 안진득 System for control in order to wear seat belt
US20090289443A1 (en) * 2008-05-23 2009-11-26 Okezie Pathfins C Vehicle safety apparatus and method
US20130214919A1 (en) * 2012-02-20 2013-08-22 Fred Bassali Seatbelt usage logging and alerting system
US20160216130A1 (en) * 2012-06-21 2016-07-28 Cellepathy Ltd. Enhanced navigation instruction
US9272713B1 (en) * 2013-06-24 2016-03-01 Imperium Technologies LLC Compliance device, system and method for machine operation
US20180338241A1 (en) * 2013-10-04 2018-11-22 Sol Mingso Li Systems and methods for programming, controlling and monitoring wireless networks
US20150206206A1 (en) * 2014-01-23 2015-07-23 Cox Enterprises, Inc. Systems and methods for flexible vehicle sharing
US20180018833A1 (en) * 2016-07-18 2018-01-18 Ford Global Technologies, Llc Vehicle Database Storage And Retrieval Methods And Systems
US20180075380A1 (en) * 2016-09-10 2018-03-15 Swiss Reinsurance Company Ltd. Automated, telematics-based system with score-driven triggering and operation of automated sharing economy risk-transfer systems and corresponding method thereof
US20190050787A1 (en) * 2018-01-03 2019-02-14 Intel Corporation Rider matching in ridesharing
US20190228654A1 (en) * 2018-01-22 2019-07-25 RPMAnetworks Holding System and method of two-way wireless communication for connected car vehicle
US20200117690A1 (en) * 2018-10-15 2020-04-16 Bao Tran Smart device
US20200168008A1 (en) * 2018-11-26 2020-05-28 Uber Technologies, Inc. Managing the operational state of a vehicle

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10901415B1 (en) * 2015-05-26 2021-01-26 Waymo Llc Non-passenger requests for autonomous vehicles
US11947353B1 (en) 2015-05-26 2024-04-02 Waymo Llc Non-passenger requests for autonomous vehicles
US11161500B1 (en) * 2019-05-13 2021-11-02 GM Cruise Holdings, LLC Advanced passenger safety for an autonomous vehicle
US11724690B2 (en) 2019-05-13 2023-08-15 Gm Cruise Holdings Llc Advanced passenger safety for an autonomous vehicle
US20210237681A1 (en) * 2020-02-05 2021-08-05 Ford Global Technologies, Llc Locating systems and methods for wireless seat belt monitoring in vehicles with removable or reconfigurable seats
US11117546B2 (en) * 2020-02-05 2021-09-14 Ford Global Technologies, Llc Locating systems and methods for wireless seat belt monitoring in vehicles with removable or reconfigurable seats
US11741400B1 (en) 2020-12-18 2023-08-29 Beijing Didi Infinity Technology And Development Co., Ltd. Machine learning-based real-time guest rider identification

Also Published As

Publication number Publication date
US10744976B1 (en) 2020-08-18

Similar Documents

Publication Publication Date Title
US10377342B1 (en) Self-driving vehicle systems and methods
US10282625B1 (en) Self-driving vehicle systems and methods
US10744976B1 (en) Self-driving vehicle systems and methods
US10471804B1 (en) Self-driving vehicle systems and methods
US10493952B1 (en) Self-driving vehicle systems and methods
US10286908B1 (en) Self-driving vehicle systems and methods
US10915100B2 (en) Control system for vehicle
US11731636B2 (en) System and method to determine responsiveness of a driver of a vehicle to feedback regarding driving behaviors
US10051411B2 (en) Method and system for guiding a person to a location
US11221622B2 (en) Self-driving vehicle systems and methods
US8260537B2 (en) Method for modifying an existing vehicle on a retrofit basis to integrate the vehicle into an information exchange system
US8892271B2 (en) Information Transmittal Techniques for Vehicles
US8060308B2 (en) Weather monitoring techniques
US7983802B2 (en) Vehicular environment scanning techniques
US7899616B2 (en) Method for obtaining information about objects outside of a vehicle
US10303181B1 (en) Self-driving vehicle systems and methods
US7647180B2 (en) Vehicular intersection management techniques
US10481606B1 (en) Self-driving vehicle systems and methods
US10479319B1 (en) Self-driving vehicle systems and methods
US20090043506A1 (en) Method and System for Controlling Timing of Vehicle Transmissions
US11644833B2 (en) Self-driving vehicle systems and methods
US10832569B2 (en) Vehicle detection systems
CN116472213A (en) Integral road-finding
US20230092933A1 (en) Systems and methods for detecting an environment external to a personal mobile vehicle in a fleet management system
WO2020061028A1 (en) Self-driving vehicle systems and methods

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

AS Assignment

Owner name: DRIVENT LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WENGREEN, ERIC JOHN;SCHWIE, WESLEY EDWARD;REEL/FRAME:050121/0140

Effective date: 20190821

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment: 4