US20230251659A1 - Systems and methods for autonomous vehicles - Google Patents
Systems and methods for autonomous vehicles Download PDFInfo
- Publication number
- US20230251659A1 US20230251659A1 US18/301,211 US202318301211A US2023251659A1 US 20230251659 A1 US20230251659 A1 US 20230251659A1 US 202318301211 A US202318301211 A US 202318301211A US 2023251659 A1 US2023251659 A1 US 2023251659A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- occupant
- control system
- screen
- cabin
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title description 3
- 230000033001 locomotion Effects 0.000 claims description 30
- 230000008859 change Effects 0.000 claims description 13
- 230000009471 action Effects 0.000 abstract description 38
- 238000010411 cooking Methods 0.000 abstract description 34
- 230000003993 interaction Effects 0.000 abstract description 12
- 230000015654 memory Effects 0.000 description 31
- 238000004891 communication Methods 0.000 description 30
- 238000012545 processing Methods 0.000 description 22
- 230000004044 response Effects 0.000 description 21
- 238000001514 detection method Methods 0.000 description 11
- 230000007246 mechanism Effects 0.000 description 11
- 238000004519 manufacturing process Methods 0.000 description 10
- 239000000725 suspension Substances 0.000 description 10
- 230000006399 behavior Effects 0.000 description 9
- 239000002699 waste material Substances 0.000 description 9
- 239000000463 material Substances 0.000 description 8
- 230000008093 supporting effect Effects 0.000 description 8
- 241000282412 Homo Species 0.000 description 7
- 238000012544 monitoring process Methods 0.000 description 7
- 230000001276 controlling effect Effects 0.000 description 5
- 230000002526 effect on cardiovascular system Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 210000003128 head Anatomy 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 238000002485 combustion reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 235000013305 food Nutrition 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000010200 validation analysis Methods 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 239000002828 fuel tank Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000010438 heat treatment Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000003973 paint Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000002040 relaxant effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000001953 sensory effect Effects 0.000 description 2
- 230000035939 shock Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 241000251468 Actinopterygii Species 0.000 description 1
- 208000019749 Eye movement disease Diseases 0.000 description 1
- 208000005392 Spasm Diseases 0.000 description 1
- 208000003443 Unconsciousness Diseases 0.000 description 1
- 239000006096 absorbing agent Substances 0.000 description 1
- 239000003570 air Substances 0.000 description 1
- 239000012080 ambient air Substances 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 235000013361 beverage Nutrition 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 230000006735 deficit Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 235000013410 fast food Nutrition 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000003760 hair shine Effects 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004615 ingredient Substances 0.000 description 1
- 235000013372 meat Nutrition 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 208000010125 myocardial infarction Diseases 0.000 description 1
- 235000019645 odor Nutrition 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 244000144977 poultry Species 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 238000005086 pumping Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000008261 resistance mechanism Effects 0.000 description 1
- 230000036387 respiratory rate Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- -1 tape Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000013022 venting Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0088—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
- B60W60/0016—Planning or execution of driving tasks specially adapted for safety of the vehicle or its occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0013—Planning or execution of driving tasks specially adapted for occupant comfort
- B60W60/00133—Planning or execution of driving tasks specially adapted for occupant comfort for resting
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
- G01C21/3415—Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3605—Destination input or retrieval
- G01C21/3617—Destination input or retrieval using user history, behaviour, conditions or preferences, e.g. predicted or inferred from previous use or current movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/008—Registering or indicating the working of vehicles communicating information to a remotely located station
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0872—Driver physiology
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/041—Potential occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/221—Physiology, e.g. weight, heartbeat, health or special needs
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2201/00—Application
- G05D2201/02—Control of position of land vehicles
- G05D2201/0213—Road vehicle, e.g. car or truck
Definitions
- This disclosure generally relates to autonomous vehicles (e.g., autonomous automobiles, trucks and other road vehicles).
- autonomous vehicles e.g., autonomous automobiles, trucks and other road vehicles.
- Autonomous vehicles (sometimes referred to as a “self-driving” or “driverless” vehicles), such as autonomous automobiles, trucks and other road vehicles, are operable without human control, including by steering, accelerating, and decelerating (e.g., braking) autonomously without human control, to travel to a destination.
- autonomous vehicles can provide many benefits such as increased safety, reduced traffic and more free time, there may also be issues or opportunities arising in respect of autonomous vehicles which may not arise with or be less relevant for conventional vehicles driven by human drivers.
- an autonomous vehicle configured to facilitate its use and/or enhance what it and/or occupants can do with it, such as, for example, by: autonomously acting based on events within the autonomous vehicle, including by autonomously rerouting itself, altering a cabin of the autonomous vehicle, notifying a third party external to the autonomous vehicle, stopping (e.g., parking) the autonomous vehicle, altering how the autonomous vehicle drives itself, and/or performing other actions based on what and/or how an occupant is doing, an emergency, or another event occurring in the cabin of the autonomous vehicle; autonomously acting based on interactions with (e.g., gestures of) humans (e.g., police officers, school-crossing guards, traffic guards at roadwork or other temporary traffic control sites, drivers of other vehicles, etc.) external to the autonomous vehicle; autonomously acting based on indicators placed at particular locations (e.g., drive-through establishments, potholes, parking spots, etc.); facilitating acts of occupants in the cabin of the autonomous vehicle,
- this disclosure relates to an autonomous vehicle comprising a control system configured to cause the autonomous vehicle to autonomously perform an action based on an event within the autonomous vehicle.
- this disclosure relates to an autonomous vehicle comprising a cabin and a control system configured to cause the autonomous vehicle to autonomously perform an action based on an event in the cabin.
- this disclosure relates to an autonomous vehicle comprising a cabin and a control system configured to cause the autonomous vehicle to autonomously perform an action based on a state of an occupant in the cabin.
- this disclosure relates to an autonomous vehicle comprising a control system configured to cause the autonomous vehicle to autonomously perform an action based on an interaction with a human external to the autonomous vehicle.
- this disclosure relates to an autonomous vehicle comprising a control system configured to cause the autonomous vehicle to autonomously perform an action based on a human protocol gesture made by a human external to the autonomous vehicle.
- this disclosure relates to an autonomous vehicle comprising a cabin and a control system configured to personalize the autonomous vehicle based on an identity of an occupant in the cabin.
- this disclosure relates to an autonomous vehicle comprising a cabin, a control system configured to operate the autonomous vehicle, and an occupant-act facilitator configured to facilitate an act of an occupant in the cabin unrelated to and normally not done while driving.
- this disclosure relates to an autonomous vehicle comprising a sleeping facilitator configured to facilitate sleeping of an occupant in a cabin of the autonomous vehicle.
- this disclosure relates to an autonomous vehicle comprising a working facilitator configured to facilitate work of an occupant in a cabin of the autonomous vehicle by providing a workspace for the occupant.
- this disclosure relates to an autonomous vehicle comprising an excising facilitator configured to facilitate exercising of an occupant in a cabin of the autonomous vehicle by providing an exerciser for the occupant.
- this disclosure relates to an autonomous vehicle comprising an eating facilitator configured to facilitate eating by an occupant in a cabin of the autonomous vehicle by providing an eating area for the occupant.
- this disclosure relates to an autonomous vehicle comprising a cooking facilitator configured to facilitate cooking by an occupant in a cabin of the autonomous vehicle by providing a cooking area for the occupant.
- this disclosure relates to a system for an autonomous vehicle, in which the system is configured to cause the autonomous vehicle to autonomously perform an action based on an event within the autonomous vehicle.
- this disclosure relates to a system for an autonomous vehicle, in which the system is configured to cause the autonomous vehicle to autonomously perform an action based on an event in a cabin of the autonomous vehicle.
- this disclosure relates to a system for an autonomous vehicle, in which the system is configured to cause the autonomous vehicle to autonomously perform an action based on a state of an occupant in a cabin of the autonomous vehicle.
- this disclosure relates to a system for an autonomous vehicle, in which the system is configured to cause the autonomous vehicle to autonomously perform an action based on an interaction with a human external to the autonomous vehicle.
- this disclosure relates to a system for an autonomous vehicle, in which the system is configured to cause the autonomous vehicle to autonomously perform an action based on a human protocol gesture made by a human external to the autonomous vehicle.
- this disclosure relates to a system for autonomous vehicle, in which the system is configured to personalize the autonomous vehicle based on an identity of an occupant in a cabin of the autonomous vehicle.
- this disclosure relates to a sleeping facilitator for autonomous vehicle, in which the sleeping facilitator is configured to facilitate sleeping of an occupant in a cabin of the autonomous vehicle.
- this disclosure relates to a working facilitator for an autonomous vehicle, in which the working facilitator is configured to facilitate work of an occupant in a cabin of the autonomous vehicle by providing a workspace for the occupant.
- this disclosure relates to an exercising facilitator for an autonomous vehicle, in which the excising facilitator is configured to facilitate exercising of an occupant in a cabin of the autonomous vehicle by providing an exerciser for the occupant.
- this disclosure relates to an eating facilitator for an autonomous vehicle, in which the eating facilitator is configured to facilitate eating by an occupant in a cabin of the autonomous vehicle by providing an eating area for the occupant.
- this disclosure relates to a cooking facilitator for an autonomous vehicle, in which the cooking facilitator is configured to facilitate cooking by an occupant in a cabin of the autonomous vehicle by providing a cooking area for the occupant.
- this disclosure relates to an indicator configured to be placed at a particular location external to an autonomous vehicle and recognized by a control system of the autonomous vehicle such that the control system autonomously operates the autonomous vehicle at the particular location based on recognition of the indicator.
- FIGS. 1 to 4 show an embodiment of an autonomous vehicle
- FIGS. 5 and 6 show an embodiment of a control system of the autonomous vehicle
- FIG. 7 shows an embodiment of monitoring the autonomous vehicle and causing the autonomous vehicle to perform one or more actions in response to detecting an actionable event within the autonomous vehicle
- FIG. 8 shows an embodiment of monitoring a cabin of the autonomous vehicle and causing the autonomous vehicle to perform one or more actions in response to detecting an actionable event in the cabin;
- FIGS. 9 and 10 show an embodiment of monitoring an occupant in the cabin of the autonomous vehicle and causing the autonomous vehicle to perform one or more actions in response to detecting an actionable event based on a state of the occupant;
- FIG. 11 shows an embodiment of rerouting of the autonomous vehicle
- FIG. 12 shows an embodiment of issuing a notification to a communication device external to the autonomous vehicle
- FIGS. 13 to 16 show an embodiment of altering the cabin of the autonomous vehicle
- FIG. 17 shows an embodiment of altering a self-driving mode of the autonomous vehicle
- FIG. 18 shows an embodiment of monitoring a device of the autonomous vehicle elsewhere than at the cabin of the autonomous vehicle and causing the autonomous vehicle to perform one or more actions in response to detecting an actionable event related to the device;
- FIG. 19 shows another embodiment of rerouting of the autonomous vehicle
- FIGS. 20 and 21 show an embodiment of the autonomous vehicle acting based on interactions with (e.g., gestures of) humans external to the autonomous vehicle;
- FIG. 22 shows an embodiment of a human external to the autonomous vehicle having equipment detectable by the autonomous vehicle to facilitate recognition of gestures
- FIG. 23 shows a variant to what is shown in FIGS. 20 and 21 ;
- FIGS. 24 to 31 show embodiments of occupant-act facilitators of the autonomous vehicle to facilitate acts of occupants in the cabin of the autonomous vehicle, such as sleeping, exercising, working, eating, cooking, and/or any other suitable act.
- FIG. 32 shows an embodiment of an aftermarket apparatus installable in the cabin of the autonomous vehicle
- FIG. 33 shows an embodiment of a controller for the aftermarket apparatus
- FIGS. 34 to 39 show embodiments of indicators placed at particular locations and recognizable by the control system of the autonomous vehicle such that the control system autonomously operates the autonomous vehicle at these particular locations based on recognition of the indicators;
- FIGS. 40 to 42 show an embodiment in which the autonomous vehicle is personalized for an occupant based on an identity of the occupant.
- FIGS. 1 to 5 show an example of an embodiment of an autonomous vehicle 10 on a road 11 .
- the autonomous vehicle 10 is designed to legally carry people or cargo on the road 11 , which is part of a public road infrastructure (e.g., public streets, highways, etc.).
- the autonomous vehicle 10 is an automobile (e.g., a passenger car).
- the autonomous vehicle 10 may be an autonomous truck, an autonomous bus, or any other autonomous road vehicle.
- the autonomous vehicle 10 (sometimes referred to as a “self-driving” or “driverless” vehicle) is operable without human control, including by steering, accelerating, and decelerating (e.g., braking) itself autonomously without human control, to travel to a destination. Although it can drive itself, in some embodiments, the autonomous vehicle 10 may be controlled by a human driver in some situations.
- the autonomous vehicle 10 comprises a frame 12 , a powertrain 14 , a steering system 16 , a suspension 18 , wheels 20 1 - 20 4 , a cabin 22 , and a control system 15 that is configured to operate the vehicle 10 autonomously (i.e., without human control).
- the autonomous vehicle 10 has a longitudinal direction, a widthwise direction, and a heightwise direction.
- the autonomous vehicle 10 may be configured to facilitate its use and/or enhance what it and/or occupants can do with it, such as, for example, by:
- the powertrain 14 is configured to generate power for the autonomous vehicle 10 , including motive power for the wheels 20 1 - 20 4 to propel the vehicle 10 on the road 11 .
- the powertrain 14 comprises a power source 13 (e.g., a primer mover) that includes one or more motors.
- the power source 13 may comprise an electric motor (e.g., powered by a battery), an internal combustion engine, or a combination of different types of motor (e.g., an electric motor and an internal combustion engine).
- the powertrain 14 can transmit power from the power source 13 to one or more of the wheels 20 1 - 20 4 in any suitable way (e.g., via a transmission, a differential, a shaft engaging (i.e., directly connecting) a motor and a given one of the wheels 20 1 - 20 4 , etc.).
- the steering system 16 is configured to steer the autonomous vehicle 10 on the road 11 .
- the steering system 16 is configured to turn front ones of the wheels 20 1 - 20 4 to change their orientation relative to the frame 12 of the vehicle 10 in order to cause the vehicle 10 to move in a desired direction.
- the suspension 18 is connected between the frame 12 and the wheels 20 1 - 20 4 to allow relative motion between the frame 12 and the wheels 20 1 - 20 4 as the autonomous vehicle 10 travels on the road 11 .
- the suspension 18 may enhance handling of the vehicle 10 on the road 11 by absorbing shocks and helping to maintain traction between the wheels 20 1 - 20 4 and the road 11 .
- the suspension 18 may comprise an arrangement of springs and dampers.
- a spring may be a coil spring, a leaf spring, a gas spring (e.g., an air spring), or any other elastic object used to store mechanical energy.
- a damper may be a fluidic damper (e.g., a pneumatic damper, a hydraulic damper, etc.), a magnetic damper, or any other object which absorbs or dissipates kinetic energy to decrease oscillations.
- a single device may itself constitute both a spring and a damper (e.g., a hydropneumatic device).
- the cabin 22 is configured to be occupied by one or more occupants of the autonomous vehicle 10 .
- the cabin 22 comprises windows 21 1 - 21 W , seats 20 1 - 20 S , and a user interface 70 that is configured to interact with one or more occupants of the vehicle 10 .
- the user interface 70 comprises an input portion 71 including one or more input devices (e.g., a set of buttons, levers, dials, etc., a touchscreen, a microphone, etc.) allowing an occupant of the vehicle 10 to input commands and/or other information into the vehicle 10 and an output portion 73 including one or more output devices (e.g., a display, a speaker, etc.) to provide information to an occupant of the vehicle 10 .
- the output portion 73 of the user interface 70 which may comprise an instrument panel (e.g., a dashboard) which provides indicators (e.g., a speedometer indicator, a tachometer indicator, etc.) related to operation of the vehicle 10 .
- the control system 15 is configured to operate the autonomous vehicle 10 , including to steer, accelerate, and decelerate (e.g., brake) the autonomous vehicle 10 , autonomously (i.e, without human control) as the autonomous vehicle 10 progresses to a destination along a route on the road 11 .
- the control system 15 comprises a controller 80 and a sensing apparatus 82 to perform actions controlling the vehicle 10 (e.g., actions to steer, accelerate, decelerate, etc.) to move it towards its destination on the road 11 based on a computerized perception of an environment of the vehicle 10 .
- the autonomous vehicle 10 may be controlled by a human driver, such as an occupant in the cabin 22 , in some situations.
- the control system 15 may allow the autonomous vehicle 10 to be selectively operable either autonomously (i.e., without human control) or under human control (i.e., by a human driver) in various situations (e.g., the autonomous vehicle 10 may be operable in either of an autonomous operational mode and a human-controlled operational mode).
- the user interface 70 of the cabin 22 may comprise an accelerator 31 (e.g., an acceleration pedal), a braking device 33 (e.g., a brake pedal), and a steering device 35 (e.g., a steering wheel) that can be operated by a human driver in the cabin 22 to control the vehicle 10 on the road 11 .
- an accelerator 31 e.g., an acceleration pedal
- a braking device 33 e.g., a brake pedal
- a steering device 35 e.g., a steering wheel
- the controller 80 is a processing apparatus configured to process information received from the sensing apparatus 82 and possibly other sources in order to perform actions controlling the autonomous vehicle 10 , including to steer, accelerate, and decelerate the vehicle 10 , towards its destination on the road 11 .
- the controller 80 comprises an interface 166 , a processing portion 168 , and a memory portion 170 , which are implemented by suitable hardware and software.
- the interface 166 comprises one or more inputs and outputs allowing the controller 80 to receive input signals from and send output signals to other components to which the controller 80 is connected (i.e., directly or indirectly connected), including the sensing apparatus 82 , the powertrain 14 , and the steering system 16 , and possibly other components such as the user interface 70 , a communication interface 68 configured to communicate over a communication network (e.g., a cellular, WiFi, or other wireless network, for internet and/or other communications), over one or more local communication links (e.g., BlueTooth, USB, etc.), and/or with one or more other vehicles that are near the autonomous vehicle 10 (i.e., for inter-vehicle communications), etc.
- a communication network e.g., a cellular, WiFi, or other wireless network, for internet and/or other communications
- local communication links e.g., BlueTooth, USB, etc.
- the processing portion 168 comprises one or more processors for performing processing operations that implement functionality of the controller 80 .
- a processor of the processing portion 168 may be a general-purpose processor executing program code stored in the memory portion 170 .
- a processor of the processing portion 168 may be a specific-purpose processor comprising one or more preprogrammed hardware or firmware elements (e.g., application-specific integrated circuits (ASICs), electrically erasable programmable read-only memories (EEPROMs), etc.) or other related elements.
- ASICs application-specific integrated circuits
- EEPROMs electrically erasable programmable read-only memories
- the memory portion 170 comprises one or more memories for storing program code executed by the processing portion 168 and/or data (e.g., maps, vehicle parameters, etc.) used during operation of the processing portion 168 .
- a memory of the memory portion 170 may be a semiconductor medium (including, e.g., a solid-state memory), a magnetic storage medium, an optical storage medium, and/or any other suitable type of memory.
- a memory of the memory portion 170 may be read-only memory (ROM) and/or random-access memory (RAM), for example.
- the controller 80 may comprise and/or interact with one or more other control units of the autonomous vehicle 10 .
- the controller 80 may comprise and/or interact with a powertrain control unit of the powertrain 14 , such as an engine control unit (ECU), a transmission control unit (TCU), etc.
- ECU engine control unit
- TCU transmission control unit
- the sensing apparatus 82 comprises a set of sensors 90 1 - 90 S to sense aspects of the environment of the vehicle 10 and generate sensor information indicative of these aspects of the environment of the vehicle 10 that is provided to the controller 80 in order to control the vehicle 10 towards its destination on the road 11 .
- the sensor information can be used by the controller 80 to determine actions which are to be performed by the autonomous vehicle 10 in order for the vehicle 10 to continue to its destination.
- the sensors 90 1 - 90 S can provide situational information proximate to the vehicle 10 , including any potential hazards proximate to the vehicle 10 .
- the sensors 90 1 - 90 S may include any suitable sensing device.
- the sensors 90 1 - 90 S may comprise a camera (e.g., video, stereoscopic, etc.) and/or other imaging device, a Light Detection and Ranging (LIDAR) device, a radar device, a wheel speed sensor, a GPS and/or other location sensor, and/or any other suitable sensing device.
- a camera e.g., video, stereoscopic, etc.
- LIDAR Light Detection and Ranging
- the autonomous vehicle 10 may be implemented in any suitable way.
- the autonomous vehicle 10 including its control system 15 , may be implemented as a WaymoTM vehicle such as that described at https://waymo.com/tech/and https://waymo.com/safetyreport/, a UberTM vehicle such as that described at https://www.uber.com/cities/pittsburgh/self-driving-ubers/, a vehicle such as that described in U.S. Pat. No. 8,818,608 or U.S. Patent Application Publication 2014/0303827, all of which are incorporated by reference herein.
- the autonomous vehicle 10 may be for personal or private use by a user (e.g., where the vehicle 10 is owned or leased by the user or another individual personally known to the user, such as a family member, a friend, etc.). In other cases, the autonomous vehicle 10 may be for public use by various users, such as where the vehicle 10 is used as part of a taxi, ride-hailing or vehicle-sharing service.
- the autonomous vehicle 10 may be configured to facilitate its use and/or enhance what it and/or occupants can do with it. Examples of embodiments of this are described below.
- control system 15 of the autonomous vehicle 10 may be configured to cause the vehicle 10 to autonomously perform one or more actions based on one or more events within the vehicle 10 .
- control system 15 may be configured to cause the vehicle 10 to autonomously reroute itself (i.e., change its destination and/or its current route), alter the cabin 22 , notify a third party external to the vehicle 10 , stop (e.g., park) the vehicle 10 , alter how the vehicle 10 drives itself, and/or perform one or more other actions based on what and/or how an occupant is doing, an emergency, or another event occurring in the cabin 22 of the vehicle 10 .
- stop e.g., park
- control system 15 is configured to monitor an interior of the cabin 22 and, in response to detecting an actionable event in the cabin 22 , cause the vehicle 10 to autonomously reroute itself and/or perform one or more other actions to address that event.
- An actionable event in the cabin 22 in response to which the control system 15 causes the autonomous vehicle 10 to reroute itself and/or perform one or more other actions may involve one or more conditions being met (e.g., one or more circumstances having arisen) in the cabin 22 . Any or all of these one or more conditions may be predefined or otherwise specified such that, when the one or more conditions are met, the actionable event is deemed to have occurred in the cabin 22 .
- Detection that one or more conditions are met in the cabin 22 , and therefore detection of an actionable event in the cabin 22 may be carried out by the controller 80 . This may be achieved based on processing of input information that may be received by the controller 80 . Examples of such input information may include information received via the user interface 70 , the communication interface 68 , and/or possibly from other sources (e.g., one or more sensors in the cabin 22 ).
- the controller 80 When an actionable event in the cabin 22 is detected, the controller 80 responds by effecting one or more actions to address that event. For example, the controller 80 may issue one or more control signals to the powertrain 14 , the steering system 16 , the user interface 70 , the communication interface 68 , and/or possibly other devices.
- control system may be configured to monitor an occupant in the cabin 22 and cause the vehicle 10 to autonomously perform one or more actions based on a state of the occupant (i.e., what and/or how the occupant is doing).
- the controller 80 may process information received or otherwise derived from one or more sensors 75 1 - 75 P of an occupant-monitoring system 83 that is monitoring the occupant.
- the sensors 75 1 - 75 P may be configured to monitor what the occupant is doing, i.e., an activity or a lack of activity of the occupant (e.g., sleeping, eating, working, exercising, watching media, etc.), and/or how the occupant is doing, i.e., a health of the occupant (e.g., whether the occupant appears to be in good condition or suffering a loss of consciousness, a stroke, a heart attack, or other physical impairment).
- the sensors 75 1 - 75 P may include a camera to view the occupant (e.g., the occupant's eyes, face, and/or other parts or all of the occupant's body), a motion sensor to sense motion of the occupant, a pressure sensor (e.g., on a given one of the seats 20 1 - 20 S that is occupied by the occupant, such as in a headrest or a seat cushion), a vital sign sensor to sense one or more vital signs (e.g., a pulse rate, a respiratory rate, a body temperature, and/or a blood pressure) of the occupant (e.g., a heart rate monitor, a temperature sensor, a blood pressure monitor, etc.), and/or any other sensor.
- a camera to view the occupant
- a motion sensor to sense motion of the occupant
- a pressure sensor e.g., on a given one of the seats 20 1 - 20 S that is occupied by the occupant, such as in a headrest or a seat cushion
- Processing of information received or otherwise derived from the sensors 75 1 - 75 P by the controller 80 may comprise image processing (e.g., of images captured by a camera), comparison of parametric values (e.g., to reference or threshold values), and/or any other processing operations.
- image processing e.g., of images captured by a camera
- parametric values e.g., to reference or threshold values
- a sensor 75 i may be built into the cabin 22 during original manufacturing of the autonomous vehicle 10 (e.g., on a console of the user interface 70 , on a given one of the seats 20 1 - 20 S , etc.). In other cases, a sensor 75 i may be installed in the cabin 22 after original manufacturing of the vehicle 10 (e.g., as part of an aftermarket device installed in the cabin 22 by an owner or leaser of the vehicle 10 ). In yet other cases, a sensor 75 i may be carried into the cabin 22 by the occupant as part of a portable device carried by the occupant, such as: a smartphone or other wireless phone; a tablet computer; a head-mounted display, smartwatch or other wearable device; a medical device; etc. In such cases, the portable device comprising the sensor 75 i may communicate with the controller 80 via a communication link, which may be wireless, wired, or partly wireless and partly wired (e.g., Bluetooth, WiFi or other wireless LAN, USB, etc.).
- a communication link which may be
- the controller 80 may monitor an occupant of the vehicle 10 and, in response to detecting an event involving the occupant, reroute the vehicle 10 to a new destination 74 different from an original destination 40 of the vehicle 10 .
- the controller 80 may monitor the occupant of the vehicle 10 and, in response to detecting an emergency or other medical event involving the occupant, autonomously reroute the vehicle 10 to the new destination 74 where medical assistance is providable to the occupant (e.g., a hospital, clinic, or other medical establishment; a police station; a fire station; etc.).
- a hospital, clinic, or other medical establishment e.g., a police station; a fire station; etc.
- the controller 80 may detect one or more conditions indicative of the emergency or other medical event involving the occupant, and may proceed to cause the vehicle 10 to be autonomously rerouted to the new destination 74 where medical assistance can be provided by consulting a database of medical establishments and mapping information using a current location of the vehicle 10 .
- vital signs e.g., heart rate
- a position or movement e.g., a spasm, eyes rolling, a head or upper body tilt or collapse, stillness, a heavy sit, etc.
- physical traits e.g., paleness, bleeding, etc.
- the controller 80 may monitor an occupant of the vehicle 10 and, in response to detecting an event involving the occupant, cause issuance of a notification 85 to a communication device 87 external to the vehicle 10 .
- the communication device 87 may be a smartphone, a tablet, a head-mounted display, a smartwatch, or other device carried or worn by an individual; a server or other computer; or any other device designed for communication.
- the controller 80 may monitor the occupant of the vehicle 10 and, in response to detecting an emergency or other medical event involving the occupant, cause the notification 85 to be transmitted to the communication device 87 which is associated with a medical assistance provider (e.g., at a hospital, clinic, or other medical establishment; a police station; a fire station; etc.) to notify the medical assistance provider of what is happening with the occupant, which may help the medical assistance provider to prepare for treating the occupant.
- the notification 85 transmitted to the communication device 87 associated with the medical assistance provider may be conveyed as a text message (e.g., SMS message), an email message, a voice message, or any other suitable communication.
- the controller 80 may cause the communication interface 68 to transmit the notification 85 to the communication device 87 via a communication link 49 which may be established over a cellular network, a WiFi network, a satellite connection, and/or any other suitable connection.
- issuance of the notification 85 to the communication device 87 associated with the medical assistance provider may be done in conjunction with autonomous rerouting of the vehicle 10 to a destination where medical assistance is providable to the occupant, as discussed above.
- issuance of the notification 85 to the communication device 87 associated with the medical assistance provider may be done without autonomously rerouting the vehicle 10 to another destination (e.g., the vehicle 10 may be parked, its location may be conveyed to the medical assistance provider, and an ambulance may be dispatched to that location).
- the controller 80 may monitor an occupant of the vehicle 10 and, in response to detecting a prohibited behavior exhibited by the occupant, perform one or more actions such as causing the vehicle 10 to autonomously stop (e.g., park) and/or reroute itself and/or causing issuance of a notification 85 to a communication device 87 external to the vehicle 10 .
- a prohibited behavior is “prohibited” in that it is not allowed to be exhibited in the vehicle 10 .
- This may be specified by a provider of the vehicle 10 (e.g., a manufacturer of the vehicle 10 , a taxi, ride-hailing or vehicle-sharing service provider, etc.); a public authority (e.g., a police, a government, etc.); etc.
- this may include a behavior that is dangerous, hazardous or otherwise risky (e.g., to the occupant, any other occupant of the vehicle 10 , or other vehicles on the road 11 ), is susceptible to vandalize or otherwise damage the vehicle 10 , and/or is otherwise undesirable.
- the controller 80 may monitor the occupant of the vehicle 10 and, in response to detecting a prohibited behavior exhibited by the occupant, autonomously stop (e.g., park) the vehicle 10 .
- the controller 80 may also cause the user interface 70 of the cabin 22 to advise the occupant of the prohibited behavior (e.g., by displaying or otherwise issuing a warning or other notification, which may request the occupant to get out of the vehicle 10 , etc.)
- the controller 80 may cause a notification 85 to be transmitted to a communication device 87 external to the vehicle 10 .
- the communication device 87 may be associated with a provider of the vehicle 10 or a public authority, and the notification 85 may report the prohibited behavior exhibited by the occupant.
- the controller 80 may monitor an occupant of the vehicle 10 and, in response to detecting an event involving the occupant, cause altering of the cabin 22 . That is, the controller 80 may cause one or more objects 96 1 - 96 O of the cabin 22 to be altered by changing from one state to a different state.
- the controller 80 may monitor the occupant of the vehicle 10 and, in response to detecting that the occupant is sleeping, alter the cabin 22 to facilitate the occupant's sleep.
- the controller 80 may cause the cabin 22 to be altered to reduce stimuli (e.g., light, noise, vibrations, etc.) from the vehicle 10 and/or its environment affecting the occupant who is sleeping.
- stimuli e.g., light, noise, vibrations, etc.
- the controller 80 may detect one or more conditions indicative that the occupant is sleeping, and may proceed to cause the cabin 22 to be altered to reduce stimuli from the vehicle 10 and/or its environment affecting the occupant who is sleeping.
- vital signs e.g., heart rate
- a position or movement e.g., a head tilt, stillness, etc.
- physical traits e.g., eyes closed, an open mouth, etc.
- the cabin 22 may comprise a light-control system 55 to control (e.g., reduce) light entering into the cabin 22 via the windows 21 1 - 21 W , and the controller 80 may cause the light-control system 55 to reduce the light entering into the cabin 22 upon detecting that the occupant is sleeping.
- the light-control system 55 may comprise a light blocker 27 activatable by the controller 80 to block light from reaching the interior of the cabin 22 through at least part of the windows 21 1 - 21 W .
- FIG. 14 the cabin 22 may comprise a light-control system 55 to control (e.g., reduce) light entering into the cabin 22 via the windows 21 1 - 21 W
- the controller 80 may cause the light-control system 55 to reduce the light entering into the cabin 22 upon detecting that the occupant is sleeping.
- the light-control system 55 may comprise a light blocker 27 activatable by the controller 80 to block light from reaching the interior of the cabin 22 through at least part of the windows 21 1 - 21 W .
- the light-control system 55 may comprise a window covering 23 (e.g., comprising one or more blinds, shades, shutters, and/or curtains) deployable (e.g., extendible) to cover at least part of the windows 21 1 - 21 W , such that the controller 80 may cause deployment of the window covering 23 to reduce the light entering into the cabin 22 .
- a window covering 23 e.g., comprising one or more blinds, shades, shutters, and/or curtains
- deployable e.g., extendible
- the light-control system 55 may comprise a window transmissivity changer 25 configured to change a tint or other aspect affecting transmissivity of one or more of the windows 21 1 - 21 W , and the controller 80 may cause the window transmissivity changer 25 to change the tint or other aspect affecting transmissivity (e.g., darken, increase opacity, etc.) of one or more of the windows 21 1 - 21 W to reduce light entering into the cabin 22 .
- a window transmissivity changer 25 configured to change a tint or other aspect affecting transmissivity of one or more of the windows 21 1 - 21 W
- the controller 80 may cause the window transmissivity changer 25 to change the tint or other aspect affecting transmissivity (e.g., darken, increase opacity, etc.) of one or more of the windows 21 1 - 21 W to reduce light entering into the cabin 22 .
- the window transmissivity changer 25 may comprise a film disposed on one or more of the windows 21 1 - 21 W and electrically controllable to alter the tint or other aspect affecting transmissivity (e.g., such as that commercially-available from Smart Tint at www.smartint.com/).
- the cabin 22 may comprise a noise-control system 57 configured to control (e.g., reduce) noise in the cabin 22 , and the controller 80 may cause the noise-control system 57 to reduce noise in the cabin 22 upon detecting that the occupant is sleeping.
- the noise-control system 57 may comprise a noise canceller 59 to at least partly cancel the noise entering the cabin 22 , such as by generating sound that at least partly cancels the noise entering the cabin.
- the noise canceller 59 may comprise one or more microphones and one or more speakers in the cabin 22 , possibly one or more amplifiers or other sound-generating components, and a controller configured to generate an audio signal that is reversed in phase to an audio signal picked up by the one or more microphones and that is applied to the one or more speakers to generate the sound at least partly cancelling the noise entering the cabin 22 (e.g., using active noise control technology for noise cancellation such as that commercially-available from Ford, Toyota, Nissan, and other car manufacturers).
- the controller 80 may cause an audio system 60 of the user interface 70 of the cabin 22 to emit relaxing sound (e.g., ocean waves, rain, forest sounds, soothing music, etc.).
- relaxing sound e.g., ocean waves, rain, forest sounds, soothing music, etc.
- the controller 80 may cause a seat 20 i occupied by the occupant to be altered to facilitate the occupant's sleep.
- the seat 20 i may be a “driver's seat” in front of the vehicle 10 , in embodiments in which the vehicle 10 is selectively operable either autonomously (i.e., without human control) or under human control (i.e., by a human driver) or where such a driver's seat would conventionally be found in a human-driven vehicle.
- the seat 20 i occupied by the occupant may include a seat-altering system 52 configured to alter the seat 20 i , and the controller 80 may cause the seat-altering system 52 upon detecting that the occupant is sleeping.
- the seat-altering system 52 may comprise one or more actuators (e.g., electromechanical actuators, fluidic (e.g., hydraulic or pneumatic) actuators, etc.) connected to one or more movable portions 50 1 - 50 M , such as a seating portion, a backrest portion, and a headrest portion, of the seat 20 i to change relative positioning of the one or more movable portions 50 1 - 50 M of the seat 20 i .
- actuators e.g., electromechanical actuators, fluidic (e.g., hydraulic or pneumatic) actuators, etc.
- the controller 80 may cause the seat-altering system 52 to alter the seat 20 i such that the seat 20 i is converted into a reclined (e.g., bedlike) configuration in which the occupant is reclined on the seat 20 i by repositioning the one or more movable portions 50 1 - 50 M of the seat 20 i .
- a reclined e.g., bedlike
- the seat-altering system 52 may comprise a pillow 63 for the seat 20 i , and the controller 63 may cause the pillow 63 to be deployed upon detecting that the occupant is sleeping.
- the pillow 63 may be integrated into or otherwise associated with the headrest portion of the seat 20 i and the controller 80 may cause the pillow 62 to be deployed by moving the pillow 63 to engage the occupant's head (e.g., by activating an actuator to move the pillow 62 into position) and/or by inflating the pillow 63 (e.g., by activating a pump to inflate the pillow 63 ).
- the controller 80 may cause the cabin 22 to be altered in various other ways in other embodiments upon detecting that the occupant is sleeping (e.g., cause a temperature-control system to adjust a temperature in the cabin 22 , activate a vibration system of the seat 20 i , etc.).
- the controller 80 may monitor an occupant of the vehicle 10 and, in response to detecting an event involving the occupant, alter a self-driving mode of the vehicle 10 , i.e., alter how the vehicle 10 autonomously drives itself.
- the controller 80 may monitor the occupant of the vehicle 10 and, in response to detecting that the occupant is sleeping, alter the self-driving mode of the vehicle 10 to facilitate the occupant's sleep, such as by reducing potential for sudden or abrupt movements (e.g., acceleration, braking, turning, shaking, etc.) of the vehicle 10 on the road 11 .
- sudden or abrupt movements e.g., acceleration, braking, turning, shaking, etc.
- the controller 80 may control the powertrain 14 , the steering system 16 , the suspension 18 , and/or possibly other devices of the vehicle 10 so that the self-driving mode of the vehicle 10 is smoother than when the occupant is deemed to be awake.
- the controller 80 may control the powertrain 14 (e.g., by controlling an ECU, TCU or other powertrain control unit) so that the vehicle 10 accelerates and/or decelerates (e.g., breaks) less intensely, control the steering system 16 so that the vehicle 10 turns less sharply, and/or control the suspension 18 (e.g., by controlling an active suspension system of the suspension 18 ) so that it is less stiff than when the occupant is deemed to be awake.
- the controller 80 may reroute the vehicle 10 to its destination along a new route that is different from and more suited to sleep of the occupant than its original route.
- the new route may include fewer stops (e.g., stop signs, traffic lights, etc.), fewer and/or less sharp turns, a smoother roadway (e.g., less damaged or flatter roadway) than the original route.
- the controller 80 may consult mapping information to determine the new route based on a current location of the vehicle 10 .
- the controller 80 may cause one or more actions to be performed in the cabin 22 to awaken the occupant. For example, in some embodiments, the controller 80 may cause the user interface 70 to issue an awakening notification, such as by causing the user interface 70 to emit sound (e.g., an alarm, music, etc.), vibrate the seat 20 i of the occupant, and/or otherwise stimulate the occupant to awaken him/her.
- the controller 80 may cause the user interface 70 to issue an awakening notification, such as by causing the user interface 70 to emit sound (e.g., an alarm, music, etc.), vibrate the seat 20 i of the occupant, and/or otherwise stimulate the occupant to awaken him/her.
- sound e.g., an alarm, music, etc.
- the controller 80 may cause the light-control system 55 to let more light into the cabin 22 via the windows 21 1 - 21 W (e.g., by retracting the window covering 23 , causing the window transmissivity changer to change lighten the tint or otherwise increase transmissivity of light through one or more of the windows 21 1 - 21 W ), the noise-control system 57 to let more noise in the cabin 22 (e.g., by stopping noise cancellation), the seat-alteration system 52 to move the seat 20 i back into a seated configuration and/or retract the pillow 63 , etc.
- the controller 80 may cause one or more actions to be performed in the cabin 22 to awaken the occupant based on a current location of the vehicle 10 and/or a current time. For instance, the controller 80 may cause these one or more actions to be performed upon determining based on the current location of the vehicle 10 and/or the current time relative to the destination of the vehicle 10 . When it determines that the vehicle 10 is sufficiently close to its destination and/or will arrive at its destination sufficiently soon, the controller 80 proceeds to cause these one or more actions to be performed to awaken the occupant.
- control system 15 of the autonomous vehicle 10 may be configured to cause the vehicle 10 to autonomously reroute itself and/or perform one or more other actions based on a state of a device (e.g., of the powertrain 14 , the steering system 16 , or the suspension 18 ) of the vehicle 10 elsewhere than in the cabin 22 , etc.
- a device e.g., of the powertrain 14 , the steering system 16 , or the suspension 18
- control system 15 may be configured to monitor an energy level (e.g., a battery level or a fuel level) of the powertrain 14 (e.g., a battery and/or a fuel tank of the powertrain 14 ) and, in response to detecting that the energy level reaches a threshold, cause the vehicle 10 to autonomously reroute itself to an energy-replenishing station (e.g., a charging station for a battery for an electric motor and/or a fueling station for a fuel tank for an internal combustion engine).
- an energy level e.g., a battery level or a fuel level
- an energy-replenishing station e.g., a charging station for a battery for an electric motor and/or a fueling station for a fuel tank for an internal combustion engine.
- control system 15 may be configured to monitor an operability of a device (e.g., of the powertrain 14 , the steering system 16 , or the suspension 18 ) and, in response to detecting that the operability of the device is unsuitable for the vehicle 10 (e.g., the device is defective or worn to a threshold), cause the vehicle 10 to autonomously reroute itself to a repair station.
- a device e.g., of the powertrain 14 , the steering system 16 , or the suspension 18
- the vehicle 10 e.g., the device is defective or worn to a threshold
- the controller 80 may monitor the vehicle and, in response to detecting an accident (e.g., a crash), reroute the vehicle 10 to a new destination 74 different from an original destination 40 of the vehicle 10 if the vehicle 10 remains operable. For example, if one or more occupants are in the vehicle when the accident occurs, the controller 80 may autonomously reroute the vehicle 10 to the new destination 74 where medical assistance is providable to the one or more occupants (e.g., a hospital, clinic, or other medical establishment; a police station; a fire station; etc.). This may be done by consulting a database of medical establishments and mapping information using a current location of the vehicle 10 . The controller 80 may detect the accident based on a sensor (e.g., a crash sensor) or deployment of an airbag in the cabin 20 .
- a sensor e.g., a crash sensor
- control system 15 of the autonomous vehicle 10 may be configured to cause the vehicle 10 to autonomously perform one or more actions based on one or more interactions with humans 44 1 - 44 H (e.g., police officers, school-crossing guards, traffic guards at roadwork or other temporary traffic control sites, drivers of other vehicles, etc.) external to the vehicle 10 .
- humans 44 1 - 44 H e.g., police officers, school-crossing guards, traffic guards at roadwork or other temporary traffic control sites, drivers of other vehicles, etc.
- control system 15 may be configured to detect a “human protocol gesture” being made by a human 44 i outside the vehicle 10 and to alter a manner in which the vehicle 10 drives itself based on that detected gesture.
- a human protocol gesture refers to gestures, made by humans in positions of traffic controlling authority, that embed one or more commands for overriding or contradicting conventional algorithmic driving rules.
- a human protocol gesture may be made by hands of a police officer to wave traffic into a lane of opposing traffic, or by a hand and stop sign of a school-crossing guard to stop traffic when there is no actual stop sign, or by a driver of an oncoming vehicle flashing his or her vehicle's lights in the spirit of agreeing on who will have priority when crossing a narrow stretch of road such as a one-lane bridge.
- Commands embedded in a human protocol gesture could include one command or a series of commands. An example of a command could be “stop”. An example of a series of commands could be “change into oncoming traffic lane, advance, and return to original lane after a safe distance”.
- human protocol gestures include motorcycle hand signals as described in: http://www.motorcyclelegalfoundation.com/motorcycle-hand-signals-chart/. However, it should be appreciated that a human protocol gesture is not limited to being derived from hand movements.
- the human protocol gestures being discussed here would be destined for the driver of such vehicle, and there is an expectation on the part of the person making the human protocol gesture that the driver inside the vehicle will understand the gesture and will make resultant modifications to control of the vehicle. This expectation is created as part of the driving culture that has developed in North America and, similarly, in other parts of the world. For example, the United Kingdom uses the “signals by authorized persons”, found at: https://assets.publishing.service.gov.uk/media/560aa62bed915d035c00001b/the-highway-code-signals-by-authorised-persons.pdf.
- human protocol gesture detection is a result of receiving sensory data at step 310 (e.g., from the sensors 90 1 - 90 S such as cameras or LIDAR) and implementing an algorithm, by the controller 80 at step 320 , to recognize a human protocol gesture that may (or may not) be present in the received sensory data.
- Various gesture recognition algorithms may be used for this purpose.
- a human protocol gesture is recognized at step 320 (i.e., the commands embedded therein have been decoded)
- a prescribed action can then be taken at step 330 , involving a change to the manner in which the vehicle 10 is autonomously driven.
- a mapping between human protocol gestures and prescribed driving actions can be stored in the memory portion 168 , such as in a database, of the controller 80 .
- an algorithm may be used (step 320 ), which includes recognizing hand and arm movements in images captured in the vicinity of the vehicle 10 (step 310 ).
- the recognized hand and arm movements may be correlated against a database of sequences of pre-determined hand and arm movements associated with different human protocol gestures.
- Various distance minimization algorithms used in pattern recognition can be used for the purposes of recognizing hand and arm movements to with a certain level of confidence.
- hand gesture recognition technology is commercially available (see, for example, http://www.arcsoft.com/technology/gesture.html), it is envisaged that the person making the hand gesture (e.g., police officer or crossing guard) may be oriented in such a way where the space defined by the movements of his or her two hands intersects to a point where it may be difficult to detect hand or arm movements without additional spatial registration or reference.
- the person executing the human protocol gesture may be provided with ancillary equipment, such as a stop sign or baton, which can be more easily detected by the sensors 90 1 - 90 S and used as a reference to allow more accurate detection of the hand and arm movements and hence the human protocol gesture.
- ancillary equipment such as a stop sign or baton
- the ancillary equipment may include gloves 21 L , 21 R worn by the police officer or crossing guard 44 1 .
- Such gloves may be coated by, covered with or otherwise include retroreflective material 39 L , 39 R . Such material facilitates detection from a direction that shines light onto the material.
- the left and right gloves 21 L , 21 R may have differences between them so as to provide a more easily identifiable hand and arm movements. Such differences could be differences in color, contrast, material patterns (including different patterns of retroreflective material). Stated differently, the left and right gloves 21 L , 21 R are non-symmetric in their distribution of retroreflective material. In this way, the control system is provided with two locatable references when processing the data obtained via the sensors 90 1 - 90 S , which could increase accuracy of detecting the hand and arm movements, with a corresponding increase in accuracy of recognizing the human protocol gesture underlying such hand movements.
- the human protocol gesture involves actions of a driver of an oncoming vehicle where, for example, a single lane must be shared between the vehicle 10 and the oncoming vehicle.
- the sensors 90 1 - 90 S capture images of the driver of the oncoming vehicle, who may be gesturing with his or her hands.
- the driver of the oncoming vehicle may also have triggered a detectable change to the oncoming vehicle, such as by temporarily flashing the head lights or the high beam to signal a desire to relinquish right-of-way to the vehicle 10 (not necessarily knowing a priori that the vehicle 10 is an autonomous vehicle).
- the control system 15 may cause the vehicle 10 to change modes of operation.
- the control system 15 may cause the vehicle 10 to enter a conventional (non-self-driving) mode, whereby control of the vehicle 10 is passed to a human that is occupying a driver's seat of the vehicle 10 .
- the control system 15 may enter an “autonomous override” mode of operation whereby the vehicle 10 is still in self-driving mode but behaves in a way that deviates from conventional algorithmic driving rules.
- the override mode of operation may be temporary, as it is expected that driving conditions will return to normal.
- each human protocol gesture may be associated with an expected length of time that the vehicle 10 will remain in the override mode. This amount of time may be variable, depending on the speed with which traffic is moving, the distance to the human 44 i carrying out the human protocol gesture, etc. Once the expected amount of time is reached, it is envisaged that there will no longer be any gesturing directed at the vehicle 10 and the control system 15 will have to determine autonomously the moment when it is should return to a normal autonomous mode of operation (consistent with conventional algorithmic driving rules).
- a police officer or crossing guard 44 i may have carried out a human protocol gesture that signals for the vehicle 10 to stop, despite the vehicle 10 having right-of-way under conventional algorithmic driving rules.
- the vehicle 10 enters the override mode.
- the control system 15 modifies the way in which the vehicle 10 is autonomously driven by (i) keeping the vehicle 10 stopped and (ii) continuing the detection of human protocol gestures until detecting that the vehicle 10 has been signaled to proceed.
- control system 15 detects a subsequent human protocol gesture that includes a command for the vehicle 10 to change lanes into an oncoming traffic lane, then modifying the way in which the vehicle is autonomously driven includes proceeding at low speed into the oncoming traffic lane (which is contrary to conventional algorithmic driving rules), determining a re-entry point into the regular lane and re-entering the regular lane at the re-entry point. Thereafter, the vehicle 10 exits the override mode.
- Another way to condition the vehicle 10 to exit the override mode may be to learn about its surroundings (e.g., using the sensors 90 1 - 90 S ).
- the control system 15 may implement an accident detection module that is configured to detect a scene of an accident based on factors such as vehicle shape and position distortions, color anomalies, broken glass fragments on the ground, presence of ambulances and so on.
- the control system 15 may be configured to determine a safe distance from the scene of the accident after which the vehicle 10 may return to its original lane and exit the override mode.
- the control system 15 may perform an additional validation step, in order to confirm the authority of the source of the human protocol gesture, before proceeding to alter driving behavior.
- the control system 15 may perform the validation step based on detection of a uniform (e.g., in the case of a police officer or crossing guard, whereby the uniform could include one or more of a vest, hat, badge, pants and shoes) or based on detection of a human driver of an oncoming car, as well as detecting “eye contact” with that human driver.
- a uniform e.g., in the case of a police officer or crossing guard, whereby the uniform could include one or more of a vest, hat, badge, pants and shoes
- the gloves 21 L , 21 R worn by police officers or crossing guards may lead to new gestures targeted specifically at autonomous vehicles such as the vehicle 10 .
- Such gestures could involve hand and arm movements that would not be intuitively understood by human drivers yet ideally suited for detection by cameras and/or LIDAR.
- certain newly created movements, positions or signs may serve to cancel or reset the control system's interpretation of any ongoing human protocol gesture so as to allow the human to restart communications using hand and arm movements.
- hand and arm movements may be recorded in memory and post-processed for algorithm improvement.
- indicators 89 1 - 89 G may be configured to be placed at particular locations 94 1 - 94 L and recognizable by the control system 15 of the vehicle 10 such that the control system 15 autonomously operates the vehicle 10 (e.g., steers, decelerates, stops, opens one or more of the windows 21 1 - 21 W , unlocks one or more doors of the cabin 22 , etc.) at these particular locations based on recognition of the indicators 89 1 - 89 G .
- the indicators 89 1 - 89 G can provide information about the particular locations 94 1 - 94 L to the control system 15 of the vehicle 10 that may otherwise be unobtainable by the control system 15 through its sensing apparatus 82 monitoring the environment of the vehicle 10 if the indicators 89 1 - 89 G were absent from that environment.
- this may be useful when the vehicle 10 moves at drive-through establishments (e.g., restaurants, banks, etc.), travels where potholes are present, looks to park, and/or is in other situations in which certain aspects of the particular locations 94 1 - 94 L would otherwise not be timely known by the control system 15 of the vehicle 10 .
- drive-through establishments e.g., restaurants, banks, etc.
- An indicator 89 x is a physical object dedicated to autonomous vehicles like the vehicle and designed to be placed at a particular location 94 y and recognized by the autonomous vehicles' control systems like the control system 15 of the vehicle 10 to cause these control systems to operate the autonomous vehicles based on recognition of the indicator 89 x . That is, the control system 15 of the vehicle 10 operates the vehicle differently when recognizing the indicator 89 x than if it had not recognized the indicator 89 x .
- the indicator 89 x has an associated predefined meaning such that, upon being recognized by the control system 15 of the vehicle 10 , the control system 15 knows what the indicator 89 x means.
- the indicator 89 x is not a traffic sign (a.k.a., road sign) conventionally used for human-driven vehicles.
- the indicator 89 x at the particular location 94 y may be implemented in any suitable way in various embodiments.
- the indicator 89 x may be an optical indicator configured to be optically observed by the sensing apparatus 82 of the control system 15 of the vehicle 10 .
- the indicator 89 x may include a visual element 95 such as an image (e.g., a symbol, etc.), a color, etc., capturable by a camera of the sensing apparatus 82 and recognizable by the controller 80 of the control system 15 .
- the visual element 95 may be printed, painted or otherwise applied.
- the indicator 89 x may comprise a supporting portion 97 (e.g., a wall, panel, etc.) and the visual element 95 may include a layer that is printed, painted or otherwise applied onto the supporting portion 97 .
- the visual element 95 may be printed, painted or otherwise applied directly onto an existing structure (e.g., part of the road 11 , a building wall, etc.) at the particular location 94 y .
- the visual element 95 may include material (e.g., tape, paint, ink, etc.) more easily observable by the camera of the sensing apparatus 82 , such as by being more reflective (e.g., highly retroreflective, reflective of IR or other particular wavelengths, etc.)
- the indicator 89 x may be a signal emitter (e.g., a beacon) configured to emit a signal receivable by the communication interface 68 of the vehicle 10 and recognizable by the controller 80 of the control system 15 .
- the indicator 89 x may include a transmitter 98 configured to transmit the signal repeatedly (e.g., periodically) or in response to a trigger or interrogation signal previously issued by the vehicle 10 .
- the signal emitted by the indicator 89 x may be wirelessly conveyed via a cellular, WiFi, BlueTooth, or other wireless link.
- an indicator 89 x may be placed at a particular location 94 y where an interaction with an external element 106 that is external to the vehicle 10 is to occur, such that the control system 15 of the vehicle 10 autonomously stops the vehicle 10 at the particular location 94 y in order to allow occurrence of the interaction with the external element 106 .
- the particular location 94 y is at a drive-through establishment 108 , such as a restaurant (e.g., a fast-food restaurant, a coffee shop, etc.) in which case the external element 106 is a drive-through counter to pay and/or pick up an order of food and/or beverage or a bank in which case the external element 106 is an automated telling machine (ATM) to perform a financial transaction.
- a restaurant e.g., a fast-food restaurant, a coffee shop, etc.
- ATM automated telling machine
- the control system 15 of the vehicle 10 Upon recognizing the indicator 89 x , the control system 15 of the vehicle 10 understands that the vehicle 10 is to be stopped at the particular location 94 y , which may be set so that a given one of the windows 21 1 - 21 W of the cabin 22 is aligned with the drive-through counter, ATM or other external element 106 to proceed with the interaction with the external element 106 .
- an indicator 89 x may be placed at a particular location 94 y that should be avoided by the vehicle 10 , such that the control system 15 of the vehicle 10 autonomously steers the vehicle 10 to avoid the particular location 94 y .
- the particular location 94 y is at a pothole 112 on the road 11 .
- the control system 15 of the vehicle 10 understands that the vehicle 10 is to avoid the pothole 112 at the particular location 94 y and determine an alternative path to steer the vehicle 10 without crossing the pothole 112 (e.g., by using the sensing apparatus 92 to assess whether there is an incoming vehicle in an adjacent lane, etc.).
- an indicator 89 x may be placed at a particular location 94 y that is a parking spot 118 for the vehicle 10 , such that the control system 15 of the vehicle 10 autonomously parks the vehicle 10 at the parking spot 118 .
- the parking spot 118 at the particular location 94 y may not be indicated by conventional paint on the road 11 or other conventional parking signs, so that it may not be apparent to the control system 15 of the vehicle 10 that the vehicle 10 can park there.
- the control system 15 of the vehicle 10 Upon recognizing the indicator 89 x , the control system 15 of the vehicle 10 understands that the vehicle 10 can park at the parking spot 118 at the particular location 94 y and proceeds to autonomously park the vehicle 10 there (e.g., by using the indicator 89 x as a reference for parking, such as a center, corner or other reference point of the parking spot 118 ).
- the autonomous vehicle 10 may include occupant-act facilitators 45 1 - 45 D that comprise devices configured to facilitate one or more acts of one or more occupants in the cabin 22 of the vehicle 10 , such as one or more acts unrelated to and normally not done while driving, including, for example, sleeping, exercising, working, eating, cooking, and/or any other suitable act.
- occupant-act facilitators 45 1 - 45 D comprise devices configured to facilitate one or more acts of one or more occupants in the cabin 22 of the vehicle 10 , such as one or more acts unrelated to and normally not done while driving, including, for example, sleeping, exercising, working, eating, cooking, and/or any other suitable act.
- an occupant-act facilitator 45 i may be a sleeping facilitator configured to facilitate sleeping of an occupant in the cabin 22 , such as by altering the cabin 22 to reduce stimuli (e.g., light, noise, vibrations, etc.) from the vehicle 10 and/or its environment.
- stimuli e.g., light, noise, vibrations, etc.
- the sleeping facilitator 45 i may comprise the light-control system 55 to control (e.g., reduce) light entering into the cabin 22 via the windows 21 1 - 21 W , which may comprise the light blocker 27 , such as the window covering 23 deployable to cover at least part of the windows 21 1 - 21 W and/or the window transmissivity changer 25 (e.g., film) to change the tint or other aspect affecting transmissivity of one or more of the windows 21 1 - 21 W , as discussed above.
- the light-control system 55 to control (e.g., reduce) light entering into the cabin 22 via the windows 21 1 - 21 W , which may comprise the light blocker 27 , such as the window covering 23 deployable to cover at least part of the windows 21 1 - 21 W and/or the window transmissivity changer 25 (e.g., film) to change the tint or other aspect affecting transmissivity of one or more of the windows 21 1 - 21 W , as discussed above.
- the sleeping facilitator 45 j may comprise the noise-control system 57 configured to control (e.g., reduce) noise in the cabin 22 , which may comprise the noise canceller 59 to at least partly cancel the noise entering the cabin 22 , as discussed above.
- the sleeping facilitator 45 i may comprise the seat-altering system 52 configured to alter a seat 20 i (e.g., a driver's seat) occupied by the occupant, which may comprise the pillow 63 for the seat 20 i , as discussed above.
- the sleeping facilitator 45 i may be manually operated within the cabin 22 by an occupant.
- the occupant may interact with the user interface 70 to input commands to activate, move, and/or otherwise control the sleeping facilitator 45 i when he/she desires to sleep.
- one or more functionalities of the sleeping facilitator 45 i that enhance privacy and comfort as discussed above may also be used by the occupant for purposes other than sleep. For instance, in some embodiments, this may be used by the occupant for relaxing (without necessarily sleeping), sex, etc.
- an occupant-act facilitator 45 j may be a working facilitator configured to facilitate work of an occupant in the cabin 22 by (e.g., altering the cabin 22 for) providing a workspace 64 for the occupant.
- the working facilitator 45 i providing the workspace 64 may comprise a desk 65 (e.g., a table) on which the occupant can work, such as by supporting a computer (e.g., a laptop computer, a tablet, etc.), papers, pens, and/or other work items used by the occupant.
- the working facilitator 45 i may include a computer mount 66 , such as a docking station and/or connectors (e.g., one or more power outlets or other electrical connectors, USB connectors, etc.) associated with the desk 65 .
- the working facilitator 45 i may also include a screen 67 connectable to the computer (e.g., via the computer mount 66 ) and integrated into the cabin 22 (e.g., in a dashboard, such as part of the user interface 70 ).
- At least part of the working facilitator 45 i providing the workspace 64 may be movable between a working position, in which it is usable by the occupant to work, and a nonworking (e.g., stowed) position, in which it is stowed (e.g., stored), concealed and/or otherwise not usable by the occupant to work.
- at least part of the working facilitator 45 i providing the workspace 64 may be deployable (e.g., extendible) from the nonworking position into the working position and retractable from the working position into the nonworking position (e.g., in which it may be concealed by a door).
- the desk 65 may be movable between the working position, in which it extends over the occupant while he/she is sitting on a seat 20 i so as to be usable by the occupant to work on the desk 65 , and the nonworking position, in which it clears (i.e., does not extend over) the occupant while he/she is sitting on the seat 20 i so that the occupant is unimpeded by the desk 65 .
- the desk 65 may be movable between the working position, in which it extends over an adjacent one of the seats 20 1 - 20 W (e.g., a passenger seat) that is adjacent to the seat 20 i of the occupant that can be rotated to face the desk 65 , and the nonworking position, in which it clears (i.e., does not extend over) that adjacent seat.
- the working position in which it extends over an adjacent one of the seats 20 1 - 20 W (e.g., a passenger seat) that is adjacent to the seat 20 i of the occupant that can be rotated to face the desk 65
- the nonworking position in which it clears (i.e., does not extend over) that adjacent seat.
- the desk 65 may be deployable (e.g., extendible) from the nonworking position, in which it is disposed in a recess (e.g., between adjacent ones of the seats 20 1 - 20 S , below a dashboard of the user interface 70 , etc.) into the working position over the occupant in the seat 20 i and retractable from the working position into the nonworking position.
- the workspace 64 includes the screen 67 for the computer, the screen 67 may also be movable (e.g., deployable and retractable) between the working position and the nonworking position.
- an occupant-act facilitator 45 k may be an exercising facilitator configured to facilitate exercising of an occupant in the cabin 22 by (e.g., altering the cabin 22 for) providing an exerciser 71 for the occupant.
- the exerciser 71 can comprise any apparatus usable by the occupant to physically exercise.
- the exerciser 71 may comprise a cardiovascular exercising device 47 .
- the cardiovascular exercising device 47 may comprise a leg-motion mechanism configured to be operated by legs of the occupant (e.g., including pedals, gliders, and/or other feet-engaging elements configured to be engaged by the occupant's feet to operate the leg-motion mechanism, in a pedaling, swinging, or any other suitable movement).
- the cardiovascular exercising device 47 may comprise an arm-motion mechanism configured to be operated by arms of the occupant (e.g., including handles and/or other hand-engaging elements configured to be engaged by the occupant's hands to operate the arm-motion mechanism, in a rowing, pulling or any other suitable movement).
- the cardiovascular exercising device 47 may comprise both the leg-motion mechanism and the arm-motion mechanism configured to be operated by the occupant's legs and arms (e.g., akin to an elliptical exercising machine).
- the exerciser 71 may comprise a strength training device 48 .
- the strength training device 48 may comprise an arm-motion mechanism configured to be operated by the occupant's arms (e.g., including handles and/or other hand-engaging elements configured to be engaged by the occupant's hands to operate the arm-motion mechanism, in a bending, pulling or any other suitable movement).
- the strength training device 48 may comprise a leg-motion mechanism configured to be operated by the occupant's legs (e.g., including pedals, gliders, and/or other feet-engaging elements configured to be engaged by the occupant's feet to operate the leg-motion mechanism, in a pushing, raising, and/or any other suitable movement).
- a leg-motion mechanism configured to be operated by the occupant's legs (e.g., including pedals, gliders, and/or other feet-engaging elements configured to be engaged by the occupant's feet to operate the leg-motion mechanism, in a pushing, raising, and/or any other suitable movement).
- the exerciser 71 may provide resistance for exercising of the occupant in any suitable way.
- the exerciser 71 may comprise free weights that can be used by the occupant to exercise.
- the exerciser 71 may include a free-weight holder (e.g., rack) to hold the free weights when not in use.
- the exerciser 71 may comprise a fluidic (e.g., hydraulic or pneumatic) resistance mechanism providing pressure to be moved against by the occupant during exercising.
- at least part of the cardiovascular exercising device 47 and at least part of the strength training device 48 of the exerciser 71 may be implemented by a common device.
- the exerciser 71 may be connected to the powertrain 14 of the vehicle 10 to recharge the battery.
- a generator 72 is drivable by the exerciser 71 to generate electrical power applied to the battery to recharge the battery.
- the user interface 70 of the cabin 22 may indicate to the occupant how much power he/she has given to the vehicle 10 by exercising (e.g., an indication of watts, a range in kilometers or miles for the vehicle 10 , etc.).
- At least part of the exercising facilitator 45 k providing the exerciser 71 may be movable between an exercising position, in which it is usable by the occupant to exercise, and a nonexercising (e.g., stowed) position, in which it is stowed (e.g., stored), concealed and/or otherwise not usable by the occupant to exercise.
- at least part of the exercising facilitator 45 k providing the exerciser 71 may be deployable (e.g., extendible) from the nonexercising position into the exercising position and retractable from the exercising position into the nonexercising position (e.g., in which it may be concealed by a door).
- the exerciser 71 may be movable between the exercising position, in which it extends to be engageable by the occupant while he/she is sitting on a seat 20 i (e.g., a driver's seat) so as to be usable by the occupant to exercise, and the nonexercising position, in which it clears (i.e., is unengageable by) the occupant while he/she is sitting on the seat 20 i so that the occupant is unimpeded by the exerciser 71 .
- a seat 20 i e.g., a driver's seat
- the exerciser 71 may be deployable from the exercising position, in which it is disposed in a recess (e.g., between adjacent ones of the seats 20 1 - 20 S , below a dashboard of the user interface 70 , etc.) into the exercising position to engage the occupant in the seat 20 i and retractable from the exercising position into the nonexercising position.
- a recess e.g., between adjacent ones of the seats 20 1 - 20 S , below a dashboard of the user interface 70 , etc.
- an occupant-act facilitator 45 m may be an eating facilitator configured to facilitate eating by an occupant in the cabin 22 by (e.g., altering the cabin 22 for) providing an eating area 75 for the occupant.
- the eating facilitator 45 m providing the eating area 75 may comprise a table 77 (e.g., a tray or other flat support) on which the occupant can eat, such as by supporting food and tableware (e.g., dishes, glasses, knives, forks, etc.) used by the occupant.
- a table 77 e.g., a tray or other flat support
- tableware e.g., dishes, glasses, knives, forks, etc.
- At least part of the eating facilitator 45 m providing the eating area 75 may be movable between an eating position, in which it is usable by the occupant to eat, and a noneating (e.g., stowed) position, in which it is stowed (e.g., stored), concealed and/or otherwise not usable by the occupant to eat.
- at least part of the eating facilitator 45 m providing the eating area 75 may be deployable (e.g., extendible) from the noneating position into the eating position and retractable from the eating position into the noneating position (e.g., in which it may be concealed by a door).
- the table 77 may be movable between the eating position, in which it extends over the occupant while he/she is sitting on a seat 20 i (e.g., a driver's seat) so as to be usable by the occupant to eat at the table 77 , and the noneating position, in which it clears (i.e., does not extend over) the occupant while he/she is sitting on the seat 20 i so that the occupant is unimpeded by the table 77 .
- a seat 20 i e.g., a driver's seat
- the table 77 may be movable between the eating position, in which it extends over an adjacent one of the seats 20 1 - 20 W (e.g., a passenger seat) that is adjacent to the seat 20 i of the occupant that can be rotated to face the table 77 , and the noneating position, in which it clears (i.e., does not extend over) that adjacent seat.
- the eating position in which it extends over an adjacent one of the seats 20 1 - 20 W (e.g., a passenger seat) that is adjacent to the seat 20 i of the occupant that can be rotated to face the table 77
- the noneating position in which it clears (i.e., does not extend over) that adjacent seat.
- the table 77 may be deployable (e.g., extendible) from the noneating position, in which it is disposed in a recess (e.g., between adjacent ones of the seats 20 1 - 20 S , below a dashboard of the user interface 70 , etc.) into the eating position over the occupant in the seat 20 i and retractable from the eating position into the noneating position.
- deployable e.g., extendible
- the eating facilitator 45 m may comprise a holder 78 to hold the tableware on the table 77 while the vehicle 10 is in motion.
- the holder 78 may comprise a mechanical holder (e.g., a clamp, a recess, etc.) to abut and mechanically hold the tableware in place.
- the holder 78 may comprise a magnetic holder, such as a magnet attracting an opposite magnet secured to (e.g., built into or adhesively bonded at an underside of) the tableware, to magnetically hold the tableware in place.
- the eating facilitator 45 m may comprise a waste disposer 69 to dispose of waste (i.e., garbage) such as what is not eaten by the occupant.
- the waster disposer 69 may comprise a garbage can configured to receive the waste, either directly or in a garbage bag placed in the can, and to be emptied.
- the garbage can may include a lid having a sealing mechanism to limit odors propagating in the cabin 22 .
- the waste disposer 69 may comprise a garburator to break apart the waste. In some cases, the garburator may receive water from a tank (e.g., filled by rain water falling onto the vehicle 10 ).
- the waste disposer 69 may be located in the vehicle 10 so as to be vented (e.g., open to a vent exposed to ambient air outside the vehicle 10 , such as at an underside of the vehicle 10 ). In various embodiments, the waste disposer 69 may be emptied by removing and emptying the garbage can manually or by pumping or otherwise emptying the garburator.
- an occupant-act facilitator 45 n may be a cooking facilitator configured to facilitate cooking by an occupant in the cabin 22 by (e.g., altering the cabin 22 for) providing a cooking area 79 for the occupant.
- the cooking facilitator 45 n providing the cooking area 79 may comprise the table 77 as discussed above that can be used by the occupant to cut, mix, and/or otherwise prepare food in addition to eating.
- the cooking facilitator 45 n providing the cooking area 79 may comprise one or more appliances 81 1 - 81 T configured to cook.
- an appliance 81 i may be an oven, stove, slow cooker, or grill (e.g., a microwave oven, an electric stove, an electric grill, etc.) or other heating appliance to cook by heat.
- venting may be effected by opening one or more of the windows 21 1 - 21 W and/or by providing a vent associated with the heating appliance 81 i .
- an appliance 81 j may be a refrigerator to refrigerate ingredients (e.g., produce, meat, fish, poultry, etc.) usable by the occupant to cook.
- the refrigerator 81 j may be powered by a battery (e.g., dedicated to powering the refrigerator and recharged by a solar panel including photovoltaics).
- the cooking facilitator 45 n may comprise the waste disposer 69 configured to dispose of waste which is not used by the occupant when cooking.
- At least part of the cooking facilitator 45 n providing the cooking area 79 may be movable between a cooking position, in which it is usable by the occupant to cook, and a noncooking (e.g., stowed) position, in which it is stowed (e.g., stored), concealed and/or otherwise not usable by the occupant to cook.
- at least part of the cooking facilitator 45 2 providing the cooking area 79 may be deployable (e.g., extendible) from the noncooking position into the cooking position and retractable from the cooking position into the noncooking position (e.g., in which it may be concealed by a door).
- the table 77 may be movable between the cooking position, in which it extends over the occupant while he/she is sitting on a seat 20 i (e.g., a driver's seat) so as to be usable by the occupant to cook at the table 77 , and the noncooking position, in which it clears (i.e., does not extend over) the occupant while he/she is sitting on the seat 20 i so that the occupant is unimpeded by the table 77 , or may be movable between the cooking position, in which it extends over an adjacent one of the seats 20 1 - 20 w (e.g., a passenger seat) that is adjacent to the seat 20 i of the occupant that can be rotated to face the table 77 , and the noncooking position, in which it clears (i.e., does not extend over) that adjacent seat, as discussed above.
- a seat 20 i e.g., a driver's seat
- an appliance 81 x may be movable between the cooking position, in which it can be used by the occupant while he/she is sitting on a seat 20 i (e.g., a driver's seat) to cook, and the noncooking position, in which it is stowed, concealed and/or otherwise unusable by the occupant while he/she is sitting on the seat 20 i .
- a seat 20 i e.g., a driver's seat
- the appliance 81 x may be deployable (e.g., extendible) from the noncooking position, in which it is disposed in a recess (e.g., between adjacent ones of the seats 20 1 - 20 S , below a dashboard of the user interface 70 , etc.) into the cooking position for the occupant in the seat 20 i and retractable from the cooking position into the noncooking position.
- deployable e.g., extendible
- an occupant-act facilitator 45 x e.g., the window covering 23 , the window transmissivity changer 25 , the noise canceller 59 , the desk 65 , the computer mount 66 , the exerciser 71 , the one or more appliances 81 1 - 81 T , etc.
- an occupant-act facilitator 45 x may be built into (i.e., integrated in) the cabin 22 during original manufacturing of the autonomous vehicle 10 (e.g., below a dashboard, on a console of the user interface 70 , on or between a given one of the seats 20 1 - 20 S , etc.).
- an occupant-act facilitator 45 x e.g., the window covering 23 , the window transmissivity changer 25 , the noise canceller 59 , the desk 65 , the computer mount 66 , the exerciser 71 , the one or more appliances 81 1 - 81 T , etc.
- an occupant-act facilitator 45 x may be configured to be installed in the cabin 22 after original manufacturing of the vehicle 10 (e.g., an aftermarket device installable in the cabin 22 by an owner or leaser of the vehicle 10 ).
- an aftermarket device installable in the cabin 22 by an owner or leaser of the vehicle 10 .
- the occupant-act facilitator 45 x may comprise a connector 88 configured to connect the occupant-act facilitator 45 x to a supporting portion 91 of the cabin 22 (e.g., a wall of the cabin 22 below a dashboard, adjacent to a console of the user interface 70 , on or between a given one of the seats 20 1 - 20 S , etc.).
- the connector 88 may comprise one or more fasteners, such as screws, bolts, hook-and-loop (e.g., Velcro) fasteners, clips, clamps, and/or any other fastening device.
- the supporting portion 91 of the cabin 22 may comprise a connector 92 complimentary to and configured to engage and interconnect with the connector 88 of the occupant-act facilitator 45 x (e.g., one or more (e.g., threaded) openings, clips, latches, etc.).
- the connector 92 of the supporting portion 91 of the cabin 22 may be built into (i.e., integrated into) the cabin 22 during original manufacturing of the autonomous vehicle 10 .
- the connector 92 of the supporting portion 91 of the cabin 22 may be configured to be installed in the cabin 22 after original manufacturing of the vehicle 10 along with the occupant-act facilitator 45 x .
- the vehicle 10 may be personalized for an occupant based on an identity of the occupant, such that one or more aspects of the vehicle 10 , like a configuration of the cabin 22 , the self-driving mode of the control system 15 of the vehicle 10 , a destination and/or a route of the vehicle 10 , and/or other aspects of the vehicle 10 , are adjusted based on the identity of the occupant.
- this may be useful where different occupants use the vehicle 10 at different times, whether the vehicle 10 is a private one (e.g., which may be used by parents and their children) or a public one used as part of a taxi, ride-hailing or vehicle-sharing service.
- a private one e.g., which may be used by parents and their children
- a public one used as part of a taxi, ride-hailing or vehicle-sharing service.
- control system 15 is configured to receive an identifier 121 indicative of the identity of the occupant and to adjust one or more aspects of the vehicle 10 based on the identity of the occupant.
- the identifier 121 may include a name, a code, or other identification information input by the occupant.
- the identifier 121 may include a biometric of the occupant, such as a picture, fingerprint, voice print, etc.
- the identifier 121 may be input by the occupant via the user interface 70 of the cabin 22 (e.g., using buttons, a camera or other biometric reader, etc.), In other cases, the identifier 121 may be transmitted from a personal device carried by the occupant, such as a smartphone or other wireless phone, a tablet computer, a head-mounted display, smartwatch or other wearable device, etc., to the communication interface 68 of the vehicle 10 .
- a personal device carried by the occupant such as a smartphone or other wireless phone, a tablet computer, a head-mounted display, smartwatch or other wearable device, etc.
- a database 130 may store a record 152 including the identifier 121 of the occupant and travel information 145 , which may be indicative of a self-driving preference, a route, a destination, etc., of the occupant when travelling in the vehicle 10 .
- the database 130 can store multiple such records including identifiers of various individuals who may travel in the vehicle 10 and travel information, which may be indicative of preferences, routes, destinations, etc., of these individuals when travelling in the vehicle 10 , so that one or more aspects of the vehicle 10 may be adjusted based on identities of these various individuals when they are occupants.
- the database 130 may be part of the controller 80 of the vehicle 10 .
- the database 130 may be part of a server external to the vehicle 10 and accessible by the controller 80 via the communication interface 68 of the vehicle 10 .
- the controller 80 may:
- information to personalize the vehicle 10 based on the identity of the occupant like the travel information 145 may be stored in a personal device carried by the occupant, such as a smartphone or other wireless phone, a tablet computer, a head-mounted display, smartwatch or other wearable device, etc., and transmitted to the controller 80 of the vehicle 10 via the communication interface 68 .
- a personal device carried by the occupant such as a smartphone or other wireless phone, a tablet computer, a head-mounted display, smartwatch or other wearable device, etc.
- one or more systems, devices and/or other components discussed above may be an aftermarket apparatus 30 configured to be installed in the cabin 22 after original manufacturing of the vehicle 10 and, in some cases, may be configured to be automatically controlled by a controller 93 that is implemented after original manufacturing of the vehicle 10 .
- the controller 93 may function as discussed above in respect of the controller 80 of the control system 15 of the autonomous vehicle 10 .
- the controller 93 may be configured to monitor the interior of the cabin 22 and, in response to detecting an actionable event in the cabin 22 , cause the vehicle 10 to autonomously reroute itself, cause issuance of a notification 85 to a communication device 87 external to the vehicle 10 , cause the cabin 22 to be altered, cause the self-driving mode of the vehicle 10 to be altered, cause the vehicle 10 to autonomously perform one or more actions based interactions with (e.g., gestures of) humans external to the vehicle 10 , etc., as discussed above in respect of the controller 80 .
- the controller 93 comprises an interface 266 , a processing portion 268 , and a memory portion 270 , which are implemented by suitable hardware and software.
- the interface 266 comprises one or more inputs and outputs allowing the controller 93 to receive input signals from and send output signals to other components to which the controller 93 is connected (i.e., directly or indirectly connected), including the aftermarket apparatus 30 (e.g., which may include one or more of: the window covering 23 , the window transmissivity changer 25 , and/or other components of the light-control system 55 ; the noise canceller 59 and/or other components of the noise-control system 57 ; the desk 65 , the computer mount 66 , the exerciser 71 , the one or more appliances 81 1 - 81 T , and/or other components of the occupant-act facilitators 45 1 - 45 D ); one or more of the sensors 75 1 - 75 P ; the powertrain 14 ; the steering system 16 ; the user interface 70 ; the communication interface 68 ; etc.
- the aftermarket apparatus 30 e.g., which may include one or more of: the window covering 23 , the window transmissivity changer 25 , and/
- the processing portion 268 comprises one or more processors for performing processing operations that implement functionality of the controller 93 .
- a processor of the processing portion 268 may be a general-purpose processor executing program code stored in the memory portion 270 .
- a processor of the processing portion 268 may be a specific-purpose processor comprising one or more preprogrammed hardware or firmware elements (e.g., application-specific integrated circuits (ASICs), electrically erasable programmable read-only memories (EEPROMs), etc.) or other related elements.
- ASICs application-specific integrated circuits
- EEPROMs electrically erasable programmable read-only memories
- the memory portion 270 comprises one or more memories for storing program code executed by the processing portion 268 and/or data (e.g., maps, vehicle parameters, etc.) used during operation of the processing portion 268 .
- a memory of the memory portion 270 may be a semiconductor medium (including, e.g., a solid-state memory), a magnetic storage medium, an optical storage medium, and/or any other suitable type of memory.
- a memory of the memory portion 270 may be read-only memory (ROM) and/or random-access memory (RAM), for example.
- the controller 93 may comprise and/or interact with one or more other control units of the autonomous vehicle 10 .
- the controller 93 may comprise and/or interact with a powertrain control unit of the powertrain 14 , such as an engine control unit (ECU), a transmission control unit (TCU), etc.
- ECU engine control unit
- TCU transmission control unit
- the controller 93 may be configured to be installed and implemented into the vehicle 10 after original manufacturing of the vehicle 10 (e.g., an aftermarket device installable and implementable in the vehicle 10 by an owner or leaser of the vehicle 10 ).
- software implementing functionality of the controller 93 may be downloaded onto the memory portion 170 of the controller 80 built into the vehicle 10 during original manufacturing of the vehicle 10 such that the controller 80 becomes the controller 93 (i.e., the interface 166 , the processing portion 168 and the memory portion 170 of the controller 80 respectively become the interface 266 , the processing portion 268 and the memory portion 270 of the controller 93 ).
- the controller 93 may a standalone controller that is separate from the controller 80 of the control system 15 of the vehicle and provided together with the aftermarket apparatus 30 that is it configured to control (e.g., the window covering 23 , the window transmissivity changer 25 , and/or other components of the light-control system 55 ; the noise canceller 59 and/or other components of the noise-control system 57 ; the desk 65 , the computer mount 66 , the exerciser 71 , the one or more appliances 81 1 - 81 T , and/or other components of the occupant-act facilitators 45 1 - 45 D ; one or more of the sensors 75 1 - 75 P ; etc.), as part of an aftermarket kit.
- control e.g., the window covering 23 , the window transmissivity changer 25 , and/or other components of the light-control system 55 ; the noise canceller 59 and/or other components of the noise-control system 57 ; the desk 65 , the computer mount 66 , the exerciser 71 , the
Abstract
An autonomous vehicle configured to facilitate its use and/or enhance what it and/or occupants can do with it, such as, for example, by: autonomously acting based on events within the autonomous vehicle, including by autonomously rerouting itself, altering a cabin of the autonomous vehicle, notifying a third party external to the autonomous vehicle, stopping (e.g., parking) the autonomous vehicle, altering how the autonomous vehicle drives itself, and/or performing other actions based on what and/or how an occupant is doing, an emergency, or another event occurring in the cabin of the autonomous vehicle; autonomously acting based on interactions with (e.g., gestures of) humans (e.g., police officers, school-crossing guards, traffic guards at roadwork or other temporary traffic control sites, drivers of other vehicles, etc.) external to the autonomous vehicle; autonomously acting based on indicators placed at particular locations (e.g., drive-through establishments, potholes, parking spots, etc.); facilitating acts of occupants in the cabin of the autonomous vehicle, such as acts unrelated to and normally not done while driving, including, for example, sleeping, exercising, working, eating, cooking, and/or any other suitable act; and/or automatically personalizing the autonomous vehicle for an occupant (e.g., a configuration of the cabin, a self-driving mode, a destination and/or a route, etc.).
Description
- This application claims priority from U.S. Provisional Patent Application 62/631,941 filed on Feb. 19, 2018 and incorporated by reference herein.
- This disclosure generally relates to autonomous vehicles (e.g., autonomous automobiles, trucks and other road vehicles).
- Autonomous vehicles (sometimes referred to as a “self-driving” or “driverless” vehicles), such as autonomous automobiles, trucks and other road vehicles, are operable without human control, including by steering, accelerating, and decelerating (e.g., braking) autonomously without human control, to travel to a destination.
- While autonomous vehicles can provide many benefits such as increased safety, reduced traffic and more free time, there may also be issues or opportunities arising in respect of autonomous vehicles which may not arise with or be less relevant for conventional vehicles driven by human drivers.
- For these and other reasons, there is a need for improvements directed to autonomous vehicles.
- According to various aspects of this disclosure, there is provided an autonomous vehicle configured to facilitate its use and/or enhance what it and/or occupants can do with it, such as, for example, by: autonomously acting based on events within the autonomous vehicle, including by autonomously rerouting itself, altering a cabin of the autonomous vehicle, notifying a third party external to the autonomous vehicle, stopping (e.g., parking) the autonomous vehicle, altering how the autonomous vehicle drives itself, and/or performing other actions based on what and/or how an occupant is doing, an emergency, or another event occurring in the cabin of the autonomous vehicle; autonomously acting based on interactions with (e.g., gestures of) humans (e.g., police officers, school-crossing guards, traffic guards at roadwork or other temporary traffic control sites, drivers of other vehicles, etc.) external to the autonomous vehicle; autonomously acting based on indicators placed at particular locations (e.g., drive-through establishments, potholes, parking spots, etc.); facilitating acts of occupants in the cabin of the autonomous vehicle, such as acts unrelated to and normally not done while driving, including, for example, sleeping, exercising, working, eating, cooking, and/or any other suitable act; and/or automatically personalizing the autonomous vehicle for an occupant (e.g., a configuration of the cabin, a self-driving mode, a destination and/or a route, etc.).
- For example, in accordance with an aspect, this disclosure relates to an autonomous vehicle comprising a control system configured to cause the autonomous vehicle to autonomously perform an action based on an event within the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising a cabin and a control system configured to cause the autonomous vehicle to autonomously perform an action based on an event in the cabin.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising a cabin and a control system configured to cause the autonomous vehicle to autonomously perform an action based on a state of an occupant in the cabin.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising a control system configured to cause the autonomous vehicle to autonomously perform an action based on an interaction with a human external to the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising a control system configured to cause the autonomous vehicle to autonomously perform an action based on a human protocol gesture made by a human external to the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising a cabin and a control system configured to personalize the autonomous vehicle based on an identity of an occupant in the cabin.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising a cabin, a control system configured to operate the autonomous vehicle, and an occupant-act facilitator configured to facilitate an act of an occupant in the cabin unrelated to and normally not done while driving.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising a sleeping facilitator configured to facilitate sleeping of an occupant in a cabin of the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising a working facilitator configured to facilitate work of an occupant in a cabin of the autonomous vehicle by providing a workspace for the occupant.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising an excising facilitator configured to facilitate exercising of an occupant in a cabin of the autonomous vehicle by providing an exerciser for the occupant.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising an eating facilitator configured to facilitate eating by an occupant in a cabin of the autonomous vehicle by providing an eating area for the occupant.
- In accordance with another aspect, this disclosure relates to an autonomous vehicle comprising a cooking facilitator configured to facilitate cooking by an occupant in a cabin of the autonomous vehicle by providing a cooking area for the occupant.
- In accordance with another aspect, this disclosure relates to a system for an autonomous vehicle, in which the system is configured to cause the autonomous vehicle to autonomously perform an action based on an event within the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to a system for an autonomous vehicle, in which the system is configured to cause the autonomous vehicle to autonomously perform an action based on an event in a cabin of the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to a system for an autonomous vehicle, in which the system is configured to cause the autonomous vehicle to autonomously perform an action based on a state of an occupant in a cabin of the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to a system for an autonomous vehicle, in which the system is configured to cause the autonomous vehicle to autonomously perform an action based on an interaction with a human external to the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to a system for an autonomous vehicle, in which the system is configured to cause the autonomous vehicle to autonomously perform an action based on a human protocol gesture made by a human external to the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to a system for autonomous vehicle, in which the system is configured to personalize the autonomous vehicle based on an identity of an occupant in a cabin of the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to a sleeping facilitator for autonomous vehicle, in which the sleeping facilitator is configured to facilitate sleeping of an occupant in a cabin of the autonomous vehicle.
- In accordance with another aspect, this disclosure relates to a working facilitator for an autonomous vehicle, in which the working facilitator is configured to facilitate work of an occupant in a cabin of the autonomous vehicle by providing a workspace for the occupant.
- In accordance with another aspect, this disclosure relates to an exercising facilitator for an autonomous vehicle, in which the excising facilitator is configured to facilitate exercising of an occupant in a cabin of the autonomous vehicle by providing an exerciser for the occupant.
- In accordance with another aspect, this disclosure relates to an eating facilitator for an autonomous vehicle, in which the eating facilitator is configured to facilitate eating by an occupant in a cabin of the autonomous vehicle by providing an eating area for the occupant.
- In accordance with another aspect, this disclosure relates to a cooking facilitator for an autonomous vehicle, in which the cooking facilitator is configured to facilitate cooking by an occupant in a cabin of the autonomous vehicle by providing a cooking area for the occupant.
- In accordance with another aspect, this disclosure relates to an indicator configured to be placed at a particular location external to an autonomous vehicle and recognized by a control system of the autonomous vehicle such that the control system autonomously operates the autonomous vehicle at the particular location based on recognition of the indicator.
- These and other aspects of this disclosure will now become apparent upon review of a description of embodiments that follows in conjunction with accompanying drawings.
- A detailed description of embodiments is provided below, by way of example only, with reference to accompanying drawings, in which:
-
FIGS. 1 to 4 show an embodiment of an autonomous vehicle; -
FIGS. 5 and 6 show an embodiment of a control system of the autonomous vehicle; -
FIG. 7 shows an embodiment of monitoring the autonomous vehicle and causing the autonomous vehicle to perform one or more actions in response to detecting an actionable event within the autonomous vehicle; -
FIG. 8 shows an embodiment of monitoring a cabin of the autonomous vehicle and causing the autonomous vehicle to perform one or more actions in response to detecting an actionable event in the cabin; -
FIGS. 9 and 10 show an embodiment of monitoring an occupant in the cabin of the autonomous vehicle and causing the autonomous vehicle to perform one or more actions in response to detecting an actionable event based on a state of the occupant; -
FIG. 11 shows an embodiment of rerouting of the autonomous vehicle; -
FIG. 12 shows an embodiment of issuing a notification to a communication device external to the autonomous vehicle; -
FIGS. 13 to 16 show an embodiment of altering the cabin of the autonomous vehicle; -
FIG. 17 shows an embodiment of altering a self-driving mode of the autonomous vehicle; -
FIG. 18 shows an embodiment of monitoring a device of the autonomous vehicle elsewhere than at the cabin of the autonomous vehicle and causing the autonomous vehicle to perform one or more actions in response to detecting an actionable event related to the device; -
FIG. 19 shows another embodiment of rerouting of the autonomous vehicle; -
FIGS. 20 and 21 show an embodiment of the autonomous vehicle acting based on interactions with (e.g., gestures of) humans external to the autonomous vehicle; -
FIG. 22 shows an embodiment of a human external to the autonomous vehicle having equipment detectable by the autonomous vehicle to facilitate recognition of gestures; -
FIG. 23 shows a variant to what is shown inFIGS. 20 and 21 ; -
FIGS. 24 to 31 show embodiments of occupant-act facilitators of the autonomous vehicle to facilitate acts of occupants in the cabin of the autonomous vehicle, such as sleeping, exercising, working, eating, cooking, and/or any other suitable act. -
FIG. 32 shows an embodiment of an aftermarket apparatus installable in the cabin of the autonomous vehicle; -
FIG. 33 shows an embodiment of a controller for the aftermarket apparatus; -
FIGS. 34 to 39 show embodiments of indicators placed at particular locations and recognizable by the control system of the autonomous vehicle such that the control system autonomously operates the autonomous vehicle at these particular locations based on recognition of the indicators; and -
FIGS. 40 to 42 show an embodiment in which the autonomous vehicle is personalized for an occupant based on an identity of the occupant. - It is to be expressly understood that the description and drawings are only for purposes of illustrating certain embodiments and are an aid for understanding. They are not intended to be limitative.
-
FIGS. 1 to 5 show an example of an embodiment of anautonomous vehicle 10 on aroad 11. Theautonomous vehicle 10 is designed to legally carry people or cargo on theroad 11, which is part of a public road infrastructure (e.g., public streets, highways, etc.). In this embodiment, theautonomous vehicle 10 is an automobile (e.g., a passenger car). In other embodiments, theautonomous vehicle 10 may be an autonomous truck, an autonomous bus, or any other autonomous road vehicle. The autonomous vehicle 10 (sometimes referred to as a “self-driving” or “driverless” vehicle) is operable without human control, including by steering, accelerating, and decelerating (e.g., braking) itself autonomously without human control, to travel to a destination. Although it can drive itself, in some embodiments, theautonomous vehicle 10 may be controlled by a human driver in some situations. - In this embodiment, the
autonomous vehicle 10 comprises aframe 12, apowertrain 14, asteering system 16, asuspension 18, wheels 20 1-20 4, acabin 22, and acontrol system 15 that is configured to operate thevehicle 10 autonomously (i.e., without human control). Theautonomous vehicle 10 has a longitudinal direction, a widthwise direction, and a heightwise direction. - As further discussed later, in various embodiments, the
autonomous vehicle 10 may be configured to facilitate its use and/or enhance what it and/or occupants can do with it, such as, for example, by: -
- autonomously acting based on events within the
vehicle 10, including by autonomously rerouting itself, altering thecabin 22, notifying a third party external to thevehicle 10, stopping (e.g., parking) thevehicle 10, altering how thevehicle 10 drives itself, and/or performing other actions based on what and/or how an occupant is doing, an emergency, or another event occurring in thecabin 22; - autonomously acting based on interactions with (e.g., gestures of) humans (e.g., police officers, school-crossing guards, traffic guards at roadwork or other temporary traffic control sites, drivers of other vehicles, etc.) external to the
vehicle 10; - autonomously acting based on indicators placed at particular locations (e.g., drive-through establishments, potholes, parking spots, etc.);
- facilitating acts of occupants in the
cabin 22, such as acts unrelated to and normally not done while driving, including, for example, sleeping, exercising, working, eating, cooking, and/or any other suitable act; and/or - automatically personalizing the
vehicle 10 for an occupant (e.g., a configuration of thecabin 22, a self-driving mode, a destination and/or a route, etc.).
- autonomously acting based on events within the
- The
powertrain 14 is configured to generate power for theautonomous vehicle 10, including motive power for the wheels 20 1-20 4 to propel thevehicle 10 on theroad 11. To that end, thepowertrain 14 comprises a power source 13 (e.g., a primer mover) that includes one or more motors. For example, in some embodiments, thepower source 13 may comprise an electric motor (e.g., powered by a battery), an internal combustion engine, or a combination of different types of motor (e.g., an electric motor and an internal combustion engine). Thepowertrain 14 can transmit power from thepower source 13 to one or more of the wheels 20 1-20 4 in any suitable way (e.g., via a transmission, a differential, a shaft engaging (i.e., directly connecting) a motor and a given one of the wheels 20 1-20 4, etc.). - The
steering system 16 is configured to steer theautonomous vehicle 10 on theroad 11. In this embodiment, thesteering system 16 is configured to turn front ones of the wheels 20 1-20 4 to change their orientation relative to theframe 12 of thevehicle 10 in order to cause thevehicle 10 to move in a desired direction. - The
suspension 18 is connected between theframe 12 and the wheels 20 1-20 4 to allow relative motion between theframe 12 and the wheels 20 1-20 4 as theautonomous vehicle 10 travels on theroad 11. For example, thesuspension 18 may enhance handling of thevehicle 10 on theroad 11 by absorbing shocks and helping to maintain traction between the wheels 20 1-20 4 and theroad 11. Thesuspension 18 may comprise an arrangement of springs and dampers. A spring may be a coil spring, a leaf spring, a gas spring (e.g., an air spring), or any other elastic object used to store mechanical energy. A damper (also sometimes referred to as a “shock absorber”) may be a fluidic damper (e.g., a pneumatic damper, a hydraulic damper, etc.), a magnetic damper, or any other object which absorbs or dissipates kinetic energy to decrease oscillations. In some cases, a single device may itself constitute both a spring and a damper (e.g., a hydropneumatic device). - The
cabin 22 is configured to be occupied by one or more occupants of theautonomous vehicle 10. In this embodiment, thecabin 22 comprises windows 21 1-21 W, seats 20 1-20 S, and auser interface 70 that is configured to interact with one or more occupants of thevehicle 10. Theuser interface 70 comprises aninput portion 71 including one or more input devices (e.g., a set of buttons, levers, dials, etc., a touchscreen, a microphone, etc.) allowing an occupant of thevehicle 10 to input commands and/or other information into thevehicle 10 and an output portion 73 including one or more output devices (e.g., a display, a speaker, etc.) to provide information to an occupant of thevehicle 10. The output portion 73 of theuser interface 70 which may comprise an instrument panel (e.g., a dashboard) which provides indicators (e.g., a speedometer indicator, a tachometer indicator, etc.) related to operation of thevehicle 10. - The
control system 15 is configured to operate theautonomous vehicle 10, including to steer, accelerate, and decelerate (e.g., brake) theautonomous vehicle 10, autonomously (i.e, without human control) as theautonomous vehicle 10 progresses to a destination along a route on theroad 11. To that end, thecontrol system 15 comprises acontroller 80 and asensing apparatus 82 to perform actions controlling the vehicle 10 (e.g., actions to steer, accelerate, decelerate, etc.) to move it towards its destination on theroad 11 based on a computerized perception of an environment of thevehicle 10. - While its
control system 15 enables it to drive itself, theautonomous vehicle 10 may be controlled by a human driver, such as an occupant in thecabin 22, in some situations. For example, in some embodiments, thecontrol system 15 may allow theautonomous vehicle 10 to be selectively operable either autonomously (i.e., without human control) or under human control (i.e., by a human driver) in various situations (e.g., theautonomous vehicle 10 may be operable in either of an autonomous operational mode and a human-controlled operational mode). For instance, in this embodiment, theuser interface 70 of thecabin 22 may comprise an accelerator 31 (e.g., an acceleration pedal), a braking device 33 (e.g., a brake pedal), and a steering device 35 (e.g., a steering wheel) that can be operated by a human driver in thecabin 22 to control thevehicle 10 on theroad 11. - The
controller 80 is a processing apparatus configured to process information received from thesensing apparatus 82 and possibly other sources in order to perform actions controlling theautonomous vehicle 10, including to steer, accelerate, and decelerate thevehicle 10, towards its destination on theroad 11. With additional reference toFIG. 6 , in this embodiment, thecontroller 80 comprises aninterface 166, aprocessing portion 168, and amemory portion 170, which are implemented by suitable hardware and software. - The
interface 166 comprises one or more inputs and outputs allowing thecontroller 80 to receive input signals from and send output signals to other components to which thecontroller 80 is connected (i.e., directly or indirectly connected), including thesensing apparatus 82, thepowertrain 14, and thesteering system 16, and possibly other components such as theuser interface 70, acommunication interface 68 configured to communicate over a communication network (e.g., a cellular, WiFi, or other wireless network, for internet and/or other communications), over one or more local communication links (e.g., BlueTooth, USB, etc.), and/or with one or more other vehicles that are near the autonomous vehicle 10 (i.e., for inter-vehicle communications), etc. - The
processing portion 168 comprises one or more processors for performing processing operations that implement functionality of thecontroller 80. A processor of theprocessing portion 168 may be a general-purpose processor executing program code stored in thememory portion 170. Alternatively, a processor of theprocessing portion 168 may be a specific-purpose processor comprising one or more preprogrammed hardware or firmware elements (e.g., application-specific integrated circuits (ASICs), electrically erasable programmable read-only memories (EEPROMs), etc.) or other related elements. - The
memory portion 170 comprises one or more memories for storing program code executed by theprocessing portion 168 and/or data (e.g., maps, vehicle parameters, etc.) used during operation of theprocessing portion 168. A memory of thememory portion 170 may be a semiconductor medium (including, e.g., a solid-state memory), a magnetic storage medium, an optical storage medium, and/or any other suitable type of memory. A memory of thememory portion 170 may be read-only memory (ROM) and/or random-access memory (RAM), for example. - In some embodiments, the
controller 80 may comprise and/or interact with one or more other control units of theautonomous vehicle 10. For example, in some embodiments, thecontroller 80 may comprise and/or interact with a powertrain control unit of thepowertrain 14, such as an engine control unit (ECU), a transmission control unit (TCU), etc. - The
sensing apparatus 82 comprises a set of sensors 90 1-90 S to sense aspects of the environment of thevehicle 10 and generate sensor information indicative of these aspects of the environment of thevehicle 10 that is provided to thecontroller 80 in order to control thevehicle 10 towards its destination on theroad 11. The sensor information can be used by thecontroller 80 to determine actions which are to be performed by theautonomous vehicle 10 in order for thevehicle 10 to continue to its destination. The sensors 90 1-90 S can provide situational information proximate to thevehicle 10, including any potential hazards proximate to thevehicle 10. - The sensors 90 1-90 S may include any suitable sensing device. For instance, in some embodiments, the sensors 90 1-90 S may comprise a camera (e.g., video, stereoscopic, etc.) and/or other imaging device, a Light Detection and Ranging (LIDAR) device, a radar device, a wheel speed sensor, a GPS and/or other location sensor, and/or any other suitable sensing device.
- The
autonomous vehicle 10 may be implemented in any suitable way. For example, in some embodiments, theautonomous vehicle 10, including itscontrol system 15, may be implemented as a Waymo™ vehicle such as that described at https://waymo.com/tech/and https://waymo.com/safetyreport/, a Uber™ vehicle such as that described at https://www.uber.com/cities/pittsburgh/self-driving-ubers/, a vehicle such as that described in U.S. Pat. No. 8,818,608 or U.S. Patent Application Publication 2014/0303827, all of which are incorporated by reference herein. - In some cases, the
autonomous vehicle 10 may be for personal or private use by a user (e.g., where thevehicle 10 is owned or leased by the user or another individual personally known to the user, such as a family member, a friend, etc.). In other cases, theautonomous vehicle 10 may be for public use by various users, such as where thevehicle 10 is used as part of a taxi, ride-hailing or vehicle-sharing service. - The
autonomous vehicle 10 may be configured to facilitate its use and/or enhance what it and/or occupants can do with it. Examples of embodiments of this are described below. - 1. Autonomously Acting Based on Event within Vehicle (e.g., Inside Cabin)
- In some embodiments, as shown in
FIG. 7 , thecontrol system 15 of theautonomous vehicle 10 may be configured to cause thevehicle 10 to autonomously perform one or more actions based on one or more events within thevehicle 10. - For example, in this embodiment, the
control system 15 may be configured to cause thevehicle 10 to autonomously reroute itself (i.e., change its destination and/or its current route), alter thecabin 22, notify a third party external to thevehicle 10, stop (e.g., park) thevehicle 10, alter how thevehicle 10 drives itself, and/or perform one or more other actions based on what and/or how an occupant is doing, an emergency, or another event occurring in thecabin 22 of thevehicle 10. - More particularly, in this embodiment, as shown in
FIG. 8 , thecontrol system 15 is configured to monitor an interior of thecabin 22 and, in response to detecting an actionable event in thecabin 22, cause thevehicle 10 to autonomously reroute itself and/or perform one or more other actions to address that event. - An actionable event in the
cabin 22 in response to which thecontrol system 15 causes theautonomous vehicle 10 to reroute itself and/or perform one or more other actions may involve one or more conditions being met (e.g., one or more circumstances having arisen) in thecabin 22. Any or all of these one or more conditions may be predefined or otherwise specified such that, when the one or more conditions are met, the actionable event is deemed to have occurred in thecabin 22. - Detection that one or more conditions are met in the
cabin 22, and therefore detection of an actionable event in thecabin 22, may be carried out by thecontroller 80. This may be achieved based on processing of input information that may be received by thecontroller 80. Examples of such input information may include information received via theuser interface 70, thecommunication interface 68, and/or possibly from other sources (e.g., one or more sensors in the cabin 22). - When an actionable event in the
cabin 22 is detected, thecontroller 80 responds by effecting one or more actions to address that event. For example, thecontroller 80 may issue one or more control signals to thepowertrain 14, thesteering system 16, theuser interface 70, thecommunication interface 68, and/or possibly other devices. - For example, in some embodiments, as shown in
FIGS. 9 and 10 , the control system may be configured to monitor an occupant in thecabin 22 and cause thevehicle 10 to autonomously perform one or more actions based on a state of the occupant (i.e., what and/or how the occupant is doing). - The
controller 80 may process information received or otherwise derived from one or more sensors 75 1-75 P of an occupant-monitoringsystem 83 that is monitoring the occupant. For instance, the sensors 75 1-75 P may be configured to monitor what the occupant is doing, i.e., an activity or a lack of activity of the occupant (e.g., sleeping, eating, working, exercising, watching media, etc.), and/or how the occupant is doing, i.e., a health of the occupant (e.g., whether the occupant appears to be in good condition or suffering a loss of consciousness, a stroke, a heart attack, or other physical impairment). - For example, in some embodiments, the sensors 75 1-75 P may include a camera to view the occupant (e.g., the occupant's eyes, face, and/or other parts or all of the occupant's body), a motion sensor to sense motion of the occupant, a pressure sensor (e.g., on a given one of the seats 20 1-20 S that is occupied by the occupant, such as in a headrest or a seat cushion), a vital sign sensor to sense one or more vital signs (e.g., a pulse rate, a respiratory rate, a body temperature, and/or a blood pressure) of the occupant (e.g., a heart rate monitor, a temperature sensor, a blood pressure monitor, etc.), and/or any other sensor. Processing of information received or otherwise derived from the sensors 75 1-75 P by the
controller 80 may comprise image processing (e.g., of images captured by a camera), comparison of parametric values (e.g., to reference or threshold values), and/or any other processing operations. - In some cases, a
sensor 75 i may be built into thecabin 22 during original manufacturing of the autonomous vehicle 10 (e.g., on a console of theuser interface 70, on a given one of the seats 20 1-20 S, etc.). In other cases, asensor 75 i may be installed in thecabin 22 after original manufacturing of the vehicle 10 (e.g., as part of an aftermarket device installed in thecabin 22 by an owner or leaser of the vehicle 10). In yet other cases, asensor 75 i may be carried into thecabin 22 by the occupant as part of a portable device carried by the occupant, such as: a smartphone or other wireless phone; a tablet computer; a head-mounted display, smartwatch or other wearable device; a medical device; etc. In such cases, the portable device comprising thesensor 75 i may communicate with thecontroller 80 via a communication link, which may be wireless, wired, or partly wireless and partly wired (e.g., Bluetooth, WiFi or other wireless LAN, USB, etc.). - a) Rerouting
- For example, in some embodiments, as shown in
FIG. 11 , thecontroller 80 may monitor an occupant of thevehicle 10 and, in response to detecting an event involving the occupant, reroute thevehicle 10 to anew destination 74 different from anoriginal destination 40 of thevehicle 10. - In some embodiments, the
controller 80 may monitor the occupant of thevehicle 10 and, in response to detecting an emergency or other medical event involving the occupant, autonomously reroute thevehicle 10 to thenew destination 74 where medical assistance is providable to the occupant (e.g., a hospital, clinic, or other medical establishment; a police station; a fire station; etc.). - More particularly, in some embodiments, based on information regarding vital signs (e.g., heart rate), a position or movement (e.g., a spasm, eyes rolling, a head or upper body tilt or collapse, stillness, a heavy sit, etc.), and/or physical traits (e.g., paleness, bleeding, etc.) of the occupant derived from the sensors 75 1-75 P, the
controller 80 may detect one or more conditions indicative of the emergency or other medical event involving the occupant, and may proceed to cause thevehicle 10 to be autonomously rerouted to thenew destination 74 where medical assistance can be provided by consulting a database of medical establishments and mapping information using a current location of thevehicle 10. - b) External Notifying
- As another example, in some embodiments, as shown in
FIG. 12 , thecontroller 80 may monitor an occupant of thevehicle 10 and, in response to detecting an event involving the occupant, cause issuance of anotification 85 to acommunication device 87 external to thevehicle 10. Thecommunication device 87 may be a smartphone, a tablet, a head-mounted display, a smartwatch, or other device carried or worn by an individual; a server or other computer; or any other device designed for communication. - For instance, in some embodiments, referring to an example discussed above, the
controller 80 may monitor the occupant of thevehicle 10 and, in response to detecting an emergency or other medical event involving the occupant, cause thenotification 85 to be transmitted to thecommunication device 87 which is associated with a medical assistance provider (e.g., at a hospital, clinic, or other medical establishment; a police station; a fire station; etc.) to notify the medical assistance provider of what is happening with the occupant, which may help the medical assistance provider to prepare for treating the occupant. Thenotification 85 transmitted to thecommunication device 87 associated with the medical assistance provider may be conveyed as a text message (e.g., SMS message), an email message, a voice message, or any other suitable communication. - The
controller 80 may cause thecommunication interface 68 to transmit thenotification 85 to thecommunication device 87 via acommunication link 49 which may be established over a cellular network, a WiFi network, a satellite connection, and/or any other suitable connection. - In some cases, issuance of the
notification 85 to thecommunication device 87 associated with the medical assistance provider may be done in conjunction with autonomous rerouting of thevehicle 10 to a destination where medical assistance is providable to the occupant, as discussed above. In other cases, issuance of thenotification 85 to thecommunication device 87 associated with the medical assistance provider may be done without autonomously rerouting thevehicle 10 to another destination (e.g., thevehicle 10 may be parked, its location may be conveyed to the medical assistance provider, and an ambulance may be dispatched to that location). - c) Responding to Prohibited Behavior
- As another example, in some embodiments, the
controller 80 may monitor an occupant of thevehicle 10 and, in response to detecting a prohibited behavior exhibited by the occupant, perform one or more actions such as causing thevehicle 10 to autonomously stop (e.g., park) and/or reroute itself and/or causing issuance of anotification 85 to acommunication device 87 external to thevehicle 10. - A prohibited behavior is “prohibited” in that it is not allowed to be exhibited in the
vehicle 10. This may be specified by a provider of the vehicle 10 (e.g., a manufacturer of thevehicle 10, a taxi, ride-hailing or vehicle-sharing service provider, etc.); a public authority (e.g., a police, a government, etc.); etc. For example, this may include a behavior that is dangerous, hazardous or otherwise risky (e.g., to the occupant, any other occupant of thevehicle 10, or other vehicles on the road 11), is susceptible to vandalize or otherwise damage thevehicle 10, and/or is otherwise undesirable. - For instance, in some embodiments, the
controller 80 may monitor the occupant of thevehicle 10 and, in response to detecting a prohibited behavior exhibited by the occupant, autonomously stop (e.g., park) thevehicle 10. Thecontroller 80 may also cause theuser interface 70 of thecabin 22 to advise the occupant of the prohibited behavior (e.g., by displaying or otherwise issuing a warning or other notification, which may request the occupant to get out of thevehicle 10, etc.) Alternatively or additionally, in some embodiments, thecontroller 80 may cause anotification 85 to be transmitted to acommunication device 87 external to thevehicle 10. For instance, thecommunication device 87 may be associated with a provider of thevehicle 10 or a public authority, and thenotification 85 may report the prohibited behavior exhibited by the occupant. - d) Cabin Altering
- In some embodiments, as shown in
FIG. 13 , thecontroller 80 may monitor an occupant of thevehicle 10 and, in response to detecting an event involving the occupant, cause altering of thecabin 22. That is, thecontroller 80 may cause one or more objects 96 1-96 O of thecabin 22 to be altered by changing from one state to a different state. - For example, in some embodiments, the
controller 80 may monitor the occupant of thevehicle 10 and, in response to detecting that the occupant is sleeping, alter thecabin 22 to facilitate the occupant's sleep. - In some embodiments, the
controller 80 may cause thecabin 22 to be altered to reduce stimuli (e.g., light, noise, vibrations, etc.) from thevehicle 10 and/or its environment affecting the occupant who is sleeping. - More particularly, in some embodiments, based on information regarding vital signs (e.g., heart rate), a position or movement (e.g., a head tilt, stillness, etc.), and/or physical traits (e.g., eyes closed, an open mouth, etc.) of the occupant derived from the sensors 75 1-75 P, the
controller 80 may detect one or more conditions indicative that the occupant is sleeping, and may proceed to cause thecabin 22 to be altered to reduce stimuli from thevehicle 10 and/or its environment affecting the occupant who is sleeping. - For instance, in some embodiments, as shown in
FIG. 14 , thecabin 22 may comprise a light-control system 55 to control (e.g., reduce) light entering into thecabin 22 via the windows 21 1-21 W, and thecontroller 80 may cause the light-control system 55 to reduce the light entering into thecabin 22 upon detecting that the occupant is sleeping. The light-control system 55 may comprise alight blocker 27 activatable by thecontroller 80 to block light from reaching the interior of thecabin 22 through at least part of the windows 21 1-21 W. As an example, in some embodiments, as shown inFIG. 15 , the light-control system 55 may comprise a window covering 23 (e.g., comprising one or more blinds, shades, shutters, and/or curtains) deployable (e.g., extendible) to cover at least part of the windows 21 1-21 W, such that thecontroller 80 may cause deployment of the window covering 23 to reduce the light entering into thecabin 22. As another example, in some embodiments, as shown inFIG. 16 , the light-control system 55 may comprise awindow transmissivity changer 25 configured to change a tint or other aspect affecting transmissivity of one or more of the windows 21 1-21 W, and thecontroller 80 may cause thewindow transmissivity changer 25 to change the tint or other aspect affecting transmissivity (e.g., darken, increase opacity, etc.) of one or more of the windows 21 1-21 W to reduce light entering into thecabin 22. For instance, in some embodiments, thewindow transmissivity changer 25 may comprise a film disposed on one or more of the windows 21 1-21 W and electrically controllable to alter the tint or other aspect affecting transmissivity (e.g., such as that commercially-available from Smart Tint at www.smartint.com/). - In some embodiments, the
cabin 22 may comprise a noise-control system 57 configured to control (e.g., reduce) noise in thecabin 22, and thecontroller 80 may cause the noise-control system 57 to reduce noise in thecabin 22 upon detecting that the occupant is sleeping. As an example, in some embodiments, the noise-control system 57 may comprise anoise canceller 59 to at least partly cancel the noise entering thecabin 22, such as by generating sound that at least partly cancels the noise entering the cabin. For instance, thenoise canceller 59 may comprise one or more microphones and one or more speakers in thecabin 22, possibly one or more amplifiers or other sound-generating components, and a controller configured to generate an audio signal that is reversed in phase to an audio signal picked up by the one or more microphones and that is applied to the one or more speakers to generate the sound at least partly cancelling the noise entering the cabin 22 (e.g., using active noise control technology for noise cancellation such as that commercially-available from Ford, Toyota, Honda, and other car manufacturers). - Additionally or alternatively, in some embodiments, the
controller 80 may cause anaudio system 60 of theuser interface 70 of thecabin 22 to emit relaxing sound (e.g., ocean waves, rain, forest sounds, soothing music, etc.). - In some embodiments, the
controller 80 may cause aseat 20 i occupied by the occupant to be altered to facilitate the occupant's sleep. In some cases, theseat 20 i may be a “driver's seat” in front of thevehicle 10, in embodiments in which thevehicle 10 is selectively operable either autonomously (i.e., without human control) or under human control (i.e., by a human driver) or where such a driver's seat would conventionally be found in a human-driven vehicle. - For example, in some embodiments, the
seat 20 i occupied by the occupant may include a seat-alteringsystem 52 configured to alter theseat 20 i, and thecontroller 80 may cause the seat-alteringsystem 52 upon detecting that the occupant is sleeping. For instance, in some embodiments, the seat-alteringsystem 52 may comprise one or more actuators (e.g., electromechanical actuators, fluidic (e.g., hydraulic or pneumatic) actuators, etc.) connected to one or more movable portions 50 1-50 M, such as a seating portion, a backrest portion, and a headrest portion, of theseat 20 i to change relative positioning of the one or more movable portions 50 1-50 M of theseat 20 i. In some cases, thecontroller 80 may cause the seat-alteringsystem 52 to alter theseat 20 i such that theseat 20 i is converted into a reclined (e.g., bedlike) configuration in which the occupant is reclined on theseat 20 i by repositioning the one or more movable portions 50 1-50 M of theseat 20 i. - In some cases, the seat-altering
system 52 may comprise apillow 63 for theseat 20 i, and thecontroller 63 may cause thepillow 63 to be deployed upon detecting that the occupant is sleeping. For instance, in some embodiments, thepillow 63 may be integrated into or otherwise associated with the headrest portion of theseat 20 i and thecontroller 80 may cause the pillow 62 to be deployed by moving thepillow 63 to engage the occupant's head (e.g., by activating an actuator to move the pillow 62 into position) and/or by inflating the pillow 63 (e.g., by activating a pump to inflate the pillow 63). - The
controller 80 may cause thecabin 22 to be altered in various other ways in other embodiments upon detecting that the occupant is sleeping (e.g., cause a temperature-control system to adjust a temperature in thecabin 22, activate a vibration system of theseat 20 i, etc.). - e) Self-Driving Mode Altering
- In some embodiments, as shown in
FIG. 17 , thecontroller 80 may monitor an occupant of thevehicle 10 and, in response to detecting an event involving the occupant, alter a self-driving mode of thevehicle 10, i.e., alter how thevehicle 10 autonomously drives itself. - For example, in some embodiments, referring to an example discussed above, the
controller 80 may monitor the occupant of thevehicle 10 and, in response to detecting that the occupant is sleeping, alter the self-driving mode of thevehicle 10 to facilitate the occupant's sleep, such as by reducing potential for sudden or abrupt movements (e.g., acceleration, braking, turning, shaking, etc.) of thevehicle 10 on theroad 11. - In some embodiments, the
controller 80 may control thepowertrain 14, thesteering system 16, thesuspension 18, and/or possibly other devices of thevehicle 10 so that the self-driving mode of thevehicle 10 is smoother than when the occupant is deemed to be awake. For instance, thecontroller 80 may control the powertrain 14 (e.g., by controlling an ECU, TCU or other powertrain control unit) so that thevehicle 10 accelerates and/or decelerates (e.g., breaks) less intensely, control thesteering system 16 so that thevehicle 10 turns less sharply, and/or control the suspension 18 (e.g., by controlling an active suspension system of the suspension 18) so that it is less stiff than when the occupant is deemed to be awake. - Additionally or alternatively, in some embodiments, the
controller 80 may reroute thevehicle 10 to its destination along a new route that is different from and more suited to sleep of the occupant than its original route. For instance, the new route may include fewer stops (e.g., stop signs, traffic lights, etc.), fewer and/or less sharp turns, a smoother roadway (e.g., less damaged or flatter roadway) than the original route. Thecontroller 80 may consult mapping information to determine the new route based on a current location of thevehicle 10. - In some embodiments, when the occupant is sleeping, the
controller 80 may cause one or more actions to be performed in thecabin 22 to awaken the occupant. For example, in some embodiments, thecontroller 80 may cause theuser interface 70 to issue an awakening notification, such as by causing theuser interface 70 to emit sound (e.g., an alarm, music, etc.), vibrate theseat 20 i of the occupant, and/or otherwise stimulate the occupant to awaken him/her. Also, in some embodiments, thecontroller 80 may cause the light-control system 55 to let more light into thecabin 22 via the windows 21 1-21 W (e.g., by retracting the window covering 23, causing the window transmissivity changer to change lighten the tint or otherwise increase transmissivity of light through one or more of the windows 21 1-21 W), the noise-control system 57 to let more noise in the cabin 22 (e.g., by stopping noise cancellation), the seat-alteration system 52 to move theseat 20 i back into a seated configuration and/or retract thepillow 63, etc. - In some cases, the
controller 80 may cause one or more actions to be performed in thecabin 22 to awaken the occupant based on a current location of thevehicle 10 and/or a current time. For instance, thecontroller 80 may cause these one or more actions to be performed upon determining based on the current location of thevehicle 10 and/or the current time relative to the destination of thevehicle 10. When it determines that thevehicle 10 is sufficiently close to its destination and/or will arrive at its destination sufficiently soon, thecontroller 80 proceeds to cause these one or more actions to be performed to awaken the occupant. - In other embodiments, as shown in
FIG. 18 , thecontrol system 15 of theautonomous vehicle 10 may be configured to cause thevehicle 10 to autonomously reroute itself and/or perform one or more other actions based on a state of a device (e.g., of thepowertrain 14, thesteering system 16, or the suspension 18) of thevehicle 10 elsewhere than in thecabin 22, etc. - For example, in some embodiments, the
control system 15 may be configured to monitor an energy level (e.g., a battery level or a fuel level) of the powertrain 14 (e.g., a battery and/or a fuel tank of the powertrain 14) and, in response to detecting that the energy level reaches a threshold, cause thevehicle 10 to autonomously reroute itself to an energy-replenishing station (e.g., a charging station for a battery for an electric motor and/or a fueling station for a fuel tank for an internal combustion engine). - As another example, in some embodiments, the
control system 15 may be configured to monitor an operability of a device (e.g., of thepowertrain 14, thesteering system 16, or the suspension 18) and, in response to detecting that the operability of the device is unsuitable for the vehicle 10 (e.g., the device is defective or worn to a threshold), cause thevehicle 10 to autonomously reroute itself to a repair station. - In some embodiments, as shown in
FIG. 19 , thecontroller 80 may monitor the vehicle and, in response to detecting an accident (e.g., a crash), reroute thevehicle 10 to anew destination 74 different from anoriginal destination 40 of thevehicle 10 if thevehicle 10 remains operable. For example, if one or more occupants are in the vehicle when the accident occurs, thecontroller 80 may autonomously reroute thevehicle 10 to thenew destination 74 where medical assistance is providable to the one or more occupants (e.g., a hospital, clinic, or other medical establishment; a police station; a fire station; etc.). This may be done by consulting a database of medical establishments and mapping information using a current location of thevehicle 10. Thecontroller 80 may detect the accident based on a sensor (e.g., a crash sensor) or deployment of an airbag in thecabin 20. - 2. Autonomously Acting Based on Interactions with (e.g., Gestures of) Humans External to Vehicle
- In some embodiments, as shown in
FIG. 20 , thecontrol system 15 of theautonomous vehicle 10 may be configured to cause thevehicle 10 to autonomously perform one or more actions based on one or more interactions with humans 44 1-44 H (e.g., police officers, school-crossing guards, traffic guards at roadwork or other temporary traffic control sites, drivers of other vehicles, etc.) external to thevehicle 10. - For example, in this embodiment, the
control system 15 may be configured to detect a “human protocol gesture” being made by a human 44 i outside thevehicle 10 and to alter a manner in which thevehicle 10 drives itself based on that detected gesture. A human protocol gesture refers to gestures, made by humans in positions of traffic controlling authority, that embed one or more commands for overriding or contradicting conventional algorithmic driving rules. For instance, a human protocol gesture may be made by hands of a police officer to wave traffic into a lane of opposing traffic, or by a hand and stop sign of a school-crossing guard to stop traffic when there is no actual stop sign, or by a driver of an oncoming vehicle flashing his or her vehicle's lights in the spirit of agreeing on who will have priority when crossing a narrow stretch of road such as a one-lane bridge. Commands embedded in a human protocol gesture could include one command or a series of commands. An example of a command could be “stop”. An example of a series of commands could be “change into oncoming traffic lane, advance, and return to original lane after a safe distance”. - Other examples of human protocol gestures include motorcycle hand signals as described in: http://www.motorcyclelegalfoundation.com/motorcycle-hand-signals-chart/. However, it should be appreciated that a human protocol gesture is not limited to being derived from hand movements.
- If the
vehicle 10 were not an autonomous vehicle, and instead were a conventional vehicle controlled by a human driver, the human protocol gestures being discussed here would be destined for the driver of such vehicle, and there is an expectation on the part of the person making the human protocol gesture that the driver inside the vehicle will understand the gesture and will make resultant modifications to control of the vehicle. This expectation is created as part of the driving culture that has developed in North America and, similarly, in other parts of the world. For example, the United Kingdom uses the “signals by authorized persons”, found at: https://assets.publishing.service.gov.uk/media/560aa62bed915d035c00001b/the-highway-code-signals-by-authorised-persons.pdf. - As the
vehicle 10 is an autonomous one, and with reference toFIG. 21 , human protocol gesture detection is a result of receiving sensory data at step 310 (e.g., from the sensors 90 1-90 S such as cameras or LIDAR) and implementing an algorithm, by thecontroller 80 at step 320, to recognize a human protocol gesture that may (or may not) be present in the received sensory data. Various gesture recognition algorithms may be used for this purpose. In the event a human protocol gesture is recognized at step 320 (i.e., the commands embedded therein have been decoded), a prescribed action can then be taken at step 330, involving a change to the manner in which thevehicle 10 is autonomously driven. A mapping between human protocol gestures and prescribed driving actions can be stored in thememory portion 168, such as in a database, of thecontroller 80. - a) Human Protocol Gesture Recognition
- For example, consider a human protocol gesture made by a police officer to control traffic in both directions where there is only one lane, e.g., as a result of an accident. This could involve the police officer using one hand to stop traffic in one direction and the other hand to beckon traffic in the other direction. It is clear that, at some point, the police officer will allow one of the directions of traffic flow to drive along a stretch of road in the opposite direction of what would be permitted under conventional algorithmic driving rules.
- In order to detect and recognize the human protocol gesture, an algorithm may be used (step 320), which includes recognizing hand and arm movements in images captured in the vicinity of the vehicle 10 (step 310). The recognized hand and arm movements may be correlated against a database of sequences of pre-determined hand and arm movements associated with different human protocol gestures. Various distance minimization algorithms used in pattern recognition can be used for the purposes of recognizing hand and arm movements to with a certain level of confidence.
- Although hand gesture recognition technology is commercially available (see, for example, http://www.arcsoft.com/technology/gesture.html), it is envisaged that the person making the hand gesture (e.g., police officer or crossing guard) may be oriented in such a way where the space defined by the movements of his or her two hands intersects to a point where it may be difficult to detect hand or arm movements without additional spatial registration or reference. To assist in this regard, the person executing the human protocol gesture may be provided with ancillary equipment, such as a stop sign or baton, which can be more easily detected by the sensors 90 1-90 S and used as a reference to allow more accurate detection of the hand and arm movements and hence the human protocol gesture.
- In another embodiment, as shown in
FIG. 22 , the ancillary equipment may includegloves - Moreover, the left and
right gloves right gloves - Consider now the situation where the human protocol gesture involves actions of a driver of an oncoming vehicle where, for example, a single lane must be shared between the
vehicle 10 and the oncoming vehicle. In this case, the sensors 90 1-90 S, capture images of the driver of the oncoming vehicle, who may be gesturing with his or her hands. Alternatively or in addition, the driver of the oncoming vehicle may also have triggered a detectable change to the oncoming vehicle, such as by temporarily flashing the head lights or the high beam to signal a desire to relinquish right-of-way to the vehicle 10 (not necessarily knowing a priori that thevehicle 10 is an autonomous vehicle). These effects (hand gestures and/or flashing headlights), in combination, can be detected and processed by thecontrol system 15 so as to result in a conclusion that the driver of the oncoming vehicle has carried out a human protocol gesture, which embeds a command (such as “go ahead”). - b) Driving Behavior Modifications Based on Human Protocol Gesture
- Once a human protocol gesture has been detected, and recognized, the control system has effectively decoded a command (or sequence of commands) issued by the human 44 i involved in making the gesture, according to cultural norms. It is envisaged that this command, if acted upon by the
vehicle 10, would be inconsistent with conventional algorithmic driving rules and therefore thecontrol system 15 may cause thevehicle 10 to change modes of operation. For example, thecontrol system 15 may cause thevehicle 10 to enter a conventional (non-self-driving) mode, whereby control of thevehicle 10 is passed to a human that is occupying a driver's seat of thevehicle 10. In another embodiment, thecontrol system 15 may enter an “autonomous override” mode of operation whereby thevehicle 10 is still in self-driving mode but behaves in a way that deviates from conventional algorithmic driving rules. - The override mode of operation may be temporary, as it is expected that driving conditions will return to normal. As such, each human protocol gesture may be associated with an expected length of time that the
vehicle 10 will remain in the override mode. This amount of time may be variable, depending on the speed with which traffic is moving, the distance to the human 44 i carrying out the human protocol gesture, etc. Once the expected amount of time is reached, it is envisaged that there will no longer be any gesturing directed at thevehicle 10 and thecontrol system 15 will have to determine autonomously the moment when it is should return to a normal autonomous mode of operation (consistent with conventional algorithmic driving rules). - For example, a police officer or crossing guard 44 i may have carried out a human protocol gesture that signals for the
vehicle 10 to stop, despite thevehicle 10 having right-of-way under conventional algorithmic driving rules. In this case, thevehicle 10 enters the override mode. During the override mode, thecontrol system 15 modifies the way in which thevehicle 10 is autonomously driven by (i) keeping thevehicle 10 stopped and (ii) continuing the detection of human protocol gestures until detecting that thevehicle 10 has been signaled to proceed. If thecontrol system 15 detects a subsequent human protocol gesture that includes a command for thevehicle 10 to change lanes into an oncoming traffic lane, then modifying the way in which the vehicle is autonomously driven includes proceeding at low speed into the oncoming traffic lane (which is contrary to conventional algorithmic driving rules), determining a re-entry point into the regular lane and re-entering the regular lane at the re-entry point. Thereafter, thevehicle 10 exits the override mode. - Another way to condition the
vehicle 10 to exit the override mode may be to learn about its surroundings (e.g., using the sensors 90 1-90 S). For example, thecontrol system 15 may implement an accident detection module that is configured to detect a scene of an accident based on factors such as vehicle shape and position distortions, color anomalies, broken glass fragments on the ground, presence of ambulances and so on. In this case, thecontrol system 15 may be configured to determine a safe distance from the scene of the accident after which thevehicle 10 may return to its original lane and exit the override mode. - c) Validation
- In addition to detecting and recognizing a human protocol gesture, in some embodiments, as shown in
FIG. 23 , thecontrol system 15 may perform an additional validation step, in order to confirm the authority of the source of the human protocol gesture, before proceeding to alter driving behavior. In that sense, thecontrol system 15 may perform the validation step based on detection of a uniform (e.g., in the case of a police officer or crossing guard, whereby the uniform could include one or more of a vest, hat, badge, pants and shoes) or based on detection of a human driver of an oncoming car, as well as detecting “eye contact” with that human driver. - In the case where the human 44 i carrying out the hand and arm movements associated with a human protocol gesture uses the
gloves control system 15 that it can act upon a human protocol gesture detected as having been carried out by a wearer of such “authenticated” gloves. - It is also envisaged that the
gloves vehicle 10. Such gestures could involve hand and arm movements that would not be intuitively understood by human drivers yet ideally suited for detection by cameras and/or LIDAR. For example, certain newly created movements, positions or signs may serve to cancel or reset the control system's interpretation of any ongoing human protocol gesture so as to allow the human to restart communications using hand and arm movements. - It is also envisaged that the hand and arm movements may be recorded in memory and post-processed for algorithm improvement.
- 3. Autonomously Acting Based on Indicators Placed at Particular Locations
- In some embodiments, as shown in
FIG. 34 , indicators 89 1-89 G may be configured to be placed at particular locations 94 1-94 L and recognizable by thecontrol system 15 of thevehicle 10 such that thecontrol system 15 autonomously operates the vehicle 10 (e.g., steers, decelerates, stops, opens one or more of the windows 21 1-21 W, unlocks one or more doors of thecabin 22, etc.) at these particular locations based on recognition of the indicators 89 1-89 G. - The indicators 89 1-89 G can provide information about the particular locations 94 1-94 L to the
control system 15 of thevehicle 10 that may otherwise be unobtainable by thecontrol system 15 through itssensing apparatus 82 monitoring the environment of thevehicle 10 if the indicators 89 1-89 G were absent from that environment. - For example, in various embodiments, this may be useful when the
vehicle 10 moves at drive-through establishments (e.g., restaurants, banks, etc.), travels where potholes are present, looks to park, and/or is in other situations in which certain aspects of the particular locations 94 1-94 L would otherwise not be timely known by thecontrol system 15 of thevehicle 10. - An
indicator 89 x is a physical object dedicated to autonomous vehicles like the vehicle and designed to be placed at aparticular location 94 y and recognized by the autonomous vehicles' control systems like thecontrol system 15 of thevehicle 10 to cause these control systems to operate the autonomous vehicles based on recognition of theindicator 89 x. That is, thecontrol system 15 of thevehicle 10 operates the vehicle differently when recognizing theindicator 89 x than if it had not recognized theindicator 89 x. Theindicator 89 x has an associated predefined meaning such that, upon being recognized by thecontrol system 15 of thevehicle 10, thecontrol system 15 knows what theindicator 89 x means. Dedicated to autonomous vehicles, theindicator 89 x is not a traffic sign (a.k.a., road sign) conventionally used for human-driven vehicles. - The
indicator 89 x at theparticular location 94 y may be implemented in any suitable way in various embodiments. - For example, in some embodiments, as shown in
FIG. 35 , theindicator 89 x may be an optical indicator configured to be optically observed by thesensing apparatus 82 of thecontrol system 15 of thevehicle 10. For instance, in some embodiments, theindicator 89 x may include avisual element 95 such as an image (e.g., a symbol, etc.), a color, etc., capturable by a camera of thesensing apparatus 82 and recognizable by thecontroller 80 of thecontrol system 15. Thevisual element 95 may be printed, painted or otherwise applied. In some cases, theindicator 89 x may comprise a supporting portion 97 (e.g., a wall, panel, etc.) and thevisual element 95 may include a layer that is printed, painted or otherwise applied onto the supportingportion 97. In other cases, thevisual element 95 may be printed, painted or otherwise applied directly onto an existing structure (e.g., part of theroad 11, a building wall, etc.) at theparticular location 94 y. Thevisual element 95 may include material (e.g., tape, paint, ink, etc.) more easily observable by the camera of thesensing apparatus 82, such as by being more reflective (e.g., highly retroreflective, reflective of IR or other particular wavelengths, etc.) As another example, in some embodiments, as shown inFIG. 36 , theindicator 89 x may be a signal emitter (e.g., a beacon) configured to emit a signal receivable by thecommunication interface 68 of thevehicle 10 and recognizable by thecontroller 80 of thecontrol system 15. For instance, in some embodiments, theindicator 89 x may include atransmitter 98 configured to transmit the signal repeatedly (e.g., periodically) or in response to a trigger or interrogation signal previously issued by thevehicle 10. The signal emitted by theindicator 89 x may be wirelessly conveyed via a cellular, WiFi, BlueTooth, or other wireless link. - For instance, in some embodiments, as shown in
FIG. 37 , anindicator 89 x may be placed at aparticular location 94 y where an interaction with anexternal element 106 that is external to thevehicle 10 is to occur, such that thecontrol system 15 of thevehicle 10 autonomously stops thevehicle 10 at theparticular location 94 y in order to allow occurrence of the interaction with theexternal element 106. More particularly, in this embodiment, theparticular location 94 y is at a drive-through establishment 108, such as a restaurant (e.g., a fast-food restaurant, a coffee shop, etc.) in which case theexternal element 106 is a drive-through counter to pay and/or pick up an order of food and/or beverage or a bank in which case theexternal element 106 is an automated telling machine (ATM) to perform a financial transaction. Upon recognizing theindicator 89 x, thecontrol system 15 of thevehicle 10 understands that thevehicle 10 is to be stopped at theparticular location 94 y, which may be set so that a given one of the windows 21 1-21 W of thecabin 22 is aligned with the drive-through counter, ATM or otherexternal element 106 to proceed with the interaction with theexternal element 106. - In some embodiments, as shown in
FIG. 38 , anindicator 89 x may be placed at aparticular location 94 y that should be avoided by thevehicle 10, such that thecontrol system 15 of thevehicle 10 autonomously steers thevehicle 10 to avoid theparticular location 94 y. More particularly, in this embodiment, theparticular location 94 y is at apothole 112 on theroad 11. Upon recognizing theindicator 89 x, thecontrol system 15 of thevehicle 10 understands that thevehicle 10 is to avoid thepothole 112 at theparticular location 94 y and determine an alternative path to steer thevehicle 10 without crossing the pothole 112 (e.g., by using thesensing apparatus 92 to assess whether there is an incoming vehicle in an adjacent lane, etc.). - In some embodiments, as shown in
FIG. 39 , anindicator 89 x may be placed at aparticular location 94 y that is aparking spot 118 for thevehicle 10, such that thecontrol system 15 of thevehicle 10 autonomously parks thevehicle 10 at theparking spot 118. More particularly, in this embodiment, theparking spot 118 at theparticular location 94 y may not be indicated by conventional paint on theroad 11 or other conventional parking signs, so that it may not be apparent to thecontrol system 15 of thevehicle 10 that thevehicle 10 can park there. Upon recognizing theindicator 89 x, thecontrol system 15 of thevehicle 10 understands that thevehicle 10 can park at theparking spot 118 at theparticular location 94 y and proceeds to autonomously park thevehicle 10 there (e.g., by using theindicator 89 x as a reference for parking, such as a center, corner or other reference point of the parking spot 118). - 4. Facilitating Acts of Occupants (e.g., Unrelated to and Normally not Done while Driving)
- In some embodiments, as shown in
FIG. 24 , theautonomous vehicle 10 may include occupant-act facilitators 45 1-45 D that comprise devices configured to facilitate one or more acts of one or more occupants in thecabin 22 of thevehicle 10, such as one or more acts unrelated to and normally not done while driving, including, for example, sleeping, exercising, working, eating, cooking, and/or any other suitable act. - a) Sleeping
- In some embodiments, as shown in
FIG. 25 , an occupant-act facilitator 45 i may be a sleeping facilitator configured to facilitate sleeping of an occupant in thecabin 22, such as by altering thecabin 22 to reduce stimuli (e.g., light, noise, vibrations, etc.) from thevehicle 10 and/or its environment. - For example, in some embodiments, the sleeping
facilitator 45 i may comprise the light-control system 55 to control (e.g., reduce) light entering into thecabin 22 via the windows 21 1-21 W, which may comprise thelight blocker 27, such as the window covering 23 deployable to cover at least part of the windows 21 1-21 W and/or the window transmissivity changer 25 (e.g., film) to change the tint or other aspect affecting transmissivity of one or more of the windows 21 1-21 W, as discussed above. As another example, in some embodiments, the sleepingfacilitator 45 j may comprise the noise-control system 57 configured to control (e.g., reduce) noise in thecabin 22, which may comprise thenoise canceller 59 to at least partly cancel the noise entering thecabin 22, as discussed above. As yet another example, in some embodiments, the sleepingfacilitator 45 i may comprise the seat-alteringsystem 52 configured to alter a seat 20 i (e.g., a driver's seat) occupied by the occupant, which may comprise thepillow 63 for theseat 20 i, as discussed above. - In some embodiments, instead of being controlled by the
controller 80, the sleepingfacilitator 45 i may be manually operated within thecabin 22 by an occupant. For example, in some embodiments, the occupant may interact with theuser interface 70 to input commands to activate, move, and/or otherwise control the sleepingfacilitator 45 i when he/she desires to sleep. - In some examples, one or more functionalities of the sleeping
facilitator 45 i that enhance privacy and comfort as discussed above may also be used by the occupant for purposes other than sleep. For instance, in some embodiments, this may be used by the occupant for relaxing (without necessarily sleeping), sex, etc. - b) Working
- In some embodiments, as shown in
FIG. 26 , an occupant-act facilitator 45 j may be a working facilitator configured to facilitate work of an occupant in thecabin 22 by (e.g., altering thecabin 22 for) providing aworkspace 64 for the occupant. - For example, in some embodiments, the working
facilitator 45 i providing theworkspace 64 may comprise a desk 65 (e.g., a table) on which the occupant can work, such as by supporting a computer (e.g., a laptop computer, a tablet, etc.), papers, pens, and/or other work items used by the occupant. In some cases, the workingfacilitator 45 i may include acomputer mount 66, such as a docking station and/or connectors (e.g., one or more power outlets or other electrical connectors, USB connectors, etc.) associated with thedesk 65. The workingfacilitator 45 i may also include ascreen 67 connectable to the computer (e.g., via the computer mount 66) and integrated into the cabin 22 (e.g., in a dashboard, such as part of the user interface 70). - In some embodiments, at least part of the working
facilitator 45 i providing theworkspace 64 may be movable between a working position, in which it is usable by the occupant to work, and a nonworking (e.g., stowed) position, in which it is stowed (e.g., stored), concealed and/or otherwise not usable by the occupant to work. For example, in some embodiments, at least part of the workingfacilitator 45 i providing theworkspace 64 may be deployable (e.g., extendible) from the nonworking position into the working position and retractable from the working position into the nonworking position (e.g., in which it may be concealed by a door). - For instance, in some embodiments, the
desk 65 may be movable between the working position, in which it extends over the occupant while he/she is sitting on aseat 20 i so as to be usable by the occupant to work on thedesk 65, and the nonworking position, in which it clears (i.e., does not extend over) the occupant while he/she is sitting on theseat 20 i so that the occupant is unimpeded by thedesk 65. Alternatively, in some embodiments, thedesk 65 may be movable between the working position, in which it extends over an adjacent one of the seats 20 1-20 W (e.g., a passenger seat) that is adjacent to theseat 20 i of the occupant that can be rotated to face thedesk 65, and the nonworking position, in which it clears (i.e., does not extend over) that adjacent seat. For example, in some embodiments, thedesk 65 may be deployable (e.g., extendible) from the nonworking position, in which it is disposed in a recess (e.g., between adjacent ones of the seats 20 1-20 S, below a dashboard of theuser interface 70, etc.) into the working position over the occupant in theseat 20 i and retractable from the working position into the nonworking position. In cases where theworkspace 64 includes thescreen 67 for the computer, thescreen 67 may also be movable (e.g., deployable and retractable) between the working position and the nonworking position. - c) Exercising
- In some embodiments, as shown in
FIG. 27 , an occupant-act facilitator 45 k may be an exercising facilitator configured to facilitate exercising of an occupant in thecabin 22 by (e.g., altering thecabin 22 for) providing anexerciser 71 for the occupant. Theexerciser 71 can comprise any apparatus usable by the occupant to physically exercise. - For example, in some embodiments, the
exerciser 71 may comprise acardiovascular exercising device 47. For instance, in some cases, thecardiovascular exercising device 47 may comprise a leg-motion mechanism configured to be operated by legs of the occupant (e.g., including pedals, gliders, and/or other feet-engaging elements configured to be engaged by the occupant's feet to operate the leg-motion mechanism, in a pedaling, swinging, or any other suitable movement). Alternatively or additionally, thecardiovascular exercising device 47 may comprise an arm-motion mechanism configured to be operated by arms of the occupant (e.g., including handles and/or other hand-engaging elements configured to be engaged by the occupant's hands to operate the arm-motion mechanism, in a rowing, pulling or any other suitable movement). In some cases, thecardiovascular exercising device 47 may comprise both the leg-motion mechanism and the arm-motion mechanism configured to be operated by the occupant's legs and arms (e.g., akin to an elliptical exercising machine). - As another example, in some embodiments, the
exerciser 71 may comprise astrength training device 48. For instance, in some cases, thestrength training device 48 may comprise an arm-motion mechanism configured to be operated by the occupant's arms (e.g., including handles and/or other hand-engaging elements configured to be engaged by the occupant's hands to operate the arm-motion mechanism, in a bending, pulling or any other suitable movement). Additionally or alternatively, in some cases, thestrength training device 48 may comprise a leg-motion mechanism configured to be operated by the occupant's legs (e.g., including pedals, gliders, and/or other feet-engaging elements configured to be engaged by the occupant's feet to operate the leg-motion mechanism, in a pushing, raising, and/or any other suitable movement). - The
exerciser 71 may provide resistance for exercising of the occupant in any suitable way. For example, in some embodiments, theexerciser 71 may comprise free weights that can be used by the occupant to exercise. In such embodiments, theexerciser 71 may include a free-weight holder (e.g., rack) to hold the free weights when not in use. As another example, theexerciser 71 may comprise a fluidic (e.g., hydraulic or pneumatic) resistance mechanism providing pressure to be moved against by the occupant during exercising. In some cases, at least part of thecardiovascular exercising device 47 and at least part of thestrength training device 48 of theexerciser 71 may be implemented by a common device. - In some embodiments, as shown in
FIG. 28 , where thepower source 13 of thepowertrain 14 of theautonomous vehicle 10 comprises an electric motor powered by a battery, theexerciser 71 may be connected to thepowertrain 14 of thevehicle 10 to recharge the battery. Agenerator 72 is drivable by theexerciser 71 to generate electrical power applied to the battery to recharge the battery. In some cases, theuser interface 70 of thecabin 22 may indicate to the occupant how much power he/she has given to thevehicle 10 by exercising (e.g., an indication of watts, a range in kilometers or miles for thevehicle 10, etc.). - In some examples, at least part of the exercising
facilitator 45 k providing theexerciser 71 may be movable between an exercising position, in which it is usable by the occupant to exercise, and a nonexercising (e.g., stowed) position, in which it is stowed (e.g., stored), concealed and/or otherwise not usable by the occupant to exercise. For example, in some embodiments, at least part of the exercisingfacilitator 45 k providing theexerciser 71 may be deployable (e.g., extendible) from the nonexercising position into the exercising position and retractable from the exercising position into the nonexercising position (e.g., in which it may be concealed by a door). - For instance, in some embodiments, the
exerciser 71 may be movable between the exercising position, in which it extends to be engageable by the occupant while he/she is sitting on a seat 20 i (e.g., a driver's seat) so as to be usable by the occupant to exercise, and the nonexercising position, in which it clears (i.e., is unengageable by) the occupant while he/she is sitting on theseat 20 i so that the occupant is unimpeded by theexerciser 71. For example, in some embodiments, theexerciser 71 may be deployable from the exercising position, in which it is disposed in a recess (e.g., between adjacent ones of the seats 20 1-20 S, below a dashboard of theuser interface 70, etc.) into the exercising position to engage the occupant in theseat 20 i and retractable from the exercising position into the nonexercising position. - d) Eating
- In some embodiments, as shown in
FIG. 29 , an occupant-act facilitator 45 m may be an eating facilitator configured to facilitate eating by an occupant in thecabin 22 by (e.g., altering thecabin 22 for) providing an eatingarea 75 for the occupant. - For example, in some embodiments, the eating
facilitator 45 m providing the eatingarea 75 may comprise a table 77 (e.g., a tray or other flat support) on which the occupant can eat, such as by supporting food and tableware (e.g., dishes, glasses, knives, forks, etc.) used by the occupant. - In some embodiments, at least part of the eating
facilitator 45 m providing the eatingarea 75 may be movable between an eating position, in which it is usable by the occupant to eat, and a noneating (e.g., stowed) position, in which it is stowed (e.g., stored), concealed and/or otherwise not usable by the occupant to eat. For example, in some embodiments, at least part of the eatingfacilitator 45 m providing the eatingarea 75 may be deployable (e.g., extendible) from the noneating position into the eating position and retractable from the eating position into the noneating position (e.g., in which it may be concealed by a door). - For instance, in some embodiments, the table 77 may be movable between the eating position, in which it extends over the occupant while he/she is sitting on a seat 20 i (e.g., a driver's seat) so as to be usable by the occupant to eat at the table 77, and the noneating position, in which it clears (i.e., does not extend over) the occupant while he/she is sitting on the
seat 20 i so that the occupant is unimpeded by the table 77. Alternatively, in some embodiments, the table 77 may be movable between the eating position, in which it extends over an adjacent one of the seats 20 1-20 W (e.g., a passenger seat) that is adjacent to theseat 20 i of the occupant that can be rotated to face the table 77, and the noneating position, in which it clears (i.e., does not extend over) that adjacent seat. For example, in some embodiments, the table 77 may be deployable (e.g., extendible) from the noneating position, in which it is disposed in a recess (e.g., between adjacent ones of the seats 20 1-20 S, below a dashboard of theuser interface 70, etc.) into the eating position over the occupant in theseat 20 i and retractable from the eating position into the noneating position. - In some embodiments, the eating
facilitator 45 m may comprise aholder 78 to hold the tableware on the table 77 while thevehicle 10 is in motion. For example, in some embodiments, theholder 78 may comprise a mechanical holder (e.g., a clamp, a recess, etc.) to abut and mechanically hold the tableware in place. In other embodiments, theholder 78 may comprise a magnetic holder, such as a magnet attracting an opposite magnet secured to (e.g., built into or adhesively bonded at an underside of) the tableware, to magnetically hold the tableware in place. - In some examples, the eating
facilitator 45 m may comprise awaste disposer 69 to dispose of waste (i.e., garbage) such as what is not eaten by the occupant. For example, in some embodiments, thewaster disposer 69 may comprise a garbage can configured to receive the waste, either directly or in a garbage bag placed in the can, and to be emptied. The garbage can may include a lid having a sealing mechanism to limit odors propagating in thecabin 22. As another example, in some embodiments, thewaste disposer 69 may comprise a garburator to break apart the waste. In some cases, the garburator may receive water from a tank (e.g., filled by rain water falling onto the vehicle 10). - The
waste disposer 69 may be located in thevehicle 10 so as to be vented (e.g., open to a vent exposed to ambient air outside thevehicle 10, such as at an underside of the vehicle 10). In various embodiments, thewaste disposer 69 may be emptied by removing and emptying the garbage can manually or by pumping or otherwise emptying the garburator. - e) Cooking
- In some embodiments, as shown in
FIG. 30 , an occupant-act facilitator 45 n may be a cooking facilitator configured to facilitate cooking by an occupant in thecabin 22 by (e.g., altering thecabin 22 for) providing acooking area 79 for the occupant. - For example, in some embodiments, the
cooking facilitator 45 n providing thecooking area 79 may comprise the table 77 as discussed above that can be used by the occupant to cut, mix, and/or otherwise prepare food in addition to eating. - As another example, in some embodiments, the
cooking facilitator 45 n providing thecooking area 79 may comprise one or more appliances 81 1-81 T configured to cook. - For instance, in some embodiments, an appliance 81 i may be an oven, stove, slow cooker, or grill (e.g., a microwave oven, an electric stove, an electric grill, etc.) or other heating appliance to cook by heat. In some cases, venting may be effected by opening one or more of the windows 21 1-21 W and/or by providing a vent associated with the heating appliance 81 i.
- In some embodiments, an appliance 81 j may be a refrigerator to refrigerate ingredients (e.g., produce, meat, fish, poultry, etc.) usable by the occupant to cook. In some cases, the refrigerator 81 j may be powered by a battery (e.g., dedicated to powering the refrigerator and recharged by a solar panel including photovoltaics).
- In some examples, the
cooking facilitator 45 n may comprise thewaste disposer 69 configured to dispose of waste which is not used by the occupant when cooking. - In some embodiments, at least part of the
cooking facilitator 45 n providing thecooking area 79 may be movable between a cooking position, in which it is usable by the occupant to cook, and a noncooking (e.g., stowed) position, in which it is stowed (e.g., stored), concealed and/or otherwise not usable by the occupant to cook. For example, in some embodiments, at least part of thecooking facilitator 45 2 providing thecooking area 79 may be deployable (e.g., extendible) from the noncooking position into the cooking position and retractable from the cooking position into the noncooking position (e.g., in which it may be concealed by a door). - For instance, in some embodiments, the table 77 may be movable between the cooking position, in which it extends over the occupant while he/she is sitting on a seat 20 i (e.g., a driver's seat) so as to be usable by the occupant to cook at the table 77, and the noncooking position, in which it clears (i.e., does not extend over) the occupant while he/she is sitting on the
seat 20 i so that the occupant is unimpeded by the table 77, or may be movable between the cooking position, in which it extends over an adjacent one of the seats 20 1-20 w (e.g., a passenger seat) that is adjacent to theseat 20 i of the occupant that can be rotated to face the table 77, and the noncooking position, in which it clears (i.e., does not extend over) that adjacent seat, as discussed above. - In some embodiments, an appliance 81 x may be movable between the cooking position, in which it can be used by the occupant while he/she is sitting on a seat 20 i (e.g., a driver's seat) to cook, and the noncooking position, in which it is stowed, concealed and/or otherwise unusable by the occupant while he/she is sitting on the
seat 20 i. For example, in some embodiments, the appliance 81 x may be deployable (e.g., extendible) from the noncooking position, in which it is disposed in a recess (e.g., between adjacent ones of the seats 20 1-20 S, below a dashboard of theuser interface 70, etc.) into the cooking position for the occupant in theseat 20 i and retractable from the cooking position into the noncooking position. - In some cases, an occupant-act facilitator 45 x (e.g., the window covering 23, the
window transmissivity changer 25, thenoise canceller 59, thedesk 65, thecomputer mount 66, theexerciser 71, the one or more appliances 81 1-81 T, etc.) may be built into (i.e., integrated in) thecabin 22 during original manufacturing of the autonomous vehicle 10 (e.g., below a dashboard, on a console of theuser interface 70, on or between a given one of the seats 20 1-20 S, etc.). - In other cases, an occupant-act facilitator 45 x (e.g., the window covering 23, the
window transmissivity changer 25, thenoise canceller 59, thedesk 65, thecomputer mount 66, theexerciser 71, the one or more appliances 81 1-81 T, etc.) may be configured to be installed in thecabin 22 after original manufacturing of the vehicle 10 (e.g., an aftermarket device installable in thecabin 22 by an owner or leaser of the vehicle 10). For example, in some embodiments, as shown inFIG. 31 , the occupant-act facilitator 45 x may comprise aconnector 88 configured to connect the occupant-act facilitator 45 x to a supportingportion 91 of the cabin 22 (e.g., a wall of thecabin 22 below a dashboard, adjacent to a console of theuser interface 70, on or between a given one of the seats 20 1-20 S, etc.). Theconnector 88 may comprise one or more fasteners, such as screws, bolts, hook-and-loop (e.g., Velcro) fasteners, clips, clamps, and/or any other fastening device. - In some examples, the supporting
portion 91 of thecabin 22 may comprise aconnector 92 complimentary to and configured to engage and interconnect with theconnector 88 of the occupant-act facilitator 45 x (e.g., one or more (e.g., threaded) openings, clips, latches, etc.). For instance, in some embodiments, theconnector 92 of the supportingportion 91 of thecabin 22 may be built into (i.e., integrated into) thecabin 22 during original manufacturing of theautonomous vehicle 10. Alternatively, in some embodiments, theconnector 92 of the supportingportion 91 of thecabin 22 may be configured to be installed in thecabin 22 after original manufacturing of thevehicle 10 along with the occupant-act facilitator 45 x. - 5. Automatic Personalization for Occupant
- In some embodiments, as shown in
FIGS. 40 and 41 , thevehicle 10 may be personalized for an occupant based on an identity of the occupant, such that one or more aspects of thevehicle 10, like a configuration of thecabin 22, the self-driving mode of thecontrol system 15 of thevehicle 10, a destination and/or a route of thevehicle 10, and/or other aspects of thevehicle 10, are adjusted based on the identity of the occupant. - For instance, this may be useful where different occupants use the
vehicle 10 at different times, whether thevehicle 10 is a private one (e.g., which may be used by parents and their children) or a public one used as part of a taxi, ride-hailing or vehicle-sharing service. - More particularly, in this embodiment, the
control system 15 is configured to receive anidentifier 121 indicative of the identity of the occupant and to adjust one or more aspects of thevehicle 10 based on the identity of the occupant. For example, in some embodiments, theidentifier 121 may include a name, a code, or other identification information input by the occupant. As another example, in some embodiments, theidentifier 121 may include a biometric of the occupant, such as a picture, fingerprint, voice print, etc. - In some cases, the
identifier 121 may be input by the occupant via theuser interface 70 of the cabin 22 (e.g., using buttons, a camera or other biometric reader, etc.), In other cases, theidentifier 121 may be transmitted from a personal device carried by the occupant, such as a smartphone or other wireless phone, a tablet computer, a head-mounted display, smartwatch or other wearable device, etc., to thecommunication interface 68 of thevehicle 10. - Upon receiving the
identifier 121, thecontroller 80 of thevehicle 10 adjusts one or more aspects of thevehicle 10 based on the identity of the occupant. For instance, in some embodiments, as shown inFIG. 42 , adatabase 130 may store arecord 152 including theidentifier 121 of the occupant andtravel information 145, which may be indicative of a self-driving preference, a route, a destination, etc., of the occupant when travelling in thevehicle 10. Thedatabase 130 can store multiple such records including identifiers of various individuals who may travel in thevehicle 10 and travel information, which may be indicative of preferences, routes, destinations, etc., of these individuals when travelling in thevehicle 10, so that one or more aspects of thevehicle 10 may be adjusted based on identities of these various individuals when they are occupants. In some cases, thedatabase 130 may be part of thecontroller 80 of thevehicle 10. In other cases, thedatabase 130 may be part of a server external to thevehicle 10 and accessible by thecontroller 80 via thecommunication interface 68 of thevehicle 10. - For example, in some embodiments, based on the
travel information 145 associated with theidentifier 121 of the occupant, thecontroller 80 may: -
- alter the cabin 22 (e.g., a
seat 20 i, reduce stimuli (e.g., light, noise, vibrations, etc.) from thevehicle 10 and/or its environment, deploy one or more of the occupant-act facilitators 45 1-45 D, etc.), such as discussed above, based on one or more preferences of the occupant; - alter the self-driving mode of the
vehicle 10, such as by reducing potential for sudden or abrupt movements (e.g., acceleration, braking, turning, shaking, etc.) of thevehicle 10 on theroad 11, based on one or more preferences of the occupant; - set a destination and/or a route for the
vehicle 10, based on one or more preferences of the occupant (e.g., a home address when entering thevehicle 10 after work or school hours, at night, etc.); and/or - adjust any other aspect of the
vehicle 10 based on the identity of the occupant.
- alter the cabin 22 (e.g., a
- In other embodiments, information to personalize the
vehicle 10 based on the identity of the occupant like thetravel information 145 may be stored in a personal device carried by the occupant, such as a smartphone or other wireless phone, a tablet computer, a head-mounted display, smartwatch or other wearable device, etc., and transmitted to thecontroller 80 of thevehicle 10 via thecommunication interface 68. - In some embodiments, as shown in
FIG. 32 , one or more systems, devices and/or other components discussed above (e.g., the window covering 23, thewindow transmissivity changer 25, and/or other components of the light-control system 55; thenoise canceller 59 and/or other components of the noise-control system 57; thedesk 65, thecomputer mount 66, theexerciser 71, the one or more appliances 81 1-81 T, and/or other components of the occupant-act facilitators 45 1-45 D; one or more of the sensors 75 1-75 P; etc.) may be anaftermarket apparatus 30 configured to be installed in thecabin 22 after original manufacturing of thevehicle 10 and, in some cases, may be configured to be automatically controlled by acontroller 93 that is implemented after original manufacturing of thevehicle 10. - The
controller 93 may function as discussed above in respect of thecontroller 80 of thecontrol system 15 of theautonomous vehicle 10. For example, in various embodiments, thecontroller 93 may be configured to monitor the interior of thecabin 22 and, in response to detecting an actionable event in thecabin 22, cause thevehicle 10 to autonomously reroute itself, cause issuance of anotification 85 to acommunication device 87 external to thevehicle 10, cause thecabin 22 to be altered, cause the self-driving mode of thevehicle 10 to be altered, cause thevehicle 10 to autonomously perform one or more actions based interactions with (e.g., gestures of) humans external to thevehicle 10, etc., as discussed above in respect of thecontroller 80. - In some embodiments, as shown in
FIG. 33 , thecontroller 93 comprises aninterface 266, aprocessing portion 268, and amemory portion 270, which are implemented by suitable hardware and software. - The
interface 266 comprises one or more inputs and outputs allowing thecontroller 93 to receive input signals from and send output signals to other components to which thecontroller 93 is connected (i.e., directly or indirectly connected), including the aftermarket apparatus 30 (e.g., which may include one or more of: the window covering 23, thewindow transmissivity changer 25, and/or other components of the light-control system 55; thenoise canceller 59 and/or other components of the noise-control system 57; thedesk 65, thecomputer mount 66, theexerciser 71, the one or more appliances 81 1-81 T, and/or other components of the occupant-act facilitators 45 1-45 D); one or more of the sensors 75 1-75 P; thepowertrain 14; thesteering system 16; theuser interface 70; thecommunication interface 68; etc. - The
processing portion 268 comprises one or more processors for performing processing operations that implement functionality of thecontroller 93. A processor of theprocessing portion 268 may be a general-purpose processor executing program code stored in thememory portion 270. Alternatively, a processor of theprocessing portion 268 may be a specific-purpose processor comprising one or more preprogrammed hardware or firmware elements (e.g., application-specific integrated circuits (ASICs), electrically erasable programmable read-only memories (EEPROMs), etc.) or other related elements. - The
memory portion 270 comprises one or more memories for storing program code executed by theprocessing portion 268 and/or data (e.g., maps, vehicle parameters, etc.) used during operation of theprocessing portion 268. A memory of thememory portion 270 may be a semiconductor medium (including, e.g., a solid-state memory), a magnetic storage medium, an optical storage medium, and/or any other suitable type of memory. A memory of thememory portion 270 may be read-only memory (ROM) and/or random-access memory (RAM), for example. - In some embodiments, the
controller 93 may comprise and/or interact with one or more other control units of theautonomous vehicle 10. For example, in some embodiments, thecontroller 93 may comprise and/or interact with a powertrain control unit of thepowertrain 14, such as an engine control unit (ECU), a transmission control unit (TCU), etc. - In some cases, the
controller 93 may be configured to be installed and implemented into thevehicle 10 after original manufacturing of the vehicle 10 (e.g., an aftermarket device installable and implementable in thevehicle 10 by an owner or leaser of the vehicle 10). - For example, in some embodiments, software implementing functionality of the
controller 93 may be downloaded onto thememory portion 170 of thecontroller 80 built into thevehicle 10 during original manufacturing of thevehicle 10 such that thecontroller 80 becomes the controller 93 (i.e., theinterface 166, theprocessing portion 168 and thememory portion 170 of thecontroller 80 respectively become theinterface 266, theprocessing portion 268 and thememory portion 270 of the controller 93). - As another example, in some embodiments, the
controller 93 may a standalone controller that is separate from thecontroller 80 of thecontrol system 15 of the vehicle and provided together with theaftermarket apparatus 30 that is it configured to control (e.g., the window covering 23, thewindow transmissivity changer 25, and/or other components of the light-control system 55; thenoise canceller 59 and/or other components of the noise-control system 57; thedesk 65, thecomputer mount 66, theexerciser 71, the one or more appliances 81 1-81 T, and/or other components of the occupant-act facilitators 45 1-45 D; one or more of the sensors 75 1-75 P; etc.), as part of an aftermarket kit. - Certain additional elements that may be needed for operation of some embodiments have not been described or illustrated as they are assumed to be within the purview of those of ordinary skill in the art. Moreover, certain embodiments may be free of, may lack and/or may function without any element that is not specifically disclosed herein.
- Any feature of any embodiment discussed herein may be combined with any feature of any other embodiment discussed herein in some examples of implementation.
- In case of any discrepancy, inconsistency, or other difference between terms used herein and terms used in any document incorporated by reference herein, meanings of the terms used herein are to prevail and be used.
- Although various embodiments and examples have been presented, this was for purposes of description, but should not be limiting. Various modifications and enhancements will become apparent to those of ordinary skill in the art.
Claims (21)
1.-117. (canceled)
118. A vehicle comprising: a cabin including a driver's seat for an occupant of the vehicle and a user interface for interacting with the occupant; and a control system configured to autonomously drive the vehicle, monitor the occupant to determine a state of the occupant, and alter a screen of the user interface to allow the occupant to use the screen for a driving-unrelated act of the occupant during autonomous driving of the vehicle.
119. The vehicle of claim 118 , wherein the control system is configured to alter the autonomous driving of the vehicle based on the state of the occupant.
120. The vehicle of claim 119 , wherein, to alter the autonomous driving of the vehicle based on the state of the occupant, the control system is configured to autonomously stop the vehicle.
121. The vehicle of claim 119 , wherein, to alter the autonomous driving of the vehicle based on the state of the occupant, the control system is configured to autonomously reroute the vehicle.
122. The vehicle of claim 118 , wherein, to alter the screen, the control system is configured to make the screen usable by the occupant for the driving-unrelated act of the occupant during the autonomous driving of the vehicle after the screen was unusable by the occupant for the driving-unrelated act of the occupant during traveling of the vehicle.
123. The vehicle of claim 118 , wherein the control system is configured to alter the screen by moving the screen.
124. The vehicle of claim 123 , wherein the control system is configured to move the screen between a first position in which the screen is unusable for the driving-unrelated act of the occupant and a second position in which the screen is usable for the driving-unrelated act of the occupant during the autonomous driving of the vehicle.
125. The vehicle of claim 124 , wherein the screen is stowed in the first position and is deployed in the second position.
126. The vehicle of claim 124 , wherein: in the first position, the screen does not extend over the occupant while the occupant is sitting in the driver's seat; and, in the second position, the screen extends over the occupant while the occupant is sitting in the driver's seat.
127. The vehicle of claim 118 , wherein: the user interface comprises a dashboard; and the screen is part of the dashboard.
128. The vehicle of claim 118 , wherein: the cabin comprises a passenger seat adjacent to the driver's seat; and at least part of the screen is disposed between the driver's seat and the passenger seat.
129. The vehicle of claim 118 , wherein the control system is configured to alter a computer element of the user interface to allow the occupant to use the screen and the computer element for the driving-unrelated act of the occupant during the autonomous driving of the vehicle.
130. The vehicle of claim 129 , wherein the control system is configured to alter the screen and the computer element by moving the screen and the computer element.
131. The vehicle of claim 118 , wherein the control system is configured to determine the state of the occupant based on at least one of a position of the occupant, a movement of the occupant, and a face of the occupant.
132. The vehicle of claim 118 , wherein the control system is configured to issue a warning to the occupant based on the state of the occupant during the autonomous driving of the vehicle.
133. The vehicle of claim 118 , wherein the control system is configured to monitor the occupant based on information received from at least one of a camera, a motion sensor, and a vital sign sensor.
134. The vehicle of claim 118 , wherein the control system is configured to change between a plurality of modes of operation, which includes an autonomous mode of operation implementing the autonomous driving of the vehicle, based on a speed of traffic where the vehicle is located.
135. The vehicle of claim 118 , wherein the driving-unrelated act of the occupant is working with the screen.
136. A vehicle comprising: a cabin including a driver's seat for an occupant of the vehicle and a user interface for interacting with the occupant; and a control system configured to autonomously drive the vehicle, monitor the occupant to determine a state of the occupant, alter a screen of the user interface to allow the occupant to use the screen for a driving-unrelated act of the occupant during autonomous driving of the vehicle, issue a warning to the occupant based on the state of the occupant during the autonomous driving of the vehicle, and alter the autonomous driving of the vehicle based on the state of the occupant.
137. A vehicle comprising: a cabin including a driver's seat for an occupant of the vehicle and a user interface for interacting with the occupant; and a control system configured to autonomously drive the vehicle, monitor the occupant to determine a state of the occupant, alter a screen of the user interface such that the screen becomes usable by the occupant for a driving-unrelated act of the occupant during autonomous driving of the vehicle after the screen was unusable by the occupant for the driving-unrelated act of the occupant during traveling of the vehicle, and alter the autonomous driving of the vehicle based on the state of the occupant.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/301,211 US20230251659A1 (en) | 2018-02-19 | 2023-04-15 | Systems and methods for autonomous vehicles |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862631941P | 2018-02-19 | 2018-02-19 | |
US16/279,778 US11971714B2 (en) | 2019-02-19 | Systems and methods for autonomous vehicles | |
US18/301,211 US20230251659A1 (en) | 2018-02-19 | 2023-04-15 | Systems and methods for autonomous vehicles |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/279,778 Continuation US11971714B2 (en) | 2018-02-19 | 2019-02-19 | Systems and methods for autonomous vehicles |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230251659A1 true US20230251659A1 (en) | 2023-08-10 |
Family
ID=67617882
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/301,211 Pending US20230251659A1 (en) | 2018-02-19 | 2023-04-15 | Systems and methods for autonomous vehicles |
Country Status (1)
Country | Link |
---|---|
US (1) | US20230251659A1 (en) |
-
2023
- 2023-04-15 US US18/301,211 patent/US20230251659A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US20190258253A1 (en) | 2019-08-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3470276B1 (en) | Vehicle control device and vehicle comprising the same | |
KR102387613B1 (en) | Interface system for vehicle | |
JP6773046B2 (en) | Driving support device, driving support method, and moving object | |
KR101891612B1 (en) | Autonomous vehicle | |
KR101895485B1 (en) | Drive assistance appratus and method for controlling the same | |
CN107867296B (en) | Vehicle control apparatus mounted on vehicle and method of controlling the vehicle | |
CN108121343B (en) | Autonomous driving vehicle | |
KR102368812B1 (en) | Method for vehicle driver assistance and Vehicle | |
KR101858694B1 (en) | Vehicle and control method for the same | |
US11318961B2 (en) | Robot for vehicle and control method thereof | |
KR101959300B1 (en) | Smart key for vehicle and system | |
CN107415938A (en) | Based on occupant position and notice control autonomous vehicle function and output | |
CN109204325A (en) | The method of the controller of vehicle and control vehicle that are installed on vehicle | |
US20160362080A1 (en) | Driver Assistance Apparatus For Vehicle And Vehicle | |
CN110001547B (en) | Input/output device and vehicle including the same | |
KR101823230B1 (en) | External modules and vehicles connected to the same | |
CN109835257A (en) | Display device and vehicle with the display device | |
CN109484343A (en) | The vehicle control apparatus and method for controlling a vehicle installed on vehicle | |
US11701984B2 (en) | Apparatus and method for controlling interior of vehicle | |
KR102135379B1 (en) | Robot for vehicle and control method of the robot | |
US20230251659A1 (en) | Systems and methods for autonomous vehicles | |
US11971714B2 (en) | Systems and methods for autonomous vehicles | |
KR101916426B1 (en) | Display Apparatus for Vehicle | |
JP2018181058A (en) | Automatic driving device | |
KR101916726B1 (en) | Vehicle control device mounted at vehicle and method for controlling the vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |