CN107628033B - Navigation based on occupant alertness - Google Patents
Navigation based on occupant alertness Download PDFInfo
- Publication number
- CN107628033B CN107628033B CN201710548955.4A CN201710548955A CN107628033B CN 107628033 B CN107628033 B CN 107628033B CN 201710548955 A CN201710548955 A CN 201710548955A CN 107628033 B CN107628033 B CN 107628033B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- accommodation
- drowsiness
- driver
- response
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000036626 alertness Effects 0.000 title abstract description 8
- 206010041349 Somnolence Diseases 0.000 claims abstract description 83
- 238000000034 method Methods 0.000 claims abstract description 43
- 230000004044 response Effects 0.000 claims abstract description 42
- 230000004308 accommodation Effects 0.000 claims description 69
- 230000000007 visual effect Effects 0.000 claims description 19
- 230000000116 mitigating effect Effects 0.000 claims description 11
- 230000001815 facial effect Effects 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 7
- 238000001931 thermography Methods 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 6
- 230000015654 memory Effects 0.000 description 20
- 238000004891 communication Methods 0.000 description 14
- 210000003128 head Anatomy 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000004378 air conditioning Methods 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000035484 reaction time Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D3/00—Control of position or direction
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
- B60K28/066—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver actuating a signalling device
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/06—Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q5/00—Arrangement or adaptation of acoustic signal devices
- B60Q5/005—Arrangement or adaptation of acoustic signal devices automatically actuated
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3605—Destination input or retrieval
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3605—Destination input or retrieval
- G01C21/362—Destination input or retrieval received from an external device or application, e.g. PDA, mobile phone or calendar application
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3679—Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/02—Reservations, e.g. for tickets, services or events
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0438—Sensor means for detecting
- G08B21/0476—Cameras to detect unsafe condition, e.g. video cameras
Abstract
Systems and methods for occupant alertness-based navigation are disclosed. An example vehicle includes a camera and an occupant monitor. An exemplary camera is attached to a rear view mirror of a vehicle to detect a drowsiness event associated with a driver. An example occupant monitor provides feedback to a driver in response to detecting a first drowsiness event with a camera. Further, the example occupant monitor selects a host proximate the geographic location of the vehicle in response to detecting a second drowsiness event subsequent to the first drowsiness event, and configures a navigation system to navigate to the selected host.
Description
Technical Field
The present disclosure relates generally to vehicle occupant alertness detection, and more particularly to occupant alertness-based navigation.
Background
Driving a vehicle requires the driver to be alerted to the environment surrounding the vehicle. When the driver is drowsy, the reaction time of the driver becomes slow, and the driver may lose focus on the road. Thus, a drowsy driver may pose a danger to himself, the occupants of the vehicle, and others in the vicinity, such as other drivers and pedestrians.
Disclosure of Invention
The appended claims define the application. This disclosure summarizes aspects of the embodiments and should not be used to limit the claims. Other embodiments will be apparent to those of ordinary skill in the art from consideration of the following drawings and detailed description, and are intended to be within the scope of the present application.
Exemplary embodiments for a vehicle with occupant alertness-based navigation are disclosed. An example vehicle includes a camera and an occupant monitor. An exemplary camera is attached to a rear view mirror of a vehicle to detect a drowsiness event associated with a driver. An exemplary occupant monitor provides feedback to the driver in response to the camera detecting a first drowsiness event. Further, the example occupant monitor selects a host proximate the geographic location of the vehicle in response to detecting a second drowsiness event subsequent to the first drowsiness event, and sets the navigation system to navigate to the selected host.
An exemplary method of mitigating drowsiness of a driver of a vehicle includes monitoring the driver using a camera attached to a rear view mirror to detect a drowsiness event. The example method also includes providing feedback to the driver in response to detecting the first drowsiness event. Further, the example method includes selecting a lodging near the geographic location of the vehicle in response to detecting a second drowsiness event subsequent to the first drowsiness event; and the navigation system is set to navigate to the selected accommodation.
An exemplary tangible computer readable medium includes instructions that, when executed, cause a vehicle to monitor a driver using a camera attached to a rear view mirror to detect a drowsiness event. The example instructions also cause the vehicle to provide feedback to the driver in response to detecting the first drowsiness event. Further, the example instructions cause the vehicle to select an accommodation near the geographic location of the vehicle in response to detecting a second drowsiness event subsequent to the first drowsiness event, and set the navigation system to navigate to the selected accommodation.
According to the present invention, there is provided a vehicle comprising:
a camera attached to the rear view mirror to detect a drowsiness event; and
an occupant monitor for:
providing feedback to the driver in response to detecting the first drowsiness event with the camera; and
in response to detecting a second drowsiness event after the first drowsiness event, performing:
selecting a lodging near the geographic location of the vehicle; and
the navigation system is automatically set up to navigate to the selected accommodation.
According to one embodiment of the invention, wherein the occupant monitor prompts the driver via the infotainment host unit to reserve a room at the selected accommodation in response to detecting a second drowsiness event subsequent to the first drowsiness event.
According to one embodiment of the invention, wherein the occupant monitor selects the accommodation based on at least one of a distance between the location of the vehicle and the location of the accommodation, a preference specified by the driver during the registration process, availability of rooms at the accommodation, or a price of rooms at the accommodation.
According to one embodiment of the invention, wherein the occupant monitor automatically reserves a room at the selected accommodation in response to detecting a third drowsiness event following the second drowsiness event.
According to an embodiment of the invention, wherein the feedback is at least one of an audio, visual or tactile alarm.
According to one embodiment of the invention, wherein the camera includes integrated facial feature recognition with infrared thermal imaging.
According to the present invention, there is provided a method of alleviating drowsiness of a driver of a vehicle, comprising:
monitoring the driver with a camera attached to the rear view mirror to detect drowsiness events; and
providing feedback to the driver via the processor in response to detecting the first drowsiness event; and
in response to detecting a second drowsiness event after the first drowsiness event, performing:
selecting a lodging near the geographic location of the vehicle; and
the navigation system is automatically set up to navigate to the selected accommodation.
According to one embodiment of the present invention, including in response to detecting a second drowsiness event after the first drowsiness event, prompting the driver via the infotainment host unit to book a room at the selected accommodation.
According to one embodiment of the invention, wherein the accommodation is selected according to at least one of a distance between the location of the vehicle and the location of the accommodation, a preference specified by the driver during the registration process, availability of rooms at the accommodation or a price of rooms at the accommodation.
According to one embodiment of the present invention, including automatically reserving, by the processor, a room at the selected accommodation in response to detecting a third drowsiness event subsequent to the second drowsiness event.
According to an embodiment of the invention, wherein the feedback is at least one of an audio, visual or tactile alarm.
According to one embodiment of the invention, wherein the camera includes integrated facial feature recognition with infrared thermal imaging.
According to the invention, there is provided a tangible computer readable medium containing instructions that, when executed, cause a vehicle to:
monitoring the driver with a camera attached to the rear view mirror to detect drowsiness events; and
providing feedback to the driver in response to detecting the first drowsiness event; and
in response to detecting a second drowsiness event after the first drowsiness event, performing:
selecting a lodging near the geographic location of the vehicle; and
the navigation system is automatically set up to navigate to the selected accommodation.
According to one embodiment of the invention, wherein the instructions, when executed, cause the vehicle to prompt, via the infotainment host unit, the driver to reserve a room at the selected accommodation in response to detecting a second drowsiness event subsequent to the first drowsiness event.
According to one embodiment of the invention, the accommodation is selected according to at least one of a distance between the location of the vehicle and the location of the accommodation, a preference specified by the driver during the registration process, an availability of the rooms at the accommodation or a price of the rooms at the accommodation.
According to one embodiment of the invention, wherein the instructions, when executed, cause the vehicle to automatically book a room at the selected accommodation in response to detecting a third drowsiness event after the second drowsiness event.
According to an embodiment of the invention, wherein the feedback is at least one of an audio, visual or tactile alarm.
According to one embodiment of the invention, wherein the camera includes integrated facial feature recognition with infrared thermal imaging.
According to one embodiment of the invention, including providing feedback to the driver after automatically reserving the room at the selected accommodation until the vehicle reaches the selected accommodation, the feedback is at least one of an audio, visual, or tactile alert.
Drawings
For a better understanding of the invention, reference may be made to the embodiments illustrated in the following drawings. The components in the figures are not necessarily to scale, and related elements may be omitted, or in some cases the scale may be exaggerated, in order to emphasize and clearly illustrate the novel features described herein. Furthermore, the system components may be arranged differently as is known in the art. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
FIG. 1 illustrates a vehicle operating in accordance with the teachings of the present disclosure;
FIG. 2 illustrates an interior view of the vehicle of FIG. 1;
FIG. 3 is a block diagram of electronic components of the vehicle of FIG. 1;
FIG. 4 is a flow diagram of a method of providing an alert for occupant alertness that may be implemented by the electronics of FIG. 3.
Detailed Description
While the present invention may be embodied in various forms, there is shown in the drawings and will hereinafter be described some exemplary and non-limiting embodiments with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments illustrated.
Driver fatigue is a serious problem that slows down the driver's reaction time, reducing the driver's situational awareness. Various techniques may be employed to alleviate drowsiness of the driver. For example, mitigation techniques may include increasing the volume of the sound system, increasing the output of an air conditioning blower, and decreasing cabin temperature, among others. However, mitigation techniques may not be effective over a significant period of time. As described below, the vehicle includes a camera for monitoring the driver. When the vehicle detects, via the camera, an indication of driver drowsiness (e.g., driver posture, driver eye movement, gaze location, etc.), the vehicle takes a gradually escalating series of actions. Initially, the vehicle activates audio, visual, and/or tactile warnings, and/or activates mitigation techniques (e.g., activates/increases the volume of a sound system, enhances an air conditioning blower, etc.). Upon subsequent detection of the driver's signs of drowsiness, the vehicle (a) activates an audio, visual, and/or tactile warning, and/or activates a mitigation technique, (b) communicates with an external server to locate a nearby accommodation (e.g., hotel, motel, lounge, etc.), and (c) instructs the navigation system to set the selected accommodation as the vehicle's destination. Upon further subsequent detection of the driver's signs of drowsiness, the vehicle (a) communicates with an external server to book a room at the selected accommodation, and (b) activates an audio, visual, and/or tactile warning until the vehicle reaches the selected accommodation.
FIG. 1 shows a vehicle 100 operating in accordance with the teachings of the present disclosure. The vehicle 100 may be a standard gasoline powered vehicle, a hybrid vehicle, an electric vehicle, a fuel cell vehicle, and/or any other mobility-enabled type of vehicle. Vehicle 100 includes mobility-related components such as a powertrain with an engine, transmission, suspension, drive shafts, and/or wheels, among others. The vehicle 100 may be non-autonomous or semi-autonomous. In the example shown, the vehicle 100 includes an in-vehicle communication platform 102, an infotainment host unit 104, a driver camera 106, and an occupant monitor 108.
The in-vehicle communication platform 102 includes a wired or wireless network interface capable of communicating with the external network 110. The in-vehicle communication platform 102 also includes hardware (e.g., processor, memory, storage, antenna, etc.) and software for controlling wired or wireless network interfaces. In some examples, the in-vehicle communication platform 102 includes a cellular modem, a dedicated short-range communication module, and/or a Wireless Local Area Network (WLAN) controller. Alternatively or additionally, in some examples, the in-vehicle communication platform 102 is connected in wired or wireless communication to a mobile device (e.g., a smartphone, feature phone, smart watch, tablet, laptop, etc.) that is connected (e.g., via a cellular connection, etc.) to the external network 110. The cellular modem includes hardware and software operated by a carrier for controlling a wide area standard-based network (e.g., global system for mobile communications (GSM), Universal Mobile Telecommunications System (UMTS), Long Term Evolution (LTE), Code Division Multiple Access (CDMA), etc.). The WLAN controller includes hardware and software for communicating with a wireless local area network (WiMAX (Worldwide interoperability for Microwave Access) (IEEE 802.16m), a local area wireless network (including IEEE 802.11a/b/g/n/ac/p or others), and wireless giga (IEEE 802.11ad), etc.) based network based on wireless lan standards. In some examples, the in-vehicle communication platform includes a controller for a personal area network (e.g., Near Field Communication (NFC), bluetooth, etc.). The in-vehicle communication platform 102 may also include a Global Positioning System (GPS) receiver for receiving the coordinates of the vehicle 100. Alternatively, in some examples, the in-vehicle communication platform 102 receives coordinates from the mobile device (e.g., from a GPS receiver of the mobile device) when the vehicle 100 is connected to the mobile device. Further, the external network 110 may be a public network, such as the Internet, a private network, such as an intranet, or a combination thereof, and may utilize various network protocols now available or later developed, including but not limited to TCP/IP based network protocols.
The infotainment host unit 104 provides interaction between the vehicle 100 and users (e.g., driver, passengers, etc.). The infotainment host unit 104 includes digital and/or analog interfaces (e.g., input devices and output devices) for receiving input from a user and displaying information. The input devices may include, for example, control knobs, a dashboard, a digital camera for image capture and/or visual command recognition, a touch screen, an audio input device (such as a car microphone), buttons, or a touch pad. The output devices may include instrument cluster outputs (e.g., dials, lighting devices), actuators, heads-up displays, center stack displays (e.g., liquid crystal displays ("LCDs"), organic light emitting diode ("OLED") displays, flat panel displays, solid state displays, etc.), and/or speakers. In the illustrated example, the infotainment host unit 104 includes hardware (e.g., a processor or controller, memory, storage, etc.) and software (e.g., an operating system, etc.) for the infotainment system 112. In addition, the infotainment host unit 104 displays the infotainment system 112 on, for example, a center console display. In the illustrated example, the infotainment system 112 includes a navigation application that provides audio and visual guidance to guide the driver of the vehicle 100 to a destination when the destination is set.
The driver camera 106 monitors the driver 114 to detect when the driver 114 is drowsy. The driver camera 106 is mounted in front of a rear view mirror 116. In the illustrated example, the driver camera 106 includes integrated facial feature recognition with infrared thermal imaging. The driver camera 106 detects (a) the position of the head of the driver 114, (b) the state of the eyes of the driver 114 (e.g., open, partially open, or closed), and/or (c) the direction of the gaze of the driver 114.
The occupant monitor 108 monitors the driver 114 for signs of drowsiness. To detect the signs of drowsiness, the occupant monitor 108 detects the position of the head of the driver 114 via the driver camera 106, (b) the state of the eyes of the driver 114 (e.g., open, partially open, or closed), and/or (c) the direction of the line of sight of the driver 114. The occupant monitor 108 determines that the driver 114 is drowsy when (i) the head of the driver 114 nods for a threshold period of time (e.g., three seconds, etc.), (ii) the head 114 of the driver 114 nods and then has a sharp sudden motion, (iii) the eyes of the driver 114 close for a threshold period of time, (iv) the time for the eyes of the driver 114 to transition from an open state to a closed state is greater than a threshold period of time (e.g., two seconds, etc.), or (v) the line of sight of the driver 114 is at a threshold angle (e.g., 45 degrees below the horizon, etc.) for a threshold period of time (e.g., five seconds, etc.).
When one or more signs of drowsiness are detected, the occupant monitor 108 reacts to relieve the drowsiness of the driver 114. To alleviate drowsiness of the driver 114, the occupant monitor 108(a) activates an audio, visual, and/or tactile alert, (b) activates an alleviation technique, and/or (c) selects a destination from a navigation application to guide the driver 114. When one or more signs of drowsiness are detected after the initial reaction, the occupant monitor 108 increases the reaction to relieve the drowsiness of the driver 114. For example, in response to detecting the first sign of drowsiness, the occupant monitor 108 may activate an audio, visual, and/or tactile alarm and activate a mitigation technique. In such an example, in response to detecting the second sign of drowsiness, the occupant monitor 108 may increase the intensity and/or duration of the audio, visual, and/or tactile alert and connect to the external network 110 to determine the destination to guide the driver 114. In such an example, in response to detecting the third sign of drowsiness, the occupant monitor 108 may further increase the intensity and/or duration of the audio, visual, and/or tactile alert and book a room at the destination's accommodation (e.g., hotel, motel, etc.).
The audio, visual and/or tactile alert may include a buzzer or ring tone, a voice message, a warning displayed on the infotainment host unit 104, and/or a vibration of the steering wheel and/or the driver's seat. The mitigation technique includes adjusting a setting of a heating, ventilation, and air conditioning (HVAC) system of the vehicle 100. For example, the occupant monitor 108 may decrease the temperature setting of the HVAC system such that the cabin of the vehicle 100 becomes cooler, and/or may increase the blower setting to increase air circulation within the cabin. In some examples, the mitigation techniques include activating a cooling system in a driver seat of the vehicle 100. Additionally, in some examples, the occupant monitor 108 may activate and/or increase the volume of the sound system of the vehicle 100. In some such examples, in a vehicle 100 equipped with directional audio (such as speakers of the vehicle 100 configured to direct sound to particular seats in the vehicle 100), the occupant monitor 108 increases the volume of sound reaching the driver's seat.
To select a destination, the occupant monitor 108 sends a destination request 118 to a destination server 120 in the external network 110 via the in-vehicle communication platform 102. The destination request 118 includes the current location of the vehicle 100. In some examples, the destination request 118 also includes a request to book a room at the located accommodation. In the illustrated example, the destination server 120 includes a destination database 122, the destination database 122 including records that associate accommodations with geographic coordinates. Additionally, in some examples, the destination server 120 also includes availability data. The destination server 120 may include an Application Program Interface (API) for facilitating the exchange of messages with the occupant monitor 108 of the vehicle 100. The destination server 120 and destination database 122 are maintained by any suitable entity to provide a location of a residence (e.g., a search engine provider, a travel service provider, a vehicle manufacturer, etc.). The destination server 120 selects one of the accommodations in the destination database 122 in response to receiving the destination request 118. The destination server 120 selects one of the accommodations based on (a) the distance between the location of the vehicle 100 and the location of the accommodation (e.g., smaller distances having higher priority, etc.), (b) the driver's specified preferences during registration, (c) availability, and/or (d) price. In some examples, the destination server 120 makes reservations at selected accommodations based on information provided during the registration process (e.g., credit card, identifier, reward program account number, etc.) when requested by the destination request 118. The destination server 120 sends the destination response 124 to the occupant monitor 108 along with the coordinates of the selected accommodation, information about the selected accommodation (such as name, price, customer rating, etc.), and whether to book the selected accommodation.
The occupant monitor 108 sets the destination of the navigation application of the infotainment system 112 to the coordinates provided in the destination response 124. In addition, the infotainment system 112 displays information from the destination response 124 regarding the selected accommodation. When the destination server 120 does not book a room at the selected accommodation, the infotainment system 112 prompts the driver 114 whether to book a room. If the driver 114 selects the prompt, the occupant monitor 108 sends a destination request 118 to the destination server 120 to request reservation of the room at the selected accommodation.
Fig. 2 shows an interior view of the vehicle 100 of fig. 1. In the example shown, the vehicle 100 includes a passenger camera 202. The passenger camera 202 is attached to the rear view mirror 116. The occupant monitor 108 monitors the drowsiness of the occupant 204 via the occupant camera 202. When the occupant monitor 108 detects that the passenger 204 is drowsy, the occupant monitor 108 adjusts the directional audio system to a lower setting for the passenger seat. In some examples, when the HVAC system of the vehicle 100 supports independent climate zones, the occupant monitor 108 raises the temperature of the occupant's climate zone to raise the body temperature. In some examples, the passenger seat is also automatically tilted.
Fig. 3 is a block diagram of the electronic components 300 of the vehicle of fig. 1. In the illustrated example, electronic components 300 include an in-vehicle communication platform 102, an infotainment host unit 104, an in-vehicle computing platform 302, sensors 304, an Electronic Control Unit (ECU)306, a first vehicle data bus 308, and a second vehicle data bus 310.
In-vehicle computing platform 302 includes a processor or controller 312 and memory 314. In some examples, the in-vehicle computing platform 302 is configured to include the occupant monitor 108. Alternatively, in some examples, the occupant monitor 108 may be incorporated into the ECU306 with its own processor and memory. The processor or controller 312 may be any suitable processing device or group of processing devices, such as, but not limited to: a microprocessor, a microcontroller-based platform, suitable integrated circuitry, one or more Field Programmable Gate Arrays (FPGAs), and/or one or more Application Specific Integrated Circuits (ASICs). The Memory 314 may be volatile Memory (e.g., may include non-volatile RAM, magnetic RAM, ferroelectric RAM, and any other suitable form of RAM), non-volatile Memory (e.g., disk Memory, FLASH Memory, EPROM (erasable programmable read-only Memory), EEPROM (electrically erasable programmable read-only Memory), memristive-based non-volatile solid-state Memory, etc.), non-volatile Memory (e.g., EPROM), read-only Memory, and/or a mass storage device (e.g., hard disk drive, solid-state drive, etc.). In some examples, memory 314 includes a variety of memories, particularly volatile and non-volatile memories.
The memory 314 is a computer-readable medium on which may be embedded one or more sets of instructions, such as software, for operating the methods of the present disclosure. The instructions may embody one or more of the methods or logic described herein. In particular embodiments, the instructions may reside, completely or at least partially, within any one or more of the memory 314, computer-readable medium, and/or processor 312 during execution thereof.
The terms "non-transitory computer-readable medium" and "computer-readable medium" should be taken to include a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions. The terms "non-transitory computer-readable medium" and "computer-readable medium" also include any tangible medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor or that cause a system to perform any one or more of the methods or operations disclosed herein. As used herein, the term "computer-readable medium" is expressly defined to include any type of computer-readable storage device and/or storage disk and to exclude propagating signals.
The sensors 304 may be disposed in and around the vehicle 100 in any suitable manner. The sensors 304 may measure properties around the exterior of the vehicle 100. Additionally, some sensors 304 may be installed within the cabin of the vehicle 100 or in the body of the vehicle 100 (e.g., engine compartment, wheel well, etc.) to measure properties of the interior of the vehicle 100. For example, such sensors 304 may include accelerometers, odometers, tachometers, pitch and yaw sensors, wheel speed sensors, microphones, tire pressure sensors, biometric sensors, and the like. In the illustrated example, the sensors 304 include the driver camera 106 and a cabin temperature sensor. In some examples, the sensor 304 also includes the passenger camera 202 of fig. 2.
The ECU306 monitors and controls the subsystems of the vehicle 100. The ECU306 communicates and exchanges information via a first vehicle data bus 308. Additionally, the ECU306 may transmit attributes (e.g., the state of the ECU306, sensor readings, control status, errors, diagnostic codes, etc.) to other ECUs 306 and/or receive requests from other ECUs 306. Some vehicles 100 may have seventy or more ECUs 306 located at various locations around the vehicle 100 that are communicatively connected by a first vehicle data bus 308. The ECU306 is a discrete set of electronic devices that includes its own circuitry (e.g., integrated circuits, microprocessors, memory, storage devices, etc.) and firmware, sensors, actuators, and/or mounting hardware. In the illustrated example, the ECU306 includes a vehicle body control unit and a steering control unit. The steering control unit includes a vibration feedback device for vibrating the steering wheel when the occupant monitor 108 detects drowsiness of the driver 114.
The first vehicle data bus 308 is communicatively connected to the sensors 304, the ECU306, the in-vehicle computing platform 302, and other devices connected to the first vehicle data bus 308. In some examples, the first vehicle data bus 308 is implemented according to a Controller Area Network (CAN) bus protocol defined by International Standards Organization (ISO) 11898-1. Alternatively, in some examples, the first vehicle data bus 308 may be a Media Oriented System Transport (MOST) bus or a CAN-flexible data (CAN-FD) bus (ISO 11898-7). The second vehicle data bus 310 is communicatively coupled to the in-vehicle communication platform 102, the infotainment host unit 104, and the in-vehicle computing platform 302. The second vehicle data bus 310 may be a MOST bus, a CAN-FD bus, or an ethernet bus. In some examples, the in-vehicle computing platform 302 communicatively isolates the first vehicle data bus 308 and the second vehicle data bus 310 (e.g., via a firewall, message broker, etc.). Alternatively, in some examples, the first vehicle data bus 308 and the second vehicle data bus 310 are the same data bus.
Fig. 4 is a flow diagram of a method of providing an alert for occupant alertness that may be implemented by the electronic component 300 of fig. 3. Initially, at block 402, the occupant monitor 108 monitors the driver 114 via the driver camera 106. At block 404, the occupant monitor 108 determines whether there is an indication that the driver 114 is drowsy. For example, the occupant monitor 108 determines that the driver 114 is drowsy when (i) the head of the driver 114 nods for a threshold period of time (e.g., three seconds, etc.), (ii) the head 114 of the driver 114 nods and then has a sharp, abrupt motion, (iii) the eyes of the driver 114 close for a threshold period of time, (iv) the eyes of the driver 114 transition from an open state to a closed state for more than a threshold period of time (e.g., two seconds, etc.), or (v) the line of sight of the driver 114 is at a threshold angle (e.g., 45 degrees below the horizon, etc.) for a threshold period of time (e.g., five seconds, etc.). If the occupant monitor 108 determines that the driver 114 is drowsy, the method continues to block 406. Otherwise, if the occupant monitor 108 does not determine that the driver 114 is drowsy, the method returns to block 402.
At block 406, the occupant monitor 108 determines whether this is a first indication that the driver 114 is drowsy. If so, the method continues to block 408. Otherwise, if not the first indication, the method continues to block 410. At block 408, the occupant monitor 108 provides an audio, visual, and/or tactile alert. Further, in some examples, the occupant monitor 108 activates mitigation techniques. At block 410, the occupant monitor 108 determines whether this is a second indication that the driver 114 is drowsy. If the indication is a second indication, the method continues to block 412. Otherwise, if not the second indication, the method continues to block 416.
At block 412, the occupant monitor 108 sends the destination request 118 to the destination server 120 to select a host. At block 414, the occupant monitor 108, in response to receiving the destination response 122 with the coordinates of the selected accommodation, sets the navigation application on the infotainment system 112 to navigate to the coordinates received in the destination response 122.
At block 416, the occupant monitor 108 sends the destination request 118 to the destination server 120 to book the room at the accommodation selected at block 412. At block 418, the occupant monitor 108 provides an audio, visual, and/or tactile alert. Further, in some examples, the occupant monitor 108 may implement additional mitigation techniques. At block 420, the occupant monitor 108 determines whether the vehicle 100 is at the coordinates of the accommodation received at block 412. If the vehicle 100 is at the coordinates of the selected accommodation, the method ends. Otherwise, if the vehicle 100 is not at the coordinates of the selected accommodation, the method returns to block 418.
The flow chart of fig. 4 is a method that may be implemented by machine readable instructions containing one or more programs that, when executed by a processor (e.g., processor 312 of fig. 3), cause the vehicle 100 to implement the occupant monitor 108 of fig. 1. Further, although the example program is described with reference to the flowchart shown in FIG. 4, many other methods of implementing the example occupant monitor 108 may alternatively be used. For example, the order of execution of the blocks may be changed, and/or some of the blocks described may be changed, eliminated, or combined.
In this application, the use of the disjunctive is intended to include the conjunctive. The use of definite or indefinite articles is not intended to indicate cardinality. In particular, references to "the" object or "an" object and "an" object are also intended to mean one of a possible plurality of such objects. Furthermore, the conjunction "or" may be used to convey simultaneous features and not mutually exclusive alternatives. In other words, the conjunction "or" should be understood to include "and/or". The terms "comprising," "including," and "including" are intended to be inclusive and have the same scope as "comprising," "including," and "comprising," respectively.
The above-described embodiments, particularly any "preferred" embodiments, are possible examples of implementations, and are presented merely for a clear understanding of the principles of the invention. Many variations and modifications may be made to the above-described embodiments without departing from the spirit and principles of the technology described herein. All such modifications are intended to be included within the scope of this disclosure and protected by the following claims.
Claims (15)
1. A vehicle, comprising:
a camera attached to the rear view mirror to detect a drowsiness event; and
an occupant monitor to:
providing feedback to a driver in response to detecting a first drowsiness event with the camera; and
in response to detecting a second drowsiness event subsequent to the first drowsiness event, performing:
selecting a host proximate to the geographic location of the vehicle; and
setting up a navigation system to navigate to the selected accommodation;
reserving a room at the selected accommodation in response to detecting a third drowsiness event subsequent to the second drowsiness event.
2. The vehicle of claim 1, wherein the occupant monitor prompts the driver via an infotainment host unit to reserve a room at the selected accommodation in response to detecting the second drowsiness event after the first drowsiness event.
3. The vehicle of claim 1, wherein the occupant monitor selects the accommodation based on at least one of a distance between a location of the vehicle and a location of the accommodation, a preference specified by the driver during a registration process, availability of rooms at the accommodation, or a price of rooms at the accommodation.
4. The vehicle of claim 1, wherein the feedback is at least one of an audio, visual, or tactile alert.
5. The vehicle of claim 1, wherein the camera includes integrated facial feature recognition with infrared thermal imaging.
6. A method of mitigating drowsiness of a driver of a vehicle, comprising:
monitoring the driver by a camera attached to a rear view mirror to detect drowsiness events; and
providing feedback to the driver via a processor in response to detecting a first drowsiness event; and
in response to detecting a second drowsiness event subsequent to the first drowsiness event, performing:
selecting a host proximate to the geographic location of the vehicle; and
setting up a navigation system to navigate to the selected accommodation;
reserving, by the processor, a room at the selected accommodation in response to detecting a third drowsiness event subsequent to the second drowsiness event.
7. The method according to claim 6, comprising prompting, via an infotainment host unit, the driver to reserve a room at the selected accommodation in response to detecting the second drowsiness event subsequent to the first drowsiness event.
8. The method of claim 6, wherein the accommodation is selected according to at least one of a distance between a location of the vehicle and a location of the accommodation, a preference specified by the driver during a registration process, availability of rooms at the accommodation, or a price of rooms at the accommodation.
9. The method of claim 6, wherein the feedback is at least one of an audio, visual, or tactile alert.
10. The method of claim 6, wherein the camera includes integrated facial feature recognition with infrared thermal imaging.
11. A tangible computer readable medium containing instructions that, when executed, cause a vehicle to:
monitoring the driver by a camera attached to the rear view mirror to detect drowsiness events; and
providing feedback to the driver in response to detecting a first drowsiness event; and
in response to detecting a second drowsiness event subsequent to the first drowsiness event, performing:
selecting a host proximate to the geographic location of the vehicle; and
setting up a navigation system to navigate to the selected accommodation;
reserving a room at the selected accommodation in response to detecting a third drowsiness event subsequent to the second drowsiness event.
12. The computer readable medium of claim 11, wherein the instructions, when executed, cause the vehicle to prompt, via an infotainment host unit, the driver to reserve a room at the selected accommodation in response to detecting the second drowsiness event after the first drowsiness event.
13. The computer readable medium of claim 11, wherein the accommodation is selected according to at least one of a distance between a location of the vehicle and a location of the accommodation, preferences specified by the driver during a registration process, availability of rooms at the accommodation, or prices of rooms at the accommodation.
14. The computer-readable medium of claim 11, wherein the feedback is at least one of an audio, visual, or tactile alert.
15. The computer readable medium of claim 11, wherein the camera includes integrated facial feature recognition with infrared thermal imaging.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/209,344 | 2016-07-13 | ||
US15/209,344 US9937792B2 (en) | 2016-07-13 | 2016-07-13 | Occupant alertness-based navigation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107628033A CN107628033A (en) | 2018-01-26 |
CN107628033B true CN107628033B (en) | 2022-04-19 |
Family
ID=59676793
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710548955.4A Active CN107628033B (en) | 2016-07-13 | 2017-07-07 | Navigation based on occupant alertness |
Country Status (6)
Country | Link |
---|---|
US (1) | US9937792B2 (en) |
CN (1) | CN107628033B (en) |
DE (1) | DE102017115317A1 (en) |
GB (1) | GB2553649A (en) |
MX (1) | MX2017009140A (en) |
RU (1) | RU2682956C2 (en) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014219408A1 (en) * | 2014-09-25 | 2016-04-14 | Volkswagen Aktiengesellschaft | Method and device for setting a thermal comfort state |
US10262219B2 (en) * | 2016-04-21 | 2019-04-16 | Hyundai Motor Company | Apparatus and method to determine drowsiness of a driver |
US10549759B1 (en) * | 2017-01-19 | 2020-02-04 | State Farm Mutual Automobile Insurance Company | Apparatuses, systems and methods for improving operation of autonomous vehicles |
EP3560746B1 (en) * | 2018-04-24 | 2022-10-26 | Skoda Auto a.s. | Motor vehicle, monitoring device for a motor vehicle and corresponding method |
US10800433B2 (en) * | 2018-09-14 | 2020-10-13 | Honda Motor Co., Ltd. | Seat haptic system and method of equalizing haptic output |
JP7119984B2 (en) * | 2018-12-21 | 2022-08-17 | トヨタ自動車株式会社 | Driving support device, vehicle, information providing device, driving support system, and driving support method |
US10810966B1 (en) * | 2019-01-15 | 2020-10-20 | Ambarella International Lp | Fusion of electronic mirror systems and driver monitoring for increased convenience and added safety |
US11657694B2 (en) * | 2019-04-12 | 2023-05-23 | Stoneridge Electronics Ab | Mobile device usage monitoring for commercial vehicle fleet management |
US20200398700A1 (en) * | 2019-06-21 | 2020-12-24 | Lear Corporation | Seat system and method of control |
JP7331781B2 (en) * | 2020-05-28 | 2023-08-23 | トヨタ自動車株式会社 | Information processing device, information processing system, program, and vehicle |
JP7251524B2 (en) * | 2020-07-01 | 2023-04-04 | トヨタ自動車株式会社 | Drowsiness Sign Notification System, Drowsiness Sign Notification Method, and Drowsiness Sign Notification Program |
DE102021103806A1 (en) * | 2021-02-18 | 2022-08-18 | 4.screen GmbH | System and method for prioritizing navigation destinations of vehicles |
US20230022436A1 (en) * | 2021-07-21 | 2023-01-26 | Indiev, Inc | Forward facing rear vehicle video system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW201209764A (en) * | 2010-08-23 | 2012-03-01 | Univ Ishou | Vehicle anti-doze system and method thereof |
CN105313898A (en) * | 2014-07-23 | 2016-02-10 | 现代摩比斯株式会社 | Apparatus and method for detecting driver status |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7027621B1 (en) | 2001-03-15 | 2006-04-11 | Mikos, Ltd. | Method and apparatus for operator condition monitoring and assessment |
DE102004006910A1 (en) * | 2004-02-12 | 2005-08-25 | Bayerische Motoren Werke Ag | Vehicle control procedure senses driver and passenger health using contactless biosensors and uses vehicle environment control equipment to improve situation |
US7301464B2 (en) * | 2005-05-24 | 2007-11-27 | Electronic Data Systems Corporation | Process and method for safer vehicle navigation through facial gesture recognition and operator condition monitoring |
SE535029C2 (en) * | 2010-02-08 | 2012-03-20 | Scania Cv Ab | Driver-specific vehicle configuration system and method |
US20120078509A1 (en) * | 2010-09-27 | 2012-03-29 | Honda Motor Co., Ltd | Intelligent Navigation For A Motor Vehicle |
US8698639B2 (en) * | 2011-02-18 | 2014-04-15 | Honda Motor Co., Ltd. | System and method for responding to driver behavior |
US9493130B2 (en) * | 2011-04-22 | 2016-11-15 | Angel A. Penilla | Methods and systems for communicating content to connected vehicle users based detected tone/mood in voice input |
DE102011109564A1 (en) * | 2011-08-05 | 2013-02-07 | Daimler Ag | Method and device for monitoring at least one vehicle occupant and method for operating at least one assistance device |
WO2014001928A2 (en) * | 2012-06-27 | 2014-01-03 | Koninklijke Philips N.V. | System and method for enhancing alertness. |
JP5895798B2 (en) * | 2012-10-04 | 2016-03-30 | 株式会社デンソー | Driving support device and driving support method |
US20150379362A1 (en) * | 2013-02-21 | 2015-12-31 | Iee International Electronics & Engineering S.A. | Imaging device based occupant monitoring system supporting multiple functions |
US20140240132A1 (en) * | 2013-02-28 | 2014-08-28 | Exmovere Wireless LLC | Method and apparatus for determining vehicle operator performance |
DE102013021928A1 (en) | 2013-12-20 | 2015-06-25 | Audi Ag | Comfort device control for a motor vehicle |
-
2016
- 2016-07-13 US US15/209,344 patent/US9937792B2/en active Active
-
2017
- 2017-07-07 CN CN201710548955.4A patent/CN107628033B/en active Active
- 2017-07-07 DE DE102017115317.9A patent/DE102017115317A1/en active Pending
- 2017-07-10 GB GB1711082.6A patent/GB2553649A/en not_active Withdrawn
- 2017-07-10 RU RU2017124232A patent/RU2682956C2/en active
- 2017-07-12 MX MX2017009140A patent/MX2017009140A/en unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW201209764A (en) * | 2010-08-23 | 2012-03-01 | Univ Ishou | Vehicle anti-doze system and method thereof |
CN105313898A (en) * | 2014-07-23 | 2016-02-10 | 现代摩比斯株式会社 | Apparatus and method for detecting driver status |
Also Published As
Publication number | Publication date |
---|---|
CN107628033A (en) | 2018-01-26 |
GB2553649A (en) | 2018-03-14 |
US20180015825A1 (en) | 2018-01-18 |
GB201711082D0 (en) | 2017-08-23 |
RU2017124232A (en) | 2019-01-10 |
US9937792B2 (en) | 2018-04-10 |
RU2682956C2 (en) | 2019-03-22 |
DE102017115317A1 (en) | 2018-01-18 |
RU2017124232A3 (en) | 2019-01-10 |
MX2017009140A (en) | 2018-01-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107628033B (en) | Navigation based on occupant alertness | |
US10317900B2 (en) | Controlling autonomous-vehicle functions and output based on occupant position and attention | |
US9977593B2 (en) | Gesture recognition for on-board display | |
CN108281069B (en) | Driver interaction system for semi-autonomous mode of vehicle | |
US10298722B2 (en) | Apparatus and method for adjusting driving position of driver | |
US20170349184A1 (en) | Speech-based group interactions in autonomous vehicles | |
US10358130B2 (en) | System and methods for adaptive cruise control based on user defined parameters | |
US20170286785A1 (en) | Interactive display based on interpreting driver actions | |
US9154923B2 (en) | Systems and methods for vehicle-based mobile device screen projection | |
US20180018179A1 (en) | Intelligent pre-boot and setup of vehicle systems | |
US20130154298A1 (en) | Configurable hardware unit for car systems | |
US20200018976A1 (en) | Passenger heads-up displays for vehicles | |
US20200339133A1 (en) | Driver distraction determination | |
KR101927170B1 (en) | System and method for vehicular and mobile communication device connectivity | |
US10666901B1 (en) | System for soothing an occupant in a vehicle | |
US10369943B2 (en) | In-vehicle infotainment control systems and methods | |
CN109562740B (en) | Fingerprint apparatus and method for remotely accessing personal functional profile of vehicle | |
CN106945671B (en) | Vehicle cruise control with multiple set points | |
CN112513708B (en) | Apparatus and method for use with a vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |