US20200099547A1 - Systems and methods for improved vehicular safety - Google Patents
Systems and methods for improved vehicular safety Download PDFInfo
- Publication number
- US20200099547A1 US20200099547A1 US16/577,987 US201916577987A US2020099547A1 US 20200099547 A1 US20200099547 A1 US 20200099547A1 US 201916577987 A US201916577987 A US 201916577987A US 2020099547 A1 US2020099547 A1 US 2020099547A1
- Authority
- US
- United States
- Prior art keywords
- infotainment device
- processor
- infotainment
- vehicle
- driver
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title description 18
- 238000004891 communication Methods 0.000 claims description 14
- 230000003993 interaction Effects 0.000 claims description 13
- 230000015654 memory Effects 0.000 description 26
- 238000005516 engineering process Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 230000002411 adverse Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000001149 cognitive effect Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000003930 cognitive ability Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 230000000881 depressing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/40—Bus networks
- H04L12/40006—Architecture of a communication node
- H04L12/40019—Details regarding a bus master
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/60—Instruments characterised by their location or relative disposition in or on vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/052—Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/46—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/1523—Matrix displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/77—Instrument locations other than the dashboard
- B60K2360/785—Instrument locations other than the dashboard on or in relation to the windshield or windows
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/40—Bus networks
- H04L2012/40208—Bus networks characterized by the use of a particular bus standard
- H04L2012/40215—Controller Area Network CAN
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/40—Bus networks
- H04L2012/40267—Bus for use in transportation systems
- H04L2012/40273—Bus for use in transportation systems the transportation system being a vehicle
Definitions
- the present disclosure generally relates to safety systems for vehicles; and more particularly to a plurality of vehicle safety features including e.g., an enhanced infotainment device that improves upon current vehicular safety systems and reduces driver distraction.
- Distraction generally can be defined as the ratio between cognitive load exerted by external or internal factors needing attention, and cognitive ability represented by the person's ability to pay attention to and react to events.
- Driver distraction is a major cause of death in the United States and is the primary cause of fatalities among teenagers.
- the driving process is inherently distractive, as drivers need to assess the rear and side positions of the vehicle, view gauges and other features of the dashboard, and engage with the gas pedal and brake, all while maintaining attention to the road in front of them. Further, different driving situations impose different levels of cognitive load and hence require different levels of a driver's attention.
- FIG. 1 is a simplified block diagram of one embodiment of a vehicular safety system with an improved infotainment device configured to reduce driver distraction, according to aspects of the present inventive concept.
- FIG. 2 is a block diagram illustrating a possible process flow for configuring an infotainment device to reduce driver distraction, according to aspects of the present inventive concept.
- FIG. 3 is a simplified block diagram of another embodiment of a vehicular safety system configured for managing functionality associated with multiple infotainment devices, according to aspects of the present inventive concept.
- FIG. 4 is a simplified block diagram depicting an exemplary computing device that may implement various services, systems, and methods discussed herein.
- the present disclosure relates to systems and methods for reducing driver distraction and for generally improving vehicular safety.
- the present inventive concept may take the form of an improved infotainment device configured with features suitable for minimizing the driver's interaction time with an infotainment device, including the implementation of forced delay or temporary disabling of infotainment system features during adverse driving conditions, or delegation of such features to other devices associated with a passenger.
- a vehicular safety system designated system 100 , is shown for optimizing infotainment response time to reduce driver cognitive load and minimize distraction.
- the system 100 may generally include a vehicle 102 equipped with an infotainment device 104 .
- the infotainment device 104 may define any in-car entertainment device or plurality of devices, or in-vehicle infotainment and may include any entertainment and information functionality provided with the vehicle 102 and integrated within the dashboard or otherwise.
- the infotainment device 104 may include a collection of hardware and software components providing audio and visual information associated with radio, navigation, phone functionality for managing data associated with calls and texts, and the like.
- the infotainment device 104 generally includes a display 106 depicting an interface 108 (such as a graphical user interface) and a driver may interact with the interface 108 using one or more input controls 110 such as dials, keypads, or other such controls or by touching the display 106 in the case where the display 106 includes touch-screen technology.
- an interface 108 such as a graphical user interface
- input controls 110 such as dials, keypads, or other such controls or by touching the display 106 in the case where the display 106 includes touch-screen technology.
- the infotainment device 104 may include or otherwise be in operable communication with a memory 111 or storage device, (at least one of) a processor 112 or electronic control unit (ECU), configured to execute a vehicle operating system (operating system 113 ) stored in the memory 111 for issuing instructions and managing the infotainment device 104 including its software and hardware peripherals.
- the processor 112 may further execute an application 114 defining a predictive model 116 stored in the memory 111 .
- the application 114 and the predictive model 116 are configured to provide functionality that reduces driver distraction by assisting to at least temporarily restrict the driver's ability to interact with the infotainment device 104 for more than a predetermined amount of time or for more than a predetermined amount of steps, as further described herein.
- NHTSA National Highway Traffic Safety Administration
- the driver should view that text message in six steps or less, such steps including, e.g., a first step of powering up a screen along a dashboard, a second step of navigating to a phone tab, etc.
- steps including, e.g., a first step of powering up a screen along a dashboard, a second step of navigating to a phone tab, etc.
- the infotainment system 104 is in communication with a control area network bus (CAN BUS or any other vehicular data bus) 118 and a plurality of subsystems 120 defined along the vehicle 102 .
- the CAN BUS 118 allows sensors and microcontrollers of the different subsystems 120 A and 120 B to communicate with each other, and the subsystems 120 include sensors, microcontrollers, or mechanical components specific to certain portions or functions of the vehicle 102 .
- the CAN BUS 118 may further communicably couple together one or more electronic control units (ECUs) (not shown) positioned along the vehicle 102 and implemented as part of any of the subsystems 120 or otherwise.
- ECUs electronice control units
- a first subsystem 120 A may be used for the transmission of the vehicle 102
- a second subsystem 120 B may be used for operations of the doors
- the CAN BUS 118 may accommodate data sharing or interconnection between these subsystems, or separate control actuators, or sensors (not shown).
- the application 114 described herein may be used to configure the infotainment device 104 to modify, restrict, or temporarily disable certain functions and settings of the infotainment device 104 to increase the driver's attention to the road as opposed to the driver interacting with the infotainment device 104 longer than desired.
- the application 114 comprises a plug-in, add-on, extension, or update to the vehicle operating system 113 to execute the functionality associated with restricting the driver's interaction with the infotainment device 104 as described herein.
- the application 114 may merely be loaded into the memory 111 by a wired or wireless connection and may be logically layered over the operating system 113 as a separate application. In these embodiments, the application 114 issues requests or instructions to the operating system 113 or any application managing functionality associated with the infotainment device 104 to temporarily restrict such functionality in the manner described herein.
- the application 114 is configured to initially extract or access input information 220 about driving conditions (such as the use of gas and brake pedals, the acceleration or deceleration information generated by the car, the presence and rate of lane change, swerving and the driving style/type of the driver, various steering wheel movements, the car state of motion and speed as well as geolocation proximate to an intersection or traffic lights, and the like), as shown in block 202 .
- driving conditions such as the use of gas and brake pedals, the acceleration or deceleration information generated by the car, the presence and rate of lane change, swerving and the driving style/type of the driver, various steering wheel movements, the car state of motion and speed as well as geolocation proximate to an intersection or traffic lights, and the like.
- the application 114 may acquire such input information 220 by integrating with or accessing data from the subsystems 120 and/or from the event handler 122 , which may already be implemented by the operating system 113 of the vehicle 102 or separately implemented.
- An event handler for a general computing device may include any software routine that monitors or processes events or actions such keystrokes, or engagement with input controls.
- Many components or features of modern vehicles are generally electrically connected, or are otherwise in electrical communication with a central processing unit, such as the processor 112 .
- the event handler 122 of the vehicle 102 (or any vehicle) similar to an event handler for a general computing device, monitors, records, or processes various events about such components of the vehicle as they occur, and the operating system 113 may manage and make use of such data.
- sensors 124 A and 124 B implemented along the drive train (not shown) of the vehicle 102 may be in operable electrical communication with the processor 112 , and may generate information about the motion of the vehicle 102 including when the vehicle 102 stops or accelerates. This information may be managed and recorded by the event handler 122 .
- the input information 220 accessed from the event handler 122 may further include information about the driver's interactions/events associated with the infotainment device 104 .
- an event may include actuation of the radio or the input controls 110 of the infotainment device 104 by the driver depressing a button along the dashboard of the vehicle 102 .
- the input information 220 may further include, or the application 114 may also be fed LIDAR information and on-board forward and vehicle camera information as available.
- the predictive model 116 may identify input parameters such as speed, acceleration, and other variables or features from the input information 220 , which may be used to determine a corresponding driving conditions classification 222 or class associated with a suggested level of attention suitable for the driver, given the driving conditions (e.g., low, moderate, or high, on a scale or otherwise).
- input parameters such as speed, acceleration, and other variables or features from the input information 220 , which may be used to determine a corresponding driving conditions classification 222 or class associated with a suggested level of attention suitable for the driver, given the driving conditions (e.g., low, moderate, or high, on a scale or otherwise).
- the predictive model 116 may determine that the vehicle 102 is operating within city traffic by identifying that the vehicle 102 has made a predetermined number of stops within a predefined period of time, and is averaging a speed of 30 mph, such that the predictive model 116 outputs a driving conditions classification 222 corresponding to operation of the vehicle 102 within city traffic.
- the predictive model 116 may output a different driving conditions classification 222 associated with operation of the vehicle 102 during highway traffic (with less stops and starts, and increased average speeds). Any such examples of classifications are contemplated.
- the driving conditions classification 222 may be associated with a scale from e.g., 1-10 with 10 reflecting adverse driving conditions requiring less driver distraction, and with 1 reflecting more favorable driving conditions where some driver distraction may be permissible.
- the driving conditions classification 222 may then be used to impose temporary restrictions upon the infotainment system 104 .
- the particular value of the driving conditions classification 222 may correspond with a maximum interaction time 224 that the application 114 is programmed to permit the driver to interact with the infotainment system 104 , or a maximum number of steps 226 the driver is permitted to take while interacting with the infotainment system 104 while the driving conditions remain consistent with the driving conditions classification 222 .
- the application 114 may be programmed to configure the infotainment system 104 (by software calls or otherwise) to temporarily: selectively display only limited graphical features along the interface 108 along the display 106 , terminate any number of graphical features or functions, deactivate audio transmission associated with the infotainment system 104 , completely disable the infotainment system 104 , disable certain ones of the input controls 110 , and the like.
- the application 114 may be preprogrammed with a maximum number of steps 226 of six to coincide with the NHTSA guidelines described herein. In this manner, the application 114 may, in conjunction with the processor 112 , issue calls or commands to the infotainment system 104 to disable certain functionality for a predetermined period of time where the event handler 122 and the application 114 collectively identify that the driver has caused six events to be recognized by the event handler 122 while the driver is interacting with the infotainment system 104 (e.g., while checking email) during adverse driving conditions or otherwise.
- a predetermined amount of time such as 6-10 seconds
- the aforementioned inventive concept may reduce driver distraction, save lives, and reduce cost overall.
- Most modern vehicular safety technologies focus on three main considerations: (1) static responses to driving situations, regardless of the varying needs of the driver's attention, (2) adding more software features to the infotainment device, more connectivity, and larger screens, and (3) adding more processing power and faster response times. Research shows that such general considerations to infotainment devices may exacerbate issues with driving distraction as opposed to reducing them.
- the application 114 may continuously and dynamically evaluate the driving conditions and configure interaction with the infotainment system 104 according to the driving conditions and any predefined infotainment interaction limitations.
- the application 114 may be programmed in any programming language or programming framework (C, C++, Java, Python, Matlab . . . , or the like).
- the application 114 may be programmed with a specialized class object (not shown) configured to interpret data from the event handler 122 associated with driving conditions and driver interaction with the infotainment system 104 .
- This class object may be implemented with a timer and may place information about events from the event handler 122 into a queue.
- the queue may define a length or amount, e.g., five seconds so that the queue stores certain events that have occurred within the last five seconds.
- this possible programming structure may include aspects of a sliding window, or implementation thereof.
- the application 114 may utilize this class object to determine whether a queue includes information about more than a desired number of events (maximum number of steps 226 ) having occurred during a particular time period associated with the queue.
- a sliding window may be implemented to count the events that have occurred in the last five seconds (reflected by the queue), and an alert may be generated if such events exceed a predetermined threshold (e.g., six) to temporarily disable functionality of the infotainment device 104 .
- a predetermined threshold e.g., six
- system 300 for distributing or delegating infotainment interaction from the driver to one or more passengers in order to reduce overall driver distraction.
- the system 300 may generally include a vehicle 302 equipped with an infotainment device, designated first infotainment device 304 .
- the first infotainment device 304 may define any in-car entertainment device or plurality of devices, or in-vehicle infotainment and may include any entertainment and information functionality provided with the vehicle 302 and integrated within the dashboard or otherwise.
- the first infotainment device 304 may include a collection of hardware and software components providing audio and visual information associated with radio, navigation, phone functionality for managing data associated with calls and texts, and the like.
- the first infotainment device 304 generally includes a display 306 depicting an interface 308 (such as a graphical user interface) and a driver may interact with the interface 308 using one or more input controls 310 such as dials, keypads, or other such controls or by touching the display 306 in the event where the display 306 includes touch-screen technology.
- the first infotainment device 304 is generally designated for and accessible by the driver.
- the first infotainment device 304 may include or otherwise be in operable communication with a memory 311 or storage device, (at least one of) a processor 312 or ECU, configured to execute a vehicle operating system (operating system 313 ) stored in the memory 311 for issuing instructions and managing the first infotainment device 304 including its software and hardware peripherals.
- the processor 312 may further execute an application 314 as further described herein.
- the first infotainment device 304 may also be in communication with a control area network bus (CAN BUS) 318 and a plurality of subsystems 320 defined along the vehicle 302 .
- the subsystems 320 A and 320 B include sensors, microcontrollers, or mechanical components specific to certain portions or functions of the vehicle 302 .
- the system 300 includes a second infotainment device 334 designated for a passenger of the vehicle 302 .
- the second infotainment device 334 may include all (or at least some) of the features of the first infotainment device 304 including input controls, a display, and mechanical and electrical components (not shown) that enable a passenger to interact with the second infotainment device 334 and modify settings of the vehicle 302 .
- the passenger may interact with the second infotainment device 334 to modify settings of the vehicle 302 or the first infotainment device 304 (e.g., changing the radio station from the second infotainment device 334 ).
- the second infotainment device 334 may merely be a related component of the first infotainment system 304 , or an extended interface of the same.
- the system 300 may include one or more of a passenger sensor 340 , which may be integrated along a passenger seat and define a sensor or other electrical component sensitive to changes in weight, or the like.
- a passenger sensor 340 may provide information to the application 314 via the CAN BUS 318 (or any other data bus) about whether a passenger or other occupant is occupying the vehicle in addition to the driver.
- the passenger sensor 340 may include a weight sensor with a load cell that creates an electric signal proportional to forces/weight applied to the passenger sensor 340 by an occupant that can be interpreted and utilized by the application 314 as further described herein.
- the application 314 assists to manage functionality between the first infotainment device 304 and the second infotainment device 334 as described herein.
- the passenger sensor 340 may be implemented to sense when passengers are in the vehicle 302 with the driver. If a passenger is detected, this information may be accessed by the application 314 , which may then in turn make settings modifications to the first infotainment device 304 and/or the second infotainment device 334 .
- the application 314 may limit phone functionality and/or disable phone functions associated with the first infotainment device 304 when a passenger is detected, and delegate or distribute this functionality to the second infotainment device 334 so that the passenger (who is not operating the vehicle 302 ) can safely utilize phone functions as needed (versus the driver operating the vehicle 302 ).
- the application 314 may be programmed to display or announce a notification (i.e., via a screen or a speaker) and request that the driver select a type of person or persona for each passenger.
- Personas may define certain permission levels and a list of limited features or functions of the second infotainment device 334 corresponding to each persona. Most personas may duplicate features available to the first infotainment device 304 and the second infotainment device 334 , however, some may be more limiting.
- the driver may, e.g., limit a passenger's ability to watch a film or change the radio station using the second infotainment device 334 by selecting a “child” person.
- the memory 311 may store profiles associated with specific passengers that occupy the vehicle 302 ; each profile corresponding to a particular weight identified by the passenger sensor 340 or otherwise.
- the application 314 is configured to dynamically distribute functionality between the first infotainment device 304 and the second infotainment device 334 as desired, and in some embodiments functionality available to the driver via the first infotainment device 304 may be restricted or disabled where a passenger is detected in the vehicle 302 .
- the application 314 comprises a plug-in, add-on, extension, or update to the vehicle operating system 313 to execute the functionality described herein.
- the application 314 may merely be loaded into the memory 311 by a wired or wireless connection and may be logically layered over the operating system 313 as a separate application.
- FIG. 3 is responsive to current technical problems.
- conventional infotainment technologies place undue emphasis upon static availability of features regardless of the varying needs and number of the occupants in a vehicle.
- Such conventional technologies may also implement an excess amount of software features in an attempt to reduce driver distraction.
- such technologies generally give full access and control of the vehicles infotainment devices and overall infotainment system to the driver as a super user (a metaphor deriving from PC Research shows that even if we achieve perfection in technologies such as speech recognition, distraction will eventually get worse and not improve in this regard).
- the application 314 implements the application 314 with the first infotainment device 304 and the second infotainment device 334 as described to dynamically and continuously evaluate the car's occupants and distribute/divide functionality as desired, and may accommodate the selective display of different features on each of the infotainment devices.
- the application 314 may further accommodate the driver to control the roles and feature access of the second infotainment device 334 available to the occupants.
- FIG. 4 is an example schematic diagram of a computing device 700 that may implement various methodologies discussed herein.
- the computing device 700 may comprise any number or form of computing device used to execute the application 114 or aspects of the system 100 described herein.
- the computing device 700 includes a bus 701 (i.e., interconnect), at least one processor 702 or other computing element, at least one communication port 703 , a main memory 704 , a removable storage media 705 , a read-only memory 706 , and a mass storage device 707 .
- Processor(s) 702 can be any known processor, such as, but not limited to, an Intel® Itanium® or Itanium 2® processor(s), AMD® Opteron® or Athlon MP® processor(s), or Motorola® lines of processors.
- Communication port 703 can be any of an RS-232 port for use with a modem based dial-up connection, a 10/100 Ethernet port, a Gigabit port using copper or fiber, or a USB port.
- Communication port(s) 703 may be chosen depending on a network such as a Local Area Network (LAN), a Wide Area Network (WAN), or any network to which the computing device 700 connects.
- Computing system may further include a transport and/or transit network 755 , a display screen 760 , an I/O port 740 , and an input device 745 such as a mouse or keyboard.
- Main memory 704 can be Random Access Memory (RAM) or any other dynamic storage device(s) commonly known in the art.
- Read-only memory 706 can be any static storage device(s) such as Programmable Read-Only Memory (PROM) chips for storing static information such as instructions for processor 702 .
- Mass storage device 707 can be used to store information and instructions.
- hard disks such as the Adaptec® family of Small Computer Serial Interface (SCSI) drives, an optical disc, an array of disks such as Redundant Array of Independent Disks (RAID), such as the Adaptec® family of RAID drives, or any other mass storage devices, may be used.
- SCSI Small Computer Serial Interface
- RAID Redundant Array of Independent Disks
- Bus 701 communicatively couples processor(s) 702 with the other memory, storage, and communications blocks.
- Bus 701 can be a PCI/PCI-X, SCSI, or Universal Serial Bus (USB) based system bus (or other) depending on the storage devices used.
- Removable storage media 705 can be any kind of external hard drives, thumb drives, Compact Disc-Read Only Memory (CD-ROM), Compact Disc-Re-Writable (CD-RW), Digital Video Disk-Read Only Memory (DVD-ROM), etc.
- Embodiments herein may be provided as a computer program product, which may include a machine-readable medium having stored thereon instructions which may be used to program a computer (or other electronic devices) to perform a process.
- the machine-readable medium may include, but is not limited to optical discs, CD-ROMs, magneto-optical disks, ROMs, RAMs, erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or other type of media/machine-readable medium suitable for storing electronic instructions.
- embodiments herein may also be downloaded as a computer program product, wherein the program may be transferred from a remote computer to a requesting computer by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., modem or network connection).
- a communication link e.g., modem or network connection
- main memory 704 is encoded with applications that support functionality as discussed herein. At least a portion of these applications (and/or other resources as described herein) can be embodied as software code such as data and/or logic instructions (e.g., code stored in the memory or on another computer readable medium such as a disk) that supports processing functionality according to different embodiments described herein.
- processor(s) 702 accesses main memory 704 via the use of bus 701 in order to launch, run, execute, interpret, or otherwise perform processes, such as through logic instructions, executing on the processor 702 and associated software modules stored in main memory or otherwise tangibly stored.
- the described disclosure may be provided as a computer program product, or software, that may include a machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure.
- a machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer).
- the machine-readable medium may include, but is not limited to optical storage medium (e.g., CD-ROM); magneto-optical storage medium, read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or other types of medium suitable for storing electronic instructions.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- This is a non-provisional application that claims benefit to U.S. provisional patent application Ser. No. 62/733,986 filed on Sep. 20, 2018, which is herein incorporated by reference in its entirety.
- The present disclosure generally relates to safety systems for vehicles; and more particularly to a plurality of vehicle safety features including e.g., an enhanced infotainment device that improves upon current vehicular safety systems and reduces driver distraction.
- Distraction generally can be defined as the ratio between cognitive load exerted by external or internal factors needing attention, and cognitive ability represented by the person's ability to pay attention to and react to events. Driver distraction is a major cause of death in the United States and is the primary cause of fatalities among teenagers. The driving process is inherently distractive, as drivers need to assess the rear and side positions of the vehicle, view gauges and other features of the dashboard, and engage with the gas pedal and brake, all while maintaining attention to the road in front of them. Further, different driving situations impose different levels of cognitive load and hence require different levels of a driver's attention.
- However, modern cars are manufactured with a growing number of distracting technologies on board which exacerbates issues related to driver distraction. For example, current infotainment systems and devices and/or in-dash screens, which may be connected to the driver's cell phone, may display a variety of distracting symbols, notifications, or other such graphical software features. In addition, different states have different laws regarding distracted driving (.e.g., texting and use of a cell phone while driving), which may be difficult if not impossible to enforce, and interaction with a large digital screen in a vehicle's dashboard (such as an infotainment system) is rarely prohibited or restricted.
- It is with these observations in mind, among others, that various aspects of the present disclosure were conceived and developed.
-
FIG. 1 is a simplified block diagram of one embodiment of a vehicular safety system with an improved infotainment device configured to reduce driver distraction, according to aspects of the present inventive concept. -
FIG. 2 is a block diagram illustrating a possible process flow for configuring an infotainment device to reduce driver distraction, according to aspects of the present inventive concept. -
FIG. 3 is a simplified block diagram of another embodiment of a vehicular safety system configured for managing functionality associated with multiple infotainment devices, according to aspects of the present inventive concept. -
FIG. 4 is a simplified block diagram depicting an exemplary computing device that may implement various services, systems, and methods discussed herein. - Corresponding reference characters indicate corresponding elements among the view of the drawings. The headings used in the figures do not limit the scope of the claims.
- The present disclosure relates to systems and methods for reducing driver distraction and for generally improving vehicular safety. In some embodiments, the present inventive concept may take the form of an improved infotainment device configured with features suitable for minimizing the driver's interaction time with an infotainment device, including the implementation of forced delay or temporary disabling of infotainment system features during adverse driving conditions, or delegation of such features to other devices associated with a passenger.
- Referring to
FIG. 1 , a vehicular safety system, designatedsystem 100, is shown for optimizing infotainment response time to reduce driver cognitive load and minimize distraction. As shown, thesystem 100 may generally include avehicle 102 equipped with aninfotainment device 104. Theinfotainment device 104 may define any in-car entertainment device or plurality of devices, or in-vehicle infotainment and may include any entertainment and information functionality provided with thevehicle 102 and integrated within the dashboard or otherwise. As such, theinfotainment device 104 may include a collection of hardware and software components providing audio and visual information associated with radio, navigation, phone functionality for managing data associated with calls and texts, and the like. In some embodiments, theinfotainment device 104 generally includes adisplay 106 depicting an interface 108 (such as a graphical user interface) and a driver may interact with theinterface 108 using one ormore input controls 110 such as dials, keypads, or other such controls or by touching thedisplay 106 in the case where thedisplay 106 includes touch-screen technology. - As further shown, the
infotainment device 104 may include or otherwise be in operable communication with amemory 111 or storage device, (at least one of) aprocessor 112 or electronic control unit (ECU), configured to execute a vehicle operating system (operating system 113) stored in thememory 111 for issuing instructions and managing theinfotainment device 104 including its software and hardware peripherals. Theprocessor 112 may further execute anapplication 114 defining apredictive model 116 stored in thememory 111. Generally, theapplication 114 and thepredictive model 116 are configured to provide functionality that reduces driver distraction by assisting to at least temporarily restrict the driver's ability to interact with theinfotainment device 104 for more than a predetermined amount of time or for more than a predetermined amount of steps, as further described herein. - By way of introductory explanation, the National Highway Traffic Safety Administration (NHTSA) has previously issued guidelines to define a preferred maximum level of distraction, which is based on tasks a driver may undertake during driving; each task comprised of individual steps. These guidelines indicate that the driver should complete any interaction with a task that distracts the driver from focusing on the road within a maximum time of six steps, and each step should be completed within two seconds of time. So, for example, if the driver is interacting with the
infotainment device 104 to view a text message (which has been relayed from the driver's cellphone to the infotainment device 104), under the subject NHTSA guidelines, the driver should view that text message in six steps or less, such steps including, e.g., a first step of powering up a screen along a dashboard, a second step of navigating to a phone tab, etc. These guidelines are helpful, but may not reflect adverse driving conditions. - In view of these observations, in some embodiments, the
infotainment system 104 is in communication with a control area network bus (CAN BUS or any other vehicular data bus) 118 and a plurality of subsystems 120 defined along thevehicle 102. The CAN BUS 118 allows sensors and microcontrollers of thedifferent subsystems vehicle 102. The CAN BUS 118 may further communicably couple together one or more electronic control units (ECUs) (not shown) positioned along thevehicle 102 and implemented as part of any of the subsystems 120 or otherwise. For example, afirst subsystem 120A may be used for the transmission of thevehicle 102, and asecond subsystem 120B may be used for operations of the doors, and the CAN BUS 118 may accommodate data sharing or interconnection between these subsystems, or separate control actuators, or sensors (not shown). - Referring back to the
infotainment system 104, by leveraging data from anevent handler 122 implemented by theprocessor 112 or one or more ECUs along thevehicle 102, theapplication 114 described herein may be used to configure theinfotainment device 104 to modify, restrict, or temporarily disable certain functions and settings of theinfotainment device 104 to increase the driver's attention to the road as opposed to the driver interacting with theinfotainment device 104 longer than desired. In some embodiments, for example, theapplication 114 comprises a plug-in, add-on, extension, or update to the vehicle operating system 113 to execute the functionality associated with restricting the driver's interaction with theinfotainment device 104 as described herein. In other embodiments, theapplication 114 may merely be loaded into thememory 111 by a wired or wireless connection and may be logically layered over the operating system 113 as a separate application. In these embodiments, theapplication 114 issues requests or instructions to the operating system 113 or any application managing functionality associated with theinfotainment device 104 to temporarily restrict such functionality in the manner described herein. - Referring to
blocks FIG. 2 , with continuing reference toFIG. 1 , to enhance theinfotainment device 104 as indicated, theapplication 114 is configured to initially extract or accessinput information 220 about driving conditions (such as the use of gas and brake pedals, the acceleration or deceleration information generated by the car, the presence and rate of lane change, swerving and the driving style/type of the driver, various steering wheel movements, the car state of motion and speed as well as geolocation proximate to an intersection or traffic lights, and the like), as shown inblock 202. In some embodiments, theapplication 114 may acquiresuch input information 220 by integrating with or accessing data from the subsystems 120 and/or from theevent handler 122, which may already be implemented by the operating system 113 of thevehicle 102 or separately implemented. An event handler for a general computing device may include any software routine that monitors or processes events or actions such keystrokes, or engagement with input controls. Many components or features of modern vehicles are generally electrically connected, or are otherwise in electrical communication with a central processing unit, such as theprocessor 112. The event handler 122 of the vehicle 102 (or any vehicle) similar to an event handler for a general computing device, monitors, records, or processes various events about such components of the vehicle as they occur, and the operating system 113 may manage and make use of such data. For example, sensors 124A and 124B implemented along the drive train (not shown) of thevehicle 102 may be in operable electrical communication with theprocessor 112, and may generate information about the motion of thevehicle 102 including when thevehicle 102 stops or accelerates. This information may be managed and recorded by theevent handler 122. Theinput information 220 accessed from theevent handler 122 may further include information about the driver's interactions/events associated with theinfotainment device 104. For example, an event may include actuation of the radio or theinput controls 110 of theinfotainment device 104 by the driver depressing a button along the dashboard of thevehicle 102. In some embodiments, theinput information 220 may further include, or theapplication 114 may also be fed LIDAR information and on-board forward and vehicle camera information as available. - As shown in
block 204, utilizing theinput information 220, thepredictive model 116, using a classifier or other machine learning method, may identify input parameters such as speed, acceleration, and other variables or features from theinput information 220, which may be used to determine a corresponding driving conditions classification 222 or class associated with a suggested level of attention suitable for the driver, given the driving conditions (e.g., low, moderate, or high, on a scale or otherwise). For example, using input parameters derived from theinput information 220, thepredictive model 116 may determine that thevehicle 102 is operating within city traffic by identifying that thevehicle 102 has made a predetermined number of stops within a predefined period of time, and is averaging a speed of 30 mph, such that thepredictive model 116 outputs a driving conditions classification 222 corresponding to operation of thevehicle 102 within city traffic. Thepredictive model 116 may output a different driving conditions classification 222 associated with operation of thevehicle 102 during highway traffic (with less stops and starts, and increased average speeds). Any such examples of classifications are contemplated. In some embodiments, the driving conditions classification 222 may be associated with a scale from e.g., 1-10 with 10 reflecting adverse driving conditions requiring less driver distraction, and with 1 reflecting more favorable driving conditions where some driver distraction may be permissible. - As shown in
blocks infotainment system 104. For example, the particular value of the driving conditions classification 222 may correspond with amaximum interaction time 224 that theapplication 114 is programmed to permit the driver to interact with theinfotainment system 104, or a maximum number ofsteps 226 the driver is permitted to take while interacting with theinfotainment system 104 while the driving conditions remain consistent with the driving conditions classification 222. In other words, based on the driving conditions classification 222, and where theapplication 114 determines that the driver has met or exceeded themaximum interaction time 224 or maximum number ofsteps 226, theapplication 114 may be programmed to configure the infotainment system 104 (by software calls or otherwise) to temporarily: selectively display only limited graphical features along theinterface 108 along thedisplay 106, terminate any number of graphical features or functions, deactivate audio transmission associated with theinfotainment system 104, completely disable theinfotainment system 104, disable certain ones of theinput controls 110, and the like. - These examples of temporary restrictions may be imposed for a predetermined amount of time, such as 6-10 seconds, so that the driver can refocus attention back to traffic and the road. As a specific example, the
application 114 may be preprogrammed with a maximum number ofsteps 226 of six to coincide with the NHTSA guidelines described herein. In this manner, theapplication 114 may, in conjunction with theprocessor 112, issue calls or commands to theinfotainment system 104 to disable certain functionality for a predetermined period of time where the event handler 122 and theapplication 114 collectively identify that the driver has caused six events to be recognized by theevent handler 122 while the driver is interacting with the infotainment system 104 (e.g., while checking email) during adverse driving conditions or otherwise. - The aforementioned inventive concept may reduce driver distraction, save lives, and reduce cost overall. Most modern vehicular safety technologies focus on three main considerations: (1) static responses to driving situations, regardless of the varying needs of the driver's attention, (2) adding more software features to the infotainment device, more connectivity, and larger screens, and (3) adding more processing power and faster response times. Research shows that such general considerations to infotainment devices may exacerbate issues with driving distraction as opposed to reducing them. With the present inventive concept, by continuing to acquire or access the
input information 220, theapplication 114 may continuously and dynamically evaluate the driving conditions and configure interaction with theinfotainment system 104 according to the driving conditions and any predefined infotainment interaction limitations. - In some embodiments, the
application 114 may be programmed in any programming language or programming framework (C, C++, Java, Python, Matlab . . . , or the like). Theapplication 114 may be programmed with a specialized class object (not shown) configured to interpret data from theevent handler 122 associated with driving conditions and driver interaction with theinfotainment system 104. This class object may be implemented with a timer and may place information about events from theevent handler 122 into a queue. The queue may define a length or amount, e.g., five seconds so that the queue stores certain events that have occurred within the last five seconds. In some embodiments, this possible programming structure may include aspects of a sliding window, or implementation thereof. In other embodiments, it could use artificial intelligence and other machine learning methods (for example Neural Networks, Recurrent Neural Networks, Long Short Term Memory LSTM) to implement the same work. So, for example, theapplication 114 may utilize this class object to determine whether a queue includes information about more than a desired number of events (maximum number of steps 226) having occurred during a particular time period associated with the queue. In other words, in this example, a sliding window may be implemented to count the events that have occurred in the last five seconds (reflected by the queue), and an alert may be generated if such events exceed a predetermined threshold (e.g., six) to temporarily disable functionality of theinfotainment device 104. Similar effect can be implemented using neural networks or other machine learning methods as explained above. - Referring to
FIG. 3 , another embodiment of a vehicular safety system, designatedsystem 300 is shown for distributing or delegating infotainment interaction from the driver to one or more passengers in order to reduce overall driver distraction. Similar to thesystem 100, thesystem 300 may generally include avehicle 302 equipped with an infotainment device, designatedfirst infotainment device 304. Thefirst infotainment device 304 may define any in-car entertainment device or plurality of devices, or in-vehicle infotainment and may include any entertainment and information functionality provided with thevehicle 302 and integrated within the dashboard or otherwise. As such, thefirst infotainment device 304 may include a collection of hardware and software components providing audio and visual information associated with radio, navigation, phone functionality for managing data associated with calls and texts, and the like. In some embodiments, thefirst infotainment device 304 generally includes adisplay 306 depicting an interface 308 (such as a graphical user interface) and a driver may interact with theinterface 308 using one or more input controls 310 such as dials, keypads, or other such controls or by touching thedisplay 306 in the event where thedisplay 306 includes touch-screen technology. In the example provided by thesystem 300, thefirst infotainment device 304 is generally designated for and accessible by the driver. - As shown, the
first infotainment device 304 may include or otherwise be in operable communication with amemory 311 or storage device, (at least one of) aprocessor 312 or ECU, configured to execute a vehicle operating system (operating system 313) stored in thememory 311 for issuing instructions and managing thefirst infotainment device 304 including its software and hardware peripherals. Theprocessor 312 may further execute anapplication 314 as further described herein. In some embodiments, thefirst infotainment device 304 may also be in communication with a control area network bus (CAN BUS) 318 and a plurality of subsystems 320 defined along thevehicle 302. Thesubsystems vehicle 302. - In addition, the
system 300 includes asecond infotainment device 334 designated for a passenger of thevehicle 302. In some embodiments, thesecond infotainment device 334 may include all (or at least some) of the features of thefirst infotainment device 304 including input controls, a display, and mechanical and electrical components (not shown) that enable a passenger to interact with thesecond infotainment device 334 and modify settings of thevehicle 302. In some embodiments, common to current vehicles with multiple infotainment devices and/or interfaces, the passenger may interact with thesecond infotainment device 334 to modify settings of thevehicle 302 or the first infotainment device 304 (e.g., changing the radio station from the second infotainment device 334). In some embodiments, thesecond infotainment device 334 may merely be a related component of thefirst infotainment system 304, or an extended interface of the same. - Further, the
system 300 may include one or more of apassenger sensor 340, which may be integrated along a passenger seat and define a sensor or other electrical component sensitive to changes in weight, or the like. Some factory cars already include passenger sensors used for other systems like activating/deactivating airbags (depending on the presence and size of passengers). The information available from these sensors could be used with thesystem 300 described herein. Thepassenger sensor 340 may provide information to theapplication 314 via the CAN BUS 318 (or any other data bus) about whether a passenger or other occupant is occupying the vehicle in addition to the driver. In some embodiments, thepassenger sensor 340 may include a weight sensor with a load cell that creates an electric signal proportional to forces/weight applied to thepassenger sensor 340 by an occupant that can be interpreted and utilized by theapplication 314 as further described herein. - Generally, the
application 314 assists to manage functionality between thefirst infotainment device 304 and thesecond infotainment device 334 as described herein. Thepassenger sensor 340 may be implemented to sense when passengers are in thevehicle 302 with the driver. If a passenger is detected, this information may be accessed by theapplication 314, which may then in turn make settings modifications to thefirst infotainment device 304 and/or thesecond infotainment device 334. For example, theapplication 314 may limit phone functionality and/or disable phone functions associated with thefirst infotainment device 304 when a passenger is detected, and delegate or distribute this functionality to thesecond infotainment device 334 so that the passenger (who is not operating the vehicle 302) can safely utilize phone functions as needed (versus the driver operating the vehicle 302). - In some embodiments, once passengers are detected, the
application 314 may be programmed to display or announce a notification (i.e., via a screen or a speaker) and request that the driver select a type of person or persona for each passenger. Personas may define certain permission levels and a list of limited features or functions of thesecond infotainment device 334 corresponding to each persona. Most personas may duplicate features available to thefirst infotainment device 304 and thesecond infotainment device 334, however, some may be more limiting. For example, the driver may, e.g., limit a passenger's ability to watch a film or change the radio station using thesecond infotainment device 334 by selecting a “child” person. In some embodiments, thememory 311 may store profiles associated with specific passengers that occupy thevehicle 302; each profile corresponding to a particular weight identified by thepassenger sensor 340 or otherwise. In either case, theapplication 314 is configured to dynamically distribute functionality between thefirst infotainment device 304 and thesecond infotainment device 334 as desired, and in some embodiments functionality available to the driver via thefirst infotainment device 304 may be restricted or disabled where a passenger is detected in thevehicle 302. - In some embodiments, the
application 314 comprises a plug-in, add-on, extension, or update to thevehicle operating system 313 to execute the functionality described herein. In other embodiments, theapplication 314 may merely be loaded into thememory 311 by a wired or wireless connection and may be logically layered over theoperating system 313 as a separate application. - Overall, the inventive concept of
FIG. 3 is responsive to current technical problems. For example, conventional infotainment technologies place undue emphasis upon static availability of features regardless of the varying needs and number of the occupants in a vehicle. Such conventional technologies may also implement an excess amount of software features in an attempt to reduce driver distraction. Furthermore, such technologies generally give full access and control of the vehicles infotainment devices and overall infotainment system to the driver as a super user (a metaphor deriving from PC Research shows that even if we achieve perfection in technologies such as speech recognition, distraction will eventually get worse and not improve in this regard). As a technical solution to these technical problems, the present inventive concept ofFIG. 3 implements theapplication 314 with thefirst infotainment device 304 and thesecond infotainment device 334 as described to dynamically and continuously evaluate the car's occupants and distribute/divide functionality as desired, and may accommodate the selective display of different features on each of the infotainment devices. Theapplication 314 may further accommodate the driver to control the roles and feature access of thesecond infotainment device 334 available to the occupants. -
FIG. 4 is an example schematic diagram of acomputing device 700 that may implement various methodologies discussed herein. For example, thecomputing device 700 may comprise any number or form of computing device used to execute theapplication 114 or aspects of thesystem 100 described herein. Thecomputing device 700 includes a bus 701 (i.e., interconnect), at least oneprocessor 702 or other computing element, at least onecommunication port 703, amain memory 704, aremovable storage media 705, a read-only memory 706, and amass storage device 707. Processor(s) 702 can be any known processor, such as, but not limited to, an Intel® Itanium® or Itanium 2® processor(s), AMD® Opteron® or Athlon MP® processor(s), or Motorola® lines of processors.Communication port 703 can be any of an RS-232 port for use with a modem based dial-up connection, a 10/100 Ethernet port, a Gigabit port using copper or fiber, or a USB port. Communication port(s) 703 may be chosen depending on a network such as a Local Area Network (LAN), a Wide Area Network (WAN), or any network to which thecomputing device 700 connects. Computing system may further include a transport and/ortransit network 755, adisplay screen 760, an I/O port 740, and aninput device 745 such as a mouse or keyboard. -
Main memory 704 can be Random Access Memory (RAM) or any other dynamic storage device(s) commonly known in the art. Read-onlymemory 706 can be any static storage device(s) such as Programmable Read-Only Memory (PROM) chips for storing static information such as instructions forprocessor 702.Mass storage device 707 can be used to store information and instructions. For example, hard disks such as the Adaptec® family of Small Computer Serial Interface (SCSI) drives, an optical disc, an array of disks such as Redundant Array of Independent Disks (RAID), such as the Adaptec® family of RAID drives, or any other mass storage devices, may be used. - Bus 701 communicatively couples processor(s) 702 with the other memory, storage, and communications blocks. Bus 701 can be a PCI/PCI-X, SCSI, or Universal Serial Bus (USB) based system bus (or other) depending on the storage devices used.
Removable storage media 705 can be any kind of external hard drives, thumb drives, Compact Disc-Read Only Memory (CD-ROM), Compact Disc-Re-Writable (CD-RW), Digital Video Disk-Read Only Memory (DVD-ROM), etc. - Embodiments herein may be provided as a computer program product, which may include a machine-readable medium having stored thereon instructions which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to optical discs, CD-ROMs, magneto-optical disks, ROMs, RAMs, erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or other type of media/machine-readable medium suitable for storing electronic instructions. Moreover, embodiments herein may also be downloaded as a computer program product, wherein the program may be transferred from a remote computer to a requesting computer by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., modem or network connection).
- As shown,
main memory 704 is encoded with applications that support functionality as discussed herein. At least a portion of these applications (and/or other resources as described herein) can be embodied as software code such as data and/or logic instructions (e.g., code stored in the memory or on another computer readable medium such as a disk) that supports processing functionality according to different embodiments described herein. During operation of one embodiment, processor(s) 702 accessesmain memory 704 via the use of bus 701 in order to launch, run, execute, interpret, or otherwise perform processes, such as through logic instructions, executing on theprocessor 702 and associated software modules stored in main memory or otherwise tangibly stored. - The description above includes example systems, methods, techniques, instruction sequences, and/or computer program products that embody techniques of the present disclosure. However, it is understood that the described disclosure may be practiced without these specific details. In the present disclosure, the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are instances of example approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter. The accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
- The described disclosure may be provided as a computer program product, or software, that may include a machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure. A machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The machine-readable medium may include, but is not limited to optical storage medium (e.g., CD-ROM); magneto-optical storage medium, read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or other types of medium suitable for storing electronic instructions.
- It is believed that the present disclosure and many of its attendant advantages should be understood by the foregoing description, and it should be apparent that various changes may be made in the form, construction, and arrangement of the components without departing from the disclosed subject matter or without sacrificing all of its material advantages. The form described is merely explanatory, and it is the intention of the following claims to encompass and include such changes.
- It should be understood from the foregoing that, while particular embodiments have been illustrated and described, various modifications can be made thereto without departing from the spirit and scope of the invention as will be apparent to those skilled in the art. Such changes and modifications are within the scope and teachings of this invention as defined in the claims appended hereto.
Claims (8)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/577,987 US20200099547A1 (en) | 2018-09-20 | 2019-09-20 | Systems and methods for improved vehicular safety |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862733986P | 2018-09-20 | 2018-09-20 | |
US16/577,987 US20200099547A1 (en) | 2018-09-20 | 2019-09-20 | Systems and methods for improved vehicular safety |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200099547A1 true US20200099547A1 (en) | 2020-03-26 |
Family
ID=69883080
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/578,114 Active US10882400B2 (en) | 2018-09-20 | 2019-09-20 | Systems and methods for improved vehicular safety |
US16/577,987 Abandoned US20200099547A1 (en) | 2018-09-20 | 2019-09-20 | Systems and methods for improved vehicular safety |
US16/577,978 Active 2040-03-12 US11142071B2 (en) | 2018-09-20 | 2019-09-20 | Systems and methods for improved vehicular safety |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/578,114 Active US10882400B2 (en) | 2018-09-20 | 2019-09-20 | Systems and methods for improved vehicular safety |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/577,978 Active 2040-03-12 US11142071B2 (en) | 2018-09-20 | 2019-09-20 | Systems and methods for improved vehicular safety |
Country Status (1)
Country | Link |
---|---|
US (3) | US10882400B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10893010B1 (en) * | 2019-03-25 | 2021-01-12 | Amazon Technologies, Inc. | Message filtering in a vehicle based on dynamically determining spare attention capacity from an overall attention capacity of an occupant and estimated amount of attention required given current vehicle operating conditions |
US11093767B1 (en) * | 2019-03-25 | 2021-08-17 | Amazon Technologies, Inc. | Selecting interactive options based on dynamically determined spare attention capacity |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102018008045B4 (en) * | 2018-10-11 | 2020-07-23 | Daimler Ag | Method and device for controlling display content on an output means of a vehicle |
CN114816158A (en) * | 2021-01-11 | 2022-07-29 | 华为技术有限公司 | Interface control method and device, electronic equipment and readable storage medium |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5983161A (en) | 1993-08-11 | 1999-11-09 | Lemelson; Jerome H. | GPS vehicle collision avoidance warning and control system and method |
US8762880B2 (en) * | 2007-06-29 | 2014-06-24 | Microsoft Corporation | Exposing non-authoring features through document status information in an out-space user interface |
US20090228172A1 (en) | 2008-03-05 | 2009-09-10 | Gm Global Technology Operations, Inc. | Vehicle-to-vehicle position awareness system and related operating method |
US20120117601A1 (en) * | 2010-11-09 | 2012-05-10 | Sony Corporation | User interface for audio video display device such as tv |
US9547172B2 (en) * | 2014-10-13 | 2017-01-17 | Ford Global Technologies, Llc | Vehicle image display |
WO2017130439A1 (en) * | 2016-01-28 | 2017-08-03 | 鴻海精密工業股▲ふん▼有限公司 | Vehicular image display system and vehicle having same image display system mounted therein |
US10111045B2 (en) * | 2016-06-24 | 2018-10-23 | Qualcomm Incorporated | Low power V2I/V2V mode for mobile devices |
US10399564B2 (en) * | 2016-10-25 | 2019-09-03 | Ford Global Technologies, Llc | Vehicle roundabout management |
US10431093B2 (en) * | 2017-06-20 | 2019-10-01 | Zf Friedrichshafen Ag | System and method for collision avoidance |
US20190206258A1 (en) * | 2018-01-04 | 2019-07-04 | nuTonomy Inc. | Augmented reality vehicle interfacing |
-
2019
- 2019-09-20 US US16/578,114 patent/US10882400B2/en active Active
- 2019-09-20 US US16/577,987 patent/US20200099547A1/en not_active Abandoned
- 2019-09-20 US US16/577,978 patent/US11142071B2/en active Active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10893010B1 (en) * | 2019-03-25 | 2021-01-12 | Amazon Technologies, Inc. | Message filtering in a vehicle based on dynamically determining spare attention capacity from an overall attention capacity of an occupant and estimated amount of attention required given current vehicle operating conditions |
US11093767B1 (en) * | 2019-03-25 | 2021-08-17 | Amazon Technologies, Inc. | Selecting interactive options based on dynamically determined spare attention capacity |
Also Published As
Publication number | Publication date |
---|---|
US10882400B2 (en) | 2021-01-05 |
US20200094678A1 (en) | 2020-03-26 |
US20200098267A1 (en) | 2020-03-26 |
US11142071B2 (en) | 2021-10-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200099547A1 (en) | Systems and methods for improved vehicular safety | |
US11400811B2 (en) | Remotely controlling electronic functions of a vehicle without an integrated touchscreen | |
US10257805B2 (en) | Preventing driver distraction from incoming notifications | |
EP4105095A1 (en) | Control method, apparatus and device, and storage medium | |
KR100742112B1 (en) | Method and apparatus for classifying vehicle operator activity state | |
US10462281B2 (en) | Technologies for user notification suppression | |
CN110843798A (en) | Coordinating delivery of notifications to vehicle drivers to reduce distraction | |
US9395702B2 (en) | Safety critical apparatus and method for controlling distraction of an operator of a safety critical apparatus | |
WO2002039761A2 (en) | Context aware wireless communication device and method | |
CN107284453A (en) | Based on the interactive display for explaining driver actions | |
US20200387278A1 (en) | Remotely controlling vehicle touchscreen controls | |
DE112012004789T5 (en) | Configurable vehicle console | |
CN117480085A (en) | Driver Monitoring System (DMS) data management | |
JP2004518461A (en) | Method and apparatus for improving vehicle driver performance | |
US20180208203A1 (en) | System, method and computer program product for braking control when approaching a traffic signal | |
US20180354433A1 (en) | In-Vehicle Infotainment Control Systems and Methods | |
CN110293903A (en) | A kind of Vehicular turn lamp control method, device, equipment and storage medium | |
WO2021216578A1 (en) | Driver screening | |
CN114312797B (en) | Agent device, agent method, and recording medium | |
Amditis et al. | Design and development of an adaptive integrated driver-vehicle interface: overview of the AIDE project | |
EP3744602A1 (en) | Method for controlling an autonomous driving configuration or driving assistance configuration | |
US20220230081A1 (en) | Generation and presentation of explanations related to behavior of an automated control system | |
CN118140496A (en) | Method, device and storage medium for scheduling notifications based on driving assistance function | |
Li et al. | Improve adaptive cruise control under driver distraction: Time headway compensation based on a random‐effect crash risk model | |
JP2021028192A (en) | Operation auxiliary device and operation auxiliary method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ARIZONA BOARD OF REGENTS ON BEHALF OF ARIZONA STATE UNIVERSITY, ARIZONA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GAFFAR, ASHRAF;REEL/FRAME:050625/0092 Effective date: 20191001 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |