CA3005402C - Systems and method to trigger vehicle events based on contextual information - Google Patents
Systems and method to trigger vehicle events based on contextual information Download PDFInfo
- Publication number
- CA3005402C CA3005402C CA3005402A CA3005402A CA3005402C CA 3005402 C CA3005402 C CA 3005402C CA 3005402 A CA3005402 A CA 3005402A CA 3005402 A CA3005402 A CA 3005402A CA 3005402 C CA3005402 C CA 3005402C
- Authority
- CA
- Canada
- Prior art keywords
- vehicle
- condition
- threshold value
- information
- event
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 230000007613 environmental effect Effects 0.000 claims description 47
- 238000001514 detection method Methods 0.000 claims description 38
- 238000003860 storage Methods 0.000 claims description 35
- 230000001133 acceleration Effects 0.000 claims description 7
- 230000008859 change Effects 0.000 claims description 4
- 238000004891 communication Methods 0.000 description 13
- 238000012545 processing Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 6
- 239000000446 fuel Substances 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 206010041349 Somnolence Diseases 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 240000004752 Laburnum anagyroides Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000004035 construction material Substances 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000383 hazardous chemical Substances 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 244000144972 livestock Species 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000002459 sustained effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0808—Diagnosing performance data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W40/09—Driving style or behaviour
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C22/00—Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0816—Indicating performance data, e.g. occurrence of a malfunction
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2510/00—Input parameters relating to a particular sub-units
- B60W2510/06—Combustion engines, Gas turbines
- B60W2510/0604—Throttle position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2510/00—Input parameters relating to a particular sub-units
- B60W2510/06—Combustion engines, Gas turbines
- B60W2510/0666—Engine power
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2510/00—Input parameters relating to a particular sub-units
- B60W2510/18—Braking system
- B60W2510/182—Brake pressure, e.g. of fluid or between pad and disc
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
- B60W2520/105—Longitudinal acceleration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/12—Lateral speed
- B60W2520/125—Lateral acceleration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/18—Steering angle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/20—Ambient conditions, e.g. wind or rain
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/10—Historical data
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/008—Registering or indicating the working of vehicles communicating information to a remotely located station
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
This disclosure relates to a system and method for detecting vehicle events. Some or all of the system may be installed in a vehicle, operate at the vehicle, and/or be otherwise coupled with a vehicle. The system includes one or more sensors configured to generate output signals conveying information related to the vehicle. The system receives contextual information from a source external to the vehicle. The system detects a vehicle event based on the information conveyed by the output signals from the sensors and the received contextual information.
Description
2 PCT/IB2017/050397 SYSTEMS AND METHOD TO TRIGGER VEHICLE EVENTS BASED ON
CONTEXTUAL INFORMATION
HELD
(01) The systems and methods disclosed herein are related to detection of vehicle events, and, in particular, detection of vehicle events that are based, at least in part, on current environmental conditions near and/or around a vehicle.
BACKGROUND
(02) Systems configured to record, store, and transmit video, audio, and/or sensor data associated with a vehicle, e.g. in response to an accident involving the vehicle are known. Typically, such systems detect an accident based on data from a single sensor such as an accelerometer mounted on the vehicle. Video from the accident may usually be analyzed by a user at a later time after the accident. Vehicle Engine Control Component (ECM) systems are known. Such systems interface/interoperate with external computers (e.g., at an automobile mechanic) where the data stored by the ECM
system is analyzed.
SUMMARY
CONTEXTUAL INFORMATION
HELD
(01) The systems and methods disclosed herein are related to detection of vehicle events, and, in particular, detection of vehicle events that are based, at least in part, on current environmental conditions near and/or around a vehicle.
BACKGROUND
(02) Systems configured to record, store, and transmit video, audio, and/or sensor data associated with a vehicle, e.g. in response to an accident involving the vehicle are known. Typically, such systems detect an accident based on data from a single sensor such as an accelerometer mounted on the vehicle. Video from the accident may usually be analyzed by a user at a later time after the accident. Vehicle Engine Control Component (ECM) systems are known. Such systems interface/interoperate with external computers (e.g., at an automobile mechanic) where the data stored by the ECM
system is analyzed.
SUMMARY
(03) One aspect of the disclosure relates to a system configured to detect vehicle events. The system may be coupled and/or otherwise related to a vehicle. Some or all of the system may be installed in the vehicle and/or be otherwise coupled with the vehicle. In some implementations, the system may include one or more sensors, one or more servers, one or more physical processors, electronic storage, one or more external providers, and/or other components. The one or more sensors may be configured to generate output signals conveying information related to the vehicle and/or one or more current operating conditions of the vehicle. In some implementations, the system may detect vehicle events based on a comparison of the information conveyed by the output signals from the sensors to predetermined (variable and/or fixed) values, threshold, functions, and/or other information. Advantageously, the system may identify vehicle events in real-time or near real-time during operation of the vehicle. As used herein, the term "processor" is used interchangeably with the term "physical processor."
(04) The one or more sensors may be configured to generate output signals conveying information related to the operation and/or one or more operating conditions of the vehicle. Information related to the operation of the vehicle may include feedback information from one or more of the mechanical systems of the vehicle, and/or other information. In some implementations, at least one of the sensors may be a vehicle system sensor included in an engine control module (ECM) system or an electronic control module (ECM) system of the vehicle. In some implementations, one or more sensors may be carried by the vehicle.
(05) The one or more servers may include one or more processors configured to execute one or more computer program components. The computer program components may include one or more of an operation component, a context component, a detection component, a record component, a notification component, a location component, and/or other components.
(06) The operation component may be configured to determine current operating conditions and/or vehicle parameters of vehicles. The operation component may determine current operating conditions based on the information conveyed by the output signals from the sensors and/or other information. The one or more current operating conditions may be related to the vehicle, the operation of the vehicle, physical characteristics of the vehicle, and/or other information. In some implementations, the operation component may be configured to determine one or more of the current operating conditions one or more times in an ongoing manner during operation of the vehicle.
(07) The context component may be configured to obtain, receive, and/or determine contextual information related to environmental conditions near and/or around vehicles.
Environmental conditions may be related to weather conditions, road surface conditions, traffic conditions, visibility, and/or other environmental conditions. In some implementations, one or more environmental conditions may be received from one or more sources external to the vehicle. For example, a source external to the vehicle may include a remote server and/or an external provider.
Environmental conditions may be related to weather conditions, road surface conditions, traffic conditions, visibility, and/or other environmental conditions. In some implementations, one or more environmental conditions may be received from one or more sources external to the vehicle. For example, a source external to the vehicle may include a remote server and/or an external provider.
(08) The detection component may be configured to detect vehicle events.
Detection of vehicle events may be based on one or more current operating conditions of the vehicle. In some implementations, detection may be further based on one or more types of contextual information. In some implementations, detection may be accomplished and/or performed at the vehicle, e.g. by a physical processor that is carried by the vehicle.
Detection of vehicle events may be based on one or more current operating conditions of the vehicle. In some implementations, detection may be further based on one or more types of contextual information. In some implementations, detection may be accomplished and/or performed at the vehicle, e.g. by a physical processor that is carried by the vehicle.
(09) The record component may be configured to record, store, and/or transmit information, including but not limited to information related to vehicle events. In some implementations, information related to vehicle events may be used to create vehicle event records. Vehicle event records may include video information, audio information, data from an ECM system, metadata, information based on sensor-generated output, and/or other information.
(10) Vehicle event records may be stored locally in a vehicle and/or transmitted from a vehicle to a system, server, and/or service that is external to the vehicle, including but not limited to a remote server and/or an external provider. In some implementations, a system, server, and/or service that is external to the vehicle may query and/or request information from a particular vehicle. The record component may be configured to respond to a query or request by transmitting information as queried and/or requested.
In some implementations, the record component may be configured to facilitate communication of information between vehicles, remote servers, external providers, and/or other systems, servers, and/or services external to vehicles.
Communication may be in real-time or near real-time. Communication may be wireless.
In some implementations, the record component may be configured to facilitate communication of information between vehicles, remote servers, external providers, and/or other systems, servers, and/or services external to vehicles.
Communication may be in real-time or near real-time. Communication may be wireless.
(11) The notification component may be configured to generate and/or determine notifications related to vehicle events. In some implementations, notifications may be intended for drivers of vehicles. For example, the notification component may be configured to provide notifications to drivers, including but not limited to warnings or requests (for example to reduce speed). In some implementations, notifications may be transmitted from a vehicle to a system, server, and/or service that is external to the vehicle, including but not limited to a remote server and/or an external provider.
(12) The location component may be configured to obtain and/or determine information related to the locations of vehicles and/or other locations (which may be referred to as location information). In some implementations, the location component may be configured to receive information related to the current location of a vehicle.
By way of non-limiting example, location information may include global positioning system (GPS) information.
(12a) According to one aspect of the present invention, there is provided a system configured to detect vehicle events, the system configured to couple with a vehicle, the system comprising: one or more sensors configured to generate output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle; electronic storage configured to store information, and one or more processors configured to: determine a current operating condition of the vehicle based on the generated output signals; receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility;
determine a first threshold value at which the current operating condition of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the first condition; at the vehicle, detect a vehicle event based on a first comparison of the current operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition with a second threshold value, and wherein the second threshold value varies based on the second condition; create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle; store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
(12b) According to another aspect of the present invention, there is provided a system configured to detect vehicle events, the system configured to couple with a vehicle, the system comprising: one or more sensors configured to generate output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle; electronic storage configured to store information;
and one or more processors configured to: determine a first operating condition and a second operating condition of the vehicle based on the generated output signals, .. wherein the first operating condition corresponds to a first moment in time, wherein the second operating condition corresponds to a second moment in time, and wherein the second moment in time is subsequent to the first moment in time;
at the vehicle, analyze the first operating condition and determine that no vehicle event has occurred that needs to be reported to either a driver of the vehicle or an external server; receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility; determine a first threshold value at which the second operating condition of the vehicle effectuates detection of vehicle events, 4a wherein the first threshold value varies based on the first condition; at the vehicle, detect a vehicle event based on a first comparison of the second operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition of the vehicle with a second threshold value, and wherein the second threshold value varies based on the second condition; create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle;
store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
(12c) According to still another aspect of the present invention, there is provided a method to detect vehicle events for a vehicle, the method being implemented in a computer system that includes one or more sensors and one or more physical processors, the method comprising: generating, by the one or more sensors, output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle; determining a current operating condition of the vehicle based on the generated output signals; receiving contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility;
determining a first threshold value at which the current operating condition of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the first condition; detecting, at the vehicle, a vehicle event based on a first comparison of the current operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition with a second threshold value, and wherein the second threshold value varies based on the second condition; creating a vehicle 4b event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle; storing the vehicle event record in electronic storage; and effectuating presentation to a user of the system, via a user .. interface, of a notification based on the vehicle event record.
(12d) According to yet another aspect of the present invention, there is provided a system configured to detect vehicle events, the system configured to couple with a vehicle, the system comprising: one or more sensors configured to generate output signals conveying information related to a current speed of the vehicle;
electronic storage configured to store information, and one or more processors configured to:
determine the current speed of the vehicle based on the generated output signals;
receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a traffic condition and a weather condition; determine a first threshold value at which the current speed of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the traffic condition;
determine a second threshold value for the traffic condition, wherein the second threshold value varies based on the weather condition; at the vehicle, detect a vehicle event based on: (i) a first comparison of the current speed of the vehicle with the first threshold value, and (ii) a second comparison of the traffic condition with the second threshold value; create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the current speed of the vehicle; store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
By way of non-limiting example, location information may include global positioning system (GPS) information.
(12a) According to one aspect of the present invention, there is provided a system configured to detect vehicle events, the system configured to couple with a vehicle, the system comprising: one or more sensors configured to generate output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle; electronic storage configured to store information, and one or more processors configured to: determine a current operating condition of the vehicle based on the generated output signals; receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility;
determine a first threshold value at which the current operating condition of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the first condition; at the vehicle, detect a vehicle event based on a first comparison of the current operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition with a second threshold value, and wherein the second threshold value varies based on the second condition; create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle; store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
(12b) According to another aspect of the present invention, there is provided a system configured to detect vehicle events, the system configured to couple with a vehicle, the system comprising: one or more sensors configured to generate output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle; electronic storage configured to store information;
and one or more processors configured to: determine a first operating condition and a second operating condition of the vehicle based on the generated output signals, .. wherein the first operating condition corresponds to a first moment in time, wherein the second operating condition corresponds to a second moment in time, and wherein the second moment in time is subsequent to the first moment in time;
at the vehicle, analyze the first operating condition and determine that no vehicle event has occurred that needs to be reported to either a driver of the vehicle or an external server; receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility; determine a first threshold value at which the second operating condition of the vehicle effectuates detection of vehicle events, 4a wherein the first threshold value varies based on the first condition; at the vehicle, detect a vehicle event based on a first comparison of the second operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition of the vehicle with a second threshold value, and wherein the second threshold value varies based on the second condition; create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle;
store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
(12c) According to still another aspect of the present invention, there is provided a method to detect vehicle events for a vehicle, the method being implemented in a computer system that includes one or more sensors and one or more physical processors, the method comprising: generating, by the one or more sensors, output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle; determining a current operating condition of the vehicle based on the generated output signals; receiving contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility;
determining a first threshold value at which the current operating condition of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the first condition; detecting, at the vehicle, a vehicle event based on a first comparison of the current operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition with a second threshold value, and wherein the second threshold value varies based on the second condition; creating a vehicle 4b event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle; storing the vehicle event record in electronic storage; and effectuating presentation to a user of the system, via a user .. interface, of a notification based on the vehicle event record.
(12d) According to yet another aspect of the present invention, there is provided a system configured to detect vehicle events, the system configured to couple with a vehicle, the system comprising: one or more sensors configured to generate output signals conveying information related to a current speed of the vehicle;
electronic storage configured to store information, and one or more processors configured to:
determine the current speed of the vehicle based on the generated output signals;
receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a traffic condition and a weather condition; determine a first threshold value at which the current speed of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the traffic condition;
determine a second threshold value for the traffic condition, wherein the second threshold value varies based on the weather condition; at the vehicle, detect a vehicle event based on: (i) a first comparison of the current speed of the vehicle with the first threshold value, and (ii) a second comparison of the traffic condition with the second threshold value; create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the current speed of the vehicle; store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
(13) As used herein, any association (or relation, or reflection, or indication, or correspondency) involving vehicles, sensors, vehicle events, operating conditions, parameters, thresholds, functions, notifications, and/or another entity or object that interacts with any part of the system and/or plays a part in the operation of the 4c system, may be a one-to-one association, a one-to-many association, a many-to-one association, and/or a many-to-many association or N-to-M association (note that N
and M may be different numbers greater than 1).
and M may be different numbers greater than 1).
(14) As used herein, the term "obtain" (and derivatives thereof) may include active and/or passive retrieval, determination, derivation, transfer, upload, download, 4d submission, and/or exchange of information, and/or any combination thereof. As used herein, the term "effectuate" (and derivatives thereof) may include active and/or passive causation of any effect. As used herein, the term "determine" (and derivatives thereof) may include measure, calculate, compute, estimate, approximate, generate, and/or otherwise derive, and/or any combination thereof.
(15) These and other objects, features, and characteristics of the servers, systems, and/or methods disclosed herein, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this disclosure, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of "a", "an", and "the" include plural referents unless the context clearly dictates otherwise.
BRIEF DESCRIPTION OF THE DRAWINGS
BRIEF DESCRIPTION OF THE DRAWINGS
(16) FIG. 1 illustrates a system configured to detect vehicle events, in accordance with one or more embodiments.
(17) FIG. 2 illustrates a method to detect vehicle events, in accordance with one or more embodiments.
(18) FIGs. 3-4-5 illustrate exemplary dashboard views for a vehicle that includes a system configured to detect vehicle events in accordance with one or more embodiments.
.. DETAILED DESCRIPTION
.. DETAILED DESCRIPTION
(19) FIG. 1 illustrates a system 10 configured to detect vehicle events of a vehicle 12.
Some or all of system 10 may be installed in vehicle 12, carried by vehicle 12, and/or be otherwise coupled with and/or related to vehicle 12. In some implementations, system may include one or more sensors 142, one or more servers 11, one or more physical 5 processors 110, electronic storage 60, a network 13, one or more external providers 18, and/or other components. One or more sensors 142 may be configured to generate output signals. The output signals may convey information related to vehicle 12 and/or one or more current operating conditions of vehicle 12. In some implementations, one or more sensors 142 may be carried by vehicle 12.
10 (20) Information related to current operating conditions of the vehicle may include feedback information from one or more of the mechanical systems of vehicle 12, and/or other information. The mechanical systems of vehicle 12 may include, for example, the engine, the drive train, the lighting systems (e.g., headlights, brake lights), the braking system, the transmission, fuel delivery systems, and/or other mechanical systems. The mechanical systems of vehicle 12 may include one or more mechanical sensors, electronic sensors, and/or other sensors that generate the output signals (e.g., seat belt sensors, tire pressure sensors, etc.). In some implementations, at least one of sensors 142 may be a vehicle system sensor included in an ECM system of vehicle 12.
(21) In some implementations, one or more sensors 142 may include a video camera, an image sensor, and/or a microphone. Based on an analysis of images and/or sounds captured, system 10 may determine, using algorithms, that vehicle 12 is moving forward, is in reverse, has maneuvered outside of its lane of traffic, is making a turn, and/or other maneuvers. For example, by way of non-limiting example, driving maneuvers may include swerving, a U-turn, freewheeling, over-revving, lane-departure, short following distance, imminent collision, unsafe turning that approaches rollover and/or vehicle stability limits, hard braking, rapid acceleration, idling, driving outside a geo-fence boundary, crossing double-yellow lines, passing on single-lane roads, a certain number of lane changes within a certain amount of time or distance, fast lane change, cutting off other vehicles during lane-change, speeding, running a red light, running a stop sign, and/or other driving maneuvers.
.. (22) In some implementations, information related to current operating conditions of vehicle 12 may include information related to the environment in and/or around vehicle 12. The vehicle environment may include spaces in and around an interior and an exterior of vehicle 12. The information may include information related to movement of vehicle 12, an orientation of vehicle 12, a geographic position of vehicle 12, a spatial position of vehicle 12 relative to other objects, a tilt angle of vehicle 12, an inclination/declination angle of vehicle 12, and/or other information. In some implementations, the output signals conveying information may be generated via non-standard aftermarket sensors installed in vehicle 12. Non-standard aftermarket sensors may include, for example, a video camera, a microphone, an accelerometer, a gyroscope, a geolocation sensor (e.g., a GPS device), a radar detector, a magnetometer, radar (e.g. for measuring distance of leading vehicle), and/or other sensors. In some implementations, one or more sensors 142 may include multiple cameras positioned around vehicle 12 and synchronized together to provide a degree view of the inside of vehicle 12 and/or a 360 degree view of the outside of vehicle 12.
(23) Although sensors 142 are depicted in FIG. 1 as a single element, this is not intended to be limiting. Sensors 142 may include one or more sensors located adjacent to and/or in communication with the various mechanical systems of vehicle 12, in one or more positions (e.g., at or near the front of vehicle 12, at or near the back of vehicle 12, etc.) to accurately acquire information representing the vehicle environment (e.g. visual information, spatial information, orientation information), and/or in other locations. For example, in some implementations, system 10 may be configured such that a first sensor is located near or in communication with a rotating tire of vehicle 12, and a second sensor located on top of vehicle 12 is in communication with a geolocation satellite. In some implementations, sensors 142 may be configured to generate output signals continuously during operation of vehicle 12.
(24) As shown in FIG. 1, server 11 may include one or more processors 110 configured to execute one or more computer program components. The computer program components may comprise one or more of an operation component 21, a context component 22, a detection component 23, a record component 24, a notification .. component 25, a location component 26, and/or other components.
(25) Operation component 21 may be configured to determine current operating conditions and/or vehicle parameters of vehicles, e.g. vehicle 12. Operation component 21 may determine current operating conditions based on the information conveyed by the output signals from sensors 142 and/or other information. The one or more current .. operating conditions may be related to vehicle 12, the operation of vehicle 12, physical characteristics of vehicle 12, and/or other information. In some implementations, operation component 21 may be configured to determine one or more of the current operating conditions one or more times in an ongoing manner during operation of vehicle 12.
.. (26) In some implementations, operating conditions may include vehicle parameters.
For example, vehicle parameters may be related to one or more of an acceleration, a direction of travel, a turn diameter, a vehicle speed, an engine speed (e.g.
RPM), a duration of time, a closing distance, a lane departure from an intended travelling lane of the vehicle, a following distance, physical characteristics of vehicle 12 (such as mass and/or number of axles, for example), a tilt angle of vehicle 12, an inclination/declination angle of vehicle 12, and/or other parameters.
(27) The physical characteristics of vehicle 12 may be physical features of vehicle 12 set during manufacture of vehicle 12, during loading of vehicle 12, and/or at other times.
For example, the one or more vehicle parameters may include a vehicle type (e.g., a car, a bus, a semi-truck, a tanker truck), a vehicle size (e.g., length), a vehicle weight (e.g., including cargo and/or without cargo), a number of gears, a number of axles, a type of load carried by vehicle 12 (e.g., food items, livestock, construction materials, hazardous materials, an oversized load, a liquid), vehicle trailer type, trailer length, trailer weight, trailer height, a number of axles, and/or other physical features. In some implementations, one or more vehicle parameters may be based on (and/or interpreted differently in the presence of) systems that a vehicle is equipped with, including but not limited to a stability system, a forward collision warning system, automatic brake system, and/or other systems that a vehicle may be equipped with. For example, the presence or absence of a particular system, e.g. a forward collision warning system, may modify the sensitivity of the process and/or mechanism by which vehicle events are detected.
(28) In some implementations, operation component 21 may be configured to determine one or more vehicle parameters based on the output signals from at least two different sensors. For example, operation component 21 may determine one or more of the vehicle parameters based on output signals from a sensor 142 related to the ECM
system and an external aftermarket added sensor 142. In some implementations, a determination of one or more of the vehicle parameters based on output signals from at least two different sensors 142 may be more accurate and/or precise than a determination based on the output signals from only one sensor 142. For example, on an icy surface, output signals from an accelerometer may not convey that a driver of vehicle 12 is applying the brakes of vehicle 12. However, a sensor in communication with the braking system of vehicle 12 would convey that the driver is applying the brakes. System 10 may determine a value of a braking parameter based on the braking sensor information even though the output signals from the accelerometer may not convey that the driver is applying the brakes.
(29) Operation component 21 may be configured to determine vehicle parameters that are not directly measurable by any of the available sensors. For example, an inclinometer may not be available to measure the road grade, but vehicle speed data as measured by a GPS system and/or by a wheel sensor ECM may be combined with accelerometer data, engine load, and/or other information to determine the road grade.
If an accelerometer measures a force that is consistent with braking, but the vehicle speed remains constant, the parameter component can determine that the measured force is a component of the gravity vector that is acting along the longitudinal axis of the vehicle. By using trigonometry, the magnitude of the gravity vector component can be used to determine the road grade (e.g., pitch angle of the vehicle in respect to the horizontal plane).
(30) In some implementations, one or more of the vehicle parameters may be determined one or more times in an ongoing manner during operation of vehicle 12. In some implementations, one or more of the vehicle parameters may be determined at regular time intervals during operation of vehicle 12. The timing of the vehicle parameter determinations (e.g., in an ongoing manner, at regular time intervals, etc.) may be programmed at manufacture, obtained responsive to user entry and/or selection of timing information via a user interface and/or a remote computing device, and/or may be determined in other ways. The time intervals of parameter determination may be significantly less (e.g. more frequent) than the time intervals at which various sensor measurements are available. In such cases, system 10 may estimate vehicle parameters in between the actual measurements of the same vehicle parameters by the respective sensors, to the extent that the vehicle parameters are measurable.
This may be established by means of a physical model that describes the behavior of various vehicle parameters and their interdependency. For example, a vehicle speed parameter may be estimated at a rate of 20 times per second, although the underlying speed measurements are much less frequent (e.g., four times per second for ECM
speed, one time per second for GPS speed). This may be accomplished by integrating vehicle acceleration, as measured by the accelerometer sensor where the measurements are available 1000 times per second, across time to determine change in speed that is accumulated over time again for the most recent vehicle speed measurement. The benefit of these more frequent estimates of vehicle parameters are many and they include improved operation of other components of system 10, reduced complexity of downstream logic and system design (e.g., all vehicle parameters are updated at the same interval, rather than being updating irregularly and at the interval of each respective sensor), and more pleasing (e.g., "smooth") presentation of vehicle event recorder data in an event player apparatus.
(31) In some implementations, system 10 may be configured to detect specific driving maneuvers based on one or more of a vehicle speed, an engine load, a throttle level, vehicle direction, a gravitational force, and/or other parameters being sustained at or above threshold levels for pre-determined amounts of time. In some implementations, an acceleration and/or force threshold may be scaled based on a length of time that an acceleration and/or force are maintained, and/or the particular speed the vehicle is travelling. System 10 may be configured such that force maintained over a period of time at a particular vehicle speed may decrease a threshold force the longer that the force is maintained. System 10 may be configured such that, combined with engine load data, throttle data may be used to determine a risky event, a fuel wasting event, and/or other events.
(32) Context component 22 may be configured to obtain, receive, and/or determine contextual information related to environmental conditions near and/or around vehicles.
Environmental conditions may be related to weather conditions, road surface conditions, traffic conditions, visibility, and/or other environmental conditions. In some implementations, environmental conditions may be related to proximity of certain objects that are relevant to driving, including but not limited to traffic signs, railroad crossings, time of day, ambient light conditions, altitude, and/or other objects relevant to driving. In some implementations, one or more environmental conditions may be received from one or more sources external to vehicle 12. For example, a source external to vehicle 12 may include a remote server and/or an external provider 18. In some implementations, contextual information may include a likelihood of traffic congestion near a particular vehicle, and/or near a particular location. In some implementations, contextual information may include a likelihood of the road surface near a particular vehicle and/or a particular location being icy, wet, and/or otherwise potentially having an effect of braking and steering. In some implementations, environmental conditions may include information related to a particular driver and/or a particular trip. For example, with every passing hour that a particular driver drives his vehicle during a particular trip, the likelihood of drowsiness may increase. In some implementations, the function between trip duration or distance and likelihood of drowsiness may be driver-specific.
(33) In some implementations, contextual information may be received by system through network 13, e.g. the internet. Network 13 may include private networks, public networks, and/or combinations thereof. For example, contextual information related to weather conditions may be received from a particular external provider 18 that provides weather information. For example, contextual information related to road surface conditions may be received from a particular external provider 18 that provides road condition information. For example, contextual information related to traffic conditions may be received from a particular external provider 18 that provides traffic information.
(34) Detection component 23 may be configured to detect vehicle events.
Detection of vehicle events may be based on one or more current operating conditions of vehicle 12. In some implementations, detection may be further based on one or more types of contextual information. In some implementations, detection may be accomplished and/or performed at vehicle 12, e.g. by processor 110 that is carried by vehicle 12 Vehicle events may include speeding, unsafe driving speed, collisions, near-collisions, and/or other events. In some implementations, vehicle events may include the distance between two vehicles being dangerously small, which may for example indicate an increased likelihood of a collision. In some implementations, vehicle events may include one or more driving maneuvers.
(35) In some implementations, a value of a current operating condition that effectuates detection of a vehicle event may vary as a function of the contextual information. For example, a speed of 50 mph (in a particular geographical location) may not effectuate detection of a vehicle event when the road surface is dry and/or when traffic is light, but the same speed in the same geographical location may effectuate detection of a vehicle event responsive to contextual information indicating that the road surface is wet and/or icy (and/or may be wet and/or icy), or responsive to contextual information that traffic is heavy (and/or may be heavy). In this example, the contextual information may have an effect of the detection of vehicle events. In some implementations, contextual .. information may modify the sensitivity of the process and/or mechanism by which vehicle events are detected.
(36) For example, a particular vehicle 12 operates at a particular operating condition (as determined based on output signals generated by a particular sensor 142).
In light of a particular current environmental condition at a first moment (e.g. sunny weather and/or light traffic), the particular operating condition may provide an insufficient impetus to determine and/or detect a particular vehicle event (e.g. "unsafe driving speed").
Subsequently, at a second moment after the first moment, a different environmental condition (e.g. rain, snow, and/or heavy traffic) becomes operative (e.g., the different environmental condition may be received at particular vehicle 12 as contextual information). In light of the different environmental condition, the combination of the different environmental condition and the particular operating condition may provide a sufficient impetus to determine and/or detect a particular vehicle event.
(37) By way of non-limiting example, FIG. 3 illustrates a dashboard view 30 for a vehicle that includes a system configured to detect vehicle events, similar to system 10 (not shown in FIG. 3). Dashboard view 30 illustrates the view of a driver of a vehicle, including a dashboard 33, a fuel indicator 36, a steering wheel 34, a road 31, and a user interface 35. User interface 35 may be configured to indicate the presence or absence of vehicle events, including but not limited to a vehicle event referred to as "unsafe driving conditions." As illustrated in dashboard view 30, the absence of unsafe driving conditions may be indicated by an indicator labeled "safe driving conditions."
User interface 35 may indicate one or more current operating conditions (e.g.
speed) as well as one or more current environmental conditions (e.g. light or heavy traffic, dry or wet road surface, etc.). The one or more current environmental conditions may be related to and/or based on contextual information received, e.g., from an external source of contextual information.
(38) By way of non-limiting example, FIG. 4 illustrates a dashboard view 40 for a vehicle that includes a system configured to detect vehicle events, similar to system 10 (not shown in FIG. 4). Dashboard view 40 illustrates the view of a driver of a vehicle, including a dashboard 33, a fuel indicator 36, a steering wheel 34, a road 31, and a user interface 35, similar to dashboard view 30 in FIG. 3. Referring to FIG. 4, user interface 35 may be configured to indicate the presence or absence of vehicle events, including but not limited to a vehicle event referred to as "unsafe driving conditions."
As illustrated in dashboard view 40, the presence of unsafe driving conditions may be indicated by an indicator labeled "unsafe driving conditions." User interface 35 may indicate one or more current operating conditions (e.g. speed) as well as one or more current environmental conditions (e.g. light or heavy traffic, dry or wet road surface, etc.). As illustrated in user interface 35, current traffic conditions are characterized as heavy (this may for example be based on average driving speed on a particular section of the road, and/or on other traffic-related information). The one or more current environmental conditions may be related to and/or based on contextual information received, e.g., from an external source of contextual information. The combination of one or more current operating conditions and one or more current environmental conditions may provide a sufficient impetus to determine and/or detect a particular vehicle event (e.g. "unsafe driving speed" or "unsafe driving conditions"). This particular vehicle event may be notified to the driver, e.g.
through user interface 35 as shown in FIG. 4.
(39) By way of non-limiting example, FIG. 5 illustrates a dashboard view 50 for a vehicle that includes a system configured to detect vehicle events, similar to system 10 (not shown in FIG. 5). Dashboard view 50 illustrates the view of a driver of a vehicle, including a dashboard 33, a fuel indicator 36, a steering wheel 34, a road 31, and a user interface 35, similar to dashboard view 30 in FIG. 3. Referring to FIG. 5, user interface 35 may be configured to indicate the presence or absence of vehicle events, including but not limited to a vehicle event referred to as "unsafe driving conditions."
As illustrated in dashboard view 50, the presence of unsafe driving conditions may be indicated by an indicator labeled "unsafe driving conditions." User interface 35 may indicate one or more current operating conditions (e.g. speed) as well as one or more current environmental conditions (e.g. light or heavy traffic, dry or wet road surface, etc.). As illustrated in user interface 35, current weather and/or road-surface conditions are characterized as wet (this may for example be based on weather information for an area that includes a particular section of the road, and/or on other weather-related information).
The one or more current environmental conditions may be related to and/or based on contextual information received, e.g., from an external source of contextual information.
The combination of one or more current operating conditions and one or more current environmental conditions may provide a sufficient impetus to determine and/or detect a particular vehicle event (e.g. "unsafe driving speed" or "unsafe driving conditions"). This particular vehicle event may be notified to the driver, e.g. through user interface 35 as shown in FIG. 5.
(40) In some implementations, detection of vehicle events may be based on one or more comparisons of the values of current operating conditions with threshold values. In some implementations, a particular threshold value may vary as a function of contextual information.
(41) By way of non-limiting example, lateral forces of about -0.3g (e.g., swerve left) and/or about +0.3g (e.g., swerve right) may be a basis used to detect a swerve. In some implementations, the -0.3g and/or +0.3g criteria may be used at vehicle 12 speeds less than about 10kph. The -0.3g and/or +0.3g criteria may be scaled as vehicle 12 increases in speed. In some implementations, the -0.3g and/or +0.3g criteria may be scaled (e.g., reduced) by about 0.0045g per kph of speed over 10kph. To prevent too much sensitivity, system 10 may limit the lateral force criteria to about +/-0.12g, regardless of the speed of vehicle 12, for example. In some implementations, the criterion for the given period of time between swerves may be about 3 seconds.
(42) Record component 24 may be configured to record, store, and/or transmit information, including but not limited to information related to vehicle events. In some implementations, information related to vehicle events may be used to create vehicle event records. Vehicle event records may include video information, audio information, data from an ECM system, metadata, information based on sensor-generated output, and/or other information.
(43) Vehicle event records may be stored locally in vehicle 12 and/or transmitted from vehicle 12 to system 10, server 11, and/or service that is external to the vehicle, including but not limited to a remote server and/or external provider 18. In some implementations, system 10, server 11, and/or service that is external to vehicle 12 may query and/or request information from a particular vehicle 12. Record component 24 may be configured to respond to a query or request by transmitting information as queried and/or requested. In some implementations, record component 24 may be configured to facilitate communication of information between vehicles, remote servers, external providers, and/or other systems, servers, and/or services external to vehicles.
Communication may be in real-time or near real-time. Communication may be wireless.
(44) Notification component 25 may be configured to generate and/or determine notifications related to vehicle events. In some implementations, notifications may be intended for drivers of vehicles. For example, notification component 25 may be configured to provide notifications to drivers, including but not limited to warnings or requests (for example to reduce speed). In some implementations, notifications may be transmitted from vehicle 12 to system 10, server 11, and/or service that is external to vehicle 12, including but not limited to a remote server and/or external provider 18. For example, a notification that a collision has occurred may be transmitted to a remote server and/or external provider 18. In some implementations, notifications may be stored locally, in electronic storage of a particular vehicle 12. Stored notifications may be retrieved later, e.g. after the particular vehicle 12 has returned to fleet headquarters, or subsequent to the particular vehicle 12 entering a particular geographical area (e.g.
within range of wireless communication with a particular external provider 18).
(45) Location component 26 may be configured to obtain and/or determine information related to the locations of vehicles and/or other locations (which may be referred to as location information). In some implementations, location component 26 may be configured to receive information related to the current location of vehicle 12. By way of non-limiting example, location information may include global positioning system (GPS) information. Operation by other components of system 10 may be based, at least in part, on information obtained and/or determined by location component 26. For example, detection of vehicle events may be affected based on proximity and/or orientation to objects near vehicle 12, geo-fence around vehicle 12, and/or other conditions related to vehicle 12.
(46) In some implementations, system 10 may include a user interface configured to provide an interface between system 10 and users through which the users may provide information to and receive information from system 10. This enables information to be communicated between a user and one or more of processor 110, sensors 142, vehicle 12, and/or other components of system 10. As an example, a dangerous driving maneuver and/or vehicle event may be displayed to the driver of vehicle 12 via the user interface, e.g. as a notification.
(47) Examples of interface devices suitable for inclusion in a user interface include a keypad, buttons, switches, a keyboard, knobs, levers, a display screen, a touch screen, speakers, a microphone, an indicator light, an audible alarm, a printer, a tactile feedback device, and/or other interface devices.
(48) It is to be understood that other communication techniques, either hard-wired or wireless, are also contemplated by the present disclosure as a user interface.
Information may be loaded into system 10 wirelessly from a remote location, from removable storage (e.g., a smart card, a flash drive, a removable disk, etc.), and/or other sources that enable the user(s) to customize the implementation of system 10.
Other exemplary input devices and techniques adapted for use with system 10 include, but are not limited to, an RS-232 port, RF link, an IR link, modem (telephone, cable, and/or other modems), a cellular network, a W-Fi network, a local area network, and/or other devices and/or systems. In short, any technique for communicating information with system 10 is contemplated by the present disclosure as a user interface.
(49) Electronic storage 60 may comprise electronic storage media that electronically stores information. The electronic storage media of electronic storage 60 may comprise one or both of system storage that is provided integrally (i.e., substantially non-removable) with system 10 and/or removable storage that is removably connectable to system 10 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). Electronic storage 60 may comprise one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. Electronic storage 60 may store software algorithms, recorded video event data, information determined by processor 110, information received via a user interface, and/or other information that enables system 10 to function properly. Electronic storage 60 may be (in whole or in part) a separate component within system 10, or electronic storage 60 may be provided (in whole or in part) integrally with one or more other components of system 10.
(50) In some implementations, a remote server may include communication lines, or ports to enable the exchange of information with a network, processor 110 of system 10, and/or other computing platforms. The remote server may include a plurality of processors, electronic storage, hardware, software, and/or firmware components operating together to provide the functionality attributed herein to a remote device. For example, the server may be implemented by a cloud of computing platforms operating together as a system server.
(51) As described above, processor 110 may be configured to provide information processing capabilities in system 10. As such, processor 110 may comprise one or more of a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information. Although processor 110 is shown in FIG. 1 as a single entity, this is for illustrative purposes only. In some implementations, processor 110 may comprise a plurality of processing units.
These processing units may be physically located within the same device (e.g., a vehicle event recorder), or processor 110 may represent processing functionality of a plurality of devices operating in coordination.
(52) Processor 110 may be configured to execute components 21-26 by software;
hardware; firmware; some combination of software, hardware, and/or firmware;
and/or other mechanisms for configuring processing capabilities on processor 110. It should be appreciated that although components 21-26 are illustrated in FIG. 1 as being co-located within a single processing unit, in implementations in which processor 110 comprises multiple processing units, one or more of components 21-26 may be located remotely from the other components. The description of the functionality provided by the different components 21-26 described herein is for illustrative purposes, and is not intended to be limiting, as any of components 21-26 may provide more or less functionality than is described. For example, one or more of components 21-26 may be eliminated, and some or all of its functionality may be provided by other components 21-26. As another example, processor 110 may be configured to execute one or more additional components that may perform some or all of the functionality attributed below to one of components 21-26.
(53) FIG. 2 illustrates a method 200 to detect vehicle events. The operations of method 200 presented below are intended to be illustrative. In some implementations, method 200 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 200 are illustrated (in FIG. 2) and described below is not intended to be limiting. In some implementations, two or more of the operations may occur substantially simultaneously.
(54) In some implementations, method 200 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 200 in response to instructions stored electronically on one or more electronic storage mediums. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 200.
(55) Referring to FIG. 2 and method 200, at an operation 202, output signals are generated conveying information related to one or more current operating conditions of the vehicle. The one or more sensors are carried by the vehicle. In some embodiments, operation 202 is performed by a sensor the same as or similar to sensor 142 (shown in FIG. 1 and described herein).
(56) At an operation 204, a current operating condition of the vehicle is determined based on the generated output signals. In some embodiments, operation 204 is performed by an operation component the same as or similar to operation component 21 (shown in FIG. 1 and described herein).
(57) At an operation 206, contextual information is received related to one or more current environmental conditions near and/or around the vehicle. In some embodiments, operation 206 is performed by a context component the same as or similar to context component 22 (shown in FIG. 1 and described herein).
(58) At an operation 208, at the vehicle, a vehicle event is detected based on the current operating condition of the vehicle and further based on the received contextual information. A value of the current operating condition of the vehicle that effectuates detection of the vehicle event varies as a function of the received contextual information.
In some embodiments, operation 208 is performed by a detection component the same as or similar to detection component 23 (shown in FIG. 1 and described herein).
(59) Although the system(s) and/or method(s) of this disclosure have been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the disclosure is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.
Some or all of system 10 may be installed in vehicle 12, carried by vehicle 12, and/or be otherwise coupled with and/or related to vehicle 12. In some implementations, system may include one or more sensors 142, one or more servers 11, one or more physical 5 processors 110, electronic storage 60, a network 13, one or more external providers 18, and/or other components. One or more sensors 142 may be configured to generate output signals. The output signals may convey information related to vehicle 12 and/or one or more current operating conditions of vehicle 12. In some implementations, one or more sensors 142 may be carried by vehicle 12.
10 (20) Information related to current operating conditions of the vehicle may include feedback information from one or more of the mechanical systems of vehicle 12, and/or other information. The mechanical systems of vehicle 12 may include, for example, the engine, the drive train, the lighting systems (e.g., headlights, brake lights), the braking system, the transmission, fuel delivery systems, and/or other mechanical systems. The mechanical systems of vehicle 12 may include one or more mechanical sensors, electronic sensors, and/or other sensors that generate the output signals (e.g., seat belt sensors, tire pressure sensors, etc.). In some implementations, at least one of sensors 142 may be a vehicle system sensor included in an ECM system of vehicle 12.
(21) In some implementations, one or more sensors 142 may include a video camera, an image sensor, and/or a microphone. Based on an analysis of images and/or sounds captured, system 10 may determine, using algorithms, that vehicle 12 is moving forward, is in reverse, has maneuvered outside of its lane of traffic, is making a turn, and/or other maneuvers. For example, by way of non-limiting example, driving maneuvers may include swerving, a U-turn, freewheeling, over-revving, lane-departure, short following distance, imminent collision, unsafe turning that approaches rollover and/or vehicle stability limits, hard braking, rapid acceleration, idling, driving outside a geo-fence boundary, crossing double-yellow lines, passing on single-lane roads, a certain number of lane changes within a certain amount of time or distance, fast lane change, cutting off other vehicles during lane-change, speeding, running a red light, running a stop sign, and/or other driving maneuvers.
.. (22) In some implementations, information related to current operating conditions of vehicle 12 may include information related to the environment in and/or around vehicle 12. The vehicle environment may include spaces in and around an interior and an exterior of vehicle 12. The information may include information related to movement of vehicle 12, an orientation of vehicle 12, a geographic position of vehicle 12, a spatial position of vehicle 12 relative to other objects, a tilt angle of vehicle 12, an inclination/declination angle of vehicle 12, and/or other information. In some implementations, the output signals conveying information may be generated via non-standard aftermarket sensors installed in vehicle 12. Non-standard aftermarket sensors may include, for example, a video camera, a microphone, an accelerometer, a gyroscope, a geolocation sensor (e.g., a GPS device), a radar detector, a magnetometer, radar (e.g. for measuring distance of leading vehicle), and/or other sensors. In some implementations, one or more sensors 142 may include multiple cameras positioned around vehicle 12 and synchronized together to provide a degree view of the inside of vehicle 12 and/or a 360 degree view of the outside of vehicle 12.
(23) Although sensors 142 are depicted in FIG. 1 as a single element, this is not intended to be limiting. Sensors 142 may include one or more sensors located adjacent to and/or in communication with the various mechanical systems of vehicle 12, in one or more positions (e.g., at or near the front of vehicle 12, at or near the back of vehicle 12, etc.) to accurately acquire information representing the vehicle environment (e.g. visual information, spatial information, orientation information), and/or in other locations. For example, in some implementations, system 10 may be configured such that a first sensor is located near or in communication with a rotating tire of vehicle 12, and a second sensor located on top of vehicle 12 is in communication with a geolocation satellite. In some implementations, sensors 142 may be configured to generate output signals continuously during operation of vehicle 12.
(24) As shown in FIG. 1, server 11 may include one or more processors 110 configured to execute one or more computer program components. The computer program components may comprise one or more of an operation component 21, a context component 22, a detection component 23, a record component 24, a notification .. component 25, a location component 26, and/or other components.
(25) Operation component 21 may be configured to determine current operating conditions and/or vehicle parameters of vehicles, e.g. vehicle 12. Operation component 21 may determine current operating conditions based on the information conveyed by the output signals from sensors 142 and/or other information. The one or more current .. operating conditions may be related to vehicle 12, the operation of vehicle 12, physical characteristics of vehicle 12, and/or other information. In some implementations, operation component 21 may be configured to determine one or more of the current operating conditions one or more times in an ongoing manner during operation of vehicle 12.
.. (26) In some implementations, operating conditions may include vehicle parameters.
For example, vehicle parameters may be related to one or more of an acceleration, a direction of travel, a turn diameter, a vehicle speed, an engine speed (e.g.
RPM), a duration of time, a closing distance, a lane departure from an intended travelling lane of the vehicle, a following distance, physical characteristics of vehicle 12 (such as mass and/or number of axles, for example), a tilt angle of vehicle 12, an inclination/declination angle of vehicle 12, and/or other parameters.
(27) The physical characteristics of vehicle 12 may be physical features of vehicle 12 set during manufacture of vehicle 12, during loading of vehicle 12, and/or at other times.
For example, the one or more vehicle parameters may include a vehicle type (e.g., a car, a bus, a semi-truck, a tanker truck), a vehicle size (e.g., length), a vehicle weight (e.g., including cargo and/or without cargo), a number of gears, a number of axles, a type of load carried by vehicle 12 (e.g., food items, livestock, construction materials, hazardous materials, an oversized load, a liquid), vehicle trailer type, trailer length, trailer weight, trailer height, a number of axles, and/or other physical features. In some implementations, one or more vehicle parameters may be based on (and/or interpreted differently in the presence of) systems that a vehicle is equipped with, including but not limited to a stability system, a forward collision warning system, automatic brake system, and/or other systems that a vehicle may be equipped with. For example, the presence or absence of a particular system, e.g. a forward collision warning system, may modify the sensitivity of the process and/or mechanism by which vehicle events are detected.
(28) In some implementations, operation component 21 may be configured to determine one or more vehicle parameters based on the output signals from at least two different sensors. For example, operation component 21 may determine one or more of the vehicle parameters based on output signals from a sensor 142 related to the ECM
system and an external aftermarket added sensor 142. In some implementations, a determination of one or more of the vehicle parameters based on output signals from at least two different sensors 142 may be more accurate and/or precise than a determination based on the output signals from only one sensor 142. For example, on an icy surface, output signals from an accelerometer may not convey that a driver of vehicle 12 is applying the brakes of vehicle 12. However, a sensor in communication with the braking system of vehicle 12 would convey that the driver is applying the brakes. System 10 may determine a value of a braking parameter based on the braking sensor information even though the output signals from the accelerometer may not convey that the driver is applying the brakes.
(29) Operation component 21 may be configured to determine vehicle parameters that are not directly measurable by any of the available sensors. For example, an inclinometer may not be available to measure the road grade, but vehicle speed data as measured by a GPS system and/or by a wheel sensor ECM may be combined with accelerometer data, engine load, and/or other information to determine the road grade.
If an accelerometer measures a force that is consistent with braking, but the vehicle speed remains constant, the parameter component can determine that the measured force is a component of the gravity vector that is acting along the longitudinal axis of the vehicle. By using trigonometry, the magnitude of the gravity vector component can be used to determine the road grade (e.g., pitch angle of the vehicle in respect to the horizontal plane).
(30) In some implementations, one or more of the vehicle parameters may be determined one or more times in an ongoing manner during operation of vehicle 12. In some implementations, one or more of the vehicle parameters may be determined at regular time intervals during operation of vehicle 12. The timing of the vehicle parameter determinations (e.g., in an ongoing manner, at regular time intervals, etc.) may be programmed at manufacture, obtained responsive to user entry and/or selection of timing information via a user interface and/or a remote computing device, and/or may be determined in other ways. The time intervals of parameter determination may be significantly less (e.g. more frequent) than the time intervals at which various sensor measurements are available. In such cases, system 10 may estimate vehicle parameters in between the actual measurements of the same vehicle parameters by the respective sensors, to the extent that the vehicle parameters are measurable.
This may be established by means of a physical model that describes the behavior of various vehicle parameters and their interdependency. For example, a vehicle speed parameter may be estimated at a rate of 20 times per second, although the underlying speed measurements are much less frequent (e.g., four times per second for ECM
speed, one time per second for GPS speed). This may be accomplished by integrating vehicle acceleration, as measured by the accelerometer sensor where the measurements are available 1000 times per second, across time to determine change in speed that is accumulated over time again for the most recent vehicle speed measurement. The benefit of these more frequent estimates of vehicle parameters are many and they include improved operation of other components of system 10, reduced complexity of downstream logic and system design (e.g., all vehicle parameters are updated at the same interval, rather than being updating irregularly and at the interval of each respective sensor), and more pleasing (e.g., "smooth") presentation of vehicle event recorder data in an event player apparatus.
(31) In some implementations, system 10 may be configured to detect specific driving maneuvers based on one or more of a vehicle speed, an engine load, a throttle level, vehicle direction, a gravitational force, and/or other parameters being sustained at or above threshold levels for pre-determined amounts of time. In some implementations, an acceleration and/or force threshold may be scaled based on a length of time that an acceleration and/or force are maintained, and/or the particular speed the vehicle is travelling. System 10 may be configured such that force maintained over a period of time at a particular vehicle speed may decrease a threshold force the longer that the force is maintained. System 10 may be configured such that, combined with engine load data, throttle data may be used to determine a risky event, a fuel wasting event, and/or other events.
(32) Context component 22 may be configured to obtain, receive, and/or determine contextual information related to environmental conditions near and/or around vehicles.
Environmental conditions may be related to weather conditions, road surface conditions, traffic conditions, visibility, and/or other environmental conditions. In some implementations, environmental conditions may be related to proximity of certain objects that are relevant to driving, including but not limited to traffic signs, railroad crossings, time of day, ambient light conditions, altitude, and/or other objects relevant to driving. In some implementations, one or more environmental conditions may be received from one or more sources external to vehicle 12. For example, a source external to vehicle 12 may include a remote server and/or an external provider 18. In some implementations, contextual information may include a likelihood of traffic congestion near a particular vehicle, and/or near a particular location. In some implementations, contextual information may include a likelihood of the road surface near a particular vehicle and/or a particular location being icy, wet, and/or otherwise potentially having an effect of braking and steering. In some implementations, environmental conditions may include information related to a particular driver and/or a particular trip. For example, with every passing hour that a particular driver drives his vehicle during a particular trip, the likelihood of drowsiness may increase. In some implementations, the function between trip duration or distance and likelihood of drowsiness may be driver-specific.
(33) In some implementations, contextual information may be received by system through network 13, e.g. the internet. Network 13 may include private networks, public networks, and/or combinations thereof. For example, contextual information related to weather conditions may be received from a particular external provider 18 that provides weather information. For example, contextual information related to road surface conditions may be received from a particular external provider 18 that provides road condition information. For example, contextual information related to traffic conditions may be received from a particular external provider 18 that provides traffic information.
(34) Detection component 23 may be configured to detect vehicle events.
Detection of vehicle events may be based on one or more current operating conditions of vehicle 12. In some implementations, detection may be further based on one or more types of contextual information. In some implementations, detection may be accomplished and/or performed at vehicle 12, e.g. by processor 110 that is carried by vehicle 12 Vehicle events may include speeding, unsafe driving speed, collisions, near-collisions, and/or other events. In some implementations, vehicle events may include the distance between two vehicles being dangerously small, which may for example indicate an increased likelihood of a collision. In some implementations, vehicle events may include one or more driving maneuvers.
(35) In some implementations, a value of a current operating condition that effectuates detection of a vehicle event may vary as a function of the contextual information. For example, a speed of 50 mph (in a particular geographical location) may not effectuate detection of a vehicle event when the road surface is dry and/or when traffic is light, but the same speed in the same geographical location may effectuate detection of a vehicle event responsive to contextual information indicating that the road surface is wet and/or icy (and/or may be wet and/or icy), or responsive to contextual information that traffic is heavy (and/or may be heavy). In this example, the contextual information may have an effect of the detection of vehicle events. In some implementations, contextual .. information may modify the sensitivity of the process and/or mechanism by which vehicle events are detected.
(36) For example, a particular vehicle 12 operates at a particular operating condition (as determined based on output signals generated by a particular sensor 142).
In light of a particular current environmental condition at a first moment (e.g. sunny weather and/or light traffic), the particular operating condition may provide an insufficient impetus to determine and/or detect a particular vehicle event (e.g. "unsafe driving speed").
Subsequently, at a second moment after the first moment, a different environmental condition (e.g. rain, snow, and/or heavy traffic) becomes operative (e.g., the different environmental condition may be received at particular vehicle 12 as contextual information). In light of the different environmental condition, the combination of the different environmental condition and the particular operating condition may provide a sufficient impetus to determine and/or detect a particular vehicle event.
(37) By way of non-limiting example, FIG. 3 illustrates a dashboard view 30 for a vehicle that includes a system configured to detect vehicle events, similar to system 10 (not shown in FIG. 3). Dashboard view 30 illustrates the view of a driver of a vehicle, including a dashboard 33, a fuel indicator 36, a steering wheel 34, a road 31, and a user interface 35. User interface 35 may be configured to indicate the presence or absence of vehicle events, including but not limited to a vehicle event referred to as "unsafe driving conditions." As illustrated in dashboard view 30, the absence of unsafe driving conditions may be indicated by an indicator labeled "safe driving conditions."
User interface 35 may indicate one or more current operating conditions (e.g.
speed) as well as one or more current environmental conditions (e.g. light or heavy traffic, dry or wet road surface, etc.). The one or more current environmental conditions may be related to and/or based on contextual information received, e.g., from an external source of contextual information.
(38) By way of non-limiting example, FIG. 4 illustrates a dashboard view 40 for a vehicle that includes a system configured to detect vehicle events, similar to system 10 (not shown in FIG. 4). Dashboard view 40 illustrates the view of a driver of a vehicle, including a dashboard 33, a fuel indicator 36, a steering wheel 34, a road 31, and a user interface 35, similar to dashboard view 30 in FIG. 3. Referring to FIG. 4, user interface 35 may be configured to indicate the presence or absence of vehicle events, including but not limited to a vehicle event referred to as "unsafe driving conditions."
As illustrated in dashboard view 40, the presence of unsafe driving conditions may be indicated by an indicator labeled "unsafe driving conditions." User interface 35 may indicate one or more current operating conditions (e.g. speed) as well as one or more current environmental conditions (e.g. light or heavy traffic, dry or wet road surface, etc.). As illustrated in user interface 35, current traffic conditions are characterized as heavy (this may for example be based on average driving speed on a particular section of the road, and/or on other traffic-related information). The one or more current environmental conditions may be related to and/or based on contextual information received, e.g., from an external source of contextual information. The combination of one or more current operating conditions and one or more current environmental conditions may provide a sufficient impetus to determine and/or detect a particular vehicle event (e.g. "unsafe driving speed" or "unsafe driving conditions"). This particular vehicle event may be notified to the driver, e.g.
through user interface 35 as shown in FIG. 4.
(39) By way of non-limiting example, FIG. 5 illustrates a dashboard view 50 for a vehicle that includes a system configured to detect vehicle events, similar to system 10 (not shown in FIG. 5). Dashboard view 50 illustrates the view of a driver of a vehicle, including a dashboard 33, a fuel indicator 36, a steering wheel 34, a road 31, and a user interface 35, similar to dashboard view 30 in FIG. 3. Referring to FIG. 5, user interface 35 may be configured to indicate the presence or absence of vehicle events, including but not limited to a vehicle event referred to as "unsafe driving conditions."
As illustrated in dashboard view 50, the presence of unsafe driving conditions may be indicated by an indicator labeled "unsafe driving conditions." User interface 35 may indicate one or more current operating conditions (e.g. speed) as well as one or more current environmental conditions (e.g. light or heavy traffic, dry or wet road surface, etc.). As illustrated in user interface 35, current weather and/or road-surface conditions are characterized as wet (this may for example be based on weather information for an area that includes a particular section of the road, and/or on other weather-related information).
The one or more current environmental conditions may be related to and/or based on contextual information received, e.g., from an external source of contextual information.
The combination of one or more current operating conditions and one or more current environmental conditions may provide a sufficient impetus to determine and/or detect a particular vehicle event (e.g. "unsafe driving speed" or "unsafe driving conditions"). This particular vehicle event may be notified to the driver, e.g. through user interface 35 as shown in FIG. 5.
(40) In some implementations, detection of vehicle events may be based on one or more comparisons of the values of current operating conditions with threshold values. In some implementations, a particular threshold value may vary as a function of contextual information.
(41) By way of non-limiting example, lateral forces of about -0.3g (e.g., swerve left) and/or about +0.3g (e.g., swerve right) may be a basis used to detect a swerve. In some implementations, the -0.3g and/or +0.3g criteria may be used at vehicle 12 speeds less than about 10kph. The -0.3g and/or +0.3g criteria may be scaled as vehicle 12 increases in speed. In some implementations, the -0.3g and/or +0.3g criteria may be scaled (e.g., reduced) by about 0.0045g per kph of speed over 10kph. To prevent too much sensitivity, system 10 may limit the lateral force criteria to about +/-0.12g, regardless of the speed of vehicle 12, for example. In some implementations, the criterion for the given period of time between swerves may be about 3 seconds.
(42) Record component 24 may be configured to record, store, and/or transmit information, including but not limited to information related to vehicle events. In some implementations, information related to vehicle events may be used to create vehicle event records. Vehicle event records may include video information, audio information, data from an ECM system, metadata, information based on sensor-generated output, and/or other information.
(43) Vehicle event records may be stored locally in vehicle 12 and/or transmitted from vehicle 12 to system 10, server 11, and/or service that is external to the vehicle, including but not limited to a remote server and/or external provider 18. In some implementations, system 10, server 11, and/or service that is external to vehicle 12 may query and/or request information from a particular vehicle 12. Record component 24 may be configured to respond to a query or request by transmitting information as queried and/or requested. In some implementations, record component 24 may be configured to facilitate communication of information between vehicles, remote servers, external providers, and/or other systems, servers, and/or services external to vehicles.
Communication may be in real-time or near real-time. Communication may be wireless.
(44) Notification component 25 may be configured to generate and/or determine notifications related to vehicle events. In some implementations, notifications may be intended for drivers of vehicles. For example, notification component 25 may be configured to provide notifications to drivers, including but not limited to warnings or requests (for example to reduce speed). In some implementations, notifications may be transmitted from vehicle 12 to system 10, server 11, and/or service that is external to vehicle 12, including but not limited to a remote server and/or external provider 18. For example, a notification that a collision has occurred may be transmitted to a remote server and/or external provider 18. In some implementations, notifications may be stored locally, in electronic storage of a particular vehicle 12. Stored notifications may be retrieved later, e.g. after the particular vehicle 12 has returned to fleet headquarters, or subsequent to the particular vehicle 12 entering a particular geographical area (e.g.
within range of wireless communication with a particular external provider 18).
(45) Location component 26 may be configured to obtain and/or determine information related to the locations of vehicles and/or other locations (which may be referred to as location information). In some implementations, location component 26 may be configured to receive information related to the current location of vehicle 12. By way of non-limiting example, location information may include global positioning system (GPS) information. Operation by other components of system 10 may be based, at least in part, on information obtained and/or determined by location component 26. For example, detection of vehicle events may be affected based on proximity and/or orientation to objects near vehicle 12, geo-fence around vehicle 12, and/or other conditions related to vehicle 12.
(46) In some implementations, system 10 may include a user interface configured to provide an interface between system 10 and users through which the users may provide information to and receive information from system 10. This enables information to be communicated between a user and one or more of processor 110, sensors 142, vehicle 12, and/or other components of system 10. As an example, a dangerous driving maneuver and/or vehicle event may be displayed to the driver of vehicle 12 via the user interface, e.g. as a notification.
(47) Examples of interface devices suitable for inclusion in a user interface include a keypad, buttons, switches, a keyboard, knobs, levers, a display screen, a touch screen, speakers, a microphone, an indicator light, an audible alarm, a printer, a tactile feedback device, and/or other interface devices.
(48) It is to be understood that other communication techniques, either hard-wired or wireless, are also contemplated by the present disclosure as a user interface.
Information may be loaded into system 10 wirelessly from a remote location, from removable storage (e.g., a smart card, a flash drive, a removable disk, etc.), and/or other sources that enable the user(s) to customize the implementation of system 10.
Other exemplary input devices and techniques adapted for use with system 10 include, but are not limited to, an RS-232 port, RF link, an IR link, modem (telephone, cable, and/or other modems), a cellular network, a W-Fi network, a local area network, and/or other devices and/or systems. In short, any technique for communicating information with system 10 is contemplated by the present disclosure as a user interface.
(49) Electronic storage 60 may comprise electronic storage media that electronically stores information. The electronic storage media of electronic storage 60 may comprise one or both of system storage that is provided integrally (i.e., substantially non-removable) with system 10 and/or removable storage that is removably connectable to system 10 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). Electronic storage 60 may comprise one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. Electronic storage 60 may store software algorithms, recorded video event data, information determined by processor 110, information received via a user interface, and/or other information that enables system 10 to function properly. Electronic storage 60 may be (in whole or in part) a separate component within system 10, or electronic storage 60 may be provided (in whole or in part) integrally with one or more other components of system 10.
(50) In some implementations, a remote server may include communication lines, or ports to enable the exchange of information with a network, processor 110 of system 10, and/or other computing platforms. The remote server may include a plurality of processors, electronic storage, hardware, software, and/or firmware components operating together to provide the functionality attributed herein to a remote device. For example, the server may be implemented by a cloud of computing platforms operating together as a system server.
(51) As described above, processor 110 may be configured to provide information processing capabilities in system 10. As such, processor 110 may comprise one or more of a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information. Although processor 110 is shown in FIG. 1 as a single entity, this is for illustrative purposes only. In some implementations, processor 110 may comprise a plurality of processing units.
These processing units may be physically located within the same device (e.g., a vehicle event recorder), or processor 110 may represent processing functionality of a plurality of devices operating in coordination.
(52) Processor 110 may be configured to execute components 21-26 by software;
hardware; firmware; some combination of software, hardware, and/or firmware;
and/or other mechanisms for configuring processing capabilities on processor 110. It should be appreciated that although components 21-26 are illustrated in FIG. 1 as being co-located within a single processing unit, in implementations in which processor 110 comprises multiple processing units, one or more of components 21-26 may be located remotely from the other components. The description of the functionality provided by the different components 21-26 described herein is for illustrative purposes, and is not intended to be limiting, as any of components 21-26 may provide more or less functionality than is described. For example, one or more of components 21-26 may be eliminated, and some or all of its functionality may be provided by other components 21-26. As another example, processor 110 may be configured to execute one or more additional components that may perform some or all of the functionality attributed below to one of components 21-26.
(53) FIG. 2 illustrates a method 200 to detect vehicle events. The operations of method 200 presented below are intended to be illustrative. In some implementations, method 200 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 200 are illustrated (in FIG. 2) and described below is not intended to be limiting. In some implementations, two or more of the operations may occur substantially simultaneously.
(54) In some implementations, method 200 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 200 in response to instructions stored electronically on one or more electronic storage mediums. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 200.
(55) Referring to FIG. 2 and method 200, at an operation 202, output signals are generated conveying information related to one or more current operating conditions of the vehicle. The one or more sensors are carried by the vehicle. In some embodiments, operation 202 is performed by a sensor the same as or similar to sensor 142 (shown in FIG. 1 and described herein).
(56) At an operation 204, a current operating condition of the vehicle is determined based on the generated output signals. In some embodiments, operation 204 is performed by an operation component the same as or similar to operation component 21 (shown in FIG. 1 and described herein).
(57) At an operation 206, contextual information is received related to one or more current environmental conditions near and/or around the vehicle. In some embodiments, operation 206 is performed by a context component the same as or similar to context component 22 (shown in FIG. 1 and described herein).
(58) At an operation 208, at the vehicle, a vehicle event is detected based on the current operating condition of the vehicle and further based on the received contextual information. A value of the current operating condition of the vehicle that effectuates detection of the vehicle event varies as a function of the received contextual information.
In some embodiments, operation 208 is performed by a detection component the same as or similar to detection component 23 (shown in FIG. 1 and described herein).
(59) Although the system(s) and/or method(s) of this disclosure have been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the disclosure is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.
Claims (20)
1. A system configured to detect vehicle events, the system configured to couple with a vehicle, the system comprising:
one or more sensors configured to generate output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle;
electronic storage configured to store information, and one or more processors configured to:
determine a current operating condition of the vehicle based on the generated output signals;
receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility;
determine a first threshold value at which the current operating condition of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the first condition;
at the vehicle, detect a vehicle event based on a first comparison of the current operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition with a second threshold value, and wherein the second threshold value varies based on the second condition;
create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle;
store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
one or more sensors configured to generate output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle;
electronic storage configured to store information, and one or more processors configured to:
determine a current operating condition of the vehicle based on the generated output signals;
receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility;
determine a first threshold value at which the current operating condition of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the first condition;
at the vehicle, detect a vehicle event based on a first comparison of the current operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition with a second threshold value, and wherein the second threshold value varies based on the second condition;
create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle;
store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
2. The system of claim 1, wherein at least some of the contextual information is received from one or more sources external to the vehicle.
3. The system of claim 1, wherein the one or more current operating conditions of the vehicle include vehicle speed of the vehicle, and wherein the vehicle event includes a particular driving maneuver.
4. The system of claim 1, wherein the value of the current operating condition of the vehicle does not effectuate detection of the vehicle event absent the received contextual information.
5. The system of claim 1, wherein the current operating condition of the vehicle includes at least one of vehicle speed, an engine load, a throttle level or accelerator position, a particular acceleration level of the vehicle, a particular change in vehicle direction, and multiple changes in vehicle direction.
6. The system of claim '1, wherein the current operating condition of the vehicle is vehicle speed, wherein the contextual information includes a likelihood of traffic congestion near the vehicle, and wherein the vehicle event is related to unsafe vehicle speed.
7. The system of claim 1, wherein the current operating condition of the vehicle is brake pressure, wherein the contextual information includes a likelihood of road surface near the vehicle being at least one of wet and icy, and wherein the vehicle event is related to potential collision conditions.
8. The system of claim 1, wherein the one or more sensors include a video camera, and wherein the one or more sensors are configured such that the information related to the one or more current operating conditions of the vehicle includes video information.
9. The system of claim 1, wherein the one or more processors are further configured to obtain geographical information of the vehicle, and wherein detection of the vehicle event is further based on the geographical information of the vehicle.
10. A system configured to detect vehicle events, the system configured to couple with a vehicle, the system comprising:
one or more sensors configured to generate output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle;
electronic storage configured to store information; and one or more processors configured to:
determine a first operating condition and a second operating condition of the vehicle based on the generated output signals, wherein the first operating condition corresponds to a first moment in time, wherein the second operating condition corresponds to a second moment in time, and wherein the second moment in time is subsequent to the first moment in time;
at the vehicle, analyze the first operating condition and determine that no vehicle event has occurred that needs to be reported to either a driver of the vehicle or an external server;
receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility;
determine a first threshold value at which the second operating condition of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the first condition;
at the vehicle, detect a vehicle event based on a first comparison of the second operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition of the vehicle with a second threshold value, and wherein the second threshold value varies based on the second condition;
create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle;
store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
one or more sensors configured to generate output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle;
electronic storage configured to store information; and one or more processors configured to:
determine a first operating condition and a second operating condition of the vehicle based on the generated output signals, wherein the first operating condition corresponds to a first moment in time, wherein the second operating condition corresponds to a second moment in time, and wherein the second moment in time is subsequent to the first moment in time;
at the vehicle, analyze the first operating condition and determine that no vehicle event has occurred that needs to be reported to either a driver of the vehicle or an external server;
receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility;
determine a first threshold value at which the second operating condition of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the first condition;
at the vehicle, detect a vehicle event based on a first comparison of the second operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition of the vehicle with a second threshold value, and wherein the second threshold value varies based on the second condition;
create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle;
store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
11. A method to detect vehicle events for a vehicle, the method being implemented in a computer system that includes one or more sensors and one or more physical processors, the method comprising:
generating, by the one or more sensors, output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle;
determining a current operating condition of the vehicle based on the generated output signals;
receiving contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility;
determining a first threshold value at which the current operating condition of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the first condition;
detecting, at the vehicle, a vehicle event based on a first comparison of the current operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition with a second threshold value, and wherein the second threshold value varies based on the second condition;
creating a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle;
storing the vehicle event record in electronic storage; and effectuating presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
generating, by the one or more sensors, output signals conveying information related to one or more current operating conditions of the vehicle associated with movement of the vehicle, wherein the one or more sensors are carried by the vehicle;
determining a current operating condition of the vehicle based on the generated output signals;
receiving contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a first condition related to traffic conditions, and wherein the two or more current environmental conditions further include a second condition related to at least one of weather conditions and visibility;
determining a first threshold value at which the current operating condition of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the first condition;
detecting, at the vehicle, a vehicle event based on a first comparison of the current operating condition of the vehicle and the first threshold value, wherein detection of the vehicle event is further based on a second comparison of the first condition with a second threshold value, and wherein the second threshold value varies based on the second condition;
creating a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the information related to the one or more current operating conditions of the vehicle;
storing the vehicle event record in electronic storage; and effectuating presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
12. The method of claim 11, wherein at least some of the contextual information is received from one or more sources external to the vehicle.
13. The method of claim 11, wherein the one or more current operating conditions of the vehicle include vehicle speed of the vehicle, and wherein the vehicle event includes a particular driving maneuver.
14. The method of claim 11, wherein the value of the current operating condition of the vehicle does not effectuate detection of the vehicle event absent the received contextual information.
15. The method of claim 11, wherein the current operating condition of the vehicle includes at least one of vehicle speed, an engine load, a throttle level or accelerator position, a particular acceleration level of the vehicle, a particular change in vehicle direction, and multiple changes in vehicle direction.
16. The method of claim 11, wherein the current operating condition of the vehicle is vehicle speed, wherein the contextual information includes a likelihood of traffic congestion near the vehicle, and wherein the vehicle event is related to unsafe vehicle speed.
17. The method of claim 11, wherein the current operating condition of the vehicle pertains to brake pressure of the vehicle, wherein the contextual information includes a likelihood of road surface near the vehicle being at least one of wet and icy, and wherein the vehicle event is related to potential collision conditions.
18. The method of claim 11, wherein the one or more sensors include a video camera, and wherein the information related to the one or more current operating conditions of the vehicle includes video information.
19. The method of claim 11, further comprising obtaining geographical information of the vehicle, and wherein detecting the vehicle event is further based on the geographical information of the vehicle.
20. A system configured to detect vehicle events, the system configured to couple with a vehicle, the system comprising:
one or more sensors configured to generate output signals conveying information related to a current speed of the vehicle;
electronic storage configured to store information, and one or more processors configured to:
determine the current speed of the vehicle based on the generated output signals;
receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a traffic condition and a weather condition;
determine a first threshold value at which the current speed of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the traffic condition;
determine a second threshold value for the traffic condition, wherein the second threshold value varies based on the weather condition; at the vehicle, detect a vehicle event based on:
(i) a first comparison of the current speed of the vehicle with the first threshold value, and (ii) a second comparison of the traffic condition with the second threshold value;
create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the current speed of the vehicle;
store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
one or more sensors configured to generate output signals conveying information related to a current speed of the vehicle;
electronic storage configured to store information, and one or more processors configured to:
determine the current speed of the vehicle based on the generated output signals;
receive contextual information related to two or more current environmental conditions near the vehicle, wherein the two or more current environmental conditions include a traffic condition and a weather condition;
determine a first threshold value at which the current speed of the vehicle effectuates detection of vehicle events, wherein the first threshold value varies based on the traffic condition;
determine a second threshold value for the traffic condition, wherein the second threshold value varies based on the weather condition; at the vehicle, detect a vehicle event based on:
(i) a first comparison of the current speed of the vehicle with the first threshold value, and (ii) a second comparison of the traffic condition with the second threshold value;
create a vehicle event record based on the detected vehicle event, wherein the vehicle event record includes the received contextual information and the current speed of the vehicle;
store the vehicle event record in the electronic storage; and effectuate presentation to a user of the system, via a user interface, of a notification based on the vehicle event record.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/006,066 US9786104B2 (en) | 2016-01-25 | 2016-01-25 | Systems and method to trigger vehicle events based on contextual information |
US15/006,066 | 2016-01-25 | ||
PCT/IB2017/050397 WO2017130122A1 (en) | 2016-01-25 | 2017-01-25 | Systems and method to trigger vehicle events based on contextual information |
Publications (2)
Publication Number | Publication Date |
---|---|
CA3005402A1 CA3005402A1 (en) | 2017-08-03 |
CA3005402C true CA3005402C (en) | 2019-11-05 |
Family
ID=59359478
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA3005402A Active CA3005402C (en) | 2016-01-25 | 2017-01-25 | Systems and method to trigger vehicle events based on contextual information |
Country Status (4)
Country | Link |
---|---|
US (3) | US9786104B2 (en) |
EP (1) | EP3408617A4 (en) |
CA (1) | CA3005402C (en) |
WO (1) | WO2017130122A1 (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10109120B2 (en) * | 2016-10-25 | 2018-10-23 | International Business Machines Corporation | Predicting vehicular failures using autonomous collaborative comparisons to detect anomalies |
US11414050B2 (en) * | 2017-08-02 | 2022-08-16 | Ford Global Technologies, Llc | Multimode vehicle proximity security |
LU100760B1 (en) * | 2018-04-09 | 2019-10-11 | Motion S | Vehicular motion assessment method |
US11019084B2 (en) | 2018-12-14 | 2021-05-25 | Intel Corporation | Controller, a context broadcaster and an alert processing device |
FR3092302B1 (en) * | 2019-02-01 | 2021-01-08 | Continental Automotive | Road hazard detection device |
US11670286B2 (en) | 2019-12-31 | 2023-06-06 | Beijing Didi Infinity Technology And Development Co., Ltd. | Training mechanism of verbal harassment detection systems |
US20210201893A1 (en) * | 2019-12-31 | 2021-07-01 | Beijing Didi Infinity Technology And Development Co., Ltd. | Pattern-based adaptation model for detecting contact information requests in a vehicle |
US11664043B2 (en) | 2019-12-31 | 2023-05-30 | Beijing Didi Infinity Technology And Development Co., Ltd. | Real-time verbal harassment detection system |
US11620987B2 (en) | 2019-12-31 | 2023-04-04 | Beijing Didi Infinity Technology And Development Co., Ltd. | Generation of training data for verbal harassment detection |
US11562550B1 (en) * | 2021-10-06 | 2023-01-24 | Qualcomm Incorporated | Vehicle and mobile device interface for vehicle occupant assistance |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5680123A (en) | 1996-08-06 | 1997-10-21 | Lee; Gul Nam | Vehicle monitoring system |
US20010018628A1 (en) * | 1997-03-27 | 2001-08-30 | Mentor Heavy Vehicle Systems, Lcc | System for monitoring vehicle efficiency and vehicle and driver perfomance |
US5978017A (en) | 1997-04-08 | 1999-11-02 | Tino; Jerald N. | Multi-camera video recording system for vehicles |
WO2002008057A1 (en) | 2000-07-20 | 2002-01-31 | Kapadia Viraf S | System and method for transportation vehicle monitoring, feedback and control |
JP4425495B2 (en) | 2001-06-08 | 2010-03-03 | 富士重工業株式会社 | Outside monitoring device |
US20030055557A1 (en) | 2001-09-20 | 2003-03-20 | International Business Machines Corporation | Method of calibrating a car alarm depending on the crime statistics of an area VIA intergration with road navigation display systems |
US6691032B1 (en) | 2002-09-09 | 2004-02-10 | Groundspeak, Inc. | System and method for executing user-definable events triggered through geolocational data describing zones of influence |
US7821421B2 (en) * | 2003-07-07 | 2010-10-26 | Sensomatix Ltd. | Traffic information system |
US20050185052A1 (en) | 2004-02-25 | 2005-08-25 | Raisinghani Vijay S. | Automatic collision triggered video system |
US7348895B2 (en) | 2004-11-03 | 2008-03-25 | Lagassey Paul J | Advanced automobile accident detection, data recordation and reporting system |
US7564348B2 (en) * | 2004-11-05 | 2009-07-21 | Wirelesswerx International, Inc. | Method and system to monitor movable entities |
US7536457B2 (en) | 2006-05-08 | 2009-05-19 | Drivecam, Inc. | System and method for wireless delivery of event data |
US9134133B2 (en) | 2008-05-30 | 2015-09-15 | Here Global B.V. | Data mining to identify locations of potentially hazardous conditions for vehicle operation and use thereof |
US20100045451A1 (en) | 2008-08-25 | 2010-02-25 | Neeraj Periwal | Speed reduction, alerting, and logging system |
JP4962449B2 (en) | 2008-08-28 | 2012-06-27 | 株式会社デンソー | Driving support system |
US8972147B2 (en) * | 2011-01-10 | 2015-03-03 | Bendix Commercial Vehicle Systems Llc | ACC and AM braking range variable based on internal and external factors |
US9395702B2 (en) * | 2011-07-20 | 2016-07-19 | Freescale Semiconductor, Inc. | Safety critical apparatus and method for controlling distraction of an operator of a safety critical apparatus |
US8996234B1 (en) * | 2011-10-11 | 2015-03-31 | Lytx, Inc. | Driver performance determination based on geolocation |
US9524269B1 (en) * | 2012-12-19 | 2016-12-20 | Allstate Insurance Company | Driving event data analysis |
US9751534B2 (en) | 2013-03-15 | 2017-09-05 | Honda Motor Co., Ltd. | System and method for responding to driver state |
KR101579098B1 (en) | 2014-05-23 | 2015-12-21 | 엘지전자 주식회사 | Stereo camera, driver assistance apparatus and Vehicle including the same |
US9242654B2 (en) | 2014-06-27 | 2016-01-26 | International Business Machines Corporation | Determining vehicle collision risk |
US10460600B2 (en) * | 2016-01-11 | 2019-10-29 | NetraDyne, Inc. | Driver behavior monitoring |
-
2016
- 2016-01-25 US US15/006,066 patent/US9786104B2/en active Active
-
2017
- 2017-01-25 CA CA3005402A patent/CA3005402C/en active Active
- 2017-01-25 WO PCT/IB2017/050397 patent/WO2017130122A1/en active Application Filing
- 2017-01-25 EP EP17743811.6A patent/EP3408617A4/en active Pending
- 2017-10-06 US US15/727,227 patent/US10796504B2/en active Active
-
2020
- 2020-06-03 US US16/891,486 patent/US11631287B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
EP3408617A1 (en) | 2018-12-05 |
CA3005402A1 (en) | 2017-08-03 |
US10796504B2 (en) | 2020-10-06 |
WO2017130122A1 (en) | 2017-08-03 |
US11631287B2 (en) | 2023-04-18 |
US20200294334A1 (en) | 2020-09-17 |
EP3408617A4 (en) | 2019-08-21 |
US9786104B2 (en) | 2017-10-10 |
US20170213397A1 (en) | 2017-07-27 |
US20180033225A1 (en) | 2018-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11587374B2 (en) | Systems and methods for capturing and offloading different information based on event trigger type | |
US12002358B2 (en) | Systems and methods for using a distributed data center to create map data | |
US11631287B2 (en) | Systems and method to trigger vehicle events based on contextual information | |
US11734964B2 (en) | System and method to detect execution of driving maneuvers | |
US11776327B2 (en) | Systems and methods for querying fleet information stored in a distributed data center | |
US20240021081A1 (en) | Systems and methods for generating data describing physical surroundings of a vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |
Effective date: 20180802 |