WO2014106854A2 - Driving support - Google Patents
Driving support Download PDFInfo
- Publication number
- WO2014106854A2 WO2014106854A2 PCT/IL2014/050017 IL2014050017W WO2014106854A2 WO 2014106854 A2 WO2014106854 A2 WO 2014106854A2 IL 2014050017 W IL2014050017 W IL 2014050017W WO 2014106854 A2 WO2014106854 A2 WO 2014106854A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- velocity
- location
- contour
- bearing
- images
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/22—Platooning, i.e. convoy of communicating vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/012—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from other sources than vehicle or roadside beacons, e.g. mobile networks
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0129—Traffic data processing for creating historical data or processing based on historical data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0141—Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/052—Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/056—Detecting movement of traffic to be counted or controlled with provision for distinguishing direction of travel
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/096741—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2310/00—Arrangements, adaptations or methods for cruise controls
- B60K2310/22—Displays for target speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
Definitions
- the present invention in some embodiments thereof, relates to driving support systems and methods and, more specifically, but not exclusively, to driving support systems and methods which are based on movement data analysis and/or visual and audible signal analysis.
- a method of calculating a commonly driven velocity recommendation comprising: gathering a plurality of data messages from a plurality of client devices located in a plurality of different vehicles, each data message comprises a current location value, a current bearing value, and a current velocity value estimated for a hosting vehicle; clustering the plurality of data messages in a plurality of clusters by matching respective the location values and the bearing values; calculating a commonly driven velocity per cluster of the plurality of clusters by combining data from respective cluster members; and retrieving the commonly driven velocity in response to an indication of a current location and a current bearing thereof which matches location and bearing of members of a respective the cluster.
- the clustering includes: filtering the plurality of data messages to remove unnecessary data.
- the retrieving is done to one of the plurality of client devices.
- a computer readable medium comprising computer executable instructions adapted to perform the method.
- a method of generating and updating a commonly driven velocity dataset comprising: hosting in a memory a dataset of a commonly driven velocity associated with a plurality of location points and bearing ranges; communicating with each of a plurality of velocity monitoring modules over a wireless connection to gather location, bearing and velocity data received from each of the plurality of velocity monitoring modules, each one of the plurality of velocity monitoring modules is hosted in one of a plurality of client devices; clustering the location, bearing and velocity data in clusters according to predefined parameters; processing the location, bearing and velocity data in the clusters to provide updated commonly driven velocity; and updating the commonly driven velocity associated with at least some of the plurality of location points and bearing ranges in the dataset according to the gathered location, bearing and velocity data.
- the method of also using the recommended velocity dataset further comprises: receiving a current location and bearing of a client device; forwarding a local commonly driven velocity for a location point and bearing range corresponding with the current location based on the updated dataset to the first client device; and presenting the local commonly driven velocity to a user of the first client device.
- the predefined parameters include at least one of driver age, driver gender, vehicle type, recency, time of day and weather condition.
- the location, bearing and velocity data further include other data acquired by the one of a plurality of client devices .
- the dataset of a commonly driven velocity further includes for each location point and bearing range at least one of velocity distribution, mean velocity, average velocity and standard deviation of velocity.
- the client device is one of the plurality of client devices.
- a method of using a commonly driven velocity dataset comprising: receiving a current location and bearing of a client device; choosing a local commonly driven velocity associated with a location point and a bearing range corresponding with the current location and bearing based on a dataset of a commonly driven velocities; and forwarding the local commonly driven velocity to the client device; and presenting the local commonly driven velocity to a user of the first client device.
- the presenting includes an alert based on a current velocity of the client device and the local commonly driven velocity.
- the alert is one of audio alert, vibration and visual alert.
- the presenting includes an alert based on at least one of velocity distribution, mean velocity, average velocity and standard deviation of velocity.
- a system of generating a commonly driven velocity dataset comprising: a network node which manages a dataset of a commonly driven velocity associated with a plurality of location points and bearing ranges; and a plurality of velocity monitoring modules each: hosted in one of a plurality of client devices; records location, bearing and velocity data of the hosting client devices; and forwards the location and velocity data over a wireless connection to the network node; wherein the network node: gathers the location, bearing and velocity data from each of the plurality of velocity monitoring modules; and updates the dataset according to the gathered location, bearing and velocity data.
- a method of generating a contour dataset comprising: acquiring a plurality of landscape images each associated with a respective capturing location; creating from each one of the plurality of landscape images one of a plurality of contour images by identifying and documenting a contour of at least one route element depicted thereby, each one of the plurality of contour images being associated with the respective capturing location; and updating in a memory a contour image dataset by correlating between each of the plurality of contour images and one of a plurality of locations along a plurality of routes documented in a contour image dataset according to the respective capturing location.
- the creating of one of a plurality of contour images includes using a plurality of landscape images, each acquired from a different source.
- the plurality of contour images includes three dimensional contour data.
- the method for also using the contour dataset further comprises: facilitating a retrieval of at least one of the plurality of contour images in response to a request indicative of a motion vector of one of a plurality of client devices along one of the plurality of routes . More optionally, the at least one of the plurality of contour images is compressed before the retrieval.
- the plurality of client devices are a plurality of mobile devices each selected from a group consisting a Smartphone, a tablet, and a wearable computing device.
- the plurality of client devices is a plurality of infotainment units of a plurality of vehicles.
- the acquiring comprises capturing the plurality of landscape images by using a plurality of image sensors each installed in one of a plurality of client devices.
- the acquiring further comprises associating each one of the plurality of landscape images with a respective the capturing location.
- the associated capturing location is acquired from a respective the client device from the plurality of client devices.
- the capturing comprises communicating the plurality of landscape images from each of the image recording modules installed in the plurality of client devices to a network node over a wireless connection, the network node performs the creating, the updating, and the facilitating.
- the network node gathers the plurality of landscape images from the plurality of image recording modules each with the respective capturing location.
- an output of a non imaging sensor is associated with each of at least some of the plurality of landscape images; the output and a respective associated landscape image from the plurality of landscape images are captured in a related time frame.
- the output is analyzed to identify a vehicle road interaction.
- the creating comprises adding an indication about the vehicle road interaction to be presented in association with a respective the contour image.
- the output is an audio signal and the vehicle road interaction is selected from a group consisting of a bumper, lane marker having at least one trembling element, and a side wall.
- the plurality of contour images is rendered to be presented on an augmented reality display.
- the plurality of contour images is rendered to be projected on a windshield augmented reality display.
- the plurality of contour images is rendered to be displayed on a head-up display.
- the plurality of contour images is rendered to be embedded as a frame on a screen display.
- the plurality of landscape images acquired from a geographical dataset.
- a method of using a contour dataset comprising: receiving a request indicative of a motion vector and location of a client device along one of a plurality of routes; locating at least one of a plurality of contour images corresponding with the motion vector based on a contour image dataset documenting the plurality of routes; and forwarding the at least one of a plurality of contour images to the client device.
- the forwarding further includes at least one landscape image stored in the contour image dataset.
- a system of generating a contour dataset comprising: a plurality of image recording modules, each of the plurality of image recording modules is: hosted in one of a plurality of client devices; instructs the capturing of some of a plurality of landscape images by using an image sensor of the hosting client device and associates each of the some of the plurality of landscape images with a respective capturing location; at least one contour image generation module which creates from each one of the plurality of landscape images one of a plurality of contour images by identifying and documenting a contour of at least one route element depicted thereby, each one of the plurality of contour images being associated with a respective the associated capturing location; a memory which stores a contour image dataset geographically mapping a plurality of routes; a contour image updating module which updates the contour image dataset by correlating between each of the plurality of landscape images and one of a plurality of locations along the plurality of routes documented in the contour image dataset according to the respective capturing location; and a communication module which
- a method of estimating a current revolutions per minute (RPM) measure of a motor of a vehicle comprising: recording an output of at least one accelerometer which is mechanically connected to a vehicle having a motor during a monitoring period; analyzing the output to generate an RPM of the motor; estimating a current RPM of the vehicle.
- RPM revolutions per minute
- the method further comprises: identifying vehicle events using the current RPM.
- the method further comprises: estimating driver behavior measures using the current RPM and additional driver information.
- the method further comprises: triggering a data acquisition module according to the current RPM.
- a system of estimating a current revolutions per minute (RPM) measure of a motor of a vehicle comprising: at least one accelerometer; at least one mount for mechanically connecting the at least one accelerometer to a vehicle having a motor during a monitoring period; a processor; and a module which: instructs the processor to generate an RPM of the motor by an analysis of an output of the at least one accelerometer during the monitoring period; and estimates a current RPM of the vehicle.
- RPM revolutions per minute
- Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.
- a data processor such as a computing platform for executing a plurality of instructions.
- the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data.
- a network connection is provided as well.
- a display and/or a user input device such as a keyboard or mouse are optionally provided as well.
- FIG. 1 is a flowchart schematically representing a method of generating, updating and/or using a commonly driven velocity dataset and/or calculating a commonly driven velocity, according to some embodiments of the present invention
- FIG. 2 is a schematic illustration of a system of generating, updating and/or using a commonly driven velocity dataset, according to some embodiments of the present invention
- FIG. 3 is a flowchart schematically representing a method of generating, updating and/or using a contour dataset, according to some embodiments of the present invention
- FIG. 4 is a schematic illustration of a system of generating, updating and/or using a contour dataset, according to some embodiments of the present invention
- FIG. 5A is an exemplary landscape image taken at night, according to some embodiments of the present invention.
- FIG. 5B is FIG. 5A combined with an exemplary contour image created from the landscape images of the same location, according to some embodiments of the present invention.
- FIG. 6A is a schematic illustration of a front view of possible layouts of systems installed on a vehicle, according to some embodiments of the present invention.
- FIG. 6B is another schematic illustration of a front view of possible layouts of systems installed on a vehicle, according to some embodiments of the present invention.
- FIG. 7 is a flowchart schematically representing a method of estimating a current revolutions per minute (RPM) measure of a motor of a vehicle, according to some embodiments of the present invention.
- FIG. 8 is a schematic illustration of a system of estimating a current RPM measure of a motor of a vehicle, according to some embodiments of the present invention.
- the present invention in some embodiments thereof, relates to driving support systems and methods and, more specifically, but not exclusively, to driving support systems and methods which are based on movement data analysis and/or visual and audible signal analysis.
- data is collected by client devices, such as Smartphones and infotainment devices, analyzed to map driving support data and/or recommendations which are presented to a user, optionally by the client devices, when he or she are located in a respective location.
- client devices may be mobile devices, such as a Smartphone, a mobile phone, a tablet and/or a wearable device such as a head-up display or a band.
- the client devices may also be infotainment units of vehicles and/or vehicle add on(s) such as rearview mirror incorporated display.
- a dataset of a commonly driven velocity associated with location points and bearing ranges is hosted in a memory of a network node.
- Location, bearing and velocity data received from velocity monitoring modules running on client devices is gathered by the network node, either by pulling or pushing update sessions.
- the gathered data is clustered and processed, and the commonly driven velocity is updated according to the processed location, bearing and velocity data.
- a client device may then send its location and bearing and receives a recommended velocity for its current location point and bearing range and/or for an estimated location where he will be shortly according to the commonly driven velocities in the dataset, to be presented to a user such a driver of a car.
- the recommended velocity is sent to the client device to allow alerting the user when he drives at a speed which is out of a range set around the recommended velocity estimated for his location.
- a contour image is created from one or more respective landscape images by identifying and documenting the contour of route element(s) depicted in the landscape image(s), such as lane markings, signs and/or curbs.
- a contour image dataset is updated by correlating between each contour images and a location along routes documented in the contour image dataset.
- a client device being in motion along one of the routes, sends a request and receives contour images according to its motion, for instance motion vector.
- This may help a user of the client device, such as a driver of a car, when he is presented with the contour images, for instance on an augmented reality display, such as head-up display or on the windshield, to better navigate using the contours, when vision is not optimal such as at night and/or in fog.
- RPM revolutions per minute
- RPM measurements are used to trigger data recording, for instance by a GPS module and/or an image and/or a data processing action, for example to identify driving parameters from image analysis and/or GPS data. For instance an identification of an RPM change above or below a certain rate, an RPM change which may be indicative of trying to avoid an accident and/or drowsiness and/or a reckless driving, may trigger an image recording session. In such a manner, battery consumption may be reduced.
- aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
- the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
- a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
- a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
- Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
- Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer, for example, through the Internet using an Internet Service Provider.
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- FIG. 1 is a flowchart schematically representing a method of generating, updating and/or using a commonly driven velocity dataset and/or calculating a commonly driven velocity based on data gathered in real time from a plurality of client terminals, according to some embodiments of the present invention.
- FIG. 2 is a schematic illustration of a system of generating, updating and/or using a commonly driven velocity dataset, according to some embodiments of the present invention. The dataset is used to present a driver with a real-time recommended velocity according to his location.
- a dataset of a commonly driven velocity associated with location points and bearing ranges is hosted in a memory 202 of a network node 201.
- a bearing range is a range of driving directions, for example, 20-30 degrees relative to the north.
- Network node 201 may be, for example, a remote server computing device, personal computing device, and/or any other device that have sufficient processor and storage abilities to perform the tasks required by the system.
- memory 202 contains data of location points, each representing a location, and each location point includes a commonly driven velocity.
- a commonly driven velocity is calculated for a group of location points.
- the location points data may include movement parameters of vehicles, for example, velocity distribution, standard deviation of velocity, average velocity, average acceleration, mean velocity and/or mean acceleration.
- the location points data may include road classification, such as highway, rural road, dirt road, country road, urban road, good/bad road condition, one way road or two way road and/or any other type of road.
- the location points data may include intersection classification, such as 3 -way, T- junction, T intersection, fork, 4-way and/or roundabout.
- the location points may also represent locations of interest where drivers have changed velocity or/and acceleration materially. Examples of location that may cause these types of changes are places where drivers slow down such as schools, speed traps, police posts, hospitals and/or crossroads.
- Wireless connection 204 may be, for example wireless local area network (WLAN) protocol such as Wireless Fidelity (WiFiTM), a wireless personal area network (WPAN) such as BluetoothTM protocol and/or a cellular network.
- WLAN wireless local area network
- WiFiTM Wireless Fidelity
- WiPAN wireless personal area network
- Each velocity monitoring modules 203 records current location value, current bearing value and current velocity value of a hosting client device 205, and the hosting client device 305 forwards a data message containing the values over a wireless connection to network node 201.
- Location, bearing and/or velocity data may be acquired, for example, by global positioning system (GPS), triangulation of cellular network, image recognition and/or any other locating method.
- Velocity data may also be acquired by vehicle controller area network (CAN) bus data and/or Wi-Fi networks.
- CAN vehicle controller area network
- client devices 205 for example, compass azimuth, images, videos, time, date, weather and/or vehicle CAN bus data such as steering wheel angle, temperature, gas consumption, and/or acceleration.
- the information may be acquired by components of client devices 205, for example, camera, GPS, gyroscope, accelerometer, air/gas sensor, vehicle CAN bus, storage, processor and/or clock.
- vehicle other systems such as breaks system, light system, signal system, wipers system and any other system available on the vehicle.
- the information may also be acquired from external sources such as maps, other client devices 205, other mobile applications, cellular towers and/or external databases.
- driver metadata is also gathered.
- the driver metadata may include, for example, driver's gender, driver's age and/or vehicle type such as motorcycle or private truck.
- the driver metadata may be supplied by the driver and/or may be extracted from external sources such as social networks, means of data mining, insurance company data and/or the internet.
- location, bearing, velocity and/or other data is recorded by client devices 205 and saved locally.
- the gathered data is filtered to remove unnecessary data by filtering module 207.
- the filtering may be used to remove data points with non physical values, for example, speeds greater than 200 miles per hour (mph) and/or locations positioned at sea.
- the filtering may also be used to select certain part of the data for analysis, for example, data with certain speed values (e.g. 50-80mph); data which is in a specific location, area, city, and/or country; data which is in a certain period of time such as the last hour, day and/or previous week; data which is in recurring periods of time such as 14:00PM- 15:00PM each Sunday, holidays and/or a special events and/or any combination thereof.
- the filtering may also be performed based on driver metadata.
- the gathered data from the data messages is clustered by clustering module 208 by matching the respective location values and bearing values.
- the clustering may be based on location, bearing, velocity, weather condition, vehicle type, driver metadata and/or any other gathered data or combination thereof.
- the clustering may be based on longitude and latitude ranges, for example, a cluster of longitude coordinate in the range 0-10 and latitude coordinate in the range 30-32 or a cluster of longitude coordinate in the range 12-15 and latitude coordinate in the range 30-32.
- the clusters may be based on bearing, for example, same bearing value, bearing in a certain range such as bearing direction of 20-30 degrees. The distinction between different bearing directions is due to the fact that different client devices 205 may travel in a specific location in different directions, for example on different roads, lanes and/or different direction of a junction.
- the clustering is based on a predetermined periods of time, for example, the past day, week and/or month.
- the clustering is based on reoccurring time periods and/or events, for example, time of day, day/night, weekdays, weekends and/or seasons.
- the clustering is based on recency of the data so recent data may have more weight in calculating the commonly driven velocity.
- the clustering is repeated, in order to create sub-clusters to the previously created clusters.
- first clusters are computed based on latitude range 20-25 and longitude range 10-15 which may represent a wide area of 200 square kilometers and for each cluster, sub-clustering is done based on latitude range 20-22 and longitude range 10-12 which may represent an area of 40 square kilometers and the clustering process may be repeated to represent smaller areas.
- specific data may be clustered into more than one group, for example, latitude and/or longitude ranges of various clusters are overlapping.
- the gathered data in the clusters is processed by processing module 209 to calculate commonly driven velocity for each cluster represented by a location point.
- the processing is performed by combining data from the location, bearing and velocity values that are members of the cluster.
- the processing may include, for example, statistic calculation, Kalman filtering, noise filtering, convolution, distribution calculation, histogram calculation, estimation, correlation, neural networking, pattern recognition and/or any other mathematical process.
- the processing includes calculating a histogram and selecting a threshold based on predefined percentile of the distribution.
- the processing may be performed in real time and/or at a later time.
- the processing may be performed on client devices 205 and/or on network node 201.
- the commonly driven velocity of at least some of the location points and bearing ranges is updated according to the gathered location, bearing and velocity data.
- a commonly driven velocity of a segment is initially 50mph, and is updated to 52mph.
- a temporary commonly driven velocity is updated, for example, when velocity is changed due to a temporary event, such as construction works on the road, an accident and/or a flooded road.
- the location points each representing a location and bearing are created and/or updated according to the clusters.
- the longitude and latitude of a location point may be calculate by an average of the all the longitude and latitude values of the data points which are included in the cluster and/or by any other statistical method.
- more than one location point represents a location and bearing, and the location points are differentiated by other parameters according to the clustering parameters and/or filtering parameters.
- the commonly driven velocity of a location point and bearing range is updated according to the group of location points.
- other movement parameters of drivers are calculated for the location points, for example, average/median velocity or acceleration, speed limit, and/or any function of velocity, acceleration and/or both.
- interest location, road classification and/or intersection classification may be determined for a location point according to the movement parameters.
- the filtering, clustering and/or processing are repeated when new data is gathered.
- the data is filtered and clustered with existing data to create new clusters and/or update existing clusters.
- a recommended velocity dataset is used.
- a current location and a current bearing of a client device 206 are received by network node 201.
- Client device 206 may be one of client devices 205.
- the location may be acquired, for example, by GPS, cellular network, image recognition and/or any other locating method.
- a local commonly driven velocity for a location point and bearing range corresponding with the current location and current bearing based on the updated dataset is forwarded to client device 206.
- the commonly driven velocity is chosen from location points along that segment.
- the location points are chosen according to other parameters, such as bearing and/or driver metadata.
- the commonly driven velocity is chosen from data stored locally on client device 206.
- the commonly driven velocity is chosen for an estimated location where client device 206 will be short time, for example, 5, 10, 30 seconds, 1, 5, 10 minutes, and/or any shorter, longer or intermediate time.
- the local commonly driven velocity is presented to a user of client device 206.
- the local commonly driven velocity is presented on a display of client device 206, for example, on a mobile phone screen, tablet screen, infotainment screen, projected over vehicle windshield and/or vehicle dashboard, head-up display (HUD) and/or wearable computing device with a screen such as digital glasses.
- HUD head-up display
- other data and/or parameters are forwarded to and/or presented by client device 206.
- the presentation may include, for example, a visual indication such as a graph, a map and/or numbers, an audio indication such as alert and/or recorded human voice and/or any other indication.
- the commonly driven velocity is calculated for all segments of a route or part of a route.
- a starting location and an ending location are received from the user of client device 206.
- the route is received from navigation system such as mobile phone app and/or a system that is part of a vehicle infotainment system.
- the route is determined by learning the user's behavior, for example, using the location and time of day to match a most probable route from previously collected data of the user of client device 206.
- parts of a route are determined in real time according to the driving.
- a next location point matching a future location of client device 206 is predicted according to the route.
- the next location point predicted according to the current location and current bearing of client device 206.
- the commonly driven velocity is compared with the current velocity of client device 206.
- other data and/or parameters are also compared.
- an alert is presented to the user of client device 206 when the current velocity is higher, lower and/or related by any function to the commonly driven velocity and/or other movement parameters.
- an alert is presented when current velocity is higher than the 80 percentile in the distribution and/or when current velocity is higher than the average velocity plus some multiple of the standard deviation.
- the alert may be, for example, a visual alert, an audio alert, vibration and/or a combination thereof.
- an alert is presented when approaching a location of interest where drivers have changed velocity or/and acceleration materially.
- an alert is presented when approaching a location of interest and current velocity is substantially higher then commonly driven velocity in the location.
- an alert is presented when current speed is out of a range set around the recommended velocity estimated for the location.
- a visual representation is presented to the user of client device 206, such as a graph, bar and/or distribution of the velocities or/and accelerations.
- a visual representation contains a comparison between the commonly driven velocity and recent velocities of client device 206.
- FIG. 3 is a flowchart schematically representing a method of generating, updating and/or using a contour dataset based on data gathered from a plurality of client devices, according to some embodiments of the present invention.
- FIG. 4 is a schematic illustration of a system of generating, updating and/or using a contour dataset, according to some embodiments of the present invention.
- the contour images are presented to users for assisting them in navigation when vision conditions are poor, for example at night.
- landscape images are acquired.
- the images may be acquired, for example, by client devices 405 during motion and/or from an external source such as online databases.
- the landscape images are captured by using image sensors 408 of client devices 405, for example, a camera of a mobile phone and/or a video device of a vehicle.
- the capturing of the landscape images is instructed by image recording modules 403, each hosted in one of client devices 405.
- each one of the landscape images is associated with the respective capturing location, for example, by geo-tagging, indexing and/or embedding location metadata to each landscape image.
- the capturing location may be acquired by any location means as described above.
- the capturing location may be acquired, for example, by a GPS, triangulation of cellular network or any other method.
- output(s) of one or more non imaging sensor(s) are also associated with each of some of the landscape images.
- the output may be any kind of data, for example, sound, bearing, compass azimuth, images, videos, time, date, weather and/or vehicle CAN bus data as described above.
- the output and the respective associated landscape image are captured in a related time frame.
- the output is analyzed to identify a vehicle road interaction.
- the vehicle road interaction may be, for example, a bumper, a lane marker with trembling elements and/or a side wall.
- the vehicle road interaction data may, for example, aid in identification of contour elements when creating the contour images and/or be presented to the user by way of alert.
- output of non imaging sensors gathered at a location of interest may be combined with landscape images of the location of interest taken from a different location.
- a bumper identified by vibration caused to the vehicle when driving on the bumper may be used to identify the bumper in a landscape image taken before arriving to the bumper.
- output of non imaging sensors may be used to increase the accuracy of the capturing location, for example, by comparing to data acquired by GPS and/or known route structure on a map.
- the landscape images are acquired from an external source, for example, a geographical dataset such as Open Street Map and/or Google mapsTM, images or videos from the internet such as Google street view and/or YouTubeTM, images or videos from security cameras, traffic cameras and/or other stationary or mobile cameras and/or any other source of road images or videos.
- a geographical dataset such as Open Street Map and/or Google mapsTM
- images or videos from the internet such as Google street view and/or YouTubeTM
- security cameras such as traffic cameras and/or other stationary or mobile cameras and/or any other source of road images or videos.
- the landscape images are communicated from each of image recording modules 403 to a network node 401 over a wireless connection 404.
- network node 401 gathers the landscape images from image recording modules 403, each with the associated respective capturing location.
- the landscape images are stored in a memory 402.
- a contour image is created from each landscape image, by at least one contour image generation module 409, by identifying and documenting a contour of at least one route element depicted in the landscape image.
- Each contour image is associated with the respective capturing location.
- the route elements may by, for example, lane marking position and orientation, lane marking type (continues line, fragmented line etc), road and/or surroundings contour lines, bumpers, sharp turns, junctions, road crossings, traffic signs, traffic lights, curb stones, traffic islands, separation walls, fences and/or Bridges (overhead).
- lane marking position and orientation lane marking type (continues line, fragmented line etc)
- road and/or surroundings contour lines, bumpers, sharp turns, junctions, road crossings, traffic signs, traffic lights, curb stones, traffic islands, separation walls, fences and/or Bridges (overhead).
- the contour images may be any kind of bitmap or vector image using any graphics file format, for example, portable network graphics (PNG), graphics interchange format (GIF), joint photographic experts group (JPEG), computer graphics metafile (CGM) and/or scalable vector graphics (SVG).
- PNG portable network graphics
- GIF graphics interchange format
- JPEG joint photographic experts group
- CGM computer graphics metafile
- SVG scalable vector graphics
- the contour images only contain data of the contours that may be rendered to an image.
- the landscape images may be compressed before being communicated.
- the compression may include any type of compression method, such as deflation, run-length encoding and/or transform coding.
- FIG. 5A is an exemplary landscape image taken at night and to FIG. 5B, which is FIG. 5 A combined with an exemplary contour image created from the landscape images of the same location.
- the contour images may be created in client devices 405 or in network node 401.
- a contour image dataset geographically mapping routes is stored in memory 402 and includes the contour images.
- each contour image is created from multiple landscape images. This may be performed by image processing algorithms, for example, by creating a contour of an object according to its mean location on different landscape images and/or by using landscape images of an object from different direction to generate an accurate contour.
- the landscape images for creating a contour image are originated from multiple client devices 405 and/or other sources and integrated to create the contour image by contour image generation module 409.
- the landscape images are captured at a proximate time.
- newer landscape images are used to generate the most updated contour image.
- the landscape images are chosen to be used in processing according to usability and/or visibility, for example, images taken on daylight and/or good weather conditions, images with clear view of the horizon, images with no objects in view (such as cars, pedestrians, trucks etc) and/or high resolution images.
- the contour images include a three dimensional contour data.
- the three dimensional contours may be constructed using multiple landscape images of the same location from different angles.
- vibrations caused by driving and measured by sensors may facilitate the capturing of landscape images of the same location from slightly different angle.
- a video by a camera taking 120 frames per second combined with vibrations data from an accelerometer taking 1000 samples per second may be used to construct three-dimensional contour data by calculating the differences between two frames relative to the displacement of the camera measured using the vibrations data.
- the three-dimensional contour data is constructed using landscape images of the same location taken by different devices, such as mobile devices of drivers and/or systems of different vehicles, by calculating the differences between two or more landscape images relative to the displacement of the devices as measured by location elements such as GPS.
- the three-dimensional contour data is presented to a user using spectroscopic methods, such as presenting two offset two-dimensional images, one for each of the user's eyes.
- other data is extracted from the landscape images, such as orientation of a vehicle relative to the roadway and/or position of a vehicle in the lane.
- optimization on the contour images is performed to produce the best possible image quality.
- landscape images of proximate location and/or similar bearing are used for the optimization.
- the optimization may include, for example, statistical methods such as Kalman filter and/or statistical prediction and/or any kind of mathematical process as described above.
- irrelevant objects that appear in the landscape image are removed by image processing when creating the contour image, for example, cars, trucks and/or pedestrians.
- a contour image dataset is updated in memory 402 by a contour image updating module 407 by correlating between each of the contour images and a location along the routes documented in the contour image dataset according to the respective capturing location.
- contour image updating module 407 combines new contour images with contour images existing in the contour image dataset to create an updated contour images. Optionally, this is performed by optimization methods, as described above.
- contour images correlated to one location are stored in the contour image dataset, for example, when intersecting roads are identified according to landscape images data such as bearing and/or when different lanes are identified.
- a contour image dataset is used.
- the retrieval of at least one of the contour images is facilitated and managed by a communication module 410, in response to a request indicative of a motion vector and location of a client device 406 along one of the routes.
- the motion vector includes bearing and velocity of client device 406.
- the motion vector of client device 406 may be a result of any kind of motion by a user of client device 406, for example, driving a car, riding a bicycle and/or walking.
- the location of client device 406 at the time of sending the request is acquired by a GPS of client device 406.
- the location is calculated in realtime by comparing landscape images taken by client device 406 at the time of sanding the request and contour images from the contour image dataset and finding the contour image most similar to the landscape images.
- the calculation includes other data related to the landscape images and/or the contour images, such as bearing.
- the contour images may be compressed before being retrieved.
- the compression may include any type of compression method, as described for landscape images.
- landscape images may also be retrieved, for example, a landscape image taken earlier or in good light conditions and clear view.
- the contour image dataset is updated by a contour image updating module 407 according to new landscape images taken by client device 406 or any other source.
- the contour images are rendered to be presented on an augmented reality display, projected on a windshield augmented reality display and/or displayed on wearable equipment such as digital glasses.
- contour images are presented as an embedded frame on a screen display, such as a display of a mobile phone or an infotainment.
- contour images are retrieved for all segments of a route or part of a route along the route map.
- the route may be determined in any method, as described above.
- the creating of the contour images, the updating of the contour image dataset, and/or the facilitating of the retrieval of contour images is performed by network node 401.
- FIG. 6A and FIG. 6B are schematic illustrations of a front view of possible layouts of systems installed on a vehicle, according to some embodiments of the present invention.
- the system may include a client device 601, such as a mobile phone or a tablet.
- Client device 601 may be held by vehicle mount 604 having an attachment unit 605 attached to windshield 602 or to dashboard 603 of the vehicle, such as vacuum cap or adhesive, an arm 606 connected to attachment unit 605 and a cradle 607 with side holders 608 that holds client device 601.
- Client device 601 may include cameras 609 and 610 for capturing landscape images.
- Client device 601 may also include a wireless communication module 611, such as an antenna.
- Client device 601 may be supplied with power by cable 612 from a vehicle lighter socket 613.
- the system may include a client device such as an infotainment 614.
- Information such as commonly driven velocity and/or contour images may by presented on a screen of client device 601 or on screen 615 of infotainment 614.
- the information may also be embedded as an image or video 616 on windshield 602, on a HUD 617 or a dashboard display 618.
- the presented information is based on data gathered in real time from a plurality of client devices such as client device 601 and analyzed to produce datasets of driver- assisting information.
- FIG. 7 is a flowchart schematically representing a method of estimating one or more current revolutions per minute (RPM) measure(s) of a motor of a vehicle, according to some embodiments of the present invention.
- FIG. 8 is a schematic illustration of a system of estimating a current RPM measure of a motor of a vehicle, according to some embodiments of the present invention.
- System 800 may include, for example, a mobile phone, an infotainment system of a dedicated hardware of vehicle 802 and/or an on-board diagnostics (OBD) system. System 800 may be located on the outside or the inside of vehicle 802.
- OBD on-board diagnostics
- accelerometers 801 are mechanically connected to a vehicle 802 by mounts 804, for example, vacuum caps.
- accelerometers 801 are attached to the windshield of vehicle 802 when accelerometers 801 are part of a mobile device.
- the output is analyzed to generate an RPM of motor 803 by processor 805 instructed by module 806.
- the RPM is generated according to accelerations caused by vibration of motor 803.
- the analysis may include any mathematical process as described above, such as Fourier transform.
- noise filtering and/or noise reduction are used to clear the output before analyzing.
- a match between an RPM pattern and a reference pattern is identified.
- the reference patterns are stored in a memory 807.
- the matching may include any mathematical process as described above, such as statistical algorithms.
- other patterns may be detected, for example, current gear state and/or events such as gear shifting.
- a current RPM of vehicle 802 is estimated according to the generated RPM.
- RPM measurements are used to identify events, for example, accelerations, braking events and/or malfunctions.
- RPM measurements are correlated with other driver data to estimate driver behavior measures, for instance correlated with GPS data, accelerometer data and/or the like.
- RPM measurements are used to trigger data recording, for instance by a GPS module and/or an image and/or a data processing action, for example to identify driving parameters from image analysis and/or GPS data. For instance an identification of an RPM change above or below a certain rate, an RPM change which may be indicative of trying to avoid an accident and/or drowsiness and/or a reckless driving, may trigger an image recording session. In such a manner, data recording is only active when triggered, thus battery consumption may be reduced.
- module 806 the identifying of the match, the estimating and/or the triggering is performed by module 806.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- composition or method may include additional ingredients and/or steps, but only if the additional ingredients and/or steps do not materially alter the basic and novel characteristics of the claimed composition or method.
- a compound or “at least one compound” may include a plurality of compounds, including mixtures thereof.
- range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015551254A JP6468563B2 (en) | 2013-01-06 | 2014-01-06 | Driving support |
CN201480012276.XA CN105074493B (en) | 2013-01-06 | 2014-01-06 | Drive support technology |
US14/759,260 US10083613B2 (en) | 2013-01-06 | 2014-01-06 | Driving support |
EP14735176.1A EP2941656B1 (en) | 2013-01-06 | 2014-01-06 | Driving support |
IL239803A IL239803B (en) | 2013-01-06 | 2015-07-06 | Driving support |
Applications Claiming Priority (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361749340P | 2013-01-06 | 2013-01-06 | |
US61/749,340 | 2013-01-06 | ||
US201361753004P | 2013-01-16 | 2013-01-16 | |
US201361753008P | 2013-01-16 | 2013-01-16 | |
US61/753,008 | 2013-01-16 | ||
US61/753,004 | 2013-01-16 | ||
US201361760093P | 2013-02-03 | 2013-02-03 | |
US61/760,093 | 2013-02-03 | ||
US201361767329P | 2013-02-21 | 2013-02-21 | |
US61/767,329 | 2013-02-21 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2014106854A2 true WO2014106854A2 (en) | 2014-07-10 |
WO2014106854A3 WO2014106854A3 (en) | 2014-09-12 |
Family
ID=51062530
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2014/050017 WO2014106854A2 (en) | 2013-01-06 | 2014-01-06 | Driving support |
Country Status (5)
Country | Link |
---|---|
US (1) | US10083613B2 (en) |
EP (1) | EP2941656B1 (en) |
JP (1) | JP6468563B2 (en) |
CN (1) | CN105074493B (en) |
WO (1) | WO2014106854A2 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017091042A (en) * | 2015-11-05 | 2017-05-25 | 株式会社デンソー | Driving support transmitter, driving support receiver, and program |
US9820108B1 (en) | 2015-10-20 | 2017-11-14 | Allstate Insurance Company | Connected services configurator |
EP3272612A1 (en) * | 2016-07-15 | 2018-01-24 | Tata Consultancy Services Limited | Method and system for vehicle speed profile generation |
US10083613B2 (en) | 2013-01-06 | 2018-09-25 | Ionroad Technologies Ltd. | Driving support |
DE102018200134B3 (en) | 2018-01-08 | 2019-03-21 | Audi Ag | Method for acquiring training data for a driver assistance system, motor vehicle and server device |
CN110361024A (en) * | 2018-04-10 | 2019-10-22 | 丰田自动车株式会社 | Utilize the dynamic lane grade automobile navigation of vehicle group mark |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105324267B (en) * | 2013-07-05 | 2018-08-07 | 歌乐株式会社 | Drive supporting device |
US9747506B2 (en) * | 2015-10-21 | 2017-08-29 | Ford Global Technologies, Llc | Perception-based speed limit estimation and learning |
WO2017147530A1 (en) * | 2016-02-25 | 2017-08-31 | Greenovations, Inc. | Automated mobile device onboard camera recording |
JP6376159B2 (en) * | 2016-03-15 | 2018-08-22 | オムロン株式会社 | Data flow control device and data flow control method |
US10451435B2 (en) * | 2016-06-03 | 2019-10-22 | Panasonic Automotive Systems Company of America, Division of Panasonic of North American | Method of using GPS map information to highlight road markings on a head up display that otherwise may be non-visible due to inclement weather |
US10168174B2 (en) * | 2017-05-09 | 2019-01-01 | Toyota Jidosha Kabushiki Kaisha | Augmented reality for vehicle lane guidance |
JP7014949B2 (en) * | 2017-06-15 | 2022-02-02 | 富士通株式会社 | Hazard calculation device, risk calculation method, and risk calculation program |
CN107323458B (en) * | 2017-06-15 | 2019-08-27 | 长安大学 | Family car driving assistance method based on driver's gender |
US10803665B1 (en) * | 2017-09-26 | 2020-10-13 | Amazon Technologies, Inc. | Data aggregation for augmented reality applications |
US10684627B2 (en) * | 2018-02-06 | 2020-06-16 | Ford Global Technologies, Llc | Accelerometer-based external sound monitoring for position aware autonomous parking |
CN110315973B (en) * | 2018-03-30 | 2022-01-07 | 比亚迪股份有限公司 | Vehicle-mounted display system, vehicle and control method of vehicle-mounted display system |
US11210936B2 (en) * | 2018-04-27 | 2021-12-28 | Cubic Corporation | Broadcasting details of objects at an intersection |
WO2020202451A1 (en) * | 2019-04-01 | 2020-10-08 | ヤマハ発動機株式会社 | Leaning vehicle traveling data analysis method, leaning vehicle traveling data analysis device, information processing method using analysis data, and information processing device using analysis data |
US10685248B1 (en) * | 2019-05-30 | 2020-06-16 | Moj.Io, Inc. | Computing system with driver behavior detection mechanism and method of operation thereof |
US11479268B2 (en) | 2020-07-30 | 2022-10-25 | Toyota Motor Engineering & Manufacturing North America, Inc. | Autonomous control of vehicle driving modes in certain traffic situations |
DE102021201063A1 (en) | 2021-02-04 | 2022-08-04 | Volkswagen Aktiengesellschaft | Method for operating a system for an at least partially assisted motor vehicle, computer program product and system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100094532A1 (en) | 2003-05-09 | 2010-04-15 | Dimitri Vorona | System for transmitting, processing, receiving, and displaying traffic information |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3285597B2 (en) | 1991-08-13 | 2002-05-27 | 松下電器産業株式会社 | Navigation device |
US5708427A (en) * | 1996-04-18 | 1998-01-13 | Bush; E. William | Vehicle in-lane positional indication/control by phase detection of RF signals induced in completely-passive resonant-loop circuits buried along a road lane |
US6680694B1 (en) * | 1997-08-19 | 2004-01-20 | Siemens Vdo Automotive Corporation | Vehicle information system |
US6067031A (en) | 1997-12-18 | 2000-05-23 | Trimble Navigation Limited | Dynamic monitoring of vehicle separation |
US20100262489A1 (en) * | 2002-12-13 | 2010-10-14 | Robert Salinas | Mobile enabled advertising and marketing methods for computer games, simulations, demonstrations, and the like |
US20070262860A1 (en) * | 2006-04-23 | 2007-11-15 | Robert Salinas | Distribution of Targeted Messages and the Serving, Collecting, Managing, and Analyzing and Reporting of Information relating to Mobile and other Electronic Devices |
US7389178B2 (en) * | 2003-12-11 | 2008-06-17 | Greenroad Driving Technologies Ltd. | System and method for vehicle driver behavior analysis and evaluation |
JP2006038558A (en) | 2004-07-26 | 2006-02-09 | Denso Corp | Car navigation system |
JP4259430B2 (en) * | 2004-08-24 | 2009-04-30 | 株式会社デンソー | Vehicle information transmitting apparatus and vehicle |
US7469827B2 (en) | 2005-11-17 | 2008-12-30 | Google Inc. | Vehicle information systems and methods |
JP4788426B2 (en) * | 2006-03-23 | 2011-10-05 | 株式会社デンソー | Vehicle display system |
JP4762026B2 (en) * | 2006-03-29 | 2011-08-31 | 株式会社ディーイーテック | Road sign database construction device |
DE102006062061B4 (en) * | 2006-12-29 | 2010-06-10 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Apparatus, method and computer program for determining a position based on a camera image from a camera |
JP2009059258A (en) * | 2007-09-03 | 2009-03-19 | Alpine Electronics Inc | Obstacle approach warning apparatus |
US8315786B2 (en) | 2008-06-27 | 2012-11-20 | Microsoft Corporation | Local decision policies about the sharing of sensed data that enhance privacy and lower communication costs for services that aggregate data from personal devices |
US8494496B2 (en) | 2009-11-13 | 2013-07-23 | At&T Mobility Ii Llc | System and method for using cellular network components to derive traffic information |
JP2011217116A (en) | 2010-03-31 | 2011-10-27 | Mitsui Sumitomo Insurance Co Ltd | Reproducing device, imaging reproducing system, imaging reproducing method, reproducing program and imaging program |
ES2424397B1 (en) | 2010-07-28 | 2014-09-12 | Traffic Network Solutions, S.L. | METHOD AND SYSTEM FOR MONITORING VEHICLE TRAFFIC |
GB2483094A (en) * | 2010-08-26 | 2012-02-29 | Sivapathalingham Sivavakeesar | Taxi location and availability reporting system |
US20120053754A1 (en) | 2010-08-31 | 2012-03-01 | Karen Pease | Electronic communications and control module |
US8494759B2 (en) * | 2010-09-08 | 2013-07-23 | Toyota Motor Engineering & Manufacturing North America, Inc. | Vehicle speed indication using vehicle-infrastructure wireless communication |
EP2469230A1 (en) | 2010-12-23 | 2012-06-27 | Research In Motion Limited | Updating map data from camera images |
JP2012137320A (en) | 2010-12-24 | 2012-07-19 | Pioneer Electronic Corp | Guidance apparatus, guidance method, guidance program and recording medium |
JP5331146B2 (en) * | 2011-03-22 | 2013-10-30 | 株式会社東芝 | Monocular head mounted display |
JP2012256138A (en) | 2011-06-08 | 2012-12-27 | Daihatsu Motor Co Ltd | Portable terminal device and driving evaluation system having the same |
GB201205125D0 (en) | 2012-02-08 | 2012-05-09 | Tomtom Int Bv | Methods using speed distribution profiles |
CN105074493B (en) | 2013-01-06 | 2018-09-25 | 艾安罗德科技有限公司 | Drive support technology |
-
2014
- 2014-01-06 CN CN201480012276.XA patent/CN105074493B/en active Active
- 2014-01-06 EP EP14735176.1A patent/EP2941656B1/en active Active
- 2014-01-06 US US14/759,260 patent/US10083613B2/en active Active
- 2014-01-06 WO PCT/IL2014/050017 patent/WO2014106854A2/en active Application Filing
- 2014-01-06 JP JP2015551254A patent/JP6468563B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100094532A1 (en) | 2003-05-09 | 2010-04-15 | Dimitri Vorona | System for transmitting, processing, receiving, and displaying traffic information |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10083613B2 (en) | 2013-01-06 | 2018-09-25 | Ionroad Technologies Ltd. | Driving support |
US9820108B1 (en) | 2015-10-20 | 2017-11-14 | Allstate Insurance Company | Connected services configurator |
US10038986B1 (en) | 2015-10-20 | 2018-07-31 | Allstate Insurance Company | Connected services configurator |
US10306431B1 (en) | 2015-10-20 | 2019-05-28 | Allstate Insurance Company | Connected services configurator for connecting a mobile device to applications to perform tasks |
US10567935B1 (en) | 2015-10-20 | 2020-02-18 | Allstate Insurance Company | Connected services configuration for connecting a mobile device to applications to perform tasks |
US10917752B1 (en) | 2015-10-20 | 2021-02-09 | Allstate Insurance Company | Connected services configurator |
JP2017091042A (en) * | 2015-11-05 | 2017-05-25 | 株式会社デンソー | Driving support transmitter, driving support receiver, and program |
EP3272612A1 (en) * | 2016-07-15 | 2018-01-24 | Tata Consultancy Services Limited | Method and system for vehicle speed profile generation |
DE102018200134B3 (en) | 2018-01-08 | 2019-03-21 | Audi Ag | Method for acquiring training data for a driver assistance system, motor vehicle and server device |
CN110361024A (en) * | 2018-04-10 | 2019-10-22 | 丰田自动车株式会社 | Utilize the dynamic lane grade automobile navigation of vehicle group mark |
CN110361024B (en) * | 2018-04-10 | 2023-12-08 | 丰田自动车株式会社 | Method and system for dynamic lane-level vehicle navigation with vehicle group identification |
Also Published As
Publication number | Publication date |
---|---|
CN105074493A (en) | 2015-11-18 |
CN105074493B (en) | 2018-09-25 |
JP6468563B2 (en) | 2019-02-13 |
JP2016511860A (en) | 2016-04-21 |
EP2941656A4 (en) | 2016-10-12 |
EP2941656A2 (en) | 2015-11-11 |
US20150356872A1 (en) | 2015-12-10 |
EP2941656B1 (en) | 2022-10-26 |
WO2014106854A3 (en) | 2014-09-12 |
US10083613B2 (en) | 2018-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10083613B2 (en) | Driving support | |
US11946748B1 (en) | Automated vehicle control and guidance based on real-time blind corner navigational analysis | |
JP6414255B2 (en) | Semantic representation of road scene situations for understanding and sharing of road scene situations | |
US11175152B2 (en) | Method and system for risk determination of a route | |
US10782138B2 (en) | Method, apparatus, and computer program product for pedestrian behavior profile generation | |
CN106662458B (en) | Wearable sensor data for improving map and navigation data | |
JP2019525185A (en) | Method and apparatus for providing goal-oriented navigation instructions | |
JP2021515286A (en) | Systems and methods for anonymizing navigation information | |
CN105489005B (en) | A kind of acquisition of path formation index and the method and system shared | |
CN110945320B (en) | Vehicle positioning method and system | |
EP3671688A1 (en) | Methods and systems for autonomous vehicle navigation | |
JP2021525370A (en) | Enhanced navigation instructions with landmarks under difficult driving conditions | |
US20220057218A1 (en) | Method and apparatus for automatic generation of context-based guidance information from behavior and context-based machine learning models | |
US20220319336A1 (en) | Method and apparatus for estimating false positive reports of detectable road events | |
US20220207994A1 (en) | Methods and systems for predicting road closure in a region | |
US11691646B2 (en) | Method and apparatus for generating a flood event warning for a flood prone location | |
US20200372727A1 (en) | Methods and systems for emergency event management | |
US20220203973A1 (en) | Methods and systems for generating navigation information in a region | |
EP3993453A1 (en) | Method, apparatus, and computer program product for anonymizing trajectories including endogenous events | |
US20230073956A1 (en) | Systems and methods for evaluating user reviews | |
US20240144812A1 (en) | System and method for verification of traffic incidents | |
US20220290995A1 (en) | System and method for validating road object data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480012276.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14735176 Country of ref document: EP Kind code of ref document: A2 |
|
ENP | Entry into the national phase |
Ref document number: 2015551254 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 239803 Country of ref document: IL Ref document number: 14759260 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014735176 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14735176 Country of ref document: EP Kind code of ref document: A2 |