DE102016208239A1 - Method for determining dynamic 3D map data, method for determining analysis data, device, computer program and computer program product - Google Patents

Method for determining dynamic 3D map data, method for determining analysis data, device, computer program and computer program product

Info

Publication number
DE102016208239A1
DE102016208239A1 DE102016208239.6A DE102016208239A DE102016208239A1 DE 102016208239 A1 DE102016208239 A1 DE 102016208239A1 DE 102016208239 A DE102016208239 A DE 102016208239A DE 102016208239 A1 DE102016208239 A1 DE 102016208239A1
Authority
DE
Germany
Prior art keywords
data
vehicle
3d map
position
dynamic 3d
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
DE102016208239.6A
Other languages
German (de)
Inventor
Jürgen Leimbach
Albert Kos
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive GmbH
Original Assignee
Continental Automotive GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive GmbH filed Critical Continental Automotive GmbH
Priority to DE102016208239.6A priority Critical patent/DE102016208239A1/en
Publication of DE102016208239A1 publication Critical patent/DE102016208239A1/en
Application status is Ceased legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps

Abstract

In a method for determining 3D dynamic map data, position data of a vehicle representative of a position of the vehicle are provided. Environmental data of the vehicle is provided that is representative of environmental information determined by vehicle sensors of the vehicle. A static map is provided. Depending on the position data, the environment data and the static map, the dynamic 3D map data are determined.

Description

  • The invention relates to a method for determining dynamic 3D map data. The invention further relates to a method for determining analysis data. The invention further relates to a device for determining dynamic 3D map data and / or for determining analysis data. The invention further relates to a computer program and a computer program product.
  • Modern vehicles have a large number of sensors, which are mainly used for safety functions and driver assistance systems.
  • The US 7860994 B2 and the US8041829 B2 disclose a system and method for remote data collection and distribution.
  • The US 20090231432 A1 discloses a method of determining which video stream to display on a video display.
  • The WO 2014114751 A1 discloses a traffic monitoring and guidance system.
  • The object underlying the invention is to contribute to making versatile use of information gathered by vehicle sensors.
  • The object is solved by the features of the independent claims. Advantageous embodiments are characterized in the subclaims.
  • The invention is characterized according to a first aspect by a method for determining dynamic 3D map data. The method provides positional data of a vehicle representative of a position of the vehicle. Environmental data of the vehicle is provided that is representative of environmental information determined by vehicle sensors of the vehicle. A static map is provided. Depending on the position data, the environment data and the static map, the dynamic 3D map data are determined.
  • The position data may additionally comprise time information and direction information, that is to say in particular information about the time at which the position of the vehicle was determined and information about the direction in which the vehicle is traveling.
  • The environmental data include, for example, 2D image data of a single camera, 3D image data of a stereo camera and / or abstract 3D image data of a radar and / or lidar sensor. The image data may additionally comprise time information, that is to say in particular information about the time at which the image data was acquired.
  • In particular, position data and environment data of a plurality of vehicles are provided for determining the dynamic 3D map data.
  • By means of the dynamic 3D map data, for example, a dynamic 3D map can be generated and / or visualized. Furthermore, by means of the dynamic 3D map data, a dynamic video can be reproduced from a predetermined position, similar to a webcam. The dynamic 3D map data are therefore particularly representative of a 3D model. Thus, the information collected by vehicle sensors information is used in many ways.
  • According to an optional embodiment, the position data and the environment data are checked for plausibility, and depending on the check, the dynamic 3D map data are determined.
  • Thus, only data are used that meet a given plausibility criterion. This increases the quality of dynamic 3D map data.
  • According to a further optional embodiment, a position and / or direction of the environmental data is determined as a function of the position data, the environmental data and the static map for determining the dynamic 3D map data.
  • Thus, a more accurate 3D map can be created. Optionally, additional older dynamic 3D map data can be used to determine the position and / or direction of the environment data.
  • According to a further optional embodiment, anonymization of the environmental data is performed, and depending on the anonymized environmental data, the dynamic 3D map data are determined.
  • This makes it possible, for example, to make license plates and / or faces unrecognizable, so that the data can be used in many different applications.
  • According to a further optional embodiment, the dynamic 3D map data and / or the surrounding data are analyzed for predetermined traffic influences and, depending on the analysis, the dynamic 3D map data are newly determined.
  • The predetermined traffic influences include, for example, a traffic jam, a jam end, a construction site, an accident, a danger spot, a traffic environment and / or a road user, such as a car, bicycle, truck and / or weather information, such as rain, ice, fog and the like.
  • The analysis data can be used, for example, to display or mark the determined traffic influences in the dynamic 3D map data.
  • The invention is characterized according to a further aspect by a method for determining analysis data. The method provides positional data of a vehicle representative of a position of the vehicle. Environmental data of the vehicle is provided that is representative of environmental information determined by vehicle sensors of the vehicle. Depending on the position data and the environmental data, the environmental data are analyzed for given traffic influences. Analysis data is determined, which includes information of the analysis of the traffic influences.
  • The position data, the environment data and the predetermined traffic influences correspond in particular to the position data, the environment data and the predetermined traffic influences which were described in the method according to the first aspect. In particular, position data and environment data of a plurality of vehicles are provided for determining the analysis data.
  • The analysis data can be used, for example, to display or mark the determined traffic influences in a 3D map. Alternatively or additionally, the analysis data may be provided for other traffic services. Thus, the information collected by vehicle sensors information is used in many ways.
  • According to an optional embodiment, the method according to the first aspect is performed. The dynamic 3D map data is analyzed for given traffic influences. The analysis data is determined, which includes information of the analysis of the traffic influences.
  • Since the analysis data can be used to display or mark the determined traffic influences in the dynamic 3D map data, it is particularly advantageous to combine the method according to the first aspect with the method for determining analysis data. Especially by means of the 3D map data, the traffic influences can be determined very accurately. Thus, it is precisely hereby that a combination of the method for determining analysis data and the method for determining dynamic 3D map data is particularly advantageous, since, for example, the traffic influences can be analyzed first and then visualized in a 3D map.
  • According to a further aspect, the invention is characterized by a device for determining dynamic 3D map data and / or for determining analysis data, wherein the device is designed to determine the method for determining dynamic 3D map data and / or the method for determining to execute analysis data.
  • According to a further aspect, the invention is characterized by a computer program, wherein the computer program is designed to carry out the method for determining dynamic 3D map data and / or the method for determining analysis data.
  • According to a further aspect, the invention is characterized by a computer program product comprising executable program code, wherein the program code executes the method for determining dynamic 3D map data and / or the method for determining analysis data when executed by a data processing device.
  • In particular, the computer program product comprises a medium which can be read by the data processing device and on which the program code is stored.
  • Embodiments of the invention are explained in more detail below with reference to the schematic drawings. Show it:
  • 1 a communication of a vehicle,
  • 2 a flow diagram for determining dynamic 3D map data,
  • 3 a flowchart for determining analysis data,
  • 4 another flowchart for determining dynamic 3D map data.
  • Elements of the same construction or function are identified across the figures with the same reference numerals.
  • The 1 shows an exemplary communication of a vehicle 1 with a backend 100 , The backend 100 For example, it includes one or more databases, such as a vehicle database with vehicle images, a road map database and / or a 3D or 2D model database. The backend 100 may be formed in a server or on multiple servers and / or in a computer cloud.
  • The vehicle 1 has a position sensor 11 on, such as a GPS sensor. The position determination sensor 11 communicates with a position calculation unit 13 , which is adapted to the position of the vehicle 1 to investigate. The position calculation unit 13 For this purpose, it is also possible to use information from a navigation system in order, for example, to additionally determine the position on a map, for example on a road segment or on a specific road. The position calculation unit 13 can also determine the direction in which the vehicle 1 moves. The position calculation unit 13 can also determine the time at which the position was determined.
  • The position calculation unit 13 is further adapted to a control unit 15 to communicate, for example, the determined position and / or position on the map and / or direction and / or time to the control unit 15 to send.
  • The position determination sensor 11 and the position calculating unit 13 can be stuck in the vehicle 1 be built and / or be mobile and the vehicle 1 be assigned. For example, they can also be designed in a smartphone or another mobile device and with an app with the vehicle 1 be connected.
  • The vehicle 1 further includes, for example, a camera 10 as an environmental sensor to determine environmental information. The camera 10 for example, sends environmental data to the control unit 15 ,
  • The vehicle 1 may include other environmental sensors. So can the vehicle 1 For example, a front camera and / or a TopView camera comprising four individual cameras, which are arranged rotated by 90 ° to each other. Alternatively or additionally, the vehicle 1 have a radar and / or Lidarsensor. Alternatively or additionally, the vehicle 1 a temperature sensor, a rain sensor and the like.
  • The environmental data thus include, for example, 2D image data of a single camera, 3D image data of a stereo camera and / or abstract 3D image data of a radar and / or Lidarsensors. The environment data may additionally comprise time information, that is to say in particular information about the time at which the environment data was acquired. In addition, the environmental data may include information of the rain sensor or a temperature value of the temperature sensor.
  • The control unit 15 is further configured with a communication unit 17 to communicate, for example, position data of the vehicle 1 and the environmental data to the communication unit 17 to send.
  • Furthermore, the communication unit 17 with an ad 19 communicate to display information to the driver.
  • The communication unit 17 For example, it includes an antenna to connect to the backend 100 to communicate via radio link.
  • 2 shows a flowchart of a program for determining dynamic 3D map data. The program can be executed on the vehicle side or distributed to the vehicle 1 and the backend 100 ,
  • The program is started in a step S1 in which variables can be initialized if necessary.
  • In a step S3, the position data of the vehicle 1 provided that are representative of a position of the vehicle 1 , The position data, for example, as described above of the vehicle 1 determined and provided and, for example, to the backend 100 Posted. In a step S5, the environment data of the vehicle 1 provided that are representative of by means of vehicle sensors of the vehicle 1 determined environment information. The environment data, for example, as described above from the vehicle 1 determined and provided and, for example, to the backend 100 Posted.
  • In particular, position data and environment data are provided by a plurality of vehicles. In addition, environmental data and position data can be provided by permanently installed webcams, such as street webcams or webcams of buildings.
  • In a step S7, a static map is provided. For example, the static map is taken from a database of the backend 100 provided.
  • In a step S9, the dynamic 3D map data is determined depending on the position data, the environment data and the static map.
  • In a step S11, the program is ended and may again be in the step S1 are started. Alternatively, the program may be continued in step S3.
  • The dynamic 3D map data can then be returned to the vehicle, for example 1 be transmitted and / or used in particular for displaying a dynamic video from a predetermined position, similar to a webcam, for example by means of the display 19 ,
  • 3 shows a flowchart of a program for determining analysis data. The program can be executed on the vehicle side or distributed to the vehicle 1 and the backend 100 , The program can be used with the program of 2 be combined or carried out separately, as will be explained below.
  • Before step S31, steps S1 to S5 of FIG 2 executed method shown.
  • In step S31, the position data and the environment data are checked for plausibility. If the plausibility check is negative, the position data and the environment data are discarded. If it is positive, the program is continued in a step S33.
  • In step S33, depending on the position data and the environment data, the environment data is analyzed for predetermined traffic influences.
  • In a step S35, analysis data including information of analysis of the traffic influences is obtained. If the process is separate from the process of 2 executed, the analysis data can be provided and the program can be terminated. If the procedure with the method of 2 combined, the program may in step S7 of the 2 to be continued.
  • The 4 shows an example of the determination of the dynamic 3D map data, ie in particular the step S7 of 2 ,
  • In particular, step S5 or step S35 is executed before step S43.
  • In step S43, a position and / or direction of the environment data is determined to determine the dynamic 3D map data. For this purpose, in particular in a step S41, which can be performed at any time prior to step S43, a static map is provided and depending on the position data, the environmental data and the static map, the position and / or direction of the environmental data is determined. Furthermore, if older dynamic 3D map data are already available, the position and / or direction can be determined depending on the older dynamic 3D map data.
  • The determined position and / or direction and the environment data can also be provided for other applications, for example.
  • In step S45, anonymization of the environment data is performed. The environmental data are thus checked in particular on faces and / or license plates and these faces and / or license plates made unrecognizable. Furthermore, a data reduction can be performed in this step.
  • The anonymized environment data may also be provided for other applications, for example.
  • In one step 47 Depending on the position data, the environment data and the static map, the dynamic 3D map data are determined.
  • Furthermore, optionally in this step, by means of the dynamic 3D map data, the analysis data can be determined, which includes information of the analysis of the traffic influences.
  • Then the analysis data can be provided for other applications. The program can now be terminated or alternatively continued in step S49.
  • In step S49, a 3D map is created by means of the dynamic 3D map data and, for example, in the display 19 visualized. For example, the method may be repeated over and over again to constantly update and improve the dynamic 3D map data.
  • For example, the 3D map includes dynamic content from the environmental data and static content from the static map.
  • For example, the dynamic contents of the environment data may include timestamps, and depending on the age of the dynamic content, static content may be displayed again. So there are elements that remain constant for a relatively long time, such as construction sites, buildings, plants and the like. With other contents, the static content can be used more quickly, for example with individual road users or even with a traffic jam.
  • The dynamic 3D map data and / or the analysis data can thus be very versatile used, for example, for a virtual webcam, so can create a virtual webcam for any position and viewing direction. Furthermore, a 3D map can be created with textures and traffic situations enriched with live images. Furthermore, alerts and location-based messages are possible. Furthermore, the data can be used for highly automated driving, for example, for a highly accurate positioning including the position on a lane.
  • Furthermore, the data for applications for end users can be used via web offers on the Internet or via apps on users' mobile devices or in the vehicle 1 through internet-enabled infotainment systems.
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited patent literature
    • US 7860994 B2 [0003]
    • US8041829 B2 [0003]
    • US 20090231432 A1 [0004]
    • WO 2014114751 A1 [0005]

Claims (10)

  1. Method for determining dynamic 3D map data, in which - position data of a vehicle ( 1 ) which are representative of a position of the vehicle ( 1 ), - environmental data of the vehicle ( 1 ) which are representative of by means of vehicle sensors of the vehicle ( 1 ) determined environmental information, - a static card is provided, - depending on the position data, the environment data and the static map, the dynamic 3D map data are determined.
  2.  The method of claim 1, wherein the position data and the environment data are checked for plausibility, and depending on the verification, the dynamic 3D map data are determined.
  3.  Method according to Claim 1 or 2, in which, depending on the position data, the environmental data and the static map, a position and / or direction of the surrounding data is determined in order to determine the dynamic 3D map data.
  4.  Method according to one of the preceding claims, in which anonymization of the environment data is carried out and, depending on the anonymized environment data, the dynamic 3D map data are determined.
  5.  Method according to one of the preceding claims, in which the dynamic 3D map data and / or the surrounding data are analyzed for predetermined traffic influences and, depending on the analysis, the dynamic 3D map data are newly determined.
  6. Method for determining analysis data, in which - position data of a vehicle ( 1 ) which are representative of a position of the vehicle ( 1 ), - environmental data of the vehicle ( 1 ) which are representative of by means of vehicle sensors of the vehicle ( 1 ) determined environment information, - be analyzed depending on the position data and the environment data, the environment data on predetermined traffic influences, - analysis data are determined, which include information of the analysis of traffic influences.
  7.  The method of claim 6, wherein The method according to one of claims 1 to 5 is carried out, The dynamic 3D map data are analyzed for given traffic influences, - The analysis data are determined, which include information of the analysis of traffic influences.
  8.  Device for determining dynamic 3D map data and / or for determining analysis data, wherein the device is designed to carry out a method according to one of claims 1 to 5 and / or 6 to 7.
  9.  Computer program, wherein the computer program is designed to carry out a method according to one of claims 1 to 5 and / or 6 to 7 when executed on a data processing device.
  10.  Computer program product comprising executable program code, wherein the program code executes the method according to one of claims 1 to 5 and / or 6 to 7 when executed by a data processing device.
DE102016208239.6A 2016-05-12 2016-05-12 Method for determining dynamic 3D map data, method for determining analysis data, device, computer program and computer program product Ceased DE102016208239A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102016208239.6A DE102016208239A1 (en) 2016-05-12 2016-05-12 Method for determining dynamic 3D map data, method for determining analysis data, device, computer program and computer program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102016208239.6A DE102016208239A1 (en) 2016-05-12 2016-05-12 Method for determining dynamic 3D map data, method for determining analysis data, device, computer program and computer program product

Publications (1)

Publication Number Publication Date
DE102016208239A1 true DE102016208239A1 (en) 2017-11-16

Family

ID=60163468

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102016208239.6A Ceased DE102016208239A1 (en) 2016-05-12 2016-05-12 Method for determining dynamic 3D map data, method for determining analysis data, device, computer program and computer program product

Country Status (1)

Country Link
DE (1) DE102016208239A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008008706A1 (en) * 2007-03-20 2008-11-06 Deutsches Zentrum für Luft- und Raumfahrt e.V. Surrounding data and/or motor vehicle condition data processing device, has evaluation and control unit transmitting data to center and/or to vehicle through satellite-assisted and terrestrial position determining units
US20090231432A1 (en) 2008-03-17 2009-09-17 International Business Machines Corporation View selection in a vehicle-to-vehicle network
US7860994B2 (en) 2006-01-17 2010-12-28 Reality Mobile Llc System and method for remote data acquisition and distribution
DE102010040803A1 (en) * 2010-09-15 2012-03-15 Continental Teves Ag & Co. Ohg Visual driver information and warning system for a driver of a motor vehicle
US20130226448A1 (en) * 2010-10-26 2013-08-29 Tim Bekaert Method for detecting grade separated crossings and underpasses
DE102012020568A1 (en) * 2012-10-19 2014-04-24 Audi Ag Method for operating e.g. computer of passenger car, involves reproducing detected property and nature in natural image of environment, combining natural image with map of environment, and transmitting combined graph to display device
WO2014114751A1 (en) 2013-01-24 2014-07-31 Eilertsen Roger André A traffic surveillance and guidance system
DE102013018315A1 (en) * 2013-10-31 2015-04-30 Bayerische Motoren Werke Aktiengesellschaft Environment model with adaptive grid
DE102014107317A1 (en) * 2014-05-23 2015-11-26 Bayerische Motoren Werke Aktiengesellschaft Electronic periscope for a motor vehicle

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7860994B2 (en) 2006-01-17 2010-12-28 Reality Mobile Llc System and method for remote data acquisition and distribution
US8041829B2 (en) 2006-01-17 2011-10-18 Reality Mobile Llc System and method for remote data acquisition and distribution
DE102008008706A1 (en) * 2007-03-20 2008-11-06 Deutsches Zentrum für Luft- und Raumfahrt e.V. Surrounding data and/or motor vehicle condition data processing device, has evaluation and control unit transmitting data to center and/or to vehicle through satellite-assisted and terrestrial position determining units
US20090231432A1 (en) 2008-03-17 2009-09-17 International Business Machines Corporation View selection in a vehicle-to-vehicle network
DE102010040803A1 (en) * 2010-09-15 2012-03-15 Continental Teves Ag & Co. Ohg Visual driver information and warning system for a driver of a motor vehicle
US20130226448A1 (en) * 2010-10-26 2013-08-29 Tim Bekaert Method for detecting grade separated crossings and underpasses
DE102012020568A1 (en) * 2012-10-19 2014-04-24 Audi Ag Method for operating e.g. computer of passenger car, involves reproducing detected property and nature in natural image of environment, combining natural image with map of environment, and transmitting combined graph to display device
WO2014114751A1 (en) 2013-01-24 2014-07-31 Eilertsen Roger André A traffic surveillance and guidance system
DE102013018315A1 (en) * 2013-10-31 2015-04-30 Bayerische Motoren Werke Aktiengesellschaft Environment model with adaptive grid
DE102014107317A1 (en) * 2014-05-23 2015-11-26 Bayerische Motoren Werke Aktiengesellschaft Electronic periscope for a motor vehicle

Similar Documents

Publication Publication Date Title
US8379923B2 (en) Image recognition processing device, method, and program for processing of image information obtained by imaging the surrounding area of a vehicle
US9262787B2 (en) Assessing risk using vehicle environment information
US20190249995A1 (en) Data Mining to Identify Locations of Potentially Hazardous Conditions for Vehicle Operation and Use Thereof
US10140417B1 (en) Creating a virtual model of a vehicle event
US20150112504A1 (en) Vehicle sensor collection of other vehicle information
US8751157B2 (en) Method and device for determining the position of a vehicle on a carriageway and motor vehicle having such a device
US20150112800A1 (en) Targeted advertising using vehicle information
EP2162849B1 (en) Lane determining device, lane determining method and navigation apparatus using the same
US20150112731A1 (en) Risk assessment for an automated vehicle
US8954226B1 (en) Systems and methods for visualizing an accident involving a vehicle
EP3078937A1 (en) Vehicle position estimation system, device, method, and camera device
JP4771365B2 (en) Route guidance system, route guidance method and program
CN1991312B (en) Route guidance system and route guidance method
US9335178B2 (en) Method for using street level images to enhance automated driving mode for vehicle
US20070016372A1 (en) Remote Perspective Vehicle Environment Observation System
US20150106010A1 (en) Aerial data for vehicle navigation
US8862384B2 (en) Self-learning map on the basis of environment sensors
DE112010005395T5 (en) Road vehicle cooperative Driving Safety Unterstützungsvorrrichtung
EP2149131B1 (en) Method and device for identifying traffic-relevant information
US10365658B2 (en) Systems and methods for aligning crowdsourced sparse map data
US8354942B2 (en) Server-based warning of hazards
JP2010176243A (en) Navigation device, probe information transmission method, program, and traffic information creation device
EP2583263A1 (en) Method for combining a road sign recognition system and a lane detection system of a motor vehicle
US20130147955A1 (en) Warning system, vehicular apparatus, and server
US10427655B2 (en) Systems and methods for detecting surprising events in vehicles

Legal Events

Date Code Title Description
R012 Request for examination validly filed
R016 Response to examination communication
R002 Refusal decision in examination/registration proceedings
R003 Refusal decision now final