KR101782423B1 - Audio navigation device, vehicle having the same, user device, and method for controlling vehicle - Google Patents
Audio navigation device, vehicle having the same, user device, and method for controlling vehicle Download PDFInfo
- Publication number
- KR101782423B1 KR101782423B1 KR1020150038431A KR20150038431A KR101782423B1 KR 101782423 B1 KR101782423 B1 KR 101782423B1 KR 1020150038431 A KR1020150038431 A KR 1020150038431A KR 20150038431 A KR20150038431 A KR 20150038431A KR 101782423 B1 KR101782423 B1 KR 101782423B1
- Authority
- KR
- South Korea
- Prior art keywords
- information
- vehicle
- lane
- generating
- received
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000004891 communication Methods 0.000 claims abstract description 86
- 238000001514 detection method Methods 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 230000003190 augmentative effect Effects 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000005856 abnormality Effects 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 2
- 238000001816 cooling Methods 0.000 description 2
- 210000003195 fascia Anatomy 0.000 description 2
- 238000010438 heat treatment Methods 0.000 description 2
- 239000003595 mist Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 206010041349 Somnolence Diseases 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/03—Cooperating elements; Interaction or communication between different cooperating elements or between cooperating elements and receivers
- G01S19/07—Cooperating elements; Interaction or communication between different cooperating elements or between cooperating elements and receivers providing data for correcting measured positioning data, e.g. DGPS [differential GPS] or ionosphere corrections
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
-
- H04N5/225—
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
A vehicle, and a control method of the vehicle. The vehicle according to claim 1, further comprising: a determination unit determining whether a front view of the vehicle can be secured; A communication unit for receiving lane information on which positional information when the image information is photographed is mapped to image information photographed by a camera, when it is determined that securing of the front view of the vehicle is impossible; A generating unit for generating virtual lanes based on the received lane information and the vehicle position information; And a control unit for controlling one or more devices in the vehicle to provide the generated virtual lane.
Description
A vehicle that provides information about roads, and a method of controlling the vehicle.
Recent vehicles introduced ADAS (Advanced Driver Assistance Systems). A vehicle driving assistance system is a system that assists a driver in driving a vehicle. It is a system that warns the driver of a collision warning with a vehicle ahead of the driver, a lane departure warning due to drowsiness, System. In addition, various systems have been introduced to assist the driver in driving the vehicle.
The vehicle according to
In addition, the determination unit may determine whether or not the front view of the vehicle can be secured based on a result of the detection using the camera and the rain sensor.
In addition, the communication unit may receive sensor information and sign information detected by at least one other vehicle located in front of the vehicle through inter-device communication.
The generating unit may generate a virtual lane on the basis of the received lane information, the vehicle position information, and the locus information.
The generating unit may exclude the received trajectory information when generating the virtual lane when the difference between the received trajectory information and the trajectory information acquired from the received lane information exceeds a predetermined level .
The control unit may control one or more devices in the vehicle to provide the generated virtual lane and the sensor information.
A vehicle according to one side includes: a camera for photographing the front of the vehicle and generating image information; A position measuring device for measuring position information when the image information is photographed; An acquiring unit for acquiring lane information based on the image information and the position information; And a communication unit for transmitting the lane information to a database.
Further, the position measuring apparatus may correspond to DGPS.
The obtaining unit may obtain lane information by mapping the position information of the image information to the image information.
The vehicle according to one side receives the lane information in which the positional information at the time of photographing the image information is mapped to the image information photographed through the camera from the database, and receives sensor information and sign information from another vehicle located in the vicinity of the vehicle ; A generating unit for generating a safety trajectory on the traveling path of the vehicle based on the received lane information, sensor information, locus information, and position information of the vehicle; And a control unit for controlling at least one device in the vehicle and providing the generated safety trajectory.
The communication unit may receive the lane information from the database through the base station and receive at least one of the sensor information and the locus information through inter-device communication from another vehicle located in the vicinity of the vehicle.
The generating unit may generate a virtual lane based on the received lane information and the location information of the vehicle and may generate the virtual lane based on the received sensor information and the locus information of the other vehicle located on the virtual lane, The obstacle and the dangerous vehicle, and generate the safety trajectory based on the determined result.
A method of controlling a vehicle according to one side includes: determining whether a front view of the vehicle can be secured; Receiving lane information in which positional information at the time when the image information is photographed is mapped to image information photographed through a camera, when it is determined that the front view of the vehicle can not be secured; Generating a virtual lane based on the received lane information and the location information of the vehicle; And controlling one or more devices in the vehicle to provide the generated virtual lane.
Also, the determining step may determine whether the front view of the vehicle can be secured based on the result of the detection using the camera and the rain sensor.
In addition, the receiving step may receive sensor information and sign information detected by at least one other vehicle located in front of the vehicle through inter-device communication.
The generating step may generate a virtual lane based on the received lane information, the vehicle position information, and the locus information.
The generating step may include a step of generating the virtual lane by excluding the received locus information if the difference between the received locus information and the locus information acquired from the received lane information exceeds a predetermined level, / RTI >
Further, the providing step may control at least one device in the vehicle to provide the generated virtual lane and the sensor information.
A method of controlling a vehicle according to one side includes the steps of photographing a front side of a vehicle to generate image information; Measuring positional information when the image information is captured; Obtaining lane information based on the image information and the position information; And transmitting the lane information to a database.
Further, the position measuring apparatus may correspond to DGPS.
The acquiring may acquire the lane information by mapping the position information of the image information to the image information.
A method of controlling a vehicle according to one aspect of the present invention includes receiving lane information in which positional information at the time of photographing of the image information is mapped on image information photographed through a camera from a database, Receiving at least one of the information; Generating a safety trajectory on the traveling path of the vehicle based on the received lane information, sensor information, locus information, and position information of the vehicle; And controlling one or more devices in the vehicle to provide the generated safety trajectory.
The receiving step may receive the lane information from the database through the base station and receive at least one of the sensor information and the locus information through inter-device communication from another vehicle located in the vicinity of the vehicle.
The generating step may include generating virtual lanes based on the received lane information and the location information of the vehicle, generating a virtual lane based on sensor information of the other vehicle located on the virtual lane, It is possible to determine the obstacle and the dangerous vehicle located on the vehicle, and to generate the safety trajectory based on the determined result.
1 is a view schematically showing an external configuration of a vehicle according to an embodiment.
2 is a view showing an internal configuration of a vehicle according to an embodiment.
3 is a block diagram of a vehicle for generating lane information and transmitting it to a database.
FIG. 4 is a diagram illustrating a large-scale antenna system of a base station according to a 5G communication scheme according to an embodiment.
5A to 5C are diagrams showing a communication method in a 5G network.
6 is a diagram showing an operational flow chart of a vehicle for generating lane information and transmitting it to a database.
7 is a block diagram of a vehicle that provides virtual lanes using lane information.
8 is a diagram showing an operational flow chart of a vehicle providing virtual lanes using lane information.
9 and 10 are views showing a case where virtual lanes according to different embodiments are displayed on the front window.
11 is a view showing a screen displaying sensor information of a front vehicle through a display according to an embodiment.
12 is a flowchart illustrating an operation of a vehicle that provides a safety trajectory according to an embodiment.
FIG. 13 is a diagram showing a case where a safety trail is added to a virtual lane according to an embodiment and displayed on a front window.
FIG. 14 is a view showing a screen in which a safety trajectory is displayed through a display according to an embodiment.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
1 is a view schematically showing an external configuration of a vehicle according to an embodiment.
1, a
A
3 is a block diagram of a vehicle according to an embodiment. FIG. 4 is a block diagram of a large-scale antenna system of a base station according to an exemplary embodiment of the present invention. Referring to FIG. 2, And FIGS. 5A to 5C are diagrams showing a communication method in a 5G network. Describe the descriptions together to help avoid duplication.
The
Referring to FIG. 2, an AVN terminal (Audio Video Navigation) 100 may be provided inside the
According to one embodiment, the AVN
On the other hand, the
A
The
Meanwhile, the
3, a
The
On the other hand, the position measuring device 132 is provided in the
Meanwhile, the position measuring device 132 includes a GPS (Global Positioning System) for locating a position through a satellite, and a DGPS (Differential Global Positioning System) for measuring a position with high accuracy by complementing the GPS It is not. Generally, the position transmitted from a satellite to a terrestrial GPS has an error. For example, if there are N (N? 2) GPSs located close to each other, the N GPSs have a similar error. At this time, DGPS means a device capable of obtaining more precise data by offsetting common errors of N GPSs.
The acquiring
Referring to FIG. 3, a
The
In addition, the
A large-scale antenna system may be employed for the 5G communication system. A large-scale antenna system refers to a system that can cover up to ultra-high frequency by using dozens or more antennas, and can transmit / receive a large amount of data simultaneously through multiple connections. Specifically, the large-scale antenna system can adjust the arrangement of the antenna elements to transmit and receive radio waves farther in a specific direction, thereby enabling a large-capacity transmission and expanding the usable area of the 5G communication network.
Referring to FIG. 4, the
In addition, the 5G communication method uses a non-orthogonal multiplexing access (NOMA) method to modulate a radio signal, unlike a conventional method of modulating a transmission signal through an Orthogonal Frequency Division Multiplexing By this transmission, more devices can be connected to multiple devices, and large capacity transmission / reception is possible at the same time.
For example, the 5G communication method can provide a transmission rate of up to 1Gbps. 5G communication method can support immersive communication that requires high-capacity transmission such as UHD (Ultra-HD), 3D, hologram, etc. through large capacity transmission. As a result, users can send and receive more sophisticated and immersive ultra-high-capacity data faster through the 5G communication method.
In addition, the 5G communication method enables real-time processing with a maximum response speed of 1ms or less. Accordingly, in the 5G communication method, it is possible to support a real-time service that responds before the user recognizes it. For example, a vehicle receives sensor information from various devices while driving, and can provide an autonomous driving system through real-time processing, as well as provide various remote controls. In addition, the vehicle can process the sensor information with other vehicles existing around the vehicle through the 5G communication system in real time to provide the possibility of occurrence of collision to the user in real time, Can be provided in real time.
In addition, through the real-time processing and large-capacity transmission provided by the 5G communication, the vehicle can provide the big data service to the passengers in the vehicle. For example, the vehicle can analyze various web information, SNS information, and the like, and provide customized information suitable for the situation of the passengers in the vehicle. In one embodiment, the vehicle collects various kinds of tourist information and tourist information existing in the vicinity of the traveling route through the big data mining and provides it in real time so that the passengers can directly check various information existing around the traveling area.
On the other hand, the network of the 5G communication can further subdivide the cell and support the high density and large capacity transmission of the network. Here, the cell means a region divided into a small area and a large area in order to efficiently use the in. At this time. A small power base station is installed in each cell to support communication between terminals. For example, the network of 5G communication can be formed in a two-stage structure of a macro cell base station-distributed small base station-communication terminal by further reducing the size of the cell and reducing the size of the cell.
Also, in a network of 5G communication, relay transmission of a radio signal through a multihop method can be performed. For example, as shown in FIG. 5A, the
Meanwhile, the 5G communication system is capable of device-to-device (D2D) communication applied to vehicles, wearable devices, and the like. Device-to-device communication refers to communication between devices, and refers to communication in which a device transmits not only data sensed through a sensor but also wireless signals containing various data stored in the device. According to the inter-device communication method, it is not necessary to exchange wireless signals via the base station, and radio signals are transmitted between the devices, so unnecessary energy can be saved. At this time, in order to use a 5G communication system such as a vehicle or a wearable device, an antenna must be built in the device.
The
As another example, the
On the other hand, the 5G communication network extends the area where device-to-device communication is supported, enabling communication between devices located farther away. In addition, since it supports real-time processing with response speed of 1ms or less and high-capacity communication of 1Gbps or more, signals including desired data can be exchanged between vehicles running.
For example, a vehicle can communicate with other vehicles, servers, systems, and the like existing in the vicinity of the vehicle in real time through the 5G communication system, and can transmit and receive data, and provides a route guidance service through the augmented reality And can provide various kinds of services.
In addition, the vehicle can transmit and receive wireless signals including data through a base station or inter-device communication using a band outside the above-mentioned frequency band, and is not limited to the communication method using the above-mentioned frequency band.
For example, as will be described later, the
Meanwhile, the
A cloud service is a service that stores various data on a cloud server and downloads it whenever necessary. For example, the cloud service may be constructed by an application programming interface (API), but is not limited thereto. The
Hereinafter, an operation flow chart of a vehicle for acquiring lane information and delivering it to a database will be described.
The vehicle can generate image information of the front of the vehicle through the camera (1000). The front of the vehicle means the direction in which the driver looks at the front window from inside the vehicle. The image information includes an image or a moving image related to a road ahead of the vehicle. As will be described later, the vehicle acquires information such as lane, obstacle, etc. in an image or a moving image about the road, .
Also, the vehicle can measure the position information when the image information is captured through the position measuring apparatus (1010). As described above, the position measuring apparatus can measure the position information of the vehicle and determine the position of the vehicle when photographing the corresponding image information.
Accordingly, the vehicle can acquire the lane information by mapping the position information at the time of photographing the image information to the image information (1020). In the lane information, the positional information at the time of photographing the image information is mapped to the image information, so that the vehicle can judge from what position the road environment is. For example, the vehicle can determine the degree to which the road is bent from the lane information at any position.
Further, the vehicle can receive sensor information from a vehicle traveling ahead of the vehicle through inter-device communication. The sensor information includes the results detected from various sensors built in the vehicle that traveled ahead, so that the vehicle can determine the state of the road through the sensor information. For example, the vehicle can detect whether there is an ice sheet on the road from the sensor information and whether it is slippery or whether or not there is a speed limiter. Details of this will be described later.
The vehicle can send lane information to the database. As described above, since the vehicle is equipped with an antenna, the vehicle can transmit lane information to the database through various communication methods.
7, the
The description of the
The
The
According to an embodiment, the
Alternatively, the
Meanwhile, the
Further, the
Further, the locus information means information on the running locus of the preceding vehicle. For example, it includes information on the degree of stepping on the angle of the steering wheel of the preceding vehicle, the degree of stepping on the brake, the degree of stepping on the accelerator, and whether or not the ESP (Electronic Stability Program) is intervened.
The generating
At this time, the
That is, the
However, the locus information received from the preceding vehicle may deviate from the actual lane due to a driver's fault of the preceding vehicle. Accordingly, the generating
Meanwhile, the
The generating
At this time, the
On the other hand, the
The
In addition, the
On the other hand, the head-up display is provided with a windshield type in which light is projected onto a reflector, light reflected from the reflector is projected onto the
In addition, the
On the other hand, the
FIG. 8 is a view showing an operation flow chart of a vehicle providing virtual lanes using lane information, and FIGS. 9 and 10 are views showing a case where virtual lanes according to different embodiments are displayed on the front window And FIG. 11 is a view showing a screen in which sensor information of a front vehicle is displayed through a display according to an embodiment.
Referring to FIG. 8, the vehicle can determine whether a front view of the vehicle can be secured through the in-vehicle device (1100). Here, the in-vehicle device refers to a device capable of collecting data as a basis for determining whether or not the front view of the vehicle can be secured.
For example, the vehicle can photograph the front of the vehicle through the camera to generate image information, and can determine whether or not the driver can obtain a view from the generated image information. Alternatively, the vehicle senses rainwater through the rain sensor, and if the predetermined level or more is sensed, the driver may judge that it is impossible to secure the visibility, and there is no limitation.
If the forward view is determined to be unreachable, the vehicle may receive lane information from the database (1110) to aid the driver in driving. For example, if the database is located in an external cloud server, the vehicle can retrieve only the lane information within the driver's driving route out of the lane information stored in the database using the API key.
The vehicle may generate a virtual lane based on the lane information and the location information of the vehicle (1120). For example, the vehicle can compare the position information mapped with the image information of the lane information and the position information of the vehicle, and generate the virtual lane using the image information corresponding to the position information of the vehicle. At this time, the vehicle receives the trajectory information from the preceding vehicle through the inter-device communication, and can generate a virtual lane reflecting the trajectory information. Further, the vehicle receives the sensor information from the vehicle that is traveling ahead through the inter-device communication, and provides the sensor information to the driver, so that the driver can safely operate the vehicle.
The vehicle may control the in-vehicle device to provide a virtual lane (1130). The in-vehicle device includes a device mounted on the vehicle and capable of providing information on the virtual lane to the driver. For example, referring to FIG. 9, a vehicle may display a virtual lane generated through an augmented reality on a windshield display. Accordingly, the in-vehicle driver can keep driving while viewing the virtual lane even if the lane is not visible due to mist, shower, or the like. Alternatively, as shown in Fig. 10, by displaying a virtual lane on the head-up display displayed in the front window of the vehicle, the driver can view it and keep driving.
On the other hand, the vehicle can provide sensor information received from another vehicle to the driver, thereby helping the driver to drive more safely. As shown in FIG. 11, the vehicle can display a pop-up message including sensor information on the display. For example, a vehicle may display a pop-up message on the display that says 'Please note that there is ice on the road 100m ahead' to ensure the safety of the driver.
12 is a flowchart illustrating an operation of a vehicle that provides a safety trajectory according to an embodiment.
As described above, the vehicle can exchange data with an external database through a base station, and can directly exchange data with other vehicles through inter-device communication. For example, the vehicle can receive lane information in which positional information is mapped when photographing image information and image information from a database through a base station, and can receive sensor information and sign information of another vehicle through inter-device communication have. In addition, the vehicle can receive position information of another vehicle through inter-device communication. Each vehicle is provided with a position measuring device, and position information of each vehicle can be measured.
The vehicle can generate the lane corresponding to the actual lane using the image information corresponding to the position information of the vehicle from the position information of the vehicle measured through the position measuring device and the image information received from the base station.
The vehicle can arrange another vehicle on the lane corresponding to the actual lane using the position information of the other vehicle. Further, the vehicle can correspond to the sensor information and the locus information of another vehicle located on the lane corresponding to the actual lane. Accordingly, the vehicle can determine whether an obstacle or the like is detected in the vicinity of each of the other vehicles from the sensor information of the other vehicle. Further, the vehicle can compare the difference between the sign information of the other vehicle and the imaginary lane to judge whether the running of each vehicle is normal or dangerous. The vehicle can generate a safety trajectory based on the determination result by determining which lane the vehicle should travel in each lane on the driving route.
The vehicle can display the safety trajectory generated through various displays. For example, the vehicle can display a safety trajectory through a windshield display, a head-up display, an AVN display, etc., thereby guiding the driver to drive in a safer lane.
For example, as shown in Fig. 13, the vehicle can display a safety trail on a virtual lane through the augmented reality on the windshield display. Accordingly, the driver in the vehicle can maintain the driving according to the safety trajectory displayed on the virtual lane, even if the lane is not visible due to mist, shower, or the like. At this time, the safety trajectory is not limited to being displayed on the imaginary lane, but the vehicle may display the safety trajectory on the lane visible through the actual front window. Alternatively, as shown in Fig. 14, by displaying a virtual lane and a safety trajectory on the head-up display displayed in the front window of the vehicle, the driver can view and keep driving.
The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like.
Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.
Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.
10: Dashboard, 11: Center Fesia
12: steering wheel, 13: head lining
21: driver's seat, 22: passenger seat
40: center console, 41: gear operating lever
42: Tray, 43: Center input
80: body, 81: hood, 82: front fender
84: Door, 85: Trunk lid, 86: Quarter panel
87: front window, 88: side window
90: rear window, 91, 92: side mirror
100: AVN terminal, 101: AVN display, 102: navigation input unit
143: speaker, 153: ventilation hole, 190: audio input section
Claims (24)
A communication unit for receiving lane information on which positional information when the image information is photographed is mapped to image information photographed by a camera, when it is determined that securing of the front view of the vehicle is impossible;
A generating unit for generating virtual lanes based on the received lane information and the vehicle position information; And
A controller for controlling one or more devices in the vehicle and providing the generated virtual lane,
Lt; / RTI >
Wherein the communication unit receives sensor information and sign information detected by at least one other vehicle located in front of the vehicle through inter-device communication,
Wherein the generating unit generates a virtual lane based on the received lane information, the vehicle position information, and the locus information.
Wherein,
And determining whether or not the front view of the vehicle can be secured based on the result of the detection using the camera and the rain sensor.
Wherein the generation unit comprises:
Wherein the control unit excludes the received trajectory information in generating the virtual lane when the difference between the received trajectory information and the trajectory information acquired from the received lane information exceeds a predetermined level.
Wherein,
And controlling one or more devices in the vehicle to provide the generated virtual lane and the sensor information.
A generating unit for generating a safety trajectory on the traveling path of the vehicle based on the received lane information, sensor information, locus information, and position information of the vehicle; And
A controller for controlling at least one device in the vehicle and providing the generated safety trajectory,
≪ / RTI >
Wherein,
Receiving the lane information from the database through a base station and receiving at least one of sensor information and sign information via inter-device communication from another vehicle located in the vicinity of the vehicle.
Wherein the generation unit comprises:
Generating a virtual lane based on the received lane information and the position information of the vehicle, detecting an obstacle and a dangerous vehicle located on a traveling route using sensor information and locus information of another vehicle located on the virtual lane, And generates the safety trajectory on the basis of the determined result.
Receiving lane information in which positional information at the time when the image information is photographed is mapped to image information photographed through a camera, when it is determined that the front view of the vehicle can not be secured;
Generating a virtual lane based on the received lane information and the location information of the vehicle; And
Controlling one or more devices in the vehicle to provide the generated virtual lane
Lt; / RTI >
Wherein the receiving step receives the sensor information and the locus information sensed by at least one other vehicle located in front of the vehicle through the inter-device communication,
Wherein the step of generating includes generating a virtual lane based on the received lane information, the vehicle position information, and the locus information
Wherein the determining step comprises:
A method for controlling a vehicle, comprising: determining whether or not a front view of the vehicle can be secured based on a result of sensing using a camera and a rain sensor.
Wherein the generating comprises:
And if the difference between the received locus information and the locus information acquired from the received lane information exceeds a predetermined level, excludes the received locus information and generates the virtual lane.
Wherein the providing step comprises:
And controlling the at least one device in the vehicle to provide the generated virtual lane and the sensor information.
Generating a safety trajectory on the traveling path of the vehicle based on the received lane information, sensor information, locus information, and position information of the vehicle; And
Controlling one or more devices in the vehicle to provide the generated safety trajectory
And controlling the vehicle.
Wherein the receiving comprises:
Receiving the lane information from the database through a base station, and receiving at least one of sensor information and sign information via inter-device communication from another vehicle located in the vicinity of the vehicle.
Wherein the generating comprises:
Generating a virtual lane based on the received lane information and the position information of the vehicle, detecting an obstacle and a dangerous vehicle located on a traveling route using sensor information and locus information of another vehicle located on the virtual lane, And generating the safety trajectory based on the determined result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150038431A KR101782423B1 (en) | 2015-03-19 | 2015-03-19 | Audio navigation device, vehicle having the same, user device, and method for controlling vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150038431A KR101782423B1 (en) | 2015-03-19 | 2015-03-19 | Audio navigation device, vehicle having the same, user device, and method for controlling vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20160113417A KR20160113417A (en) | 2016-09-29 |
KR101782423B1 true KR101782423B1 (en) | 2017-09-29 |
Family
ID=57073632
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150038431A KR101782423B1 (en) | 2015-03-19 | 2015-03-19 | Audio navigation device, vehicle having the same, user device, and method for controlling vehicle |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101782423B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200101518A (en) * | 2019-01-30 | 2020-08-28 | 한국자동차연구원 | Method for road lane management based on multi-vehicle driving information and system for the same |
KR20210148518A (en) | 2020-05-29 | 2021-12-08 | 서울대학교산학협력단 | Apparatus and method for virtual lane generation based on traffic flow for autonomous driving in severe weather condition |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190061137A (en) * | 2017-11-27 | 2019-06-05 | 현대모비스 주식회사 | Apparatus for keeping virtual lane and method thereof |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007210458A (en) * | 2006-02-09 | 2007-08-23 | Nissan Motor Co Ltd | Display device for vehicle and image display control method for vehicle |
-
2015
- 2015-03-19 KR KR1020150038431A patent/KR101782423B1/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007210458A (en) * | 2006-02-09 | 2007-08-23 | Nissan Motor Co Ltd | Display device for vehicle and image display control method for vehicle |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200101518A (en) * | 2019-01-30 | 2020-08-28 | 한국자동차연구원 | Method for road lane management based on multi-vehicle driving information and system for the same |
KR102269625B1 (en) | 2019-01-30 | 2021-06-28 | 한국자동차연구원 | Method for road lane management based on multi-vehicle driving information and system for the same |
KR20210148518A (en) | 2020-05-29 | 2021-12-08 | 서울대학교산학협력단 | Apparatus and method for virtual lane generation based on traffic flow for autonomous driving in severe weather condition |
Also Published As
Publication number | Publication date |
---|---|
KR20160113417A (en) | 2016-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101744724B1 (en) | Audio navigation device, vehicle having the same, user device, and method for controlling vehicle | |
US10753757B2 (en) | Information processing apparatus and information processing method | |
EP3683102B1 (en) | Systems for driver assistance | |
US20190193738A1 (en) | Vehicle and Control Method Thereof | |
WO2017057055A1 (en) | Information processing device, information terminal and information processing method | |
CN106467060A (en) | Display device and the vehicle including this display device | |
KR20180132922A (en) | Vehicle display devices and vehicles | |
US20160275360A1 (en) | Vehicle and method for controlling the same | |
CN107102347B (en) | Position sensing device, vehicle having the same, and method of controlling the same | |
US11915452B2 (en) | Information processing device and information processing method | |
CN109196557A (en) | Image processing apparatus, image processing method and vehicle | |
JP7374098B2 (en) | Information processing device, information processing method, computer program, information processing system, and mobile device | |
EP3538846B1 (en) | Using map information to smooth objects generated from sensor data | |
US10099616B2 (en) | Vehicle and method for controlling the vehicle | |
JP7371629B2 (en) | Information processing device, information processing method, program, and vehicle | |
US11590985B2 (en) | Information processing device, moving body, information processing method, and program | |
WO2019049828A1 (en) | Information processing apparatus, self-position estimation method, and program | |
US20200230820A1 (en) | Information processing apparatus, self-localization method, program, and mobile body | |
KR101782423B1 (en) | Audio navigation device, vehicle having the same, user device, and method for controlling vehicle | |
KR101650791B1 (en) | Vehicle, and method for controlling thereof | |
US20200357284A1 (en) | Information processing apparatus and information processing method | |
KR20170110800A (en) | Navigation Apparutaus and Driver Assistance Apparatus Having The Same | |
JP2019100942A (en) | Mobile object, positioning system, positioning program and positioning method | |
WO2023068116A1 (en) | On-vehicle communication device, terminal device, communication method, information processing method, and communication system | |
WO2022075062A1 (en) | Object position detection device, object position detection system, and object position detection method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E90F | Notification of reason for final refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |