KR101782423B1 - Audio navigation device, vehicle having the same, user device, and method for controlling vehicle - Google Patents

Audio navigation device, vehicle having the same, user device, and method for controlling vehicle Download PDF

Info

Publication number
KR101782423B1
KR101782423B1 KR1020150038431A KR20150038431A KR101782423B1 KR 101782423 B1 KR101782423 B1 KR 101782423B1 KR 1020150038431 A KR1020150038431 A KR 1020150038431A KR 20150038431 A KR20150038431 A KR 20150038431A KR 101782423 B1 KR101782423 B1 KR 101782423B1
Authority
KR
South Korea
Prior art keywords
information
vehicle
lane
generating
received
Prior art date
Application number
KR1020150038431A
Other languages
Korean (ko)
Other versions
KR20160113417A (en
Inventor
강경현
강기동
노희진
윤석영
김성운
백빛나
김가희
허종혁
김치성
Original Assignee
현대자동차주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 현대자동차주식회사 filed Critical 현대자동차주식회사
Priority to KR1020150038431A priority Critical patent/KR101782423B1/en
Publication of KR20160113417A publication Critical patent/KR20160113417A/en
Application granted granted Critical
Publication of KR101782423B1 publication Critical patent/KR101782423B1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/03Cooperating elements; Interaction or communication between different cooperating elements or between cooperating elements and receivers
    • G01S19/07Cooperating elements; Interaction or communication between different cooperating elements or between cooperating elements and receivers providing data for correcting measured positioning data, e.g. DGPS [differential GPS] or ionosphere corrections
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • H04N5/225

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

A vehicle, and a control method of the vehicle. The vehicle according to claim 1, further comprising: a determination unit determining whether a front view of the vehicle can be secured; A communication unit for receiving lane information on which positional information when the image information is photographed is mapped to image information photographed by a camera, when it is determined that securing of the front view of the vehicle is impossible; A generating unit for generating virtual lanes based on the received lane information and the vehicle position information; And a control unit for controlling one or more devices in the vehicle to provide the generated virtual lane.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a control method for a vehicle,

A vehicle that provides information about roads, and a method of controlling the vehicle.

Recent vehicles introduced ADAS (Advanced Driver Assistance Systems). A vehicle driving assistance system is a system that assists a driver in driving a vehicle. It is a system that warns the driver of a collision warning with a vehicle ahead of the driver, a lane departure warning due to drowsiness, System. In addition, various systems have been introduced to assist the driver in driving the vehicle.

The vehicle according to claim 1, further comprising: a determination unit determining whether a front view of the vehicle can be secured; A communication unit for receiving lane information on which positional information when the image information is photographed is mapped to image information photographed by a camera, when it is determined that securing of the front view of the vehicle is impossible; A generating unit for generating virtual lanes based on the received lane information and the vehicle position information; And a control unit for controlling one or more devices in the vehicle to provide the generated virtual lane.

In addition, the determination unit may determine whether or not the front view of the vehicle can be secured based on a result of the detection using the camera and the rain sensor.

In addition, the communication unit may receive sensor information and sign information detected by at least one other vehicle located in front of the vehicle through inter-device communication.

The generating unit may generate a virtual lane on the basis of the received lane information, the vehicle position information, and the locus information.

The generating unit may exclude the received trajectory information when generating the virtual lane when the difference between the received trajectory information and the trajectory information acquired from the received lane information exceeds a predetermined level .

The control unit may control one or more devices in the vehicle to provide the generated virtual lane and the sensor information.

A vehicle according to one side includes: a camera for photographing the front of the vehicle and generating image information; A position measuring device for measuring position information when the image information is photographed; An acquiring unit for acquiring lane information based on the image information and the position information; And a communication unit for transmitting the lane information to a database.

Further, the position measuring apparatus may correspond to DGPS.

The obtaining unit may obtain lane information by mapping the position information of the image information to the image information.

The vehicle according to one side receives the lane information in which the positional information at the time of photographing the image information is mapped to the image information photographed through the camera from the database, and receives sensor information and sign information from another vehicle located in the vicinity of the vehicle ; A generating unit for generating a safety trajectory on the traveling path of the vehicle based on the received lane information, sensor information, locus information, and position information of the vehicle; And a control unit for controlling at least one device in the vehicle and providing the generated safety trajectory.

The communication unit may receive the lane information from the database through the base station and receive at least one of the sensor information and the locus information through inter-device communication from another vehicle located in the vicinity of the vehicle.

The generating unit may generate a virtual lane based on the received lane information and the location information of the vehicle and may generate the virtual lane based on the received sensor information and the locus information of the other vehicle located on the virtual lane, The obstacle and the dangerous vehicle, and generate the safety trajectory based on the determined result.

A method of controlling a vehicle according to one side includes: determining whether a front view of the vehicle can be secured; Receiving lane information in which positional information at the time when the image information is photographed is mapped to image information photographed through a camera, when it is determined that the front view of the vehicle can not be secured; Generating a virtual lane based on the received lane information and the location information of the vehicle; And controlling one or more devices in the vehicle to provide the generated virtual lane.

Also, the determining step may determine whether the front view of the vehicle can be secured based on the result of the detection using the camera and the rain sensor.

In addition, the receiving step may receive sensor information and sign information detected by at least one other vehicle located in front of the vehicle through inter-device communication.

The generating step may generate a virtual lane based on the received lane information, the vehicle position information, and the locus information.

The generating step may include a step of generating the virtual lane by excluding the received locus information if the difference between the received locus information and the locus information acquired from the received lane information exceeds a predetermined level, / RTI >

Further, the providing step may control at least one device in the vehicle to provide the generated virtual lane and the sensor information.

A method of controlling a vehicle according to one side includes the steps of photographing a front side of a vehicle to generate image information; Measuring positional information when the image information is captured; Obtaining lane information based on the image information and the position information; And transmitting the lane information to a database.

Further, the position measuring apparatus may correspond to DGPS.

The acquiring may acquire the lane information by mapping the position information of the image information to the image information.

A method of controlling a vehicle according to one aspect of the present invention includes receiving lane information in which positional information at the time of photographing of the image information is mapped on image information photographed through a camera from a database, Receiving at least one of the information; Generating a safety trajectory on the traveling path of the vehicle based on the received lane information, sensor information, locus information, and position information of the vehicle; And controlling one or more devices in the vehicle to provide the generated safety trajectory.

The receiving step may receive the lane information from the database through the base station and receive at least one of the sensor information and the locus information through inter-device communication from another vehicle located in the vicinity of the vehicle.

The generating step may include generating virtual lanes based on the received lane information and the location information of the vehicle, generating a virtual lane based on sensor information of the other vehicle located on the virtual lane, It is possible to determine the obstacle and the dangerous vehicle located on the vehicle, and to generate the safety trajectory based on the determined result.

1 is a view schematically showing an external configuration of a vehicle according to an embodiment.
2 is a view showing an internal configuration of a vehicle according to an embodiment.
3 is a block diagram of a vehicle for generating lane information and transmitting it to a database.
FIG. 4 is a diagram illustrating a large-scale antenna system of a base station according to a 5G communication scheme according to an embodiment.
5A to 5C are diagrams showing a communication method in a 5G network.
6 is a diagram showing an operational flow chart of a vehicle for generating lane information and transmitting it to a database.
7 is a block diagram of a vehicle that provides virtual lanes using lane information.
8 is a diagram showing an operational flow chart of a vehicle providing virtual lanes using lane information.
9 and 10 are views showing a case where virtual lanes according to different embodiments are displayed on the front window.
11 is a view showing a screen displaying sensor information of a front vehicle through a display according to an embodiment.
12 is a flowchart illustrating an operation of a vehicle that provides a safety trajectory according to an embodiment.
FIG. 13 is a diagram showing a case where a safety trail is added to a virtual lane according to an embodiment and displayed on a front window.
FIG. 14 is a view showing a screen in which a safety trajectory is displayed through a display according to an embodiment.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.

1 is a view schematically showing an external configuration of a vehicle according to an embodiment.

1, a vehicle 200 includes a vehicle body 80 that forms an appearance of the vehicle 200, and wheels 93 and 94 that move the vehicle 200. As shown in Fig. The vehicle body 80 includes a hood 81, a front fender 82, a door 84, a luggage compartment lid 85, a quarter panel 86, and the like.

A front window 87 provided on the front side of the vehicle body 80 to provide a view of the front of the vehicle 200, a side window 88 providing a side view, a door 84 Side mirrors 91 and 92 provided on the rear side of the vehicle 200 to provide the rear and side view of the vehicle 200 and a rear window 90 provided on the rear side of the vehicle body 80 to provide a view of the rear of the vehicle 200. [ May be provided. Hereinafter, the internal configuration of the vehicle 200 will be described in detail.

3 is a block diagram of a vehicle according to an embodiment. FIG. 4 is a block diagram of a large-scale antenna system of a base station according to an exemplary embodiment of the present invention. Referring to FIG. 2, And FIGS. 5A to 5C are diagrams showing a communication method in a 5G network. Describe the descriptions together to help avoid duplication.

The vehicle 200 is provided with an air conditioner to perform both heating and cooling. The heated or cooled air can be discharged through the vent 153 to control the temperature inside the vehicle 200. The air conditioner 196 described below can automatically control the air conditioning environment including indoor / outdoor environmental conditions of the vehicle 200, intake / exhaustion of air, circulation, cooling / heating state, etc., And the like.

Referring to FIG. 2, an AVN terminal (Audio Video Navigation) 100 may be provided inside the vehicle 200. The AVN terminal 100 refers to a terminal that can provide a navigation function for providing a path to a destination to a user and can integrally provide audio and video functions. The AVN terminal 100 can selectively display at least one of an audio screen, a video screen and a navigation screen through the AVN display 101 as well as various control screens related to the control of the vehicle 200, ) Can display a screen related to an additional function that can be executed.

According to one embodiment, the AVN terminal 100 can display various control screens related to the control of the air conditioner through the AVN display 101 in cooperation with the above-described air conditioner. In addition, the AVN terminal 100 can control the operation state of the air conditioner 196 to adjust the air conditioning environment in the vehicle.

On the other hand, the AVN display 101 may be located in the center fascia 11, which is the central area of the dashboard 10. According to one embodiment, the AVN display 101 is implemented as a liquid crystal display (LCD), a light emitting diode (LED), a plasma display panel (PDP), an organic light emitting diode (OLED), a cathode ray tube However, this is not the case. In addition, as will be described later, the vehicle 200 can display information necessary for driving through a head-up display (HUD) and a windshield display.

A speaker 143 capable of outputting sound may be provided inside the vehicle 200. Accordingly, the vehicle 200 can output the sound necessary for performing the audio function, the video function, the navigation function, and other additional functions through the speaker 143. [

The navigation input unit 102 may be located in the center fascia 11, which is a central area of the dashboard 10. [ The driver can input various control commands by operating the navigation input unit 102. [ In addition, the navigation input unit 102 may be provided in a hard key type in an area adjacent to the AVN display 101. [ When the AVN display 101 is implemented as a touch screen type, the AVN display 101 can also perform the function of the navigation input unit 102. [

Meanwhile, the center console 40 may be provided with a center input unit 43 of a jog shuttle type or a hard key type. The center console 40 refers to the portion where the gear operating lever 41 and the tray 42 are formed between the driver's seat 21 and the front passenger's seat 22. The center input unit 43 may perform all or some of the functions of the navigation input unit 102. [

3, a vehicle 200 according to an embodiment may include a camera 131 and a position measuring device 132, an obtaining unit 130, a communication unit 140, and a control unit 150 have. The acquisition unit 130, the communication unit 140 and the control unit 150 may be integrated into one or more system on chips (SOC) built in the vehicle 200 and may be operated by a processor .

The camera 131 is provided on the vehicle 200, and can photograph the road ahead of the vehicle 200 to generate image information. At this time, the image information includes all the images taken on the road ahead of the vehicle 200, or moving images. According to one embodiment, the camera 131 is installed in the headlining 13 so as to photograph a front road, but the position where the camera 131 is installed is not limited to the embodiment, It can be installed anywhere on the scene where it can be photographed. At this time, the forward direction refers to the direction in which the front window 97 shown in FIG. 2 is viewed inside the vehicle 200.

On the other hand, the position measuring device 132 is provided in the vehicle 200, and can measure the position information of the vehicle 200. [ The location information means information that can identify the location of the vehicle 200. For example, the location information includes coordinate information composed of longitude, latitude, altitude, and the like, but is not limited thereto, and includes all information capable of locating the vehicle 200.

Meanwhile, the position measuring device 132 includes a GPS (Global Positioning System) for locating a position through a satellite, and a DGPS (Differential Global Positioning System) for measuring a position with high accuracy by complementing the GPS It is not. Generally, the position transmitted from a satellite to a terrestrial GPS has an error. For example, if there are N (N? 2) GPSs located close to each other, the N GPSs have a similar error. At this time, DGPS means a device capable of obtaining more precise data by offsetting common errors of N GPSs.

The acquiring unit 130 may acquire the lane information by mapping the position information when the image information calculated through the position measuring apparatus 132 is captured to the image information generated through the camera 131. [ As will be described later, the vehicle 200 receives the lane information stored in the database 170, and can provide a virtual lane to the driver using the lane information.

Referring to FIG. 3, a communication unit 140 may be provided in the vehicle 200. The communication unit 140 may transmit a radio signal including lane information to a database 170 provided outside the vehicle 200. [ Here, the lane information includes information on the lane. For example, the lane information includes information about a running angle when a plurality of vehicles are running on the road, that is, a running trajectory. Depending on the road, the driving directions of the vehicles are different. For example, unlike a straight road, on a bended road, the running angle, that is, the running direction, may continuously change according to the position of the vehicle. The database 170 accumulates information about the trajectory of travel according to the position of the vehicle on the road so that information on the average travel trajectory can be stored. Information about the driving trajectory can be measured by the angle of the steering wheel in the vehicle or the compass sensor and uploaded to the database 170 via the communication network. At this time, the compass sensor is a compass sensor which can sense the bearing (magnetic north) through the four sensors, and can be used to confirm the direction of the vehicle.

The communication unit 140 can transmit and receive wireless signals between devices via a base station through a communication method such as 3G (3G) and 4G (4Generation). In addition, a wireless LAN, (Bluetooth), Zigbee, Wi-Fi Direct, UWB, Infrared Data Association (BDA), Bluetooth Low Energy (BLE), and Near Field Communication The mobile station can transmit and receive a wireless signal including data with a terminal within a predetermined distance.

In addition, the communication unit 140 can transmit and receive a radio signal through a 5G (5 Generation) communication system. The 4G communication method uses the frequency band of 2 GHz or less, but the 5G communication method uses the frequency band of about 28 GHz band. However, the frequency band used by the 5G communication system is not limited thereto.

A large-scale antenna system may be employed for the 5G communication system. A large-scale antenna system refers to a system that can cover up to ultra-high frequency by using dozens or more antennas, and can transmit / receive a large amount of data simultaneously through multiple connections. Specifically, the large-scale antenna system can adjust the arrangement of the antenna elements to transmit and receive radio waves farther in a specific direction, thereby enabling a large-capacity transmission and expanding the usable area of the 5G communication network.

Referring to FIG. 4, the base station 400 can simultaneously transmit and receive data with many devices through a large-scale antenna system. In addition, the large-scale antenna system minimizes radio waves radiated in a direction outside the direction of propagation of radio waves to reduce noise, thereby improving transmission quality and reducing the amount of electric power.

In addition, the 5G communication method uses a non-orthogonal multiplexing access (NOMA) method to modulate a radio signal, unlike a conventional method of modulating a transmission signal through an Orthogonal Frequency Division Multiplexing By this transmission, more devices can be connected to multiple devices, and large capacity transmission / reception is possible at the same time.

For example, the 5G communication method can provide a transmission rate of up to 1Gbps. 5G communication method can support immersive communication that requires high-capacity transmission such as UHD (Ultra-HD), 3D, hologram, etc. through large capacity transmission. As a result, users can send and receive more sophisticated and immersive ultra-high-capacity data faster through the 5G communication method.

In addition, the 5G communication method enables real-time processing with a maximum response speed of 1ms or less. Accordingly, in the 5G communication method, it is possible to support a real-time service that responds before the user recognizes it. For example, a vehicle receives sensor information from various devices while driving, and can provide an autonomous driving system through real-time processing, as well as provide various remote controls. In addition, the vehicle can process the sensor information with other vehicles existing around the vehicle through the 5G communication system in real time to provide the possibility of occurrence of collision to the user in real time, Can be provided in real time.

In addition, through the real-time processing and large-capacity transmission provided by the 5G communication, the vehicle can provide the big data service to the passengers in the vehicle. For example, the vehicle can analyze various web information, SNS information, and the like, and provide customized information suitable for the situation of the passengers in the vehicle. In one embodiment, the vehicle collects various kinds of tourist information and tourist information existing in the vicinity of the traveling route through the big data mining and provides it in real time so that the passengers can directly check various information existing around the traveling area.

On the other hand, the network of the 5G communication can further subdivide the cell and support the high density and large capacity transmission of the network. Here, the cell means a region divided into a small area and a large area in order to efficiently use the in. At this time. A small power base station is installed in each cell to support communication between terminals. For example, the network of 5G communication can be formed in a two-stage structure of a macro cell base station-distributed small base station-communication terminal by further reducing the size of the cell and reducing the size of the cell.

Also, in a network of 5G communication, relay transmission of a radio signal through a multihop method can be performed. For example, as shown in FIG. 5A, the first terminal 401 may relay a radio signal to be transmitted by the third terminal 403 located outside the network of the base station 400 to the base station 400 have. The first terminal 401 may relay the radio signal to be transmitted by the second terminal 402 located in the network of the base station 400 to the base station 400. As described above, at least one of the devices capable of using the network of the 5G communication may perform the relay transmission through the multi-hop method, but the present invention is not limited thereto. As a result, it is possible to expand the area where the 5G communication network is supported and to solve the buffering problem caused by a large number of users in the cell.

Meanwhile, the 5G communication system is capable of device-to-device (D2D) communication applied to vehicles, wearable devices, and the like. Device-to-device communication refers to communication between devices, and refers to communication in which a device transmits not only data sensed through a sensor but also wireless signals containing various data stored in the device. According to the inter-device communication method, it is not necessary to exchange wireless signals via the base station, and radio signals are transmitted between the devices, so unnecessary energy can be saved. At this time, in order to use a 5G communication system such as a vehicle or a wearable device, an antenna must be built in the device.

The vehicle 200 can transmit and receive wireless signals with other vehicles existing around the vehicle through inter-device communication. For example, as shown in FIG. 5B, the vehicle 200 can communicate with other vehicles 201, 202, and 203 existing in the vicinity of the vehicle. In addition, the vehicle 200 can communicate with a traffic information device (not shown) provided at an intersection or the like as well as the device.

As another example, the vehicle 200 can transmit and receive wireless signals to and from the first vehicle 201 and the third vehicle 203 via inter-device communication as shown in FIG. 5C, Can communicate data with the vehicle 200 and the second vehicle 202 through inter-device communication. That is, a virtual network is formed between a plurality of vehicles 200, 201, 202, and 203 located within a distance capable of device-to-device communication, and wireless signals can be transmitted and received.

On the other hand, the 5G communication network extends the area where device-to-device communication is supported, enabling communication between devices located farther away. In addition, since it supports real-time processing with response speed of 1ms or less and high-capacity communication of 1Gbps or more, signals including desired data can be exchanged between vehicles running.

For example, a vehicle can communicate with other vehicles, servers, systems, and the like existing in the vicinity of the vehicle in real time through the 5G communication system, and can transmit and receive data, and provides a route guidance service through the augmented reality And can provide various kinds of services.

In addition, the vehicle can transmit and receive wireless signals including data through a base station or inter-device communication using a band outside the above-mentioned frequency band, and is not limited to the communication method using the above-mentioned frequency band.

For example, as will be described later, the communication unit 140 can receive sign information and sensor information from at least one other vehicle existing in the vicinity through inter-device communication. A detailed description thereof will be described later.

Meanwhile, the database 170 may be provided on a server, a system, a storage medium or the like located outside the vehicle 200. An externally located server or system, storage medium, or the like may be the vehicle 200 operated by the manufacturer or other operator. For example, the database 170 may be stored in a cloud server capable of uploading and downloading data through a cloud service.

A cloud service is a service that stores various data on a cloud server and downloads it whenever necessary. For example, the cloud service may be constructed by an application programming interface (API), but is not limited thereto. The vehicle 200 can provide a driver with a virtual lane using the lane information stored in the database 170 when the front view is difficult to secure, thereby helping the driver to drive safely. A detailed description thereof will be given later

Hereinafter, an operation flow chart of a vehicle for acquiring lane information and delivering it to a database will be described.

The vehicle can generate image information of the front of the vehicle through the camera (1000). The front of the vehicle means the direction in which the driver looks at the front window from inside the vehicle. The image information includes an image or a moving image related to a road ahead of the vehicle. As will be described later, the vehicle acquires information such as lane, obstacle, etc. in an image or a moving image about the road, .

Also, the vehicle can measure the position information when the image information is captured through the position measuring apparatus (1010). As described above, the position measuring apparatus can measure the position information of the vehicle and determine the position of the vehicle when photographing the corresponding image information.

Accordingly, the vehicle can acquire the lane information by mapping the position information at the time of photographing the image information to the image information (1020). In the lane information, the positional information at the time of photographing the image information is mapped to the image information, so that the vehicle can judge from what position the road environment is. For example, the vehicle can determine the degree to which the road is bent from the lane information at any position.

Further, the vehicle can receive sensor information from a vehicle traveling ahead of the vehicle through inter-device communication. The sensor information includes the results detected from various sensors built in the vehicle that traveled ahead, so that the vehicle can determine the state of the road through the sensor information. For example, the vehicle can detect whether there is an ice sheet on the road from the sensor information and whether it is slippery or whether or not there is a speed limiter. Details of this will be described later.

The vehicle can send lane information to the database. As described above, since the vehicle is equipped with an antenna, the vehicle can transmit lane information to the database through various communication methods.

7, the vehicle 200 includes an AVN display 101, a camera 131, and a position measuring device 132, a rain sensor 133, a generating unit 120, a communication unit 140, a speaker 143 And a control unit 170. The control unit 170 includes a control unit 170,

The description of the camera 131 has been described above with reference to Fig. 3, and therefore will be omitted.

The rain sensor 133 refers to a sensor that detects intensity and amount of rainwater. The rain sensor 133 is a sensor that senses rainwater and automatically controls the speed or operation time of the wiper, even if the driver does not operate the sensor based on the detection result.

The determination unit 110 may determine whether the front view of the vehicle 200 can be secured by using at least one of the camera 131 and the rain sensor 133. [ In this case, when the front view can not be secured, it means that it is difficult to secure the front view of the vehicle 200, such as a case where the driver is unable to identify the lane due to a lot of rain or fog .

According to an embodiment, the determination unit 110 may detect intensity or amount of rainwater using the rain sensor 133, and may determine that the front view of the vehicle 200 can not be secured if the rain sensor exceeds a preset level .

Alternatively, the determination unit 110 can determine whether the forward view of the vehicle 200 can be secured through the camera 131. [ For example, the determination unit 110 may determine whether the objects are normally identified from the image information generated through the camera 131. For example, Accordingly, if the objects included in the image information are not clearly identified or can not be identified, the determination unit 110 can determine that the viewability can not be secured. In addition, the determination unit 110 may determine whether or not the viewability can be secured by combining the determination results obtained through the camera 131 and the rain sensor 133. [

Meanwhile, the communication unit 140 may be provided in the vehicle 200. The communication method of the communication unit 140 is the same as that described above, so that it is omitted. The communication unit 140 can receive the radio signal including the lane information from the database 170. [ At this time, the communication unit 140 may receive all the lane information on the driving route from the database 170, or may receive only the lane information corresponding to the position information of the difficult-to-view area in real time. The lane information is information in which the image information about the road and the position information at the time of photographing the image information are mapped, and the detailed description thereof is the same as the above description, so that the lane information is omitted.

Further, the communication unit 140 can receive at least one of the sensor information and the sign information from the surrounding preceding vehicle through the inter-device communication. The sensor information means information detected from various sensors built in the preceding vehicle. For example, the sensor information includes information detected through radar, radar, and the like.

Further, the locus information means information on the running locus of the preceding vehicle. For example, it includes information on the degree of stepping on the angle of the steering wheel of the preceding vehicle, the degree of stepping on the brake, the degree of stepping on the accelerator, and whether or not the ESP (Electronic Stability Program) is intervened.

The generating unit 120 can generate the virtual lane by associating the received lane information with the current position information obtained from the position measuring apparatus 133. [ Here, the virtual lane may be displayed as a lane corresponding to the actual lane, superimposed on the real world through the augmented reality, or displayed in a specific view such as bird view. That is, the vehicle 200 according to the disclosed embodiment provides the driver with a virtual lane corresponding to the actual lane at a position where visibility is difficult to secure, thereby making it possible to secure the driver.

At this time, the generator 120 may generate a virtual lane by combining lane information, position information, and sign information. As described above, the locus information is information on the trajectory of the preceding vehicle, which is transmitted from another vehicle ahead of the vehicle 200, so that the road state such as the current road surface state or the like can be reflected.

That is, the vehicle 200 according to the disclosed embodiment can generate the virtual lane using the sign information in addition to the lane information and the position information, thereby improving the accuracy of the virtual lane generated only by the lane information and the position information. In addition, the generation unit 120 can reduce the error by correcting the virtual lane previously generated through the locus information.

However, the locus information received from the preceding vehicle may deviate from the actual lane due to a driver's fault of the preceding vehicle. Accordingly, the generating unit 120 compares the sign information obtained from the lane information received from the database 170 with the sign information received from the preceding vehicle, and if the difference exceeds a predetermined level, It is possible to exclude information and generate a virtual lane. That is, if the difference between the sign information received from the preceding vehicle and the average sign information is large, the generating unit 120 can prevent an accident by excluding it.

Meanwhile, the generation unit 120 can generate a safety trajectory predicted on the traveling route. In this case, the safety trajectory means a trajectory that can be guided to a safe lane by judging that the vehicle 200 is dangerous to travel in view of the obstacle predicted on the traveling route or the trajectory of the traveling of the other vehicle.

The generating unit 120 generates the lane on which the vehicle 200 is located from the entire lane of the road and the entire lane of the road from the lane information received from the database 170 and the position information of the vehicle 200 as described above .

At this time, the generation unit 120 can determine the position of other vehicles from the position information received from the other vehicle. Also, the generating unit 120 can determine whether there is an obstacle in the vicinity of another vehicle based on the sensor information of another vehicle, and also compare the sign information of another vehicle with the sign information of the virtual lane, Or whether there is an abnormality in the running of the vehicle. Accordingly, if it is determined that the obstacle is located or abnormality occurs in the running of other vehicles, the generating unit 120 can generate a safety trajectory capable of safely running away from obstacles or other vehicles. That is, the generating unit 120 can identify various risks located on the driving route and generate a safety trajectory that can avoid them. The driver can view the safety trajectory and drive safely away from obstacles or other dangerous vehicles.

On the other hand, the controller 160 can control the overall operation. Specifically, the control unit 160 can control not only the various modules built in the AVN terminal, but also the operation of all the components such as the device mounted on the vehicle 200. [ The control unit 160 may generate control signals for controlling the components of the vehicle 200 to control the operations of the components described above.

The control unit 160 can control one or more devices in the vehicle 200 to provide virtual lanes. For example, the control unit 160 may control the AVN display 101 to display a virtual lane. Further, the control unit 160 can control the speaker 143 to output information about the virtual lane.

In addition, the control unit 160 can control the windshield display to display a virtual lane on the front window 87. [ In addition, the control unit 160 may control the head-up display to display a virtual lane on the front window 87. [ The head-up display is a display for providing information by utilizing the front front window 87 of the vehicle 200. The head-up display displays real-time information required for driving in the front window 87, Means a display that helps you get the information you need. Such a head-up display can reduce the risk of accidental or accidental accidents due to the divergence of the driver's gaze, thereby enabling smooth running.

On the other hand, the head-up display is provided with a windshield type in which light is projected onto a reflector, light reflected from the reflector is projected onto the front window 87 to display a virtual lane, and a separate screen, And a head-up display in the vehicle 200 according to the disclosed embodiment includes all of them.

In addition, the control unit 160 is mounted on the vehicle 200 and controls various devices that can provide information about the virtual lane to the driver, thereby providing a virtual lane so that the driver can safely operate the vehicle have.

On the other hand, the control unit 160 is mounted on the vehicle 200 and can provide sensor information by controlling various devices that can provide information to the driver about the virtual lane. For example, the control unit 160 may display a pop-up message including sensor information via the AVN display 101, a windshield display, or a head-up display, or may output sensor information via the speaker 143 .

FIG. 8 is a view showing an operation flow chart of a vehicle providing virtual lanes using lane information, and FIGS. 9 and 10 are views showing a case where virtual lanes according to different embodiments are displayed on the front window And FIG. 11 is a view showing a screen in which sensor information of a front vehicle is displayed through a display according to an embodiment.

Referring to FIG. 8, the vehicle can determine whether a front view of the vehicle can be secured through the in-vehicle device (1100). Here, the in-vehicle device refers to a device capable of collecting data as a basis for determining whether or not the front view of the vehicle can be secured.

For example, the vehicle can photograph the front of the vehicle through the camera to generate image information, and can determine whether or not the driver can obtain a view from the generated image information. Alternatively, the vehicle senses rainwater through the rain sensor, and if the predetermined level or more is sensed, the driver may judge that it is impossible to secure the visibility, and there is no limitation.

If the forward view is determined to be unreachable, the vehicle may receive lane information from the database (1110) to aid the driver in driving. For example, if the database is located in an external cloud server, the vehicle can retrieve only the lane information within the driver's driving route out of the lane information stored in the database using the API key.

The vehicle may generate a virtual lane based on the lane information and the location information of the vehicle (1120). For example, the vehicle can compare the position information mapped with the image information of the lane information and the position information of the vehicle, and generate the virtual lane using the image information corresponding to the position information of the vehicle. At this time, the vehicle receives the trajectory information from the preceding vehicle through the inter-device communication, and can generate a virtual lane reflecting the trajectory information. Further, the vehicle receives the sensor information from the vehicle that is traveling ahead through the inter-device communication, and provides the sensor information to the driver, so that the driver can safely operate the vehicle.

The vehicle may control the in-vehicle device to provide a virtual lane (1130). The in-vehicle device includes a device mounted on the vehicle and capable of providing information on the virtual lane to the driver. For example, referring to FIG. 9, a vehicle may display a virtual lane generated through an augmented reality on a windshield display. Accordingly, the in-vehicle driver can keep driving while viewing the virtual lane even if the lane is not visible due to mist, shower, or the like. Alternatively, as shown in Fig. 10, by displaying a virtual lane on the head-up display displayed in the front window of the vehicle, the driver can view it and keep driving.

On the other hand, the vehicle can provide sensor information received from another vehicle to the driver, thereby helping the driver to drive more safely. As shown in FIG. 11, the vehicle can display a pop-up message including sensor information on the display. For example, a vehicle may display a pop-up message on the display that says 'Please note that there is ice on the road 100m ahead' to ensure the safety of the driver.

12 is a flowchart illustrating an operation of a vehicle that provides a safety trajectory according to an embodiment.

As described above, the vehicle can exchange data with an external database through a base station, and can directly exchange data with other vehicles through inter-device communication. For example, the vehicle can receive lane information in which positional information is mapped when photographing image information and image information from a database through a base station, and can receive sensor information and sign information of another vehicle through inter-device communication have. In addition, the vehicle can receive position information of another vehicle through inter-device communication. Each vehicle is provided with a position measuring device, and position information of each vehicle can be measured.

The vehicle can generate the lane corresponding to the actual lane using the image information corresponding to the position information of the vehicle from the position information of the vehicle measured through the position measuring device and the image information received from the base station.

The vehicle can arrange another vehicle on the lane corresponding to the actual lane using the position information of the other vehicle. Further, the vehicle can correspond to the sensor information and the locus information of another vehicle located on the lane corresponding to the actual lane. Accordingly, the vehicle can determine whether an obstacle or the like is detected in the vicinity of each of the other vehicles from the sensor information of the other vehicle. Further, the vehicle can compare the difference between the sign information of the other vehicle and the imaginary lane to judge whether the running of each vehicle is normal or dangerous. The vehicle can generate a safety trajectory based on the determination result by determining which lane the vehicle should travel in each lane on the driving route.

The vehicle can display the safety trajectory generated through various displays. For example, the vehicle can display a safety trajectory through a windshield display, a head-up display, an AVN display, etc., thereby guiding the driver to drive in a safer lane.

For example, as shown in Fig. 13, the vehicle can display a safety trail on a virtual lane through the augmented reality on the windshield display. Accordingly, the driver in the vehicle can maintain the driving according to the safety trajectory displayed on the virtual lane, even if the lane is not visible due to mist, shower, or the like. At this time, the safety trajectory is not limited to being displayed on the imaginary lane, but the vehicle may display the safety trajectory on the lane visible through the actual front window. Alternatively, as shown in Fig. 14, by displaying a virtual lane and a safety trajectory on the head-up display displayed in the front window of the vehicle, the driver can view and keep driving.

The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like.

Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.

Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.

10: Dashboard, 11: Center Fesia
12: steering wheel, 13: head lining
21: driver's seat, 22: passenger seat
40: center console, 41: gear operating lever
42: Tray, 43: Center input
80: body, 81: hood, 82: front fender
84: Door, 85: Trunk lid, 86: Quarter panel
87: front window, 88: side window
90: rear window, 91, 92: side mirror
100: AVN terminal, 101: AVN display, 102: navigation input unit
143: speaker, 153: ventilation hole, 190: audio input section

Claims (24)

A determination unit for determining whether or not the front view of the vehicle can be secured;
A communication unit for receiving lane information on which positional information when the image information is photographed is mapped to image information photographed by a camera, when it is determined that securing of the front view of the vehicle is impossible;
A generating unit for generating virtual lanes based on the received lane information and the vehicle position information; And
A controller for controlling one or more devices in the vehicle and providing the generated virtual lane,
Lt; / RTI >
Wherein the communication unit receives sensor information and sign information detected by at least one other vehicle located in front of the vehicle through inter-device communication,
Wherein the generating unit generates a virtual lane based on the received lane information, the vehicle position information, and the locus information.
The method according to claim 1,
Wherein,
And determining whether or not the front view of the vehicle can be secured based on the result of the detection using the camera and the rain sensor.
delete delete The method according to claim 1,
Wherein the generation unit comprises:
Wherein the control unit excludes the received trajectory information in generating the virtual lane when the difference between the received trajectory information and the trajectory information acquired from the received lane information exceeds a predetermined level.
The method according to claim 1,
Wherein,
And controlling one or more devices in the vehicle to provide the generated virtual lane and the sensor information.
delete delete delete A communication unit for receiving lane information in which positional information at the time of photographing the image information is captured in image information photographed from a database through a camera and receiving sensor information and sign information from another vehicle located in the vicinity of the vehicle through inter- ;
A generating unit for generating a safety trajectory on the traveling path of the vehicle based on the received lane information, sensor information, locus information, and position information of the vehicle; And
A controller for controlling at least one device in the vehicle and providing the generated safety trajectory,
≪ / RTI >
11. The method of claim 10,
Wherein,
Receiving the lane information from the database through a base station and receiving at least one of sensor information and sign information via inter-device communication from another vehicle located in the vicinity of the vehicle.
11. The method of claim 10,
Wherein the generation unit comprises:
Generating a virtual lane based on the received lane information and the position information of the vehicle, detecting an obstacle and a dangerous vehicle located on a traveling route using sensor information and locus information of another vehicle located on the virtual lane, And generates the safety trajectory on the basis of the determined result.
Determining whether a front view of the vehicle can be secured;
Receiving lane information in which positional information at the time when the image information is photographed is mapped to image information photographed through a camera, when it is determined that the front view of the vehicle can not be secured;
Generating a virtual lane based on the received lane information and the location information of the vehicle; And
Controlling one or more devices in the vehicle to provide the generated virtual lane
Lt; / RTI >
Wherein the receiving step receives the sensor information and the locus information sensed by at least one other vehicle located in front of the vehicle through the inter-device communication,
Wherein the step of generating includes generating a virtual lane based on the received lane information, the vehicle position information, and the locus information
14. The method of claim 13,
Wherein the determining step comprises:
A method for controlling a vehicle, comprising: determining whether or not a front view of the vehicle can be secured based on a result of sensing using a camera and a rain sensor.
delete delete 14. The method of claim 13,
Wherein the generating comprises:
And if the difference between the received locus information and the locus information acquired from the received lane information exceeds a predetermined level, excludes the received locus information and generates the virtual lane.
18. The method of claim 17,
Wherein the providing step comprises:
And controlling the at least one device in the vehicle to provide the generated virtual lane and the sensor information.
delete delete delete Receiving lane information in which positional information at the time of photographing the image information is captured in image information photographed by a camera from a database, and receiving sensor information and sign information from another vehicle located in the vicinity of the vehicle through inter-device communication ;
Generating a safety trajectory on the traveling path of the vehicle based on the received lane information, sensor information, locus information, and position information of the vehicle; And
Controlling one or more devices in the vehicle to provide the generated safety trajectory
And controlling the vehicle.
23. The method of claim 22,
Wherein the receiving comprises:
Receiving the lane information from the database through a base station, and receiving at least one of sensor information and sign information via inter-device communication from another vehicle located in the vicinity of the vehicle.
23. The method of claim 22,
Wherein the generating comprises:
Generating a virtual lane based on the received lane information and the position information of the vehicle, detecting an obstacle and a dangerous vehicle located on a traveling route using sensor information and locus information of another vehicle located on the virtual lane, And generating the safety trajectory based on the determined result.
KR1020150038431A 2015-03-19 2015-03-19 Audio navigation device, vehicle having the same, user device, and method for controlling vehicle KR101782423B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150038431A KR101782423B1 (en) 2015-03-19 2015-03-19 Audio navigation device, vehicle having the same, user device, and method for controlling vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150038431A KR101782423B1 (en) 2015-03-19 2015-03-19 Audio navigation device, vehicle having the same, user device, and method for controlling vehicle

Publications (2)

Publication Number Publication Date
KR20160113417A KR20160113417A (en) 2016-09-29
KR101782423B1 true KR101782423B1 (en) 2017-09-29

Family

ID=57073632

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150038431A KR101782423B1 (en) 2015-03-19 2015-03-19 Audio navigation device, vehicle having the same, user device, and method for controlling vehicle

Country Status (1)

Country Link
KR (1) KR101782423B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200101518A (en) * 2019-01-30 2020-08-28 한국자동차연구원 Method for road lane management based on multi-vehicle driving information and system for the same
KR20210148518A (en) 2020-05-29 2021-12-08 서울대학교산학협력단 Apparatus and method for virtual lane generation based on traffic flow for autonomous driving in severe weather condition

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190061137A (en) * 2017-11-27 2019-06-05 현대모비스 주식회사 Apparatus for keeping virtual lane and method thereof

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007210458A (en) * 2006-02-09 2007-08-23 Nissan Motor Co Ltd Display device for vehicle and image display control method for vehicle

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007210458A (en) * 2006-02-09 2007-08-23 Nissan Motor Co Ltd Display device for vehicle and image display control method for vehicle

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200101518A (en) * 2019-01-30 2020-08-28 한국자동차연구원 Method for road lane management based on multi-vehicle driving information and system for the same
KR102269625B1 (en) 2019-01-30 2021-06-28 한국자동차연구원 Method for road lane management based on multi-vehicle driving information and system for the same
KR20210148518A (en) 2020-05-29 2021-12-08 서울대학교산학협력단 Apparatus and method for virtual lane generation based on traffic flow for autonomous driving in severe weather condition

Also Published As

Publication number Publication date
KR20160113417A (en) 2016-09-29

Similar Documents

Publication Publication Date Title
KR101744724B1 (en) Audio navigation device, vehicle having the same, user device, and method for controlling vehicle
US10753757B2 (en) Information processing apparatus and information processing method
EP3683102B1 (en) Systems for driver assistance
US20190193738A1 (en) Vehicle and Control Method Thereof
WO2017057055A1 (en) Information processing device, information terminal and information processing method
CN106467060A (en) Display device and the vehicle including this display device
KR20180132922A (en) Vehicle display devices and vehicles
US20160275360A1 (en) Vehicle and method for controlling the same
CN107102347B (en) Position sensing device, vehicle having the same, and method of controlling the same
US11915452B2 (en) Information processing device and information processing method
CN109196557A (en) Image processing apparatus, image processing method and vehicle
JP7374098B2 (en) Information processing device, information processing method, computer program, information processing system, and mobile device
EP3538846B1 (en) Using map information to smooth objects generated from sensor data
US10099616B2 (en) Vehicle and method for controlling the vehicle
JP7371629B2 (en) Information processing device, information processing method, program, and vehicle
US11590985B2 (en) Information processing device, moving body, information processing method, and program
WO2019049828A1 (en) Information processing apparatus, self-position estimation method, and program
US20200230820A1 (en) Information processing apparatus, self-localization method, program, and mobile body
KR101782423B1 (en) Audio navigation device, vehicle having the same, user device, and method for controlling vehicle
KR101650791B1 (en) Vehicle, and method for controlling thereof
US20200357284A1 (en) Information processing apparatus and information processing method
KR20170110800A (en) Navigation Apparutaus and Driver Assistance Apparatus Having The Same
JP2019100942A (en) Mobile object, positioning system, positioning program and positioning method
WO2023068116A1 (en) On-vehicle communication device, terminal device, communication method, information processing method, and communication system
WO2022075062A1 (en) Object position detection device, object position detection system, and object position detection method

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant