WO2019132504A1 - Destination guide apparatus and method - Google Patents

Destination guide apparatus and method Download PDF

Info

Publication number
WO2019132504A1
WO2019132504A1 PCT/KR2018/016645 KR2018016645W WO2019132504A1 WO 2019132504 A1 WO2019132504 A1 WO 2019132504A1 KR 2018016645 W KR2018016645 W KR 2018016645W WO 2019132504 A1 WO2019132504 A1 WO 2019132504A1
Authority
WO
WIPO (PCT)
Prior art keywords
destination
unit
feature
feature map
vicinity
Prior art date
Application number
PCT/KR2018/016645
Other languages
French (fr)
Korean (ko)
Inventor
박승일
Original Assignee
현대엠엔소프트 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 현대엠엔소프트 주식회사 filed Critical 현대엠엔소프트 주식회사
Publication of WO2019132504A1 publication Critical patent/WO2019132504A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • the present invention relates to a destination guidance apparatus and method, and more particularly, to a destination guidance apparatus and method for accurately guiding a destination through a head-up display.
  • the navigation system reads the data of the current position in the road database based on the position information of the route search device received through the GPS (Global Positioning System) satellite and displays the current position together with the position of the vehicle on the screen. It is a device that helps you to find the destination easily when you identify the position of a road or drive on a new road.
  • GPS Global Positioning System
  • head-up display is used to provide information provided by navigation.
  • the information provided in the information navigation provided through the head-up display is simply provided on the windshield in the past, even if the information is provided through the head-up display, the driver can not find the destination in the vicinity of the destination, There has been a problem that an overdraft still occurs.
  • the present invention has been made to overcome the above problems, and it is an object of one aspect of the present invention to provide a method and apparatus for searching for an image near a destination to generate a feature map in the vicinity of a destination, And a destination guidance device and method for matching a feature point detected in a photographed image with a feature map to recognize a destination and displaying the destination through a head-up display.
  • a destination guide apparatus including: a navigation unit receiving destination information and guiding a route to a destination; A feature map detecting unit that detects a feature map of a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination information input by the navigation unit; A feature point detector for detecting a feature point in the vicinity of a destination from an actually photographed image; A destination recognizing unit for recognizing a destination through the feature map and the minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And a head-up display for displaying a destination indicator for guiding a destination to a destination on the windshield glass, when the destination is recognized by the destination recognizing unit.
  • the feature map detection unit may include a distance image search unit searching for a distance image in the vicinity of the destination according to the destination information input by the navigation unit; And a feature map generation unit for detecting a feature point in the vicinity of the destination in the range image retrieved by the distance image retrieval unit and generating a feature map using the detected feature points.
  • the feature map generation unit of the present invention is characterized in that at least one of a peripheral contour line of a destination, a road surface object, and a roadside object is detected as a minutiae point.
  • the minutiae detection unit of the present invention includes a photographing unit for photographing the front of the vehicle; And a feature point generating unit for generating feature points in the image captured by the photographing unit.
  • the minutia point generation unit of the present invention is characterized in that the minutiae point near the destination is generated when the current position of the vehicle detected by the navigation unit is within a predetermined set distance from the destination.
  • the destination recognizing unit of the present invention includes: a destination coordinate detecting unit for detecting a destination coordinate by matching a feature point and a minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And a coordinate transforming unit for transforming the destination coordinates detected by the destination coordinate detecting unit into the coordinates of the seeds.
  • the head-up display of the present invention is characterized in that the destination indicator is displayed according to the coordinates of the hub converted by the coordinate conversion unit.
  • the present invention calculates a distance to a destination corresponding to a destination coordinate detected in an actually photographed image, newly detects a current position of the vehicle by applying a distance to a destination to a coordinate on a map of the destination, And a current position correcting unit for inputting a current position of the current position of the vehicle to the navigation unit.
  • a destination guidance method including: receiving a destination information of a navigation unit;
  • the feature map detecting unit detecting a feature map for a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination information input by the navigation unit; Detecting a feature point near a destination from an actually photographed image;
  • the destination recognizing unit recognizing the destination using the feature map and the minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And displaying a destination indicator for guiding the destination to a destination recognized by the destination recognizing unit.
  • the step of detecting the feature map in the vicinity of the destination of the present invention may include searching a distance image in the vicinity of the destination according to the destination information input by the navigation unit, detecting the feature point in the vicinity of the destination in the retrieved distance image, To generate a feature map.
  • the step of detecting the feature map of the present invention is characterized by detecting at least one of the peripheral contour of the destination, the road surface object, and the roadside object as feature points.
  • the step of detecting the feature points in the vicinity of the destination of the present invention is characterized by photographing the front of the vehicle and generating the feature points from the photographed image.
  • the detecting of the feature point near the destination of the present invention is characterized in that the feature point near the destination is generated when the current position of the vehicle detected by the navigation unit is within a predetermined set distance from the destination.
  • the step of recognizing the destination of the present invention includes the steps of detecting the destination coordinates by matching the feature maps and the minutiae points detected by the feature map detecting section and the minutiae point detecting section respectively and converting the detected destination coordinates into the head coordinates of the head- .
  • the step of displaying the destination indicator of the present invention is characterized in that the destination indicator is displayed according to the coordinates of the hub converted by the destination recognizing unit.
  • the current position correction unit calculates the distance to the destination corresponding to the destination coordinates detected in the actually photographed image and newly detects the current position of the vehicle by applying the distance to the destination to the coordinates on the map of the destination, And the navigation unit corrects the current position of the vehicle based on the current position of the vehicle newly detected by the current position correcting unit to determine the current position of the vehicle based on the corrected current position of the vehicle, And a step of informing the user of the route to the destination.
  • a destination guidance apparatus including: a navigation terminal for guiding a route to an input destination; And a server for detecting a feature map of a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination input by the navigation terminal, wherein the navigation terminal comprises: A destination indicator for displaying the destination to a destination on the windshield glass is displayed through the head-up display after recognizing the destination through the feature map detected by the server and the detected feature point .
  • the server of the present invention generates a feature map for a minutiae near a destination when a destination is input from the navigation terminal or generates a feature map for a destination currently inputted from the navigation terminal among feature maps for each of a plurality of destinations And a feature map for a feature point in the vicinity of the destination is detected by using a method of extracting the feature map.
  • the server of the present invention searches for a distance image in the vicinity of the destination according to the destination input by the navigation terminal when generating the feature map, detects the feature point in the vicinity of the destination in the retrieved distance image, Thereby generating a feature map.
  • the navigation terminal of the present invention detects destination coordinates by matching the feature maps detected by the server with the detected minutiae points and converts the detected destination coordinates into the coordinates of the head and transmits the converted coordinates to the head up display do.
  • a destination guide apparatus and method is a method and apparatus for searching for an image near a destination and generating a feature map in the vicinity of a destination when the destination is set, The destination is recognized by matching with the map, and the destination is displayed through the head-up display.
  • a destination guide apparatus and method accurately displays a destination through a head-up display, thereby preventing a situation where a user can not find a destination in the vicinity of the destination or passes the destination.
  • FIG. 1 is a block diagram of a destination guide apparatus according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of a feature map detecting unit according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating an example of extracting a destination and a neighboring feature point according to an embodiment of the present invention.
  • FIG. 4 is a block diagram of a feature point detector according to an embodiment of the present invention.
  • FIG. 5 is a block diagram of a destination recognizer according to an exemplary embodiment of the present invention.
  • FIG. 6 is a diagram illustrating an example of a destination according to an exemplary embodiment of the present invention. Referring to FIG. 6
  • FIG. 7 is a flowchart of a destination guidance method according to an embodiment of the present invention.
  • FIG. 8 is a block diagram of a destination guidance apparatus according to another embodiment of the present invention.
  • FIG. 1 is a block diagram of a destination guide apparatus according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of a feature map detecting unit according to an embodiment of the present invention.
  • FIG. 4 is a block diagram of a feature point detector according to an embodiment of the present invention,
  • FIG. 5 is a block diagram of a destination recognizer according to an embodiment of the present invention, and
  • FIG. 6 is a diagram illustrating an example of displaying a destination according to an embodiment of the present invention.
  • a destination guidance apparatus includes a navigation unit 10, a feature map detection unit 20, a feature point detection unit 30, a destination recognition unit 40, And a display 50.
  • the navigation unit 10 reads the data of the current location in the road database based on the location information of the route search device received through GPS (Global Positioning System) satellite, and displays the current location data together with the location of the vehicle on the screen.
  • GPS Global Positioning System
  • the navigation unit 10 receives the destination information from the user and guides the route to the destination according to the inputted destination information. At this time, the navigation unit 10 guides the route to the destination using the current location and the destination information of the vehicle received through the GPS satellite, outputs the map stored in the road database on the screen, and displays the route on the map It guides you with your voice.
  • the navigation unit 10 inputs the destination information input by the user to the feature map detection unit 20 and inputs the current position of the vehicle to the feature point detection unit 30 in real time.
  • the navigation unit 10 may guide the route to the destination using the current position. This will be described later.
  • the feature map detecting unit 20 detects a feature map for a feature point in the vicinity of the destination from the distance image near the destination using the destination information input by the navigation unit 10. [ When the destination information is input from the navigation unit 10, the feature map detecting unit 20 generates a feature map for the feature points near the destination, or searches the feature unit map of the feature map for each of the plurality of destinations, It is possible to detect the feature map of the feature point in the vicinity of the destination using a method of extracting the feature map for the corresponding destination that is currently input from the feature map.
  • the feature map detection unit 20 includes a distance image search unit 21 and a feature map generation unit 22.
  • the distance image searching unit 21 searches for and acquires a distance image in the vicinity of the destination according to the destination information inputted by the navigation unit 10.
  • the distance image searching unit 21 searches for a distance image in the vicinity of the corresponding destination and searches for a street view in various directions that can enter the destination. This is because when the directions to reach the destination are different, that is, the user can enter a path different from the predetermined path.
  • the distance image may be an image of a distance taken by a real image, a street view in which a distance in the vicinity of the destination is photographed may be employed, and the distance image may be constructed in an external web server or database,
  • the image search unit 21 can search for and acquire a distance image near the destination among the distance images constructed in the web server or the database.
  • the feature map generating unit 22 detects feature points in the vicinity of a destination from the distance image retrieved by the distance image retrieving unit 21 and generates a feature map using the detected feature points.
  • a feature point is a feature point of an object that can identify the vicinity of a destination, and is used as information for identifying the vicinity of the destination in the distance image.
  • the feature map generation unit 22 detects feature points in a distance image near the destination retrieved by the distance image retrieval unit 21.
  • the feature points may include a surrounding contour of the destination, a road surface object, and a roadside object.
  • the road surface object may include a mark formed on the surrounding road, various lines of the road, and the like.
  • Roadside objects may include traffic lights, traffic signs, buildings or signboards on nearby roads.
  • a crosswalk line feature point 1
  • a traffic light feature point 2
  • a signboard feature point 3
  • a building building
  • the feature map detecting unit 20 detects the feature map of the corresponding destination currently input from the navigation unit 10 among the feature maps for each of the plurality of destinations, Or if a feature map for the destination is not generated, the feature map for the corresponding destination currently input through the above-described method may be generated.
  • the function of the feature map detection unit may be implemented in a server built outside the vehicle as described later.
  • the minutia detection unit 30 detects minutiae near the destination from the actually photographed image.
  • the feature point detection unit 30 includes a photographing unit 31 and a feature point generating unit 32.
  • the photographing unit 31 is installed in the vehicle, captures an image of the front of the vehicle, and inputs the photographed image to the feature point generating unit 32.
  • the photographing unit 31 may be provided inside the vehicle, but may be a camera mounted on a black box (not shown), a preceding vehicle (not shown) or a camera for photographing a lane.
  • the feature point generating unit 32 generates a feature point from the image actually photographed by the photographing unit 31.
  • the manner and object in which the feature point generation unit 32 extracts the feature points are the same as the manner and the manner in which the feature map generation unit 22 extracts feature points from the street view. Therefore, detailed description thereof will be omitted here.
  • the feature point generation unit 32 extracts feature points from an image actually photographed by the photographing unit 31, and extracts feature points based on the current position of the vehicle.
  • the feature point generating unit 32 can extract the feature point from the image actually photographed by the photographing unit 31 until the vehicle reaches the vicinity of the destination, but it is possible to determine whether the current position of the vehicle is within the predetermined set distance And the minutiae near the destination can be generated according to the determination result.
  • the minutia generation section 32 receives the destination coordinates and the current position of the vehicle from the navigation section 10 in real time, determines whether the current position of the vehicle is within the set distance from the destination coordinates, It is possible to photograph the front of the vehicle by controlling the photographing section 31 to extract the feature point from the actually photographed image. In this manner, by detecting the feature points within the set distance from the destination coordinates, the load on the feature point generating unit 32 can be relatively reduced and the detection error can be minimized.
  • the setting distance may be variously set depending on the existence or position of an object that can be detected as a minutia in the vicinity of the destination, or the distance that the user can visually confirm the destination in the vehicle.
  • the destination recognizing unit 40 recognizes the destination through the feature map and the minutiae detected by the feature map detecting unit 20 and the minutiae detecting unit 30, respectively.
  • the destination recognizing unit 40 includes a destination coordinate detecting unit 41 and a coordinate converting unit 42.
  • the destination coordinate detecting section 41 detects the destination coordinates by matching the feature maps detected by the feature map detecting section 20 and the feature point detecting section 30 with the feature points.
  • the destination coordinate detecting section 41 receives the feature map from the feature map detecting section 20 and, when receiving the feature point by the feature point detecting section 30, matches the feature map with the feature point, Identify the destination according to the result.
  • the destination coordinate detecting unit 41 detects the destination coordinates in the actually photographed image by matching the feature map with the minutiae, and if it is determined that the feature map matches the minutiae point, the destination coordinates are detected from the actually photographed image.
  • the coordinate conversion unit 42 converts the destination coordinates detected by the destination coordinate detection unit 41 into the Hurd coordinates.
  • the coordinate conversion unit 42 converts the destination coordinates detected by the destination coordinate detection unit 41 into the coordinates of the destination by matching the coordinates with the coordinates of the destination.
  • the coordinate matching is such that the coordinate of the actually photographed distance image is matched with the range displayed by the head-up display 50 and the Hurd coordinate.
  • the coordinate conversion unit 42 stores in advance the range and coordinates that can be displayed by the head-up display 50 corresponding to the coordinates of the actually photographed distance image, It is possible to convert the destination coordinates into the Hurd coordinates by detecting the Hurd coordinates matching the destination coordinates.
  • the current position correcting unit 60 corrects the current position based on the corresponding destination coordinates as the destination coordinates are detected from the actually photographed image as described above. That is, the current position correcting unit 60 calculates the actual distance corresponding to each coordinate in the actually photographed image in advance, and calculates the distance to the destination corresponding to the detected destination coordinates in the actually photographed image. The current position correcting unit 60 newly detects the current position of the vehicle by applying the distance to the destination calculated as described above to the coordinates on the map of the destination and outputs the detected current position of the vehicle to the navigation unit 10 ).
  • the navigation unit 10 corrects the current position of the vehicle to the current position input from the current position correcting unit 60, and guides the path to the destination more accurately based on the corrected current position.
  • Up display 50 displays a destination indicator for guiding a destination to a destination on the windshield glass when a destination is recognized by the destination recognizing unit 40.
  • the head-up display 50 displays the destination indicator for guiding the destination according to the coordinates of the corresponding hub as shown in FIG. 6 when the destination recognizing unit 40 detects the coordinates of the destination corresponding to the destination coordinates .
  • the head-up display 50 can display the destination indicator directly at the destination coordinates, but can also be displayed around the destination.
  • the destination indicator may be formed of polygons or may be formed by symbols such as dots, lines, and arrows.
  • FIG. 7 is a flowchart of a destination guidance method according to an embodiment of the present invention.
  • the navigation unit 10 receives destination information from a user (S10). At this time, the navigation unit 10 inputs the destination information inputted by the user to the feature map detecting unit 20, and inputs the current position of the vehicle to the feature point detecting unit 30 in real time.
  • the navigation unit 10 guides the route to the destination.
  • the distance image search unit 21 searches for a distance image in the vicinity of the destination according to the destination information input by the navigation unit 10 (S20).
  • the feature map detecting unit 20 can search the street view in various directions that can enter the destination.
  • the feature map generating unit 22 detects feature points in the vicinity of the destination in the distance image retrieved by the distance image retrieving unit 21, and detects the feature map using the detected feature points (30).
  • the feature point may include a peripheral contour of the destination, a road surface object, and a roadside object.
  • the minutia generation section 32 receives the destination coordinates and the current position of the vehicle from the navigation section 10 in real time, and determines whether the current position of the vehicle is within a set distance from the destination coordinates (S40).
  • the minutia generation section 32 controls the photographing section 31 to photograph the distance image in the vicinity of the destination, (Step S50).
  • the destination coordinate detecting unit 41 detects the correspondence between the detected feature maps and the minutiae, judges whether they match or not, identifies the destination from the actually photographed image, and detects the destination coordinates of the actually photographed image S60).
  • the coordinate transforming unit 42 converts the destination coordinates of the distance image actually photographed by the photographing unit 31 to the coordinates of the heights by converting the coordinates of the destination coordinates into the coordinates of the heights (S70).
  • the head-up display 50 displays a destination indicator for guiding the destination according to the coordinates of the corresponding head, as the destination coordinates are converted into the coordinates of the head (S80).
  • the current position correcting unit 60 calculates the distance to the destination corresponding to the destination coordinates detected in the actually photographed image, and calculates the distance to the destination And then inputs the current position of the newly detected vehicle to the navigation unit 10.
  • the navigation unit 10 detects the current position of the vehicle.
  • the navigation unit 10 can correct the current position of the vehicle to the current position input from the current position correcting unit 60, and guide the path to the destination more accurately based on the corrected current position.
  • (Feature Map) detection unit 20 may be embodied as an external server 200 that is distinguished from the navigation terminal 100 may be employed.
  • the present embodiment guides a route to a destination inputted by the navigation terminal 100, and the server 200 uses the destination input by the navigation terminal 100 to extract a distance from a near-
  • the navigation terminal 100 detects a feature point in the vicinity of the destination from the actually photographed image and detects the feature point detected by the server 200 and the detected feature point through the detected feature point
  • a destination indicator for guiding the destination to the destination on the windshield glass is displayed through the head-up display.
  • the server 200 may generate a feature map for the feature points in the vicinity of the destination, A feature map for a feature point in the vicinity of a destination can be detected using a method of extracting a feature map for a destination currently input from the feature point searching unit 100.
  • the server 200 searches the distance image near the destination according to the destination input by the navigation terminal 100, detects the feature point near the destination in the retrieved distance image, Can be used to generate a feature map.
  • the navigation terminal 100 may detect the destination coordinates by matching the feature maps detected by the server 200 with the detected minutia points, convert the detected destination coordinates into the coordinates of the hands, and transmit the converted coordinates to the head-up display, Accordingly, a destination indicator for guiding the destination to the destination on the windshield glass through the head-up display can be displayed.
  • the implementations described herein may be implemented, for example, as a method or process, an apparatus, a software program, a data stream, or a signal. Although discussed only in the context of a single type of implementation (e.g., discussed only as a method), implementations of the discussed features may also be implemented in other forms (e.g., devices or programs).
  • the device may be implemented with suitable hardware, software, firmware, and the like.
  • the method may be implemented in an apparatus such as, for example, a processor, which generally refers to a processing device including a computer, microprocessor, integrated circuit or programmable logic device, and the like.
  • the processor also includes communication devices such as computers, cell phones, personal digital assistants ("PDAs”) and other devices that facilitate communication of information between end-users.
  • PDAs personal digital assistants
  • the destination guidance apparatus and method according to the embodiment of the present invention precisely displays the destination through the head-up display, thereby preventing the user from finding the destination in the vicinity of the destination or passing through the destination.

Abstract

Disclosed is a destination guide apparatus and method. A destination guide apparatus according to the present invention comprises: a navigation unit for receiving destination information as input and providing route guidance to a destination; a feature map detection unit for detecting a feature map of feature points in the vicinity of the destination, from a street image in the vicinity of the destination by using the destination information input to the navigation unit; a feature point detection unit for detecting a feature point in the vicinity of the destination, from an actually-captured image; a destination recognition unit for recognizing the destination through the feature map and the feature point detected by the feature map detection unit and the feature point detection unit, respectively; and a head-up display for, when the destination has been recognized by the destination recognition unit, displaying a destination indicator, used for destination guidance, on a destination on a windshield glass.

Description

목적지 안내 장치 및 방법Destination guide device and method
본 발명은 목적지 안내 장치 및 방법에 관한 것으로서, 보다 상세하게는 목적지를 헤드업 디스플레이를 통해 정확하게 안내하는 목적지 안내 장치 및 방법에 관한 것이다.BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a destination guidance apparatus and method, and more particularly, to a destination guidance apparatus and method for accurately guiding a destination through a head-up display.
일반적으로 내비게이션 시스템은 GPS(Global Positioning System) 위성을 통해서 수신받은 경로탐색 장치의 위치 정보를 기반으로, 도로 데이터베이스에서 현재 위치의 데이터를 읽어 화면에 차량의 위치와 함께 표시함으로써, 운전자가 현재 주행하는 도로의 위치를 식별하거나 초행 도로에서 주행할 때 목적지를 쉽게 찾을 수 있도록 도와주는 장치이다.Generally, the navigation system reads the data of the current position in the road database based on the position information of the route search device received through the GPS (Global Positioning System) satellite and displays the current position together with the position of the vehicle on the screen. It is a device that helps you to find the destination easily when you identify the position of a road or drive on a new road.
최근에는 사용자의 니즈를 충족시키기 위해서 운전자가 목적지를 설정하면 출발지에서 목적지까지의 예상 도착 시간을 안내하거나 목적지까지의 도로 상황에 따른 우회 도로를 제안하는 등의 선진화된 기술들이 경로탐색 장치에 접목되어 운전자에게 보다 정확한 경로 정보를 제공하고 있다.Recently, advanced techniques such as guiding the estimated arrival time from the departure point to the destination or suggesting a detour road according to the road situation to the destination are incorporated into the route search device when the driver sets the destination in order to meet the user's needs More accurate route information is provided to the driver.
본 발명의 배경기술은 대한민국 공개특허공보 10-2016-0001358호(2016.01.06)의 '헤드업 디스플레이 시스템, 헤드업 디스플레이장치 및 내비게이션정보 표시방법'에 개시되어 있다.BACKGROUND ART [0002] The background art of the present invention is disclosed in Korean Patent Laid-Open Publication No. 10-2016-0001358 (May 2016), entitled " Head-Up Display System, Head-Up Display Device, and Navigation Information Display Method. &Quot;
종래에는 헤드업 디스플레이를 이용하여 내비게이션에서 제공하는 정보를 제공하고 있다. 그러나, 종래에는 헤드업 디스플레이를 통해 제공되는 정보 내비게이션에서 제공되는 정보를 단순히 윈드쉴드 상에 제공하는 상태이므로, 헤드업 디스플레이를 통해 정보를 제공하더라도, 운전자가 목적지 부근에서 목적지를 찾지 못하거나 목적지를 지나치는 상황이 여전히 발생하는 문제점이 있었다. Conventionally, head-up display is used to provide information provided by navigation. However, since the information provided in the information navigation provided through the head-up display is simply provided on the windshield in the past, even if the information is provided through the head-up display, the driver can not find the destination in the vicinity of the destination, There has been a problem that an overdraft still occurs.
본 발명은 전술한 문제점을 개선하기 위해 창안된 것으로서, 본 발명의 일 측면에 따른 목적은 목적지가 설정되면 목적지 부근의 영상을 검색하여 목적지 부근의 피처 맵을 생성하고, 이후 차량이 목적지 부근에 도착하면 촬영된 영상에서 검출된 특징점을 피처 맵과 매칭시켜 목적지를 인식하여 헤드업 디스플레이를 통해 표시하는 목적지 안내 장치 및 방법을 제공하는 것이다. SUMMARY OF THE INVENTION The present invention has been made to overcome the above problems, and it is an object of one aspect of the present invention to provide a method and apparatus for searching for an image near a destination to generate a feature map in the vicinity of a destination, And a destination guidance device and method for matching a feature point detected in a photographed image with a feature map to recognize a destination and displaying the destination through a head-up display.
본 발명의 일 측면에 따른 목적지 안내 장치는 목적지 정보를 입력받아 목적지까지의 경로를 안내하는 내비게이션부; 상기 내비게이션부에 의해 입력된 목적지 정보를 이용하여 목적지 부근의 거리 영상으로부터 목적지 부근의 특징점에 대한 피처 맵(Feature Map)을 검출하는 피처 맵 검출부; 실제 촬영된 영상으로부터 목적지 부근의 특징점을 검출하는 특징점 검출부; 상기 피처 맵 검출부와 상기 특징점 검출부 각각에 의해 검출된 피처 맵과 특징점을 통해 목적지를 인식하는 목적지 인식부; 및 상기 목적지 인식부에 의해 목적지가 인식되면, 윈드쉴드 글래스 상의 목적지에 목적지를 안내하기 위한 목적지 인디케이터를 표시하는 헤드업 디스플레이를 포함하는 것을 특징으로 한다. According to an aspect of the present invention, there is provided a destination guide apparatus including: a navigation unit receiving destination information and guiding a route to a destination; A feature map detecting unit that detects a feature map of a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination information input by the navigation unit; A feature point detector for detecting a feature point in the vicinity of a destination from an actually photographed image; A destination recognizing unit for recognizing a destination through the feature map and the minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And a head-up display for displaying a destination indicator for guiding a destination to a destination on the windshield glass, when the destination is recognized by the destination recognizing unit.
본 발명의 상기 피처 맵 검출부는 상기 내비게이션부에 의해 입력된 목적지 정보에 따라 목적지 부근의 거리 영상을 검색하는 거리 영상 검색부; 및 상기 거리 영상 검색부에 의해 검색된 거리 영상에서 목적지 부근의 특징점을 검출하고 검출된 특징점을 이용하여 피처 맵을 생성하는 피처 맵 생성부를 포함하는 것을 특징으로 한다. The feature map detection unit may include a distance image search unit searching for a distance image in the vicinity of the destination according to the destination information input by the navigation unit; And a feature map generation unit for detecting a feature point in the vicinity of the destination in the range image retrieved by the distance image retrieval unit and generating a feature map using the detected feature points.
본 발명의 상기 피처 맵 생성부는 특징점으로 목적지의 주변 윤곽선, 노면 객체 및 노변 객체 중 적어도 하나를 검출하는 것을 특징으로 한다. The feature map generation unit of the present invention is characterized in that at least one of a peripheral contour line of a destination, a road surface object, and a roadside object is detected as a minutiae point.
본 발명의 상기 특징점 검출부는 차량의 전방을 촬영하는 촬영부; 및 상기 촬영부에 의해 촬영된 영상에서 특징점을 생성하는 특징점 생성부를 포함하는 것을 특징으로 한다. The minutiae detection unit of the present invention includes a photographing unit for photographing the front of the vehicle; And a feature point generating unit for generating feature points in the image captured by the photographing unit.
본 발명의 상기 특징점 생성부는 상기 내비게이션부에 의해 검출된 차량의 현재 위치가 목적지로부터 기 설정된 설정 거리 이내이면 목적지 부근의 특징점을 생성하는 것을 특징으로 한다. The minutia point generation unit of the present invention is characterized in that the minutiae point near the destination is generated when the current position of the vehicle detected by the navigation unit is within a predetermined set distance from the destination.
본 발명의 상기 목적지 인식부는 상기 피처 맵 검출부와 상기 특징점 검출부 각각에 의해 검출된 피처 맵과 특징점을 매칭시켜 목적지 좌표를 검출하는 목적지 좌표 검출부; 및 상기 목적지 좌표 검출부에 의해 검출된 목적지 좌표를 허드 좌표로 변환하는 좌표 변환부를 포함하는 것을 특징으로 한다. The destination recognizing unit of the present invention includes: a destination coordinate detecting unit for detecting a destination coordinate by matching a feature point and a minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And a coordinate transforming unit for transforming the destination coordinates detected by the destination coordinate detecting unit into the coordinates of the seeds.
본 발명의 상기 헤드업 디스플레이는 상기 좌표 변환부에 의해 변환된 허드 좌표에 따라 목적지 인디케이터를 표시하는 것을 특징으로 한다.The head-up display of the present invention is characterized in that the destination indicator is displayed according to the coordinates of the hub converted by the coordinate conversion unit.
본 발명은 실제 촬영된 영상에서 검출된 목적지 좌표에 대응되는 목적지까지의 거리를 계산하고, 목적지의 지도상의 좌표에 목적지까지의 거리를 적용하여 차량의 현재 위치를 새롭게 검출한 후, 새롭게 검출된 차량의 현재 위치를 상기 내비게이션부에 입력하는 현재 위치 보정부를 더 포함하는 것을 특징으로 한다. The present invention calculates a distance to a destination corresponding to a destination coordinate detected in an actually photographed image, newly detects a current position of the vehicle by applying a distance to a destination to a coordinate on a map of the destination, And a current position correcting unit for inputting a current position of the current position of the vehicle to the navigation unit.
본 발명의 일 측면에 따른 목적지 안내 방법은 내비게이션부가 목적지 정보를 입력받는 단계; 피처 맵 검출부가 상기 내비게이션부에 의해 입력된 목적지 정보를 이용하여 목적지 부근의 거리 영상으로부터 목적지 부근의 특징점에 대한 피처 맵(Feature Map)을 검출하는 단계; 특징점 검출부가 실제 촬영된 영상으로부터 목적지 부근의 특징점을 검출하는 단계; 목적지 인식부가 상기 피처 맵 검출부와 상기 특징점 검출부 각각에 의해 검출된 피처 맵과 특징점을 이용하여 목적지를 인식하는 단계; 및 헤드업 디스플레이가 상기 목적지 인식부에 의해 인식된 목적지에 목적지를 안내하기 위한 목적지 인디케이터를 표시하는 단계를 포함하는 것을 특징으로 한다.According to an aspect of the present invention, there is provided a destination guidance method including: receiving a destination information of a navigation unit; The feature map detecting unit detecting a feature map for a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination information input by the navigation unit; Detecting a feature point near a destination from an actually photographed image; The destination recognizing unit recognizing the destination using the feature map and the minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And displaying a destination indicator for guiding the destination to a destination recognized by the destination recognizing unit.
본 발명의 상기 목적지 부근의 피처 맵을 검출하는 단계는 상기 내비게이션부에 의해 입력된 목적지 정보에 따라 목적지 부근의 거리 영상을 검색하고, 검색된 거리 영상에서 목적지 부근의 특징점을 검출한 후, 검출된 특징점을 이용하여 피처 맵을 생성하는 것을 특징으로 한다. The step of detecting the feature map in the vicinity of the destination of the present invention may include searching a distance image in the vicinity of the destination according to the destination information input by the navigation unit, detecting the feature point in the vicinity of the destination in the retrieved distance image, To generate a feature map.
본 발명의 상기 피처 맵을 검출하는 단계는 특징점으로 목적지의 주변 윤곽선, 노면 객체 및 노변 객체 중 적어도 하나를 검출하는 것을 특징으로 한다. The step of detecting the feature map of the present invention is characterized by detecting at least one of the peripheral contour of the destination, the road surface object, and the roadside object as feature points.
본 발명의 상기 목적지 부근의 특징점을 검출하는 단계는 차량의 전방을 촬영하고, 촬영된 영상에서 특징점을 생성하는 것을 특징으로 한다. The step of detecting the feature points in the vicinity of the destination of the present invention is characterized by photographing the front of the vehicle and generating the feature points from the photographed image.
본 발명의 상기 목적지 부근의 특징점을 검출하는 단계는 상기 내비게이션부에 의해 검출된 차량의 현재 위치가 목적지로부터 기 설정된 설정 거리 이내이면 목적지 부근의 특징점을 생성하는 것을 특징으로 한다. The detecting of the feature point near the destination of the present invention is characterized in that the feature point near the destination is generated when the current position of the vehicle detected by the navigation unit is within a predetermined set distance from the destination.
본 발명의 상기 목적지를 인식하는 단계는 상기 피처 맵 검출부와 상기 특징점 검출부 각각에 의해 검출된 피처 맵과 특징점을 매칭시켜 목적지 좌표를 검출하고, 검출된 목적지 좌표를 상기 헤드업 디스플레이의 허드 좌표로 변환하는 것을 특징으로 한다. The step of recognizing the destination of the present invention includes the steps of detecting the destination coordinates by matching the feature maps and the minutiae points detected by the feature map detecting section and the minutiae point detecting section respectively and converting the detected destination coordinates into the head coordinates of the head- .
본 발명의 상기 목적지 인디케이터를 표시하는 단계는 상기 목적지 인식부에 의해 변환된 허드 좌표에 따라 목적지 인디케이터를 표시하는 것을 특징으로 한다. The step of displaying the destination indicator of the present invention is characterized in that the destination indicator is displayed according to the coordinates of the hub converted by the destination recognizing unit.
본 발명은 현재 위치 보정부가 실제 촬영된 영상에서 검출된 목적지 좌표에 대응되는 목적지까지의 거리를 계산하고, 목적지의 지도상의 좌표에 목적지까지의 거리를 적용하여 차량의 현재 위치를 새롭게 검출한 후, 새롭게 검출된 차량의 현재 위치를 상기 내비게이션부에 입력하고, 상기 내비게이션부가 상기 현재 위치 보정부에 의해 새롭게 검출된 차량의 현재 위치를 통해 차량의 현재 위치를 보정하여 보정된 차량의 현재 위치를 토대로 목적지까지의 경로를 안내하는 단계를 더 포함하는 것을 특징으로 한다.The current position correction unit calculates the distance to the destination corresponding to the destination coordinates detected in the actually photographed image and newly detects the current position of the vehicle by applying the distance to the destination to the coordinates on the map of the destination, And the navigation unit corrects the current position of the vehicle based on the current position of the vehicle newly detected by the current position correcting unit to determine the current position of the vehicle based on the corrected current position of the vehicle, And a step of informing the user of the route to the destination.
본 발명의 다른 측면에 따른 목적지 안내 장치는 입력받은 목적지까지의 경로를 안내하는 내비게이션 단말; 및 상기 내비게이션 단말에 의해 입력된 목적지를 이용하여 목적지 부근의 거리 영상으로부터 목적지 부근의 특징점에 대한 피처 맵(Feature Map)을 검출하는 서버를 포함하고, 상기 내비게이션 단말은, 실제 촬영된 영상으로부터 목적지 부근의 특징점을 검출하고, 상기 서버에 의해 검출된 피처 맵과 상기 검출된 특징점을 통해 목적지를 인식한 후, 헤드업 디스플레이를 통해 윈드쉴드 글래스 상의 목적지에 목적지를 안내하기 위한 목적지 인디케이터가 표시되도록 하는 것을 특징으로 한다.According to another aspect of the present invention, there is provided a destination guidance apparatus including: a navigation terminal for guiding a route to an input destination; And a server for detecting a feature map of a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination input by the navigation terminal, wherein the navigation terminal comprises: A destination indicator for displaying the destination to a destination on the windshield glass is displayed through the head-up display after recognizing the destination through the feature map detected by the server and the detected feature point .
본 발명의 상기 서버는 상기 내비게이션 단말로부터 목적지가 입력될 때 목적지 부근의 특징점에 대한 피처 맵을 생성하거나, 기 생성되어 있는 복수의 목적지 각각에 대한 피처 맵 중 상기 내비게이션 단말로부터 현재 입력된 목적지에 대한 피처 맵을 추출하는 방식을 이용하여 목적지 부근의 특징점에 대한 피처 맵을 검출하는 것을 특징으로 한다.The server of the present invention generates a feature map for a minutiae near a destination when a destination is input from the navigation terminal or generates a feature map for a destination currently inputted from the navigation terminal among feature maps for each of a plurality of destinations And a feature map for a feature point in the vicinity of the destination is detected by using a method of extracting the feature map.
본 발명의 상기 서버는 피처 맵을 생성할 때, 상기 내비게이션 단말에 의해 입력된 목적지에 따라 목적지 부근의 거리 영상을 검색하고, 검색된 거리 영상에서 목적지 부근의 특징점을 검출한 후, 검출된 특징점을 이용하여 피처 맵을 생성하는 것을 특징으로 한다.The server of the present invention searches for a distance image in the vicinity of the destination according to the destination input by the navigation terminal when generating the feature map, detects the feature point in the vicinity of the destination in the retrieved distance image, Thereby generating a feature map.
본 발명의 상기 내비게이션 단말은, 상기 서버에 의해 검출된 피처 맵과 상기 검출된 특징점을 매칭시켜 목적지 좌표를 검출하고, 검출된 목적지 좌표를 허드 좌표로 변환하여 상기 헤드업 디스플레이에 전달하는 것을 특징으로 한다.The navigation terminal of the present invention detects destination coordinates by matching the feature maps detected by the server with the detected minutiae points and converts the detected destination coordinates into the coordinates of the head and transmits the converted coordinates to the head up display do.
본 발명의 일 측면에 따른 목적지 안내 장치 및 방법은 목적지가 설정되면 목적지 부근의 영상을 검색하여 목적지 부근의 피처 맵을 생성하고, 이후 차량이 목적지 부근에 도착하면 촬영된 영상에서 검출된 특징점을 피처 맵과 매칭시켜 목적지를 인식하여 헤드업 디스플레이를 통해 목적지를 표시한다.A destination guide apparatus and method according to an aspect of the present invention is a method and apparatus for searching for an image near a destination and generating a feature map in the vicinity of a destination when the destination is set, The destination is recognized by matching with the map, and the destination is displayed through the head-up display.
본 발명의 다른 측면에 따른 목적지 안내 장치 및 방법은 목적지를 헤드업 디스플레이를 통해 정확하게 표시함으로써, 사용자가 목적지 부근에서 목적지를 찾지 못하거나 목적지를 지나치는 상황을 미연에 방지한다. A destination guide apparatus and method according to another aspect of the present invention accurately displays a destination through a head-up display, thereby preventing a situation where a user can not find a destination in the vicinity of the destination or passes the destination.
도 1 은 본 발명의 일 실시예에 따른 목적지 안내 장치의 블럭 구성도이다.1 is a block diagram of a destination guide apparatus according to an embodiment of the present invention.
도 2 는 본 발명의 일 실시예에 따른 피처 맵 검출부의 블럭 구성도이다.2 is a block diagram of a feature map detecting unit according to an embodiment of the present invention.
도 3 은 본 발명의 일 실시예에 따른 목적지 및 주변 특징점을 추출한 예를 나타낸 도면이다.FIG. 3 is a diagram illustrating an example of extracting a destination and a neighboring feature point according to an embodiment of the present invention.
도 4 는 본 발명의 일 실시예에 따른 특징점 검출부의 블럭 구성도이다.4 is a block diagram of a feature point detector according to an embodiment of the present invention.
도 5 는 본 발명의 일 실시예에 따른 목적지 인식부의 블럭 구성도이다.5 is a block diagram of a destination recognizer according to an exemplary embodiment of the present invention.
도 6 은 본 발명의 일 실시예에 따른 목적지를 표시한 예를 나타낸 도면이다. FIG. 6 is a diagram illustrating an example of a destination according to an exemplary embodiment of the present invention. Referring to FIG.
도 7 은 본 발명의 일 실시예에 따른 목적지 안내 방법의 순서도이다.7 is a flowchart of a destination guidance method according to an embodiment of the present invention.
도 8 은 본 발명의 다른 실시예에 따른 목적지 안내 장치의 블럭 구성도이다.8 is a block diagram of a destination guidance apparatus according to another embodiment of the present invention.
이하에서는 본 발명의 일 실시예에 따른 목적지 안내 장치 및 방법을 첨부된 도면들을 참조하여 상세하게 설명한다. 이러한 과정에서 도면에 도시된 선들의 두께나 구성요소의 크기 등은 설명의 명료성과 편의상 과장되게 도시되어 있을 수 있다. 또한 후술되는 용어들은 본 발명에서의 기능을 고려하여 정의된 용어들로서, 이는 이용자, 운용자의 의도 또는 관례에 따라 달라질 수 있다. 그러므로 이러한 용어들에 대한 정의는 본 명세서 전반에 걸친 내용을 토대로 내려져야 할 것이다. Hereinafter, a destination guide apparatus and method according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. In this process, the thicknesses of the lines and the sizes of the components shown in the drawings may be exaggerated for clarity and convenience of explanation. Further, the terms described below are defined in consideration of the functions of the present invention, which may vary depending on the user, the intention or custom of the operator. Therefore, definitions of these terms should be made based on the contents throughout this specification.
도 1 은 본 발명의 일 실시예에 따른 목적지 안내 장치의 블럭 구성도이고, 도 2 는 본 발명의 일 실시예에 따른 피처 맵 검출부의 블럭 구성도이며, 도 3 은 본 발명의 일 실시예에 따른 목적지 및 주변 특징점을 추출한 예를 나타낸 도면이며, 도 4 는 본 발명의 일 실시예에 따른 특징점 검출부의 블럭 구성도이며, 도 5 는 본 발명의 일 실시예에 따른 목적지 인식부의 블럭 구성도이며, 도 6 은 본 발명의 일 실시예에 따른 목적지를 표시한 예를 나타낸 도면이다.FIG. 1 is a block diagram of a destination guide apparatus according to an embodiment of the present invention. FIG. 2 is a block diagram of a feature map detecting unit according to an embodiment of the present invention. FIG. 4 is a block diagram of a feature point detector according to an embodiment of the present invention, FIG. 5 is a block diagram of a destination recognizer according to an embodiment of the present invention, and FIG. And FIG. 6 is a diagram illustrating an example of displaying a destination according to an embodiment of the present invention.
도 1 을 참조하면, 본 발명의 일 실시예에 따른 목적지 안내 장치는 내비게이션부(10), 피처 맵(Feature Map) 검출부(20), 특징점 검출부(30), 목적지 인식부(40) 및 헤드업 디스플레이(50)를 포함한다.Referring to FIG. 1, a destination guidance apparatus according to an embodiment of the present invention includes a navigation unit 10, a feature map detection unit 20, a feature point detection unit 30, a destination recognition unit 40, And a display 50.
내비게이션부(10)는 GPS(Global Positioning System) 위성을 통해서 수신받은 경로탐색 장치의 위치 정보를 기반으로, 도로 데이터베이스에서 현재 위치의 데이터를 읽어 화면에 차량의 위치와 함께 표시한다.The navigation unit 10 reads the data of the current location in the road database based on the location information of the route search device received through GPS (Global Positioning System) satellite, and displays the current location data together with the location of the vehicle on the screen.
즉, 내비게이션부(10)는 사용자로부터 목적지 정보를 입력받고, 입력된 목적지 정보에 따라 목적지까지의 경로를 안내한다. 이때, 내비게이션부(10)는 GPS 위성을 통해 수신받은 차량의 현재 위치와 목적지 정보를 이용하여 목적지까지의 경로를 안내하되, 도로 데이터베이스에 저장된 지도를 화면상에 출력하고 해당 경로를 지도상에 표출함과 더불어 음성으로 안내한다. That is, the navigation unit 10 receives the destination information from the user and guides the route to the destination according to the inputted destination information. At this time, the navigation unit 10 guides the route to the destination using the current location and the destination information of the vehicle received through the GPS satellite, outputs the map stored in the road database on the screen, and displays the route on the map It guides you with your voice.
아울러, 내비게이션부(10)는 사용자에 의해 입력된 목적지 정보를 피처 맵 검출부(20)에 입력하고, 차량의 현재 위치를 실시간으로 특징점 검출부(30)에 입력한다.The navigation unit 10 inputs the destination information input by the user to the feature map detection unit 20 and inputs the current position of the vehicle to the feature point detection unit 30 in real time.
한편, 내비게이션부(10)는 현재 위치 보정부(60)로부터 입력된 현재 위치가 입력되면, 이 현재 위치를 이용하여 목적지까지의 경로를 안내할 수도 있다. 이에 대해서는 후술한다.On the other hand, when the current position input from the current position correcting unit 60 is input, the navigation unit 10 may guide the route to the destination using the current position. This will be described later.
피처 맵 검출부(20)는 내비게이션부(10)에 의해 입력된 목적지 정보를 이용하여 목적지 부근의 거리 영상으로부터 목적지 부근의 특징점에 대한 피처 맵을 검출한다. 피처 맵 검출부(20)는 내비게이션부(10)로부터 목적지 정보가 입력될 때 해당 목적지 부근의 특징점에 대한 피처 맵을 생성하거나, 기 생성되어 있는 복수의 목적지 각각에 대한 피처 맵 중 내비게이션부(10)로부터 현재 입력된 해당 목적지에 대한 피처 맵을 추출하는 방식을 이용하여 목적지 부근의 특징점에 대한 피처 맵을 검출할 수 있다.The feature map detecting unit 20 detects a feature map for a feature point in the vicinity of the destination from the distance image near the destination using the destination information input by the navigation unit 10. [ When the destination information is input from the navigation unit 10, the feature map detecting unit 20 generates a feature map for the feature points near the destination, or searches the feature unit map of the feature map for each of the plurality of destinations, It is possible to detect the feature map of the feature point in the vicinity of the destination using a method of extracting the feature map for the corresponding destination that is currently input from the feature map.
먼저, 피처 맵 검출부(20)가 피처 맵을 생성하는 방법에 대하여 설명한다.First, a method of generating the feature map by the feature map detection unit 20 will be described.
도 2 를 참조하면, 피처 맵 검출부(20)는 거리 영상 검색부(21), 및 피처 맵 생성부(22)를 포함한다.Referring to FIG. 2, the feature map detection unit 20 includes a distance image search unit 21 and a feature map generation unit 22.
거리 영상 검색부(21)는 내비게이션부(10)에 의해 입력된 목적지 정보에 따라 목적지 부근의 거리 영상을 검색하여 획득한다. The distance image searching unit 21 searches for and acquires a distance image in the vicinity of the destination according to the destination information inputted by the navigation unit 10.
거리 영상 검색부(21)는 내비게이션부(10)로부터 목적지 정보가 입력되면 해당 목적지 부근의 거리 영상을 검색하되, 목적지에 진입할 수 있는 다양한 방향에서의 스트리트 뷰를 검색할 수 있다. 이는 목적지에 도달하는 방향이 서로 다른 경우, 즉 사용자가 기 설정된 경로와 상이한 경로로 진입할 수 있기 때문이다.When the destination information is input from the navigation unit 10, the distance image searching unit 21 searches for a distance image in the vicinity of the corresponding destination and searches for a street view in various directions that can enter the destination. This is because when the directions to reach the destination are different, that is, the user can enter a path different from the predetermined path.
거리 영상은 실사 촬영한 거리의 이미지로써, 목적지 부근의 거리가 촬영된 스트리트 뷰(Street View)가 채용될 수 있고, 이러한 거리 영상은 외부의 웹 서버 또는 데이터베이스에 구축되어 있을 수 있으며, 이에 따라 거리 영상 검색부(21)는 웹 서버 또는 데이터베이스에 구축된 거리 영상 중 목적지 부근의 거리 영상을 검색하여 획득할 수 있다.The distance image may be an image of a distance taken by a real image, a street view in which a distance in the vicinity of the destination is photographed may be employed, and the distance image may be constructed in an external web server or database, The image search unit 21 can search for and acquire a distance image near the destination among the distance images constructed in the web server or the database.
피처 맵 생성부(22)는 거리 영상 검색부(21)에 의해 검색된 거리 영상에서 목적지 부근의 특징점을 검출하고 검출된 특징점을 이용하여 피처 맵을 생성한다.The feature map generating unit 22 detects feature points in the vicinity of a destination from the distance image retrieved by the distance image retrieving unit 21 and generates a feature map using the detected feature points.
특징점은 목적지 부근임을 식별할 수 있는 물체의 특징점으로써, 거리 영상에서 목적지 부근임을 식별하기 위한 정보로 사용된다.A feature point is a feature point of an object that can identify the vicinity of a destination, and is used as information for identifying the vicinity of the destination in the distance image.
피처 맵 생성부(22)는 거리 영상 검색부(21)에 의해 검색된 목적지 부근의 거리 영상에서 특징점을 검출하는데, 특징점으로는 목적지의 주변 윤곽선, 노면 객체 및 노변 객체가 포함될 수 있다.The feature map generation unit 22 detects feature points in a distance image near the destination retrieved by the distance image retrieval unit 21. The feature points may include a surrounding contour of the destination, a road surface object, and a roadside object.
노면 객체에는 주변 도로에 형성되어 있는 마크나 도로의 각종 선 등이 포함될 수 있다.The road surface object may include a mark formed on the surrounding road, various lines of the road, and the like.
노변 객체에는 주변 도로의 신호등이나 교통 표지판, 건물 또는 간판 등이 포함될 수 있다.Roadside objects may include traffic lights, traffic signs, buildings or signboards on nearby roads.
도 3 에는 특징점으로 도로의 횡단보도 선(특징점 1), 신호등(특징점 2), 간판(특징점 3) 및 건물(건물)이 도시되었다. 이러한 특징점을 토대로 현재 위치가 목적지 부근임을 인지할 수 있게 된다. 이에 대해서는 후술한다.In Fig. 3, a crosswalk line (feature point 1), a traffic light (feature point 2), a signboard (feature point 3), and a building (building) are shown as feature points. Based on these minutiae points, it is possible to recognize that the current position is near the destination. This will be described later.
한편, 피처 맵 검출부(20)는 내비게이션부(10)로부터 목적지가 입력될 때, 상기한 방식을 통해 기 생성되어 있는 복수의 목적지 각각에 대한 피처 맵 중 내비게이션부(10)로부터 현재 입력된 해당 목적지에 대한 피처 맵을 추출할 수도 있고, 해당 목적지에 대한 피처 맵이 생성되어 있지 않은 경우에는 전술한 방식을 통해 현재 입력된 해당 목적지에 대한 피처 맵을 생성할 수도 있다. 이러한 피처 맵 검출부의 기능은 후술하는 것과 같이 차량 외부에 구축된 서버에 구현될 수도 있다.On the other hand, when a destination is inputted from the navigation unit 10, the feature map detecting unit 20 detects the feature map of the corresponding destination currently input from the navigation unit 10 among the feature maps for each of the plurality of destinations, Or if a feature map for the destination is not generated, the feature map for the corresponding destination currently input through the above-described method may be generated. The function of the feature map detection unit may be implemented in a server built outside the vehicle as described later.
특징점 검출부(30)는 실제 촬영된 영상으로부터 목적지 부근의 특징점을 검출한다.The minutia detection unit 30 detects minutiae near the destination from the actually photographed image.
도 4 를 참조하면, 특징점 검출부(30)는 촬영부(31) 및 특징점 생성부(32)를 포함한다.Referring to FIG. 4, the feature point detection unit 30 includes a photographing unit 31 and a feature point generating unit 32.
촬영부(31)는 차량에 설치되어 차량 전방의 영상을 촬영하고, 촬영된 영상을 특징점 생성부(32)에 입력한다. The photographing unit 31 is installed in the vehicle, captures an image of the front of the vehicle, and inputs the photographed image to the feature point generating unit 32. [
촬영부(31)는 차량 내부에 별도로 마련될 수 있으나 블랙박스(미도시)에 장착된 카메라 또는 차량 운행시 선행 차량(미도시) 또는 차선을 촬영하기 위한 카메라 등도 채용될 수 있다. The photographing unit 31 may be provided inside the vehicle, but may be a camera mounted on a black box (not shown), a preceding vehicle (not shown) or a camera for photographing a lane.
특징점 생성부(32)는 촬영부(31)에 의해 실제 촬영된 영상에서 특징점을 생성한다. 여기서, 특징점 생성부(32)가 특징점을 추출하는 방식과 대상은 상기한 피처 맵 생성부(22)가 스트리트 뷰에서 특징점을 추출하는 방식 및 대상과 동일하다. 따라서, 여기서는 그 상세한 설명을 생략한다.The feature point generating unit 32 generates a feature point from the image actually photographed by the photographing unit 31. [ Here, the manner and object in which the feature point generation unit 32 extracts the feature points are the same as the manner and the manner in which the feature map generation unit 22 extracts feature points from the street view. Therefore, detailed description thereof will be omitted here.
아울러, 특징점 생성부(32)는 촬영부(31)에 의해 실제 촬영된 영상에서 특징점을 추출하는데, 차량의 현재 위치를 기반으로 특징점을 추출한다.In addition, the feature point generation unit 32 extracts feature points from an image actually photographed by the photographing unit 31, and extracts feature points based on the current position of the vehicle.
여기서, 특징점 생성부(32)는 차량이 목적지 부근에 도달할 때까지 촬영부(31)에 의해 실제 촬영된 영상에서 특징점을 추출할 수 있으나, 차량의 현재 위치가 기 설정된 설정 거리 이내인지 여부를 판단하고, 판단 결과에 따라 목적지 부근의 특징점을 생성할 수도 있다.Here, the feature point generating unit 32 can extract the feature point from the image actually photographed by the photographing unit 31 until the vehicle reaches the vicinity of the destination, but it is possible to determine whether the current position of the vehicle is within the predetermined set distance And the minutiae near the destination can be generated according to the determination result.
즉, 특징점 생성부(32)는 내비게이션부(10)로부터 목적지 좌표 및 차량의 현재 위치를 실시간으로 전달받고, 차량의 현재 위치가 목적지 좌표로부터 설정거리 이내인지 여부를 판단하여 목적지 좌표로부터 설정 거리 이내이면 촬영부(31)를 제어하여 차량 전방을 촬영하고, 실제 촬영된 영상에서 특징점을 추출할 수도 있다. 이와 같이, 목적지 좌표로부터 설정거리 이내일 때 특징점을 검출함으로써, 특징점 생성부(32)의 부하를 상대적으로 감소시키고 검출 에러를 최소화시킬 수 있다. That is, the minutia generation section 32 receives the destination coordinates and the current position of the vehicle from the navigation section 10 in real time, determines whether the current position of the vehicle is within the set distance from the destination coordinates, It is possible to photograph the front of the vehicle by controlling the photographing section 31 to extract the feature point from the actually photographed image. In this manner, by detecting the feature points within the set distance from the destination coordinates, the load on the feature point generating unit 32 can be relatively reduced and the detection error can be minimized.
설정거리는 목적지 부근에서 특징점으로 검출될 수 있는 객체의 존재 여부나 위치, 또는 사용자가 차량 내에서 목적지를 시각적으로 확인할 수 있는 거리 등에 따라 다양하게 설정될 수 있다. The setting distance may be variously set depending on the existence or position of an object that can be detected as a minutia in the vicinity of the destination, or the distance that the user can visually confirm the destination in the vehicle.
목적지 인식부(40)는 피처 맵 검출부(20)와 특징점 검출부(30) 각각에 의해 검출된 피처 맵과 특징점을 통해 목적지를 인식한다.The destination recognizing unit 40 recognizes the destination through the feature map and the minutiae detected by the feature map detecting unit 20 and the minutiae detecting unit 30, respectively.
도 5 를 참조하면, 목적지 인식부(40)는 목적지 좌표 검출부(41) 및 좌표 변환부(42)를 포함한다.Referring to FIG. 5, the destination recognizing unit 40 includes a destination coordinate detecting unit 41 and a coordinate converting unit 42.
목적지 좌표 검출부(41)는 피처 맵 검출부(20)와 특징점 검출부(30) 각각에 의해 검출된 피처 맵과 특징점을 매칭시켜 목적지 좌표를 검출한다.The destination coordinate detecting section 41 detects the destination coordinates by matching the feature maps detected by the feature map detecting section 20 and the feature point detecting section 30 with the feature points.
즉, 목적지 좌표 검출부(41)는 피처 맵 검출부(20)로부터 피처 맵을 입력받고, 특징점 검출부(30)에 의해 특징점을 입력받으면, 이 피처 맵과 특징점을 매칭시켜 일치 여부를 판단하며, 그 판단 결과에 따라 목적지를 식별한다. That is, the destination coordinate detecting section 41 receives the feature map from the feature map detecting section 20 and, when receiving the feature point by the feature point detecting section 30, matches the feature map with the feature point, Identify the destination according to the result.
예를 들어, 목적지 좌표 검출부(41)는 피처 맵과 특징점을 매칭시켜 피처 맵과 특징점이 일치하는 것으로 판단되면 실제 촬영된 영상에서 목적지를 식별하여 실제 촬영된 영상에서의 목적지 좌표를 검출한다. For example, the destination coordinate detecting unit 41 detects the destination coordinates in the actually photographed image by matching the feature map with the minutiae, and if it is determined that the feature map matches the minutiae point, the destination coordinates are detected from the actually photographed image.
좌표 변환부(42)는 목적지 좌표 검출부(41)에 의해 검출된 목적지 좌표를 허드 좌표로 변환한다. 좌표 변환부(42)는 목적지 좌표 검출부(41)에 의해 검출된 목적지 좌표를 허드 좌표와 매칭시켜 목적지 좌표를 허드 좌표로 변환한다. The coordinate conversion unit 42 converts the destination coordinates detected by the destination coordinate detection unit 41 into the Hurd coordinates. The coordinate conversion unit 42 converts the destination coordinates detected by the destination coordinate detection unit 41 into the coordinates of the destination by matching the coordinates with the coordinates of the destination.
여기서, 좌표 매칭은 실제 촬영된 거리 영상의 좌표가 헤드업 디스플레이(50)에 의해 표시되는 범위 및 허드 좌표에 매칭되는 것이다. Here, the coordinate matching is such that the coordinate of the actually photographed distance image is matched with the range displayed by the head-up display 50 and the Hurd coordinate.
즉, 좌표 변환부(42)는 실제 촬영된 거리 영상의 좌표에 대응되는 헤드업 디스플레이(50)에 의해 표시 가능한 범위 및 좌표를 사전에 저장하고, 촬영부(31)에 의해 실제 촬영된 거리 영상의 좌표 중 목적지 좌표에 매칭되는 허드 좌표를 검출함으로써, 목적지 좌표를 허드 좌표로 변환할 수 있게 된다. That is, the coordinate conversion unit 42 stores in advance the range and coordinates that can be displayed by the head-up display 50 corresponding to the coordinates of the actually photographed distance image, It is possible to convert the destination coordinates into the Hurd coordinates by detecting the Hurd coordinates matching the destination coordinates.
현재 위치 보정부(60)는 상기한 바와 같이 실제 촬영된 영상에서 목적지 좌표가 검출됨에 따라, 해당 목적지 좌표를 바탕으로 현재 위치를 보정한다. 즉 현재 위치 보정부(60)는 실제 촬영된 영상에서의 각 좌표에 대응되는 실제 거리가 사전에 설정되며, 이를 토대로 실제 촬영된 영상에서 검출된 목적지 좌표에 대응되는 목적지까지의 거리를 계산한다. 이어 현재 위치 보정부(60)는 목적지의 지도상의 좌표에 상기한 바와 같이 계산된 목적지까지의 거리를 적용하여 차량의 현재 위치를 새롭게 검출하고, 이와 같이 검출된 차량의 현재 위치를 내비게이션부(10)에 입력한다. The current position correcting unit 60 corrects the current position based on the corresponding destination coordinates as the destination coordinates are detected from the actually photographed image as described above. That is, the current position correcting unit 60 calculates the actual distance corresponding to each coordinate in the actually photographed image in advance, and calculates the distance to the destination corresponding to the detected destination coordinates in the actually photographed image. The current position correcting unit 60 newly detects the current position of the vehicle by applying the distance to the destination calculated as described above to the coordinates on the map of the destination and outputs the detected current position of the vehicle to the navigation unit 10 ).
이에 따라, 내비게이션부(10)는 현재 위치 보정부(60)로부터 입력된 현재 위치로 차량의 현재 위치를 보정하고 보정된 현재 위치를 바탕으로 목적지까지의 경로를 더욱 정확하게 안내한다. Accordingly, the navigation unit 10 corrects the current position of the vehicle to the current position input from the current position correcting unit 60, and guides the path to the destination more accurately based on the corrected current position.
헤드업 디스플레이(50)는 목적지 인식부(40)에 의해 목적지가 인식되면, 윈드쉴드 글래스 상의 목적지에 목적지를 안내하기 위한 목적지 인디케이터를 표시한다.Up display 50 displays a destination indicator for guiding a destination to a destination on the windshield glass when a destination is recognized by the destination recognizing unit 40. [
즉, 헤드업 디스플레이(50)는 목적지 인식부(40)에 의해 목적지 좌표에 대응되는 허드 좌표가 검출되면, 도 6 에 도시된 바와 같이 해당 허드 좌표에 따라 목적지를 안내하기 위한 목적지 인디케이터를 표시한다.That is, the head-up display 50 displays the destination indicator for guiding the destination according to the coordinates of the corresponding hub as shown in FIG. 6 when the destination recognizing unit 40 detects the coordinates of the destination corresponding to the destination coordinates .
이 경우, 헤드업 디스플레이(50)는 목적지 인디케이터를 목적지 좌표에 직접 표시할 수 있으나, 이외에도 목적지 주변에 표시할 수 있다. In this case, the head-up display 50 can display the destination indicator directly at the destination coordinates, but can also be displayed around the destination.
목적지 인디케이터는 다각형으로 형성되거나 점이나 선, 화살표 등의 기호로도 형성될 수 있다. The destination indicator may be formed of polygons or may be formed by symbols such as dots, lines, and arrows.
이하 본 발명의 일 실시예에 따른 목적지 안내 방법을 도 7 을 참조하여 상세하게 설명한다. Hereinafter, a destination guidance method according to an embodiment of the present invention will be described in detail with reference to FIG.
도 7 은 본 발명의 일 실시예에 따른 목적지 안내 방법의 순서도이다. 7 is a flowchart of a destination guidance method according to an embodiment of the present invention.
도 7 을 참조하면, 먼저 내비게이션부(10)는 사용자로부터 목적지 정보를 입력받는다(S10). 이때, 내비게이션부(10)는 사용자에 의해 입력된 목적지 정보를 피처 맵 검출부(20)에 입력하고, 차량의 현재 위치를 실시간으로 특징점 검출부(30)에 입력한다. Referring to FIG. 7, first, the navigation unit 10 receives destination information from a user (S10). At this time, the navigation unit 10 inputs the destination information inputted by the user to the feature map detecting unit 20, and inputs the current position of the vehicle to the feature point detecting unit 30 in real time.
한편, 내비게이션부(10)는 사용자로부터 목적지 정보가 입력됨에 따라 해당 목적지까지의 경로를 안내한다. On the other hand, as the destination information is input from the user, the navigation unit 10 guides the route to the destination.
거리 영상 검색부(21)는 내비게이션부(10)에 의해 입력된 목적지 정보에 따라 목적지 부근의 거리 영상을 검색한다(S20). 이 경우, 피처 맵 검출부(20)는 목적지에 진입할 수 있는 다양한 방향에서의 스트리트 뷰를 검색할 수 있다. The distance image search unit 21 searches for a distance image in the vicinity of the destination according to the destination information input by the navigation unit 10 (S20). In this case, the feature map detecting unit 20 can search the street view in various directions that can enter the destination.
이어 피처 맵 생성부(22)는 거리 영상 검색부(21)에 의해 검색된 거리 영상에서 목적지 부근의 특징점을 검출하고 검출된 특징점을 이용하여 피처 맵을 검출한다(30). 이 경우, 특징점으로는 목적지의 주변 윤곽선, 노면 객체 및 노변 객체가 포함될 수 있다.Then, the feature map generating unit 22 detects feature points in the vicinity of the destination in the distance image retrieved by the distance image retrieving unit 21, and detects the feature map using the detected feature points (30). In this case, the feature point may include a peripheral contour of the destination, a road surface object, and a roadside object.
한편, 특징점 생성부(32)는 내비게이션부(10)로부터 목적지 좌표 및 차량의 현재 위치를 실시간으로 전달받는데, 차량의 현재 위치가 목적지 좌표로부터 설정거리 이내인지 여부를 판단한다(S40). On the other hand, the minutia generation section 32 receives the destination coordinates and the current position of the vehicle from the navigation section 10 in real time, and determines whether the current position of the vehicle is within a set distance from the destination coordinates (S40).
단계(S40)에서의 판단 결과, 차량의 현재 위치가 목적지 좌표로부터 설정 거리 이내이면, 특징점 생성부(32)는 촬영부(31)를 제어하여 목적지 부근의 거리 영상을 촬영하고, 촬영부(31)에 의해 실제 촬영된 영상에서 특징점을 검출한다(S50). As a result of the determination in step S40, if the current position of the vehicle is within the set distance from the destination coordinates, the minutia generation section 32 controls the photographing section 31 to photograph the distance image in the vicinity of the destination, (Step S50).
이어 목적지 좌표 검출부(41)는 검출된 피처 맵과 특징점을 매칭시켜 이들의 일치 여부를 판단하고, 판단 결과에 따라 실제 촬영된 영상에서 목적지를 식별하여 실제 촬영된 영상에서의 목적지 좌표를 검출한다(S60). Then, the destination coordinate detecting unit 41 detects the correspondence between the detected feature maps and the minutiae, judges whether they match or not, identifies the destination from the actually photographed image, and detects the destination coordinates of the actually photographed image S60).
목적지 좌표가 검출됨에 따라 좌표 변환부(42)는 촬영부(31)에 의해 실제 촬영된 거리 영상의 목적지 좌표를 허드 좌표와 매칭시켜 목적지 좌표를 허드 좌표로 변환한다(S70). As the destination coordinates are detected, the coordinate transforming unit 42 converts the destination coordinates of the distance image actually photographed by the photographing unit 31 to the coordinates of the heights by converting the coordinates of the destination coordinates into the coordinates of the heights (S70).
상기한 바와 같이 목적지 좌표가 허드 좌표로 변환됨에 따라, 헤드업 디스플레이(50)는 해당 허드 좌표에 따라 목적지를 안내하기 위한 목적지 인디케이터를 표시한다(S80).As described above, the head-up display 50 displays a destination indicator for guiding the destination according to the coordinates of the corresponding head, as the destination coordinates are converted into the coordinates of the head (S80).
아울러, 실제 촬영된 영상에서 목적지 좌표가 검출됨에 따라, 현재 위치 보정부(60)는 실제 촬영된 영상에서 검출된 목적지 좌표에 대응되는 목적지까지의 거리를 계산하고, 목적지의 지도상의 좌표에 목적지까지의 거리를 적용하여 차량의 현재 위치를 새롭게 검출한 후, 새롭게 검출된 차량의 현재 위치를 내비게이션부(10)에 입력한다. In addition, as the destination coordinates are detected in the actually photographed image, the current position correcting unit 60 calculates the distance to the destination corresponding to the destination coordinates detected in the actually photographed image, and calculates the distance to the destination And then inputs the current position of the newly detected vehicle to the navigation unit 10. The navigation unit 10 detects the current position of the vehicle.
이에 따라, 내비게이션부(10)는 현재 위치 보정부(60)로부터 입력된 현재 위치로 차량의 현재 위치를 보정하고, 보정된 현재 위치를 바탕으로 목적지까지의 경로를 더욱 정확하게 안내할 수 있다.Accordingly, the navigation unit 10 can correct the current position of the vehicle to the current position input from the current position correcting unit 60, and guide the path to the destination more accurately based on the corrected current position.
이상에서는 내비게이션부(10), 피처 맵(Feature Map) 검출부(20), 특징점 검출부(30), 목적지 인식부(40) 및 현재 위치 보정부(60)를 분리된 구성으로 설명하였으나, 실시예에 따라서는 도 8에 도시된 것과 같이 내비게이션부(10), 특징점 검출부(30), 목적지 인식부(40) 및 현재 위치 보정부(60)가 차량에 설치된 내비게이션 단말(100)에 통합되고, 피처 맵(Feature Map) 검출부(20)가 내비게이션 단말(100)과 구분되는 외부의 서버(200)로 구현되는 실시예가 채용될 수도 있다.Although the navigation unit 10, the feature map detecting unit 20, the feature point detecting unit 30, the destination recognizing unit 40 and the current position correcting unit 60 have been described above as separate components, 8, the navigation unit 10, the feature point detection unit 30, the destination recognition unit 40, and the current position correction unit 60 are integrated in the navigation terminal 100 installed in the vehicle, (Feature Map) detection unit 20 may be embodied as an external server 200 that is distinguished from the navigation terminal 100 may be employed.
즉, 본 실시예는 내비게이션 단말(100)이 입력받은 목적지까의 경로를 안내하고, 서버(200)가 내비게이션 단말(100)에 의해 입력된 목적지를 이용하여 목적지 부근의 거리 영상으로부터 목적지 부근의 특징점에 대한 피처 맵(Feature Map)을 검출하며, 이에 따라 내비게이션 단말(100)이 실제 촬영된 영상으로부터 목적지 부근의 특징점을 검출하고, 서버(200)에 의해 검출된 피처 맵과 상기 검출된 특징점을 통해 목적지를 인식한 후, 헤드업 디스플레이를 통해 윈드쉴드 글래스 상의 목적지에 목적지를 안내하기 위한 목적지 인디케이터가 표시되도록 하는 실시예로 구현될 수 있다.That is, the present embodiment guides a route to a destination inputted by the navigation terminal 100, and the server 200 uses the destination input by the navigation terminal 100 to extract a distance from a near- The navigation terminal 100 detects a feature point in the vicinity of the destination from the actually photographed image and detects the feature point detected by the server 200 and the detected feature point through the detected feature point After recognizing the destination, a destination indicator for guiding the destination to the destination on the windshield glass is displayed through the head-up display.
이 경우, 전술한 것과 같이 서버(200)는 내비게이션 단말(100)로부터 목적지가 입력될 때 목적지 부근의 특징점에 대한 피처 맵을 생성하거나, 기 생성되어 있는 복수의 목적지 각각에 대한 피처 맵 중 내비게이션 단말(100)로부터 현재 입력된 목적지에 대한 피처 맵을 추출하는 방식을 이용하여 목적지 부근의 특징점에 대한 피처 맵을 검출할 수 있다. 피처 맵을 생성할 때, 서버(200)는 내비게이션 단말(100)에 의해 입력된 목적지에 따라 목적지 부근의 거리 영상을 검색하고, 검색된 거리 영상에서 목적지 부근의 특징점을 검출한 후, 검출된 특징점을 이용하여 피처 맵을 생성할 수 있다.In this case, as described above, when the destination is input from the navigation terminal 100, the server 200 may generate a feature map for the feature points in the vicinity of the destination, A feature map for a feature point in the vicinity of a destination can be detected using a method of extracting a feature map for a destination currently input from the feature point searching unit 100. [ When generating the feature map, the server 200 searches the distance image near the destination according to the destination input by the navigation terminal 100, detects the feature point near the destination in the retrieved distance image, Can be used to generate a feature map.
또한, 내비게이션 단말(100)은 서버(200)에 의해 검출된 피처 맵과 상기 검출된 특징점을 매칭시켜 목적지 좌표를 검출하고, 검출된 목적지 좌표를 허드 좌표로 변환하여 헤드업 디스플레이에 전달할 수 있으며, 이에 따라 헤드업 디스플레이를 통해 윈드쉴드 글래스 상의 목적지에 목적지를 안내하기 위한 목적지 인디케이터가 표시될 수 있다.In addition, the navigation terminal 100 may detect the destination coordinates by matching the feature maps detected by the server 200 with the detected minutia points, convert the detected destination coordinates into the coordinates of the hands, and transmit the converted coordinates to the head-up display, Accordingly, a destination indicator for guiding the destination to the destination on the windshield glass through the head-up display can be displayed.
본 명세서에서 설명된 구현은, 예컨대, 방법 또는 프로세스, 장치, 소프트웨어 프로그램, 데이터 스트림 또는 신호로 구현될 수 있다. 단일 형태의 구현의 맥락에서만 논의(예컨대, 방법으로서만 논의)되었더라도, 논의된 특징의 구현은 또한 다른 형태(예컨대, 장치 또는 프로그램)로도 구현될 수 있다. 장치는 적절한 하드웨어, 소프트웨어 및 펌웨어 등으로 구현될 수 있다. 방법은, 예컨대, 컴퓨터, 마이크로프로세서, 집적 회로 또는 프로그래밍가능한 로직 디바이스 등을 포함하는 프로세싱 디바이스를 일반적으로 지칭하는 프로세서 등과 같은 장치에서 구현될 수 있다. 프로세서는 또한 최종-사용자 사이에 정보의 통신을 용이하게 하는 컴퓨터, 셀 폰, 휴대용/개인용 정보 단말기(personal digital assistant: "PDA") 및 다른 디바이스 등과 같은 통신 디바이스를 포함한다.The implementations described herein may be implemented, for example, as a method or process, an apparatus, a software program, a data stream, or a signal. Although discussed only in the context of a single type of implementation (e.g., discussed only as a method), implementations of the discussed features may also be implemented in other forms (e.g., devices or programs). The device may be implemented with suitable hardware, software, firmware, and the like. The method may be implemented in an apparatus such as, for example, a processor, which generally refers to a processing device including a computer, microprocessor, integrated circuit or programmable logic device, and the like. The processor also includes communication devices such as computers, cell phones, personal digital assistants ("PDAs") and other devices that facilitate communication of information between end-users.
이와 같이, 본 발명의 일 실시예에 따른 목적지 안내 장치 및 방법은 목적지를 헤드업 디스플레이를 통해 정확하게 표시함으로써, 사용자가 목적지 부근에서 목적지를 찾지 못하거나 목적지를 지나치는 상황을 미연에 방지한다.As described above, the destination guidance apparatus and method according to the embodiment of the present invention precisely displays the destination through the head-up display, thereby preventing the user from finding the destination in the vicinity of the destination or passing through the destination.
본 발명은 도면에 도시된 실시예를 참고로 하여 설명되었으나, 이는 예시적인 것에 불과하며 당해 기술이 속하는 기술분야에서 통상의 지식을 가진 자라면 이로부터 다양한 변형 및 균등한 타 실시예가 가능하다는 점을 이해할 것이다. 따라서, 본 발명의 진정한 기술적 보호범위는 아래의 특허청구범위에 의하여 정해져야할 것이다.While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, I will understand. Accordingly, the true scope of the present invention should be determined by the following claims.

Claims (20)

  1. 목적지 정보를 입력받아 목적지까지의 경로를 안내하는 내비게이션부;A navigation unit for receiving destination information and guiding a route to a destination;
    상기 내비게이션부에 의해 입력된 목적지 정보를 이용하여 목적지 부근의 거리 영상으로부터 목적지 부근의 특징점에 대한 피처 맵(Feature Map)을 검출하는 피처 맵 검출부; A feature map detecting unit that detects a feature map of a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination information input by the navigation unit;
    실제 촬영된 영상으로부터 목적지 부근의 특징점을 검출하는 특징점 검출부; A feature point detector for detecting a feature point in the vicinity of a destination from an actually photographed image;
    상기 피처 맵 검출부와 상기 특징점 검출부 각각에 의해 검출된 피처 맵과 특징점을 통해 목적지를 인식하는 목적지 인식부; 및A destination recognizing unit for recognizing a destination through the feature map and the minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And
    상기 목적지 인식부에 의해 목적지가 인식되면, 윈드쉴드 글래스 상의 목적지에 목적지를 안내하기 위한 목적지 인디케이터를 표시하는 헤드업 디스플레이를 포함하는 목적지 안내 장치. And a head-up display for displaying a destination indicator for guiding a destination to a destination on the windshield glass when the destination is recognized by the destination recognizing unit.
  2. 제 1 항에 있어서, 상기 피처 맵 검출부는 The apparatus according to claim 1, wherein the feature map detecting unit
    상기 내비게이션부에 의해 입력된 목적지 정보에 따라 목적지 부근의 거리 영상을 검색하는 거리 영상 검색부; 및A distance image search unit for searching for a distance image in the vicinity of a destination according to the destination information input by the navigation unit; And
    상기 거리 영상 검색부에 의해 검색된 거리 영상에서 목적지 부근의 특징점을 검출하고 검출된 특징점을 이용하여 피처 맵을 생성하는 피처 맵 생성부를 포함하는 것을 특징으로 하는 목적지 안내 장치. And a feature map generation unit for detecting a feature point in the vicinity of the destination in the distance image retrieved by the distance image retrieval unit and generating a feature map using the detected feature points.
  3. 제 2 항에 있어서, 상기 피처 맵 생성부는 특징점으로 목적지의 주변 윤곽선, 노면 객체 및 노변 객체 중 적어도 하나를 검출하는 것을 특징으로 하는 목적지 안내 장치. 3. The destination guidance apparatus according to claim 2, wherein the feature map generator detects at least one of a peripheral contour of a destination, a road surface object, and a roadside object as minutiae.
  4. 제 1 항에 있어서, 상기 특징점 검출부는 The apparatus according to claim 1, wherein the minutiae point detecting unit
    차량의 전방을 촬영하는 촬영부; 및A photographing unit photographing the front of the vehicle; And
    상기 촬영부에 의해 촬영된 영상에서 특징점을 생성하는 특징점 생성부를 포함하는 것을 특징으로 하는 목적지 안내 장치. And a minutiae point generating unit for generating minutiae from the image photographed by the photographing unit.
  5. 제 4 항에 있어서, 상기 특징점 생성부는5. The apparatus of claim 4, wherein the feature point generation unit
    상기 내비게이션부에 의해 검출된 차량의 현재 위치가 목적지로부터 기 설정된 설정 거리 이내이면 목적지 부근의 특징점을 생성하는 것을 특징으로 하는 목적지 안내 장치.When the current position of the vehicle detected by the navigation unit is within a predetermined set distance from the destination, generates a feature point in the vicinity of the destination.
  6. 제 1 항에 있어서, 상기 목적지 인식부는 The apparatus of claim 1, wherein the destination recognizing unit
    상기 피처 맵 검출부와 상기 특징점 검출부 각각에 의해 검출된 피처 맵과 특징점을 매칭시켜 목적지 좌표를 검출하는 목적지 좌표 검출부; 및A destination coordinate detector for detecting a destination coordinate by matching the feature point and the minutiae detected by the feature map detector and the minutia matching detector, respectively; And
    상기 목적지 좌표 검출부에 의해 검출된 목적지 좌표를 허드 좌표로 변환하는 좌표 변환부를 포함하는 것을 특징으로 하는 목적지 안내 장치. And a coordinate conversion unit for converting the destination coordinates detected by the destination coordinate detection unit into the Hurd coordinates.
  7. 제 6 항에 있어서, 상기 헤드업 디스플레이는 상기 좌표 변환부에 의해 변환된 허드 좌표에 따라 목적지 인디케이터를 표시하는 것을 특징으로 하는 목적지 안내 장치. The destination guidance apparatus according to claim 6, wherein the head-up display displays a destination indicator according to the coordinate of the hub converted by the coordinate conversion unit.
  8. 제 1 항에 있어서, 실제 촬영된 영상에서 검출된 목적지 좌표에 대응되는 목적지까지의 거리를 계산하고, 목적지의 지도상의 좌표에 목적지까지의 거리를 적용하여 차량의 현재 위치를 새롭게 검출한 후, 새롭게 검출된 차량의 현재 위치를 상기 내비게이션부에 입력하는 현재 위치 보정부를 더 포함하는 것을 특징으로 하는 목적지 안내 장치.The method according to claim 1, further comprising: calculating a distance to a destination corresponding to the destination coordinate detected in the actually photographed image, newly detecting the current position of the vehicle by applying a distance to the destination to the coordinates on the map of the destination, And a current position correcting unit for inputting the current position of the detected vehicle to the navigation unit.
  9. 내비게이션부가 목적지 정보를 입력받는 단계; The navigation unit receiving destination information;
    피처 맵 검출부가 상기 내비게이션부에 의해 입력된 목적지 정보를 이용하여 목적지 부근의 거리 영상으로부터 목적지 부근의 특징점에 대한 피처 맵(Feature Map)을 검출하는 단계;The feature map detecting unit detecting a feature map for a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination information input by the navigation unit;
    특징점 검출부가 실제 촬영된 영상으로부터 목적지 부근의 특징점을 검출하는 단계; Detecting a feature point near a destination from an actually photographed image;
    목적지 인식부가 상기 피처 맵 검출부와 상기 특징점 검출부 각각에 의해 검출된 피처 맵과 특징점을 이용하여 목적지를 인식하는 단계; 및The destination recognizing unit recognizing the destination using the feature map and the minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And
    헤드업 디스플레이가 상기 목적지 인식부에 의해 인식된 목적지에 목적지를 안내하기 위한 목적지 인디케이터를 표시하는 단계를 포함하는 목적지 안내 방법. And displaying a destination indicator for guiding the destination to a destination recognized by the destination recognizing unit.
  10. 제 9 항에 있어서, 상기 목적지 부근의 피처 맵을 검출하는 단계는 상기 내비게이션부에 의해 입력된 목적지 정보에 따라 목적지 부근의 거리 영상을 검색하고, 검색된 거리 영상에서 목적지 부근의 특징점을 검출한 후, 검출된 특징점을 이용하여 피처 맵을 생성하는 것을 특징으로 하는 목적지 안내 방법. The method as claimed in claim 9, wherein the step of detecting the feature map in the vicinity of the destination comprises: searching a distance image in the vicinity of the destination according to the destination information input by the navigation unit; detecting a feature point in the vicinity of the destination in the retrieved distance image; And a feature map is generated using the detected minutiae.
  11. 제 9 항에 있어서, 상기 피처 맵을 검출하는 단계는 특징점으로 목적지의 주변 윤곽선, 노면 객체 및 노변 객체 중 적어도 하나를 검출하는 것을 특징으로 하는 목적지 안내 방법. 10. The method of claim 9, wherein detecting the feature map comprises detecting at least one of a peripheral contour of the destination, a road surface object, and a roadside object as feature points.
  12. 제 9 항에 있어서, 상기 목적지 부근의 특징점을 검출하는 단계는 차량의 전방을 촬영하고, 촬영된 영상에서 특징점을 생성하는 것을 특징으로 하는 목적지 안내 방법. 10. The destination guidance method according to claim 9, wherein the step of detecting the feature points in the vicinity of the destination captures the front of the vehicle and generates the feature points in the captured image.
  13. 제 12 항에 있어서, 상기 목적지 부근의 특징점을 검출하는 단계는 상기 내비게이션부에 의해 검출된 차량의 현재 위치가 목적지로부터 기 설정된 설정 거리 이내이면 목적지 부근의 특징점을 생성하는 것을 특징으로 하는 목적지 안내 방법.13. The destination guidance method according to claim 12, wherein the step of detecting the feature point in the vicinity of the destination generates the feature point in the vicinity of the destination when the current position of the vehicle detected by the navigation unit is within a predetermined set distance from the destination .
  14. 제 9 항에 있어서, 상기 목적지를 인식하는 단계는 상기 피처 맵 검출부와 상기 특징점 검출부 각각에 의해 검출된 피처 맵과 특징점을 매칭시켜 목적지 좌표를 검출하고, 검출된 목적지 좌표를 상기 헤드업 디스플레이의 허드 좌표로 변환하는 것을 특징으로 하는 목적지 안내 방법. 10. The method of claim 9, wherein the step of recognizing the destination comprises: detecting a destination coordinate by matching the feature map and the minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively, Coordinates of the destination.
  15. 제 14 항에 있어서, 상기 목적지 인디케이터를 표시하는 단계는 상기 목적지 인식부에 의해 변환된 허드 좌표에 따라 목적지 인디케이터를 표시하는 것을 특징으로 하는 목적지 안내 방법. 15. The destination guidance method according to claim 14, wherein the step of displaying the destination indicator displays a destination indicator according to the coordinates of the hub converted by the destination recognition unit.
  16. 제 9 항에 있어서, 현재 위치 보정부가 실제 촬영된 영상에서 검출된 목적지 좌표에 대응되는 목적지까지의 거리를 계산하고, 목적지의 지도상의 좌표에 목적지까지의 거리를 적용하여 차량의 현재 위치를 새롭게 검출한 후, 새롭게 검출된 차량의 현재 위치를 상기 내비게이션부에 입력하며, 상기 내비게이션부가 상기 현재 위치 보정부에 의해 새롭게 검출된 차량의 현재 위치를 통해 차량의 현재 위치를 보정하여 보정된 차량의 현재 위치를 토대로 목적지까지의 경로를 안내하는 단계를 더 포함하는 것을 특징으로 하는 목적지 안내 방법.The method as claimed in claim 9, wherein the current position correction unit calculates a distance to a destination corresponding to the destination coordinate detected in the actually photographed image and newly detects the current position of the vehicle by applying the distance to the destination to the coordinates on the map of the destination The navigation unit inputs the current position of the newly detected vehicle to the navigation unit, and the navigation unit corrects the current position of the vehicle through the current position of the vehicle newly detected by the current position correcting unit, Further comprising the step of: guiding the route to the destination based on the route information.
  17. 입력받은 목적지까지의 경로를 안내하는 내비게이션 단말; 및A navigation terminal for guiding a route to an input destination; And
    상기 내비게이션 단말에 의해 입력된 목적지를 이용하여 목적지 부근의 거리 영상으로부터 목적지 부근의 특징점에 대한 피처 맵(Feature Map)을 검출하는 서버를 포함하고,And a server for detecting a feature map of a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination input by the navigation terminal,
    상기 내비게이션 단말은, 실제 촬영된 영상으로부터 목적지 부근의 특징점을 검출하고, 상기 서버에 의해 검출된 피처 맵과 상기 검출된 특징점을 통해 목적지를 인식한 후, 헤드업 디스플레이를 통해 윈드쉴드 글래스 상의 목적지에 목적지를 안내하기 위한 목적지 인디케이터가 표시되도록 하는 것을 특징으로 하는 목적지 안내 장치.The navigation terminal detects a feature point in the vicinity of the destination from the actually photographed image, recognizes the destination through the feature map detected by the server and the detected feature point, and then displays the destination on the windshield glass through the head- So that a destination indicator for guiding the destination is displayed.
  18. 제 17 항에 있어서, 상기 서버는 상기 내비게이션 단말로부터 목적지가 입력될 때 목적지 부근의 특징점에 대한 피처 맵을 생성하거나, 기 생성되어 있는 복수의 목적지 각각에 대한 피처 맵 중 상기 내비게이션 단말로부터 현재 입력된 목적지에 대한 피처 맵을 추출하는 방식을 이용하여 목적지 부근의 특징점에 대한 피처 맵을 검출하는 것을 특징으로 하는 목적지 안내 장치.18. The navigation system according to claim 17, wherein the server generates a feature map for a feature point near a destination when a destination is input from the navigation terminal, or generates a feature map for each of a plurality of destinations, And a feature map for a feature point in the vicinity of a destination is detected using a method of extracting a feature map for a destination.
  19. 제 18 항에 있어서, 상기 서버는 피처 맵을 생성할 때, 상기 내비게이션 단말에 의해 입력된 목적지에 따라 목적지 부근의 거리 영상을 검색하고, 검색된 거리 영상에서 목적지 부근의 특징점을 검출한 후, 검출된 특징점을 이용하여 피처 맵을 생성하는 것을 특징으로 하는 목적지 안내 장치.19. The navigation system according to claim 18, wherein, when generating the feature map, the server searches for a distance image in the vicinity of the destination according to the destination input by the navigation terminal, detects feature points in the vicinity of the destination in the retrieved distance image, And a feature map is generated using the feature points.
  20. 제 17 항에 있어서, 상기 내비게이션 단말은, 상기 서버에 의해 검출된 피처 맵과 상기 검출된 특징점을 매칭시켜 목적지 좌표를 검출하고, 검출된 목적지 좌표를 허드 좌표로 변환하여 상기 헤드업 디스플레이에 전달하는 것을 특징으로 하는 목적지 안내 장치.18. The head-up display of claim 17, wherein the navigation terminal detects destination coordinates by matching the feature maps detected by the server with the detected feature points, converts the detected destination coordinates into the header coordinates, And the destination guide device.
PCT/KR2018/016645 2017-12-28 2018-12-26 Destination guide apparatus and method WO2019132504A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020170182196A KR102541069B1 (en) 2017-12-28 2017-12-28 Apparatus and method for guiding destination
KR10-2017-0182196 2017-12-28

Publications (1)

Publication Number Publication Date
WO2019132504A1 true WO2019132504A1 (en) 2019-07-04

Family

ID=67067860

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/016645 WO2019132504A1 (en) 2017-12-28 2018-12-26 Destination guide apparatus and method

Country Status (2)

Country Link
KR (1) KR102541069B1 (en)
WO (1) WO2019132504A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112577524A (en) * 2020-12-16 2021-03-30 北京百度网讯科技有限公司 Information correction method and device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102349652B1 (en) * 2020-03-26 2022-01-12 주식회사 라이드플럭스 Method, apparatus and computer program for providing route guidance service using traffic lights information and location information of vehicle
KR102427713B1 (en) * 2020-06-03 2022-08-03 홍기방 Method for door to car delivery service

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120067479A (en) * 2010-12-16 2012-06-26 에스케이플래닛 주식회사 Navigation system using picture and method of cotnrolling the same
KR20140064424A (en) * 2012-11-20 2014-05-28 엘지전자 주식회사 Apparatus and method for recognizing position of car
KR20150034997A (en) * 2013-09-27 2015-04-06 네이버 주식회사 Method and system for notifying destination by route guide
KR20170016203A (en) * 2015-08-03 2017-02-13 현대모비스 주식회사 Route guidandce apparatus and control method for the same
KR20170105165A (en) * 2016-03-08 2017-09-19 주식회사 비에스피 Verifying device and verifying method of driving path using the road view and the shooting image

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150066036A (en) * 2013-12-06 2015-06-16 이동녀 Position Guidance System and Method Using Transparency Navigation
KR101885356B1 (en) * 2016-05-04 2018-08-03 임재형 Apparatus for determining position information of object and method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120067479A (en) * 2010-12-16 2012-06-26 에스케이플래닛 주식회사 Navigation system using picture and method of cotnrolling the same
KR20140064424A (en) * 2012-11-20 2014-05-28 엘지전자 주식회사 Apparatus and method for recognizing position of car
KR20150034997A (en) * 2013-09-27 2015-04-06 네이버 주식회사 Method and system for notifying destination by route guide
KR20170016203A (en) * 2015-08-03 2017-02-13 현대모비스 주식회사 Route guidandce apparatus and control method for the same
KR20170105165A (en) * 2016-03-08 2017-09-19 주식회사 비에스피 Verifying device and verifying method of driving path using the road view and the shooting image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112577524A (en) * 2020-12-16 2021-03-30 北京百度网讯科技有限公司 Information correction method and device

Also Published As

Publication number Publication date
KR102541069B1 (en) 2023-06-07
KR20190080030A (en) 2019-07-08

Similar Documents

Publication Publication Date Title
WO2019132504A1 (en) Destination guide apparatus and method
WO2012157850A2 (en) Real-time map data updating system and method
WO2011055978A2 (en) User terminal, method for providing position and method for guiding route thereof
WO2021006441A1 (en) Road sign information collection method using mobile mapping system
WO2011025254A2 (en) Method for providing vehicle information and terminal device applying the same
WO2017131334A1 (en) System and method for recognizing location of mobile robot and making map
EP2740103A1 (en) Traffic lane recognizing apparatus and method thereof
WO2019054593A1 (en) Map production apparatus using machine learning and image processing
WO2018230845A1 (en) Method for positioning on basis of vision information and robot implementing same
WO2020159076A1 (en) Landmark location estimation apparatus and method, and computer-readable recording medium storing computer program programmed to perform method
WO2020067751A1 (en) Device and method for data fusion between heterogeneous sensors
WO2019240340A1 (en) Speeding guide device capable of measuring speed of vehicle by using camera, and operation method therefor
WO2020075954A1 (en) Positioning system and method using combination of results of multimodal sensor-based location recognition
WO2015064892A1 (en) Method for emphasizing point of interest of navigation user for each time period and route guidance server
CN110998684A (en) Image collection system, image collection method, image collection device, recording medium, and vehicle communication device
WO2021045445A1 (en) Driver's license test processing device
WO2018021870A1 (en) Navigation system and location correction method of navigation system
WO2021241847A1 (en) Method and system for generating visual feature map
WO2021194109A1 (en) Method, device, and computer program for providing driving guide by using vehicle position information and signal light information
WO2015108401A1 (en) Portable device and control method using plurality of cameras
KR20140064424A (en) Apparatus and method for recognizing position of car
WO2020071573A1 (en) Location information system using deep learning and method for providing same
WO2020209551A1 (en) Portable apparatus for measuring air quality and method for displaying information about air quality
WO2013022153A1 (en) Apparatus and method for detecting lane
WO2016003023A1 (en) Route search method, device and server

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18893721

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18893721

Country of ref document: EP

Kind code of ref document: A1