WO2019132504A1 - Appareil et procédé de guidage de destination - Google Patents
Appareil et procédé de guidage de destination Download PDFInfo
- Publication number
- WO2019132504A1 WO2019132504A1 PCT/KR2018/016645 KR2018016645W WO2019132504A1 WO 2019132504 A1 WO2019132504 A1 WO 2019132504A1 KR 2018016645 W KR2018016645 W KR 2018016645W WO 2019132504 A1 WO2019132504 A1 WO 2019132504A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- destination
- unit
- feature
- feature map
- vicinity
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3647—Guidance involving output of stored or live camera images or video streams
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3446—Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3632—Guidance using simplified or iconic instructions, e.g. using arrows
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
Definitions
- the present invention relates to a destination guidance apparatus and method, and more particularly, to a destination guidance apparatus and method for accurately guiding a destination through a head-up display.
- the navigation system reads the data of the current position in the road database based on the position information of the route search device received through the GPS (Global Positioning System) satellite and displays the current position together with the position of the vehicle on the screen. It is a device that helps you to find the destination easily when you identify the position of a road or drive on a new road.
- GPS Global Positioning System
- head-up display is used to provide information provided by navigation.
- the information provided in the information navigation provided through the head-up display is simply provided on the windshield in the past, even if the information is provided through the head-up display, the driver can not find the destination in the vicinity of the destination, There has been a problem that an overdraft still occurs.
- the present invention has been made to overcome the above problems, and it is an object of one aspect of the present invention to provide a method and apparatus for searching for an image near a destination to generate a feature map in the vicinity of a destination, And a destination guidance device and method for matching a feature point detected in a photographed image with a feature map to recognize a destination and displaying the destination through a head-up display.
- a destination guide apparatus including: a navigation unit receiving destination information and guiding a route to a destination; A feature map detecting unit that detects a feature map of a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination information input by the navigation unit; A feature point detector for detecting a feature point in the vicinity of a destination from an actually photographed image; A destination recognizing unit for recognizing a destination through the feature map and the minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And a head-up display for displaying a destination indicator for guiding a destination to a destination on the windshield glass, when the destination is recognized by the destination recognizing unit.
- the feature map detection unit may include a distance image search unit searching for a distance image in the vicinity of the destination according to the destination information input by the navigation unit; And a feature map generation unit for detecting a feature point in the vicinity of the destination in the range image retrieved by the distance image retrieval unit and generating a feature map using the detected feature points.
- the feature map generation unit of the present invention is characterized in that at least one of a peripheral contour line of a destination, a road surface object, and a roadside object is detected as a minutiae point.
- the minutiae detection unit of the present invention includes a photographing unit for photographing the front of the vehicle; And a feature point generating unit for generating feature points in the image captured by the photographing unit.
- the minutia point generation unit of the present invention is characterized in that the minutiae point near the destination is generated when the current position of the vehicle detected by the navigation unit is within a predetermined set distance from the destination.
- the destination recognizing unit of the present invention includes: a destination coordinate detecting unit for detecting a destination coordinate by matching a feature point and a minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And a coordinate transforming unit for transforming the destination coordinates detected by the destination coordinate detecting unit into the coordinates of the seeds.
- the head-up display of the present invention is characterized in that the destination indicator is displayed according to the coordinates of the hub converted by the coordinate conversion unit.
- the present invention calculates a distance to a destination corresponding to a destination coordinate detected in an actually photographed image, newly detects a current position of the vehicle by applying a distance to a destination to a coordinate on a map of the destination, And a current position correcting unit for inputting a current position of the current position of the vehicle to the navigation unit.
- a destination guidance method including: receiving a destination information of a navigation unit;
- the feature map detecting unit detecting a feature map for a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination information input by the navigation unit; Detecting a feature point near a destination from an actually photographed image;
- the destination recognizing unit recognizing the destination using the feature map and the minutiae detected by the feature map detecting unit and the minutiae detecting unit, respectively; And displaying a destination indicator for guiding the destination to a destination recognized by the destination recognizing unit.
- the step of detecting the feature map in the vicinity of the destination of the present invention may include searching a distance image in the vicinity of the destination according to the destination information input by the navigation unit, detecting the feature point in the vicinity of the destination in the retrieved distance image, To generate a feature map.
- the step of detecting the feature map of the present invention is characterized by detecting at least one of the peripheral contour of the destination, the road surface object, and the roadside object as feature points.
- the step of detecting the feature points in the vicinity of the destination of the present invention is characterized by photographing the front of the vehicle and generating the feature points from the photographed image.
- the detecting of the feature point near the destination of the present invention is characterized in that the feature point near the destination is generated when the current position of the vehicle detected by the navigation unit is within a predetermined set distance from the destination.
- the step of recognizing the destination of the present invention includes the steps of detecting the destination coordinates by matching the feature maps and the minutiae points detected by the feature map detecting section and the minutiae point detecting section respectively and converting the detected destination coordinates into the head coordinates of the head- .
- the step of displaying the destination indicator of the present invention is characterized in that the destination indicator is displayed according to the coordinates of the hub converted by the destination recognizing unit.
- the current position correction unit calculates the distance to the destination corresponding to the destination coordinates detected in the actually photographed image and newly detects the current position of the vehicle by applying the distance to the destination to the coordinates on the map of the destination, And the navigation unit corrects the current position of the vehicle based on the current position of the vehicle newly detected by the current position correcting unit to determine the current position of the vehicle based on the corrected current position of the vehicle, And a step of informing the user of the route to the destination.
- a destination guidance apparatus including: a navigation terminal for guiding a route to an input destination; And a server for detecting a feature map of a feature point in the vicinity of a destination from a distance image in the vicinity of the destination using the destination input by the navigation terminal, wherein the navigation terminal comprises: A destination indicator for displaying the destination to a destination on the windshield glass is displayed through the head-up display after recognizing the destination through the feature map detected by the server and the detected feature point .
- the server of the present invention generates a feature map for a minutiae near a destination when a destination is input from the navigation terminal or generates a feature map for a destination currently inputted from the navigation terminal among feature maps for each of a plurality of destinations And a feature map for a feature point in the vicinity of the destination is detected by using a method of extracting the feature map.
- the server of the present invention searches for a distance image in the vicinity of the destination according to the destination input by the navigation terminal when generating the feature map, detects the feature point in the vicinity of the destination in the retrieved distance image, Thereby generating a feature map.
- the navigation terminal of the present invention detects destination coordinates by matching the feature maps detected by the server with the detected minutiae points and converts the detected destination coordinates into the coordinates of the head and transmits the converted coordinates to the head up display do.
- a destination guide apparatus and method is a method and apparatus for searching for an image near a destination and generating a feature map in the vicinity of a destination when the destination is set, The destination is recognized by matching with the map, and the destination is displayed through the head-up display.
- a destination guide apparatus and method accurately displays a destination through a head-up display, thereby preventing a situation where a user can not find a destination in the vicinity of the destination or passes the destination.
- FIG. 1 is a block diagram of a destination guide apparatus according to an embodiment of the present invention.
- FIG. 2 is a block diagram of a feature map detecting unit according to an embodiment of the present invention.
- FIG. 3 is a diagram illustrating an example of extracting a destination and a neighboring feature point according to an embodiment of the present invention.
- FIG. 4 is a block diagram of a feature point detector according to an embodiment of the present invention.
- FIG. 5 is a block diagram of a destination recognizer according to an exemplary embodiment of the present invention.
- FIG. 6 is a diagram illustrating an example of a destination according to an exemplary embodiment of the present invention. Referring to FIG. 6
- FIG. 7 is a flowchart of a destination guidance method according to an embodiment of the present invention.
- FIG. 8 is a block diagram of a destination guidance apparatus according to another embodiment of the present invention.
- FIG. 1 is a block diagram of a destination guide apparatus according to an embodiment of the present invention.
- FIG. 2 is a block diagram of a feature map detecting unit according to an embodiment of the present invention.
- FIG. 4 is a block diagram of a feature point detector according to an embodiment of the present invention,
- FIG. 5 is a block diagram of a destination recognizer according to an embodiment of the present invention, and
- FIG. 6 is a diagram illustrating an example of displaying a destination according to an embodiment of the present invention.
- a destination guidance apparatus includes a navigation unit 10, a feature map detection unit 20, a feature point detection unit 30, a destination recognition unit 40, And a display 50.
- the navigation unit 10 reads the data of the current location in the road database based on the location information of the route search device received through GPS (Global Positioning System) satellite, and displays the current location data together with the location of the vehicle on the screen.
- GPS Global Positioning System
- the navigation unit 10 receives the destination information from the user and guides the route to the destination according to the inputted destination information. At this time, the navigation unit 10 guides the route to the destination using the current location and the destination information of the vehicle received through the GPS satellite, outputs the map stored in the road database on the screen, and displays the route on the map It guides you with your voice.
- the navigation unit 10 inputs the destination information input by the user to the feature map detection unit 20 and inputs the current position of the vehicle to the feature point detection unit 30 in real time.
- the navigation unit 10 may guide the route to the destination using the current position. This will be described later.
- the feature map detecting unit 20 detects a feature map for a feature point in the vicinity of the destination from the distance image near the destination using the destination information input by the navigation unit 10. [ When the destination information is input from the navigation unit 10, the feature map detecting unit 20 generates a feature map for the feature points near the destination, or searches the feature unit map of the feature map for each of the plurality of destinations, It is possible to detect the feature map of the feature point in the vicinity of the destination using a method of extracting the feature map for the corresponding destination that is currently input from the feature map.
- the feature map detection unit 20 includes a distance image search unit 21 and a feature map generation unit 22.
- the distance image searching unit 21 searches for and acquires a distance image in the vicinity of the destination according to the destination information inputted by the navigation unit 10.
- the distance image searching unit 21 searches for a distance image in the vicinity of the corresponding destination and searches for a street view in various directions that can enter the destination. This is because when the directions to reach the destination are different, that is, the user can enter a path different from the predetermined path.
- the distance image may be an image of a distance taken by a real image, a street view in which a distance in the vicinity of the destination is photographed may be employed, and the distance image may be constructed in an external web server or database,
- the image search unit 21 can search for and acquire a distance image near the destination among the distance images constructed in the web server or the database.
- the feature map generating unit 22 detects feature points in the vicinity of a destination from the distance image retrieved by the distance image retrieving unit 21 and generates a feature map using the detected feature points.
- a feature point is a feature point of an object that can identify the vicinity of a destination, and is used as information for identifying the vicinity of the destination in the distance image.
- the feature map generation unit 22 detects feature points in a distance image near the destination retrieved by the distance image retrieval unit 21.
- the feature points may include a surrounding contour of the destination, a road surface object, and a roadside object.
- the road surface object may include a mark formed on the surrounding road, various lines of the road, and the like.
- Roadside objects may include traffic lights, traffic signs, buildings or signboards on nearby roads.
- a crosswalk line feature point 1
- a traffic light feature point 2
- a signboard feature point 3
- a building building
- the feature map detecting unit 20 detects the feature map of the corresponding destination currently input from the navigation unit 10 among the feature maps for each of the plurality of destinations, Or if a feature map for the destination is not generated, the feature map for the corresponding destination currently input through the above-described method may be generated.
- the function of the feature map detection unit may be implemented in a server built outside the vehicle as described later.
- the minutia detection unit 30 detects minutiae near the destination from the actually photographed image.
- the feature point detection unit 30 includes a photographing unit 31 and a feature point generating unit 32.
- the photographing unit 31 is installed in the vehicle, captures an image of the front of the vehicle, and inputs the photographed image to the feature point generating unit 32.
- the photographing unit 31 may be provided inside the vehicle, but may be a camera mounted on a black box (not shown), a preceding vehicle (not shown) or a camera for photographing a lane.
- the feature point generating unit 32 generates a feature point from the image actually photographed by the photographing unit 31.
- the manner and object in which the feature point generation unit 32 extracts the feature points are the same as the manner and the manner in which the feature map generation unit 22 extracts feature points from the street view. Therefore, detailed description thereof will be omitted here.
- the feature point generation unit 32 extracts feature points from an image actually photographed by the photographing unit 31, and extracts feature points based on the current position of the vehicle.
- the feature point generating unit 32 can extract the feature point from the image actually photographed by the photographing unit 31 until the vehicle reaches the vicinity of the destination, but it is possible to determine whether the current position of the vehicle is within the predetermined set distance And the minutiae near the destination can be generated according to the determination result.
- the minutia generation section 32 receives the destination coordinates and the current position of the vehicle from the navigation section 10 in real time, determines whether the current position of the vehicle is within the set distance from the destination coordinates, It is possible to photograph the front of the vehicle by controlling the photographing section 31 to extract the feature point from the actually photographed image. In this manner, by detecting the feature points within the set distance from the destination coordinates, the load on the feature point generating unit 32 can be relatively reduced and the detection error can be minimized.
- the setting distance may be variously set depending on the existence or position of an object that can be detected as a minutia in the vicinity of the destination, or the distance that the user can visually confirm the destination in the vehicle.
- the destination recognizing unit 40 recognizes the destination through the feature map and the minutiae detected by the feature map detecting unit 20 and the minutiae detecting unit 30, respectively.
- the destination recognizing unit 40 includes a destination coordinate detecting unit 41 and a coordinate converting unit 42.
- the destination coordinate detecting section 41 detects the destination coordinates by matching the feature maps detected by the feature map detecting section 20 and the feature point detecting section 30 with the feature points.
- the destination coordinate detecting section 41 receives the feature map from the feature map detecting section 20 and, when receiving the feature point by the feature point detecting section 30, matches the feature map with the feature point, Identify the destination according to the result.
- the destination coordinate detecting unit 41 detects the destination coordinates in the actually photographed image by matching the feature map with the minutiae, and if it is determined that the feature map matches the minutiae point, the destination coordinates are detected from the actually photographed image.
- the coordinate conversion unit 42 converts the destination coordinates detected by the destination coordinate detection unit 41 into the Hurd coordinates.
- the coordinate conversion unit 42 converts the destination coordinates detected by the destination coordinate detection unit 41 into the coordinates of the destination by matching the coordinates with the coordinates of the destination.
- the coordinate matching is such that the coordinate of the actually photographed distance image is matched with the range displayed by the head-up display 50 and the Hurd coordinate.
- the coordinate conversion unit 42 stores in advance the range and coordinates that can be displayed by the head-up display 50 corresponding to the coordinates of the actually photographed distance image, It is possible to convert the destination coordinates into the Hurd coordinates by detecting the Hurd coordinates matching the destination coordinates.
- the current position correcting unit 60 corrects the current position based on the corresponding destination coordinates as the destination coordinates are detected from the actually photographed image as described above. That is, the current position correcting unit 60 calculates the actual distance corresponding to each coordinate in the actually photographed image in advance, and calculates the distance to the destination corresponding to the detected destination coordinates in the actually photographed image. The current position correcting unit 60 newly detects the current position of the vehicle by applying the distance to the destination calculated as described above to the coordinates on the map of the destination and outputs the detected current position of the vehicle to the navigation unit 10 ).
- the navigation unit 10 corrects the current position of the vehicle to the current position input from the current position correcting unit 60, and guides the path to the destination more accurately based on the corrected current position.
- Up display 50 displays a destination indicator for guiding a destination to a destination on the windshield glass when a destination is recognized by the destination recognizing unit 40.
- the head-up display 50 displays the destination indicator for guiding the destination according to the coordinates of the corresponding hub as shown in FIG. 6 when the destination recognizing unit 40 detects the coordinates of the destination corresponding to the destination coordinates .
- the head-up display 50 can display the destination indicator directly at the destination coordinates, but can also be displayed around the destination.
- the destination indicator may be formed of polygons or may be formed by symbols such as dots, lines, and arrows.
- FIG. 7 is a flowchart of a destination guidance method according to an embodiment of the present invention.
- the navigation unit 10 receives destination information from a user (S10). At this time, the navigation unit 10 inputs the destination information inputted by the user to the feature map detecting unit 20, and inputs the current position of the vehicle to the feature point detecting unit 30 in real time.
- the navigation unit 10 guides the route to the destination.
- the distance image search unit 21 searches for a distance image in the vicinity of the destination according to the destination information input by the navigation unit 10 (S20).
- the feature map detecting unit 20 can search the street view in various directions that can enter the destination.
- the feature map generating unit 22 detects feature points in the vicinity of the destination in the distance image retrieved by the distance image retrieving unit 21, and detects the feature map using the detected feature points (30).
- the feature point may include a peripheral contour of the destination, a road surface object, and a roadside object.
- the minutia generation section 32 receives the destination coordinates and the current position of the vehicle from the navigation section 10 in real time, and determines whether the current position of the vehicle is within a set distance from the destination coordinates (S40).
- the minutia generation section 32 controls the photographing section 31 to photograph the distance image in the vicinity of the destination, (Step S50).
- the destination coordinate detecting unit 41 detects the correspondence between the detected feature maps and the minutiae, judges whether they match or not, identifies the destination from the actually photographed image, and detects the destination coordinates of the actually photographed image S60).
- the coordinate transforming unit 42 converts the destination coordinates of the distance image actually photographed by the photographing unit 31 to the coordinates of the heights by converting the coordinates of the destination coordinates into the coordinates of the heights (S70).
- the head-up display 50 displays a destination indicator for guiding the destination according to the coordinates of the corresponding head, as the destination coordinates are converted into the coordinates of the head (S80).
- the current position correcting unit 60 calculates the distance to the destination corresponding to the destination coordinates detected in the actually photographed image, and calculates the distance to the destination And then inputs the current position of the newly detected vehicle to the navigation unit 10.
- the navigation unit 10 detects the current position of the vehicle.
- the navigation unit 10 can correct the current position of the vehicle to the current position input from the current position correcting unit 60, and guide the path to the destination more accurately based on the corrected current position.
- (Feature Map) detection unit 20 may be embodied as an external server 200 that is distinguished from the navigation terminal 100 may be employed.
- the present embodiment guides a route to a destination inputted by the navigation terminal 100, and the server 200 uses the destination input by the navigation terminal 100 to extract a distance from a near-
- the navigation terminal 100 detects a feature point in the vicinity of the destination from the actually photographed image and detects the feature point detected by the server 200 and the detected feature point through the detected feature point
- a destination indicator for guiding the destination to the destination on the windshield glass is displayed through the head-up display.
- the server 200 may generate a feature map for the feature points in the vicinity of the destination, A feature map for a feature point in the vicinity of a destination can be detected using a method of extracting a feature map for a destination currently input from the feature point searching unit 100.
- the server 200 searches the distance image near the destination according to the destination input by the navigation terminal 100, detects the feature point near the destination in the retrieved distance image, Can be used to generate a feature map.
- the navigation terminal 100 may detect the destination coordinates by matching the feature maps detected by the server 200 with the detected minutia points, convert the detected destination coordinates into the coordinates of the hands, and transmit the converted coordinates to the head-up display, Accordingly, a destination indicator for guiding the destination to the destination on the windshield glass through the head-up display can be displayed.
- the implementations described herein may be implemented, for example, as a method or process, an apparatus, a software program, a data stream, or a signal. Although discussed only in the context of a single type of implementation (e.g., discussed only as a method), implementations of the discussed features may also be implemented in other forms (e.g., devices or programs).
- the device may be implemented with suitable hardware, software, firmware, and the like.
- the method may be implemented in an apparatus such as, for example, a processor, which generally refers to a processing device including a computer, microprocessor, integrated circuit or programmable logic device, and the like.
- the processor also includes communication devices such as computers, cell phones, personal digital assistants ("PDAs”) and other devices that facilitate communication of information between end-users.
- PDAs personal digital assistants
- the destination guidance apparatus and method according to the embodiment of the present invention precisely displays the destination through the head-up display, thereby preventing the user from finding the destination in the vicinity of the destination or passing through the destination.
Abstract
La présente invention concerne un appareil et un procédé de guidage de destination. Un appareil de guidage de destination selon la présente invention comprend : une unité de navigation pour recevoir des informations de destination en tant qu'entrée et fournir un guidage d'itinéraire vers une destination; une unité de détection de carte de caractéristiques pour détecter une carte de caractéristiques de points caractéristiques au voisinage de la destination, à partir d'une image de rue au voisinage de la destination en utilisant les informations de destination entrées dans l'unité de navigation; une unité de détection de point caractéristique pour détecter un point caractéristique au voisinage de la destination, à partir d'une image réellement capturée; une unité de reconnaissance de destination pour reconnaître la destination via la carte de caractéristiques et le point caractéristique détecté par l'unité de détection de carte de caractéristiques et l'unité de détection de point caractéristique, respectivement; et un affichage tête haute pour, lorsque la destination a été reconnue par l'unité de reconnaissance de destination, afficher un indicateur de destination, utilisé pour un guidage de destination, concernant une destination sur une vitre de pare-brise.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020170182196A KR102541069B1 (ko) | 2017-12-28 | 2017-12-28 | 목적지 안내 장치 및 방법 |
KR10-2017-0182196 | 2017-12-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019132504A1 true WO2019132504A1 (fr) | 2019-07-04 |
Family
ID=67067860
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2018/016645 WO2019132504A1 (fr) | 2017-12-28 | 2018-12-26 | Appareil et procédé de guidage de destination |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR102541069B1 (fr) |
WO (1) | WO2019132504A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112577524A (zh) * | 2020-12-16 | 2021-03-30 | 北京百度网讯科技有限公司 | 信息校正方法和装置 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102349652B1 (ko) * | 2020-03-26 | 2022-01-12 | 주식회사 라이드플럭스 | 차량의 위치 정보 및 신호등 정보를 이용한 주행 안내 제공 방법, 장치 및 컴퓨터프로그램 |
KR102427713B1 (ko) * | 2020-06-03 | 2022-08-03 | 홍기방 | 오프라인 매장 예약 주문 상품 수령 서비스 방법 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120067479A (ko) * | 2010-12-16 | 2012-06-26 | 에스케이플래닛 주식회사 | 촬영 영상을 이용한 길 안내 시스템 및 방법 |
KR20140064424A (ko) * | 2012-11-20 | 2014-05-28 | 엘지전자 주식회사 | 차량의 위치 인식 장치 및 방법 |
KR20150034997A (ko) * | 2013-09-27 | 2015-04-06 | 네이버 주식회사 | 경로 안내에 따른 목적지를 명확히 인지시켜 주기 위한 방법 및 그 시스템 |
KR20170016203A (ko) * | 2015-08-03 | 2017-02-13 | 현대모비스 주식회사 | 경로 안내 장치 및 그 제어 방법 |
KR20170105165A (ko) * | 2016-03-08 | 2017-09-19 | 주식회사 비에스피 | 로드뷰와 촬영이미지를 이용한 주행경로 확인장치와 주행경로 확인방법 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150066036A (ko) * | 2013-12-06 | 2015-06-16 | 이동녀 | 투명 네비게이션을 이용한 위치 안내 시스템 및 그 방법 |
KR101885356B1 (ko) * | 2016-05-04 | 2018-08-03 | 임재형 | 대상물의 위치 정보 측정 장치 그리고 그 제어 방법 |
-
2017
- 2017-12-28 KR KR1020170182196A patent/KR102541069B1/ko active IP Right Grant
-
2018
- 2018-12-26 WO PCT/KR2018/016645 patent/WO2019132504A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120067479A (ko) * | 2010-12-16 | 2012-06-26 | 에스케이플래닛 주식회사 | 촬영 영상을 이용한 길 안내 시스템 및 방법 |
KR20140064424A (ko) * | 2012-11-20 | 2014-05-28 | 엘지전자 주식회사 | 차량의 위치 인식 장치 및 방법 |
KR20150034997A (ko) * | 2013-09-27 | 2015-04-06 | 네이버 주식회사 | 경로 안내에 따른 목적지를 명확히 인지시켜 주기 위한 방법 및 그 시스템 |
KR20170016203A (ko) * | 2015-08-03 | 2017-02-13 | 현대모비스 주식회사 | 경로 안내 장치 및 그 제어 방법 |
KR20170105165A (ko) * | 2016-03-08 | 2017-09-19 | 주식회사 비에스피 | 로드뷰와 촬영이미지를 이용한 주행경로 확인장치와 주행경로 확인방법 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112577524A (zh) * | 2020-12-16 | 2021-03-30 | 北京百度网讯科技有限公司 | 信息校正方法和装置 |
Also Published As
Publication number | Publication date |
---|---|
KR20190080030A (ko) | 2019-07-08 |
KR102541069B1 (ko) | 2023-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019132504A1 (fr) | Appareil et procédé de guidage de destination | |
WO2011055978A2 (fr) | Terminal d'utilisateur, procédé de délivrance de position et procédé de guidage d'itinéraire pour celle-ci | |
WO2013018962A1 (fr) | Appareil de reconnaissance de voie de trafic et procédé associé | |
WO2017131334A1 (fr) | Système et procédé de reconnaissance d'emplacement de robot mobile et d'élaboration de carte | |
WO2019054593A1 (fr) | Appareil de production de carte utilisant l'apprentissage automatique et le traitement d'image | |
WO2021241847A1 (fr) | Procédé et système de génération de carte de caractéristiques visuelles | |
WO2018230845A1 (fr) | Procédé de positionnement sur la base d'informations visuelles et robot destiné à mettre en œuvre un tel procédé de positionnement | |
EP2471054A2 (fr) | Procédé pour fournir des informations concernant un véhicule et dispositif terminal permettant de mettre en oeuvre ce procédé | |
WO2020159076A1 (fr) | Dispositif et procédé d'estimation d'emplacement de point de repère, et support d'enregistrement lisible par ordinateur stockant un programme informatique programmé pour mettre en œuvre le procédé | |
CN110998684B (zh) | 图像收集系统、图像收集方法、图像收集装置、记录介质 | |
WO2020067751A1 (fr) | Dispositif et procédé de fusion de données entre capteurs hétérogènes | |
WO2019240340A1 (fr) | Dispositif indicateur d'excès de vitesse permettant de mesurer la vitesse d'un véhicule à l'aide d'une caméra, et son procédé de fonctionnement | |
WO2020075954A1 (fr) | Système et procédé de positionnement utilisant une combinaison de résultats de reconnaissance d'emplacement basée sur un capteur multimodal | |
WO2015064892A1 (fr) | Procédé permettant d'accentuer un point d'intérêt d'un utilisateur de navigation pour chaque période de temps et serveur de guidage d'itinéraire | |
WO2021221334A1 (fr) | Dispositif de génération de palette de couleurs formée sur la base d'informations gps et de signal lidar, et son procédé de commande | |
WO2018021870A1 (fr) | Système de navigation et procédé de correction de localisation du système de navigation | |
WO2013022153A1 (fr) | Appareil et procédé de détection de voie | |
WO2021194109A1 (fr) | Procédé, dispositif et programme informatique pour fournir un guide de circulation en utilisant des informations de position de véhicule et des informations de feu de circulation | |
WO2015108401A1 (fr) | Dispositif portatif et procédé de commande employant une pluralité de caméras | |
KR20140064424A (ko) | 차량의 위치 인식 장치 및 방법 | |
WO2020071573A1 (fr) | Système d'informations d'emplacement utilisant un apprentissage profond et son procédé d'obtention | |
WO2020209551A1 (fr) | Appareil portable destiné à mesurer la qualité de l'air et procédé d'affichage d'informations relatives à la qualité de l'air | |
WO2016003023A1 (fr) | Procédé de recherche d'itinéraire, dispositif et serveur | |
WO2013022159A1 (fr) | Appareil de reconnaissance de voie de circulation et procédé associé | |
WO2022231316A1 (fr) | Système de gestion de stationnement sans conducteur pour corriger automatiquement de changements d'angle de caméra, et procédé associé |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18893721 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18893721 Country of ref document: EP Kind code of ref document: A1 |