KR20120070887A - Electronic device and course guide method of electronic device - Google Patents

Electronic device and course guide method of electronic device Download PDF

Info

Publication number
KR20120070887A
KR20120070887A KR1020100132405A KR20100132405A KR20120070887A KR 20120070887 A KR20120070887 A KR 20120070887A KR 1020100132405 A KR1020100132405 A KR 1020100132405A KR 20100132405 A KR20100132405 A KR 20100132405A KR 20120070887 A KR20120070887 A KR 20120070887A
Authority
KR
South Korea
Prior art keywords
vehicle
route
lane
driving
driving direction
Prior art date
Application number
KR1020100132405A
Other languages
Korean (ko)
Inventor
김진영
우승완
이해일
조창빈
Original Assignee
팅크웨어(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 팅크웨어(주) filed Critical 팅크웨어(주)
Priority to KR1020100132405A priority Critical patent/KR20120070887A/en
Publication of KR20120070887A publication Critical patent/KR20120070887A/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3658Lane guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3679Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities
    • G01C21/3682Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities output of POI information on a road map
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

An electronic device and a method for guiding a path of the electronic device are disclosed.
According to the present invention, the navigation system determines whether to re-route the path according to the path departure of the vehicle based on the path information corresponding to the movement route to the destination and the driving direction of the vehicle obtained from the driving image of the vehicle.

Description

ELECTRONIC DEVICE AND COURSE GUIDE METHOD OF ELECTRONIC DEVICE}

An electronic device and a route guidance method for the electronic device.

With the opening of the Internet network and the revision of laws relating to location data, the industry based on location based services (LBS) is being activated. Representative devices using such a location-based service may be a vehicle navigation device that provides a route guidance service for locating a current location of a vehicle or guiding a moving route to a destination.

In addition, there is an increasing number of cases where objective data are needed to determine the error rate according to the responsibility for accidents that occur during the stopping or driving of a vehicle. Accordingly, a vehicle black box capable of providing objective data is used.

Recently, various studies are being actively conducted to use a vehicle driving image acquired through a vehicle black box to provide a route guidance service in a vehicle navigation.

The present invention provides an electronic device and a route guidance method of the electronic device to minimize the time delay until the route re-search.

As an aspect of the present invention for realizing the above object, the electronic device according to the present invention, the position data module for obtaining the position data of the vehicle; And acquiring a driving direction of the vehicle based on at least one object included in the driving image of the vehicle and deviating from the vehicle path based on the path information of the vehicle generated based on the position data and the driving direction of the vehicle. It includes a control unit for determining whether or not.

As an aspect of the present invention for realizing the above object, a system according to the present invention, the first electronic device for obtaining a driving image of the vehicle; And performing a route search for the vehicle, obtaining a driving direction of the vehicle based on at least one object extracted from the driving image, and based on the route information obtained through the route searching and the driving direction of the vehicle. And a second electronic device that determines whether the vehicle leaves the path.

As an aspect of the present invention for realizing the above object, a route guidance method of an electronic device according to the present invention comprises: generating route information to a destination using position data of a vehicle; Obtaining a driving image of the vehicle; Extracting at least one object displayed on the road from the driving image; Obtaining a driving direction of the vehicle based on the extracted at least one object; And determining whether the vehicle deviates from the path by comparing the path information with the driving direction of the vehicle.

As one aspect of the present invention for realizing the above object, a computer-readable recording medium according to the present invention records a program that performs any one of the above methods.

According to the present invention, when the vehicle deviates from the path, there is an effect of minimizing the time delay until the route re-search.

1 is a diagram illustrating an example of a system for implementing a route guidance method according to an embodiment of the present invention.
2 is a diagram illustrating another example of a system for implementing a route guidance method according to an embodiment of the present invention.
3 is a structural diagram showing an example of a vehicle navigation system according to an embodiment of the present invention.
4 is a structural diagram showing an example of a vehicle navigation system according to an embodiment of the present invention.
5 is a block diagram illustrating an example of a communication network including a navigation system according to an embodiment of the present invention.
6 is a flowchart illustrating a route guidance method of the navigation system 10 according to an exemplary embodiment.
FIG. 7 illustrates an example of selecting a lane display line associated with a lane on which a vehicle is driven from an image in the navigation system 10 according to an exemplary embodiment.
8 illustrates an example of an object displayed on a road.
9 is a flowchart illustrating an example of obtaining object information using a candidate object database in the navigation system 10 according to an embodiment of the present invention.
10 is a flowchart illustrating another example of obtaining object information using a candidate object database in the navigation system 10 according to an exemplary embodiment.
FIG. 11 is a flowchart illustrating an example of determining whether to re-route a route based on a driving direction of a lane in which a vehicle is driving.
12 is a flowchart illustrating another example of determining whether to re-route a route based on a driving direction of a lane in which a vehicle is driving.

The above objects, features and advantages of the present invention will become more apparent from the following detailed description taken in conjunction with the accompanying drawings. It is to be understood, however, that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and similarities. Like reference numerals designate like elements throughout the specification. In addition, when it is determined that the detailed description of the known function or configuration related to the present invention may unnecessarily obscure the subject matter of the present invention, the detailed description thereof will be omitted. In addition, numerals (e.g., days, days, etc.) used in the description of the present invention are merely an identifier for distinguishing one component from another component

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

In addition, the suffix "module" and " part "for constituent elements used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

1 is a diagram illustrating an example of a system for implementing a route guidance method according to an embodiment of the present invention.

Referring to FIG. 1, the system 10 may include a first electronic device 100 that informs drivers and passengers of various types of data related to driving and maintenance of a vehicle, and a second electronic device that records data related to driving of the vehicle. 200 may be included.

In the following, when the system 10 for implementing the vehicle guide method is a 'navigation system' and the first electronic device 100 and the second electronic device 200 are 'vehicle navigation' and 'vehicle black box', respectively. An example will be described. However, the present invention is not limited thereto, and in the present invention, other types of electronic devices may be used to implement the route guidance method. For example, the first electronic device 100 or the second electronic device 200 may be implemented as a mobile communication terminal, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), or the like. In addition, in the embodiments of the present invention, a case in which the vehicle navigation apparatus 100 and the vehicle black box 200 are provided separately from each other will be described as an example, but the present invention is not limited thereto. According to the present invention, it is also possible that the vehicle navigation apparatus 100 and the vehicle black box 200 are integrated.

The vehicle navigation apparatus 100 may include a display unit 145 provided on the front surface of the navigation housing 191, a navigation operation key 193, and a navigation microphone 195.

The navigation housing 191 forms the exterior of the vehicle navigation apparatus 100. The vehicle navigation apparatus 100 may be exposed to an environment of various conditions such as being exposed to high or low temperatures due to seasonal factors or being subjected to direct / indirect external shock. The navigation housing 191 may have a purpose to protect various electronic components inside the vehicle navigation apparatus 100 from changes in the external environment and to enhance the appearance of the vehicle navigation apparatus 100. In order to achieve the above object, the navigation housing 191 may be processed by injection molding a material such as ABS, PC or strength-enhanced engineering plastic.

The display unit 145 is a part that visually displays various data. The various data displayed on the display unit 145 may be map data in which route data and the like are combined, various broadcast screens including DMB broadcasting, or images stored in a memory. The display unit 145 may be divided into several physically or logically. The physically divided display unit 145 refers to a case in which two or more display units 145 are adjacent to each other. The logically divided display unit 145 refers to a case in which a plurality of independent screens are physically displayed on one display unit 145. For example, when the DMB broadcast is received and displayed, the route data is displayed on a part of the display unit 145 or the DMB broadcast and the map screen are displayed on a part of the display unit 145 and another area, respectively. Can be. In accordance with the tendency for various functions to converge on the vehicle navigation apparatus 100, the display 145 is logically divided to display various data. Furthermore, in order to display various data, the display unit 145 is gradually becoming larger.

The entire surface or part of the display unit 145 may be a touch screen that may receive a user's touch input. For example, the function may be activated by touching the function selection button displayed on the display unit 145. That is, it means that the display unit 145 may be an output unit (140 of FIG. 3) and an input unit (120 of FIG. 3) of an image.

The navigation operation key 193 may be provided for executing various functions of the vehicle navigation apparatus 100 or for allowing a user to directly input necessary data. The convenience of use can be improved by mapping a frequently used specific function to the navigation operation key 193.

The navigation microphone 195 may be provided to receive a sound including voice and sound. For example, the specific function of the vehicle navigation apparatus 100 may be executed based on the voice signal received by the navigation microphone 195. In addition, the current state of the vehicle, such as the occurrence of an accident, may be detected based on the acoustic signal received by the navigation microphone 195.

The vehicle black box 200 may exchange data with the vehicle navigation apparatus 100 to store data necessary for an accident handling process of the vehicle. For example, when an accident occurs while driving a vehicle, the image obtained and stored in the vehicle black box 200 may be analyzed and used to determine the extent of the accident and the degree of error. In addition, the vehicle black box 200 connected to the vehicle navigation apparatus 100 may utilize various data stored in the vehicle navigation apparatus 100. For example, the image obtained by the vehicle black box 200 may be mapped to map data stored in the vehicle navigation apparatus 100 to increase the utility of the vehicle black box 200.

The vehicle black box 200 may acquire data of the vehicle while driving and stopping. That is, not only can the image be taken while the vehicle is running, but also the image can be taken even when the vehicle is stopped. The image quality of the image acquired through the vehicle black box 200 may be constant or change. For example, before and after the occurrence of an accident, the image quality of the image is increased, and in general, the image quality can be saved by minimizing the required storage space by storing the image quality.

The vehicle black box 200 may include a black box camera 222, a black box microphone 224, and an attachment part 281.

The black box camera 222 may photograph the outside or the interior of the vehicle. In addition, the black box camera 222 may be provided in singular or plural. When there are a plurality of black box cameras 222, one may be integrated with the vehicle black box 200 and the other may transmit an image photographed while being attached to each part of the vehicle to the vehicle black box 200. . In the case where the black box camera 222 is singular, the black box camera 222 may be installed to photograph the front of the vehicle. The image photographed by the black box camera 222 may be stored in the vehicle black box 200 or the vehicle navigation apparatus 100.

The black box microphone 224 may acquire sound generated inside and outside the vehicle. The black box microphone 224 may perform a function similar to the navigation microphone 195 described above.

The attachment part 281 may fix the vehicle black box 200 to the vehicle. The attachment part 281 may be a suction plate which may attach the vehicle black box 200 to the windshield of the vehicle, or may be a fixing device that may be coupled to a room mirror of the vehicle.

2 is a diagram illustrating another example of a system for implementing a route guidance method according to an embodiment of the present invention.

Referring to FIG. 2, the vehicle navigation apparatus 100 and the vehicle black box 200 constituting the navigation system 10 may be wirelessly connected to each other. That is, the vehicle navigation apparatus 100 and the vehicle black box 200 are separate devices, and there may be no physical connection device therebetween. The vehicle navigation apparatus 100 and the vehicle black box 200 may be a communication scheme such as Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra WideBand (UWB), ZigBee, and the like. Can be communicated using.

3 is a structural diagram of a vehicle navigation system according to an embodiment of the present invention.

Referring to FIG. 3, the vehicle navigation apparatus 100 may include a first communication unit 110, a first input unit 120, a first sensing unit 130, an output unit 140, a first storage unit 150, and a first storage unit 150. The power supply unit 160 and the first control unit 170 may be included. Since the components shown in FIG. 3 are not essential, an electronic device having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The first communication unit 110 enables communication between the vehicle navigation system 100 and the communication system or between a network in which the vehicle navigation system 100 and the vehicle navigation system 100 are located or between the vehicle navigation system 100 and another electronic device. It may include more than one module. For example, the first communication unit 100 may include a first location data module 111, a first wireless internet module 113, a broadcast transmission / reception module 115, a first local area communication module 117, and a first wired communication. Module 119 and the like.

The first position data module 111 is a module for checking or obtaining position data of the vehicle navigation apparatus 100. As a method of obtaining location data by the first location data module 111, a method of obtaining location data through a global navigation satellite system (GNSS) may be used. GNSS refers to a navigation system that can calculate the position of the receiving terminal using the radio signal received from the satellite (20 of FIG. 5). Specific examples of GNSS include Global Positioning System (GPS), Galileo, Global Orbiting Navigational Satellite System (GLONASS), COMPASS, Indian Regional Navigational Satellite System (IRNS), and Quasi-Zenith Satellite System (QZSS), depending on the operating entity. Can be. The location data module 111 of the vehicle navigation apparatus 100 according to an embodiment of the present disclosure may receive location data by receiving a GNSS signal serving in an area in which the vehicle navigation apparatus 100 is used. The first position data module 111 continuously calculates a current position of the vehicle navigation apparatus 100 in real time, and calculates speed information using the same.

The first wireless internet module 113 is an apparatus that accesses the wireless Internet and acquires or transmits data. The wireless Internet accessible through the wireless internet module 113 may be a wireless LAN (WLAN), a wireless broadband (WBRO), a world interoperability for microwave access (Wimax), a high speed downlink packet access (HSDPA), or the like.

The broadcast transmission / reception module 115 is an apparatus for receiving broadcast signals through various broadcast systems. The broadcast system that can be received through the broadcast transmission / reception module 115 includes: Digital Multimedia Broadcasting Terrestrial (DMBT), Digital Multimedia Broadcasting Satellite (DMBS), Media Forward Link Only (MediaFLO), Digital Video Broadcast Handheld (DVBH), and ISDBT ( Integrated Services Digital Broadcast Terrestrial). The broadcast signal received through the broadcast transmission / reception module 115 may include traffic data, living data, and the like.

The first short range communication module 117 refers to a module for short range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and the like can be used as the short distance communication technology.

The first wired communication module 119 serves to provide an interface with another electronic device connected to the vehicle navigation apparatus 100. For example, the first wired communication module 119 may be a USB module capable of communicating through a USB port.

The first input unit 120 is a module for generating input data for controlling the operation of the vehicle navigation apparatus 100. The first input unit 120 may generate input data by converting a physical input from the outside into a specific electrical signal. The first input unit 120 may include a first user input module 121, a first microphone 123, and the like.

The first user input module 121 receives a control input for controlling the operation of the vehicle navigation apparatus 100 from the user. The first user input module may include a key pad dome switch, a touch pad (constant voltage / capacitance), a jog wheel, a jog switch, and the like. For example, the first user input module 121 may be implemented as a navigation operation key (193 of FIG. 1) provided outside the housing (191 of FIG. 1) of the vehicle navigation apparatus 100.

The first microphone 123 is a device that receives a user's voice and sound generated in and out of the vehicle. The first microphone 123 may be implemented as a navigation microphone 195 provided outside the housing (191 of FIG. 1) of the vehicle navigation apparatus 100.

The first sensing unit 130 detects the current state of the vehicle navigation apparatus 100 and generates a sensing signal for controlling the operation of the vehicle navigation apparatus 100. The first sensing unit 130 may include a first motion sensing module 131, an optical sensing module 133, and the like.

The first motion sensing module 131 may detect a motion in a three-dimensional space of the vehicle navigation apparatus 100. The first motion sensing module 131 may include a three-axis geomagnetic sensor and a three-axis acceleration sensor. The motion data obtained through the first motion sensing module 131 may be combined with the position data obtained through the position data module 111 to calculate a more accurate trajectory of the vehicle to which the vehicle navigation apparatus 100 is attached.

The light sensing module 133 is a device for measuring the ambient illuminance of the vehicle navigation apparatus 100. The brightness of the display unit 145 may be changed to correspond to the ambient brightness by using the illumination data acquired through the light sensing module 133.

The output unit 140 is a device in which the vehicle navigation apparatus 100 outputs data. The output unit 140 may include a display module 141, an audio output module 143, and the like.

The display module 141 is a device that outputs data that can be visually recognized by the vehicle navigation apparatus 100. The display module 141 may be implemented as a display unit 145 of FIG. 1 provided on the front surface of the housing (191 of FIG. 1) of the vehicle navigation apparatus 100. Meanwhile, as described above, when the display module 141 is a touch screen, the display module 141 may serve as the output unit 140 and the first input unit 120 at the same time.

The audio output module 143 outputs audio data that can be recognized acoustically. The audio output module 143 outputs an audio signal related to a function (eg, a route guidance function) performed in the vehicle navigation apparatus 100. The audio output module 143 may include a receiver, a speaker, a buzzer, and the like.

The first storage unit 150 may store a program for the operation of the vehicle navigation apparatus 100, and may temporarily store data (path data and image data) input / output in relation to the vehicle navigation apparatus 100. In addition, the first storage unit 150 may store map data used for the route guidance service. The first storage unit 150 may be embedded in or detachable from the vehicle navigation apparatus 100, and may be a flash memory type, a hard disk type, or a multimedia card micro type. micro type), card type memory (e.g. SD or XD memory), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), electrically erasable programmable read only Memory, a PROM (Programmable Read Only Memory), a magnetic memory, a magnetic disk, an optical disk may include at least one type of storage medium. The vehicle navigation apparatus 100 may operate in association with a web storage that performs a storage function of the first storage unit 150 on the Internet.

The first power supply unit 160 receives external power and internal power to supply power required for the operation of each component of the vehicle navigation apparatus 100 or another device connected to the vehicle navigation apparatus 100.

The first controller 170 typically controls the overall operation of the vehicle navigation apparatus 100. In addition, the first controller 170 may output a control signal for controlling another device connected to the vehicle navigation apparatus 100.

4 is a structural diagram of a vehicle black box according to embodiments of the present invention.

Referring to FIG. 4, the vehicle black box 200 may include a second communication unit 210, a second input unit 220, a second sensing unit 230, a second storage unit 250, a second power supply unit 260, The second control unit 270 may be included. The components shown in FIG. 4 are not essential, and an electronic device having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The second communication unit 210 may communicate between the vehicle black box 200 and the communication system or between the vehicle black box 200 and the network in which the vehicle black box 200 is located or the vehicle black box 200 and other electronic devices (for example, It may include one or more modules that enable communication with the vehicle navigation (100). For example, the second communication unit 210 may include a second location data module 211, a second wireless internet module 213, a second short range communication module 217, a second wired communication module 219, and the like. Can be. The second location data module 211, the second wireless internet module 213, the second short range communication module 217, and the second wired communication module 219 are respectively a first location data module 111 and a first wireless internet. Since the module 113, the first local area communication module 117, and the first wired communication module 119 perform similar operations, detailed descriptions thereof will be omitted below.

The second input unit 220 may generate input data by converting an input from the outside of the vehicle black box 200 into a specific electric signal. The second input unit 220 may include a second user input module 221, a second microphone 223, a camera 225, and the like.

The second user input module 221 may receive a control input for controlling the operation of the vehicle black box 200 from the user. The second user input module 221 may be configured of a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, a push button, and the like. Since the second user input module 221 is not an essential component and the vehicle black box 200 is controlled by the vehicle navigation apparatus 100, the second user input module 221 may be excluded from the component of the vehicle black box 200.

The second microphone 223 converts an audio signal received from the outside into a specific electric signal to generate audio data. For example, the second microphone may acquire audio data related to driving of the vehicle. The second microphone 223 may correspond to the microphone 224 of the vehicle black box 200 shown in FIG. 1.

The camera 225 generates image data by converting an image signal received from the outside into a specific electric signal. For example, the camera 225 may acquire image data of the inside and outside of the vehicle related to the driving of the vehicle. The camera 225 may correspond to the camera 222 of the vehicle black box 200 shown in FIG. 1.

The second sensing unit 230 detects a current state of the vehicle black box 200 and generates a sensing signal for controlling the operation of the vehicle black box 200. The second sensing unit 230 may include a second motion sensing module 231, and the second motion sensing module 231 performs an operation similar to the aforementioned first motion sensing module 131 of FIG. 3. do.

The second storage unit 250 may store a program for operating the vehicle black box 200, and may store data (image data, audio data, etc.) input / output in relation to the vehicle black box 200. . For example, the second storage unit 250 may store audio data obtained through the second microphone 223 or image data obtained through the camera 225. The second storage unit 250 may be embedded in or detachable from the vehicle black box 200, and may include a flash memory type, a hard disk type, a multimedia card micro type, and a card type memory (for example, SD or XD). Memory, etc.), a RAM, a ROM, a PROM, a magnetic memory, a magnetic disk, and an optical disk. The vehicle black box 200 may operate in association with a web storage that performs a storage function of the second storage unit 250 on the Internet.

The second power supply unit 260 receives external power and internal power to supply power necessary for the operation of each component of the vehicle black box 200 or another device connected to the vehicle black box 200.

The second control unit 270 typically controls the overall operation of the vehicle black box 200. The second control unit 270 may be affected by the control signal of the first control unit 170 of FIG. 3 of the vehicle navigation apparatus 100 of FIG. 3. That is, the second control unit 270 may be dependent on the first control unit 170 of FIG. 3.

5 is a configuration diagram of a communication network including a navigation system according to embodiments of the present invention.

Referring to FIG. 5, the navigation system 10 may be connected to various communication networks and other electronic devices 61 to 64.

The navigation system 10 can calculate the current position using the radio wave signal received from the satellite 20. Each satellite 20 may transmit L band frequencies having different frequency bands. The navigation system 10 may calculate the current position based on the time it takes for the L-band frequency transmitted from each satellite 20 to reach the navigation system 10.

The navigation system 10 can wirelessly connect to the network 30 via the control station 40 (ACR), the base station 50, RAS, or the like. When the navigation system 10 is connected to the network 30, data can be exchanged by indirectly connecting to the electronic devices 61 and 62 connected to the network 30.

The navigation system 10 may connect to the network 30 indirectly through another device 63 having a communication function. For example, when no device capable of connecting to the network 30 is provided in the navigation system 10, it means that the device can communicate with another device 63 having a communication function using short-range communication.

Embodiments disclosed in this document may be implemented in the navigation system 10 described with reference to FIGS. 1 to 5. Below, a line drawn at regular intervals along the driving direction is used as a lane marking line to distinguish lanes on a road.

Hereinafter, a method of guiding a route according to an embodiment of the present invention and an operation of a navigation system for implementing the same will be described in detail with reference to the accompanying drawings.

6 is a flowchart illustrating a route guidance method of the navigation system 10 according to an exemplary embodiment. 7 to 13 are diagrams for describing a route guidance method of the navigation system 10 according to an exemplary embodiment.

Referring to FIG. 6, the first control unit 170 of the vehicle navigation apparatus 100 receives a destination through the first input unit 120. Also, location data of the vehicle is acquired through the first location information module 111. The route information is calculated by performing a route search for searching a moving route to a destination based on the position data of the vehicle and the map data stored in the first storage unit 150 (S101).

The first controller 170 is connected to the vehicle black box 200 through the first communication unit 110. In addition, an image related to driving of the vehicle is obtained from the vehicle black box 200 (S102). The vehicle black box 200 acquires an image related to driving of the vehicle on which the navigation system 10 is mounted in real time using at least one camera 225, and transmits the image to the vehicle navigation apparatus 100. Meanwhile, in one embodiment of the present invention, the vehicle navigation apparatus 100 and the vehicle black box 200 are separately provided and described in the case of communicating with each other through wired / wireless communication as an example, but the present invention is not limited thereto. According to the present invention, the vehicle navigation apparatus 100 and the vehicle black box 200 may be integrally implemented. In addition, in one embodiment of the present invention has been described taking the case of receiving the driving image of the vehicle from the vehicle black box 200 as an example, the present invention is not limited thereto. In the present invention, an image related to driving of the vehicle may be directly received from at least one external camera (not shown).

The first controller 170 of the vehicle navigation apparatus 100 obtains lane information by analyzing an image received from the vehicle black box 200 (S103). Lane information may be obtained based on lane marking lines, figures, text, and the like displayed on a road.

The lane information may include information related to lane display lines for identifying lanes. The lane display line related information may include a display type (eg, a straight line, a curve, a solid line, a dotted line, etc.), a color, and the like of the lane display line.

The first controller 170 may recognize the lane display line displayed on the road from the driving image of the vehicle using various filtering algorithms. For example, the first controller 170 may extract a lane display line for dividing a lane from an image by using an edge filter. In the case of using the edge filter, the first controller 170 searches for a part forming a boundary in the image by using brightness values of each pixel included in the image. Then, the lane display line is recognized based on the retrieved boundary portion. Meanwhile, a method of recognizing a lane display line using an edge filter is an example of a method of recognizing a lane display line, and the navigation system 10 disclosed in this document uses various filtering algorithms in addition to the method using an edge filter. It is possible to extract the lane display line included in the. When the lane display line is extracted, the first controller 170 may acquire a display form of the lane display lines such as a straight line, a curve, a solid line, and a dotted line based on the extracted outline of the lane display line. In addition, the first controller 170 may acquire color information of the lane display line through color recognition of the extracted lane display line.

When the plurality of lane display lines are extracted from the image, the first controller 170 may recognize at least one lane display line corresponding to the lane in which the vehicle is driving among the plurality of lane display lines. Here, the first controller 170 may use the display position in the image of each lane display line extracted from the image to select a lane display line corresponding to the lane in which the vehicle is driving among the lane display lines included in the image. For example, when a plurality of lane display lines are detected from the image, the first controller 170 sets a reference line that vertically crosses the image, and corresponds to a lane in which the vehicle is driving based on a distance between the reference line and each lane display line. The lane display line can be selected. That is, the lane display line close to the reference line may be selected as the lane display line associated with the lane in which the vehicle is driving. In addition, the first controller 170 may distinguish the left display line and the right display line of the lane in which the vehicle is driving based on which direction the reference line is located.

FIG. 7 illustrates an example of selecting a lane display line associated with a lane on which a vehicle is driving from an image. Referring to FIG. 7, the first controller 170 receives an image 7 related to driving of the vehicle from the vehicle black box 200 (S201). In addition, lane display lines 7a, 7b, 7c, and 7d are detected from the image 7 by using a filtering algorithm (S202). The first controller 170 may select the lane display lines 7b and 7c related to the lane in which the vehicle is driving, based on the position in the image 7 of each of the detected lane display lines 7a, 7b, 7c, and 7d. . To this end, the first controller 170 sets a reference line 7e that traverses the image 7 vertically. The lane display line 7a closest to the reference line 7e among the lane display lines 7a and 7b positioned on the left side of the reference line 7e is recognized as the left lane display line of the lane in which the vehicle is traveling. Also, among the lane display lines 7c and 7d positioned on the right side of the reference line 7e, the lane display line 7c closest to the reference line 7e is recognized as the right lane display line of the lane on which the vehicle is traveling. Meanwhile, the method of selecting a lane display line associated with a lane on which a vehicle is driving based on a distance or a relative position between the reference line and each lane display line is an example of a method of selecting a lane display line associated with a lane on which a vehicle is driven. It is not limited to this. The navigation system 10 disclosed in the present disclosure may select a lane display line related to a lane in which a vehicle is driven by using various methods, in addition to selecting a lane display line by using a relative position of a reference line and each lane display line.

Lane information may include object information related to at least one object displayed on the road. Here, the object is distinguished from the lane marking line and refers to a non-text figure (for example, an arrow) or text displayed on the road to provide information related to the corresponding road or the corresponding lane. The object displayed on the road may provide information such as a driving direction (forward, left, right, u-turn, etc.) of the corresponding lane, information on the way or destination of the corresponding lane, whether the road is branched, or one-way lane. When the object is displayed as a figure, the first controller 170 may obtain object information corresponding to the figure based on the shape of the figure. For example, when an arrow-shaped figure is extracted, the driving direction corresponding to the direction of the arrow may be obtained as object information. When the object is displayed as text, the first controller 170 may obtain the text as object information. For example, if the text indicates a waypoint or destination in a lane, the waypoint or destination may be acquired as object information. In addition, for example, in the case of a one-way lane, the object information indicating that the one-way lane may be obtained by analyzing text displayed on the lane. The first controller 170 may extract a road area displaying a road from an image to obtain object information, and may extract an object distinguished from a lane from the extracted road area. Here, the first controller 170 may distinguish the lane from the object based on the position in the image, the display form, and the like.

8 illustrates an example of an object displayed on a road. Referring to FIG. 8, the road included in the image 8 displays the first object 8b in the form of an arrow and the second object 8c in the form of a text, in addition to the display line 8a for distinguishing a lane. In FIG. 8, a case in which the non-text type object 8b and the text type object 8c are combined and displayed is described as an example. However, in the present invention, only one of the non-text type object and the text type object is displayed on the road. Applicable even if indicated in.

As in the lane recognition method described above, the first controller 170 may obtain object information displayed on a road from an image by using various filtering algorithms. For example, the first controller 170 may recognize an object displayed on a road from an image by using an edge filter. When the edge filter is used, the first controller 170 searches for a boundary of the image by using brightness values of each pixel included in the image, as in detecting the lane display line. The object is extracted from the image based on the retrieved boundary part. The object may be distinguished from the lane display line based on the display type, the display position, and the like. Meanwhile, a method of recognizing an object on which an object is displayed using an edge filter is an example of a method of recognizing an object. The navigation system 10 disclosed in this document uses various filtering algorithms in addition to a method using an edge filter. The objects included in the image can be extracted. When the object is extracted, the first controller 170 obtains object information including a display form of the object (eg, an arrow, text, etc.) and corresponding text when the object is text, based on the extracted outline of the object. can do.

Meanwhile, when a plurality of lanes are included in the image, the first controller 170 may recognize a lane in which the vehicle is driving among the plurality of lanes, and may recognize an object displayed on the driving lane. As described above, when the first controller 170 recognizes a lane display line of a lane in which the vehicle is driving, the first controller 170 may acquire a lane area in which the vehicle is driving from the image. That is, the area between the display lines of the lane on which the vehicle is traveling is recognized as the lane currently being driven. Based on this, it is possible to distinguish an area displayed on a driving lane from an object displayed on a neighboring lane.

When extracting an object displayed on the road from an image, the vehicle navigation apparatus 100 may use previously stored candidate objects to increase the object recognition rate and increase the accuracy of the object information. In this case, the first storage unit 150 stores a candidate object database for managing candidate objects to be compared with the objects extracted from the image. In addition, object information corresponding to each candidate object included in the candidate object database may be stored in advance. Also, location data corresponding to each candidate object stored in the candidate object database may be stored.

9 is a flowchart illustrating an example of obtaining object information using a candidate object database. Referring to FIG. 9, when an object is extracted from an image related to driving of a vehicle (S301), the first controller 170 matches a extracted object with a candidate object database stored in the first storage unit 150. It performs (S302). That is, a candidate object matching a predetermined ratio or more to an object extracted from an image among the candidate objects previously stored in the candidate object database is searched for. When a matching candidate object is found, the first controller 170 obtains object information using the corresponding candidate object (S303). For example, the first controller 170 may read object information corresponding to the matching candidate object from the first storage unit 150. Also, for example, the first controller 170 may acquire text information by recognizing text using a matching candidate object, or analyzing a figure represented by the candidate object. Meanwhile, when there is no candidate object matching the extracted object among the candidate objects previously stored in the candidate object database, the first controller 170 may ignore the extracted candidate object. When all the objects displayed on the road are stored as candidate objects in the candidate object database, the first controller 170 may determine that the object extracted from the image is an incorrect object if there is no candidate object matching the object extracted from the image. It may be determined that the object is not recognized, and may ignore the information or output guide information indicating that object recognition has failed.

10 is a flowchart illustrating another example of obtaining object information using a candidate object database. Referring to FIG. 10, the first controller 170 extracts an object from an image related to driving of the vehicle (S401) and obtains current position data of the vehicle through the first position data module 111 (S402). In addition, the first controller 170 performs a matching process on the extracted objects using the candidate object database stored in the first storage unit 150 (S403). Here, the first storage unit 170 selects only some of the candidate objects stored in the candidate object database for comparison based on the current position of the vehicle. That is, only candidate objects corresponding to the current position of the vehicle among the candidate objects stored in the candidate object database are selected for comparison. When a candidate object matching the extracted object among the candidate objects selected for comparison is found, the first controller 170 obtains object information using the corresponding candidate object (S404). Here, the method of acquiring the object information is similar to the method of acquiring the object information shown in FIG. 9 described above, and thus a detailed description thereof will be omitted below.

As shown in FIG. 9 and FIG. 10, the vehicle navigation apparatus 100 according to an embodiment of the present invention pre-stores objects displayed on a road as a comparison target, and previously stored candidate objects extracted from an image. By performing the process of matching with, it is possible to increase the object recognition rate and improve the accuracy of the object information.

Referring back to FIG. 6, the first controller 170 acquiring the lane information determines whether to re-route the route using the same (S104). If it is determined that the route re-search is necessary (S105), the route search for calculating the moving route to the destination is again performed based on the position data of the vehicle (S106).

In operation S104, the first controller 170 may determine whether to re-route the route based on the object information obtained from the driving image of the vehicle and the route information of the vehicle. The first controller 170 may generate route information based on a moving route of the vehicle calculated based on the destination data input through the user input module 121 and the position data of the current vehicle acquired through the first position data module 111. Create In addition, the first controller 170 may extract the objects displayed on the road to guide the driving direction of the lane from the driving image of the vehicle, and analyze the extracted objects to obtain the driving direction of the lane in which the vehicle is driving. . In addition, the first controller 170 may predict the driving direction of the vehicle in the alley, the intersection, the branch point, etc. based on the route information. The first controller 170 determines whether the vehicle is currently driving in the correct lane by comparing the driving direction of the vehicle previously predicted based on the route information with the driving direction of the lane in which the vehicle is currently driving. If the vehicle is not driving in the correct lane, the route to the destination is searched again.

FIG. 11 is a flowchart illustrating an example of determining whether to re-route a route based on a driving direction of a lane in which a vehicle is driving. Referring to FIG. 11, the first controller 170 compares the driving direction predicted using the route information with the driving direction of the lane in which the vehicle is traveling (S501). Then, based on this, it is determined whether the vehicle is driving in the correct lane, and if the vehicle is not driving in the correct driving lane (S502), the first controller 170 determines the route re-search (S503). For example, according to the route information, although the vehicle must travel in the first direction from the branch point, when the vehicle is currently traveling in the second direction from the branch point, the first controller 170 determines the route re-search. In FIG. 12, when the driving direction according to the route information is different from the driving direction of the currently driving lane, the route rescan is immediately determined, but the present invention is not limited thereto.

12 is a flowchart illustrating another example of determining whether to re-route a route based on a driving direction of a lane in which a vehicle is driving. Referring to FIG. 11, the first controller 170 compares the driving direction predicted using the route information with the driving direction of the lane in which the vehicle is driving (S601). Then, based on this, it is determined whether the vehicle is driving in the correct lane, and if the vehicle is not driving in the correct driving lane (S602), the first controller 170 outputs information for guiding the lane change (S603). For example, the driving direction predicted using the route information corresponds to the left turn at the next intersection, but when the vehicle is currently driving in the straight lane, the first controller 170 guides the lane change to the left turn lane. Print information. If the lane change to the correct lane is not performed within the preset time or after passing the specific area after outputting the lane change information, the first controller 170 determines the route rescanning (S604). Here, the specific area may include an intersection, a branch point, and the like. In addition, whether the lane is changed may be determined based on lane information acquired from the driving image of the vehicle.

Meanwhile, when there are a plurality of lanes having the same driving direction on the actual road, the first controller 170 may determine which lane the vehicle is currently driving by using color information of the lane obtained from the image. In the case of the center line for distinguishing lanes traveling in opposite directions, the colors of the other lane marking lines are different. Accordingly, the first controller 170 recognizes whether the lane display line of the currently running lane is the center line or the lane display line located from the center line based on the color information of the lane obtained from the image. Based on this, it is possible to determine in which lane the vehicle is currently driving.

Referring back to FIG. 6, in step S106, the first controller 170 may re-discover the route based on the driving direction of the lane in which the vehicle is currently driving. That is, when the driving direction of the lane in which the vehicle is currently traveling is different from the driving direction according to the route information, the first controller 170 calculates route information when the vehicle continues to travel in the driving direction of the currently driving lane. Rescan.

In general, when detecting a path deviation of a vehicle by using location data received from a satellite, a problem in that it takes a long time to detect a path deviation due to a delay before receiving location data from the satellite. It was. However, according to the embodiment of the present invention described above, the navigation system 10 detects the driving direction of the vehicle in real time by using the object displayed on the road, and based on this can detect the deviation of the vehicle route, If you are not driving in the correct driving direction, you can quickly decide whether to re-route your route.

Embodiments of the present invention include a computer readable medium containing program instructions for performing various computer-implemented operations. This medium records a program for executing the route guidance method described so far. The medium may include program instructions, data files, data structures, etc., alone or in combination. Examples of such media include magnetic media such as hard disks, floppy disks and magnetic tapes, optical recording media such as CDs and DVDs, and ROMs. Hardware devices configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine code, such as produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.

It is apparent to those skilled in the art that the present invention is not limited to the described embodiments, and that various modifications and changes can be made without departing from the spirit and scope of the present invention. Therefore, such modifications or variations will have to be belong to the claims of the present invention.

Claims (16)

A position data module for obtaining position data of the vehicle; And
The driving direction of the vehicle is obtained based on at least one object included in the driving image of the vehicle, and whether the vehicle departs from the path based on the path information of the vehicle generated based on the position data and the driving direction of the vehicle. Control unit to judge
.
The method of claim 1,
The controller may determine whether the vehicle deviates from the path by comparing the driving direction of the vehicle with the driving direction based on the route information.
The method of claim 1,
The controller may re-search for a route to a destination when the vehicle leaves the route according to the route information.
The method of claim 3,
The controller, when the vehicle deviates from the route according to the route information, re-searches the route to the destination based on the driving direction of the vehicle.
The method of claim 1,
The controller is configured to output guide information for inducing a lane change when the vehicle leaves the route according to the route information.
The method of claim 5,
The control unit, if the lane change is not performed in the lane of the driving direction according to the route information within a predetermined time after outputting the guide information or until the vehicle passes through a specific region, the vehicle according to the route information The electronic device, characterized in that it is determined that the path deviated.
The method of claim 1,
Further comprising a storage for storing a candidate object database for managing candidate objects,
The controller is further configured to obtain the object information by using at least one candidate object that matches the extracted at least one object among the candidate objects.
The method of claim 1,
And the at least one object is displayed on a road and includes at least one figure or at least one text indicating a driving direction of a corresponding lane.
The method of claim 1,
The driving image of the vehicle is received from the black box.
The method of claim 1,
The driving image of the vehicle is obtained from at least one camera.
A first electronic device for obtaining a driving image of the vehicle; And
Perform a path search on the vehicle, obtain a driving direction of the vehicle based on at least one object extracted from the driving image, and based on the path information obtained through the path search and the driving direction of the vehicle Second electronic device to determine whether the path of deviation
/ RTI >
The method of claim 11,
And the second electronic device compares a driving direction of the vehicle with a driving direction based on the route information to determine whether the vehicle leaves the path.
The method of claim 11,
And the second electronic device re-searches a route to a destination based on the driving direction in which the vehicle travels when the vehicle leaves the route according to the route information.
The method of claim 11,
And the at least one object is displayed on a road and includes at least one figure or at least one text representing a corresponding driving direction.
Generating route information to a destination using location data of the vehicle;
Obtaining a driving image of the vehicle;
Extracting at least one object displayed on the road from the driving image;
Obtaining a driving direction of the vehicle based on the extracted at least one object; And
Determining whether the vehicle deviates from the route by comparing the route information with the driving direction of the vehicle;
Route guidance method of an electronic device comprising a.
A computer-readable recording medium having recorded thereon a program for performing the method of claim 15.
KR1020100132405A 2010-12-22 2010-12-22 Electronic device and course guide method of electronic device KR20120070887A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100132405A KR20120070887A (en) 2010-12-22 2010-12-22 Electronic device and course guide method of electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100132405A KR20120070887A (en) 2010-12-22 2010-12-22 Electronic device and course guide method of electronic device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
KR1020180084573A Division KR102041300B1 (en) 2018-07-20 2018-07-20 Electronic device and course guide method of electronic device

Publications (1)

Publication Number Publication Date
KR20120070887A true KR20120070887A (en) 2012-07-02

Family

ID=46706062

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100132405A KR20120070887A (en) 2010-12-22 2010-12-22 Electronic device and course guide method of electronic device

Country Status (1)

Country Link
KR (1) KR20120070887A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101455765B1 (en) * 2013-04-16 2014-11-03 주식회사 비온시이노베이터 Ship navigation system and its operating method
KR20190037535A (en) * 2017-09-29 2019-04-08 주식회사 만도 Apparatus and method for supporting vehicle driving

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101455765B1 (en) * 2013-04-16 2014-11-03 주식회사 비온시이노베이터 Ship navigation system and its operating method
KR20190037535A (en) * 2017-09-29 2019-04-08 주식회사 만도 Apparatus and method for supporting vehicle driving

Similar Documents

Publication Publication Date Title
KR102396731B1 (en) Method of providing detailed map data and system therefor
CN108680173B (en) Electronic device, control method of electronic device, and computer-readable recording medium
KR102123844B1 (en) Electronic apparatus, control method of electronic apparatus and computer readable recording medium
WO2012086054A1 (en) Navigation device, control method, program, and storage medium
KR20120079341A (en) Method, electronic device and recorded medium for updating map data
KR102411421B1 (en) Method, electronic apparatus and computer readable recording medium for displaying information regarding user's poi
KR101001842B1 (en) Navigation device for a vehicle and method for inducing right position of blackbox device of a navigation system
JP2007240380A (en) Intra-tunnel position detector
KR101759975B1 (en) Navigation for vehicel and land departure warning method of navigation for vehicel
KR101049372B1 (en) Navigation system, and control method of a navigation device for vehicle
KR20120070886A (en) Electronic device and lane guide method of electronic device
KR101911522B1 (en) Electronic device and navigation service method of electronic device
JP2013036930A (en) Navigation device and navigation system comprising the same
KR20120126175A (en) Electronic Device And Operating Method Thereof
KR20120070887A (en) Electronic device and course guide method of electronic device
KR101906436B1 (en) Driving-related guidance method for a moving body, electronic apparatus and computer-readable recording medium
KR102041300B1 (en) Electronic device and course guide method of electronic device
KR102058620B1 (en) Electronic device and driving related guidance maethod for moving body
KR101912853B1 (en) Electronic device and lane guide method of electronic device
KR101055155B1 (en) Navigation system, navigation for vehicle and navigation method of navigation for vehicel
KR101818956B1 (en) Electronic apparatus, method and recording medium for providing lane-changing information
KR20120078876A (en) Electronic device and lane guide method of electronic device
KR102048712B1 (en) Electronic device and navigation service method of electronic device
KR101126516B1 (en) Electronic device and method for receiving information, recording medium recording program performing method of receiving information
KR101421613B1 (en) Electronic device, server, mehotd for controlling of the electronic device and method for providing of traffic information

Legal Events

Date Code Title Description
N231 Notification of change of applicant
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application
E801 Decision on dismissal of amendment
E902 Notification of reason for refusal
J201 Request for trial against refusal decision
J301 Trial decision

Free format text: TRIAL NUMBER: 2018101003071; TRIAL DECISION FOR APPEAL AGAINST DECISION TO DECLINE REFUSAL REQUESTED 20180720

Effective date: 20191202