WO2016063999A1 - 네비게이션 디바이스 및 그 제어 방법 - Google Patents
네비게이션 디바이스 및 그 제어 방법 Download PDFInfo
- Publication number
- WO2016063999A1 WO2016063999A1 PCT/KR2014/009912 KR2014009912W WO2016063999A1 WO 2016063999 A1 WO2016063999 A1 WO 2016063999A1 KR 2014009912 W KR2014009912 W KR 2014009912W WO 2016063999 A1 WO2016063999 A1 WO 2016063999A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- destination
- vehicle
- navigation device
- loaded
- identified object
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3605—Destination input or retrieval
- G01C21/362—Destination input or retrieval received from an external device or application, e.g. PDA, mobile phone or calendar application
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3605—Destination input or retrieval
- G01C21/3617—Destination input or retrieval using user history, behaviour, conditions or preferences, e.g. predicted or inferred from previous use or current movement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
- G01C21/3415—Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
Definitions
- the present invention relates to a navigation device and a control method thereof.
- Vehicles eg automobiles
- vehicles have traditionally been a collection of mechanical devices.
- electronic devices are being mounted in vehicles.
- the vehicle may track the location of the smart key and perform a function corresponding to the location of the smart key.
- the navigation device is an example of an electronic device of a vehicle.
- various navigation devices have been used.
- a portable navigation device, a navigation device embedded in a vehicle, or a cellular phone equipped with a navigation application may perform a navigation function.
- such navigation devices can track the position of the vehicle in real time while moving with the vehicle to guide the direction of travel.
- a navigation device performs a destination search for destination setting.
- the destination search the user must enter the destination into the navigation device.
- the navigation device may provide a recommended destination, such as a list of recent destinations.
- the list of recent destinations does not consider the current state of the user. Therefore, there is a need for a more improved method of providing a recommendation destination that takes into account the user's context.
- the present specification is to provide a navigation device and a control method for providing a recommended destination based on an external object.
- it is intended to present a more advanced navigation device that provides a recommended destination by generating a destination history associated with an external object.
- a navigation device includes a display unit for displaying at least one image and receiving a touch input, a positioning unit for determining a position of the navigation device, and at least loaded in a vehicle.
- a detecting unit and a display unit for detecting one object, a positioning unit and a processor for controlling the detecting unit, each at least one object including attribute information, and the navigation device being loaded into the vehicle;
- the processor detects the object loaded in the vehicle using the detecting unit, identifies the detected object based on attribute information of the detected object, and includes destination information of the vehicle in which the identified object is loaded.
- a destination history for the identified object After this generation, once the identified object is loaded back into the vehicle, it may provide at least one recommended destination based on the destination history for the object.
- a method of controlling a navigation device may include detecting an object loaded in a vehicle using a detecting unit, and identifying a detected object based on attribute information included in the detected object. Generating a destination history for the identified object that includes the destination information of the vehicle from which the identified object was loaded, and after the identified object has been reloaded into the vehicle, the destination history for the object. And providing at least one recommended destination based on the.
- the navigation device may provide a user with a recommended destination.
- the navigation device may provide a recommendation destination that matches the user's context by generating a recommendation destination based on the identified object.
- the navigation device may statistically analyze the user's context information by generating a destination history for the object.
- FIG. 1 illustrates a network environment of a vehicle.
- FIG. 2 is a block diagram of a navigation device according to an exemplary embodiment.
- FIG. 3 illustrates a destination setting of a navigation device according to an embodiment.
- FIG. 4 illustrates a destination history, according to one embodiment.
- 5 illustrates a further destination recommendation, according to one embodiment.
- FIG 6 illustrates an input interface according to one embodiment.
- FIG. 8 illustrates an example of object notification using a user device.
- FIG. 9 is a flowchart of a control method of a navigation device, according to an exemplary embodiment.
- FIG. 1 illustrates a network environment of a vehicle.
- vehicle 200 can communicate with various devices.
- the vehicle 200 may communicate with the user device 351 via a network.
- the vehicle 200 may communicate with the user device 351 through a system mounted on the vehicle 200 or through a navigation device that communicates with the system mounted on the vehicle 200.
- the vehicle 200 may communicate with various objects 301, 302, 303, 304 loaded in the vehicle 200. Communication with the objects 301, 302, 303, 304 may be performed directly or indirectly.
- the vehicle 200 may include other transportation means.
- the vehicle 200 may be a motorcycle, a bicycle, a ship or an airplane.
- the vehicle 200 of the present specification may include an automatic driving device. 1
- a basketball ball 301, a shopping cart 302, a soccer ball 303, and a laptop computer 304 are shown as examples of objects, but various objects may be used as objects of the present specification.
- a mobile phone is illustrated as a user device 351 in FIG. 1, various portable devices may be used as the user device 351 of the present specification.
- the user device 351 may include a mobile phone, an electronic notebook, a head mounted mounted (HMD), and various portable devices.
- HMD head mounted mounted
- the objects 301, 302, 303, and 304 of the present specification may directly or indirectly communicate with the vehicle 200 or the navigation device.
- the communication of the objects 301, 302, 303, 304 herein may be performed by a simple tag.
- the objects 301, 302, 303, 304 may not have a function for separate data transmission / reception.
- the vehicle 200 or the navigation device may identify the tag of the objects 301, 302, 303, 304. The identification of these tags can be included in the communication in a broad sense.
- the navigation device of the present specification is not directly shown.
- the navigation device described below may be embedded in the vehicle 200.
- vehicle 200 may include a navigation device as part of a vehicle system.
- the navigation device herein may be a portable device that is not embedded in an automobile.
- a mobile phone may operate as the navigation device herein.
- the navigation device of the present disclosure may be powered from the vehicle 200 and may be a removable device.
- FIG. 2 is a block diagram of a navigation device according to an exemplary embodiment.
- the navigation device may be part of the vehicle system or detachable from the vehicle.
- the navigation device may include a positioning unit 130, a display unit 120, a detecting unit 140, and a processor 110.
- the positioning unit 130 may determine the position of the navigation device 100.
- the location determination unit 130 may include a satellite positioning system (GPS), a geographic information system (GIS), a terrestrial network based positioning system, and / or a hybrid assisted GPS wireless positioning system.
- the detecting unit 140 may detect at least one object loaded in the vehicle.
- the detecting unit 140 may detect the loading of the object.
- the detecting unit 140 may detect the loading of the object by communicating with the object.
- the detecting unit 140 may detect the loading of the object based on the strength of the signal from the object, the strength of the signal reflected from the object, and / or the response time from the object.
- the detecting unit 140 may detect the loading of an object using an object sensor provided in the vehicle.
- the vehicle may include an object sensor for sensing the loading of the object, and the detecting unit 140 may communicate with the object sensor.
- the detecting unit 140 may determine the loading / unloading of the object based on a signal received from the object sensor of the vehicle.
- the detecting unit 140 may detect the object loaded in the vehicle. In addition, the detecting unit 140 may identify the object based on the attribute information of the object. For example, the attribute information of the object may include the name, ID, type and / or unique identifier of the object. The detecting unit 140 may receive attribute information of the object by communicating with the object and identify the object based on the attribute information. In addition, the detecting unit 140 may identify the object by reading a tag included in the object.
- the detecting unit 140 may include a communication unit communicating with an object, a user device, and / or a vehicle.
- the detecting unit 140 may be coupled with a separate communication unit embedded in the navigation device 100.
- the communication unit may perform communication and transmit / receive data via a wired or wireless network.
- the communication unit includes a wireless LAN (WLAN), an IEEE 802.11-based wireless LAN communication, a wireless broadband (Wibro), a world interoperability for micorwave access (Wimax), and a high speed downlink packet (HSDPA) for connection with a wireless network.
- Wibro wireless broadband
- Wimax wireless broadband
- HSDPA high speed downlink packet
- Access, Bluetooth, Near Field Communication (NFC) standards can be used.
- the communication unit can access the Internet via a wired / wireless network.
- the display unit 120 may display at least one image and receive a touch input.
- the display unit 120 may include a liquid crystal display (LCD), a plasma display, or another type of display.
- the display unit 120 may include a touch sensor. That is, the display unit 120 may include a touch sensitive display unit.
- the touch sensor may be located on or within the display unit 120.
- the touch sensor senses a variety of touch or non-contact touch inputs, such as sliding touch inputs, multi-touch inputs, long-press touch inputs, short-press touch inputs, drag touch inputs, hovering inputs, or flicking touch inputs. can do.
- the touch sensor may also sense touch input by various input tools such as a touch pen and a stylus pen.
- the touch sensor may transmit a result of sensing the touch input to the processor 110.
- the processor 110 may control the display unit 120, the positioning unit 130, and the detecting unit 140. In addition, the processor 110 may control other components included in the navigation device 100 described later.
- the processor 110 may process data of the navigation device 100 to execute various applications.
- the processor 110 may control the content executed in the navigation device 100 and the navigation device 100 based on the command.
- the navigation device 100 may further include components not shown in FIG. 2.
- the navigation device 100 may further include a memory, a power source, a housing, an audio receiving unit, an audio output unit, or an image sensing unit.
- the image sensing unit may sense the image using visible light, infrared light, ultraviolet light, magnetic field, and / or sound waves.
- the above-described configuration may be selectively combined according to the manufacturer's selection or the shape of the navigation device 100.
- the above-described components may be connected to each other via a bus, and may be controlled by the processor 110.
- a configuration diagram of the navigation device 100 illustrated in FIG. 1 is a block diagram according to an exemplary embodiment.
- the navigation device 100 of the present specification may be controlled based on various inputs.
- the navigation device 100 may include a physical button and may receive input from the physical button.
- the navigation device 100 may include a voice receiving unit, perform voice recognition based on the received voice, and may be controlled based on the voice recognition.
- the navigation device 100 may perform speech recognition in units of syllables, words, or sentences, or may perform a function by combining recognized syllables, words, or sentences.
- the navigation device 100 may perform image analysis using the image sensing unit, and may be controlled based on the analyzed image.
- the navigation device 100 may include a touch sensing unit and may be controlled based on a touch input to the touch sensing unit.
- the navigation device 100 may be controlled based on the combination of the above-described inputs.
- navigation device 100 operations performed by the navigation device 100 will be described with reference to FIGS. 3 to 9.
- the configuration of the navigation device 100 described above with reference to FIGS. 1 and 2 may be used for the operation of the navigation device 100 described later.
- the operation of the navigation device 100 and the operation of the processor 110 may be described in the following description.
- the navigation device 100 may be embedded in or loaded into the vehicle.
- FIG. 3 illustrates a destination setting of a navigation device according to an embodiment.
- the navigation device (not shown) is embedded in or loaded into the vehicle 200.
- a basketball 301 is loaded into the vehicle 200.
- the navigation device may identify the object (basketball ball 301) loaded in the vehicle 200 using the detecting unit.
- basketball 301 may include a wirelessly identifiable tag.
- the navigation device may detect the basketball 301 loaded in the vehicle 200.
- the tag of the basketball (301) may include attribute information for the basketball (301).
- the basketball 301 may be in communication with a navigation device. In this case, the navigation device may receive attribute information based on communication with the basketball 301. Therefore, the navigation device may identify the basketball ball 301 loaded in the vehicle 200 based on the attribute information of the basketball ball 301.
- the navigation device may generate a destination history for the object (basketball ball 301) including destination information of the vehicle in which the basketball ball 301 is loaded in the vehicle 200.
- a user boarding the vehicle 200 sets a destination.
- the navigation device may include the set destination in the destination history for the basketball 301.
- the user may move to the destination without setting the destination in the navigation device.
- the navigation device may include the position where the vehicle 200 stops and the basketball 301 unloaded in the destination history for the basketball 301.
- a basketball court on the map is set as the destination.
- the navigation device includes the basketball court in the destination history for the basketball ball 301.
- the navigation device may provide at least one recommended destination based on the generated destination history after the destination history for the object (for example, the basketball 301) is generated, and the object is loaded back into the vehicle. That is, when the basketball 301 is loaded into the vehicle 200, the navigation device may provide a recommended destination based on the destination history generated for the basketball 301. For example, the navigation device may provide the highest ranked destination of the destination history for the basketball 301 as the recommended destination. For example, the navigation device may provide a basketball court in the destination history for basketball 301 as the recommended destination for basketball 301. The sorting / classifying of the destinations in the destination history will be described later with reference to FIG. 4. The navigation device may provide only one destination as the recommended destination.
- the navigation device may provide two or more destinations (destinations in the destination history) as recommended destinations.
- the navigation device may provide a recommendation destination through the display unit and may provide a recommendation destination through the voice output unit.
- the navigation device may automatically set the highest ranking destination as the destination.
- the vehicle 200 may be driven to the destination based on the set destination.
- FIG. 4 illustrates a destination history, according to one embodiment.
- the navigation device may generate a destination history for the identified object (eg, basketball 301).
- the destination history may include the location of the destination, the last visit, and the number of visits.
- the location of the destination may include geographic coordinates.
- the destination may be identified as the name of the destination. 4 names and I.D. are shown as attribute information of the basketball ball 301.
- the attribute information may include the name, ID, type and / or unique identifier of the object.
- the destination history of FIG. 4 includes a destination, a last visit, and a visit frequency.
- the destination history may include other information.
- the navigation device may distinguish between a location where an object (eg, the basketball 301) is loaded into the vehicle and a location that is unloaded from the vehicle and include it in the destination history.
- the navigation device can identify where an object is often loaded and often unloaded based on statistical experience.
- the navigation device may classify the destination history, and the classification of the destination history may be reflected in the provision of the recommended destination described later. For example, only the highest ranked destination in the destination history may be provided as the recommended destination.
- the navigation device may also provide a plurality of recommended destinations in the order in which the destination histories are arranged. For example, the navigation device may sort the destination history based on the last visit date and / or visit frequency.
- the navigation device may also classify destinations in the destination history based on the location of the navigation device. For example, the navigation device may sort the destinations in the destination history in the order closest to the current location of the navigation device.
- the sorting / classification of the destination history of the navigation device described above with respect to FIG. 4 may optionally be combined with the operation of the navigation device described above with respect to FIG. 3.
- 5 illustrates a further destination recommendation, according to one embodiment.
- the navigation device may provide at least one recommended destination.
- the navigation device may provide at least one additional destination based on the attribute information of the object and the location of the navigation device.
- a basketball 301 is loaded into the vehicle 200 as an object.
- the basketball 301 can be identified by the navigation device.
- the navigation device may search for a destination matching the basketball ball from the attribute information of the basketball ball 301 (for example, a basketball ball).
- the navigation device may search the basketball court as a destination that matches the basketball 301.
- the position of the navigation can be taken into account.
- basketball courts within a predetermined distance from the location of the navigation may be provided as a recommended destination. That is, the additional recommended destination is for additionally providing the user with a destination that does not exist in the destination history for the identified object (eg, basketball 301).
- the navigation device may determine the type of the object based on attribute information of the object loaded in the vehicle 200.
- the navigation device may provide, as at least one recommended destination, a location existing within a predetermined distance from the location of the navigation device among the locations corresponding to the type of the determined object. That is, the navigation device may, for example, perform a similar search / semantic search based on the attributes of the basketball ball 301 as well as a destination that matches the name itself of the basketball 301.
- the navigation device can provide a recommendation for a new place that the user has never been to.
- provision of the additional recommended destination described above with respect to FIG. 5 may optionally be combined with the operation of the navigation device described above with respect to FIGS. 3 and 4.
- FIG 6 illustrates an input interface according to one embodiment.
- the navigation device 100 includes a display unit 120 and is mounted in the vehicle 200. As described above, the navigation device 100 may be removable from the vehicle 200. As shown in FIG. 6, the navigation device 100 may provide an interface 151 on the display unit 120 for setting a destination of the vehicle 200. The user may retrieve or set a destination through a virtual keyboard on the interface 151.
- the navigation device 100 may set a destination based on an input to the interface 151.
- the navigation device 100 may provide a notification for the absence of the identified object if the set destination exists in the destination history for the identified object and the identified object is not loaded into the vehicle.
- the notification for the absence of the identified object is described.
- a particular basketball court is included in the destination history for the basketball 301.
- the user may set the basketball court as a destination using the interface 151 of FIG. 6.
- the navigation device may inform the user that the basketball 301 is not loaded.
- the navigation device may generate a notification through the display unit or voice message, "Would you like to take the basketball ball with you?" It can also be provided to the user via an output unit. Therefore, the user can take the basketball ball to the basketball court without forgetting.
- Providing the notification described above with respect to FIG. 6 may optionally be combined with the operation of the navigation device described above with respect to FIGS.
- the navigation device may generate a destination history for the identified object. That is, the object and the destination may be associated with each other. Thus, as described above with respect to FIG. 3, the navigation device can provide the associated destination as the recommended destination when the identified object is loaded into the vehicle. Conversely, as described above with respect to FIG. 6, the navigation device may inform the absence of an associated object if the associated object is not loaded into the vehicle even though the destination in the destination history has been set as the destination.
- the navigation device may identify one or more objects and generate a destination history for each of the identified objects. For example, as shown in FIG. 7, each object 301, 302, 303, 304 may be associated with a different place from each other.
- the navigation device may also provide a notification for the unloading of the identified object after arriving at the destination. For example, in FIG. 7, a user may move to a location associated with soccer ball 303 while loading soccer ball 303 in a vehicle. In this case, the navigation device may inform the user to unload the soccer ball 303 when the vehicle arrives at a location associated with the soccer ball 303. Thus, such notification can prevent the user from getting off the vehicle without the specific object at the place where the specific object is associated.
- the navigation device may identify the object, distinguish the location where the identified object is loaded / unloaded, and store it in the destination history. Thus, the navigation device may not only provide a recommended destination based on the object, but may also recommend loading / unloading of the object based on the destination.
- the provision of the notification described above with respect to FIG. 7 may optionally be combined with the operations of the navigation device described above with reference to FIGS. 3-6.
- FIG. 8 illustrates an example of object notification using a user device.
- the navigation device may use a separate user device 351 to provide a recommended destination or notification.
- the user device 351 may include a schedule application.
- the navigation device may communicate with a user device 351 loaded in the vehicle 200 using a communication unit.
- the navigation device may receive schedule information from the user device 351.
- the schedule information may include a time, a place, and a brief description.
- the navigation device may identify an object associated with a current or later schedule of the user device 351 based on the schedule information.
- the navigation device may identify the basketball 301 as an object associated with a schedule called "basketball.”
- the navigation device may provide a notification for the absence of an object if an object associated with a current or later schedule is not loaded into the vehicle 200. For example, assume that a user boards the vehicle 200 at 4 pm on September 22, 2014. In this case, the navigation device may receive schedule information from the user device 351. In addition, the basketball 301 may be identified as an associated object based on the received schedule information. At this time, if the basketball ball 301 which is the identified object is not loaded in the vehicle 200, the navigation device may suggest that the user bring the basketball ball 301 together. Therefore, the user can take the objects necessary for the future schedule without forgetting.
- the user device 351 and the navigation device are described as separate devices throughout the present specification.
- the user device 351 and the navigation device may be the same device.
- the navigation device may be a mobile phone that includes a navigation application.
- the mobile phone may include a schedule application.
- the mobile phone may provide the notification for the absence of the object described above with respect to FIG. 8 based on the schedule information of the schedule application.
- FIG. 9 is a flowchart of a control method of a navigation device, according to an exemplary embodiment.
- the navigation device may detect the object loaded in the vehicle 901 using the detecting unit and identify 902 the detected object based on attribute information included in the detected object. As described above in connection with FIGS. 2 and 3, the navigation device may identify the object by communicating with the object, by identifying the tag of the object, or using a sensor embedded in the vehicle. The navigation device generates 903 a destination history for the identified object that includes destination information of the vehicle in which the identified object was loaded. As described above with respect to FIG. 3, the navigation device may generate a destination history based on various methods. In addition, as described above with reference to FIG. 4, the navigation device may sort the destination history.
- the navigation device may provide 904 at least one recommended destination based on the destination history for the object if the identified object is reloaded into the vehicle after the destination history has been generated. As described above with respect to FIG. 3, one or more recommended destinations may be provided. The navigation device may also set the recommended destination as the destination of the vehicle.
- the control method of the navigation device of FIG. 9 may be selectively combined with the operations of the navigation device described above with reference to FIGS. 3 to 8. In addition, the method of controlling the navigation device of the present specification may be performed by the navigation device described above with reference to FIGS. 1 and 2.
- the navigation device and its control method according to the present disclosure are not limited to the configuration and method of the above-described embodiments, but various modifications may be made by selectively combining all or some of the embodiments.
- the navigation device and its control method of the present specification can be implemented as software in a processor-readable recording medium included in the navigation device.
- the processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and the processor-readable recording medium is distributed in a networked computer system. In a distributed fashion, processor-readable code can be stored and executed.
- the present invention has industrial applicability that can be used in terminal devices and that can be repeated.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Navigation (AREA)
Abstract
Description
Claims (21)
- 네비게이션 디바이스로서,적어도 하나의 이미지를 디스플레이하고, 터치 입력을 수신하는 디스플레이 유닛;상기 네비게이션 디바이스의 위치를 결정하는 위치 결정 유닛;비히클(vehicle)에 로드(load)된 적어도 하나의 오브젝트를 디텍트하는 디텍팅 유닛; 및상기 디스플레이 유닛, 상기 위치 결정 유닛 및 상기 디텍팅 유닛을 제어하는 프로세서를 포함하고,각각의 상기 적어도 하나의 오브젝트는 속성 정보를 포함하고,상기 네비게이션 디바이스는 상기 비히클에 로드되며,상기 프로세서는,디텍팅 유닛을 이용하여 상기 비히클에 로드된 오브젝트를 디텍트하고,상기 디텍트된 오브젝트의 속성 정보에 기초하여 상기 디텍트된 오브젝트를 식별하고,상기 식별된 오브젝트가 로드된 상기 비히클의 목적지 정보를 포함하는 상기 식별된 오브젝트에 대한 목적지 이력을 생성하고,상기 목적지 이력이 생성된 후, 상기 식별된 오브젝트가 상기 비히클에 다시 로드되면, 상기 오브젝트에 대한 상기 목적지 이력에 기초하여 적어도 하나의 추천 목적지를 제공하는, 네비게이션 디바이스.
- 제 1 항에 있어서,상기 적어도 하나의 추천 목적지는, 상기 식별된 오브젝트에 대한 상기 목적지 이력 내의 목적지들의 방문 빈도에 기초하여 분류되는, 네비게이션 디바이스.
- 제 2 항에 있어서,상기 프로세서는,상기 식별된 오브젝트에 대한 상기 목적지 이력 내의 목적지들 중 가장 방문 빈도가 높은 목적지를 상기 비히클의 목적지로서 설정하는, 네비게이션 디바이스.
- 제 1 항에 있어서,상기 적어도 하나의 추천 목적지는, 상기 식별된 오브젝트에 대한 상기 목적지 이력 내의 목적지들의 마지막 방문 날짜에 기초하여 분류되는, 네비게이션 디바이스.
- 제 1 항에 있어서,상기 적어도 하나의 추천 목적지는, 상기 네비게이션 디바이스의 위치에 기초하여 분류되는, 네비게이션 디바이스.
- 제 1 항에 있어서,상기 프로세서는,상기 목적지 이력이 생성된 후, 상기 식별된 오브젝트가 상기 비히클에 로드되면, 상기 식별된 오브젝트의 상기 속성 정보 및 상기 네비게이션 디바이스의 위치에 기초하여 적어도 하나의 추가 추천 목적지를 제공하는, 네비게이션 디바이스.
- 제 6 항에 있어서,상기 프로세서는,상기 속성 정보에 기초하여 상기 식별된 오브젝트의 유형을 결정하고,상기 네비게이션 디바이스의 위치로부터 기설정된 거리 내에 존재하는 상기 식별된 오브젝트의 상기 유형에 대응하는 적어도 하나의 위치를 상기 적어도 하나의 추가 추천 목적지로서 제공하는, 네비게이션 디바이스.
- 제 1 항에 있어서,상기 프로세서는,상기 비히클의 목적지를 설정하는 인터페이스를 상기 디스플레이 유닛 상에 제공하는, 네비게이션 디바이스.
- 제 8 항에 있어서,상기 프로세서는,상기 인터페이스에 대한 입력에 기초하여 목적지를 설정하고,상기 설정된 목적지가 상기 식별된 오브젝트에 대한 상기 목적지 이력 내에 존재하고, 상기 식별된 오브젝트가 상기 비히클에 로드되지 않으면, 상기 식별된 오브젝트의 부재에 대한 노티피케이션을 제공하는, 네비게이션 디바이스.
- 제 1 항에 있어서,상기 디텍팅 유닛은 상기 오브젝트로부터 상기 오브젝트의 속성 정보를 포함하는 신호를 수신하는, 네비게이션 디바이스.
- 제 1 항에 있어서,상기 프로세서는,상기 오브젝트로부터 수신된 신호의 세기 및 상기 오브젝트의 응답 시간 중 적어도 하나에 기초하여 상기 오브젝트가 상기 비히클에 로드되었거나 상기 비히클로부터 언로드(unload)되었는지를 결정하는, 네비게이션 디바이스.
- 제 11 항에 있어서,상기 프로세서는,상기 오브젝트가 상기 비히클에 로드된 제1 위치 및 상기 오브젝트가 상기 비히클로부터 언로드된 제2 위치에 기초하여 상기 오브젝트에 대한 상기 목적지 이력을 생성하는, 네비게이션 디바이스.
- 제 12 항에 있어서,상기 프로세서는,상기 오브젝트가 상기 비히클에 로드되고, 상기 네비게이션 디바이스가 상기 제1 위치로부터 상기 제2 위치로 이동되면, 상기 제2 위치에 도착한 후 상기 오브젝트의 언로드에 대한 노티피케이션을 제공하는, 네비게이션 디바이스.
- 제 1 항에 있어서,상기 디텍팅 유닛은 상기 비히클에 구비된 적어도 하나의 오브젝트 센서와 커뮤니케이션하고,상기 프로세서는,상기 적어도 하나의 오브젝트 센서로부터 수신되는 신호에 기초하여 상기 오브젝트가 상기 비히클에 로드되었거나 상기 비히클로부터 언로드되었는지를 결정하는, 네비게이션 디바이스.
- 제 1 항에 있어서,유저 디바이스와 커뮤니케이션하는 커뮤니케이션 유닛을 더 포함하고,상기 프로세서는,상기 비히클에 로드된 상기 유저 디바이스로부터 스케쥴 정보를 수신하고,상기 스케쥴 정보에 기초하여 상기 유저 디바이스의 현재 또는 이후의 스케쥴에 연관된 오브젝트를 식별하는, 네비게이션 디바이스.
- 제 15 항에 있어서,상기 현재 또는 이후의 스케쥴에 연관된 오브젝트가 상기 비히클에 로드되지 않으면, 상기 오브젝트의 부재에 대한 노티피케이션을 제공하는, 네비게이션 디바이스.
- 네비게이션 디바이스의 제어 방법으로서,디텍팅 유닛을 이용하여 비히클(vehicle)에 로드(load)된 오브젝트를 디텍팅하는 단계;디텍트된 오브젝트에 포함된 속성 정보에 기초하여 상기 디텍트된 오브젝트를 식별하는 단계;상기 식별된 오브젝트가 로드된 상기 비히클의 목적지 정보를 포함하는 상기 식별된 오브젝트에 대한 목적지 이력을 생성하는 단계; 및상기 목적지 이력이 생성된 후, 상기 식별된 오브젝트가 상기 비히클에 다시 로드되면, 상기 오브젝트에 대한 상기 목적지 이력에 기초하여 적어도 하나의 추천 목적지를 제공하는 단계를 포함하는, 네비게이션 디바이스의 제어 방법.
- 제 17 항에 있어서,상기 적어도 하나의 추천 목적지는, 상기 식별된 오브젝트에 대한 상기 목적지 이력 내의 목적지들의 방문 빈도에 기초하여 분류되는, 네비게이션 디바이스의 제어 방법.
- 제 17 항에 있어서,상기 적어도 하나의 추천 목적지는, 상기 식별된 오브젝트에 대한 상기 목적지 이력 내의 목적지들의 마지막 방문 날짜에 기초하여 분류되는, 네비게이션 디바이스의 제어 방법.
- 제 17 항에 있어서,위치 결정 유닛을 이용하여 상기 네비게이션 디바이스의 위치를 결정하는 단계; 및상기 목적지 이력이 생성된 후, 상기 식별된 오브젝트가 상기 비히클에 로드되면, 상기 식별된 오브젝트의 상기 속성 정보 및 상기 네비게이션 디바이스의 위치에 기초하여 적어도 하나의 추가 추천 목적지를 제공하는 단계를 더 포함하는, 네비게이션 디바이스의 제어 방법.
- 제 17 항에 있어서,유저로부터 상기 비히클의 목적지를 수신하는 단계; 및상기 수신된 목적지가 상기 식별된 오브젝트에 대한 상기 목적지 이력에 포함되고, 상기 식별된 오브젝트가 상기 비히클에 로드되지 않으면, 상기 식별된 오브젝트의 부재에 대한 노티피케이션을 상기 유저에게 제공하는 단계를 더 포함하는, 네비게이션 디바이스의 제어 방법.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020177013032A KR102224491B1 (ko) | 2014-10-22 | 2014-10-22 | 네비게이션 디바이스 및 그 제어 방법 |
PCT/KR2014/009912 WO2016063999A1 (ko) | 2014-10-22 | 2014-10-22 | 네비게이션 디바이스 및 그 제어 방법 |
US15/129,107 US20180003519A1 (en) | 2014-10-22 | 2014-10-22 | Navigation device and method of controlling the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2014/009912 WO2016063999A1 (ko) | 2014-10-22 | 2014-10-22 | 네비게이션 디바이스 및 그 제어 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016063999A1 true WO2016063999A1 (ko) | 2016-04-28 |
Family
ID=55761029
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2014/009912 WO2016063999A1 (ko) | 2014-10-22 | 2014-10-22 | 네비게이션 디바이스 및 그 제어 방법 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20180003519A1 (ko) |
KR (1) | KR102224491B1 (ko) |
WO (1) | WO2016063999A1 (ko) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090046538A1 (en) * | 1995-06-07 | 2009-02-19 | Automotive Technologies International, Inc. | Apparatus and method for Determining Presence of Objects in a Vehicle |
US20100079256A1 (en) * | 2008-09-29 | 2010-04-01 | Avaya Inc. | Monitoring Responsive Objects in Vehicles |
KR20110130570A (ko) * | 2010-05-28 | 2011-12-06 | 권영택 | 네비게이션단말기를 이용한 택배화물 배송방법 |
JP2013180634A (ja) * | 2012-03-01 | 2013-09-12 | Panasonic Corp | 車載用電子機器とそれを搭載した自動車 |
KR20140031611A (ko) * | 2012-09-05 | 2014-03-13 | 김동석 | 택배화물 배송시스템 및 이를 이용한 택배화물 배송방법 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6832092B1 (en) * | 2000-10-11 | 2004-12-14 | Motorola, Inc. | Method and apparatus for communication within a vehicle dispatch system |
US6974928B2 (en) * | 2001-03-16 | 2005-12-13 | Breakthrough Logistics Corporation | Method and apparatus for efficient package delivery and storage |
US6484094B1 (en) * | 2002-02-19 | 2002-11-19 | Alpine Electronics, Inc. | Display method and apparatus for navigation system |
CN1914633A (zh) * | 2004-01-28 | 2007-02-14 | Ww格兰杰有限公司 | 用于管理订货商品配送的系统和方法 |
US7136747B2 (en) * | 2005-01-08 | 2006-11-14 | Stephen Raney | Method for GPS carpool rendezvous tracking and personal safety verification |
KR20080022864A (ko) * | 2006-09-08 | 2008-03-12 | 김도환 | 명함을 이용한 네비게이션 시스템 |
KR100835319B1 (ko) * | 2007-02-06 | 2008-06-04 | 엘지전자 주식회사 | 네비게이션 장치 및 이의 주유소 검색 및 표시방법 |
JP2012502861A (ja) * | 2008-07-15 | 2012-02-02 | カルメル コンテナ システムズ リミテッド | フォークリフトのツメにrfidリーダを搭載したスマート・ロジスティック・システム |
KR101061836B1 (ko) * | 2009-01-07 | 2011-09-02 | 이선영 | 유체 도난 감시시스템 |
KR20110041669A (ko) * | 2009-10-16 | 2011-04-22 | (주)한국공간정보통신 | Rfid 기술을 이용하는 내비게이션 장치 및 이의 동작 방법 |
KR101077054B1 (ko) * | 2011-05-13 | 2011-10-26 | 주식회사 모리아타운 | 길 안내 서비스 시스템 및 방법 |
US20140378159A1 (en) * | 2013-06-24 | 2014-12-25 | Amazon Technologies, Inc. | Using movement patterns to anticipate user expectations |
-
2014
- 2014-10-22 KR KR1020177013032A patent/KR102224491B1/ko active IP Right Grant
- 2014-10-22 US US15/129,107 patent/US20180003519A1/en not_active Abandoned
- 2014-10-22 WO PCT/KR2014/009912 patent/WO2016063999A1/ko active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090046538A1 (en) * | 1995-06-07 | 2009-02-19 | Automotive Technologies International, Inc. | Apparatus and method for Determining Presence of Objects in a Vehicle |
US20100079256A1 (en) * | 2008-09-29 | 2010-04-01 | Avaya Inc. | Monitoring Responsive Objects in Vehicles |
KR20110130570A (ko) * | 2010-05-28 | 2011-12-06 | 권영택 | 네비게이션단말기를 이용한 택배화물 배송방법 |
JP2013180634A (ja) * | 2012-03-01 | 2013-09-12 | Panasonic Corp | 車載用電子機器とそれを搭載した自動車 |
KR20140031611A (ko) * | 2012-09-05 | 2014-03-13 | 김동석 | 택배화물 배송시스템 및 이를 이용한 택배화물 배송방법 |
Also Published As
Publication number | Publication date |
---|---|
KR20170072258A (ko) | 2017-06-26 |
KR102224491B1 (ko) | 2021-03-08 |
US20180003519A1 (en) | 2018-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107172590B (zh) | 基于移动终端的活动状态信息处理方法、装置及移动终端 | |
US20170091335A1 (en) | Search method, server and client | |
WO2015016569A1 (en) | Method and apparatus for constructing multi-screen display | |
WO2014073850A1 (en) | Method and apparatus for managing message in electronic device | |
US9241242B2 (en) | Information recommendation method and apparatus | |
WO2019061443A1 (zh) | 一种通知显示方法及终端 | |
WO2014163330A1 (en) | Apparatus and method for providing additional information by using caller phone number | |
CN107305493A (zh) | 具有集成搜索的图形键盘应用 | |
CN108701137A (zh) | 键盘内的图标建议 | |
CN106104528A (zh) | 用于屏幕上项目选择和消歧的基于模型的方法 | |
KR20160021637A (ko) | 컨텐츠 처리 방법 및 그 전자 장치 | |
CN106294308B (zh) | 命名实体识别方法及装置 | |
CN105606118B (zh) | 导航装置、用于输入位置给导航装置的系统,以及用于从终端输入位置给导航装置的方法 | |
WO2017146327A1 (en) | System and method for locating an occupant within a vehicle | |
CN109489674B (zh) | 基于位置确定路段的方法、装置及存储介质 | |
WO2019235653A1 (ko) | 근거리 무선 통신을 기반으로 근처 지인을 파악하기 위한 방법과 시스템 및 비-일시적인 컴퓨터 판독 가능한 기록 매체 | |
CN108151716A (zh) | 飞行装置测绘作业区域规划方法、装置和终端 | |
CN107341226B (zh) | 信息展示方法、装置及移动终端 | |
WO2013125785A1 (en) | Task performing method, system and computer-readable recording medium | |
WO2016108336A1 (en) | Pen type multimedia device for processing image data by using handwriting input and method for controlling the same | |
WO2015109992A1 (en) | Information retrieval method, apparatus and system | |
WO2016063999A1 (ko) | 네비게이션 디바이스 및 그 제어 방법 | |
WO2013115493A1 (en) | Method and apparatus for managing an application in a mobile electronic device | |
CN108270660A (zh) | 消息的快捷回复方法及装置 | |
KR20140127948A (ko) | 승하차 관리 방법, 이를 위한 장치 및 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14904283 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15129107 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20177013032 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14904283 Country of ref document: EP Kind code of ref document: A1 |