WO2012086054A1 - ナビゲーション装置、制御方法、プログラム、及び記憶媒体 - Google Patents
ナビゲーション装置、制御方法、プログラム、及び記憶媒体 Download PDFInfo
- Publication number
- WO2012086054A1 WO2012086054A1 PCT/JP2010/073326 JP2010073326W WO2012086054A1 WO 2012086054 A1 WO2012086054 A1 WO 2012086054A1 JP 2010073326 W JP2010073326 W JP 2010073326W WO 2012086054 A1 WO2012086054 A1 WO 2012086054A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- landmark
- feature
- guidance
- guide
- system controller
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3644—Landmark guidance, e.g. using POIs or conspicuous other objects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/582—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
Definitions
- the present invention relates to a technique for performing guidance using an image obtained from imaging means mounted on a moving body.
- Patent Document 1 discloses a technique for extracting an object serving as a mark from an image obtained by photographing a periphery such as an intersection, and generating and outputting presentation information corresponding to the object serving as the mark.
- Patent Document 2 discloses a technique for performing pattern matching between an image acquired from a camera and a stored template, and determining which position in the image corresponds to the template.
- the landmark information stored in advance is not updated in real time, it may already be lost at the time of guidance, or the store of the corresponding facility may have changed. In this case, if guidance is performed using the landmark information, the user may be confused. Further, as in the technique described in Patent Document 1, when searching for an object to be a mark from the entire image, there arises a problem such as a large processing load.
- the present invention has been made in order to solve the above-described problems, and has as its main object to provide a navigation device capable of providing appropriate guidance without confusing the user.
- an acquisition unit that acquires an image captured by the imaging unit, a storage unit that stores landmark information regarding the facility, and a storage unit that stores the landmark information when the moving body moves along a route.
- Extraction means for extracting a landmark corresponding to a guidance point from the landmark information obtained, guidance means for guiding the guidance point using the extracted landmark, and a feature corresponding to the landmark information
- a determination unit that determines whether the feature can be recognized from the image acquired by the acquisition unit at a point where photographing is possible, and the guide unit determines that the determination unit cannot recognize the feature
- the landmarks corresponding to the feature are excluded from the landmarks extracted by the extraction means and guided.
- the invention according to claim 8 is a control method executed by a navigation device having a storage means for storing facility information related to a facility, an acquisition step of acquiring an image captured by the imaging means, and a moving body on a route.
- An extraction step of extracting a landmark corresponding to a guide point from the landmark information stored in the storage means, and a guide step of guiding the guide point using the extracted landmark when moving along A determination step of determining whether the feature can be recognized from the image acquired by the acquisition step at a point where the feature corresponding to the landmark information can be photographed, and the guide step includes the determination step If it is determined that the feature cannot be recognized, the landmark corresponding to the feature is excluded from the landmark extracted by the extraction step. And characterized in that the inner.
- the invention according to claim 9 is a program executed by a navigation device having a storage means for storing landmark information relating to a facility, an acquisition means for acquiring an image captured by the imaging means, and a moving body on the route.
- Extraction means for extracting a landmark corresponding to a guidance point from the landmark information stored in the storage means, and guidance means for guiding the guidance point using the extracted landmark when moving along
- the navigation device functions as a determination unit that determines whether the feature can be recognized from the image acquired by the acquisition unit at a point where the feature corresponding to the landmark information can be photographed.
- the determination unit determines that the feature cannot be recognized, the landmark is extracted from the landmark extracted by the extraction unit. Characterized by guiding the exclusion of landmarks corresponding to the feature thereof.
- (B) It is an example of the flowchart which shows the process sequence of the 2nd guidance example. It is an example of the flowchart which shows the process sequence of the 3rd guidance example. The example of an image in which the traffic control target object was displayed is shown. It is an example of the flowchart which shows the process sequence of the 4th guidance example.
- (A) It is a figure showing the outline
- (B) It is an example of the flowchart which shows the process sequence of the 5th guidance example.
- an acquisition unit that acquires an image captured by the imaging unit, a storage unit that stores landmark information related to a facility, and the storage unit when the moving body moves along a route.
- Extracting means for extracting a landmark corresponding to a guide point from the landmark information stored in the guide, guidance means for guiding the guide point using the extracted landmark, and features corresponding to the landmark information
- Determination means for determining whether the feature can be recognized from the image acquired by the acquisition means at a point where an object can be photographed, and the guide means determines that the determination means cannot recognize the feature
- the landmark corresponding to the feature is excluded from the landmarks extracted by the extracting means and guided.
- the navigation apparatus includes an acquisition unit, a storage unit, an extraction unit, a guide unit, and a determination unit.
- the acquisition unit acquires an image captured by the imaging unit.
- the storage means stores landmark information related to the facility.
- the extraction unit extracts a landmark corresponding to the guide point from the landmark information stored in the storage unit when the moving body moves along the route.
- the guide means guides the guide point using the extracted landmark.
- the determination unit determines whether the feature can be recognized from the image acquired by the acquisition unit at a point where the feature corresponding to the landmark information can be photographed. When the determination unit determines that the feature cannot be recognized, the guide unit guides the landmark corresponding to the feature from the landmark extracted by the extraction unit.
- the navigation device determines whether or not a feature corresponding to the stored landmark information actually exists based on the captured image, and only when the landmark is determined to be present Guidance based on information.
- the navigation device can execute guidance based on other information without causing confusion to the user even when the facility corresponding to the landmark information does not exist at the time of guidance. it can.
- the determination unit determines whether another feature corresponding to another landmark not extracted by the extraction unit can be recognized.
- the guide unit guides the guide point using the other landmark.
- the determination unit determines whether a new feature to be guided can be extracted from the image, and the guide unit When the determination means can extract the new feature, the guide point is guided using the new feature information.
- the navigation apparatus can perform guidance without causing confusion to the user even when there is no facility corresponding to the landmark information at the time of guidance.
- the new feature is a traffic control object installed at the guide point.
- the navigation apparatus can widely perform guidance at various guidance points using the traffic control object regardless of the landmark information.
- the storage unit stores related information of the guide point, and the guide unit determines that the landmark information is determined when the determination unit determines that the feature cannot be recognized.
- Guidance of the guide point is performed based on the related information of the guide points other than.
- the “related information of the guide point” corresponds to information such as the name of the guide point and the direction of the road that branches at the guide point. According to this aspect, the navigation device can execute guidance without causing confusion to the user even when the feature object cannot be detected.
- the guide unit when the determination unit determines that the feature cannot be recognized, guides the guide point based on information on a traveling direction at the guide point. . According to this aspect, the navigation device can execute guidance without causing confusion to the user even when the feature object cannot be detected.
- the guide unit when the determination unit determines that the feature cannot be recognized, the guide unit performs guidance that the landmark extracted by the extraction unit cannot be recognized. According to this aspect, the navigation apparatus can explicitly notify the user that the landmark no longer exists.
- a control method executed by a navigation device having a storage unit that stores facility information related to a facility, the acquisition step of acquiring an image captured by the imaging unit, and the moving body
- the navigation device is based on other information without causing confusion to the user even when the facility corresponding to the landmark information does not exist at the time of guidance. Guidance can be executed.
- a program executed by a navigation device having a storage unit that stores landmark information regarding a facility, an acquisition unit that acquires an image captured by the imaging unit, and a moving body
- an extraction unit that extracts a landmark corresponding to the guide point from the landmark information stored in the storage unit, and the guide point is guided using the extracted landmark.
- the navigation device functions as a guide means and a judgment means for judging whether the feature can be recognized from the image obtained by the obtaining means at a point where the feature corresponding to the landmark information can be photographed.
- the guide means when the judging means judges that the feature cannot be recognized, the land extracted by the extracting means Excludes landmark corresponding to the characteristic object from chromatography click guide.
- the navigation device can change the information to other information without causing confusion to the user even if the facility corresponding to the landmark information does not exist at the time of guidance.
- Guidance can be executed based on this.
- the above program is stored in a storage medium.
- the “destination” refers to a destination point set in the navigation device 1 by the user.
- the “stop-by place” refers to a point set in the navigation device 1 as a point where the user stops in the middle of the destination.
- the “landmark” refers to a facility mark that serves as a landmark during route guidance.
- the “guidance point” refers to a point where guidance is provided at the time of route guidance.
- FIG. 1 shows a schematic configuration of the navigation apparatus 1.
- the navigation device 1 is mounted on a vehicle and connected to the camera 5.
- the navigation device 1 includes a self-supporting positioning device 10, a GPS receiver 18, a system controller 20, a disk drive 31, a data storage unit 36, a communication interface 37, a communication device 38, a display unit 40, an audio output.
- a unit 50 and an input device 60 are provided.
- the self-supporting positioning device 10 includes an acceleration sensor 11, an angular velocity sensor 12, and a distance sensor 13.
- the acceleration sensor 11 is made of, for example, a piezoelectric element, detects vehicle acceleration, and outputs acceleration data.
- the angular velocity sensor 12 is composed of, for example, a vibrating gyroscope, detects the angular velocity of the vehicle when the direction of the vehicle is changed, and outputs angular velocity data and relative azimuth data.
- the distance sensor 13 measures a vehicle speed pulse composed of a pulse signal generated with the rotation of the vehicle wheel.
- the GPS receiver 18 receives radio waves 19 carrying downlink data including positioning data from a plurality of GPS satellites.
- the positioning data is used to detect the absolute position of the vehicle (hereinafter also referred to as “current position”) from latitude and longitude information.
- the system controller 20 includes an interface 21, a CPU (Central Processing Unit) 22, a ROM (Read Only Memory) 23, and a RAM (Random Access Memory) 24, and controls the entire navigation device 1.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the interface 21 performs an interface operation with the acceleration sensor 11, the angular velocity sensor 12, the distance sensor 13, and the GPS receiver 18. From these, vehicle speed pulses, acceleration data, relative azimuth data, angular velocity data, GPS positioning data, absolute azimuth data, and the like are input to the system controller 20.
- the CPU 22 controls the entire system controller 20 by executing a program prepared in advance.
- the ROM 23 includes a nonvolatile memory (not shown) in which a control program for controlling the system controller 20 is stored.
- the RAM 24 stores various data such as route data preset by the user via the input device 60 so as to be readable, and provides a working area to the CPU 22.
- a system controller 20 a disk drive 31 such as a CD-ROM drive or a DVD-ROM drive, a data storage unit 36, a communication interface 37, a display unit 40, an audio output unit 50 and an input device 60 are mutually connected via a bus line 30. It is connected to the.
- the disk drive 31 reads and outputs content data such as music data and video data from a disk 33 such as a CD or DVD under the control of the system controller 20.
- the disk drive 31 may be either a CD-ROM drive or a DVD-ROM drive, or may be a CD and DVD compatible drive.
- the data storage unit 36 is configured by, for example, an HDD or the like, and stores various data used for navigation processing such as map data.
- the data storage unit 36 is an example of the “storage unit” in the present invention.
- the map data includes a database (also referred to as “landmark information DB”) in which landmark information (also referred to as “landmark information IL”) corresponding to each guide point is stored in association with each guide point.
- the landmark information IL specifically corresponds to information relating to the landmark, such as an image indicating the landmark, a facility name indicated by the landmark, and the position of the landmark.
- the communication device 38 includes, for example, an FM tuner, a beacon receiver, a mobile phone, a dedicated communication card, and the like, and information distributed from a VICS (Vehicle Information Communication System) center or the like (hereinafter referred to as “VICS information”). Is acquired from the radio wave 39.
- VICS information Vehicle Information Communication System
- the interface 37 performs an interface operation of the communication device 38 and inputs the VICS information to the system controller 20 or the like.
- the display unit 40 displays various display data on a display device such as a display under the control of the system controller 20.
- the system controller 20 reads map data from the data storage unit 36.
- the display unit 40 displays the map data read from the data storage unit 36 by the system controller 20 on the display screen.
- the display unit 40 includes a graphic controller 41 that controls the entire display unit 40 based on control data sent from the CPU 22 via the bus line 30 and a memory such as a VRAM (Video RAM), and can display image information that can be displayed immediately.
- a buffer memory 42 that temporarily stores, a display control unit 43 that controls display of a display 44 such as a liquid crystal or a CRT (Cathode Ray Tube) based on image data output from the graphic controller 41, and a display 44 are provided.
- the display 44 functions as an image display unit, and includes, for example, a liquid crystal display device having a diagonal size of about 5 to 10 inches and is mounted near the front panel in the vehicle.
- the audio output unit 50 performs D / A (Digital to Analog) conversion of audio digital data sent from the CD-ROM drive 31, DVD-ROM 32, RAM 24, or the like via the bus line 30 under the control of the system controller 20.
- a D / A converter 51 to perform an amplifier (AMP) 52 that amplifies the audio analog signal output from the D / A converter 51, and a speaker 53 that converts the amplified audio analog signal into sound and outputs the sound into the vehicle. It is prepared for.
- AMP amplifier
- the input device 60 includes keys, switches, buttons, a remote controller, a voice input device, and the like for inputting various commands and data.
- the input device 60 is disposed around the front panel and the display 44 of the main body of the in-vehicle electronic system mounted in the vehicle.
- the display 44 is a touch panel system
- the touch panel provided on the display screen of the display 44 also functions as the input device 60.
- the camera 5 has a predetermined angle of view and generates an image (also simply referred to as “captured image”) captured based on light incident on the image sensor.
- the camera 5 is directed in the forward direction of the vehicle.
- the camera 5 is an example of the “imaging unit” in the present invention.
- the system controller 20 corresponds to “acquisition means”, “extraction means”, “guidance means”, and “determination means” in the present invention.
- FIG. 2A is an example of a photographed image when approaching the intersection “Pi”, which is a guide point, within a predetermined distance.
- FIG. 2B is a diagram showing an outline of processing for reading the landmark information IL from the landmark information DB.
- the intersection “Pi” is represented by the node “NPi”
- the roads “R1” to “R4” connected to the intersection Pi are represented by the links “LR1” to “LR4”.
- the captured image of (a) is expressed.
- FIGS. 2 (a) and 2 (b) do not show facilities existing near the intersection Pi.
- the system controller 20 identifies the current position based on the detection signals of the GPS receiver 18 and the self-supporting positioning device 10, refers to the map data, and identifies the intersection Pi that is a guide point from the current position. Then, the system controller 20 refers to the landmark information DB based on the identification number of the node NPi indicating the intersection Pi, and acquires the landmark information IL associated with the intersection Pi. In FIG. 2B, the system controller 20 uses, as the landmark information IL, the name “convenience store A” of the facility associated with the intersection Pi, the image “ImgA” that is a logo mark of the facility “convenience store A”, and the like. Get information. Here, “A” represents a predetermined character string. The system controller 20 stores the landmark information IL in a primary storage area such as the RAM 24.
- the system controller 20 determines whether or not the landmark indicated by the landmark information IL exists from the captured image including the intersection Pi in the display range by image recognition by pattern matching.
- a captured image that is a target for pattern matching is referred to as a “target image Itag”.
- FIGS. 3A and 3B show an example of the target image Itag in which the intersection Pi that is a guide point is captured.
- 3 (a) and 3 (b) a feature “Ft1”, which is a sign with the facility name of the facility “Convenience Store A”, and an image ImgA, which is a logo mark of the facility “Convenience Store A”, etc. “Ft2” is displayed in the target image Itag.
- the system controller 20 uses the image indicating the facility name “convenience store A” specified from the landmark information IL and the image ImgA as a template, in a predetermined image in the target image Itag shown in FIGS. Compare with size range. In other words, the system controller 20 defines a range (also referred to as “search range”) to be compared with the template in the target image Itag, and calculates the similarity between the template and the search range. A method for determining the search range will be described later.
- the system controller 20 determines that the landmark indicated by the template exists in the target image Itag when the similarity is equal to or greater than a predetermined value.
- the predetermined value is determined in advance based on an experiment or the like, for example, as a lower limit value of the similarity that is determined to match the feature in the captured image with the landmark information IL.
- the system controller 20 changes the search range and calculates the similarity between the changed search range and the template.
- the system controller 20 compares the template with the search range “W1” as shown in FIG. Judge that it will be above. Further, when the image ImgA is used as a template, the system controller 20 determines that the similarity is equal to or greater than a predetermined value when the template is compared with the search range “W2” as illustrated in FIG.
- FIGS. 4A and 4B a method for determining a search range in a captured image will be described with reference to FIGS. 4A and 4B using two specific examples of the first example and the second example.
- FIG. 4A is a diagram conceptually showing a search range setting method in the first example.
- the ranges “W10” to “W12” in FIG. 4A show a range (also referred to as a “searched range”) that is a combination of search ranges that have already been searched, in this order. .
- the system controller 20 expands the searched range evenly in the vertical and horizontal directions in the directions indicated by the arrows “Y1” to “Y4” with the intersection Pi as the center. Yes. That is, the system controller 20 sets the search range in order from the position close to the center of the intersection Pi based on the assumption that the landmark normally exists at a position close to the intersection. Thereby, the system controller 20 can detect the characteristic object in agreement with landmark information IL from a picked-up image at an early stage.
- FIG. 4B is a diagram conceptually illustrating a search range setting method in the second example.
- the ranges “W13” to “W15” in FIG. 4B show the searched ranges in time series in this order.
- the system controller 20 expands the searched range mainly in the direction of the arrow “Y5” where the landmark is located.
- the system controller 20 expands the searched range from the position of the intersection Pi toward the landmark position.
- the system controller 20 can detect from the captured image a feature that matches the landmark indicated by the landmark information IL at an early stage.
- the system controller 20 When the system controller 20 detects a feature that matches the landmark indicated by the landmark information IL from the captured image as a result of the above-described processing, the system controller 20 performs guidance at the guidance point using the landmark information IL. .
- the system controller 20 detects the features Ft1 and Ft2 that coincide with the landmark indicated by the landmark information IL at the intersection Pi, and then uses “convenience store A” as a mark. A voice guidance to turn right or left at the intersection Pi is performed, and one or both of the feature objects Ft1 and Ft2 are highlighted in the captured image by surrounding them with an outer frame.
- the system controller 20 when performing the guidance, the system controller 20 confirms whether the landmark indicated by the landmark information IL stored in the data storage unit 36 actually exists, and then uses the landmark information IL for guidance. I do. As a result, the system controller 20 can prevent the user from being confused due to the guidance using the landmark that no longer exists.
- FIG. 5 is an example of a flowchart showing a processing procedure of landmark detection processing.
- the system controller 20 executes the flowchart shown in FIG. 5 when, for example, it is determined that the vehicle has approached the guidance point within a predetermined distance during route guidance.
- the system controller 20 specifies a guide point (step S101). Specifically, the system controller 20 refers to the map data and specifies the next guide point to pass based on the current position and the route scheduled to travel to the destination. Then, the system controller 20 acquires landmark information IL corresponding to the guidance point from the landmark information DB (step S102). At this time, the system controller 20 stores the landmark information IL in the primary storage area.
- the system controller 20 stores the target image Itag where the guide point is photographed in a memory such as the data storage unit 36 (step S103).
- the system controller 20 sets a landmark indicated by one landmark information IL as a template (step S104).
- the system controller 20 determines whether or not the position of the landmark can be specified (step S105).
- the system controller 20 specifies the search direction (step S106). Specifically, the system controller 20 determines the search direction in the direction from the guidance point toward the location of the facility.
- the system controller 20 determines that the position of the landmark cannot be specified (step S105; No)
- it does not specify the search direction.
- the system controller 20 specifies a search range based on the position of the guide point (step S107).
- the system controller 20 may determine the position of the guide point in the target image Itag in advance or refer to a predetermined map or formula based on the distance between the current position and the guide point.
- the position of the guide point in the target image Itag may be specified.
- the above-described map and the like are determined in advance based on, for example, experiments.
- the system controller 20 calculates the similarity between the search range and the template (step S108).
- the system controller 20 provides guidance based on the landmark information IL corresponding to the template (step S110).
- the system controller 20 provides guidance using only the landmark information IL of the landmark that actually exists in the captured image, so that even when the facility corresponding to the landmark information IL does not exist, the user can Can be prevented from causing confusion.
- step S109 determines whether or not the entire target image Itag is within the searched range (step S111).
- step S111 determines whether there is other landmark information IL (step S112).
- step S114 the search range is changed (step S114), and the processing after step S108 is executed again. To do.
- step S112 determines in step S112 that other landmark information IL exists (step S112; Yes).
- the landmark indicated by the landmark information IL is set as a template (step S104).
- step S105 is performed.
- step S112 determines that there is no other landmark information IL (step S112; No)
- step S113 provides guidance without using the landmark information IL (step S113). The processing in step S113 will be described in detail in a section described later.
- the system controller 20 detects another feature corresponding to a landmark other than the landmark indicated by the landmark information IL (hereinafter also referred to as “other feature Fex”) from the target image Itag. To do.
- the system controller 20 can detect the other feature Fex, the system controller 20 performs guidance using the other feature Fex.
- FIG. 6A illustrates the target image Itag when the feature “Ft3” indicating the signboard of the convenience store A that is not registered in the landmark information DB as the landmark information IL exists in the vicinity of the intersection Pi that is the guide point.
- the system controller 20 again includes a character string in the search range “W3” by specifying a search range based on the position of the intersection Pi and detecting a character string or the like. The presence of the object Ft3 is detected.
- the system controller 20 performs display that emphasizes the feature Ft3 from the captured image displayed to the user, and performs voice guidance to the effect that the feature Ft3 should be turned to the left or right. For example, at the time of guidance, the system controller 20 superimposes the broken line frame of the search range W3 on the captured image and blinks it, and also performs voice guidance such as “xx meters ahead, turn right. The blinking point is a landmark”.
- FIG. 6B is an example of a flowchart showing a processing procedure of the first guidance example.
- the system controller 20 executes the process of the flowchart shown in FIG. 6B when the process proceeds to step S113 in FIG.
- the system controller 20 searches for another feature Fex from the target image Itag (step S201). Specifically, the system controller 20 searches the target image Itag for a feature that becomes a signboard or other landmark from the target image Itag.
- the system controller 20 determines that the other feature Fex has been detected (step S202; Yes)
- the system controller 20 highlights the other feature Fex when guiding the guide point (step S203). For example, when displaying the captured image on the display unit 40, the system controller 20 causes the outer frame of the other feature Fex to blink. Further, preferably, the system controller 20 notifies that the highlighted part by the voice guidance is a mark.
- the system controller 20 performs guidance without using the landmark (step S203). Specifically, in this case, the system controller 20 executes a fourth guidance example and a fifth guidance example which will be described later.
- the system controller 20 can clearly specify the guidance point by using the other landmarks even when the landmark information IL is not used. .
- the system controller 20 associates an image indicating a feature such as a signboard of a facility with information on a facility name or other feature indicated by the feature ( If the detected other feature Fex matches the image registered in the feature DB, guidance is performed using the information in the feature DB. .
- FIG. 7A is a diagram conceptually showing the process of the second guidance example.
- the system controller 20 detects the feature Ft3 from the target image Itag, and determines whether or not the feature Ft3 matches the image registered in the feature DB. For example, as in the method for comparing the search range and the template described in the landmark detection process, the system controller 20 determines that the similarity between the image of the feature Ft3 and the image registered in the feature DB is equal to or greater than a predetermined value. Judge that they match.
- the system controller 20 After determining that an image that matches the image of the feature Ft3 exists in the feature DB, the system controller 20 names the facility “convenience store A corresponding to the feature Ft3 stored in association with the matched image. Is acquired from the feature object DB. Then, the system controller 20 performs guidance using information obtained from the feature object DB. For example, as in the first guidance example, the system controller 20 highlights the feature Ft3 from the photographed image, and performs voice guidance such as “turn xx meters ahead and turn right.
- the convenience store A is a landmark”.
- FIG. 7B is an example of a flowchart showing a processing procedure of the second guidance example.
- the flowchart of FIG.7 (b) shows the process sequence which combined the 1st guidance example and the 2nd guidance example.
- the system controller 20 executes the process of the flowchart shown in FIG. 7B when the process proceeds to step S113 in FIG.
- the system controller 20 searches for another feature Fex from the target image Itag (step S301). If the system controller 20 determines that the other feature Fex has been detected (step S302; Yes), the system controller 20 determines whether an image of the other feature Fex is registered in the feature DB (step S303). When the system controller 20 determines that the image of the other feature Fex is registered in the feature DB (step S303; Yes), the system controller 20 performs guidance using the information in the feature DB (step S304). For example, the system controller 20 highlights the other feature Fex in the captured image, and performs voice guidance using the facility name corresponding to the other feature Fex acquired from the feature DB.
- step S303 when the system controller 20 determines that an image of the feature is not registered in the feature DB (step S303; No), the other feature Fex is highlighted at the time of guidance, as in the first guidance example. (Step S305).
- the system controller 20 performs guidance without using the landmark (step S306). Specifically, in this case, the system controller 20 executes a fourth guidance example and a fifth guidance example which will be described later.
- the system controller 20 can cause the user to specify the guide point based on more information as compared with the first guide example by using the feature DB.
- the system controller 20 performs guidance using the character or the character string indicated by the other feature Fex. For example, in the example of FIG. 6A and FIG. 7A, the system controller 20 recognizes the character string “A” displayed in the image of the feature Ft3 and turns “xx meters ahead, right turn. "Is a landmark.”
- FIG. 8 is an example of a flowchart showing a processing procedure of the third guidance example.
- the flowchart of FIG. 8 shows a processing procedure combining the first guidance example to the third guidance example.
- the system controller 20 executes the process of the flowchart shown in FIG. 8 when the process proceeds to step S113 of FIG.
- the system controller 20 searches for another feature Fex from the target image Itag (step S401).
- the system controller 20 determines whether an image of the other feature Fex exists in the feature DB (step S403). If the system controller 20 determines that the image of the other feature Fex exists in the feature DB (step S403; Yes), the system controller 20 performs guidance using the information in the feature DB (step S404).
- step S403 when the system controller 20 determines that the image of the other feature Fex does not exist in the feature DB (step S403; No), the system controller 20 determines whether character recognition of the other feature Fex is possible (step S405). ). When the system controller 20 determines that the character recognition of the other feature Fex is possible (step S405; Yes), the system controller 20 performs guidance using the recognized character (step S406). For example, the system controller 20 highlights the other feature Fex in the captured image and performs voice guidance using the recognized character.
- step S405 the feature controller highlights the feature at the time of guiding the guide point as in the first guidance example. (Step S407).
- the system controller 20 performs guidance without using the landmark (step S408). Specifically, in this case, the system controller 20 executes a fourth guidance example or a fifth guidance example which will be described later.
- the system controller 20 recognizes the characters of the other feature Fex and compares it with the first guidance example. Based on more information, the user can specify the guide point.
- the system controller 20 performs traffic control generally at intersections such as traffic lights, intersection name signs, direction signs (guide signs) in addition to or instead of the first to third guidance examples.
- the provided object also referred to as “traffic control object”
- guidance is provided using the traffic control object.
- FIG. 9A shows an example of a target image Itag on which a feature “Ft4” corresponding to a traffic light is displayed.
- the system controller 20 designates a predetermined search range, and calculates the similarity between the traffic light template and the search range.
- the above-mentioned signal template is stored in advance in a memory such as the data storage unit 36.
- the system controller 20 determines that the similarity is equal to or higher than a predetermined value when the search range “W4” is compared with the template of the traffic light. In this case, the system controller 20 highlights, for example, the feature Ft4, and performs voice guidance such as “X meters ahead, turn right.
- the traffic light is a landmark”.
- the system controller 20 may further detect the red, blue, and yellow states of the traffic signal and perform voice guidance using the status of the traffic signal. For example, when the detected traffic light is red, the system controller 20 provides voice guidance “Turn right xx meters, turn right. Red signal is a landmark”.
- FIG. 9B shows an example of the target image Itag on which the feature “Ft5” corresponding to the intersection name signboard described as “B” (“B” is a character string) is displayed.
- the system controller 20 determines that the similarity is equal to or greater than a predetermined value. Then, the system controller 20 highlights the feature Ft5, for example, and performs voice guidance such as “XX meters ahead and turn right. B is a landmark” or “XX meters ahead and B turns right”. .
- FIG. 9C shows an example of the target image Itag on which the feature “Ft6” corresponding to the direction signboard is displayed.
- the system controller 20 determines that the similarity is equal to or greater than a predetermined value when the search range “W6” is compared with the intersection name signboard template. Then, the system controller 20 highlights the feature Ft6, for example, and performs voice guidance saying “Turn right xx meters, turn right.
- the direction sign (guide sign) is a landmark”.
- FIG. 10 is an example of a flowchart showing a processing procedure of the fourth guidance example.
- the system controller 20 executes the process of the flowchart shown in FIG. 10 when the process proceeds to step S113 or when the process proceeds to step S204, step S306, or step S408.
- the system controller 20 acquires one image of the traffic control object from the database storing the image of the traffic control object (step S501). Then, the system controller 20 sets the image of the traffic control object as a template (step S502).
- the system controller 20 determines a search range from the target image Itag based on the position of the guide point (step S503). Then, the system controller 20 calculates the similarity between the search range and the template (step S504). When the similarity is equal to or higher than the predetermined value (step S505; Yes), the system controller 20 performs guidance based on information on the traffic control target corresponding to the template (step S506). For example, the system controller 20 highlights the search range and performs voice guidance using the name of the traffic control object.
- step S505 when the similarity is less than the predetermined value (step S505; No), the system controller 20 determines whether or not the entire target image Itag is within the searched range (step S507). Then, when the entire target image Itag is within the searched range (step S507; Yes), the system controller 20 determines whether an image of another traffic control target exists in the database (step S508). . On the other hand, when the system controller 20 determines that the entire target image Itag is not within the searched range (step S507; No), the search range is changed (step S510), and the processing after step S504 is executed again. To do.
- step S508 when the system controller 20 determines that there is an image of another traffic control object (step S508; Yes), the process returns to step S501.
- step S508; No when the system controller 20 determines that there is no image of another traffic control object (step S508; No), the system controller 20 performs guidance based on another method (step S114). Specifically, in this case, the system controller 20 performs guidance based on the fifth guidance example and the like.
- the system controller 20 can guide the user based on the traffic control object, thereby allowing the user to specify the guide point at various guide points.
- the system controller 20 identifies an intersection that is a guidance point, and then information associated with the intersection on the map data (simply simply Guidance is provided based on “Related Information”.
- FIG. 11A is a diagram showing an outline of the fifth guidance example.
- the system controller 20 extracts the relevant information of the intersection Pi from the map data based on the identification number of the node NPi of the intersection Pi.
- the system controller 20 determines from the map data the intersection name “B” of the intersection Pi and the direction names “C” and “D” corresponding to the travel destinations of the road that branches at the intersection Pi such as the Shibuya direction. , “E” is acquired.
- “B” to “E” indicate characters or character strings.
- the system controller 20 uses the intersection name “B” or the direction names “C” to “E” acquired from the map data when guiding the direction to proceed at the intersection Pi. Specifically, the system controller 20 determines whether to use the intersection name “B” or the direction names “C” to “E” in accordance with the priority predetermined for each type of related information. Then, when using the intersection name “B”, the system controller 20 performs voice guidance such as “xx meters ahead and turn right at B”. Further, when using the direction names “C” to “E”, the system controller 20 performs voice guidance “xx meters ahead, direction E, turn right.”
- FIG. 11B is an example of a flowchart showing a processing procedure of the fifth guidance example.
- the system controller 20 executes the process of the flowchart shown in FIG. 11B when the process proceeds to step S113 or when the process proceeds to step S204, step S306, step S408, or step S509.
- the system controller 20 acquires the relevant information of the intersection Pi from the map data (step S601). Next, the system controller 20 determines the related information used for guidance among the acquired related information according to the priority (step S602). Then, the system controller 20 provides guidance based on the determined related information (step S603).
- the system controller 20 does not use information for specifying a guidance point such as a landmark, but performs guidance using information on a direction to travel at the guidance point. Also good. For example, in this case, the system controller 20 performs a voice guidance saying “Turn right after xx meters”. In addition to this, or instead of this, the system controller 20 may display the captured image with an arrow indicating the traveling direction superimposed thereon. Therefore, in this case, the system controller 20 performs guidance without using information for specifying a guidance point such as a landmark.
- the system controller 20 may notify by voice or display that a feature corresponding to the landmark indicated by the landmark information IL has not been detected in the captured image.
- the system controller 20 provides voice guidance “Convenience store A serving as a mark could not be detected”. By doing in this way, the system controller 20 can notify a user that the landmark which existed once no longer exists.
- the system controller 20 designates the landmark image indicated by the landmark information IL as a template, calculates the similarity with the search range, and determines whether the similarity is equal to or greater than a predetermined value. Instead, when the landmark information IL is a character or character example, the system controller 20 reads the character or character string from the search range and determines whether or not these characters or character strings match. Good. Similarly in the second guidance example, the system controller 20 replaces the images of the template and the search range with each other, instead of comparing the images with the predetermined character or character string stored in the feature object DB. It may be determined whether or not exists in the search range.
- the present invention can be suitably applied to an in-vehicle navigation device, a PND (Personal Navigation Device), and other devices that perform guidance using images acquired from a camera.
- a PND Personal Navigation Device
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Navigation (AREA)
- Instructional Devices (AREA)
- Image Processing (AREA)
Abstract
Description
図1は、ナビゲーション装置1の概略構成を示す。ナビゲーション装置1は、車両に搭載され、カメラ5と接続する。図1に示すように、ナビゲーション装置1は、自立測位装置10、GPS受信機18、システムコントローラ20、ディスクドライブ31、データ記憶ユニット36、通信用インタフェース37、通信装置38、表示ユニット40、音声出力ユニット50及び入力装置60を備える。
次に、システムコントローラ20が実行する経路案内方法について説明する。概略的には、システムコントローラ20は、車両が経路案内中に案内地点に近づいた場合、当該案内地点と関連付けられたランドマーク情報ILが示すランドマークが実際に撮影画像中に存在するか否か判断する。そして、システムコントローラ20は、当該ランドマークが存在する場合には当該ランドマークを用いて案内を行い、当該ランドマークが存在しない場合には、当該ランドマーク以外の情報に基づき案内を行う。
まず、ランドマーク情報ILが示すランドマークを撮影画像から検出する処理(「ランドマーク検出処理」とも呼ぶ。)について説明する。システムコントローラ20は、案内地点に所定距離以内に近づいた場合、ランドマーク情報DBからランドマーク情報ILを読み出す。これについて、図2を参照して説明する。
図5は、ランドマーク検出処理の処理手順を示すフローチャートの一例である。システムコントローラ20は、図5に示すフローチャートを、例えば経路案内中に車両が案内地点へ所定距離以内に近づいたと判断した場合に実行する。
次に、上述のランドマーク検出処理の結果、ランドマーク情報ILに対応する特徴物が撮影画像中で認識できない場合の案内方法について説明する。以下では、ランドマーク情報ILを用いない案内方法の具体例である第1案内例乃至第5案内例を説明する。これらの案内例は、任意に組み合わせて実行されてもよい。
第1案内例では、システムコントローラ20は、対象画像Itagから、ランドマーク情報ILが示すランドマーク以外のランドマークに対応する他の特徴物(以後、「他特徴物Fex」とも呼ぶ。)を検出する。そして、システムコントローラ20は、他特徴物Fexを検出することができた場合、当該他特徴物Fexを用いて案内を行う。
第2案内例では、システムコントローラ20は、第1案内例に加えて、施設の看板等の特徴物を示す画像と、当該特徴物が示す施設名称その他特徴物の情報とを対応付けたデータベース(「特徴物DB」とも呼ぶ。)を参照し、検出した他特徴物Fexが特徴物DBに登録された画像と一致している場合には、当該特徴物DBの情報を利用して案内を行う。
第3案内例では、システムコントローラ20は、第1案内例又は第2案内例に加えて、他特徴物Fexが示す文字又は文字列を用いて案内を行う。例えば、図6(a)及び図7(a)の例では、システムコントローラ20は、特徴物Ft3の画像に表示された「A」の文字列を認識し、「xxメートル先、右折です。Aが目印です。」と音声案内を行う。
第4案内例では、システムコントローラ20は、第1乃至第3案内例に加えて、またはこれに代えて、信号機、交差点名称看板、方向看板(案内標識)など一般的に交差点にある交通制御に供される物(「交通制御対象物」とも呼ぶ。)を検出し、当該交通制御対象物を用いて案内を行う。
第5案内例では、システムコントローラ20は、第1乃至第4案内例に加えて、またはこれに代えて、案内地点となる交差点を特定後、地図データ上で当該交差点に関連付けられた情報(単に「関連情報」とも呼ぶ。)に基づき案内を行う。
システムコントローラ20は、上述の案内例に代えて、又はこれに加えて、ランドマークなど案内地点を特定するための情報を用いず、案内地点で進行すべき方向の情報を用いて案内を行ってもよい。例えば、この場合、システムコントローラ20は、「xxメートル先、右折です。」と音声案内を行う。さらにこれに加え、又はこれに代えて、システムコントローラ20は、撮影画像に進行方向を示す矢印を重畳させて表示してもよい。従って、この場合、システムコントローラ20は、ランドマークなど案内地点を特定するための情報を特に用いずに案内を行う。
第1案内例では、システムコントローラ20は、ランドマーク情報ILが示すランドマークの画像をテンプレートに指定して検索範囲との類似度を算出し、当該類似度が所定値以上か否か判定した。これに代えて、システムコントローラ20は、ランドマーク情報ILが文字又は文字例の場合には、検索範囲から文字又は文字列を読み取り、これらの文字又は文字列が一致するか否か判定してもよい。第2案内例等でも同様に、システムコントローラ20は、テンプレートと検索範囲との画像同士を比較する場合に代えて、特徴物DBに格納された所定の文字又は文字列と同一の文字又は文字列が検索範囲中に存在するか否か判定してもよい。
10 自立測位装置
12 GPS受信機
20 システムコントローラ
22 CPU
36 データ記憶ユニット
38 通信装置
40 表示ユニット
44 ディスプレイ
Claims (10)
- 撮像手段が撮像した画像を取得する取得手段と、
施設に関するランドマーク情報を記憶する記憶手段と、
移動体が経路に沿って移動する際、前記記憶手段に記憶された前記ランドマーク情報から案内地点に対応するランドマークを抽出する抽出手段と、
前記抽出されたランドマークを用いて前記案内地点を案内する案内手段と、
前記ランドマーク情報に対応する特徴物を撮影可能な地点において、前記取得手段が取得した前記画像から前記特徴物を認識できるか判断する判断手段と、を備え、
前記案内手段は、前記判断手段が前記特徴物を認識できないと判断した場合、前記抽出手段が抽出したランドマークから当該特徴物に対応するランドマークを除外して案内することを特徴とするナビゲーション装置。 - 前記判断手段は、前記特徴物を認識できないと判断した場合、前記抽出手段が抽出しない他のランドマークに対応する他の特徴物を認識できるか否かを判断し、
前記案内手段は、前記判断手段が前記他の特徴物を認識できたと判断した場合、前記他のランドマークを用いて案内地点を案内することを特徴とする請求項1に記載のナビゲーション装置。 - 前記判断手段は、前記特徴物を認識できない場合、前記画像から案内の対象となる新たな特徴物が抽出できるか否かを判断し、
前記案内手段は、前記判断手段が前記新たな特徴物を抽出できた場合、前記新たな特徴情報を用いて案内地点を案内することを特徴とする請求項1または2に記載のナビゲーション装置。 - 前記新たな特徴物は、前記案内地点に設置されている交通制御対象物であることを特徴とする請求項3に記載のナビゲーション装置。
- 前記記憶手段は、前記案内地点の関連情報を記憶し、
前記案内手段は、前記判断手段が前記特徴物を認識できないと判断した場合、前記ランドマーク情報以外の前記案内地点の関連情報に基づき、前記案内地点の案内を行うことを特徴とする請求項1乃至4のいずれか一項に記載のナビゲーション装置。 - 前記案内手段は、前記判断手段が前記特徴物を認識できないと判断した場合、前記案内地点での進行方向の情報に基づき、前記案内地点の案内を行うことを特徴とする請求項1乃至5のいずれか一項に記載のナビゲーション装置。
- 前記案内手段は、前記判断手段が前記特徴物を認識できないと判断した場合、前記抽出手段が抽出したランドマークが認識できない旨の案内を行うことを特徴とする請求項1乃至6のいずれか一項に記載のナビゲーション装置。
- 施設に関する施設情報を記憶する記憶手段を有するナビゲーション装置により実行される制御方法であって、
撮像手段が撮像した画像を取得する取得工程と、
移動体が経路に沿って移動する際、前記記憶手段に記憶された前記ランドマーク情報から案内地点に対応するランドマークを抽出する抽出工程と、
前記抽出されたランドマークを用いて前記案内地点を案内する案内工程と、
前記ランドマーク情報に対応する特徴物を撮影可能な地点において、前記取得工程が取得した前記画像から前記特徴物を認識できるか判断する判断工程と、を備え、
前記案内工程は、前記判断工程が前記特徴物を認識できないと判断した場合、前記抽出工程が抽出したランドマークから当該特徴物に対応するランドマークを除外して案内することを特徴とする制御方法。 - 施設に関するランドマーク情報を記憶する記憶手段を有するナビゲーション装置により実行されるプログラムであって、
撮像手段が撮像した画像を取得する取得手段と、
移動体が経路に沿って移動する際、前記記憶手段に記憶された前記ランドマーク情報から案内地点に対応するランドマークを抽出する抽出手段と、
前記抽出されたランドマークを用いて前記案内地点を案内する案内手段と、
前記ランドマーク情報に対応する特徴物を撮影可能な地点において、前記取得手段が取得した前記画像から前記特徴物を認識できるか判断する判断手段、
として前記ナビゲーション装置を機能させ、
前記案内手段は、前記判断手段が前記特徴物を認識できないと判断した場合、前記抽出手段が抽出したランドマークから当該特徴物に対応するランドマークを除外して案内することを特徴とするプログラム。 - 請求項9に記載のプログラムを記憶したことを特徴とする記憶媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/993,563 US20130261969A1 (en) | 2010-12-24 | 2010-12-24 | Navigation apparatus, control method, program, and storage medium |
PCT/JP2010/073326 WO2012086054A1 (ja) | 2010-12-24 | 2010-12-24 | ナビゲーション装置、制御方法、プログラム、及び記憶媒体 |
JP2011543943A JP4881493B1 (ja) | 2010-12-24 | 2010-12-24 | ナビゲーション装置、制御方法、プログラム、及び記憶媒体 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2010/073326 WO2012086054A1 (ja) | 2010-12-24 | 2010-12-24 | ナビゲーション装置、制御方法、プログラム、及び記憶媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012086054A1 true WO2012086054A1 (ja) | 2012-06-28 |
Family
ID=45851235
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/073326 WO2012086054A1 (ja) | 2010-12-24 | 2010-12-24 | ナビゲーション装置、制御方法、プログラム、及び記憶媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20130261969A1 (ja) |
JP (1) | JP4881493B1 (ja) |
WO (1) | WO2012086054A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014038464A (ja) * | 2012-08-15 | 2014-02-27 | Zenrin Datacom Co Ltd | 標識情報通知装置および標識情報通知方法 |
US20150071493A1 (en) * | 2013-09-11 | 2015-03-12 | Yasuhiro Kajiwara | Information processing apparatus, control method of the information processing apparatus, and storage medium |
WO2015059812A1 (ja) * | 2013-10-25 | 2015-04-30 | 三菱電機株式会社 | 移動支援装置及び移動支援方法 |
JP2017129451A (ja) * | 2016-01-20 | 2017-07-27 | 株式会社トヨタマップマスター | ナビゲーションシステム、poi提示方法、poi提示プログラム、記録媒体 |
JPWO2016203506A1 (ja) * | 2015-06-15 | 2017-09-14 | 三菱電機株式会社 | 経路案内装置及び経路案内方法 |
JP2020193871A (ja) * | 2019-05-28 | 2020-12-03 | 株式会社Nttドコモ | 情報処理装置及びプログラム |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5625987B2 (ja) * | 2011-02-16 | 2014-11-19 | アイシン・エィ・ダブリュ株式会社 | 案内装置、案内方法、および、案内プログラム |
US10126141B2 (en) * | 2016-05-02 | 2018-11-13 | Google Llc | Systems and methods for using real-time imagery in navigation |
US20180045530A1 (en) * | 2016-08-12 | 2018-02-15 | Blackberry Limited | System and method for generating an acoustic signal for localization of a point of interest |
US10853643B2 (en) | 2017-10-27 | 2020-12-01 | Rakuten, Inc. | Image extraction device, image extraction method, and image extraction program |
US10769428B2 (en) * | 2018-08-13 | 2020-09-08 | Google Llc | On-device image recognition |
US11972616B2 (en) * | 2018-11-20 | 2024-04-30 | Google Llc | Enhanced navigation instructions with landmarks under difficult driving conditions |
EP3822585A1 (en) * | 2019-11-15 | 2021-05-19 | Volkswagen AG | Vehicle navigation system and method for providing turn guidance for a driver of a vehicle |
JP2022178701A (ja) * | 2021-05-20 | 2022-12-02 | フォルシアクラリオン・エレクトロニクス株式会社 | ナビゲーション装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09178497A (ja) * | 1995-12-26 | 1997-07-11 | Aisin Aw Co Ltd | 車両用ナビゲーション装置 |
JPH11337359A (ja) * | 1998-05-27 | 1999-12-10 | Fujitsu Ten Ltd | ナビゲーション装置 |
JP2003344084A (ja) * | 2002-05-24 | 2003-12-03 | Alpine Electronics Inc | ナビゲーション装置および交差点案内方法 |
JP2009162722A (ja) * | 2008-01-10 | 2009-07-23 | Pioneer Electronic Corp | 案内装置、案内方法、及び案内プログラム |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3749821B2 (ja) * | 1999-09-30 | 2006-03-01 | 株式会社東芝 | 歩行者用道案内システムおよび歩行者用道案内方法 |
US8060302B2 (en) * | 2009-03-31 | 2011-11-15 | Microsoft Corporation | Visual assessment of landmarks |
-
2010
- 2010-12-24 US US13/993,563 patent/US20130261969A1/en not_active Abandoned
- 2010-12-24 JP JP2011543943A patent/JP4881493B1/ja active Active
- 2010-12-24 WO PCT/JP2010/073326 patent/WO2012086054A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09178497A (ja) * | 1995-12-26 | 1997-07-11 | Aisin Aw Co Ltd | 車両用ナビゲーション装置 |
JPH11337359A (ja) * | 1998-05-27 | 1999-12-10 | Fujitsu Ten Ltd | ナビゲーション装置 |
JP2003344084A (ja) * | 2002-05-24 | 2003-12-03 | Alpine Electronics Inc | ナビゲーション装置および交差点案内方法 |
JP2009162722A (ja) * | 2008-01-10 | 2009-07-23 | Pioneer Electronic Corp | 案内装置、案内方法、及び案内プログラム |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014038464A (ja) * | 2012-08-15 | 2014-02-27 | Zenrin Datacom Co Ltd | 標識情報通知装置および標識情報通知方法 |
US20150071493A1 (en) * | 2013-09-11 | 2015-03-12 | Yasuhiro Kajiwara | Information processing apparatus, control method of the information processing apparatus, and storage medium |
US9378558B2 (en) * | 2013-09-11 | 2016-06-28 | Ricoh Company, Ltd. | Self-position and self-orientation based on externally received position information, sensor data, and markers |
WO2015059812A1 (ja) * | 2013-10-25 | 2015-04-30 | 三菱電機株式会社 | 移動支援装置及び移動支援方法 |
JPWO2015059812A1 (ja) * | 2013-10-25 | 2017-03-09 | 三菱電機株式会社 | 移動支援装置及び移動支援方法 |
US10082401B2 (en) | 2013-10-25 | 2018-09-25 | Mitsubishi Electric Corporation | Movement support apparatus and movement support method |
JPWO2016203506A1 (ja) * | 2015-06-15 | 2017-09-14 | 三菱電機株式会社 | 経路案内装置及び経路案内方法 |
JP2017129451A (ja) * | 2016-01-20 | 2017-07-27 | 株式会社トヨタマップマスター | ナビゲーションシステム、poi提示方法、poi提示プログラム、記録媒体 |
JP2020193871A (ja) * | 2019-05-28 | 2020-12-03 | 株式会社Nttドコモ | 情報処理装置及びプログラム |
JP7246254B2 (ja) | 2019-05-28 | 2023-03-27 | 株式会社Nttドコモ | 情報処理装置及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP4881493B1 (ja) | 2012-02-22 |
JPWO2012086054A1 (ja) | 2014-05-22 |
US20130261969A1 (en) | 2013-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4881493B1 (ja) | ナビゲーション装置、制御方法、プログラム、及び記憶媒体 | |
EP2728313B1 (en) | Method of displaying objects on a navigation map | |
JP4562471B2 (ja) | ナビゲーション装置及び進行方向案内方法 | |
JP4683380B2 (ja) | 車線変更案内装置 | |
JP6106495B2 (ja) | 検出装置、制御方法、プログラム及び記憶媒体 | |
JP2009162722A (ja) | 案内装置、案内方法、及び案内プログラム | |
JP2005326956A (ja) | 駐車場空きスペース案内装置及び駐車場空きスペース案内方法 | |
JP2006292656A (ja) | ナビゲーション装置および気象情報表示方法 | |
WO2013145146A1 (ja) | ナビゲーション装置、ナビゲーション方法及びナビゲーションプログラム | |
JP4619442B2 (ja) | 画像表示装置、表示制御方法、表示制御プログラムおよび記録媒体 | |
JP2009026164A (ja) | 車線認識装置及びナビゲーション装置 | |
JP2012137482A (ja) | ナビゲーション装置及び制御方法 | |
JP2009198508A (ja) | 経路案内装置 | |
JP2008286585A (ja) | 車載用ナビゲーション装置 | |
JP4817993B2 (ja) | ナビゲーション装置および誘導経路設定方法 | |
JP4274913B2 (ja) | 目的地検索装置 | |
JP2005321268A (ja) | ナビゲーション装置 | |
JP2008151754A (ja) | 経路誘導装置、経路誘導方法、経路誘導プログラムおよび記録媒体 | |
JP2010156815A (ja) | 地図情報管理装置、地図情報管理方法及び地図情報管理プログラム | |
JP2012063367A (ja) | 地物画像データ通知装置、地物画像データ通知方法及び地物画像データ通知プログラム | |
JP2014066595A (ja) | ナビゲーション装置 | |
JP2011179920A (ja) | 表示制御装置、表示制御方法、プログラム、および記録媒体 | |
JP4895123B2 (ja) | 地物画像データ変更通知装置及び地物画像データ変更通知プログラム | |
JP2007205983A (ja) | ナビゲーション装置 | |
KR101906436B1 (ko) | 이동체의 주행 관련 안내 방법, 전자 기기 및 컴퓨터 판독 가능한 기록 매체 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2011543943 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10861000 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13993563 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10861000 Country of ref document: EP Kind code of ref document: A1 |