WO2007088915A1 - Route guidance device, route guidance method, route guidance program, and recording medium - Google Patents

Route guidance device, route guidance method, route guidance program, and recording medium Download PDF

Info

Publication number
WO2007088915A1
WO2007088915A1 PCT/JP2007/051658 JP2007051658W WO2007088915A1 WO 2007088915 A1 WO2007088915 A1 WO 2007088915A1 JP 2007051658 W JP2007051658 W JP 2007051658W WO 2007088915 A1 WO2007088915 A1 WO 2007088915A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
image
route guidance
feature
map
Prior art date
Application number
PCT/JP2007/051658
Other languages
French (fr)
Japanese (ja)
Inventor
Hiroaki Shibasaki
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2007088915A1 publication Critical patent/WO2007088915A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids
    • G09B29/102Map spot or coordinate position indicators; Map reading aids using electrical means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096827Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed onboard
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096833Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route
    • G08G1/096844Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route where the complete route is dynamically recomputed based on new data

Definitions

  • Route guidance device route guidance method, route guidance program, and recording medium
  • the present invention relates to a route guidance device, a route guidance method, a route guidance program, and a recording medium for guiding a route along which a moving body moves using information on features on the route.
  • the use of the present invention is not limited to the above-described route guidance device, route guidance method, route guidance program, and recording medium.
  • a navigation device mounted on a vehicle or the like searches for a route to a destination point and performs route guidance along the searched route.
  • Route guidance is mainly performed by outputting guidance voices, but at major points on the route such as intersections, guidance is provided based on landmarks (features) around the point so that the user can easily recognize the same point. May generate audio.
  • Such a navigation device includes, for example, a landmark detection unit that detects a landmark near a branch point, and a land that determines the position of the branch point by detecting the driver's power.
  • a landmark priority ranking table that is displayed with priorities for landmarks that can be checked in the time required to perform voice guidance and the mark position determination section is stored and detected by the landmark detection section.
  • the optimal landmark determination unit that determines which landmarks are to be voice-guided based on the landmark priority / effective time table, and provides route guidance to the driver while performing voice guidance on the landmarks along with the location information. (For example, refer to Patent Document 1 below.)
  • Patent Document 1 Japanese Unexamined Patent Publication No. 2000-131084
  • the landmark used for voice guidance (voice guidance) is likely to enter the user's field of view at the time of voice guidance output!
  • voice guidance voice guidance
  • the problem is an example. For example, if an obstacle such as a street signboard is installed between the landmark and the vehicle on which the user is boarding, or if the field of view is blocked by a slope or curve, the user uses it for voice guidance.
  • the displayed landmark cannot be seen.
  • the route guidance device is an image acquisition unit that acquires an image obtained by capturing the moving direction in the vicinity of the driver's seat of the moving body. Based on map acquisition means for acquiring map information around the current location of the mobile body, images acquired by the image acquisition means, and map information acquired by the map acquisition means.
  • a means for identifying a feature (hereinafter referred to as a visible feature) that can be visually recognized by a driver of the vehicle, and guidance of a route along which the moving body moves using information on the visible feature identified by the identification device It is characterized by comprising generating means for generating information and output means for outputting the guidance information generated by the generating means.
  • the route guidance method according to the invention of claim 6 includes an image acquisition step of acquiring an image around the current location of the moving object, and a map acquisition step of acquiring map information around the current location of the moving object. And a feature (hereinafter referred to as a visible feature) that can be seen by a passenger of the mobile body based on the image obtained by the image obtaining step and the map information obtained by the map obtaining step. ) Identifying step), generating step for generating guidance information of a route along which the moving body moves using information on visible features identified by the identification step, and guidance information generated by the generation step. And an output process for outputting.
  • the route guidance program according to the invention of claim 7 is the route guidance according to claim 6.
  • the method is characterized by causing a computer to execute the method.
  • a recording medium according to the invention of claim 8 is readable by a computer in which the route guidance program according to claim 7 is recorded.
  • FIG. 1 is a block diagram showing a functional configuration of a route guidance apparatus according to an embodiment.
  • FIG. 2 is a flowchart showing a procedure of route guidance processing by the route guidance device.
  • FIG. 3 is an explanatory view showing the vicinity of a dashboard of a vehicle in which a navigation device that is effective in the embodiment is installed.
  • FIG. 4 is a block diagram showing a hardware configuration of the navigation device.
  • FIG. 5 is a flowchart showing the procedure of route guidance processing based on landmarks extracted from map data power.
  • FIG. 6 is a flowchart showing a procedure of route guidance processing based on landmarks extracted from image power. Explanation of symbols
  • FIG. 1 is a block diagram showing a functional configuration of a route guidance apparatus according to an embodiment.
  • the route guidance device 100 is mounted on a moving body, and includes an image acquisition unit 101, a map acquisition unit 102, an identification unit 103, a generation unit 104, and an output unit 105.
  • the identification unit 103 includes a selection unit 103a and a determination unit 103b.
  • the image acquisition unit 101 acquires an image obtained by capturing the moving direction from the vicinity of the driver's seat of the moving body.
  • the image acquisition unit 101 is, for example, a moving body dashboard or a video camera installed in the vicinity of the rearview mirror, and picks up a scene color that is almost the same as a scene viewed by the driver of the moving body.
  • the map acquisition unit 102 acquires map information around the current location of the mobile object.
  • the area around the current location of the moving object is, for example, a range that is at least visible to the driver of the moving object.
  • the map acquisition unit 102 extracts, for example, map information around the current location of the moving object from map information used for route guidance to the destination location.
  • the identification unit 103 is a feature (hereinafter referred to as the following) that is visible to the driver of the mobile object based on the image acquired by the image acquisition unit 101 and the map information acquired by the map acquisition unit 102. Identify visible features). More specifically, the identification unit 103 includes a selection unit 103a and a determination unit 103b.
  • the selection unit 103a selects a feature that is a candidate for a visible feature (hereinafter, referred to as a candidate feature) from features whose information is included in the map information.
  • the determination unit 103b determines whether or not the candidate feature selected by the selection unit 103a is captured in the image acquired by the image acquisition unit 101. Then, the identification unit 103 identifies the candidate feature determined to be captured in the image by the determination unit 103b as a visible feature.
  • the determination unit 103b extracts information on at least one of the name, trademark, and shape of the candidate feature from the map information, and extracts the name, trademark, and shape of the extracted candidate feature. If information about at least one of them is captured in the image, the candidate feature may be captured in the image, and it may be possible to make a semi-IJ decision! /.
  • the generation unit 104 generates guidance information of a route along which the moving body moves, using information on the visible features identified by the identification unit 103.
  • Information about visible features is, for example, for example, it is information such as the names of visible features and the distance to traffic lights and intersections on the route along which the moving object moves.
  • guidance information is information that instructs the driver of a moving body in a direction to proceed or calls attention to movement.
  • the output unit 105 outputs the guidance information generated by the generation unit 104.
  • the output unit 105 outputs the guidance information by, for example, voice output or image output (message display or superimposed display on map information).
  • FIG. 2 is a flowchart showing a procedure of route guidance processing by the route guidance device.
  • the route guidance device 100 first acquires an image of a force near the driver's seat of the moving body by the image acquisition unit 101 (step S 201).
  • the map acquisition unit 102 acquires map information around the current location of the moving object (step S202).
  • the candidate feature is selected from the features whose information is included in the map information acquired in step S202 by the selection unit 103a (step S203). Then, the determination unit 103b determines whether or not the candidate feature is a visible feature (step S204). Specifically, it is determined whether the candidate feature selected by the selection unit 103a is captured in the image acquired by the image acquisition unit 101.
  • step S204 If the candidate feature is a visible feature (step S204: Yes), the generation unit 104 generates guidance information using information on the visible feature (step S205), and the output unit 105 performs guidance. Information is output (step S206), and the processing according to this flowchart is terminated.
  • step S204: No if the candidate feature is not a visible feature (step S204: No), the process returns to step S203, and the subsequent processing is repeated. In other words, the candidate feature is selected again and the visible feature is identified.
  • a plurality of candidate features may be selected in step S203.
  • the closest visible feature to the point to be guided such as an intersection
  • the largest visible range V and the feature (from the driver)
  • the guidance information may be generated using the most visible! /, Feature).
  • the selection unit 103a selects a candidate feature from features whose information is included in the map information, and the determination unit 103b applies the selection unit 103a to the image acquired by the image acquisition unit 101. It is determined whether or not the candidate feature selected by is picked up.
  • the selection unit 103a selects a candidate feature from the features captured in the image
  • the determination unit 103b selects the candidate feature selected by the selection unit 103a in the map information acquired by the map acquisition unit 102. It is also possible to determine whether or not the information related to is included.
  • the determination unit 103b extracts information on at least one of the name, trademark, and shape of the candidate feature from the image, and extracts the name, trademark, and shape of the extracted candidate feature. If the map information contains information on at least one of these, it is determined that the map information contains information on the candidate feature.
  • the route guidance device 100 As described above, according to the route guidance device 100, the route on which the moving body moves is guided using the information on the feature visible to the driver. Accordingly, the driver can actually visually recognize the features used for the guidance information, and the route guidance can be performed more easily for the driver.
  • the features used for guidance are often not visible. For this reason, even if the guidance information is output, the travel route may be wrong. On the other hand, it is often difficult to correct the trajectory if the route is wrong in such urban areas and mountain roads.
  • the route guidance device 100 since the route guidance is performed using the features that can be visually recognized, the possibility of erroneous routes can be reduced. Furthermore, since the driver can easily understand the contents of the guidance, he / she can afford to drive and can drive safe driving.
  • FIG. 4 is an explanatory diagram showing the vicinity of a dashboard of a vehicle in which a navigation device that is powerful in an embodiment is installed.
  • the navigation device 300 is installed on the dashboard of the vehicle.
  • the navigation apparatus 300 includes a main body M and a display unit (display) D.
  • the display unit D displays the current location of the vehicle, map information, current time, and the like.
  • the navigation apparatus 300 is connected to an in-vehicle camera 311 installed on the dashboard.
  • the in-vehicle camera 311 can change the direction of the lens and can take images inside and outside the vehicle.
  • the lens of the in-vehicle camera 311 is facing the inside of the car.
  • the in-vehicle power camera 311 captures the same scenery as the driver's field of view in the driver's seat.
  • FIG. 4 is a block diagram showing the hardware configuration of the navigation device.
  • a navigation device 300 includes a CPU 401, a ROM 402, a RAM (memory) 403, a magnetic disk drive 404, a magnetic disk 405, an optical disk drive 406, an optical disk 407, and an audio I / F. (Interface) 408, microphone 409, speaker 410, input device 411, video IZF412, camera 413, display 414, communication iZF 415, GPS unit 416, and various sensors 417! / Speak.
  • the components ⁇ 401 to 417 are connected by a bus 420.
  • CPU 401 governs overall control of navigation device 300.
  • ROM 402 records programs such as a boot program, a communication program, a database creation program, and a data analysis program.
  • the RAM 403 is used as a work area for the CPU 401.
  • the magnetic disk drive 404 controls reading and writing of data to the magnetic disk 405 according to the control of the CPU 401.
  • the magnetic disk 405 records data written under the control of the magnetic disk drive 404.
  • the magnetic disk 405 for example, HD (node disk) or FD (flexible disk) can be used.
  • the optical disk drive 406 performs data transfer to the optical disk 407 in accordance with the control of the CPU 401. Control reading and writing.
  • the optical disk 407 is a detachable recording medium from which data is read according to the control of the optical disk drive 406.
  • the optical disc 407 can also use a writable recording medium.
  • the removable recording medium may be a power MO of the optical disc 407, a memory card, or the like.
  • map data includes background data that represents features (features) such as buildings, rivers, and the ground surface, and road shape data that represents the shape of the road.
  • Two-dimensional or three-dimensional data is displayed on the display screen of the display 414. Rendered on When the navigation apparatus 300 is guiding a route, the map data and the current vehicle location acquired by the CPU 401 are displayed in an overlapping manner.
  • the background data further includes background shape data representing the shape of the background and background type data representing the type of the background.
  • the background shape data includes, for example, the feature representative point 'polyline' polygon 'feature' coordinates.
  • the background type data includes, for example, text data representing the name, address, and telephone number of the feature, and type data of the feature such as a building, a river, and the ground surface.
  • facility information such as business hours and the presence / absence of a parking lot is included as facility information.
  • facilities provided for a predetermined purpose such as “gas station” and “convenience store”, are classified according to their types.
  • the representative point and coordinate data of the feature and the facility information “genre information” are associated with each other.
  • the genre into which the facilities are classified at this time includes, for example, parking lots, train stations, etc., in addition to the above gas stations and convenience stores.
  • the road shape data further includes traffic condition data.
  • the traffic condition data includes, for example, the presence or absence of traffic lights and pedestrian crossings for each node, the presence or absence of highway entrances and junctions, the length (distance) for each link, road width, traveling direction, road type (highway , Toll roads, general roads, etc.).
  • the traffic condition data stores past congestion information obtained by statistically processing past congestion information based on the season 'day of the week, large holidays, and time.
  • the navigation device 300 will be described later.
  • Information on traffic congestion that currently occurs is obtained from road traffic information received by IZF415, but it is possible to predict traffic conditions at a specified time using past traffic information.
  • the audio IZF 408 is connected to a microphone 409 for audio input and a speaker 410 for audio output.
  • the sound received by the microphone 409 is AZD converted in the sound IZF 408.
  • sound is output from the speaker 410.
  • the voice input from the microphone 409 can be recorded on the magnetic disk 405 or the optical disk 407 as voice data.
  • Examples of the input device 411 include a remote controller, a keyboard, a mouse, and a touch panel that are provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like. Further, the input device 411 can be connected to another information processing terminal such as a digital camera or a mobile phone terminal to input / output data.
  • Video IZF 412 is connected to video input camera 413 (for example, in-vehicle camera 311 in FIG. 3) and video output display 414.
  • the video IZF412 includes, for example, a graphic controller that controls the entire display 414, a buffer memory such as VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and a graphic controller. It is composed of a control IC that controls the display of the display 414 based on the output image data.
  • the camera 413 (vehicle camera 311) captures images inside and outside the vehicle and outputs them as image data.
  • An image captured by the camera 413 can be recorded on the magnetic disk 405 or the optical disk 407 as image data.
  • the display 414 displays icons, cursors, menus, windows, or various data such as characters and images.
  • a CRT, a TFT liquid crystal display, a plasma display, or the like can be adopted.
  • Communication IZF 415 is connected to a network via radio and functions as an interface between navigation device 300 and CPU 401.
  • the communication I / F 415 is further connected to a communication network such as the Internet via wireless, and functions as an interface between the communication network and the CPU 401.
  • Communication networks include LANs, WANs, public line networks and mobile phone networks.
  • Communication IZF415 is composed of, for example, FM tuner, VICS (Vehicle Information and Communication System) Z beacon Resino, wireless navigation equipment, and other navigation equipment. Get traffic information.
  • VICS is a registered trademark.
  • Boot 416 receives radio waves from GPS satellites and calculates information indicating the current position of the vehicle (current position of navigation device 300).
  • the output information of the GPS unit 416 is used when the CPU 401 calculates the current position of the vehicle together with output values of various sensors 417 described later.
  • the information indicating the current location is information specifying one point on the map data, for example, latitude'longitude and altitude.
  • Various sensors 417 output information that can determine the position and behavior of the vehicle, such as a vehicle speed sensor, an acceleration sensor, and an angular velocity sensor.
  • the output values of the various sensors 417 are used to calculate the current position of the vehicle by the CPU 401 and to measure the amount of change in speed and direction.
  • the navigation device 300 may use the output values of the various sensors 417 as data to be recorded by the drive recorder function.
  • the image acquisition unit 101 is the camera 413
  • the map acquisition unit 102 is the magnetic disk drive 404 and the magnetic disk 405, or the optical disk drive 406 and
  • the function is realized by the optical disc 407, the identification unit 103 and the generation unit 104 by the CPU 401, and the output unit 105 by the speaker 410 and the display 414.
  • the navigation device 300 searches for a route to the destination point set by the user and performs route guidance along the searched route.
  • guidance information is output as appropriate on the traveling route to instruct the driver on the direction of travel.
  • Guidance information is based on the current location of the vehicle, for example, “Turn right at the intersection 200m ahead”, or based on landmarks (features) around the road, such as “Turn left at the bank crossing”. It may be. Thereby, the user can grasp the route to be visually advanced.
  • Fig. 5 is a flowchart showing the route guidance process based on the landmarks extracted from the map data.
  • the navigation apparatus 300 first determines a travel route to the destination point (step S501).
  • the travel route is determined by, for example, an input by the user via the input device 411.
  • the navigation device 300 searches for a route to the destination point input by the user via the input device 411.
  • the user determines the route that best matches the user's preference and demand as the travel route from the searched routes.
  • navigation device 300 stands by until the vehicle on which the device is installed starts to travel (step S 502: No loop). Whether or not the force has started running is determined by, for example, whether or not the current position of the device has changed.
  • the in-vehicle camera 311 captures a scene in the traveling direction of the vehicle viewed from the driver's seat (step S503). The in-vehicle camera 311 captures the scenery that the driver of the vehicle sees.
  • a guidance point is a point set on the travel route for route guidance (sound output or screen display). For example, it is set a short distance from the intersection where a right or left turn should be made.
  • step S503 and step S504 may be reversed. That is, when a predetermined distance has been reached to the guidance point (step S504: Yes), shooting of the scenery in the traveling direction of the vehicle may be started (step S503).
  • the in-vehicle camera 311 You can stop until you reach a certain distance to the camera, or take a picture of the inside of the car or the rear of the car (if it is a movable camera, etc.).
  • the navigation apparatus 300 When the predetermined distance is reached to the guide point (step S504: Yes), the navigation apparatus 300 also extracts the landmark from the map data force recorded on the magnetic disk 405 or the optical disk 407 (step S505).
  • a landmark is a land target such as a mountain or a high-rise building.
  • a landmark is extracted according to the matter to be guided at a guide point.
  • the information to be guided at the guidance point instructs to turn left at an intersection 300m away from the guidance point
  • landmarks such as gas stations, family restaurants, and banks near the intersection are extracted.
  • the neighborhood includes the predetermined distance from the intersection. For example, it is possible to provide guidance such as “turn left at the intersection in front of OO Mart”. Further, a single landmark or a plurality of landmarks may be extracted.
  • step S506 the image taken in step S503 is collated (step S506), and it is determined whether or not the landmark extracted in step S505 is reflected in the image (step S507). Judgment whether or not a landmark is reflected in the image is based on, for example, the type of landmark (gas station, family restaurant, bank, etc.), name, trademark, and the current position of the vehicle from the map data. Extract distance. If the object shown at a relative distance on the image matches the type, name, or trademark of the landmark, it is determined that the landmark appears in the image.
  • the type of landmark gas station, family restaurant, bank, etc.
  • name, trademark the current position of the vehicle from the map data. Extract distance. If the object shown at a relative distance on the image matches the type, name, or trademark of the landmark, it is determined that the landmark appears in the image.
  • the force matches the shape of the building characteristic of the type of landmark (for example, the gas station has a low and flat shape).
  • the shape of the building characteristic of the type of landmark for example, the gas station has a low and flat shape.
  • step S507 If the landmark appears in the image (step S507: Yes), the vehicle position and the relative position between the guide point and the landmark are acquired (step S508), and the guidance is based on the landmark. Information is output (step S509).
  • Guide information based on landmarks is, for example, “Turn right on the road next to the mart that is visible on the right hand side”, “Look 200m ahead on the left ⁇ ⁇ Turn left on the intersection 50m ahead of the oil”, etc. Landmark that the driver can see This is a guide for instructing a left or right turn based on a track.
  • the display of the display 414 may be changed, for example, by expanding the display of landmarks used for guidance. Also, when multiple landmarks are collated, for example
  • the landmark having the shortest distance from the guided intersection may be determined as the optimum landmark and used for guidance.
  • Step S510: No Until the destination point is reached (Step S510: No), the process returns to Step S503, and the subsequent processing is repeated.
  • Step S510: Yes the processing according to this flowchart is performed. Exit. If the landmark is not reflected in the image at step S507 (step S507: No), the process returns to step S505 and the subsequent processing is repeated.
  • step S507 and step S508 may be reversed! /.
  • landmarks that are suitable for guidance such as landmarks with a short distance from the intersection, may be collated one by one with the images in order.
  • processing from step S505 to step S508 is always performed, and landmarks that can be visually recognized by the driver are extracted. When the guide point is reached, guidance is based on the latest visible landmarks. It is good to do.
  • the force that the landmark is on the image (the driver's power is also visible) is continued. May be monitored. If the power on the image disappears in the middle of the determined landmark (when the driver's power is also invisible), the other landmark on the image is used as the landmark for guidance information.
  • FIG. 6 is a flowchart showing the procedure of the route guidance process based on the landmark extracted from the image force.
  • the navigation device 300 determines the travel route to the destination point (step S601), and waits until the vehicle on which the device is installed starts traveling, as in the flowchart of FIG. Step S602: No loop).
  • step S602 When the vehicle starts running (step S602: Yes), the vehicle-mounted camera 311 drives the vehicle. A view of the vehicle traveling direction as seen from the seat is taken (step S603). Then, a visible landmark is extracted from the captured image (step S604). Visible landmarks are landmarks that are visible to the driver.
  • the navigation device 300 holds the text information of the shape and name according to the type and the information of the trade mark (logo, figure, etc.) for the landmark included in the map data. The image is analyzed, and if there is a shape, text, or trademark that matches these, it is extracted as a landmark.
  • the navigation device 300 collates the extracted landmarks with the map data, and acquires position information and related information (step S605).
  • Related information is more detailed information about the location and name of the landmark.
  • the position information is based on the relative distance between the landmark position on the image and the position of the vehicle. The position of may be calculated. Then, until the predetermined distance is reached to the guide point set in the traveling direction (Step S606: No), the process returns to Step S603 to continue shooting the scenery in the traveling direction.
  • step S606 When the predetermined distance to the guidance point is reached (step S606: Yes), the navigation device 300 outputs guidance information using the latest visible landmark information (step S607).
  • the content of the guidance information is the same as that described in the flowchart of FIG. Until the destination point is reached (step S608: No), the process returns to step S603 and the subsequent processing is repeated. If the destination point is reached (step S608: Yes), the processing by this flow chart is terminated.
  • route guidance is performed based on landmarks visible to the driver. This makes it easier for the driver to understand V and route guidance.
  • the features used for guidance are often not visible. For this reason, there is a case where the traveling route is wrong even though the guidance information is output. On the other hand, it is often difficult to correct the trajectory if the route is wrong in such urban areas and mountain roads.
  • the navigation device 300 since the route guidance is performed using the features that are actually visible, the route is incorrect. The possibility of being reduced can be reduced. In addition, since the driver can easily understand the contents of the guidance, he / she can afford to drive and encourage safe driving.
  • the route guidance method described in the present embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed when the recording medium force is also read by the computer.
  • the program may be a transmission medium that can be distributed through a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)

Abstract

A route guidance device (100) is constructed from an image acquisition section (101), a map acquisition section (102), an identification section (103), a creation section (104), and an output section (105). The image acquisition section (101) acquires an image that is in the direction of movement of a mobile body and is taken from the vicinity of a driver’s seat of the mobile body. The map acquisition section (102) acquires map information about the periphery of the current position of the mobile body. The identification section (103) identifies a ground object (hereinafter referred to as the visible ground object) visually recognizable by the driver of the moving body based on the image acquired by the image acquisition section (101) and on the map information acquired by the map acquisition section (102). The creation section (104) creates, by using information on the visible ground object identified by the identification section (103), guidance information on a route on which the moving body moves. The output section (105) outputs the guidance information created by the creation section (104).

Description

明 細 書  Specification
経路誘導装置、経路誘導方法、経路誘導プログラムおよび記録媒体 技術分野  Route guidance device, route guidance method, route guidance program, and recording medium
[0001] 本発明は、経路上の地物に関する情報を用いて、移動体が移動する経路を誘導す る経路誘導装置、経路誘導方法、経路誘導プログラムおよび記録媒体に関する。た だし、本発明の利用は、上述した経路誘導装置、経路誘導方法、経路誘導プロダラ ムおよび記録媒体に限られな 、。  The present invention relates to a route guidance device, a route guidance method, a route guidance program, and a recording medium for guiding a route along which a moving body moves using information on features on the route. However, the use of the present invention is not limited to the above-described route guidance device, route guidance method, route guidance program, and recording medium.
背景技術  Background art
[0002] 従来、車両などに搭載されるナビゲーシヨン装置では、 目的地点までの経路を探索 し、探索した経路に沿って経路誘導をおこなっている。経路誘導は、おもに案内音声 の出力などによっておこなうが、交差点など経路上の主要なポイントでは、ユーザが 同ポイントを認識しやす 、ように、同ポイント周辺のランドマーク(地物)を基準として 案内音声を生成する場合がある。  Conventionally, a navigation device mounted on a vehicle or the like searches for a route to a destination point and performs route guidance along the searched route. Route guidance is mainly performed by outputting guidance voices, but at major points on the route such as intersections, guidance is provided based on landmarks (features) around the point so that the user can easily recognize the same point. May generate audio.
[0003] このようなナビゲーシヨン装置は、たとえば、分岐点付近のランドマークを検出する ランドマーク検出部と、検出されたランドマークが運転者力 みて分岐点のどの位置 にあるのかを判断するランドマーク位置判断部と、音声ガイダンスを行おうとして 、る 時間で確認できるランドマークを優先順位をつけてテーブルィ匕されたランドマーク優 先順位'有効時間テーブルが格納され、ランドマーク検出部で検出された何れのラン ドマークを音声ガイダンスするかをランドマーク優先順位 ·有効時間テーブルに基づ いて決定する最適ランドマーク決定部と、ランドマークを位置情報とともに音声ガイダ ンスしつつ運転者にルート案内を行う音声ガイダンス部とを有する(たとえば、下記特 許文献 1参照。)。  [0003] Such a navigation device includes, for example, a landmark detection unit that detects a landmark near a branch point, and a land that determines the position of the branch point by detecting the driver's power. A landmark priority ranking table that is displayed with priorities for landmarks that can be checked in the time required to perform voice guidance and the mark position determination section is stored and detected by the landmark detection section. The optimal landmark determination unit that determines which landmarks are to be voice-guided based on the landmark priority / effective time table, and provides route guidance to the driver while performing voice guidance on the landmarks along with the location information. (For example, refer to Patent Document 1 below.)
[0004] 特許文献 1:特開 2000— 131084号公報 [0004] Patent Document 1: Japanese Unexamined Patent Publication No. 2000-131084
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0005] し力しながら、上述した従来技術によれば、音声案内(音声ガイダンス)に用いるラ ンドマークが、音声案内の出力時にユーザの視界に入って ヽな 、可能性があると!/ヽ う問題点が一例として挙げられる。たとえば、ランドマークとユーザの搭乗する車両と の間に街路榭ゃ看板などの障害物が設置されている場合や、坂やカーブなどによつ て視界が遮られる場合、ユーザは音声案内に用いられたランドマークを視認すること ができない。 [0005] However, according to the above-described prior art, if the landmark used for voice guidance (voice guidance) is likely to enter the user's field of view at the time of voice guidance output! The problem is an example. For example, if an obstacle such as a street signboard is installed between the landmark and the vehicle on which the user is boarding, or if the field of view is blocked by a slope or curve, the user uses it for voice guidance. The displayed landmark cannot be seen.
[0006] このような案内では、ユーザがどの方向に進行すればよいかわ力 ず混乱してしま い、適切に経路を誘導することができないという問題点が一例として挙げられる。また 、ナビゲーシヨン装置の案内が間違っていると判断された場合、ナビゲーシヨン装置 が提供する情報への信頼性を低下させてしまうという問題点が一例として挙げられる  [0006] In such guidance, the user may be confused without knowing which direction to proceed, and the route cannot be properly guided. Another example is the problem of reducing the reliability of information provided by the navigation device when it is determined that the guidance of the navigation device is incorrect.
課題を解決するための手段 Means for solving the problem
[0007] 上述した課題を解決し、目的を達成するため、請求項 1の発明にかかる経路誘導 装置は、移動体の運転席の近傍カゝら移動方向を撮像した画像を取得する画像取得 手段と、前記移動体の現在地点周辺の地図情報を取得する地図取得手段と、前記 画像取得手段によって取得された画像と、前記地図取得手段によって取得された地 図情報とに基づいて、前記移動体の運転者が視認可能な地物(以下、可視地物とい う)を識別する識別手段と、前記識別手段によって識別された可視地物に関する情 報を用いて前記移動体が移動する経路の誘導情報を生成する生成手段と、前記生 成手段によって生成された誘導情報を出力する出力手段と、を備えることを特徴とす る。 In order to solve the above-described problems and achieve the object, the route guidance device according to the invention of claim 1 is an image acquisition unit that acquires an image obtained by capturing the moving direction in the vicinity of the driver's seat of the moving body. Based on map acquisition means for acquiring map information around the current location of the mobile body, images acquired by the image acquisition means, and map information acquired by the map acquisition means. A means for identifying a feature (hereinafter referred to as a visible feature) that can be visually recognized by a driver of the vehicle, and guidance of a route along which the moving body moves using information on the visible feature identified by the identification device It is characterized by comprising generating means for generating information and output means for outputting the guidance information generated by the generating means.
[0008] また、請求項 6の発明に力かる経路誘導方法は、移動体の現在地点周辺の画像を 取得する画像取得工程と、前記移動体の現在地点周辺の地図情報を取得する地図 取得工程と、前記画像取得工程によって取得された画像と、前記地図取得工程によ つて取得された地図情報とに基づいて、前記移動体の搭乗者が視認可能な地物 (以 下、可視地物という)を識別する識別工程と、前記識別工程によって識別された可視 地物に関する情報を用いて前記移動体が移動する経路の誘導情報を生成する生成 工程と、前記生成工程によって生成された誘導情報を出力する出力工程と、を含ん だことを特徴とする。  [0008] Further, the route guidance method according to the invention of claim 6 includes an image acquisition step of acquiring an image around the current location of the moving object, and a map acquisition step of acquiring map information around the current location of the moving object. And a feature (hereinafter referred to as a visible feature) that can be seen by a passenger of the mobile body based on the image obtained by the image obtaining step and the map information obtained by the map obtaining step. ) Identifying step), generating step for generating guidance information of a route along which the moving body moves using information on visible features identified by the identification step, and guidance information generated by the generation step. And an output process for outputting.
[0009] また、請求項 7の発明に力かる経路誘導プログラムは、請求項 6に記載の経路誘導 方法をコンピュータに実行させることを特徴とする。 [0009] Further, the route guidance program according to the invention of claim 7 is the route guidance according to claim 6. The method is characterized by causing a computer to execute the method.
[0010] また、請求項 8の発明に力かる記録媒体は、請求項 7に記載の経路誘導プログラム を記録したコンピュータに読み取り可能なことを特徴とする。  [0010] Further, a recording medium according to the invention of claim 8 is readable by a computer in which the route guidance program according to claim 7 is recorded.
図面の簡単な説明  Brief Description of Drawings
[0011] [図 1]図 1は、実施の形態に力かる経路誘導装置の機能的構成を示すブロック図であ る。  [0011] FIG. 1 is a block diagram showing a functional configuration of a route guidance apparatus according to an embodiment.
[図 2]図 2は、経路誘導装置による経路誘導処理の手順を示すフローチャートである  FIG. 2 is a flowchart showing a procedure of route guidance processing by the route guidance device.
[図 3]図 3は、実施例に力かるナビゲーシヨン装置が設置された車両のダッシュボード 付近を示す説明図である。 [FIG. 3] FIG. 3 is an explanatory view showing the vicinity of a dashboard of a vehicle in which a navigation device that is effective in the embodiment is installed.
[図 4]図 4は、ナビゲーシヨン装置のハードウェア構成を示すブロック図である。  FIG. 4 is a block diagram showing a hardware configuration of the navigation device.
[図 5]図 5は、地図データ力 抽出したランドマークに基づいて経路誘導処理の手順 を示すフローチャートである。  FIG. 5 is a flowchart showing the procedure of route guidance processing based on landmarks extracted from map data power.
[図 6]図 6は、画像力 抽出したランドマークに基づいて経路誘導処理の手順を示す フローチャートである。 符号の説明  FIG. 6 is a flowchart showing a procedure of route guidance processing based on landmarks extracted from image power. Explanation of symbols
[0012] 100 経路誘導装置 [0012] 100 route guidance device
101 画像取得部  101 Image acquisition unit
102 地図取得部  102 Map acquisition unit
103 識別部  103 Identification part
103a 選択部  103a Selector
103b 判断部  103b Judgment part
104 生成部  104 generator
105 出力部  105 Output section
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0013] 以下に添付図面を参照して、この発明にかかる経路誘導装置、経路誘導方法、経 路誘導プログラムおよび記録媒体の好適な実施の形態を詳細に説明する。 Exemplary embodiments of a route guidance device, a route guidance method, a route guidance program, and a recording medium according to the present invention will be described below in detail with reference to the accompanying drawings.
[0014] (実施の形態) はじめに、実施の形態に力かる経路誘導装置 100の機能的構成について説明す る。図 1は、実施の形態に力かる経路誘導装置の機能的構成を示すブロック図である 。経路誘導装置 100は、移動体に搭載されており、画像取得部 101、地図取得部 10 2、識別部 103、生成部 104、出力部 105によって構成される。また、識別部 103は、 選択部 103a、判断部 103bによって構成される。 [0014] (Embodiment) First, the functional configuration of the route guidance device 100 that is relevant to the embodiment will be described. FIG. 1 is a block diagram showing a functional configuration of a route guidance apparatus according to an embodiment. The route guidance device 100 is mounted on a moving body, and includes an image acquisition unit 101, a map acquisition unit 102, an identification unit 103, a generation unit 104, and an output unit 105. The identification unit 103 includes a selection unit 103a and a determination unit 103b.
[0015] 画像取得部 101は、移動体の運転席の近傍から移動方向を撮像した画像を取得 する。画像取得部 101は、たとえば、移動体のダッシュボードやバックミラー周辺に設 置されたビデオカメラなどであり、移動体の運転者が視認して 、る景色とほぼ同じ景 色を撮像する。 [0015] The image acquisition unit 101 acquires an image obtained by capturing the moving direction from the vicinity of the driver's seat of the moving body. The image acquisition unit 101 is, for example, a moving body dashboard or a video camera installed in the vicinity of the rearview mirror, and picks up a scene color that is almost the same as a scene viewed by the driver of the moving body.
[0016] 地図取得部 102は、移動体の現在地点周辺の地図情報を取得する。移動体の現 在地点周辺とは、たとえば、少なくとも移動体の運転者が視認可能な範囲である。地 図取得部 102は、たとえば、目的地点までの経路誘導に用いる地図情報から、移動 体の現在地点周辺の地図情報を抽出する。  [0016] The map acquisition unit 102 acquires map information around the current location of the mobile object. The area around the current location of the moving object is, for example, a range that is at least visible to the driver of the moving object. The map acquisition unit 102 extracts, for example, map information around the current location of the moving object from map information used for route guidance to the destination location.
[0017] 識別部 103は、画像取得部 101によって取得された画像と、地図取得部 102によ つて取得された地図情報とに基づいて、移動体の運転者が視認可能な地物(以下、 可視地物という)を識別する。より詳細には、識別部 103は、選択部 103aと判断部 10 3bとによって構成される。  [0017] The identification unit 103 is a feature (hereinafter referred to as the following) that is visible to the driver of the mobile object based on the image acquired by the image acquisition unit 101 and the map information acquired by the map acquisition unit 102. Identify visible features). More specifically, the identification unit 103 includes a selection unit 103a and a determination unit 103b.
[0018] 選択部 103aは、地図情報に情報が含まれる地物から可視地物の候補となる地物( 以下、候補地物という)を選択する。判断部 103bは、画像取得部 101によって取得さ れた画像に選択部 103aによって選択された候補地物が撮像されているカゝ否かを判 断する。そして、識別部 103は、判断部 103bによって画像に撮像されていると判断さ れた候補地物を可視地物と識別する。  [0018] The selection unit 103a selects a feature that is a candidate for a visible feature (hereinafter, referred to as a candidate feature) from features whose information is included in the map information. The determination unit 103b determines whether or not the candidate feature selected by the selection unit 103a is captured in the image acquired by the image acquisition unit 101. Then, the identification unit 103 identifies the candidate feature determined to be captured in the image by the determination unit 103b as a visible feature.
[0019] ここで、判断部 103bは、地図情報から候補地物の名称、商標、形状のうち、少なく ともいずれか 1つに関する情報を抽出し、抽出された候補地物の名称、商標、形状の うち、少なくともいずれか 1つに関する情報が画像に撮像されている場合に、候補地 物が画像に撮像されて 、ると半 IJ断することとしてもよ!/、。  Here, the determination unit 103b extracts information on at least one of the name, trademark, and shape of the candidate feature from the map information, and extracts the name, trademark, and shape of the extracted candidate feature. If information about at least one of them is captured in the image, the candidate feature may be captured in the image, and it may be possible to make a semi-IJ decision! /.
[0020] 生成部 104は、識別部 103によって識別された可視地物に関する情報を用いて、 移動体が移動する経路の誘導情報を生成する。可視地物に関する情報とは、たとえ ば、可視地物の名称や、移動体が移動する経路上の信号機や交差点までの距離な どの情報である。また、誘導情報とは、移動体の運転者に進行すべき方向を指示し たり、移動上の注意を喚起する情報である。 [0020] The generation unit 104 generates guidance information of a route along which the moving body moves, using information on the visible features identified by the identification unit 103. Information about visible features is, for example, For example, it is information such as the names of visible features and the distance to traffic lights and intersections on the route along which the moving object moves. In addition, guidance information is information that instructs the driver of a moving body in a direction to proceed or calls attention to movement.
[0021] 出力部 105は、生成部 104によって生成された誘導情報を出力する。出力部 105 は、たとえば、音声出力や画像出力(メッセージ表示や地図情報への重畳表示)など によって誘導情報を出力する。  The output unit 105 outputs the guidance information generated by the generation unit 104. The output unit 105 outputs the guidance information by, for example, voice output or image output (message display or superimposed display on map information).
[0022] つぎに、経路誘導装置 100の経路誘導処理にっ 、て説明する。図 2は、経路誘導 装置による経路誘導処理の手順を示すフローチャートである。図 2のフローチャート において、経路誘導装置 100は、まず、画像取得部 101によって、移動体の運転席 の近傍力もの画像を取得する (ステップ S 201)。つぎに、地図取得部 102によって、 移動体の現在地点周辺の地図情報を取得する(ステップ S202)。  Next, a route guidance process of the route guidance device 100 will be described. FIG. 2 is a flowchart showing a procedure of route guidance processing by the route guidance device. In the flowchart of FIG. 2, the route guidance device 100 first acquires an image of a force near the driver's seat of the moving body by the image acquisition unit 101 (step S 201). Next, the map acquisition unit 102 acquires map information around the current location of the moving object (step S202).
[0023] つぎに、選択部 103aによって、ステップ S202で取得した地図情報に情報が含ま れる地物から候補地物を選択する (ステップ S 203)。そして、判断部 103bによって、 候補地物が可視地物である力否かを判断する (ステップ S204)。具体的には、画像 取得部 101によって取得された画像に選択部 103aによって選択された候補地物が 撮像されて ヽるか否かを判断する。  Next, the candidate feature is selected from the features whose information is included in the map information acquired in step S202 by the selection unit 103a (step S203). Then, the determination unit 103b determines whether or not the candidate feature is a visible feature (step S204). Specifically, it is determined whether the candidate feature selected by the selection unit 103a is captured in the image acquired by the image acquisition unit 101.
[0024] 候補地物が可視地物である場合には(ステップ S204: Yes)、生成部 104によって 可視地物に関する情報を用いて誘導情報を生成し (ステップ S205)、出力部 105に よって誘導情報を出力して (ステップ S206)、本フローチャートによる処理を終了する 。一方、候補地物が可視地物でない場合には (ステップ S 204 : No)、ステップ S203 に戻り、以降の処理を繰り返す。すなわち、再度候補地物を選択し、可視地物の識 別をおこなう。  [0024] If the candidate feature is a visible feature (step S204: Yes), the generation unit 104 generates guidance information using information on the visible feature (step S205), and the output unit 105 performs guidance. Information is output (step S206), and the processing according to this flowchart is terminated. On the other hand, if the candidate feature is not a visible feature (step S204: No), the process returns to step S203, and the subsequent processing is repeated. In other words, the candidate feature is selected again and the visible feature is identified.
[0025] 上記のように、 1つずつ候補地物を選択する他、複数の候補地物をステップ S203 で選択してもよい。この場合、ステップ S204では、選択した各候補地物が可視地物 であるか否かを判断する。複数の候補地物が可視地物と判断された場合、たとえば、 誘導の対象となる地点 (交差点など)に一番近い可視地物や、可視範囲が一番大き V、地物 (運転者から一番見えやす!/、地物)を用いて誘導情報を生成することとしても よい。 [0026] なお、本実施の形態では、選択部 103aは地図情報に情報が含まれる地物から候 補地物を選択し、判断部 103bは画像取得部 101によって取得された画像に選択部 103aによって選択された候補地物が撮像されている力否かを判断した。一方、たと えば、選択部 103aは画像に撮像された地物から候補地物を選択し、判断部 103bは 地図取得部 102によって取得された地図情報に選択部 103aによって選択された候 補地物に関する情報が含まれているカゝ否かを判断することとしてもよい。 [0025] As described above, in addition to selecting candidate features one by one, a plurality of candidate features may be selected in step S203. In this case, in step S204, it is determined whether or not each selected candidate feature is a visible feature. When multiple candidate features are determined to be visible features, for example, the closest visible feature to the point to be guided (such as an intersection), the largest visible range V, and the feature (from the driver) The guidance information may be generated using the most visible! /, Feature). [0026] In the present embodiment, the selection unit 103a selects a candidate feature from features whose information is included in the map information, and the determination unit 103b applies the selection unit 103a to the image acquired by the image acquisition unit 101. It is determined whether or not the candidate feature selected by is picked up. On the other hand, for example, the selection unit 103a selects a candidate feature from the features captured in the image, and the determination unit 103b selects the candidate feature selected by the selection unit 103a in the map information acquired by the map acquisition unit 102. It is also possible to determine whether or not the information related to is included.
[0027] この場合、判断部 103bは、たとえば、画像から候補地物の名称、商標、形状のうち 、少なくともいずれか 1つに関する情報を抽出し、抽出された候補地物の名称、商標 、形状のうち、少なくともいずれ力 1つに関する情報が地図情報に含まれている場合 に地図情報に候補地物に関する情報が含まれていると判断する。  [0027] In this case, for example, the determination unit 103b extracts information on at least one of the name, trademark, and shape of the candidate feature from the image, and extracts the name, trademark, and shape of the extracted candidate feature. If the map information contains information on at least one of these, it is determined that the map information contains information on the candidate feature.
[0028] 以上説明したように、経路誘導装置 100によれば、運転者が視認可能な地物に関 する情報を用いて、移動体が移動する経路の誘導をおこなう。これにより、誘導情報 に用いられる地物を運転者が実際に視認することができ、運転者にとってより理解し やす 、経路誘導をおこなうことができる。  [0028] As described above, according to the route guidance device 100, the route on which the moving body moves is guided using the information on the feature visible to the driver. Accordingly, the driver can actually visually recognize the features used for the guidance information, and the route guidance can be performed more easily for the driver.
[0029] 特に、障害物が多い市街地や、アップダウンやカーブが多い山道などでは、誘導 に用いられた地物が視認できない場合が多い。このため、誘導情報が出力されたに も関わらず、進行経路を誤ってしまう場合がある。一方で、このような市街地や山道で 経路を間違えてしまうと、軌道修正が困難な場合が多い。経路誘導装置 100によれ ば、実際に視認可能な地物を用いて経路誘導をおこなっているため、経路を誤る可 能性を低減させることができる。さらに、運転者は容易に誘導の内容を理解すること ができるため、運転に際して余裕を持つことができ、安全運転を励行することができる 実施例  [0029] Especially in urban areas with many obstacles and mountain roads with many ups and downs and curves, the features used for guidance are often not visible. For this reason, even if the guidance information is output, the travel route may be wrong. On the other hand, it is often difficult to correct the trajectory if the route is wrong in such urban areas and mountain roads. According to the route guidance device 100, since the route guidance is performed using the features that can be visually recognized, the possibility of erroneous routes can be reduced. Furthermore, since the driver can easily understand the contents of the guidance, he / she can afford to drive and can drive safe driving.
[0030] つぎに、上述した実施の形態に力かる経路誘導装置 100の実施例について説明 する。以下の実施例においては、経路誘導装置 100を、車両に搭載されたナビゲー シヨン装置 300に適用した場合について説明する。  [0030] Next, an example of the route guidance device 100 that works on the above-described embodiment will be described. In the following embodiment, a case where the route guidance device 100 is applied to a navigation device 300 mounted on a vehicle will be described.
[0031] (ナビゲーシヨン装置 300の周辺機器構成)  [0031] (Configuration of peripheral device of navigation device 300)
はじめに、ナビゲーシヨン装置 300の周辺機器構成について説明する。図 3は、実 施例に力かるナビゲーシヨン装置が設置された車両のダッシュボード付近を示す説 明図である。ナビゲーシヨン装置 300は、車両のダッシュボードに設置されている。ナ ピゲーシヨン装置 300は、本体部 Mおよび表示部(ディスプレイ) Dによって構成され 、表示部 Dには車両の現在地点や地図情報、現在時刻などが表示される。 First, the peripheral device configuration of the navigation device 300 will be described. Figure 3 shows the actual FIG. 4 is an explanatory diagram showing the vicinity of a dashboard of a vehicle in which a navigation device that is powerful in an embodiment is installed. The navigation device 300 is installed on the dashboard of the vehicle. The navigation apparatus 300 includes a main body M and a display unit (display) D. The display unit D displays the current location of the vehicle, map information, current time, and the like.
[0032] また、ナビゲーシヨン装置 300には、ダッシュボード上に設置された車載用カメラ 31 1が接続されている。車載用カメラ 311は、レンズの向きを変化させることができ、車 内および車外の画像を撮影することができる。図 3では、説明の便宜上、車載用カメ ラ 311のレンズは車内に向いている力 車両の走行中は、レンズは車外に向け、運 転者の前方視野を想定した角度および向きで撮影をおこなう。これにより、車載用力 メラ 311は、運転席に搭乗している運転者の視野とほぼ同じ景色を撮像する。  In addition, the navigation apparatus 300 is connected to an in-vehicle camera 311 installed on the dashboard. The in-vehicle camera 311 can change the direction of the lens and can take images inside and outside the vehicle. In Fig. 3, for convenience of explanation, the lens of the in-vehicle camera 311 is facing the inside of the car. When the vehicle is running, the lens is facing the outside of the car, and shooting is performed at an angle and orientation that assumes the driver's front vision. . As a result, the in-vehicle power camera 311 captures the same scenery as the driver's field of view in the driver's seat.
[0033] (ナビゲーシヨン装置 300のハードウェア構成)  [0033] (Hardware configuration of navigation device 300)
つぎに、ナビゲーシヨン装置 300のハードウェア構成について説明する。図 4は、ナ ピゲーシヨン装置のハードウェア構成を示すブロック図である。  Next, the hardware configuration of the navigation device 300 will be described. FIG. 4 is a block diagram showing the hardware configuration of the navigation device.
[0034] 図 4において、ナビゲーシヨン装置 300は、 CPU401と、 ROM402と、 RAM (メモリ ) 403と、磁気ディスクドライブ 404と、磁気ディスク 405と、光ディスクドライブ 406と、 光ディスク 407と、音声 I/F (インターフェース) 408と、マイク 409と、スピーカ 410と 、入力デバイス 411と、映像 IZF412と、カメラ 413と、ディスプレイ 414と、通信 iZF 415と、 GPSユニット 416と、各種センサ 417とを備えて!/ヽる。また、各構成咅401〜 417はバス 420によってそれぞれ接続されている。  In FIG. 4, a navigation device 300 includes a CPU 401, a ROM 402, a RAM (memory) 403, a magnetic disk drive 404, a magnetic disk 405, an optical disk drive 406, an optical disk 407, and an audio I / F. (Interface) 408, microphone 409, speaker 410, input device 411, video IZF412, camera 413, display 414, communication iZF 415, GPS unit 416, and various sensors 417! / Speak. In addition, the components 咅 401 to 417 are connected by a bus 420.
[0035] CPU401は、ナビゲーシヨン装置 300の全体の制御を司る。 ROM402は、ブート プログラム、通信プログラム、データベース作成プログラム、データ解析プログラムな どのプログラムを記録している。 RAM403は、 CPU401のワークエリアとして使用さ れる。  CPU 401 governs overall control of navigation device 300. ROM 402 records programs such as a boot program, a communication program, a database creation program, and a data analysis program. The RAM 403 is used as a work area for the CPU 401.
[0036] 磁気ディスクドライブ 404は、 CPU401の制御に従って磁気ディスク 405に対する データの読み取り Z書き込みを制御する。磁気ディスク 405は、磁気ディスクドライブ 404の制御で書き込まれたデータを記録する。磁気ディスク 405としては、たとえば、 HD (ノヽードディスク)や FD (フレキシブルディスク)を用いることができる。  The magnetic disk drive 404 controls reading and writing of data to the magnetic disk 405 according to the control of the CPU 401. The magnetic disk 405 records data written under the control of the magnetic disk drive 404. As the magnetic disk 405, for example, HD (node disk) or FD (flexible disk) can be used.
[0037] 光ディスクドライブ 406は、 CPU401の制御に従って光ディスク 407に対するデー タの読み取り Z書き込みを制御する。光ディスク 407は、光ディスクドライブ 406の制 御に従ってデータが読み出される着脱自在な記録媒体である。光ディスク 407は、書 き込み可能な記録媒体を利用することもできる。また、この着脱可能な記録媒体とし て、光ディスク 407のほ力 MO、メモリカードなどであってもよい。 [0037] The optical disk drive 406 performs data transfer to the optical disk 407 in accordance with the control of the CPU 401. Control reading and writing. The optical disk 407 is a detachable recording medium from which data is read according to the control of the optical disk drive 406. The optical disc 407 can also use a writable recording medium. Further, the removable recording medium may be a power MO of the optical disc 407, a memory card, or the like.
[0038] 磁気ディスク 405、光ディスク 407に記録される情報の他の一例として、経路探索' 経路誘導などに用いる地図データが挙げられる。地図データは、建物、河川、地表 面などの地物 (フィーチャ)をあらわす背景データと、道路の形状をあらわす道路形 状データとを有しており、ディスプレイ 414の表示画面において 2次元または 3次元に 描画される。ナビゲーシヨン装置 300が経路誘導中の場合は、地図データと CPU40 1によって取得された車両の現在地点とが重ねて表示されることとなる。  [0038] Another example of information recorded on the magnetic disk 405 and the optical disk 407 is route search and map data used for route guidance. The map data includes background data that represents features (features) such as buildings, rivers, and the ground surface, and road shape data that represents the shape of the road. Two-dimensional or three-dimensional data is displayed on the display screen of the display 414. Rendered on When the navigation apparatus 300 is guiding a route, the map data and the current vehicle location acquired by the CPU 401 are displayed in an overlapping manner.
[0039] ここで、背景データは、さらに、背景の形状をあらわす背景形状データと、背景の種 類をあらわす背景種別データとを有する。背景形状データは、たとえば、地物の代表 点'ポリライン'ポリゴン '地物の座標などを含んでいる。また、背景種別データは、たと えば、地物の名称や住所 ·電話番号をあらわすテキストデータ、建物 ·河川 ·地表面 などの地物の種別データを含んで 、る。  Here, the background data further includes background shape data representing the shape of the background and background type data representing the type of the background. The background shape data includes, for example, the feature representative point 'polyline' polygon 'feature' coordinates. The background type data includes, for example, text data representing the name, address, and telephone number of the feature, and type data of the feature such as a building, a river, and the ground surface.
[0040] さらに、地図中の主要な施設(POI : Point of Interest)に関しては、営業時間、 駐車場の有無などの情報を施設情報として有している。さらに、地物のうち、「ガソリン スタンド」や「コンビ-エンスストア」など、所定の目的のために設けられている施設は 、その種類によってジャンル分けされている。そして、地物の代表点および座標のデ ータと、施設情報'ジャンル情報とが関連付けられている。このとき施設が分類される ジャンルは、上記のガソリンスタンドやコンビ-エンスストアの他、たとえば、駐車場、 ,駅などがある。  [0040] Further, regarding major facilities (POI: Point of Interest) in the map, information such as business hours and the presence / absence of a parking lot is included as facility information. Furthermore, among the features, facilities provided for a predetermined purpose, such as “gas station” and “convenience store”, are classified according to their types. Then, the representative point and coordinate data of the feature and the facility information “genre information” are associated with each other. The genre into which the facilities are classified at this time includes, for example, parking lots, train stations, etc., in addition to the above gas stations and convenience stores.
[0041] また、道路形状データは、さらに交通条件データを有する。交通条件データには、 たとえば、各ノードについて、信号機や横断歩道などの有無、高速道路の出入り口 やジャンクションの有無、各リンクについての長さ(距離)、道幅、進行方向、道路種 別 (高速道路、有料道路、一般道路など)などの情報が含まれている。  [0041] The road shape data further includes traffic condition data. The traffic condition data includes, for example, the presence or absence of traffic lights and pedestrian crossings for each node, the presence or absence of highway entrances and junctions, the length (distance) for each link, road width, traveling direction, road type (highway , Toll roads, general roads, etc.).
[0042] 交通条件データには、過去の渋滞情報を、季節 '曜日 ·大型連休 ·時刻などを基準 に統計処理した過去渋滞情報を記憶している。ナビゲーシヨン装置 300は、後述す る通信 IZF415によって受信される道路交通情報によって現在発生している渋滞の 情報を得るが、過去渋滞情報により、指定した時刻における渋滞状況の予想をおこ なうことが可能となる。 [0042] The traffic condition data stores past congestion information obtained by statistically processing past congestion information based on the season 'day of the week, large holidays, and time. The navigation device 300 will be described later. Information on traffic congestion that currently occurs is obtained from road traffic information received by IZF415, but it is possible to predict traffic conditions at a specified time using past traffic information.
[0043] 音声 IZF408は、音声入力用のマイク 409および音声出力用のスピーカ 410に接 続される。マイク 409に受音された音声は、音声 IZF408内で AZD変換される。ま た、スピーカ 410からは音声が出力される。なお、マイク 409から入力された音声は、 音声データとして磁気ディスク 405あるいは光ディスク 407に記録可能である。  The audio IZF 408 is connected to a microphone 409 for audio input and a speaker 410 for audio output. The sound received by the microphone 409 is AZD converted in the sound IZF 408. In addition, sound is output from the speaker 410. Note that the voice input from the microphone 409 can be recorded on the magnetic disk 405 or the optical disk 407 as voice data.
[0044] 入力デバイス 411は、文字、数値、各種指示などの入力のための複数のキーを備 えたリモコン、キーボード、マウス、タツチパネルなどが挙げられる。さらに、入力デバ イス 411は、デジタルカメラや携帯電話端末などの他の情報処理端末を接続し、デ ータの入出力をおこなうことができる。  [0044] Examples of the input device 411 include a remote controller, a keyboard, a mouse, and a touch panel that are provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like. Further, the input device 411 can be connected to another information processing terminal such as a digital camera or a mobile phone terminal to input / output data.
[0045] 映像 IZF412は、映像入力用のカメラ 413 (たとえば、図 3の車載用カメラ 311)お よび映像出力用のディスプレイ 414と接続される。映像 IZF412は、具体的には、た とえば、ディスプレイ 414全体の制御をおこなうグラフィックコントローラと、即時表示 可能な画像情報を一時的に記録する VRAM (Video RAM)などのバッファメモリと 、グラフィックコントローラから出力される画像データに基づいて、ディスプレイ 414を 表示制御する制御 ICなどによって構成される。  Video IZF 412 is connected to video input camera 413 (for example, in-vehicle camera 311 in FIG. 3) and video output display 414. Specifically, the video IZF412 includes, for example, a graphic controller that controls the entire display 414, a buffer memory such as VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and a graphic controller. It is composed of a control IC that controls the display of the display 414 based on the output image data.
[0046] カメラ 413 (車載用カメラ 311)は、車両内外の映像を撮像し、画像データとして出 力する。カメラ 413で撮像された画像は、画像データとして磁気ディスク 405あるいは 光ディスク 407に記録可能である。ディスプレイ 414には、アイコン、カーソル、メ-ュ 一、ウィンドウ、あるいは文字や画像などの各種データが表示される。このディスプレ ィ 414は、たとえば、 CRT、 TFT液晶ディスプレイ、プラズマディスプレイなどを採用 することができる。  [0046] The camera 413 (vehicle camera 311) captures images inside and outside the vehicle and outputs them as image data. An image captured by the camera 413 can be recorded on the magnetic disk 405 or the optical disk 407 as image data. The display 414 displays icons, cursors, menus, windows, or various data such as characters and images. As this display 414, for example, a CRT, a TFT liquid crystal display, a plasma display, or the like can be adopted.
[0047] 通信 IZF415は、無線を介してネットワークに接続され、ナビゲーシヨン装置 300と CPU401とのインターフェースとして機能する。通信 I/F415は、さらに、無線を介し てインターネットなどの通信網に接続され、この通信網と CPU401とのインターフエ一 スとしても機能する。  Communication IZF 415 is connected to a network via radio and functions as an interface between navigation device 300 and CPU 401. The communication I / F 415 is further connected to a communication network such as the Internet via wireless, and functions as an interface between the communication network and the CPU 401.
[0048] 通信網には、 LAN, WAN,公衆回線網や携帯電話網などがある。具体的には、 通信 IZF415は、たとえば、 FMチューナー、 VICS (Vehicle Information and Communication System) Zビーコンレシーノ 、無線ナビゲーシヨン装置、および その他のナビゲーシヨン装置によって構成され、 VICSセンター力も配信される渋滞 や交通規制などの道路交通情報を取得する。なお、 VICSは登録商標である。 [0048] Communication networks include LANs, WANs, public line networks and mobile phone networks. In particular, Communication IZF415 is composed of, for example, FM tuner, VICS (Vehicle Information and Communication System) Z beacon Resino, wireless navigation equipment, and other navigation equipment. Get traffic information. VICS is a registered trademark.
[0049] 0?3ュ-ット416は、 GPS衛星からの電波を受信、車両の現在地点(ナビゲーショ ン装置 300の現在地点)を示す情報を算出する。 GPSユニット 416の出力情報は、 後述する各種センサ 417の出力値とともに、 CPU401による車両の現在地点の算出 に際して利用される。現在地点を示す情報は、たとえば緯度'経度、高度などの、地 図データ上の 1点を特定する情報である。  [0049] 0-3 Boot 416 receives radio waves from GPS satellites and calculates information indicating the current position of the vehicle (current position of navigation device 300). The output information of the GPS unit 416 is used when the CPU 401 calculates the current position of the vehicle together with output values of various sensors 417 described later. The information indicating the current location is information specifying one point on the map data, for example, latitude'longitude and altitude.
[0050] 各種センサ 417は、車速センサや加速度センサ、角速度センサなどの、車両の位 置や挙動を判断することが可能な情報を出力する。各種センサ 417の出力値は、 CP U401による車両の現在地点の算出や、速度や方位の変化量の測定に用いられる。 なお、ナビゲーシヨン装置 300は、各種センサ 417の出力値を、ドライブレコーダ機 能で記録するデータとしてもよ 、。  [0050] Various sensors 417 output information that can determine the position and behavior of the vehicle, such as a vehicle speed sensor, an acceleration sensor, and an angular velocity sensor. The output values of the various sensors 417 are used to calculate the current position of the vehicle by the CPU 401 and to measure the amount of change in speed and direction. The navigation device 300 may use the output values of the various sensors 417 as data to be recorded by the drive recorder function.
[0051] ここで、実施の形態に力かる経路誘導装置 100の構成のうち、画像取得部 101は カメラ 413によって、地図取得部 102は磁気ディスクドライブ 404および磁気ディスク 405、あるいは、光ディスクドライブ 406および光ディスク 407によって、識別部 103 および生成部 104は CPU401によって、出力部 105はスピーカ 410およびディスプ レイ 414によって、その機能を実現する。  [0051] Here, in the configuration of the route guidance device 100 according to the embodiment, the image acquisition unit 101 is the camera 413, the map acquisition unit 102 is the magnetic disk drive 404 and the magnetic disk 405, or the optical disk drive 406 and The function is realized by the optical disc 407, the identification unit 103 and the generation unit 104 by the CPU 401, and the output unit 105 by the speaker 410 and the display 414.
[0052] (ナビゲーシヨン装置 300の経路誘導処理)  [0052] (Route guidance processing of navigation device 300)
つぎに、ナビゲーシヨン装置 300の経路誘導処理について説明する。ナビゲーショ ン装置 300は、ユーザによって設定された目的地点までの経路を探索し、探索した 経路に沿って経路誘導をおこなう。このとき、走行中の経路上で適宜誘導情報を出 力して、運転者に進行方向などの指示をおこなう。誘導情報は、たとえば、「200m先 の交差点を右折です」など、車両の現在位置を基準とする他、「〇〇銀行の交差点を 左折です」など、道路周辺のランドマーク (地物)を基準とする場合がある。これにより 、ユーザは視覚的に進むべき経路を把握することができる。  Next, route guidance processing of the navigation device 300 will be described. The navigation device 300 searches for a route to the destination point set by the user and performs route guidance along the searched route. At this time, guidance information is output as appropriate on the traveling route to instruct the driver on the direction of travel. Guidance information is based on the current location of the vehicle, for example, “Turn right at the intersection 200m ahead”, or based on landmarks (features) around the road, such as “Turn left at the bank crossing”. It may be. Thereby, the user can grasp the route to be visually advanced.
[0053] 一方で、障害物が多い市街地や、アップダウンやカーブが多い山道などでは、誘 導に用いられた地物が視認できない場合が多い。このため、誘導情報が出力された にも関わらず、進行経路を誤ってしまう場合がある。一方で、このような市街地や山道 で経路を間違えてしまうと、軌道修正が困難な場合が多い。このため、ナビゲーシヨン 装置 300では、運転者が実際に視認可能なランドマークを用いて経路誘導をおこな う。これにより、運転者に誘導の内容を容易に理解させることができ、経路を誤る可能 '性を低減させることができる。 [0053] On the other hand, in urban areas with many obstacles and mountain roads with many ups and downs and curves, In many cases, the features used for guiding cannot be visually recognized. For this reason, there is a case where the travel route is wrong even though the guidance information is output. On the other hand, it is often difficult to correct the trajectory if the route is wrong in such urban areas and mountain roads. For this reason, the navigation device 300 performs route guidance using landmarks that are actually visible to the driver. As a result, the driver can easily understand the contents of the guidance, and the possibility of making a mistake in the route can be reduced.
[0054] はじめに、地図データ力も経路情報に用いるランドマークを決定する場合につ!、て 説明する。図 5は、地図データ力 抽出したランドマークに基づいて経路誘導処理の 手順を示すフローチャートである。図 5のフローチャートにおいて、ナビゲーシヨン装 置 300は、まず、目的地点までの走行経路を決定する (ステップ S501)。走行経路の 決定は、たとえば、入力デバイス 411を介したユーザ力もの入力によっておこなう。ナ ピゲーシヨン装置 300は、たとえば、入力デバイス 411を介してユーザ力 入力され た目的地点までの経路を探索する。ユーザは、探索された経路の中からユーザの嗜 好や要望に最も合致する経路を走行経路として決定する。  [0054] First, the map data power will be described when determining landmarks used for route information. Fig. 5 is a flowchart showing the route guidance process based on the landmarks extracted from the map data. In the flowchart of FIG. 5, the navigation apparatus 300 first determines a travel route to the destination point (step S501). The travel route is determined by, for example, an input by the user via the input device 411. For example, the navigation device 300 searches for a route to the destination point input by the user via the input device 411. The user determines the route that best matches the user's preference and demand as the travel route from the searched routes.
[0055] つぎに、ナビゲーシヨン装置 300は、自装置が搭載された車両が走行を開始するま で待機する (ステップ S 502 : Noのループ)。走行を開始した力否かは、たとえば、自 装置の現在位置が変化した力否かによって判断する。走行を開始すると (ステップ S 502 : Yes)、車載用カメラ 311によって、運転席から見た車両の進行方向の景色を 撮影する (ステップ S 503)。車載用カメラ 311は、車両の運転者が視認する景色を撮 影することとなる。  Next, navigation device 300 stands by until the vehicle on which the device is installed starts to travel (step S 502: No loop). Whether or not the force has started running is determined by, for example, whether or not the current position of the device has changed. When the vehicle starts to travel (step S502: Yes), the in-vehicle camera 311 captures a scene in the traveling direction of the vehicle viewed from the driver's seat (step S503). The in-vehicle camera 311 captures the scenery that the driver of the vehicle sees.
[0056] ナビゲーシヨン装置 300は、進行方向に設定された案内ポイントまで所定距離とな るまでは (ステップ S504 :No)、ステップ S503に戻り、進行方向の景色の撮影を継 続する。案内ポイントとは、経路誘導の案内(音声出力や画面表示)をおこなうために 走行経路上に設定された地点であり、たとえば、右左折をおこなうべき交差点から所 定距離手前に設定される。  [0056] The navigation device 300 returns to step S503 and continues to capture the scenery in the traveling direction until the predetermined distance is reached to the guide point set in the traveling direction (step S504: No). A guidance point is a point set on the travel route for route guidance (sound output or screen display). For example, it is set a short distance from the intersection where a right or left turn should be made.
[0057] また、ステップ S503とステップ S504の順番は逆であってもよい。すなわち、案内ポ イントまで所定距離となった場合 (ステップ S504 : Yes)に、車両の進行方向の景色 の撮影を開始してもよい (ステップ S503)。この場合、車載用カメラ 311は、案内ボイ ントまで所定距離となるまで停止して 、てもよ 、し、車内や車両後方 (可動式カメラな どである場合)などを撮影して 、てもよ 、。 [0057] Further, the order of step S503 and step S504 may be reversed. That is, when a predetermined distance has been reached to the guidance point (step S504: Yes), shooting of the scenery in the traveling direction of the vehicle may be started (step S503). In this case, the in-vehicle camera 311 You can stop until you reach a certain distance to the camera, or take a picture of the inside of the car or the rear of the car (if it is a movable camera, etc.).
[0058] 案内ポイントまで所定距離となった場合 (ステップ S504: Yes)、ナビゲーシヨン装 置 300は、磁気ディスク 405または光ディスク 407に記録された地図データ力もランド マークを抽出する (ステップ S505)。ランドマークとは、山や高層建築物など、陸上の 目標となる地物であり、ここでは案内ポイントで案内すべき事柄に合わせてランドマー クを抽出する。  [0058] When the predetermined distance is reached to the guide point (step S504: Yes), the navigation apparatus 300 also extracts the landmark from the map data force recorded on the magnetic disk 405 or the optical disk 407 (step S505). A landmark is a land target such as a mountain or a high-rise building. Here, a landmark is extracted according to the matter to be guided at a guide point.
[0059] たとえば、案内ポイントで案内すべき事項が、案内ポイントから 300m先の交差点を 左折するよう指示するものであれば、交差点の近隣にあるガソリンスタンドやファミリー レストラン、銀行などのランドマークを抽出する。なお、近隣とは、交差点から所定距 離先も含む。たとえば、「〇〇マートの手前の交差点を左折」などの案内もおこない得 るためである。また、抽出するランドマークは単数であっても複数であってもよい。  [0059] For example, if the information to be guided at the guidance point instructs to turn left at an intersection 300m away from the guidance point, landmarks such as gas stations, family restaurants, and banks near the intersection are extracted. To do. The neighborhood includes the predetermined distance from the intersection. For example, it is possible to provide guidance such as “turn left at the intersection in front of OO Mart”. Further, a single landmark or a plurality of landmarks may be extracted.
[0060] そして、ステップ S503で撮影した画像と照合し (ステップ S506)、ステップ S505で 抽出したランドマークが画像に映っているか否かを判断する (ステップ S507)。ランド マークが画像に映っているか否かの判断は、たとえば、地図データからランドマーク の種類 (ガソリンスタンドやファミリーレストラン、銀行など)や名称、トレードマーク、お よび、自車の現在位置との相対距離を抽出する。そして、画像上で相対距離の位置 に映っている物体について、ランドマークの種類や名称、トレードマークなどと一致す る場合は、ランドマークが画像に映って 、ると判断する。  [0060] Then, the image taken in step S503 is collated (step S506), and it is determined whether or not the landmark extracted in step S505 is reflected in the image (step S507). Judgment whether or not a landmark is reflected in the image is based on, for example, the type of landmark (gas station, family restaurant, bank, etc.), name, trademark, and the current position of the vehicle from the map data. Extract distance. If the object shown at a relative distance on the image matches the type, name, or trademark of the landmark, it is determined that the landmark appears in the image.
[0061] より具体的には、たとえば、ランドマークの種類については、その種類のランドマー クに特徴的な建物の形状と一致する力否か (たとえば、ガソリンスタンドは低く平らな 形状など)を判断する。また、名称やトレードマークについては、看板などのテキスト や図形を認識することによって判断する。  [0061] More specifically, for example, for the type of landmark, it is determined whether or not the force matches the shape of the building characteristic of the type of landmark (for example, the gas station has a low and flat shape). To do. Names and trademarks are determined by recognizing text and figures such as signs.
[0062] ランドマークが画像に映っている場合は(ステップ S507 : Yes)、自車位置や案内ポ イントとランドマークとの相対位置を取得して (ステップ S508)、ランドマークを基準と する誘導情報を出力する (ステップ S509)。ランドマークを基準とする誘導情報とは、 たとえば、「右手に見える〇〇マートの次の道路を右折です」「200m先左に見える△ △オイルの 50m先の交差点を左折です」などのように、運転者が視認できるランドマ ークを基準として右左折などを指示する案内である。 [0062] If the landmark appears in the image (step S507: Yes), the vehicle position and the relative position between the guide point and the landmark are acquired (step S508), and the guidance is based on the landmark. Information is output (step S509). Guide information based on landmarks is, for example, “Turn right on the road next to the mart that is visible on the right hand side”, “Look 200m ahead on the left △ △ Turn left on the intersection 50m ahead of the oil”, etc. Landmark that the driver can see This is a guide for instructing a left or right turn based on a track.
[0063] このとき、案内に用いるランドマークの表示を拡大するなど、ディスプレイ 414の表 示を変更することとしてもよい。また、複数のランドマークが照合された場合、たとえば [0063] At this time, the display of the display 414 may be changed, for example, by expanding the display of landmarks used for guidance. Also, when multiple landmarks are collated, for example
、案内する交差点からの距離が最も短いランドマークなどを最適ランドマークとして決 定し、案内に用いてもよい。 Alternatively, the landmark having the shortest distance from the guided intersection may be determined as the optimum landmark and used for guidance.
[0064] そして、 目的地点に到達するまでは(ステップ S510 :No)、ステップ S503に戻り、 以降の処理を繰り返し、 目的地点に到達した場合は (ステップ S510: Yes)、本フロ 一チャートによる処理を終了する。また、ステップ S507で、ランドマークが画像に映つ ていない場合は (ステップ S507 :No)、ステップ S505に戻り、以降の処理を繰り返す  [0064] Until the destination point is reached (Step S510: No), the process returns to Step S503, and the subsequent processing is repeated. When the destination point is reached (Step S510: Yes), the processing according to this flowchart is performed. Exit. If the landmark is not reflected in the image at step S507 (step S507: No), the process returns to step S505 and the subsequent processing is repeated.
[0065] なお、ステップ S507とステップ S508の順番は逆であってもよ!/、。また、たとえば、 交差点からの距離が短いランドマークなど、案内に用いるのに適したランドマークか ら、 1つずつ順番に画像と照合することとしてもよい。さらに、ステップ S505〜ステツ プ S508までの処理を常時おこな 、、運転者カも視認可能なランドマークを抽出して おき、案内ポイントに到達したら、最新の視認可能なランドマークを基準として案内を おこなうこととしてもよい。 [0065] The order of step S507 and step S508 may be reversed! /. In addition, for example, landmarks that are suitable for guidance, such as landmarks with a short distance from the intersection, may be collated one by one with the images in order. In addition, the processing from step S505 to step S508 is always performed, and landmarks that can be visually recognized by the driver are extracted. When the guide point is reached, guidance is based on the latest visible landmarks. It is good to do.
[0066] また、誘導情報に用いるランドマークを決定した後、誘導情報を出力するまで時間 がある場合には、ランドマークが画像上にある力 (運転者カも視認可能力、)を継続して モニタリングしてもよい。もし、決定したランドマーク途中で画像上力もなくなった場合 (運転者力も視認できなくなった場合)は、その時画像上にある他のランドマークを誘 導情報に用いるランドマークとする。  [0066] If there is a time until the guidance information is output after the landmark used for the guidance information is determined, the force that the landmark is on the image (the driver's power is also visible) is continued. May be monitored. If the power on the image disappears in the middle of the determined landmark (when the driver's power is also invisible), the other landmark on the image is used as the landmark for guidance information.
[0067] つぎに、車載用カメラ 311で撮影した画像力も経路情報に用いるランドマークを決 定する場合について説明する。図 6は、画像力 抽出したランドマークに基づいて経 路誘導処理の手順を示すフローチャートである。図 6のフローチャートにおいて、ナビ ゲーシヨン装置 300は、図 5のフローチャートと同様に、 目的地点までの走行経路を 決定し (ステップ S601)、自装置が搭載された車両が走行を開始するまで待機する( ステップ S602: Noのループ)。  Next, a description will be given of the case where the landmark used for the route information is also determined by the image force captured by the in-vehicle camera 311. FIG. 6 is a flowchart showing the procedure of the route guidance process based on the landmark extracted from the image force. In the flowchart of FIG. 6, the navigation device 300 determines the travel route to the destination point (step S601), and waits until the vehicle on which the device is installed starts traveling, as in the flowchart of FIG. Step S602: No loop).
[0068] 車両が走行を開始すると (ステップ S602 : Yes)、車載用カメラ 311によって、運転 席から見た車両の進行方向の景色を撮影する (ステップ S603)。そして、撮影した画 像から可視ランドマークを抽出する (ステップ S604)。可視ランドマークとは、運転者 カも視認可能なランドマークである。ナビゲーシヨン装置 300は、あら力じめ地図デー タに含まれるランドマークについて、種類に応じた形状や名称のテキスト情報、トレー ドマーク(ロゴや図形など)の情報を保持している。そして、画像を解析して、これらに 一致する形状やテキスト、トレードマークなどがある場合には、ランドマークとして抽出 する。 [0068] When the vehicle starts running (step S602: Yes), the vehicle-mounted camera 311 drives the vehicle. A view of the vehicle traveling direction as seen from the seat is taken (step S603). Then, a visible landmark is extracted from the captured image (step S604). Visible landmarks are landmarks that are visible to the driver. The navigation device 300 holds the text information of the shape and name according to the type and the information of the trade mark (logo, figure, etc.) for the landmark included in the map data. The image is analyzed, and if there is a shape, text, or trademark that matches these, it is extracted as a landmark.
[0069] つづいて、ナビゲーシヨン装置 300は、抽出したランドマークを地図データと照合し て、位置情報や関連情報を取得する (ステップ S605)。関連情報とは、ランドマーク の位置や名称に関するより詳細な情報である。また、位置情報は、地図データに該 当するランドマークが記載されていない場合であっても、画像上のランドマークの位 置と自車の位置との相対距離から、地図データ上のランドマークの位置を算出しても よい。そして、進行方向に設定された案内ポイントまで所定距離となるまでは (ステツ プ S606 :No)、ステップ S603に戻り、進行方向の景色の撮影を継続する。  [0069] Subsequently, the navigation device 300 collates the extracted landmarks with the map data, and acquires position information and related information (step S605). Related information is more detailed information about the location and name of the landmark. In addition, even if the corresponding landmark is not described in the map data, the position information is based on the relative distance between the landmark position on the image and the position of the vehicle. The position of may be calculated. Then, until the predetermined distance is reached to the guide point set in the traveling direction (Step S606: No), the process returns to Step S603 to continue shooting the scenery in the traveling direction.
[0070] 案内ポイントまで所定距離となった場合 (ステップ S606: Yes)、ナビゲーシヨン装 置 300は、最新の可視ランドマーク情報を用いた誘導情報を出力する (ステップ S60 7)。誘導情報の内容は、図 5のフローチャートで説明したものと同様である。そして、 目的地点に到達するまでは (ステップ S608 :No)、ステップ S603に戻り、以降の処 理を繰り返す。また、 目的地点に到達した場合は (ステップ S608 : Yes)、本フローチ ヤートによる処理を終了する。  [0070] When the predetermined distance to the guidance point is reached (step S606: Yes), the navigation device 300 outputs guidance information using the latest visible landmark information (step S607). The content of the guidance information is the same as that described in the flowchart of FIG. Until the destination point is reached (step S608: No), the process returns to step S603 and the subsequent processing is repeated. If the destination point is reached (step S608: Yes), the processing by this flow chart is terminated.
[0071] 以上説明したように、ナビゲーシヨン装置 300によれば、運転者が視認可能なランド マークを基準として経路誘導をおこなう。これにより、運転者にとってより理解しやす V、経路誘導をおこなうことができる。  [0071] As described above, according to the navigation device 300, route guidance is performed based on landmarks visible to the driver. This makes it easier for the driver to understand V and route guidance.
[0072] 特に、障害物が多い市街地や、アップダウンやカーブが多い山道などは、誘導に 用いられた地物が視認できない場合が多い。このため、誘導情報が出力されたにも 関わらず、進行経路を誤ってしまう場合がある。一方で、このような市街地や山道で 経路を間違えてしまうと、軌道修正が困難な場合が多い。ナビゲーシヨン装置 300に よれば、実際に視認可能な地物を用いて経路誘導をおこなっているため、経路を誤 る可能性を低減させることができる。さらに、運転者は容易に誘導の内容を理解する ことができるため、運転に際して余裕を持つことができ、安全運転を励行することがで きる。 [0072] Especially in urban areas with many obstacles and mountain roads with many ups and downs and curves, the features used for guidance are often not visible. For this reason, there is a case where the traveling route is wrong even though the guidance information is output. On the other hand, it is often difficult to correct the trajectory if the route is wrong in such urban areas and mountain roads. According to the navigation device 300, since the route guidance is performed using the features that are actually visible, the route is incorrect. The possibility of being reduced can be reduced. In addition, since the driver can easily understand the contents of the guidance, he / she can afford to drive and encourage safe driving.
[0073] また、車載用カメラ 311で撮影した画像と地図データとを照合することによって、より 高い精度で運転者が視認可能なランドマークを識別することができる。さらに、運転 者が識別可能なランドマークから、案内ポイントにおいて誘導に最適なランドマークを 選択することによって、より理解しやすい経路誘導をおこなうことができる。  [0073] Further, by comparing the image captured by the in-vehicle camera 311 with map data, landmarks that can be visually recognized by the driver can be identified with higher accuracy. Furthermore, it is possible to perform route guidance that is easier to understand by selecting an optimum landmark for guidance at the guide point from landmarks that can be identified by the driver.
[0074] なお、本実施の形態で説明した経路誘導方法は、あらかじめ用意されたプログラム をパーソナル 'コンピュータやワークステーションなどのコンピュータで実行することに より実現することができる。このプログラムは、ハードディスク、フレキシブルディスク、 CD-ROM, MO、 DVDなどのコンピュータで読み取り可能な記録媒体に記録され 、コンピュータによって記録媒体力も読み出されることによって実行される。またこの プログラムは、インターネットなどのネットワークを介して配布することが可能な伝送媒 体であってもよい。  Note that the route guidance method described in the present embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation. This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed when the recording medium force is also read by the computer. The program may be a transmission medium that can be distributed through a network such as the Internet.

Claims

請求の範囲 The scope of the claims
[1] 移動体の運転席の近傍から移動方向を撮像した画像を取得する画像取得手段と、 前記移動体の現在地点周辺の地図情報を取得する地図取得手段と、  [1] An image acquisition unit that acquires an image obtained by capturing a moving direction from the vicinity of a driver's seat of a mobile body, a map acquisition unit that acquires map information around the current location of the mobile body,
前記画像取得手段によって取得された画像と、前記地図取得手段によって取得さ れた地図情報とに基づいて、前記移動体の運転者が視認可能な地物(以下、可視 地物と!/ゝぅ)を識別する識別手段と、  On the basis of the image acquired by the image acquisition means and the map information acquired by the map acquisition means, a feature (hereinafter referred to as a visible feature! Identification means for identifying)
前記識別手段によって識別された可視地物に関する情報を用いて前記移動体が 移動する経路の誘導情報を生成する生成手段と、  Generating means for generating guidance information of a route along which the moving body moves using information on the visible features identified by the identifying means;
前記生成手段によって生成された誘導情報を出力する出力手段と、  Output means for outputting the guidance information generated by the generating means;
を備えることを特徴とする経路誘導装置。  A route guidance device comprising:
[2] 前記識別手段は、  [2] The identification means includes
前記地図情報に情報が含まれる地物から前記可視地物の候補となる地物(以下、 候補地物と!、う)を選択する選択手段と、  A selection means for selecting a feature that is a candidate for the visible feature (hereinafter referred to as a candidate feature!) From features whose information is included in the map information;
前記画像取得手段によって取得された画像に前記選択手段によって選択された候 補地物が撮像されているか否かを判断する判断手段と、を備え、  Determining means for determining whether or not the candidate feature selected by the selection means is imaged in the image acquired by the image acquisition means;
前記判断手段によって画像に撮像されていると判断された候補地物を前記可視地 物と識別することを特徴とする請求項 1に記載の経路誘導装置。  2. The route guidance device according to claim 1, wherein the candidate feature determined to be captured in the image by the determination unit is identified from the visible feature.
[3] 前記判断手段は、 [3] The determination means includes:
前記地図情報から前記候補地物の名称、商標、形状のうち、少なくともいずれか 1 つに関する情報を抽出し、抽出された前記候補地物の名称、商標、形状のうち、少 なくともいずれ力 1つに関する情報が前記画像に撮像されている場合に前記候補地 物が前記画像に撮像されていると判断することを特徴とする請求項 2に記載の経路 誘導装置。  Information on at least one of the name, trademark, and shape of the candidate feature is extracted from the map information, and at least one of the extracted name, trademark, and shape of the candidate feature is used. 3. The route guidance device according to claim 2, wherein when the information on the image is captured in the image, it is determined that the candidate feature is captured in the image.
[4] 前記識別手段は、 [4] The identification means includes
前記画像に撮像された地物から前記可視地物の候補となる地物 (以下、候補地物 と 、う)を選択する選択手段と、  Selecting means for selecting a feature that is a candidate for the visible feature from the features captured in the image (hereinafter referred to as a candidate feature);
前記地図情報に前記選択手段によって選択された候補地物に関する情報が含ま れているカゝ否かを判断する判断手段と、を備え、 前記判断手段によって前記地図情報に情報が含まれていると判断された候補地物 を前記可視地物として識別することを特徴とする請求項 1に記載の経路誘導装置。 Determining means for determining whether the map information includes information on the candidate feature selected by the selecting means; 2. The route guidance device according to claim 1, wherein candidate features determined by the determining means to include information in the map information are identified as the visible features.
[5] 前記判断手段は、 [5] The determination means includes
前記画像から前記候補地物の名称、商標、形状のうち、少なくともいずれか 1つに 関する情報を抽出し、抽出された前記候補地物の名称、商標、形状のうち、少なくと もいずれか 1つに関する情報が前記地図情報に含まれている場合に前記地図情報 に前記候補地物に関する情報が含まれていると判断することを特徴とする請求項 4 に記載の経路誘導装置。  Information on at least one of the name, trademark, and shape of the candidate feature is extracted from the image, and at least one of the extracted name, trademark, and shape of the candidate feature is extracted. 5. The route guidance device according to claim 4, wherein the information on the candidate feature is determined to be included in the map information when the information on the map is included in the map information.
[6] 移動体の現在地点周辺の画像を取得する画像取得工程と、 [6] An image acquisition process for acquiring an image around the current location of the moving object;
前記移動体の現在地点周辺の地図情報を取得する地図取得工程と、 前記画像取得工程によって取得された画像と、前記地図取得工程によって取得さ れた地図情報とに基づいて、前記移動体の搭乗者が視認可能な地物(以下、可視 地物と!/ゝぅ)を識別する識別工程と、  Boarding of the mobile body based on a map acquisition step of acquiring map information around the current location of the mobile body, an image acquired by the image acquisition step, and map information acquired by the map acquisition step An identification process for identifying features visible to the user (hereinafter referred to as visible features! / ゝ ぅ),
前記識別工程によって識別された可視地物に関する情報を用いて前記移動体が 移動する経路の誘導情報を生成する生成工程と、  A generation step of generating guidance information of a route along which the moving body moves using information on the visible feature identified by the identification step;
前記生成工程によって生成された誘導情報を出力する出力工程と、  An output step of outputting the guidance information generated by the generation step;
を含んだことを特徴とする経路誘導方法。  A route guidance method comprising:
[7] 請求項 6に記載の経路誘導方法をコンピュータに実行させることを特徴とする経路 誘導プログラム。 [7] A route guidance program that causes a computer to execute the route guidance method according to claim 6.
[8] 請求項 7に記載の経路誘導プログラムを記録したコンピュータに読み取り可能な記 録媒体。  [8] A computer-readable recording medium in which the route guidance program according to claim 7 is recorded.
PCT/JP2007/051658 2006-02-02 2007-02-01 Route guidance device, route guidance method, route guidance program, and recording medium WO2007088915A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006025931 2006-02-02
JP2006-025931 2006-02-02

Publications (1)

Publication Number Publication Date
WO2007088915A1 true WO2007088915A1 (en) 2007-08-09

Family

ID=38327486

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/051658 WO2007088915A1 (en) 2006-02-02 2007-02-01 Route guidance device, route guidance method, route guidance program, and recording medium

Country Status (1)

Country Link
WO (1) WO2007088915A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009162722A (en) * 2008-01-10 2009-07-23 Pioneer Electronic Corp Guidance device, guidance technique, and guidance program
WO2014097347A1 (en) * 2012-12-18 2014-06-26 三菱電機株式会社 Visibility estimation device, visibility estimation method, and safe driving support system
JP2017173154A (en) * 2016-03-24 2017-09-28 株式会社ゼンリンデータコム Map display system, map display method and program, as well as route display device
JP2018031647A (en) * 2016-08-24 2018-03-01 株式会社ゼンリンデータコム Map display system, map display method, program and route display device
JP2020149575A (en) * 2019-03-15 2020-09-17 トヨタ自動車株式会社 Server device and information processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09178497A (en) * 1995-12-26 1997-07-11 Aisin Aw Co Ltd Navigation device for vehicle
JPH11271074A (en) * 1998-03-20 1999-10-05 Fujitsu Ltd Device and method for comparing mark image and program storage medium
JP2005249504A (en) * 2004-03-03 2005-09-15 Fujitsu Ten Ltd Route guidance system, route guidance method and program for route guidance

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09178497A (en) * 1995-12-26 1997-07-11 Aisin Aw Co Ltd Navigation device for vehicle
JPH11271074A (en) * 1998-03-20 1999-10-05 Fujitsu Ltd Device and method for comparing mark image and program storage medium
JP2005249504A (en) * 2004-03-03 2005-09-15 Fujitsu Ten Ltd Route guidance system, route guidance method and program for route guidance

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009162722A (en) * 2008-01-10 2009-07-23 Pioneer Electronic Corp Guidance device, guidance technique, and guidance program
WO2014097347A1 (en) * 2012-12-18 2014-06-26 三菱電機株式会社 Visibility estimation device, visibility estimation method, and safe driving support system
CN104854638A (en) * 2012-12-18 2015-08-19 三菱电机株式会社 Visibility estimation device, visibility estimation method, and safe driving support system
JP5930067B2 (en) * 2012-12-18 2016-06-08 三菱電機株式会社 Visibility estimation device and safe driving support system
JP2017173154A (en) * 2016-03-24 2017-09-28 株式会社ゼンリンデータコム Map display system, map display method and program, as well as route display device
JP2018031647A (en) * 2016-08-24 2018-03-01 株式会社ゼンリンデータコム Map display system, map display method, program and route display device
JP2020149575A (en) * 2019-03-15 2020-09-17 トヨタ自動車株式会社 Server device and information processing method
JP7207045B2 (en) 2019-03-15 2023-01-18 トヨタ自動車株式会社 Server device and information processing method

Similar Documents

Publication Publication Date Title
US8600655B2 (en) Road marking recognition system
JP4936710B2 (en) Map display in navigation system
US20100026804A1 (en) Route guidance systems, methods, and programs
US8532917B2 (en) Method, system, and recording medium for navigating vehicle
US20090198443A1 (en) In-vehicle navigation device and parking space guiding method
WO2007122927A1 (en) Position registering device, position registering method, position registering program, and recording medium
US20080033638A1 (en) Information providing apparatus, information providing method, and computer product
EP2549232A2 (en) Travel guidance system, travel guidance method, and computer program product
JP2007240198A (en) Navigation apparatus
JP4550926B2 (en) Route search device, route search method, route search program, and recording medium
WO2007088915A1 (en) Route guidance device, route guidance method, route guidance program, and recording medium
JP4922637B2 (en) Route search device, route search method, route search program, and recording medium
JP2003279363A (en) On-board navigation apparatus
WO2008041283A1 (en) Route guidance device, information recording method, route guidance method, information recording program, route guidance program, and recording medium
JP2023138609A (en) Congestion display device, congestion display method, and congestion display program
JP6572654B2 (en) Route guidance system, route guidance method and computer program
JP4798549B2 (en) Car navigation system
JP5032592B2 (en) Route search device, route search method, route search program, and recording medium
JPWO2008047449A1 (en) Image display device, display control method, display control program, and recording medium
JP2007263580A (en) Route search device, route search method, route search program, and recording medium
JPWO2009057207A1 (en) Point guide device, point guide method, point guide program, and recording medium
JP2008134140A (en) On-vehicle navigation device
WO2008041338A1 (en) Map display, map display method, map display program, and recording medium
JP2007132724A (en) Navigation device and facility search method
WO2010058482A1 (en) Information display device, information display method, information display program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07707844

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP