WO2011135660A1 - Navigation system, navigation method, navigation program, and storage medium - Google Patents

Navigation system, navigation method, navigation program, and storage medium Download PDF

Info

Publication number
WO2011135660A1
WO2011135660A1 PCT/JP2010/057392 JP2010057392W WO2011135660A1 WO 2011135660 A1 WO2011135660 A1 WO 2011135660A1 JP 2010057392 W JP2010057392 W JP 2010057392W WO 2011135660 A1 WO2011135660 A1 WO 2011135660A1
Authority
WO
WIPO (PCT)
Prior art keywords
route
current position
display image
information
road surface
Prior art date
Application number
PCT/JP2010/057392
Other languages
French (fr)
Japanese (ja)
Inventor
智博 廣瀬
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2010/057392 priority Critical patent/WO2011135660A1/en
Priority to JP2011530191A priority patent/JP4833384B1/en
Publication of WO2011135660A1 publication Critical patent/WO2011135660A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself

Definitions

  • the present invention relates to a navigation device, a navigation method, a navigation program, and a recording medium that display a route to a destination.
  • the use of the present invention is not limited to the above-described navigation device, navigation method, navigation program, and recording medium.
  • Patent Document 1 since the guidance display is continuously superimposed on the road surface of the road, the guidance display is drawn on the road surface of the road when the positioning accuracy of the position is good. This makes it easy to understand where to turn (see Fig. 16), but if the guidance display is drawn on the road surface when the positioning accuracy of the self-position is poor, the road display and the guidance display will be misaligned. (Refer to FIG. 20).
  • a navigation device includes a current position information acquisition means for measuring a current position of a mobile body and acquiring current position information, Route information acquisition means for acquiring route information relating to the route, video acquisition means for acquiring an image of a road surface of the road corresponding to the route, and a route display generated based on the current position information and the route information Display means for superimposing and displaying an image at a predetermined height from the position of the road surface on the video, the display means having the predetermined height according to the reliability of positioning of the current position. The changed route display image is displayed.
  • the navigation device includes a current position information acquisition unit that acquires a current position information by measuring a current position of a mobile body, and a route information acquisition unit that acquires route information regarding a route to a destination. And a route display image generated based on the current position information and the route information on a transmissive member located between the viewpoint of the mobile person of the moving body and the road surface corresponding to the route, Projection means for projecting to a predetermined height from the position of the road surface on the transmission member as seen from the moving person, the projection means depending on the reliability of positioning at the current position. Projecting the route display image whose length is changed.
  • a navigation method is a navigation method in a navigation device, and relates to a current position information acquisition step of acquiring a current position information by measuring a current position of a moving body, and a route to a destination.
  • a route information acquisition step for acquiring route information a video acquisition step for acquiring a video of a road surface of a road corresponding to the route, and a route display image generated based on the current position information and the route information.
  • a route display image is displayed.
  • a navigation method is a navigation method in a navigation device, and relates to a current position information acquisition step of acquiring a current position information by measuring a current position of a moving body, and a route to a destination. Based on the current position information and the route information on a route information obtaining step for obtaining route information, and on a transparent member located between a viewpoint of a moving person of the moving body and a road surface of the road corresponding to the route. Projecting the route display image generated in this way to a predetermined height from the position of the road surface on the transmission member as seen by the moving person, and in the projection step, the positioning accuracy of the current position The route display image having the predetermined height changed according to the projection is projected.
  • a navigation program according to the invention of claim 11 causes a computer to execute the navigation method according to claim 9 or 10.
  • the recording medium according to the invention of claim 12 is readable by a computer in which the navigation program according to claim 11 is recorded.
  • FIG. 1 is a block diagram of a functional configuration of the navigation device according to the first embodiment.
  • FIG. 2 is a flowchart of a navigation process performed by the navigation device according to the first embodiment.
  • FIG. 3 is a block diagram of a functional configuration of the navigation device according to the second embodiment.
  • FIG. 4 is a flowchart of a navigation process performed by the navigation device according to the second embodiment.
  • FIG. 5 is a block diagram of a functional configuration of the information display apparatus according to the third embodiment.
  • FIG. 6 is a block diagram illustrating a hardware configuration of the navigation device.
  • FIG. 7 is a block diagram illustrating a functional configuration of the generation unit of the navigation device.
  • FIG. 8 is a flowchart illustrating a processing procedure of the generation unit of the navigation device.
  • FIG. 1 is a block diagram of a functional configuration of the navigation device according to the first embodiment.
  • FIG. 2 is a flowchart of a navigation process performed by the navigation device according to the first embodiment.
  • FIG. 9 is an explanatory diagram showing map data (node information, link information) at intersections of crossroads.
  • FIG. 10A is an explanatory diagram of the contents of the placement process in the placement processing unit of the generation unit.
  • FIG. 10B is an explanatory diagram of the contents of the placement process in the placement processing unit of the generation unit.
  • FIG. 11A is an explanatory diagram of the contents of the placement process in the placement processing unit of the generation unit.
  • FIG. 11B is an explanatory diagram of the contents of the placement processing in the placement processing unit of the generation unit.
  • FIG. 12A is an explanatory diagram of the contents of the movement process in the movement processing unit of the generation unit.
  • FIG. 12B is an explanatory diagram of the contents of the movement process in the movement processing unit of the generation unit.
  • FIG. 13A is an explanatory diagram of the contents of setting processing in the setting processing unit of the generation unit.
  • FIG. 13B is an explanatory diagram of the contents of setting processing in the setting processing unit of the generation unit.
  • FIG. 14 is an explanatory diagram showing a state rendered by the rendering processing unit of the generation unit.
  • FIG. 15 is an explanatory diagram showing a state in which an aerial route guidance display image is superimposed on the front video.
  • FIG. 16 is an explanatory diagram showing a state in which a road surface route guidance display image is displayed superimposed on the front video.
  • FIG. 17 is a flowchart illustrating a procedure of switching generation processing in the generation unit of the navigation device.
  • FIG. 18 is a flowchart showing a procedure (No.
  • FIG. 20 is an explanatory diagram illustrating a state in which a road surface route guidance display image is displayed over the front video.
  • FIG. 21 is an explanatory diagram showing a state in which an aerial route guidance display image is displayed superimposed on the front video.
  • FIG. 22 is an explanatory diagram showing a state in which the sky route guidance display image is projected on the front window.
  • FIG. 1 is a block diagram of a functional configuration of the navigation device according to the first embodiment.
  • the navigation apparatus 100 according to the first embodiment includes a current position information acquisition unit 101, a route information acquisition unit 102, a storage unit 103, a generation unit 104, a video acquisition unit 105, a display unit 106, an input unit 107, and a determination.
  • the unit 108 and the sensor 109 are configured.
  • the current position information acquisition unit 101 measures the current position of the moving body and acquires current position information.
  • the information related to the current position of the mobile object may include information such as the speed and travel direction of the mobile object in addition to the latitude and longitude information.
  • Information relating to the current position of the moving object can be acquired by, for example, a sensor 109 described later.
  • the route information acquisition unit 102 acquires route information related to the route to the destination.
  • the route information may be registered in advance, may be acquired by being received by communication means, or may be acquired by input of an operator (driver or passenger). At that time, only information on the destination is input from the operator, a route is searched based on the current position of the moving body (for example, the own vehicle) and the input destination, and the determined route is acquired as route information. You may make it do.
  • the route information may include link information of each road corresponding to the route to the destination and node information indicating an intersection.
  • the storage unit 103 stores so-called map information, specifically, for example, link information including the shape of a road and node information indicating an intersection.
  • the map information may further include traffic lights, sign information, building information (for example, building shape and height), and the like.
  • the link information may include information on road width information and inclination.
  • the storage unit 103 may be provided in the navigation device 100, or an external server accessible by communication may realize the function of the storage unit 103.
  • the generation unit 104 generates a route display image based on the information on the current position acquired by the current position information acquisition unit 101 and the route information acquired by the route information acquisition unit 102. Specifically, for example, a road-like route display image that is continuous along the route and displayed to overlap the road surface, and a belt-like route display image that is continuous along the route. Then, an aerial route guidance display image displayed at a predetermined height from the road surface is generated by switching at a predetermined timing. At that time, if the positioning of the current position is highly reliable, a road surface route guidance display image with a predetermined height as the position directly above the road surface is generated, and if the positioning of the current position is low, the video is displayed. A sky route guidance display image having a predetermined height at a position higher than the viewpoint to be photographed is generated. A specific method for generating the road route guidance display image and the sky route guidance display image, switching timing, and the like will be described later.
  • the generation unit 104 may switch the road route guidance display image to the sky route guidance display image when a predetermined condition is met. In addition, the generation unit 104 switches from the sky route guidance display image to the road surface route guidance display image when the predetermined condition is met. Specific contents of the predetermined condition will be described later.
  • the video acquisition unit 105 acquires a video of the road surface corresponding to the route (for example, from the imaging unit 110).
  • the imaging unit 110 may be included in the navigation device 100.
  • the display unit 106 displays the route display image generated (for example, generated by the generation unit 104) based on the current position information and the route information, and the position of the road surface on the video (for example, acquired by the video acquisition unit 105).
  • the images are displayed at a predetermined height.
  • the display unit 106 displays a route display image whose predetermined height is changed according to the reliability of positioning of the current position. Further, when the display unit 106 switches from one of the road surface route guidance display image and the sky route guidance display image to the other, the display unit 106 maintains the display of the route display image after the switching until a predetermined condition is satisfied. It may be.
  • the input unit 107 accepts an input of an instruction to switch to the other when either the road surface route guidance display image or the sky route guidance display image is displayed. Then, when the input unit 107 receives an input, the display unit 106 may be switched to the other display without depending on the reliability of positioning determined by the determination unit 108 described later.
  • the determination unit 108 determines the reliability of positioning of the current position (for example, acquired by the current position information acquisition unit 101), and passes the determination result to the generation unit 104. Details of specific determination processing will be described later.
  • the generation unit 104 generates a road surface route guidance display image with a predetermined height as a position directly above the road surface when the current position positioning reliability is high, and when the current position positioning reliability is low, You may make it produce
  • the sensor 109 includes a plurality of self-contained navigation sensors that measure the current position and direction. Then, the determination unit 108 can determine the reliability depending on whether or not the positioning results of the independent navigation sensors 109 are consistent.
  • the sensor 109 includes a self-contained navigation sensor that measures the current position and direction and a GPS sensor. The determination unit 108 can determine the reliability based on whether or not the positioning results of the self-contained navigation sensor and the GPS sensor match.
  • FIG. 2 is a flowchart showing a procedure of navigation processing by the navigation device.
  • the navigation device 100 first acquires the current position information by measuring the current position of the moving body by the current position information acquisition unit 101 (step S201). At this time, information regarding the reliability of positioning of the current position may be acquired.
  • the route information acquisition unit 102 acquires route information regarding the route to the destination (step S202). At that time, the navigation device 100 may also acquire road information.
  • the navigation device 100 generates a route display image (road surface route guidance display image or sky route guidance display image) according to the reliability of positioning of the current position based on the current position information and route information (step). S204).
  • a route display image road surface route guidance display image or sky route guidance display image
  • the navigation device 100 acquires a video image of the road surface corresponding to the route (step S205). Then, the navigation device 100 displays the route display image generated based on the current position information and the route information so as to overlap the predetermined height from the position of the road surface on the video (step S206).
  • the navigation apparatus 100 waits for the current position to change due to the traveling of the moving body (for example, the own vehicle) (step S207: No), and when the current position is changed (step S207: Yes). It is then determined whether or not the destination has been reached (step S208). If the destination has not yet been reached (step S208: No), the process returns to step S201, and the processes in steps S201 to S207 are repeated. In step S208, when it arrives at the destination (step S208: Yes), a series of processing is ended.
  • a route display image (either a road route guidance display image or a sky route guidance display image) according to the reliability of positioning of the current position is displayed on the video until the moving body (for example, the own vehicle) arrives at the destination. It continues to be displayed repeatedly. The operator can recognize the route to the destination without making a mistake by viewing the route display image displayed together with the video on the display screen while traveling.
  • the navigation device 100 may be configured as a single device with each component unit integrated, or each component unit may be a single device and the function of the navigation device 100 may be realized by a plurality of devices. Good.
  • FIG. 3 is a block diagram of a functional configuration of the navigation device according to the second embodiment.
  • the navigation apparatus 300 according to the second embodiment includes a current position information acquisition unit 301, a route information acquisition unit 302, a storage unit 303, a generation unit 304, a projection unit 305, an input unit 306, a determination unit 307, and a sensor 308. Consists of.
  • the current position information acquisition unit 301, the route information acquisition unit 302, the storage unit 303, the input unit 306, the determination unit 307, and the sensor 308 are the current position information acquisition unit 101, the route information acquisition unit 102, and the like in the first embodiment.
  • the configuration is the same as that of the storage unit 103, the input unit 107, the determination unit 108, and the sensor 109, and a description thereof will be omitted.
  • the generation unit 304 includes information on the current position acquired by the current position information acquisition unit 301, route information acquired by the route information acquisition unit 302, only link information stored in the storage unit 303, or node information and A route display image is generated based on the link information. Specifically, for example, a sky-like route display image that is a continuous belt-like route display image along the route and displayed at a predetermined height from the road surface is generated. At that time, when the positioning reliability of the current position is high, a road surface route guidance display image with a predetermined height as the position directly above the road surface is generated. When the positioning reliability of the current position is low, the mobile A sky route guidance display image having a predetermined height higher than the viewpoint is generated.
  • the generation unit 304 is different from the generation unit 104 of the first embodiment in a specific generation method of the road surface route guidance display image and the sky route guidance display image.
  • a specific method for generating the road surface route guidance display image and the sky route guidance display image will be described later.
  • the difference between the navigation device 300 according to the second embodiment and the navigation device 100 according to the first embodiment is that the navigation device 100 according to the first embodiment includes a video acquisition unit 105 and a display unit 106.
  • the navigation device 300 according to the second embodiment is not provided with the video acquisition unit 105 and the display unit 106 but is provided with a projection unit 305 instead.
  • the projecting unit 305 moves a route display image generated based on the current position information and the route information on a transmission member located between the viewpoint of the moving object moving person and the road surface corresponding to the route. Projection is performed at a predetermined height from the position of the road surface on the transparent member as seen by the person.
  • the road surface route guidance display image or the sky route guidance display image generated by the generation unit 304 is projected onto the window of the corresponding moving body (for example, the own vehicle).
  • the image is projected onto a window (usually a front window) in which the road corresponding to the route can be seen from inside the vehicle.
  • the projection unit 305 projects a route display image whose predetermined height has been changed according to the reliability of positioning at the current position.
  • FIG. 4 is a flowchart showing a procedure of navigation processing by the navigation device.
  • the navigation device 300 first acquires the current position information by measuring the current position of the moving body by the current position information acquisition unit 301 (step S ⁇ b> 401). At that time, the reliability of the current position may be determined.
  • the route information acquisition unit 302 acquires route information regarding the route to the destination (step S402). At that time, the navigation apparatus 300 may also acquire road information (node information / link information) stored in the storage unit 303 (step S403).
  • the navigation device 300 generates a route display image (road surface route guidance display image or sky route guidance display image) according to the reliability of positioning of the current position based on the current position information and route information (step S404). ), And projecting the generated route display image at a predetermined height from the position of the road surface on the transparent member as viewed from the moving person. Specifically, for example, the generated road surface route guidance display image or the sky route guidance display image is projected onto the front window of the moving body (for example, the own vehicle) (step S405).
  • the navigation apparatus 300 waits for the current position to change due to the traveling of the mobile body (for example, the own vehicle) (step S406: No), and when the current position is changed (step S406: Yes). Then, it is determined whether or not the destination has been reached (step S407). If the destination has not yet been reached (step S407: No), the process returns to step S401, and the processes of steps S401 to S406 are repeated. In step S407, when it arrives at the destination (step S407: Yes), a series of processes are complete
  • the road surface route guidance display image or the sky route guidance display image continues to be projected on the front window of the moving body (for example, the own vehicle) until the moving body (for example, the own vehicle) arrives at the destination. While traveling, the operator can recognize the route to the destination by viewing the actual scenery ahead through the front window and viewing the projected road route guidance display image or the sky route guidance display image.
  • the navigation device 300 may be configured as a single device with each component unit integrated, or each component unit may be a single device, and the function of the navigation device 300 may be realized by a plurality of devices. Good.
  • the navigation device is mounted on a moving body such as an automobile.
  • the navigation apparatus is not limited to the one mounted on the moving body.
  • FIG. 5 is a block diagram of a functional configuration of the information display apparatus according to the third embodiment.
  • the information display apparatus 500 according to the third embodiment includes a current position information acquisition unit 501, a route information acquisition unit 502, a storage unit 503, a generation unit 504, a video acquisition unit 505, a display unit 506, an input unit 507, The determination unit 508, the sensor 509, and the imaging unit 510 are configured.
  • the current position information acquisition unit 501 measures the current position of the own device and acquires current position information.
  • the information about the current position of the device itself includes the direction in which the device (particularly the display screen) is facing, the height of the device from the ground, and the device (the operator carrying the device). Information such as moving speed may be included. Information relating to the current position of the device itself can be acquired, for example, by a sensor 509 described later.
  • the route information acquisition unit 502 acquires route information related to the route to the destination.
  • the route information may be registered in advance, may be acquired by being received by a communication unit, or may be acquired by input of an operator of the own device. At that time, only information related to the destination is input from the operator, a route is searched based on the current position of the own device and the input destination, and the determined route is acquired as route information. Good.
  • the storage unit 503 stores so-called map information, specifically, link information including, for example, a road shape, and node information indicating an intersection.
  • the map information may further include traffic lights, sign information, building information (for example, building shape and height), and the like.
  • the link information may include information on road width information and inclination.
  • the storage unit 503 may be provided in the information display device 500, or an external server accessible by communication may realize the function of the storage unit 503.
  • the generation unit 504 generates a route display image based on the information on the current position acquired by the current position information acquisition unit 501 and the route information acquired by the route information acquisition unit 502. Specifically, for example, a road-like route display image that is continuous along the route and displayed to overlap the road surface, and a belt-like route display image that is continuous along the route. Then, an aerial route guidance display image displayed at a predetermined height from the road surface is generated by switching at a predetermined timing. At that time, if the positioning of the current position is highly reliable, a road surface route guidance display image with a predetermined height as the position directly above the road surface is generated, and if the positioning of the current position is low, the video is displayed. A sky route guidance display image having a predetermined height at a position higher than the viewpoint to be photographed is generated. A specific method for generating the road route guidance display image and the sky route guidance display image, switching timing, and the like will be described later.
  • the video acquisition unit 505 acquires the video from the imaging unit 510 that captures the video on the route of the device itself.
  • the imaging unit 510 is configured to be included in the information display device 500. However, it is only necessary that the imaging unit 510 is connected to the information display device 500 by wire or wireless and the information display device 500 can acquire the captured video.
  • the display unit 506 displays the generated route display image (for example, generated by the generation unit 504) based on the current position information and the route information, and the position of the road surface on the video (for example, acquired by the video acquisition unit 505). The images are displayed at a predetermined height. Then, the display unit 506 displays a route display image whose predetermined height is changed according to the reliability of positioning of the current position. Further, when the display unit 506 switches from one of the road surface route guidance display image or the sky route guidance display image to the other, the display unit 506 maintains the display of the route display image after the switching until a predetermined condition is satisfied. It may be.
  • the input unit 507 receives an input of an instruction to switch to the other when either the road surface route guidance display image or the sky route guidance display image is displayed. Then, when the input unit 507 receives an input, the display unit 506 may switch to the other display without depending on the reliability of positioning determined by the determination unit 508 described later.
  • the determination unit 508 determines the reliability of positioning of the current position (for example, acquired by the current position information acquisition unit 501), and passes the determination result to the generation unit 504. Details of specific determination processing will be described later.
  • the generation unit 504 generates a road surface route guidance display image in which the position directly above the road surface is a predetermined height when the positioning reliability of the current position is high, and when the reliability of the positioning of the current position is low, You may make it produce
  • the sensor 509 includes a plurality of self-contained navigation sensors that measure the current position and direction. Then, the determination unit 508 can determine the reliability based on whether or not the positioning results of the respective independent navigation sensors 509 match.
  • the sensor 509 includes a self-contained navigation sensor that measures the current position and direction and a GPS sensor. The determination unit 508 can determine the reliability based on whether or not the positioning results of the self-contained navigation sensor and the GPS sensor match.
  • the information display device 500 may be configured as a single device with each component unit integrated, or each component unit is a single device, and the function of the information display device 500 is realized by a plurality of devices. May be.
  • Information display device 500 may be, for example, a portable information terminal device (more specifically, for example, a mobile phone, a PDA, a mobile personal computer, etc.). Further, the information display device 500 can be mounted on a moving body to provide the navigation device 100 according to the first embodiment.
  • examples of the navigation device 100 according to the first embodiment, the navigation device 300 according to the second embodiment, and the information display device 500 according to the third embodiment will be described.
  • the example of the navigation device 100 according to the first embodiment, the example of the navigation device 300 according to the second embodiment, and the information display device 500 according to the third embodiment are applied to a navigation device mounted on a vehicle. An example will be described.
  • FIG. 6 is a block diagram illustrating a hardware configuration of the navigation device 100, the navigation device 300, and the information display device 500 (hereinafter simply referred to as “navigation device”).
  • the navigation device includes a CPU 601, ROM 602, RAM (memory) 603, magnetic disk drive 604, magnetic disk 605, optical disk drive 606, optical disk 607, audio I / F (interface) 608, microphone 609, speaker 610, input.
  • a device 611, a video I / F 612, a camera 613, a display 614, a projector 615, a communication I / F 616, a GPS unit 617, and various sensors 618 are provided.
  • the constituent units 601 to 618 are connected by a bus 620, respectively.
  • the CPU 601 governs overall control of the navigation device.
  • the ROM 602 records various programs such as a boot program, a communication program, a data display program, and a data analysis program.
  • the RAM 603 is used as a work area for the CPU 601.
  • the magnetic disk drive 604 controls the reading / writing of the data with respect to the magnetic disk 605 according to control of CPU601.
  • the magnetic disk 605 records data written under the control of the magnetic disk drive 604.
  • an HD hard disk
  • FD flexible disk
  • the optical disc drive 606 controls reading / writing of data with respect to the optical disc 607 according to the control of the CPU 601.
  • the optical disc 607 is a detachable recording medium from which data is read according to the control of the optical disc drive 606, and includes, for example, a Blu-ray disc, DVD, CD, and the like.
  • a writable recording medium can be used as the optical disk 607.
  • the removable recording medium may be an MO, a memory card, or the like.
  • map data used for route search / route guidance.
  • the map data includes background data representing features (features) such as buildings, rivers, and the ground surface, and road shape data representing the shape of the road.
  • the map data is displayed on the display screen of the display 614 in two dimensions or three dimensions. Drawn. When the navigation device is guiding the route, the map data and the current location of the moving body (for example, the own vehicle) acquired by the GPS unit 617 described later are displayed in an overlapping manner.
  • the voice I / F 608 is connected to a microphone 609 for voice input and a speaker 610 for voice output.
  • the sound received by the microphone 609 is A / D converted in the sound I / F 608.
  • sound is output from the speaker 610. Note that the sound input from the microphone 609 can be recorded on the magnetic disk 605 or the optical disk 607 as sound data.
  • the input device 611 includes a remote controller, a keyboard, a mouse, a touch panel, and the like that are provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like. Further, the input device 611 can connect other information processing terminals such as a digital camera and a mobile phone terminal to input / output data.
  • the video I / F 612 is connected to a camera 613 for video input, a display 614 for video output, and a projector 615 for video output.
  • the video I / F 612 includes, for example, a graphic controller that controls the entire display 614 and the projector 615, and a VRAM (Video) that temporarily records image information that can be displayed immediately.
  • VRAM Video
  • RAM random access memory
  • control IC controls display of the display 614 and the projector 615 based on image data output from the graphic controller.
  • the camera 613 captures images inside and outside the vehicle and outputs them as image data.
  • An image captured by the camera 613 can be recorded on the magnetic disk 605 or the optical disk 607 as image data.
  • the display 614 displays icons, cursors, menus, windows, or various data such as characters and images.
  • a CRT, a TFT liquid crystal display, a plasma display, or the like can be adopted.
  • the display 614 can realize the functions of the display screen controlled by the display unit 106 in Embodiment 1 and the display unit 506 in Embodiment 3.
  • the projector 615 displays icons, cursors, menus, windows, or various data such as characters and images.
  • the projector 615 projects various data on the front window using, for example, a CRT or a liquid crystal.
  • the projector 615 is installed on the ceiling or the upper part of the seat in the vehicle.
  • the projector 615 can realize the function of the projection unit 305 of the second embodiment.
  • the communication I / F 616 is connected to the network via wireless and functions as an interface between the navigation device and the CPU 601.
  • the communication I / F 616 is further connected to a communication network such as the Internet via wireless, and also functions as an interface between the communication network and the CPU 601.
  • Communication networks include LANs, WANs, public line networks and mobile phone networks.
  • the GPS unit 617 receives radio waves from GPS satellites, and calculates information indicating the current position of the vehicle (current position of the navigation device).
  • the output information of the GPS unit 617 is used when the current position of the vehicle is calculated by the CPU 601 together with output values of various sensors described later.
  • the information indicating the current location is information for specifying one point on the map data, such as latitude / longitude and altitude.
  • the various sensors 618 are, for example, a gyro sensor, an acceleration sensor, a vehicle speed sensor, and the like, and detect the moving state of the vehicle. Output signals from the various sensors 618 are used for the calculation of the current location by the CPU 601 and the measurement of changes in speed and direction.
  • the current position information acquisition unit 101, the determination unit 108, and the sensor 109 are a GPS unit 617, various sensors 618, a CPU 601 (ROM602, RAM603, magnetic disk 605, optical disk).
  • the route information acquisition unit 102 and the storage unit 103 are executed by the CPU 601, the magnetic disk drive 604 and the magnetic disk 605, or the optical disk drive 606 and the optical disk 607.
  • the CPU 601 the video acquisition unit 105 is the CPU 601, the video I / F 612, the camera 613, or the communication I / F 616
  • the display unit 106 is the CPU 601, the video I / F 612, The spray 614, imaging unit 110, the camera 613 and the communication I / F616, the input unit 107, the input device 611, to realize the respective functions.
  • the current position information acquisition unit 301, the determination unit 307, and the sensor 308 are a GPS unit 617, various sensors 618, and a CPU 601 (ROM602, RAM603, magnetic disk 605, optical disk).
  • the route information acquisition unit 302 and the storage unit 303 are executed by the CPU 601, the magnetic disk drive 604 and the magnetic disk 605, or the optical disk drive 606 and the optical disk 607.
  • the CPU 601 realizes the respective functions of the projection unit 305 by the CPU 601, the video I / F 612 and the projector 615, and the input unit 306 by the input device 611.
  • the current position information acquisition unit 501, the determination unit 508, and the sensor 509 include a GPS unit 617, various sensors 618, and a CPU 601 (ROM 602, RAM 603, magnetic disk 605, The route information acquisition unit 502 and the storage unit 503 are generated by the CPU 601, the magnetic disk drive 604 and the magnetic disk 605, or the optical disk drive 606 and the optical disk 607.
  • Reference numeral 504 denotes the CPU 601
  • the video acquisition unit 505 includes the CPU 601, video I / F 612, camera 613, or communication I / F 616.
  • the display unit 506 includes the CPU 601, video I / F 612, and display.
  • FIG. 7 is a block diagram illustrating a functional configuration of the generation unit of the navigation device.
  • FIG. 8 is a flowchart showing a processing procedure of the generation unit of the navigation device.
  • the generation unit includes an arrangement processing unit (arrangement unit) 701, a movement processing unit (movement unit) 702, a setting processing unit (setting unit) 703, a rendering processing unit (rendering unit) 704, and an adjustment unit. 705.
  • the arrangement processing unit 701 arranges a band-like object corresponding to a planar road in the space for three-dimensional calculation based only on link information or based on node information and link information (step S801). At that time, based on the route information, only the link information corresponding to the route is extracted, and a portion other than the route is deleted (step S802).
  • the movement processing unit 702 moves the band-like object arranged by the arrangement processing unit 701 by a predetermined amount (predetermined height) in the height direction orthogonal to the plane in the space for three-dimensional calculation (step S803).
  • the setting processing unit 703 determines the viewpoint position, viewpoint direction, and height for the three-dimensional calculation for the band-shaped object moved by the movement processing unit 702, the position, direction, and height of the imaging unit, or an operator (driver Or the passenger's viewpoint position, viewpoint direction, and height are set (step S804).
  • the rendering processing unit 704 renders the band-like object viewed from the viewpoint position, the viewpoint direction, and the height set by the setting processing unit 703 as a sky route guidance display image (step S805).
  • the adjustment unit 705 can adjust a predetermined amount (predetermined height) from the road surface (the ground), and the movement processing unit 702 can adjust the height adjusted by the adjustment unit 705 in the three-dimensional calculation space in step S803. It moves in the height direction perpendicular to the plane.
  • the adjustment unit 705 may perform adjustment according to route information (for example, road inclination, presence / absence of an intersection (node), presence / absence of a traffic light, etc.). Moreover, you may make it adjust to an operator's desired height by the input from an operator.
  • FIG. 9 is an explanatory diagram showing map data (node information, link information) at intersections of a crossroad
  • FIGS. 10-1 to 11-2 are explanatory diagrams showing the contents of arrangement processing in the arrangement processing unit 701 of the generation unit.
  • the map data includes a node 900 indicating an intersection and links 901 to 904 indicating four roads with respect to the node 900.
  • the link has information such as the shape of the road and the width of the road. Based on the information, a link in which a planar road is arranged in the space for three-dimensional calculation is represented from the top, as shown in FIG. It is.
  • the arrangement processing unit 701 arranges the link 901 shown in FIG. 9 as a planar road in a space for three-dimensional calculation in consideration of the shape and width of the road, and sets it as a link 1001. .
  • the arrangement processing unit 701 arranges the links 902 to 904 shown in FIG. 9 as planar roads in the space for three-dimensional calculation to form links (band images) 1002 to 1004.
  • FIG. 10A is a view from the side (viewed from the lower side to the upper side in FIG. 10-1) of what is arranged as a planar road in the space for three-dimensional calculation in FIG. 10-1. .
  • the reference height that is, height 0
  • the reference height can be arbitrarily set, and for example, the ground surface can be used as a reference.
  • FIG. 10A only the link corresponding to the route is extracted from the four links 1001 to 1004 in FIG.
  • FIG. 11A also shows a plan view in which a planar road is arranged in a space for three-dimensional calculation. Since the route reaches from the link 901 in FIG. 9 to the node 900, turns right on the node 900 and reaches the link 902, in FIG. 11-1, the placement processing unit 701 only includes the links 1001 and 1002 in FIG. Is extracted, and links other than the root (links 1003 and 1004) are deleted.
  • the placement processing unit 701 may not place links other than the root (links 1003 and 1004) from the beginning (that is, omitting FIG. 10-1 in the space for three-dimensional calculation and starting from the beginning. 11-1 may be used).
  • FIG. 11-2 is similar to FIG. 10-2, in which the one arranged as a planar road in the space for three-dimensional calculation in FIG. 11-1 is viewed from the side (from the lower side to the upper side of FIG. 11-1). (Point of view). Even at this time, the height of the links 1001 and 1002 (Z-axis direction) is zero.
  • FIGS. 12A and 12B are explanatory diagrams showing the contents of the movement processing in the movement processing unit 702 of the generation unit.
  • the movement processing unit 702 moves the links 1001 and 1002 arranged by the arrangement processing unit 701 upward (Z-axis direction) by a predetermined height (here, about 20 m above the ground). Thereby, the arrangement of the objects (band-like objects) in the space for three-dimensional calculation is completed.
  • FIG. 12-1 is expressed from the viewpoint from the top like FIGS. 10-1 and 11-1, it looks the same as FIG. 11-1 (that is, the movement in the Z-axis direction is not known).
  • FIG. 12-2 is a view from the side (view point from the lower side to the upper side in FIG. 12-1) as in FIGS. 10-2 and 11-2. Compared to FIG. It can be seen that the links 1001 and 1002 are higher than the original height (height 0) by 20 m in the Z-axis direction.
  • FIGS. 13A and 13B are explanatory diagrams illustrating the contents of the setting process in the setting processing unit 703 of the generation unit.
  • the setting processing unit 703 is an imaging unit in which a camera 1301 indicating a viewpoint position, direction, and height for three-dimensional calculation is mounted on a moving body (for example, the own vehicle) with respect to the belt-like image moved by the movement processing unit 702. Position, direction and height, or the viewpoint position, direction and height of the driver or passenger.
  • FIG. 13-2 shows the viewpoint from the side (the viewpoint from the lower side to the upper side of FIG. 13-1) as in FIGS. 10-2, 11-2, and 12-2. It can be seen that the position is lower than the links 1001 and 1002 in the Z-axis direction (for example, about 1.5 m above the ground) and is almost directly below the link 1001. This indicates that the vehicle is traveling on the road of the link 1001. In addition, it indicates that the camera 1301 is mounted at a predetermined position (for example, the back side of the rearview mirror) of the own vehicle, or that the viewpoint position of the driver or the passenger riding the own vehicle is taken into consideration. .
  • a predetermined position for example, the back side of the rearview mirror
  • FIG. 14 is an explanatory diagram showing a state rendered by the rendering processing unit 704 of the generation unit.
  • FIG. 14 illustrates a state in which the links 1001 and 1002 viewed from the camera 1301 illustrated in FIGS. 13A and 13B are rendered. Since the camera 1301 and the link 1001 have different heights, the link 1001 looks up from below from the camera 1301. When this state is shown in the space for three-dimensional calculation, it is close to an inverted trapezoid as shown in FIG. It becomes a shape.
  • the link 1002 also becomes a band-like image having a width that the camera 1301 looks up from below. Thereby, the generation (drawing) of the band-like image (link) by the generation unit is completed.
  • FIG. 15 is an explanatory diagram showing a state in which an aerial route guidance display image is displayed over the front video.
  • the band images 1001 and 1002 generated in FIG. 14 indicate route guidance on the front video. Since the route guidance constituted by the belt-like images 1001 and 1002 continuous along the route is displayed above the road surface, even if it is superimposed on the front image, the operator (driver or passenger) However, even if there is an obstacle on the front side in the front image, for example, a car traveling in front, the obstacle is not blocked by the belt-like image, and the user does not mistake the front image. Furthermore, as shown in FIG.
  • the strip images 1001 and 1002 are positioned higher than the imaging position of the camera 1301 (photographing viewpoint) and are displayed on the front video, the strip images 1001 and 1002 are displayed in the front video. Since the image is displayed at a position where the sky is looked up, there is a low possibility that an object that overlaps the strip images 1001 and 1002 exists.
  • the road surface route guidance display image is generated by substantially the same procedure as described above for generating the sky route guidance display image.
  • the difference between the two is that the movement processing unit 702 shown in FIG. 7 applies a predetermined amount (predetermined height) of the band-shaped image arranged by the arrangement processing unit 701 in the height direction orthogonal to the plane in the space for three-dimensional calculation. Only the presence / absence of the process (step S803 in FIG. 8). That is, in the generation of the sky route guidance display image, step S803 in FIG. 8 is executed, but in the generation of the road surface route guidance display image, the execution of step S803 in FIG. 8 is omitted, and step S804 is followed by step S804. Execute. Other than that, both processes are the same.
  • FIG. 16 is an explanatory diagram showing a state in which a road surface route guidance display image is superimposed on the front video. As shown in FIG. 16, strip images 1001 and 1002 are displayed so as to overlap the road surface of the front image. In FIG. 16, since the vehicle is recognized correctly, the turning position can be accurately drawn. In this case, as shown in FIG. 16, when there is no obstacle ahead, it is considered effective as route guidance.
  • the high accuracy of the position is regarded as high reliability of positioning, and either the road surface route guidance display image or the sky route guidance display image is generated according to the reliability determination result.
  • the case where the position accuracy is poor may be, for example, a case where positioning is performed only by GPS and information such as vehicle speed, G sensor, gyro sensor, and electronic compass is not obtained or cannot be obtained. For example, this is the case where the navigation device itself has only a GPS built-in.
  • FIG. 17 is a flowchart showing a procedure of switching generation processing in the generation unit of the navigation device
  • FIGS. 18 and 19 are flowcharts showing a procedure of determination processing as to whether or not the output value of each sensor is normal. . Judgment processing as to whether or not the output value of each sensor is normal is performed by the judgment unit.
  • step S1701 it is determined whether or not information can be acquired from a predetermined positioning sensor (independent sensor) (step S1701). Specifically, the determination is made based on whether or not the vehicle speed sensor is connected and whether or not the gyro sensor and the G sensor are in the apparatus and can be used. If it is determined in step S1701 that information cannot be acquired from a predetermined positioning sensor (step S1701: No), it is determined that the accuracy of the current position is poor, and a sky route guidance display image is generated (step S1705). .
  • a predetermined positioning sensor independent sensor
  • step S1701 when it is in a state where information can be acquired from a predetermined positioning sensor (step S1701: Yes), the output value of each sensor is confirmed (step S1702). Whether the output value of each sensor is normal or abnormal is determined according to the flowchart shown in FIG. 18 (determination process based on comparison between gyro sensor and G sensor) and the flowchart shown in FIG. 19 (determination process based on comparison between GPS and vehicle speed pulse). The determination is made according to the procedure of the determination process of whether or not the output value of each sensor is normal. That is, if the determination result shown in FIG. 18 is normal or the determination result shown in FIG.
  • step S1703: Yes if the output value of each sensor is normal (step S1703: Yes), it is determined that the accuracy of the current position is good, and a road surface route guidance display image is generated (step S1704).
  • step S1703: No when the output value of each sensor is abnormal (step S1703: No), it is determined that the accuracy of the current position is poor, and the sky route guidance display image is generated (step S1705).
  • step S1801 determines whether or not the angular velocity of the gyro sensor is equal to or greater than a predetermined threshold value.
  • step S1801 determines whether or not the angular velocity of the gyro sensor is equal to or greater than a predetermined threshold value.
  • step S1801: Yes the rotational direction of the gyro sensor is stored (step S1802).
  • the G sensor determines whether or not the direction opposite to the rotation direction of the gyro sensor (that is, the left-right direction) is output more than a certain value (step S1803).
  • step S1803: Yes the output value of each sensor is matched, so the output value is determined to be normal (step S1804).
  • step S1803: No it is determined that the output value is abnormal because the output values of the sensors are not matched (step S1805).
  • step S1903 the procedure of the determination process (determination process based on comparison between GPS and vehicle speed pulses) for determining whether the output value of each sensor is normal. Also in this case, whether or not the output value of each sensor is normal is determined based on whether or not the output values of the GPS and the vehicle speed pulse match.
  • step S1903 the first position information obtained by GPS is stored (step S1901).
  • step S1902 the number of vehicle speed pulses is counted (step S1902).
  • step S1903 it is determined whether or not a predetermined period has elapsed.
  • the process returns to step S1902, and the number of vehicle speed pulses is counted.
  • the number of vehicle speed pulses is counted until a predetermined period elapses. If the predetermined period has elapsed (step S1903: YES), the second position information obtained by GPS is stored (step S1904).
  • the movement distance during a predetermined period is calculated by subtracting the first position information from the second position information (step S1905). Further, the movement distance during the predetermined period calculated in step S1905 is divided by the number of vehicle speed pulses counted in step S1902, and the movement amount per vehicle speed pulse is calculated (step S1906).
  • steps S1901 to S1906 are repeated a predetermined number of times. Then, it is determined whether or not a series of processing has reached a predetermined number of trials X (step S1907). If the predetermined number of trials X has not been reached (step S1907: NO), the process returns to step S1901, and the processing of steps S1901 to S1906 is repeated. If the predetermined number of trials X has been reached in step S1907 (step S1907: Yes), then whether the variation in the movement amount per vehicle speed pulse of the predetermined number of trials X is within a predetermined range. It is determined whether or not (step S1908).
  • step S1908 when the variation is within a predetermined range (step S1908: Yes), it is determined that the output values of the sensors are consistent and the output values are normal (step S1909). On the other hand, if the variation is outside the predetermined range (step S1908: No), it is determined that the output values of the sensors are not matched and the output values are abnormal (step S1910).
  • the generation unit when the output value of each sensor is normal, that is, when the predetermined condition that the accuracy of the current position is good is met, the generation unit generates a sky route from the road surface route guidance display image. It is generated by switching to the guidance display image. In addition, when the predetermined condition is not met, the generation unit generates the image by switching from the sky route guidance display image to the road surface route guidance display image.
  • FIG. 20 is an explanatory diagram illustrating a state in which a road surface route guidance display image is displayed over the front image
  • FIG. 21 is an explanatory diagram illustrating a state in which the sky route guidance display image is displayed over the front image. is there.
  • FIG. 20 shows a case where the position of the own vehicle is not recognized correctly, so that the position of the own vehicle is recognized about 10 m before delivery, and the turning position is drawn in the back.
  • the position accuracy of the own vehicle is poor
  • the belt-like images 1001 and 1002 are displayed so as to overlap the road surface, the difference between the actual right-turn road and the belt-like image 1002 after the right turn appears to be large. Is difficult to understand.
  • FIG. 21 shows a case in which the position of the own vehicle is recognized about 10 m before actual delivery and the turning position is drawn in the back because the position recognition of the own vehicle is not correct, as in FIG.
  • the band images 1001 and 1002 are drawn in the sky, so that the operator (driver and passenger) can see the band image 1002 after the right turn and below
  • the actual right turn road displayed is recognized in association with it, and the turning point (road) is estimated. Therefore, in either FIG. 15 or FIG. 21, the operator (driver, passenger) can easily recognize at which point (road) the turn should be made.
  • the accuracy of the map data is poor (for example, the detailed map when creating the map data) May be generated by switching to the sky route guidance display image. That is, instead of or in addition to the reliability of the positioning of the vehicle position, the height of the route display image from the road surface may be determined according to the reliability of the accuracy of the map data.
  • the road route guidance display image may be switched to be generated.
  • the road surface route guidance display image may not match the shape of the undulations, so it is generated by switching to the sky route guidance display image. You may do it.
  • the road when the road is flat, it may be generated by switching to a road surface route guidance display image.
  • the position accuracy may not be determined when the vehicle is close to the guide point, and switching may not occur frequently. Further, once it is determined that the position accuracy is poor, it may be displayed on the sky route guidance display image unless there is an operation instruction or the like. In addition, once it is determined that the position accuracy is poor, it may be displayed on the sky route guidance display image until the apparatus is turned off. Until such a predetermined condition is satisfied, it is possible to prevent frequent switching by maintaining the display of either the road surface route guidance display image or the sky route guidance display image.
  • the route display form (color, transparency, etc.) may be varied according to the accuracy of the position.
  • FIG. 22 is an explanatory diagram showing a state in which the sky route guidance display image is projected on the front window.
  • a sky route guidance display image (links 1001 and 1002) is displayed on the front window 2201. Switching between the road surface route guidance display image and the sky route guidance display image according to the accuracy of the position of the own vehicle is effective even when projecting on the front window 2201.
  • a target for projecting a route display image such as a road surface route guidance display image or an aerial route guidance display image is not limited to the front window. It may be a window on the side or rear of the vehicle, or a transparent member may be arranged in the vehicle and a route display image projected onto the member. That is, if a route display image is projected onto a transparent member positioned between the user's viewpoint and the road surface corresponding to the route, the user can recognize the route only by looking outside the vehicle.
  • the current position of the moving body is measured to acquire the current position information
  • the route information regarding the route to the destination is acquired
  • the road surface of the road corresponding to the route is photographed.
  • the obtained video is acquired, and the route display image generated based on the current position information and the route information is displayed so as to be superimposed at a predetermined height from the position of the road surface on the video.
  • the reliability of positioning of the current position is judged, and when the positioning of the current position is highly reliable, a road surface route guidance display image having a predetermined height as the position directly above the road surface is generated, and the positioning of the current position is determined.
  • a road surface route guidance display image having a predetermined height as the position directly above the road surface is generated, and the positioning of the current position is determined.
  • an aerial route guidance display image is generated in which a position higher than the viewpoint for shooting the video is set to a predetermined height.
  • the current position of the moving body is measured to obtain current position information
  • the route information about the route to the destination is obtained
  • the road corresponding to the viewpoint and route of the moving body of the moving body A route display image generated based on the current position information and the route information is projected on a transparent member located between the road surface and a predetermined height from the position of the road surface on the transparent member as seen by the moving person. To do. At that time, it is possible to project a route display image whose predetermined height is changed according to the reliability of positioning of the current position.
  • the reliability of positioning of the current position is judged, and when the positioning of the current position is highly reliable, a road surface route guidance display image having a predetermined height as the position directly above the road surface is generated, and the positioning of the current position is determined.
  • a road surface route guidance display image having a predetermined height as the position directly above the road surface is generated, and the positioning of the current position is determined.
  • an aerial route guidance display image is generated in which a position higher than the viewpoint of the moving person is set to a predetermined height.
  • the display of the switched route display image is maintained until a predetermined condition is satisfied. You can continue to display route guidance.
  • the navigation method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read from the recording medium by the computer.
  • the program may be a transmission medium that can be distributed via a network such as the Internet.

Abstract

Disclosed is a navigation system (100) which includes a current location information acquiring section (101) for positioning the current location of a mobile entity to acquire current location information and a route information acquiring section (102) for acquiring route information on a route to a destination. On the basis of the current location information and the route information, a display section (106) overlays a created road route guide display image or an aerial route guide display image on a video image on the route of the vehicle acquired by a video image acquiring section (105) or projects same onto the front windshield.

Description

ナビゲーション装置、ナビゲーション方法、ナビゲーションプログラムおよび記録媒体Navigation device, navigation method, navigation program, and recording medium
 この発明は、目的地までのルートを表示するナビゲーション装置、ナビゲーション方法、ナビゲーションプログラムおよび記録媒体に関する。ただし、この発明の利用は、上述したナビゲーション装置、ナビゲーション方法、ナビゲーションプログラムおよび記録媒体に限られない。 The present invention relates to a navigation device, a navigation method, a navigation program, and a recording medium that display a route to a destination. However, the use of the present invention is not limited to the above-described navigation device, navigation method, navigation program, and recording medium.
 従来、ナビゲーション装置などにおいて、実写映像を用いて案内表示をおこなう技術が存在する(たとえば、下記特許文献1参照)。 2. Description of the Related Art Conventionally, there is a technique for performing guidance display using a live-action image in a navigation device or the like (see, for example, Patent Document 1 below).
特開2008-20288号公報JP 2008-20288 A
 しかしながら、上述した従来技術(特許文献1)では、案内表示を連続的に道路の路面に重ねて表示するため、自己の位置の測位精度がよい状態のときは、道路の路面に案内表示を描くことで曲がる位置がわかりやすい(図16参照)が、自己位置の測位精度が悪い状態のときに道路の路面に案内表示を描くと、道路の路面と案内表示がずれてしまって、かえってわかりにくくなってしまう(図20参照)という問題点があった。 However, in the above-described prior art (Patent Document 1), since the guidance display is continuously superimposed on the road surface of the road, the guidance display is drawn on the road surface of the road when the positioning accuracy of the position is good. This makes it easy to understand where to turn (see Fig. 16), but if the guidance display is drawn on the road surface when the positioning accuracy of the self-position is poor, the road display and the guidance display will be misaligned. (Refer to FIG. 20).
 上述した課題を解決し、目的を達成するため、請求項1の発明にかかるナビゲーション装置は、移動体の現在位置を測位して現在位置情報を取得する現在位置情報取得手段と、目的地までのルートに関するルート情報を取得するルート情報取得手段と、前記ルートに対応する道路の路面を撮影した映像を取得する映像取得手段と、前記現在位置情報と前記ルート情報とに基づいて生成されたルート表示画像を、前記映像上の前記路面の位置から所定の高さに重ねて表示する表示手段と、を備え、前記表示手段は、前記現在位置の測位の信頼性に応じて前記所定の高さが変更された前記ルート表示画像を表示することを特徴とする。 In order to solve the above-described problems and achieve the object, a navigation device according to the invention of claim 1 includes a current position information acquisition means for measuring a current position of a mobile body and acquiring current position information, Route information acquisition means for acquiring route information relating to the route, video acquisition means for acquiring an image of a road surface of the road corresponding to the route, and a route display generated based on the current position information and the route information Display means for superimposing and displaying an image at a predetermined height from the position of the road surface on the video, the display means having the predetermined height according to the reliability of positioning of the current position. The changed route display image is displayed.
 また、請求項3の発明にかかるナビゲーション装置は、移動体の現在位置を測位して現在位置情報を取得する現在位置情報取得手段と、目的地までのルートに関するルート情報を取得するルート情報取得手段と、前記移動体の移動者の視点と前記ルートに対応する道路の路面との間に位置する透過部材上に、前記現在位置情報と前記ルート情報とに基づいて生成されたルート表示画像を、前記移動者から見た前記透過部材上の前記路面の位置から所定の高さに投影する投影手段と、を備え、前記投影手段は、前記現在位置の測位の信頼性に応じて前記所定の高さが変更された前記ルート表示画像を投影することを特徴とする。 In addition, the navigation device according to the invention of claim 3 includes a current position information acquisition unit that acquires a current position information by measuring a current position of a mobile body, and a route information acquisition unit that acquires route information regarding a route to a destination. And a route display image generated based on the current position information and the route information on a transmissive member located between the viewpoint of the mobile person of the moving body and the road surface corresponding to the route, Projection means for projecting to a predetermined height from the position of the road surface on the transmission member as seen from the moving person, the projection means depending on the reliability of positioning at the current position. Projecting the route display image whose length is changed.
 また、請求項9の発明にかかるナビゲーション方法は、ナビゲーション装置におけるナビゲーション方法であって、移動体の現在位置を測位して現在位置情報を取得する現在位置情報取得工程と、目的地までのルートに関するルート情報を取得するルート情報取得工程と、前記ルートに対応する道路の路面を撮影した映像を取得する映像取得工程と、前記現在位置情報と前記ルート情報とに基づいて生成されたルート表示画像を、前記映像上の前記路面の位置から所定の高さに重ねて表示する表示工程と、を備え、前記表示工程では、前記現在位置の測位精度に応じて前記所定の高さが変更された前記ルート表示画像を表示することを特徴とする。 A navigation method according to the invention of claim 9 is a navigation method in a navigation device, and relates to a current position information acquisition step of acquiring a current position information by measuring a current position of a moving body, and a route to a destination. A route information acquisition step for acquiring route information, a video acquisition step for acquiring a video of a road surface of a road corresponding to the route, and a route display image generated based on the current position information and the route information. A display step of displaying a predetermined height from the position of the road surface on the video, wherein the predetermined height is changed according to the positioning accuracy of the current position. A route display image is displayed.
 また、請求項10の発明にかかるナビゲーション方法は、ナビゲーション装置におけるナビゲーション方法であって、移動体の現在位置を測位して現在位置情報を取得する現在位置情報取得工程と、目的地までのルートに関するルート情報を取得するルート情報取得工程と、前記移動体の移動者の視点と前記ルートに対応する道路の路面との間に位置する透過部材上に、前記現在位置情報と前記ルート情報とに基づいて生成されたルート表示画像を、前記移動者から見た前記透過部材上の前記路面の位置から所定の高さに投影する投影工程と、を備え、前記投影工程では、前記現在位置の測位精度に応じて前記所定の高さが変更された前記ルート表示画像を投影することを特徴とする。 A navigation method according to the invention of claim 10 is a navigation method in a navigation device, and relates to a current position information acquisition step of acquiring a current position information by measuring a current position of a moving body, and a route to a destination. Based on the current position information and the route information on a route information obtaining step for obtaining route information, and on a transparent member located between a viewpoint of a moving person of the moving body and a road surface of the road corresponding to the route. Projecting the route display image generated in this way to a predetermined height from the position of the road surface on the transmission member as seen by the moving person, and in the projection step, the positioning accuracy of the current position The route display image having the predetermined height changed according to the projection is projected.
 また、請求項11の発明にかかるナビゲーションプログラムは、請求項9または10に記載のナビゲーション方法をコンピュータに実行させることを特徴とする。 A navigation program according to the invention of claim 11 causes a computer to execute the navigation method according to claim 9 or 10.
 また、請求項12の発明にかかる記録媒体は、請求項11に記載のナビゲーションプログラムを記録したコンピュータに読み取り可能なことを特徴とする。 Further, the recording medium according to the invention of claim 12 is readable by a computer in which the navigation program according to claim 11 is recorded.
図1は、実施の形態1にかかるナビゲーション装置の機能的構成を示すブロック図である。FIG. 1 is a block diagram of a functional configuration of the navigation device according to the first embodiment. 図2は、実施の形態1にかかるナビゲーション装置によるナビゲーション処理の手順を示すフローチャートである。FIG. 2 is a flowchart of a navigation process performed by the navigation device according to the first embodiment. 図3は、実施の形態2にかかるナビゲーション装置の機能的構成を示すブロック図である。FIG. 3 is a block diagram of a functional configuration of the navigation device according to the second embodiment. 図4は、実施の形態2にかかるナビゲーション装置によるナビゲーション処理の手順を示すフローチャートである。FIG. 4 is a flowchart of a navigation process performed by the navigation device according to the second embodiment. 図5は、実施の形態3にかかる情報表示装置の機能的構成を示すブロック図である。FIG. 5 is a block diagram of a functional configuration of the information display apparatus according to the third embodiment. 図6は、ナビゲーション装置のハードウェア構成を示すブロック図である。FIG. 6 is a block diagram illustrating a hardware configuration of the navigation device. 図7は、ナビゲーション装置の生成部の機能的構成を示すブロック図である。FIG. 7 is a block diagram illustrating a functional configuration of the generation unit of the navigation device. 図8は、ナビゲーション装置の生成部の処理の手順を示すフローチャートである。FIG. 8 is a flowchart illustrating a processing procedure of the generation unit of the navigation device. 図9は、十字路の交差点の地図データ(ノード情報、リンク情報)を示す説明図である。FIG. 9 is an explanatory diagram showing map data (node information, link information) at intersections of crossroads. 図10-1は、生成部の配置処理部における配置処理の内容を示す説明図である。FIG. 10A is an explanatory diagram of the contents of the placement process in the placement processing unit of the generation unit. 図10-2は、生成部の配置処理部における配置処理の内容を示す説明図である。FIG. 10B is an explanatory diagram of the contents of the placement process in the placement processing unit of the generation unit. 図11-1は、生成部の配置処理部における配置処理の内容を示す説明図である。FIG. 11A is an explanatory diagram of the contents of the placement process in the placement processing unit of the generation unit. 図11-2は、生成部の配置処理部における配置処理の内容を示す説明図である。FIG. 11B is an explanatory diagram of the contents of the placement processing in the placement processing unit of the generation unit. 図12-1は、生成部の移動処理部における移動処理の内容を示す説明図である。FIG. 12A is an explanatory diagram of the contents of the movement process in the movement processing unit of the generation unit. 図12-2は、生成部の移動処理部における移動処理の内容を示す説明図である。FIG. 12B is an explanatory diagram of the contents of the movement process in the movement processing unit of the generation unit. 図13-1は、生成部の設定処理部における設定処理の内容を示す説明図である。FIG. 13A is an explanatory diagram of the contents of setting processing in the setting processing unit of the generation unit. 図13-2は、生成部の設定処理部における設定処理の内容を示す説明図である。FIG. 13B is an explanatory diagram of the contents of setting processing in the setting processing unit of the generation unit. 図14は、生成部のレンダリング処理部によってレンダリングされた状態を示す説明図である。FIG. 14 is an explanatory diagram showing a state rendered by the rendering processing unit of the generation unit. 図15は、前方映像上に上空ルート案内表示画像を重ねて表示した状態を示す説明図である。FIG. 15 is an explanatory diagram showing a state in which an aerial route guidance display image is superimposed on the front video. 図16は、前方映像上に路面ルート案内表示画像を重ねて表示した状態を示す説明図である。FIG. 16 is an explanatory diagram showing a state in which a road surface route guidance display image is displayed superimposed on the front video. 図17は、ナビゲーション装置の生成部における切り替え生成処理の手順を示すフローチャートである。FIG. 17 is a flowchart illustrating a procedure of switching generation processing in the generation unit of the navigation device. 図18は、各センサの出力値が正常であるか否かの判定処理の手順(その1)を示すフローチャートである。FIG. 18 is a flowchart showing a procedure (No. 1) for determining whether or not the output value of each sensor is normal. 図19は、各センサの出力値が正常であるか否かの判定処理の手順(その2)を示すフローチャートである。FIG. 19 is a flowchart showing a procedure (No. 2) for determining whether or not the output value of each sensor is normal. 図20は、前方映像上に路面ルート案内表示画像を重ねて表示した状態を示す説明図である。FIG. 20 is an explanatory diagram illustrating a state in which a road surface route guidance display image is displayed over the front video. 図21は、前方映像上に上空ルート案内表示画像を重ねて表示した状態を示す説明図である。FIG. 21 is an explanatory diagram showing a state in which an aerial route guidance display image is displayed superimposed on the front video. 図22は、フロントウインドウに上空ルート案内表示画像を投影した状態を示す説明図である。FIG. 22 is an explanatory diagram showing a state in which the sky route guidance display image is projected on the front window.
 以下に添付図面を参照して、この発明にかかるナビゲーション装置、ナビゲーション方法、ナビゲーションプログラムおよび記録媒体の好適な実施の形態を詳細に説明する。 Hereinafter, preferred embodiments of a navigation device, a navigation method, a navigation program, and a recording medium according to the present invention will be described in detail with reference to the accompanying drawings.
(実施の形態1)
 実施の形態1にかかるナビゲーション装置100の機能的構成について説明する。図1は、実施の形態1にかかるナビゲーション装置の機能的構成を示すブロック図である。図1において、実施の形態1にかかるナビゲーション装置100は、現在位置情報取得部101、ルート情報取得部102、記憶部103、生成部104、映像取得部105、表示部106、入力部107、判断部108、センサ109によって構成される。
(Embodiment 1)
A functional configuration of the navigation device 100 according to the first embodiment will be described. FIG. 1 is a block diagram of a functional configuration of the navigation device according to the first embodiment. In FIG. 1, the navigation apparatus 100 according to the first embodiment includes a current position information acquisition unit 101, a route information acquisition unit 102, a storage unit 103, a generation unit 104, a video acquisition unit 105, a display unit 106, an input unit 107, and a determination. The unit 108 and the sensor 109 are configured.
 現在位置情報取得部101は、移動体の現在位置を測位して現在位置情報を取得する。移動体の現在位置に関する情報には、緯度経度情報のほか、移動体の速度、走行方向などの情報を含んでいてもよい。移動体の現在位置に関する情報は、たとえば後述するセンサ109によって取得することができる。 The current position information acquisition unit 101 measures the current position of the moving body and acquires current position information. The information related to the current position of the mobile object may include information such as the speed and travel direction of the mobile object in addition to the latitude and longitude information. Information relating to the current position of the moving object can be acquired by, for example, a sensor 109 described later.
 ルート情報取得部102は、目的地までのルートに関するルート情報を取得する。ルート情報は、あらかじめ登録されていてもよく、通信手段によって受信することによって取得してもよく、操作者(運転車または同乗者)の入力によって取得するようにしてもよい。その際、操作者からは目的地に関する情報のみが入力され、移動体(たとえば自車)の現在位置と入力された目的地とに基づいてルートを探索し、決定されたルートをルート情報として取得するようにしてもよい。なお、ルート情報には、目的地までのルートに対応する各道路のリンク情報や、交差点を示すノード情報が含まれていてもよい。 The route information acquisition unit 102 acquires route information related to the route to the destination. The route information may be registered in advance, may be acquired by being received by communication means, or may be acquired by input of an operator (driver or passenger). At that time, only information on the destination is input from the operator, a route is searched based on the current position of the moving body (for example, the own vehicle) and the input destination, and the determined route is acquired as route information. You may make it do. The route information may include link information of each road corresponding to the route to the destination and node information indicating an intersection.
 記憶部103は、いわゆる地図情報、具体的には、たとえば道路の形状を含むリンク情報と、交差点を示すノード情報と、を記憶する。地図情報には、さらに信号機、標識情報や建物の情報(たとえば建物の形状や高さ)などを含んでいてもよい。リンク情報には、道路の幅情報や傾斜などに関する情報を含んでいてもよい。記憶部103はナビゲーション装置100に備えられていてもよく、また、通信によりアクセス可能な外部のサーバが記憶部103の機能を実現してもよい。 The storage unit 103 stores so-called map information, specifically, for example, link information including the shape of a road and node information indicating an intersection. The map information may further include traffic lights, sign information, building information (for example, building shape and height), and the like. The link information may include information on road width information and inclination. The storage unit 103 may be provided in the navigation device 100, or an external server accessible by communication may realize the function of the storage unit 103.
 生成部104は、現在位置情報取得部101によって取得された現在位置に関する情報と、ルート情報取得部102によって取得されたルート情報と、に基づいてルート表示画像を生成する。具体的には、たとえば、ルートに沿って連続した帯状のルート表示画像であって路面に重なって表示されるような路面ルート案内表示画像と、当該ルートに沿って連続した帯状のルート表示画像であって路面から所定の高さに表示されるような上空ルート案内表示画像と、を所定のタイミングで切り替えて生成する。その際、現在位置の測位の信頼性が高いときは、路面の直上となる位置を所定の高さとした路面ルート案内表示画像を生成し、現在位置の測位の信頼性が低いときは、映像を撮影する視点よりも高い位置を所定の高さとした上空ルート案内表示画像を生成する。路面ルート案内表示画像および上空ルート案内表示画像の具体的な生成方法、切り替えるタイミングなどについては後述する。 The generation unit 104 generates a route display image based on the information on the current position acquired by the current position information acquisition unit 101 and the route information acquired by the route information acquisition unit 102. Specifically, for example, a road-like route display image that is continuous along the route and displayed to overlap the road surface, and a belt-like route display image that is continuous along the route. Then, an aerial route guidance display image displayed at a predetermined height from the road surface is generated by switching at a predetermined timing. At that time, if the positioning of the current position is highly reliable, a road surface route guidance display image with a predetermined height as the position directly above the road surface is generated, and if the positioning of the current position is low, the video is displayed. A sky route guidance display image having a predetermined height at a position higher than the viewpoint to be photographed is generated. A specific method for generating the road route guidance display image and the sky route guidance display image, switching timing, and the like will be described later.
 また、生成部104は、所定の条件に合致した場合に、路面ルート案内表示画像から上空ルート案内表示画像に切り替えて生成してもよい。また、生成部104は、所定の条件に合致した場合に、上空ルート案内表示画像から路面ルート案内表示画像に切り替えて生成する。所定の条件の具体的内容については後述する。 In addition, the generation unit 104 may switch the road route guidance display image to the sky route guidance display image when a predetermined condition is met. In addition, the generation unit 104 switches from the sky route guidance display image to the road surface route guidance display image when the predetermined condition is met. Specific contents of the predetermined condition will be described later.
 映像取得部105は、ルートに対応する道路の路面を撮影した映像を(たとえば撮像部110から)取得する。撮像部110は、ナビゲーション装置100が備えていてもよい。表示部106は、現在位置情報とルート情報とに基づいて(たとえば生成部104によって生成された)生成されたルート表示画像を、(たとえば映像取得部105によって取得された)映像上の路面の位置から所定の高さに重ねて表示する。そして、表示部106は、現在位置の測位の信頼性に応じて所定の高さが変更されたルート表示画像を表示する。また、表示部106は、路面ルート案内表示画像または上空ルート案内表示画像のいずれか一方から他方に切り替えて表示したときは、所定の条件を満たすまで切り替え後のルート表示画像の表示を維持するようにしてもよい。 The video acquisition unit 105 acquires a video of the road surface corresponding to the route (for example, from the imaging unit 110). The imaging unit 110 may be included in the navigation device 100. The display unit 106 displays the route display image generated (for example, generated by the generation unit 104) based on the current position information and the route information, and the position of the road surface on the video (for example, acquired by the video acquisition unit 105). The images are displayed at a predetermined height. Then, the display unit 106 displays a route display image whose predetermined height is changed according to the reliability of positioning of the current position. Further, when the display unit 106 switches from one of the road surface route guidance display image and the sky route guidance display image to the other, the display unit 106 maintains the display of the route display image after the switching until a predetermined condition is satisfied. It may be.
 入力部107は、路面ルート案内表示画像または上空ルート案内表示画像のいずれか一方が表示されているときに他方に切り替える指示の入力を受け付ける。そして、表示部106は、入力部107が入力を受け付けたときは後述する判断部108によって判断される測位の信頼性によらずに他方に切り替えて表示するようにしてもよい。 The input unit 107 accepts an input of an instruction to switch to the other when either the road surface route guidance display image or the sky route guidance display image is displayed. Then, when the input unit 107 receives an input, the display unit 106 may be switched to the other display without depending on the reliability of positioning determined by the determination unit 108 described later.
 判断部108は、(たとえば現在位置情報取得部101によって取得された)現在位置の測位の信頼性を判断し、判断結果を生成部104へ渡す。具体的な判断処理の詳細については後述する。生成部104は、現在位置の測位の信頼性が高いときは、路面の直上となる位置を所定の高さとした路面ルート案内表示画像を生成し、現在位置の測位の信頼性が低いときは、映像を撮影する視点よりも高い位置を所定の高さとした上空ルート案内表示画像を生成するようにしてもよい。 The determination unit 108 determines the reliability of positioning of the current position (for example, acquired by the current position information acquisition unit 101), and passes the determination result to the generation unit 104. Details of specific determination processing will be described later. The generation unit 104 generates a road surface route guidance display image with a predetermined height as a position directly above the road surface when the current position positioning reliability is high, and when the current position positioning reliability is low, You may make it produce | generate the sky route guidance display image which made the position higher than the viewpoint which image | photographs an image | video the predetermined height.
 センサ109は、現在位置及び方位を測位する複数の自立航法センサから構成される。そして判断部108は、各々の自立航法センサ109の測位結果が整合するか否かによって信頼性を判断することができる。また、センサ109は、現在位置及び方位を測位する自立航法センサとGPSセンサとから構成される。そして判断部108は、自立航法センサとGPSセンサとの測位結果が整合するか否かによって信頼性を判断することができる。 The sensor 109 includes a plurality of self-contained navigation sensors that measure the current position and direction. Then, the determination unit 108 can determine the reliability depending on whether or not the positioning results of the independent navigation sensors 109 are consistent. The sensor 109 includes a self-contained navigation sensor that measures the current position and direction and a GPS sensor. The determination unit 108 can determine the reliability based on whether or not the positioning results of the self-contained navigation sensor and the GPS sensor match.
 つぎに、ナビゲーション装置100によるナビゲーション処理の手順について説明する。図2は、ナビゲーション装置によるナビゲーション処理の手順を示すフローチャートである。図2のフローチャートにおいて、ナビゲーション装置100は、まず、現在位置情報取得部101によって、移動体の現在位置を測位して現在位置情報を取得する(ステップS201)。その際、現在位置の測位の信頼性に関する情報を取得するようにしてもよい。また、ルート情報取得部102によって、目的地までのルートに関するルート情報を取得する(ステップS202)。また、その際、ナビゲーション装置100は、道路情報もあわせて取得してもよい。 Next, a procedure of navigation processing by the navigation device 100 will be described. FIG. 2 is a flowchart showing a procedure of navigation processing by the navigation device. In the flowchart of FIG. 2, the navigation device 100 first acquires the current position information by measuring the current position of the moving body by the current position information acquisition unit 101 (step S201). At this time, information regarding the reliability of positioning of the current position may be acquired. In addition, the route information acquisition unit 102 acquires route information regarding the route to the destination (step S202). At that time, the navigation device 100 may also acquire road information.
 つぎに、ナビゲーション装置100は、現在位置情報とルート情報とに基づいて、現在位置の測位の信頼性に応じたルート表示画像(路面ルート案内表示画像または上空ルート案内表示画像)を生成する(ステップS204)。 Next, the navigation device 100 generates a route display image (road surface route guidance display image or sky route guidance display image) according to the reliability of positioning of the current position based on the current position information and route information (step). S204).
 さらに、ナビゲーション装置100は、ルートに対応する道路の路面を撮影した映像を取得する(ステップS205)。そして、ナビゲーション装置100は、現在位置情報とルート情報とに基づいて生成されたルート表示画像を、映像上の路面の位置から所定の高さに重ねて表示する(ステップS206)。 Furthermore, the navigation device 100 acquires a video image of the road surface corresponding to the route (step S205). Then, the navigation device 100 displays the route display image generated based on the current position information and the route information so as to overlap the predetermined height from the position of the road surface on the video (step S206).
 つぎに、ナビゲーション装置100は、移動体(たとえば自車)の走行によって現在位置が変更になるのを待って(ステップS207:No)、現在位置が変更になった場合(ステップS207:Yes)は、目的地に到着したか否か判断する(ステップS208)。ここで、未だ目的地に到着していない場合(ステップS208:No)は、ステップS201へ戻って、ステップS201~S207の各ステップの処理を繰り返しおこなう。ステップS208において、目的地に到着した場合(ステップS208:Yes)は、一連の処理を終了する。 Next, the navigation apparatus 100 waits for the current position to change due to the traveling of the moving body (for example, the own vehicle) (step S207: No), and when the current position is changed (step S207: Yes). It is then determined whether or not the destination has been reached (step S208). If the destination has not yet been reached (step S208: No), the process returns to step S201, and the processes in steps S201 to S207 are repeated. In step S208, when it arrives at the destination (step S208: Yes), a series of processing is ended.
 これによって、移動体(たとえば自車)が目的地に到着するまで、現在位置の測位の信頼性に応じたルート表示画像(路面ルート案内表示画像か上空ルート案内表示画像のいずれか)が映像に重ねて表示され続ける。操作者は、走行しながら、表示画面に映像とともに表示されたルート表示画像を見ることで、目的地までのルートを間違えずに認識することができる。 As a result, a route display image (either a road route guidance display image or a sky route guidance display image) according to the reliability of positioning of the current position is displayed on the video until the moving body (for example, the own vehicle) arrives at the destination. It continues to be displayed repeatedly. The operator can recognize the route to the destination without making a mistake by viewing the route display image displayed together with the video on the display screen while traveling.
 なお、ナビゲーション装置100は、各構成部が一体となって1つの装置として構成されていてもよいし、各構成部を単独の装置とし、複数の装置によってナビゲーション装置100の機能を実現してもよい。 Note that the navigation device 100 may be configured as a single device with each component unit integrated, or each component unit may be a single device and the function of the navigation device 100 may be realized by a plurality of devices. Good.
(実施の形態2)
 つぎに、実施の形態2にかかるナビゲーション装置300の機能的構成について説明する。図3は、実施の形態2にかかるナビゲーション装置の機能的構成を示すブロック図である。図3において、実施の形態2にかかるナビゲーション装置300は、現在位置情報取得部301、ルート情報取得部302、記憶部303、生成部304、投影部305、入力部306、判断部307、センサ308によって構成される。
(Embodiment 2)
Next, a functional configuration of the navigation device 300 according to the second embodiment will be described. FIG. 3 is a block diagram of a functional configuration of the navigation device according to the second embodiment. In FIG. 3, the navigation apparatus 300 according to the second embodiment includes a current position information acquisition unit 301, a route information acquisition unit 302, a storage unit 303, a generation unit 304, a projection unit 305, an input unit 306, a determination unit 307, and a sensor 308. Consists of.
 ここで、現在位置情報取得部301、ルート情報取得部302、記憶部303、入力部306、判断部307、センサ308は、実施の形態1における現在位置情報取得部101、ルート情報取得部102、記憶部103、入力部107、判断部108、センサ109と同一の構成であるで、その説明は省略する。 Here, the current position information acquisition unit 301, the route information acquisition unit 302, the storage unit 303, the input unit 306, the determination unit 307, and the sensor 308 are the current position information acquisition unit 101, the route information acquisition unit 102, and the like in the first embodiment. The configuration is the same as that of the storage unit 103, the input unit 107, the determination unit 108, and the sensor 109, and a description thereof will be omitted.
 生成部304は、現在位置情報取得部301によって取得された現在位置に関する情報と、ルート情報取得部302によって取得されたルート情報と、記憶部303に記憶されたリンク情報のみと、またはノード情報およびリンク情報と、に基づいてルート表示画像を生成する。具体的には、たとえば、ルートに沿って連続した帯状のルート表示画像であって路面から所定の高さに表示されるような上空ルート案内表示画像を生成する。その際、現在位置の測位の信頼性が高いときは、路面の直上となる位置を所定の高さとした路面ルート案内表示画像を生成し、現在位置の測位の信頼性が低いときは、移動者の視点よりも高い位置を所定の高さとした上空ルート案内表示画像を生成する。路面ルート案内表示画像および上空ルート案内表示画像の具体的な生成方法、切り替えるタイミングなどについては後述する。なお、生成部304は、実施の形態1の生成部104とは、路面ルート案内表示画像、上空ルート案内表示画像の具体的な生成方法が異なる。路面ルート案内表示画像、上空ルート案内表示画像の具体的な生成方法については後述する。 The generation unit 304 includes information on the current position acquired by the current position information acquisition unit 301, route information acquired by the route information acquisition unit 302, only link information stored in the storage unit 303, or node information and A route display image is generated based on the link information. Specifically, for example, a sky-like route display image that is a continuous belt-like route display image along the route and displayed at a predetermined height from the road surface is generated. At that time, when the positioning reliability of the current position is high, a road surface route guidance display image with a predetermined height as the position directly above the road surface is generated. When the positioning reliability of the current position is low, the mobile A sky route guidance display image having a predetermined height higher than the viewpoint is generated. A specific method for generating the road route guidance display image and the sky route guidance display image, switching timing, and the like will be described later. The generation unit 304 is different from the generation unit 104 of the first embodiment in a specific generation method of the road surface route guidance display image and the sky route guidance display image. A specific method for generating the road surface route guidance display image and the sky route guidance display image will be described later.
 実施の形態2にかかるナビゲーション装置300と、実施の形態1にかかるナビゲーション装置100との相違点は、実施の形態1にかかるナビゲーション装置100が、映像取得部105および表示部106を備えているのに対し、実施の形態2にかかるナビゲーション装置300は、映像取得部105、表示部106を備えておらず、その代わりに、投影部305を備えている点である。 The difference between the navigation device 300 according to the second embodiment and the navigation device 100 according to the first embodiment is that the navigation device 100 according to the first embodiment includes a video acquisition unit 105 and a display unit 106. On the other hand, the navigation device 300 according to the second embodiment is not provided with the video acquisition unit 105 and the display unit 106 but is provided with a projection unit 305 instead.
 投影部305は、移動体の移動者の視点とルートに対応する道路の路面との間に位置する透過部材上に、現在位置情報とルート情報とに基づいて生成されたルート表示画像を、移動者から見た透明部材上の路面の位置から所定の高さに投影する。たとえば、生成部304によって生成された路面ルート案内表示画像または上空ルート案内表示画像を対応する移動体(たとえば自車)のウインドウに投影する。すなわち、車内からルートに対応する道路が見えるウインドウ(通常はフロントウインドウ)に投影する。これによって、ウインドウから見える実風景に路面ルート案内表示画像または上空ルート案内表示画像が重なって見えるようになる。そして、投影部305は、現在位置の測位の信頼性に応じて所定の高さが変更されたルート表示画像を投影する。 The projecting unit 305 moves a route display image generated based on the current position information and the route information on a transmission member located between the viewpoint of the moving object moving person and the road surface corresponding to the route. Projection is performed at a predetermined height from the position of the road surface on the transparent member as seen by the person. For example, the road surface route guidance display image or the sky route guidance display image generated by the generation unit 304 is projected onto the window of the corresponding moving body (for example, the own vehicle). In other words, the image is projected onto a window (usually a front window) in which the road corresponding to the route can be seen from inside the vehicle. As a result, the road route guidance display image or the sky route guidance display image appears to overlap the actual scenery seen from the window. Then, the projection unit 305 projects a route display image whose predetermined height has been changed according to the reliability of positioning at the current position.
 つぎに、ナビゲーション装置300によるナビゲーション処理の手順について説明する。図4は、ナビゲーション装置によるナビゲーション処理の手順を示すフローチャートである。図4のフローチャートにおいて、ナビゲーション装置300は、まず、現在位置情報取得部301によって、移動体の現在位置を測位して現在位置情報を取得する(ステップS401)。その際、現在位置の信頼性の判断をおこなうようにしてもよい。また、ルート情報取得部302によって、目的地までのルートに関するルート情報を取得する(ステップS402)。また、その際、ナビゲーション装置300は、記憶部303に記憶されている道路情報(ノード情報・リンク情報)をあわせて取得してもよい(ステップS403)。 Next, a procedure of navigation processing by the navigation device 300 will be described. FIG. 4 is a flowchart showing a procedure of navigation processing by the navigation device. In the flowchart of FIG. 4, the navigation device 300 first acquires the current position information by measuring the current position of the moving body by the current position information acquisition unit 301 (step S <b> 401). At that time, the reliability of the current position may be determined. In addition, the route information acquisition unit 302 acquires route information regarding the route to the destination (step S402). At that time, the navigation apparatus 300 may also acquire road information (node information / link information) stored in the storage unit 303 (step S403).
 そして、ナビゲーション装置300は、現在位置情報とルート情報とに基づいて、現在位置の測位の信頼性に応じたルート表示画像(路面ルート案内表示画像または上空ルート案内表示画像)を生成し(ステップS404)、生成されたルート表示画像を、移動者から見た透明部材上の路面の位置から所定の高さに投影する。具体的には、たとえば、生成された路面ルート案内表示画像または上空ルート案内表示画像を移動体(たとえば自車)のフロントウインドウに投影する(ステップS405)。 Then, the navigation device 300 generates a route display image (road surface route guidance display image or sky route guidance display image) according to the reliability of positioning of the current position based on the current position information and route information (step S404). ), And projecting the generated route display image at a predetermined height from the position of the road surface on the transparent member as viewed from the moving person. Specifically, for example, the generated road surface route guidance display image or the sky route guidance display image is projected onto the front window of the moving body (for example, the own vehicle) (step S405).
 つぎに、ナビゲーション装置300は、移動体(たとえば自車)の走行によって現在位置が変更になるのを待って(ステップS406:No)、現在位置が変更になった場合(ステップS406:Yes)は、目的地に到着したか否か判断する(ステップS407)。ここで、未だ目的地に到着していない場合(ステップS407:No)は、ステップS401へ戻って、ステップS401~S406の各ステップの処理を繰り返しおこなう。ステップS407において、目的地に到着した場合(ステップS407:Yes)は、一連の処理を終了する。 Next, the navigation apparatus 300 waits for the current position to change due to the traveling of the mobile body (for example, the own vehicle) (step S406: No), and when the current position is changed (step S406: Yes). Then, it is determined whether or not the destination has been reached (step S407). If the destination has not yet been reached (step S407: No), the process returns to step S401, and the processes of steps S401 to S406 are repeated. In step S407, when it arrives at the destination (step S407: Yes), a series of processes are complete | finished.
 これによって、移動体(たとえば自車)が目的地に到着するまで、路面ルート案内表示画像または上空ルート案内表示画像が移動体(たとえば自車)のフロントウインドウに投影され続ける。操作者は、走行しながら、フロントウインドウを通して前方の実風景を見るとともに投影された路面ルート案内表示画像または上空ルート案内表示画像を見ることで、目的地までのルートを認識することができる。 Thus, the road surface route guidance display image or the sky route guidance display image continues to be projected on the front window of the moving body (for example, the own vehicle) until the moving body (for example, the own vehicle) arrives at the destination. While traveling, the operator can recognize the route to the destination by viewing the actual scenery ahead through the front window and viewing the projected road route guidance display image or the sky route guidance display image.
 なお、ナビゲーション装置300は、各構成部が一体となって1つの装置として構成されていてもよいし、各構成部を単独の装置とし、複数の装置によってナビゲーション装置300の機能を実現してもよい。 Note that the navigation device 300 may be configured as a single device with each component unit integrated, or each component unit may be a single device, and the function of the navigation device 300 may be realized by a plurality of devices. Good.
(実施の形態3)
 実施の形態3にかかる情報表示装置500の機能的構成について説明する。実施の形態1および2では、たとえば自動車などの移動体に搭載されるナビゲーション装置であったのに対し、実施の形態3では、移動体に搭載されるものに限定されない。
(Embodiment 3)
A functional configuration of the information display apparatus 500 according to the third embodiment will be described. In the first and second embodiments, for example, the navigation device is mounted on a moving body such as an automobile. In the third embodiment, the navigation apparatus is not limited to the one mounted on the moving body.
 図5は、実施の形態3にかかる情報表示装置の機能的構成を示すブロック図である。図5において、実施の形態3にかかる情報表示装置500は、現在位置情報取得部501、ルート情報取得部502、記憶部503、生成部504、映像取得部505、表示部506、入力部507、判断部508、センサ509、撮像部510によって構成される。 FIG. 5 is a block diagram of a functional configuration of the information display apparatus according to the third embodiment. In FIG. 5, the information display apparatus 500 according to the third embodiment includes a current position information acquisition unit 501, a route information acquisition unit 502, a storage unit 503, a generation unit 504, a video acquisition unit 505, a display unit 506, an input unit 507, The determination unit 508, the sensor 509, and the imaging unit 510 are configured.
 現在位置情報取得部501は、自装置の現在位置を測位して現在位置情報を取得する。自装置の現在位置に関する情報には、緯度経度情報のほか、自装置(特に表示画面)が向いている方向、自装置の地上からの高さ、自装置(を携帯している操作者)の移動速度などの情報を含んでいてもよい。自装置の現在位置に関する情報は、たとえば後述するセンサ509によって取得することができる。 The current position information acquisition unit 501 measures the current position of the own device and acquires current position information. In addition to the latitude and longitude information, the information about the current position of the device itself includes the direction in which the device (particularly the display screen) is facing, the height of the device from the ground, and the device (the operator carrying the device). Information such as moving speed may be included. Information relating to the current position of the device itself can be acquired, for example, by a sensor 509 described later.
 ルート情報取得部502は、目的地までのルートに関するルート情報を取得する。ルート情報は、あらかじめ登録されていてもよく、通信手段によって受信することによって取得してもよく、自装置の操作者の入力によって取得するようにしてもよい。その際、操作者からは目的地に関する情報のみが入力され、自装置の現在位置と入力された目的地とに基づいてルートを探索し、決定されたルートをルート情報として取得するようにしてもよい。 The route information acquisition unit 502 acquires route information related to the route to the destination. The route information may be registered in advance, may be acquired by being received by a communication unit, or may be acquired by input of an operator of the own device. At that time, only information related to the destination is input from the operator, a route is searched based on the current position of the own device and the input destination, and the determined route is acquired as route information. Good.
 記憶部503は、いわゆる地図情報、具体的には、たとえば道路の形状を含むリンク情報と、交差点を示すノード情報と、を記憶する。地図情報には、さらに信号機、標識情報や建物の情報(たとえば建物の形状や高さ)などを含んでいてもよい。リンク情報には、道路の幅情報や傾斜などに関する情報を含んでいてもよい。記憶部503は情報表示装置500に備えられていてもよく、また、通信によりアクセス可能な外部のサーバが記憶部503の機能を実現してもよい。 The storage unit 503 stores so-called map information, specifically, link information including, for example, a road shape, and node information indicating an intersection. The map information may further include traffic lights, sign information, building information (for example, building shape and height), and the like. The link information may include information on road width information and inclination. The storage unit 503 may be provided in the information display device 500, or an external server accessible by communication may realize the function of the storage unit 503.
 生成部504は、現在位置情報取得部501によって取得された現在位置に関する情報と、ルート情報取得部502によって取得されたルート情報と、に基づいてルート表示画像を生成する。具体的には、たとえば、ルートに沿って連続した帯状のルート表示画像であって路面に重なって表示されるような路面ルート案内表示画像と、当該ルートに沿って連続した帯状のルート表示画像であって路面から所定の高さに表示されるような上空ルート案内表示画像と、を所定のタイミングで切り替えて生成する。その際、現在位置の測位の信頼性が高いときは、路面の直上となる位置を所定の高さとした路面ルート案内表示画像を生成し、現在位置の測位の信頼性が低いときは、映像を撮影する視点よりも高い位置を所定の高さとした上空ルート案内表示画像を生成する。路面ルート案内表示画像および上空ルート案内表示画像の具体的な生成方法、切り替えるタイミングなどについては後述する。 The generation unit 504 generates a route display image based on the information on the current position acquired by the current position information acquisition unit 501 and the route information acquired by the route information acquisition unit 502. Specifically, for example, a road-like route display image that is continuous along the route and displayed to overlap the road surface, and a belt-like route display image that is continuous along the route. Then, an aerial route guidance display image displayed at a predetermined height from the road surface is generated by switching at a predetermined timing. At that time, if the positioning of the current position is highly reliable, a road surface route guidance display image with a predetermined height as the position directly above the road surface is generated, and if the positioning of the current position is low, the video is displayed. A sky route guidance display image having a predetermined height at a position higher than the viewpoint to be photographed is generated. A specific method for generating the road route guidance display image and the sky route guidance display image, switching timing, and the like will be described later.
 映像取得部505は、自装置のルート上の映像を撮影する撮像部510から当該映像を取得する。撮像部510は、情報表示装置500が備えている構成としたが、情報表示装置500と有線または無線によって接続され、撮影された映像を情報表示装置500が取得できればよい。表示部506は、現在位置情報とルート情報とに基づいて(たとえば生成部504によって生成された)生成されたルート表示画像を、(たとえば映像取得部505によって取得された)映像上の路面の位置から所定の高さに重ねて表示する。そして、表示部506は、現在位置の測位の信頼性に応じて所定の高さが変更されたルート表示画像を表示する。また、表示部506は、路面ルート案内表示画像または上空ルート案内表示画像のいずれか一方から他方に切り替えて表示したときは、所定の条件を満たすまで切り替え後のルート表示画像の表示を維持するようにしてもよい。 The video acquisition unit 505 acquires the video from the imaging unit 510 that captures the video on the route of the device itself. The imaging unit 510 is configured to be included in the information display device 500. However, it is only necessary that the imaging unit 510 is connected to the information display device 500 by wire or wireless and the information display device 500 can acquire the captured video. The display unit 506 displays the generated route display image (for example, generated by the generation unit 504) based on the current position information and the route information, and the position of the road surface on the video (for example, acquired by the video acquisition unit 505). The images are displayed at a predetermined height. Then, the display unit 506 displays a route display image whose predetermined height is changed according to the reliability of positioning of the current position. Further, when the display unit 506 switches from one of the road surface route guidance display image or the sky route guidance display image to the other, the display unit 506 maintains the display of the route display image after the switching until a predetermined condition is satisfied. It may be.
 入力部507は、路面ルート案内表示画像または上空ルート案内表示画像のいずれか一方が表示されているときに他方に切り替える指示の入力を受け付ける。そして、表示部506は、入力部507が入力を受け付けたときは後述する判断部508によって判断される測位の信頼性によらずに他方に切り替えて表示するようにしてもよい。 The input unit 507 receives an input of an instruction to switch to the other when either the road surface route guidance display image or the sky route guidance display image is displayed. Then, when the input unit 507 receives an input, the display unit 506 may switch to the other display without depending on the reliability of positioning determined by the determination unit 508 described later.
 判断部508は、(たとえば現在位置情報取得部501によって取得された)現在位置の測位の信頼性を判断し、判断結果を生成部504へ渡す。具体的な判断処理の詳細については後述する。生成部504は、現在位置の測位の信頼性が高いときは、路面の直上となる位置を所定の高さとした路面ルート案内表示画像を生成し、現在位置の測位の信頼性が低いときは、映像を撮影する視点よりも高い位置を所定の高さとした上空ルート案内表示画像を生成するようにしてもよい。 The determination unit 508 determines the reliability of positioning of the current position (for example, acquired by the current position information acquisition unit 501), and passes the determination result to the generation unit 504. Details of specific determination processing will be described later. The generation unit 504 generates a road surface route guidance display image in which the position directly above the road surface is a predetermined height when the positioning reliability of the current position is high, and when the reliability of the positioning of the current position is low, You may make it produce | generate the sky route guidance display image which made the position higher than the viewpoint which image | photographs an image | video the predetermined height.
 センサ509は、現在位置及び方位を測位する複数の自立航法センサから構成される。そして判断部508は、各々の自立航法センサ509の測位結果が整合するか否かによって信頼性を判断することができる。また、センサ509は、現在位置及び方位を測位する自立航法センサとGPSセンサとから構成される。そして判断部508は、自立航法センサとGPSセンサとの測位結果が整合するか否かによって信頼性を判断することができる。 The sensor 509 includes a plurality of self-contained navigation sensors that measure the current position and direction. Then, the determination unit 508 can determine the reliability based on whether or not the positioning results of the respective independent navigation sensors 509 match. The sensor 509 includes a self-contained navigation sensor that measures the current position and direction and a GPS sensor. The determination unit 508 can determine the reliability based on whether or not the positioning results of the self-contained navigation sensor and the GPS sensor match.
 なお、情報表示装置500は、各構成部が一体となって1つの装置として構成されていてもよいし、各構成部を単独の装置とし、複数の装置によって情報表示装置500の機能を実現してもよい。 Note that the information display device 500 may be configured as a single device with each component unit integrated, or each component unit is a single device, and the function of the information display device 500 is realized by a plurality of devices. May be.
 情報表示装置500による情報表示処理の手順は、図2に示した、実施の形態1にかかるナビゲーション装置によるナビゲーション処理の手順と同様であるので、その説明は省略する。情報表示装置500は、たとえば携帯型情報端末装置(より具体的には、たとえば携帯電話機、PDA、モバイルパソコンなど)であってもよい。また、情報表示装置500を、移動体に搭載することによって実施の形態1にかかるナビゲーション装置100とすることもできる。 The procedure of the information display process by the information display device 500 is the same as the procedure of the navigation process by the navigation device according to the first embodiment shown in FIG. Information display device 500 may be, for example, a portable information terminal device (more specifically, for example, a mobile phone, a PDA, a mobile personal computer, etc.). Further, the information display device 500 can be mounted on a moving body to provide the navigation device 100 according to the first embodiment.
 つぎに、上述した実施の形態1にかかるナビゲーション装置100、実施の形態2にかかるナビゲーション装置300、実施の形態3にかかる情報表示装置500の実施例について説明する。以下の実施例では、実施の形態1にかかるナビゲーション装置100の例、実施の形態2にかかるナビゲーション装置300の例、実施の形態3にかかる情報表示装置500を車両に搭載されたナビゲーション装置に適用した例について説明する。 Next, examples of the navigation device 100 according to the first embodiment, the navigation device 300 according to the second embodiment, and the information display device 500 according to the third embodiment will be described. In the following examples, the example of the navigation device 100 according to the first embodiment, the example of the navigation device 300 according to the second embodiment, and the information display device 500 according to the third embodiment are applied to a navigation device mounted on a vehicle. An example will be described.
(ナビゲーション装置のハードウェア構成)
 図6は、ナビゲーション装置100、ナビゲーション装置300および情報表示装置500(以下単に「ナビゲーション装置」とする)のハードウェア構成を示すブロック図である。
(Hardware configuration of navigation device)
FIG. 6 is a block diagram illustrating a hardware configuration of the navigation device 100, the navigation device 300, and the information display device 500 (hereinafter simply referred to as “navigation device”).
 図6において、ナビゲーション装置は、CPU601、ROM602、RAM(メモリ)603、磁気ディスクドライブ604、磁気ディスク605、光ディスクドライブ606、光ディスク607、音声I/F(インターフェース)608、マイク609、スピーカ610、入力デバイス611、映像I/F612、カメラ613、ディスプレイ614、プロジェクタ615、通信I/F616、GPSユニット617、各種センサ618を備えている。また、各構成部601~618はバス620によってそれぞれ接続されている。 In FIG. 6, the navigation device includes a CPU 601, ROM 602, RAM (memory) 603, magnetic disk drive 604, magnetic disk 605, optical disk drive 606, optical disk 607, audio I / F (interface) 608, microphone 609, speaker 610, input. A device 611, a video I / F 612, a camera 613, a display 614, a projector 615, a communication I / F 616, a GPS unit 617, and various sensors 618 are provided. The constituent units 601 to 618 are connected by a bus 620, respectively.
 CPU601は、ナビゲーション装置の全体の制御を司る。ROM602は、ブートプログラム、通信プログラム、データ表示プログラム、データ解析プログラムなどの各種プログラムを記録している。RAM603は、CPU601のワークエリアとして使用される。 The CPU 601 governs overall control of the navigation device. The ROM 602 records various programs such as a boot program, a communication program, a data display program, and a data analysis program. The RAM 603 is used as a work area for the CPU 601.
 磁気ディスクドライブ604は、CPU601の制御に従って磁気ディスク605に対するデータの読み取り/書き込みを制御する。磁気ディスク605は、磁気ディスクドライブ604の制御で書き込まれたデータを記録する。磁気ディスク605としては、たとえば、HD(ハードディスク)やFD(フレキシブルディスク)を用いることができる。 The magnetic disk drive 604 controls the reading / writing of the data with respect to the magnetic disk 605 according to control of CPU601. The magnetic disk 605 records data written under the control of the magnetic disk drive 604. As the magnetic disk 605, for example, an HD (hard disk) or an FD (flexible disk) can be used.
 光ディスクドライブ606は、CPU601の制御に従って光ディスク607に対するデータの読み取り/書き込みを制御する。光ディスク607は、光ディスクドライブ606の制御に従ってデータが読み出される着脱自在な記録媒体であり、たとえば、ブルーレイディスク、DVD、CDなどを含む。光ディスク607は、書き込み可能な記録媒体を利用することもできる。また、この着脱可能な記録媒体として、光ディスク607のほか、MO、メモリカードなどであってもよい。 The optical disc drive 606 controls reading / writing of data with respect to the optical disc 607 according to the control of the CPU 601. The optical disc 607 is a detachable recording medium from which data is read according to the control of the optical disc drive 606, and includes, for example, a Blu-ray disc, DVD, CD, and the like. As the optical disk 607, a writable recording medium can be used. In addition to the optical disk 607, the removable recording medium may be an MO, a memory card, or the like.
 磁気ディスク605または光ディスク607に記録される情報の一例として、経路探索・経路誘導などに用いる地図データが挙げられる。地図データは、建物、河川、地表面などの地物(フィーチャ)を表す背景データと、道路の形状を表す道路形状データとを有しており、ディスプレイ614の表示画面において2次元または3次元に描画される。ナビゲーション装置が経路誘導中の場合は、地図データと後述するGPSユニット617によって取得された移動体(たとえば自車)の現在地点とが重ねて表示されることとなる。 As an example of information recorded on the magnetic disk 605 or the optical disk 607, there is map data used for route search / route guidance. The map data includes background data representing features (features) such as buildings, rivers, and the ground surface, and road shape data representing the shape of the road. The map data is displayed on the display screen of the display 614 in two dimensions or three dimensions. Drawn. When the navigation device is guiding the route, the map data and the current location of the moving body (for example, the own vehicle) acquired by the GPS unit 617 described later are displayed in an overlapping manner.
 音声I/F608は、音声入力用のマイク609および音声出力用のスピーカ610に接続される。マイク609に受音された音声は、音声I/F608内でA/D変換される。また、スピーカ610からは音声が出力される。なお、マイク609から入力された音声は、音声データとして磁気ディスク605あるいは光ディスク607に記録可能である。 The voice I / F 608 is connected to a microphone 609 for voice input and a speaker 610 for voice output. The sound received by the microphone 609 is A / D converted in the sound I / F 608. In addition, sound is output from the speaker 610. Note that the sound input from the microphone 609 can be recorded on the magnetic disk 605 or the optical disk 607 as sound data.
 入力デバイス611は、文字、数値、各種指示などの入力のための複数のキーを備えたリモコン、キーボード、マウス、タッチパネルなどが挙げられる。さらに、入力デバイス611は、デジタルカメラや携帯電話端末などの他の情報処理端末を接続し、データの入出力をおこなうことができる。 The input device 611 includes a remote controller, a keyboard, a mouse, a touch panel, and the like that are provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like. Further, the input device 611 can connect other information processing terminals such as a digital camera and a mobile phone terminal to input / output data.
 映像I/F612は、映像入力用のカメラ613、映像出力用のディスプレイ614および映像出力用のプロジェクタ615と接続される。映像I/F612は、具体的には、たとえば、ディスプレイ614およびプロジェクタ615全体の制御をおこなうグラフィックコントローラと、即時表示可能な画像情報を一時的に記録するVRAM(Video
RAM)などのバッファメモリと、グラフィックコントローラから出力される画像データに基づいて、ディスプレイ614、プロジェクタ615を表示制御する制御ICなどによって構成される。
The video I / F 612 is connected to a camera 613 for video input, a display 614 for video output, and a projector 615 for video output. Specifically, the video I / F 612 includes, for example, a graphic controller that controls the entire display 614 and the projector 615, and a VRAM (Video) that temporarily records image information that can be displayed immediately.
RAM) and a control IC that controls display of the display 614 and the projector 615 based on image data output from the graphic controller.
 カメラ613は、車両内外の映像を撮像し、画像データとして出力する。カメラ613で撮像された画像は、画像データとして磁気ディスク605あるいは光ディスク607に記録可能である。 The camera 613 captures images inside and outside the vehicle and outputs them as image data. An image captured by the camera 613 can be recorded on the magnetic disk 605 or the optical disk 607 as image data.
 ディスプレイ614には、アイコン、カーソル、メニュー、ウインドウ、あるいは文字や画像などの各種データが表示される。このディスプレイ614は、たとえば、CRT、TFT液晶ディスプレイ、プラズマディスプレイなどを採用することができる。ディスプレイ614によって、実施の形態1の表示部106、実施の形態3の表示部506が制御する表示画面の機能を実現することができる。 The display 614 displays icons, cursors, menus, windows, or various data such as characters and images. As the display 614, for example, a CRT, a TFT liquid crystal display, a plasma display, or the like can be adopted. The display 614 can realize the functions of the display screen controlled by the display unit 106 in Embodiment 1 and the display unit 506 in Embodiment 3.
 プロジェクタ615には、ディスプレイ614と同様に、アイコン、カーソル、メニュー、ウインドウ、あるいは文字や画像などの各種データが表示される。プロジェクタ615は、たとえばCRTや液晶を用いて、フロントウインドウに各種データを投影する。プロジェクタ615は、たとえば、車内の天井や座席上部などに設置する。プロジェクタ615によって、実施の形態2の投影部305の機能を実現することができる。 Similarly to the display 614, the projector 615 displays icons, cursors, menus, windows, or various data such as characters and images. The projector 615 projects various data on the front window using, for example, a CRT or a liquid crystal. For example, the projector 615 is installed on the ceiling or the upper part of the seat in the vehicle. The projector 615 can realize the function of the projection unit 305 of the second embodiment.
 通信I/F616は、無線を介してネットワークに接続され、ナビゲーション装置とCPU601とのインターフェースとして機能する。通信I/F616は、さらに、無線を介してインターネットなどの通信網に接続され、この通信網とCPU601とのインターフェースとしても機能する。通信網には、LAN、WAN、公衆回線網や携帯電話網などがある。 The communication I / F 616 is connected to the network via wireless and functions as an interface between the navigation device and the CPU 601. The communication I / F 616 is further connected to a communication network such as the Internet via wireless, and also functions as an interface between the communication network and the CPU 601. Communication networks include LANs, WANs, public line networks and mobile phone networks.
 GPSユニット617は、GPS衛星からの電波を受信し、車両の現在地点(ナビゲーション装置の現在地点)を示す情報を算出する。GPSユニット617の出力情報は、後述する各種センサの出力値とともに、CPU601による車両の現在地点の算出に際して利用される。現在地点を示す情報は、たとえば緯度・経度、高度などの、地図データ上の1点を特定する情報である。 The GPS unit 617 receives radio waves from GPS satellites, and calculates information indicating the current position of the vehicle (current position of the navigation device). The output information of the GPS unit 617 is used when the current position of the vehicle is calculated by the CPU 601 together with output values of various sensors described later. The information indicating the current location is information for specifying one point on the map data, such as latitude / longitude and altitude.
 各種センサ618は、たとえば、ジャイロセンサや加速度センサ、車速センサなどであり、車両の移動状態を検出する。各種センサ618からの出力信号は、CPU601による現在地点の算出や、速度や方位の変化量の測定に用いられる。 The various sensors 618 are, for example, a gyro sensor, an acceleration sensor, a vehicle speed sensor, and the like, and detect the moving state of the vehicle. Output signals from the various sensors 618 are used for the calculation of the current location by the CPU 601 and the measurement of changes in speed and direction.
 また、実施の形態1にかかるナビゲーション装置100の構成のうち、現在位置情報取得部101、判断部108、センサ109は、GPSユニット617、各種センサ618、CPU601(ROM602、RAM603、磁気ディスク605、光ディスク607などに記憶されたプログラムをCPU601が実行すること)によって、ルート情報取得部102および記憶部103はCPU601、磁気ディスクドライブ604および磁気ディスク605、または、光ディスクドライブ606および光ディスク607によって、生成部104は、CPU601によって、映像取得部105は、CPU601、映像I/F612、カメラ613、または、通信I/F616によって、表示部106は、CPU601、映像I/F612、ディスプレイ614によって、撮像部110は、カメラ613および通信I/F616によって、入力部107は、入力デバイス611によって、それぞれの機能を実現する。 In the configuration of the navigation apparatus 100 according to the first embodiment, the current position information acquisition unit 101, the determination unit 108, and the sensor 109 are a GPS unit 617, various sensors 618, a CPU 601 (ROM602, RAM603, magnetic disk 605, optical disk). The route information acquisition unit 102 and the storage unit 103 are executed by the CPU 601, the magnetic disk drive 604 and the magnetic disk 605, or the optical disk drive 606 and the optical disk 607. The CPU 601, the video acquisition unit 105 is the CPU 601, the video I / F 612, the camera 613, or the communication I / F 616, and the display unit 106 is the CPU 601, the video I / F 612, The spray 614, imaging unit 110, the camera 613 and the communication I / F616, the input unit 107, the input device 611, to realize the respective functions.
 また、実施の形態2にかかるナビゲーション装置300の構成のうち、現在位置情報取得部301、判断部307、センサ308は、GPSユニット617、各種センサ618、CPU601(ROM602、RAM603、磁気ディスク605、光ディスク607などに記憶されたプログラムをCPU601が実行すること)によって、ルート情報取得部302および記憶部303はCPU601、磁気ディスクドライブ604および磁気ディスク605、または、光ディスクドライブ606および光ディスク607によって、生成部304は、CPU601によって、投影部305は、CPU601、映像I/F612、プロジェクタ615によって、入力部306は、入力デバイス611によって、それぞれの機能を実現する。 Further, in the configuration of the navigation device 300 according to the second embodiment, the current position information acquisition unit 301, the determination unit 307, and the sensor 308 are a GPS unit 617, various sensors 618, and a CPU 601 (ROM602, RAM603, magnetic disk 605, optical disk). The route information acquisition unit 302 and the storage unit 303 are executed by the CPU 601, the magnetic disk drive 604 and the magnetic disk 605, or the optical disk drive 606 and the optical disk 607. The CPU 601 realizes the respective functions of the projection unit 305 by the CPU 601, the video I / F 612 and the projector 615, and the input unit 306 by the input device 611.
 また、実施の形態3にかかる情報表示装置500の構成のうち、現在位置情報取得部501、判断部508、センサ509は、GPSユニット617、各種センサ618、CPU601(ROM602、RAM603、磁気ディスク605、光ディスク607などに記憶されたプログラムをCPU601が実行すること)によって、ルート情報取得部502および記憶部503はCPU601、磁気ディスクドライブ604および磁気ディスク605、または、光ディスクドライブ606および光ディスク607によって、生成部504は、CPU601によって、映像取得部505は、CPU601、映像I/F612、カメラ613、または、通信I/F616によって、表示部506は、CPU601、映像I/F612、ディスプレイ614によって、入力部507は、入力デバイス611によって、撮像部510は、カメラ613および通信I/F616によって、それぞれの機能を実現する。 In addition, in the configuration of the information display apparatus 500 according to the third embodiment, the current position information acquisition unit 501, the determination unit 508, and the sensor 509 include a GPS unit 617, various sensors 618, and a CPU 601 (ROM 602, RAM 603, magnetic disk 605, The route information acquisition unit 502 and the storage unit 503 are generated by the CPU 601, the magnetic disk drive 604 and the magnetic disk 605, or the optical disk drive 606 and the optical disk 607. Reference numeral 504 denotes the CPU 601, and the video acquisition unit 505 includes the CPU 601, video I / F 612, camera 613, or communication I / F 616. The display unit 506 includes the CPU 601, video I / F 612, and display. By Lee 614, input unit 507, the input device 611, an imaging unit 510, the camera 613 and the communication I / F616, realizing the respective functions.
 つぎに、ナビゲーション装置の生成部(生成部104、304、504)による上空ルート案内表示画像の生成手順について詳細に説明する。図7は、ナビゲーション装置の生成部の機能的構成を示すブロック図である。また、図8は、ナビゲーション装置の生成部の処理の手順を示すフローチャートである。 Next, a procedure for generating the sky route guidance display image by the generation unit ( generation units 104, 304, and 504) of the navigation device will be described in detail. FIG. 7 is a block diagram illustrating a functional configuration of the generation unit of the navigation device. FIG. 8 is a flowchart showing a processing procedure of the generation unit of the navigation device.
 図7において、生成部は、配置処理部(配置部)701と、移動処理部(移動部)702と、設定処理部(設定部)703と、レンダリング処理部(レンダリング部)704と、調整部705から構成される。図8のフローチャートにおいて、配置処理部701は、リンク情報のみに基づいて、またはノード情報およびリンク情報に基づいて、三次元計算用の空間において平面的な道路に対応する帯状オブジェクトを配置する(ステップS801)。その際、ルート情報に基づいて、ルートに該当するリンク情報だけを抽出し、ルート以外の部分を削除する(ステップS802)。 7, the generation unit includes an arrangement processing unit (arrangement unit) 701, a movement processing unit (movement unit) 702, a setting processing unit (setting unit) 703, a rendering processing unit (rendering unit) 704, and an adjustment unit. 705. In the flowchart of FIG. 8, the arrangement processing unit 701 arranges a band-like object corresponding to a planar road in the space for three-dimensional calculation based only on link information or based on node information and link information (step S801). At that time, based on the route information, only the link information corresponding to the route is extracted, and a portion other than the route is deleted (step S802).
 移動処理部702は、配置処理部701によって配置された帯状オブジェクトを、三次元計算用の空間において平面と直交する高さ方向へ所定量(所定の高さ)だけ移動させる(ステップS803)。設定処理部703は、移動処理部702によって移動された帯状オブジェクトに対して、三次元計算用の視点位置、視点方向および高さを撮像部の位置、方向および高さ、または操作者(運転者または同乗者)の視点位置、視点方向および高さに設定する(ステップS804)。レンダリング処理部704は、設定処理部703によって設定された視点位置、視点方向および高さから見た帯状オブジェクトを上空ルート案内表示画像としてレンダリングする(ステップS805)。 The movement processing unit 702 moves the band-like object arranged by the arrangement processing unit 701 by a predetermined amount (predetermined height) in the height direction orthogonal to the plane in the space for three-dimensional calculation (step S803). The setting processing unit 703 determines the viewpoint position, viewpoint direction, and height for the three-dimensional calculation for the band-shaped object moved by the movement processing unit 702, the position, direction, and height of the imaging unit, or an operator (driver Or the passenger's viewpoint position, viewpoint direction, and height are set (step S804). The rendering processing unit 704 renders the band-like object viewed from the viewpoint position, the viewpoint direction, and the height set by the setting processing unit 703 as a sky route guidance display image (step S805).
 調整部705は、路面(地上)からの所定量(所定の高さ)を調整可能であり、移動処理部702は、ステップS803において、三次元計算用の空間において調整部705によって調整された高さだけ平面と直交する高さ方向へ移動する。調整部705は、ルート情報(たとえば、道路の傾斜、交差点(ノード)の有無、信号機の有無など)に応じて調整してもよい。また、操作者からの入力によって操作者の所望の高さに調整するようにしてもよい。 The adjustment unit 705 can adjust a predetermined amount (predetermined height) from the road surface (the ground), and the movement processing unit 702 can adjust the height adjusted by the adjustment unit 705 in the three-dimensional calculation space in step S803. It moves in the height direction perpendicular to the plane. The adjustment unit 705 may perform adjustment according to route information (for example, road inclination, presence / absence of an intersection (node), presence / absence of a traffic light, etc.). Moreover, you may make it adjust to an operator's desired height by the input from an operator.
 図9は、十字路の交差点の地図データ(ノード情報、リンク情報)を示す説明図であり、図10-1~図11-2は、生成部の配置処理部701における配置処理の内容を示す説明図である。図9において、地図データは、交差点を示すノード900と、ノード900に対して4本の道路を示すリンク901~904で構成されている。リンクには道路の形状や道幅などの情報があり、それらの情報に基づいて三次元計算用の空間に平面的な道路を配置したものを上からの視点で表したのが、図10-1である。 FIG. 9 is an explanatory diagram showing map data (node information, link information) at intersections of a crossroad, and FIGS. 10-1 to 11-2 are explanatory diagrams showing the contents of arrangement processing in the arrangement processing unit 701 of the generation unit. FIG. In FIG. 9, the map data includes a node 900 indicating an intersection and links 901 to 904 indicating four roads with respect to the node 900. The link has information such as the shape of the road and the width of the road. Based on the information, a link in which a planar road is arranged in the space for three-dimensional calculation is represented from the top, as shown in FIG. It is.
 図10-1において、配置処理部701は、図9に示したリンク901を、道路の形状や道幅を考慮して三次元計算用の空間に平面的な道路として配置して、リンク1001とする。同様に、配置処理部701は、図9に示したリンク902~904を三次元計算用の空間に平面的な道路として配置して、リンク(帯状画像)1002~1004とする。 In FIG. 10A, the arrangement processing unit 701 arranges the link 901 shown in FIG. 9 as a planar road in a space for three-dimensional calculation in consideration of the shape and width of the road, and sets it as a link 1001. . Similarly, the arrangement processing unit 701 arranges the links 902 to 904 shown in FIG. 9 as planar roads in the space for three-dimensional calculation to form links (band images) 1002 to 1004.
 図10-1から、リンク1001が道幅の広い通りであり、リンク1002がリンク1001と比べて道幅の狭い道路であることがわかる。なお、配置処理部701は、リンクの道幅に関する情報を考慮しないで、全てのリンクを一定の道幅として配置させてもよい。図10-2は、図10-1において三次元計算用の空間に平面的な道路として配置したものを横からの視点(図10-1の下側から上側の視点)で表したものである。この時点では、各リンク1001~1004の高さ(Z軸方向)は0である。なお、このときの基準となる高さ(すなわち、高さ0)は任意に設定することが可能であり、たとえば、地表面を基準とすることができる。 10A, it can be seen that the link 1001 is a street with a wide road, and the link 1002 is a road with a narrower road width than the link 1001. Note that the arrangement processing unit 701 may arrange all the links as a constant road width without considering information on the road width of the links. FIG. 10-2 is a view from the side (viewed from the lower side to the upper side in FIG. 10-1) of what is arranged as a planar road in the space for three-dimensional calculation in FIG. 10-1. . At this time, the height of each link 1001 to 1004 (Z-axis direction) is zero. The reference height (that is, height 0) can be arbitrarily set, and for example, the ground surface can be used as a reference.
 図10-1において、4本のリンク1001~1004のうち、ルートに該当するリンクだけを抽出したのが、図11-1である。図11-1も、図10-1と同様に、三次元計算用の空間に平面的な道路を配置したものを上からの視点で表している。ルートが図9におけるリンク901からノード900まで到達し、ノード900を右折してリンク902へ至るものであるので、図11-1では、配置処理部701が図10-1におけるリンク1001および1002だけを抽出し、ルート以外のリンク(リンク1003および1004)を削除した状態を示している。 In FIG. 10A, only the link corresponding to the route is extracted from the four links 1001 to 1004 in FIG. Similarly to FIG. 10-1, FIG. 11A also shows a plan view in which a planar road is arranged in a space for three-dimensional calculation. Since the route reaches from the link 901 in FIG. 9 to the node 900, turns right on the node 900 and reaches the link 902, in FIG. 11-1, the placement processing unit 701 only includes the links 1001 and 1002 in FIG. Is extracted, and links other than the root (links 1003 and 1004) are deleted.
 配置処理部701は、ルート以外のリンク(リンク1003および1004)を、最初から配置しないようにしてもよい(すなわち、三次元計算用の空間において、図10-1を省略して、最初から図11-1とするようにしてもよい)。 The placement processing unit 701 may not place links other than the root (links 1003 and 1004) from the beginning (that is, omitting FIG. 10-1 in the space for three-dimensional calculation and starting from the beginning. 11-1 may be used).
 図11-2は、図10-2と同様に、図11-1において三次元計算用の空間に平面的な道路として配置したものを横からの視点(図11-1の下側から上側の視点)で表したものである。この時点でも、リンク1001、1002の高さ(Z軸方向)は0である。 11-2 is similar to FIG. 10-2, in which the one arranged as a planar road in the space for three-dimensional calculation in FIG. 11-1 is viewed from the side (from the lower side to the upper side of FIG. 11-1). (Point of view). Even at this time, the height of the links 1001 and 1002 (Z-axis direction) is zero.
 図12-1、図12-2は、生成部の移動処理部702における移動処理の内容を示す説明図である。移動処理部702は、配置処理部701によって配置されたリンク1001および1002を所定の高さ(ここでは地上約20m)だけ、上側(Z軸方向)へ移動させる。これによって、三次元計算用の空間におけるオブジェクト(帯状オブジェクト)の配置は終了する。 FIGS. 12A and 12B are explanatory diagrams showing the contents of the movement processing in the movement processing unit 702 of the generation unit. The movement processing unit 702 moves the links 1001 and 1002 arranged by the arrangement processing unit 701 upward (Z-axis direction) by a predetermined height (here, about 20 m above the ground). Thereby, the arrangement of the objects (band-like objects) in the space for three-dimensional calculation is completed.
 図12-1は、図10-1、図11-1と同様に上からの視点で表しているため、図11-1と同様に見える(すなわち、Z軸方向の移動はわからない)。図12-2は、図10-2、図11-2と同様に横からの視点(図12-1の下側から上側の視点)で表したものであり、図11-2と比較すると、リンク1001および1002が、元の高さ(高さ0)よりZ軸方向へ20mだけ高くなっていることがわかる。 Since FIG. 12-1 is expressed from the viewpoint from the top like FIGS. 10-1 and 11-1, it looks the same as FIG. 11-1 (that is, the movement in the Z-axis direction is not known). FIG. 12-2 is a view from the side (view point from the lower side to the upper side in FIG. 12-1) as in FIGS. 10-2 and 11-2. Compared to FIG. It can be seen that the links 1001 and 1002 are higher than the original height (height 0) by 20 m in the Z-axis direction.
 図13-1、図13-2は、生成部の設定処理部703における設定処理の内容を示す説明図である。設定処理部703は、移動処理部702によって移動された帯状画像に対して、三次元計算用の視点位置、方向および高さを示すカメラ1301を移動体(たとえば自車)に搭載された撮像部の位置、方向および高さ、または運転者または同乗者の視点位置、方向および高さに設定する。 FIGS. 13A and 13B are explanatory diagrams illustrating the contents of the setting process in the setting processing unit 703 of the generation unit. The setting processing unit 703 is an imaging unit in which a camera 1301 indicating a viewpoint position, direction, and height for three-dimensional calculation is mounted on a moving body (for example, the own vehicle) with respect to the belt-like image moved by the movement processing unit 702. Position, direction and height, or the viewpoint position, direction and height of the driver or passenger.
 図13-2は、図10-2、図11-2、図12-2と同様に横からの視点(図13-1の下側から上側の視点)で表したものであり、カメラ1301がリンク1001、1002よりもZ軸方向に対して低い位置(たとえば地上約1.5m)であって、リンク1001のほぼ真下にあることがわかる。これは、自車がリンク1001の道路上を走行中であるということを示している。また、カメラ1301が自車の所定の位置(たとえばバックミラーの裏側など)に搭載されているか、あるいは、自車に乗車した運転者または同乗者の視点位置を考慮していることを示している。 FIG. 13-2 shows the viewpoint from the side (the viewpoint from the lower side to the upper side of FIG. 13-1) as in FIGS. 10-2, 11-2, and 12-2. It can be seen that the position is lower than the links 1001 and 1002 in the Z-axis direction (for example, about 1.5 m above the ground) and is almost directly below the link 1001. This indicates that the vehicle is traveling on the road of the link 1001. In addition, it indicates that the camera 1301 is mounted at a predetermined position (for example, the back side of the rearview mirror) of the own vehicle, or that the viewpoint position of the driver or the passenger riding the own vehicle is taken into consideration. .
 図14は、生成部のレンダリング処理部704によってレンダリングされた状態を示す説明図である。図14は、図13-1、図13-2で示したカメラ1301からリンク1001、1002を見た状態をレンダリングした状態を示している。カメラ1301とリンク1001とは、高さが異なるため、リンク1001はカメラ1301からは下から見上げる状態となり、その状態を三次元計算用の空間に示すと、図14に示すように逆台形に近い形状となる。リンク1002も、カメラ1301が下から見上げる分だけ、幅を持った帯状画像となる。これにより、生成部による帯状画像(リンク)の生成(描画)は完了する。 FIG. 14 is an explanatory diagram showing a state rendered by the rendering processing unit 704 of the generation unit. FIG. 14 illustrates a state in which the links 1001 and 1002 viewed from the camera 1301 illustrated in FIGS. 13A and 13B are rendered. Since the camera 1301 and the link 1001 have different heights, the link 1001 looks up from below from the camera 1301. When this state is shown in the space for three-dimensional calculation, it is close to an inverted trapezoid as shown in FIG. It becomes a shape. The link 1002 also becomes a band-like image having a width that the camera 1301 looks up from below. Thereby, the generation (drawing) of the band-like image (link) by the generation unit is completed.
 図15は、前方映像上に上空ルート案内表示画像を重ねて表示した状態を示す説明図である。図14において生成された帯状画像1001、1002が、前方映像上においてルート案内を示すことになる。ルートに沿って連続する帯状画像1001、1002によって構成されるルート案内は、路面よりも上方に表示されるため、前方映像と重ね合わせた場合であっても、操作者(運転者または同乗者)が、前方映像における前側にある障害物、たとえば前方を走行する自動車などが存在してもその帯状画像によって当該障害物が遮られることはなく、ユーザが前方映像を見誤ることがない。さらに、図13-2に示すように、帯状画像1001、1002をカメラ1301の撮像位置(撮影する視点)よりも高く位置づけて前方映像上に重ねて表示すると、帯状画像1001、1002は前方映像において上空を見上げる位置に表示されるため、帯状画像1001、1002に重なる対象物が存在する可能性が低くなる。 FIG. 15 is an explanatory diagram showing a state in which an aerial route guidance display image is displayed over the front video. The band images 1001 and 1002 generated in FIG. 14 indicate route guidance on the front video. Since the route guidance constituted by the belt- like images 1001 and 1002 continuous along the route is displayed above the road surface, even if it is superimposed on the front image, the operator (driver or passenger) However, even if there is an obstacle on the front side in the front image, for example, a car traveling in front, the obstacle is not blocked by the belt-like image, and the user does not mistake the front image. Furthermore, as shown in FIG. 13-2, when the strip images 1001 and 1002 are positioned higher than the imaging position of the camera 1301 (photographing viewpoint) and are displayed on the front video, the strip images 1001 and 1002 are displayed in the front video. Since the image is displayed at a position where the sky is looked up, there is a low possibility that an object that overlaps the strip images 1001 and 1002 exists.
 つぎに、ナビゲーション装置の生成部(生成部104、304、504)による路面ルート案内表示画像の生成手順について詳細に説明する。路面ルート案内表示画像は、上述した、上空ルート案内表示画像の生成手順とほぼ同じ手順によって生成される。両者の相違点は、図7に示した移動処理部702による、配置処理部701によって配置された帯状画像を、三次元計算用の空間において平面と直交する高さ方向へ所定量(所定の高さ)だけ移動させる(図8のステップS803)処理の有無のみである。すなわち、上空ルート案内表示画像の生成においては、図8のステップS803を実行するが、路面ルート案内表示画像の生成においては、図8のステップS803の実行を省略し、ステップS802に続いてステップS804を実行する。それ以外は、両者の処理は同じである。 Next, a procedure for generating a road surface route guidance display image by the generation unit ( generation units 104, 304, and 504) of the navigation device will be described in detail. The road surface route guidance display image is generated by substantially the same procedure as described above for generating the sky route guidance display image. The difference between the two is that the movement processing unit 702 shown in FIG. 7 applies a predetermined amount (predetermined height) of the band-shaped image arranged by the arrangement processing unit 701 in the height direction orthogonal to the plane in the space for three-dimensional calculation. Only the presence / absence of the process (step S803 in FIG. 8). That is, in the generation of the sky route guidance display image, step S803 in FIG. 8 is executed, but in the generation of the road surface route guidance display image, the execution of step S803 in FIG. 8 is omitted, and step S804 is followed by step S804. Execute. Other than that, both processes are the same.
 図16は、前方映像上に路面ルート案内表示画像を重ねて表示した状態を示す説明図である。図16に示すように、前方映像の路面に重なるように帯状画像1001、1002が表示される。図16にあっては、自車の認識が正しくなされているので、曲がる位置が正確に描画できている。この場合に、図16に示すように、前方に障害物が存在しない場合は、ルート案内としては有効であると考えられる。 FIG. 16 is an explanatory diagram showing a state in which a road surface route guidance display image is superimposed on the front video. As shown in FIG. 16, strip images 1001 and 1002 are displayed so as to overlap the road surface of the front image. In FIG. 16, since the vehicle is recognized correctly, the turning position can be accurately drawn. In this case, as shown in FIG. 16, when there is no obstacle ahead, it is considered effective as route guidance.
 つぎに、生成部において、路面ルート案内表示画像と上空ルート案内表示画像の生成を切り替える処理手順について説明する。位置の精度の高さを測位の信頼性の高さとし、この信頼性の判断結果に応じて、路面ルート案内表示画像と上空ルート案内表示画像のいずれかを生成する。位置の精度が悪い場合とは、たとえば、GPSのみで測位していて、車速・Gセンサ・ジャイロセンサ・電子コンパスなどの情報を得ていないまたは得られないときが考えられる。たとえば、ナビゲーション装置本体にはGPSしか内蔵していない場合などである。また、周りを高いビルなどで囲まれていて、GPSの電波に障害があるとき、通信ネットワークを使って測位精度を上げる仕組み(D-GPS,A-GPS)などが無いとき、取り付けの向きが悪く、ジャイロセンサが有効に使えないとき、ナビゲーションの取り付けが正確におこなわれていないとき、複数のセンサ間でデータに整合がとれないとき(たとえば、ジャイロセンサで右方向の角速度を検出したにもかかわらず、Gセンサでは左方向への加速度を検出しなかったとき)などが考えられる。また、ナビゲーションを使い始めてすぐでセンサの学習が終わっていないときや、タイヤを替えた直後なども位置精度が悪い場合として考えられる。 Next, the processing procedure for switching the generation of the road surface route guidance display image and the sky route guidance display image in the generation unit will be described. The high accuracy of the position is regarded as high reliability of positioning, and either the road surface route guidance display image or the sky route guidance display image is generated according to the reliability determination result. The case where the position accuracy is poor may be, for example, a case where positioning is performed only by GPS and information such as vehicle speed, G sensor, gyro sensor, and electronic compass is not obtained or cannot be obtained. For example, this is the case where the navigation device itself has only a GPS built-in. Also, if the surroundings are surrounded by tall buildings and there are obstacles in the radio wave of GPS, there is no mechanism (D-GPS, A-GPS) to improve positioning accuracy using a communication network, etc. It is bad, when the gyro sensor cannot be used effectively, when the navigation is not attached correctly, or when the data cannot be matched between multiple sensors (for example, even if the gyro sensor detects the angular velocity in the right direction) Regardless, the G sensor may not detect leftward acceleration). In addition, it can be considered that the position accuracy is poor immediately after starting to use the navigation and the learning of the sensor is not finished or immediately after changing the tire.
 図17は、ナビゲーション装置の生成部における切り替え生成処理の手順を示すフローチャートであり、図18および図19は、各センサの出力値が正常であるか否かの判断処理の手順を示すフローチャートである。各センサの出力値が正常であるか否かの判断処理は判断部によっておこなわれる。 FIG. 17 is a flowchart showing a procedure of switching generation processing in the generation unit of the navigation device, and FIGS. 18 and 19 are flowcharts showing a procedure of determination processing as to whether or not the output value of each sensor is normal. . Judgment processing as to whether or not the output value of each sensor is normal is performed by the judgment unit.
 図17のフローチャートにおいて、所定の測位センサ(自立センサ)から情報を取得できる状態となっているか否かを判断する(ステップS1701)。具体的には、車速センサが接続されているか否か、また、ジャイロセンサ、Gセンサが装置内部にあって使用可能であるか否かなどに基づいて判断する。ステップS1701において、所定の測位センサから情報を取得できる状態となっていない場合(ステップS1701:No)は、現在位置の精度が悪いと判断して、上空ルート案内表示画像を生成する(ステップS1705)。 In the flowchart of FIG. 17, it is determined whether or not information can be acquired from a predetermined positioning sensor (independent sensor) (step S1701). Specifically, the determination is made based on whether or not the vehicle speed sensor is connected and whether or not the gyro sensor and the G sensor are in the apparatus and can be used. If it is determined in step S1701 that information cannot be acquired from a predetermined positioning sensor (step S1701: No), it is determined that the accuracy of the current position is poor, and a sky route guidance display image is generated (step S1705). .
 一方、ステップS1701において、所定の測位センサから情報を取得できる状態となっている場合(ステップS1701:Yes)は、各センサの出力値を確認する(ステップS1702)。各センサの出力値が正常であるか異常であるかは、図18に示すフローチャート(ジャイロセンサとGセンサの比較による判定処理)および図19に示すフローチャート(GPSと車速パルスの比較による判定処理)による、各センサの出力値が正常であるか否かの判断処理の手順によって判定する。すなわち、図18に示す判断結果が正常であるかまたは図19に示す判断結果が正常であれば(いずれか一方が正常であれば)、各センサの出力値は正常であると判定することができる。また、図18に示す判断結果が正常であり、かつ図19に示す判断結果が正常であると判断された場合に限り(両方とも正常であれば)、各センサの出力値は正常と判定するようにしてもよい。 On the other hand, in step S1701, when it is in a state where information can be acquired from a predetermined positioning sensor (step S1701: Yes), the output value of each sensor is confirmed (step S1702). Whether the output value of each sensor is normal or abnormal is determined according to the flowchart shown in FIG. 18 (determination process based on comparison between gyro sensor and G sensor) and the flowchart shown in FIG. 19 (determination process based on comparison between GPS and vehicle speed pulse). The determination is made according to the procedure of the determination process of whether or not the output value of each sensor is normal. That is, if the determination result shown in FIG. 18 is normal or the determination result shown in FIG. 19 is normal (if either one is normal), it can be determined that the output value of each sensor is normal. it can. Further, only when it is determined that the determination result shown in FIG. 18 is normal and the determination result shown in FIG. 19 is normal (both are normal), the output value of each sensor is determined to be normal. You may do it.
 上記判定の結果、各センサの出力値が正常である場合(ステップS1703:Yes)は、現在位置の精度がよいと判断して、路面ルート案内表示画像を生成する(ステップS1704)。これに対して、各センサの出力値が異常である場合(ステップS1703:No)は、現在位置の精度が悪いと判断して、上空ルート案内表示画像を生成する(ステップS1705)。 As a result of the determination, if the output value of each sensor is normal (step S1703: Yes), it is determined that the accuracy of the current position is good, and a road surface route guidance display image is generated (step S1704). On the other hand, when the output value of each sensor is abnormal (step S1703: No), it is determined that the accuracy of the current position is poor, and the sky route guidance display image is generated (step S1705).
 なお、路面ルート案内表示画像または上空ルート案内表示画像のいずれか一方が表示されているときに他方に切り替える指示の入力を受け付けた場合には、図17の判断処理によらずに他方に切り替えて表示するようにしてもよい。また、路面ルート案内表示画像または上空ルート案内表示画像のいずれか一方から他方に切り替えて生成し、表示したときは、所定の条件(たとえば、所定時間経過など)を満たすまで切り替え後のルート表示画像の表示を維持するようにしてもよい。 In addition, when the input of the instruction | indication which switches to the other is received when either a road surface route guidance display image or an aerial route guidance display image is displayed, it switches to the other irrespective of the judgment process of FIG. You may make it display. Further, when the road route guidance display image or the sky route guidance display image is generated by switching from one to the other and displayed, the route display image after switching until a predetermined condition (for example, a predetermined time elapses) is satisfied. May be maintained.
 つぎに、各センサの出力値が正常であるか否かの判定処理(ジャイロセンサとGセンサの比較による判定処理)の手順について説明する。各センサの出力値が正常であるか否かは、ジャイロセンサとGセンサの出力値が整合しているか否かにより判断する。図18のフローチャートにおいて、ジャイロセンサの角速度が所定のしきい値以上であるか否かを判断する(ステップS1801)。ここで、所定のしきい値以上でない場合(ステップS1801:No)は、出力値は異常であると判定する(ステップS1805)。 Next, the procedure of the determination process (determination process based on comparison between the gyro sensor and the G sensor) for determining whether the output value of each sensor is normal will be described. Whether or not the output value of each sensor is normal is determined by whether or not the output values of the gyro sensor and the G sensor match. In the flowchart of FIG. 18, it is determined whether or not the angular velocity of the gyro sensor is equal to or greater than a predetermined threshold value (step S1801). Here, when it is not more than a predetermined threshold value (step S1801: No), it determines with an output value being abnormal (step S1805).
 一方、ステップS1801において、所定のしきい値以上である場合(ステップS1801:Yes)は、ジャイロセンサの回転方向を記憶する(ステップS1802)。つぎに、Gセンサは、ジャイロセンサの回転方向と逆方向(すなわち左右方向)を一定以上出力しているか否かを判断する(ステップS1803)。ここで、一定以上出力している場合(ステップS1803:Yes)は、各センサの出力値が整合しているので出力値は正常であると判定する(ステップS1804)。これに対して、一定以上出力していない場合(ステップS1803:No)は、各センサの出力値が整合していないので出力値は異常であると判定する(ステップS1805)。 On the other hand, if it is equal to or greater than the predetermined threshold value in step S1801 (step S1801: Yes), the rotational direction of the gyro sensor is stored (step S1802). Next, the G sensor determines whether or not the direction opposite to the rotation direction of the gyro sensor (that is, the left-right direction) is output more than a certain value (step S1803). Here, if the output is above a certain level (step S1803: Yes), the output value of each sensor is matched, so the output value is determined to be normal (step S1804). On the other hand, when it is not outputting more than a certain value (step S1803: No), it is determined that the output value is abnormal because the output values of the sensors are not matched (step S1805).
 つぎに、各センサの出力値が正常であるか否かの判定処理(GPSと車速パルスの比較による判定処理)の手順について説明する。この場合も、各センサの出力値が正常であるか否かは、GPSと車速パルスの出力値が整合しているか否かにより判断する。図19のフローチャートにおいて、まず、GPSによって得られた第1の位置情報を保存する(ステップS1901)。つぎに、車速パルスのパルス数をカウントする(ステップS1902)。そして、所定期間が経過したか否かを判断する(ステップS1903)。ここで、所定期間が経過していない場合(ステップS1903:No)は、ステップS1902に戻って、車速パルスのパルス数をカウントする。車速パルスのパルス数のカウントは所定期間が経過するまでおこなう。そして所定期間が経過した場合(ステップS1903:Yes)は、GPSによって得られた第2の位置情報を保存する(ステップS1904)。 Next, the procedure of the determination process (determination process based on comparison between GPS and vehicle speed pulses) for determining whether the output value of each sensor is normal will be described. Also in this case, whether or not the output value of each sensor is normal is determined based on whether or not the output values of the GPS and the vehicle speed pulse match. In the flowchart of FIG. 19, first, the first position information obtained by GPS is stored (step S1901). Next, the number of vehicle speed pulses is counted (step S1902). Then, it is determined whether or not a predetermined period has elapsed (step S1903). Here, when the predetermined period has not elapsed (step S1903: No), the process returns to step S1902, and the number of vehicle speed pulses is counted. The number of vehicle speed pulses is counted until a predetermined period elapses. If the predetermined period has elapsed (step S1903: YES), the second position information obtained by GPS is stored (step S1904).
 つぎに、第2の位置情報から第1の位置情報を減算することによって、所定期間中の移動距離を算出する(ステップS1905)。さらに、ステップS1905において算出された所定期間中の移動距離を、ステップS1902においてカウントした車速パルスのパルス数で除算し、1車速パルスあたりの移動量を算出する(ステップS1906)。 Next, the movement distance during a predetermined period is calculated by subtracting the first position information from the second position information (step S1905). Further, the movement distance during the predetermined period calculated in step S1905 is divided by the number of vehicle speed pulses counted in step S1902, and the movement amount per vehicle speed pulse is calculated (step S1906).
 上記ステップS1901~S1906の一連の処理を所定回数繰り返しておこなう。そして、一連の処理が所定の試行回数Xに達したか否かを判断する(ステップS1907)。所定の試行回数Xに達していない場合(ステップS1907:No)は、ステップS1901へ戻って、ステップS1901~S1906の処理を繰り返す。そして、ステップS1907において、所定の試行回数Xに達した場合(ステップS1907:Yes)は、つぎに、所定の試行回数Xの各1車速パルス当たりの移動量のばらつきが所定の範囲内であるか否かを判断する(ステップS1908)。 The above-described steps S1901 to S1906 are repeated a predetermined number of times. Then, it is determined whether or not a series of processing has reached a predetermined number of trials X (step S1907). If the predetermined number of trials X has not been reached (step S1907: NO), the process returns to step S1901, and the processing of steps S1901 to S1906 is repeated. If the predetermined number of trials X has been reached in step S1907 (step S1907: Yes), then whether the variation in the movement amount per vehicle speed pulse of the predetermined number of trials X is within a predetermined range. It is determined whether or not (step S1908).
 ステップS1908において、ばらつきが所定の範囲内である場合(ステップS1908:Yes)は、各センサの出力値は整合しており、出力値は正常であると判定する(ステップS1909)。これに対して、ばらつきが所定の範囲外である場合(ステップS1908:No)は、各センサの出力値は整合しておらず、出力値は異常であると判定する(ステップS1910)。 In step S1908, when the variation is within a predetermined range (step S1908: Yes), it is determined that the output values of the sensors are consistent and the output values are normal (step S1909). On the other hand, if the variation is outside the predetermined range (step S1908: No), it is determined that the output values of the sensors are not matched and the output values are abnormal (step S1910).
 このようにして、上述したような、各センサの出力値が正常である、すなわち、現在位置の精度がよいという所定の条件に合致した場合に、生成部は、路面ルート案内表示画像から上空ルート案内表示画像に切り替えて生成する。また、生成部は、所定の条件に合致しない場合に、上空ルート案内表示画像から路面ルート案内表示画像に切り替えて生成する。 As described above, when the output value of each sensor is normal, that is, when the predetermined condition that the accuracy of the current position is good is met, the generation unit generates a sky route from the road surface route guidance display image. It is generated by switching to the guidance display image. In addition, when the predetermined condition is not met, the generation unit generates the image by switching from the sky route guidance display image to the road surface route guidance display image.
 図20は、前方映像上に路面ルート案内表示画像を重ねて表示した状態を示す説明図であり、図21は、前方映像上に上空ルート案内表示画像を重ねて表示した状態を示す説明図である。図20は、自車の位置認識が正しくないことで、自車の位置が実際より10mほど出前に認識され、曲がる位置が奥に描画されている場合を示している。このように自車の位置の精度が悪い場合は、路面に重なるように帯状画像1001、1002を表示すると、実際の右折道路と右折後の帯状画像1002とのずれが大きく見えるため、曲がるポイント(道)がわかりづらい。 FIG. 20 is an explanatory diagram illustrating a state in which a road surface route guidance display image is displayed over the front image, and FIG. 21 is an explanatory diagram illustrating a state in which the sky route guidance display image is displayed over the front image. is there. FIG. 20 shows a case where the position of the own vehicle is not recognized correctly, so that the position of the own vehicle is recognized about 10 m before delivery, and the turning position is drawn in the back. Thus, when the position accuracy of the own vehicle is poor, when the belt- like images 1001 and 1002 are displayed so as to overlap the road surface, the difference between the actual right-turn road and the belt-like image 1002 after the right turn appears to be large. Is difficult to understand.
 図21は、図20と同様に、自車の位置認識が正しくないことで、自車の位置が実際より10mほど出前に認識され、曲がる位置が奥に描画されている場合を示している。実際には自車の位置の精度が悪いが、帯状画像1001、1002が上空に描かれていることで、操作者(運転者、同乗者)は、右折後の帯状画像1002と、その下方に表示される実際の右折道路とを対応づけて認識し、曲がるポイント(道)を推定している。したがって、図15であっても、図21であっても、どのポイント(道)で曲がればよいかを操作者(運転者、同乗者)は容易に認識することができる。 FIG. 21 shows a case in which the position of the own vehicle is recognized about 10 m before actual delivery and the turning position is drawn in the back because the position recognition of the own vehicle is not correct, as in FIG. Although the accuracy of the position of the own vehicle is actually poor, the band images 1001 and 1002 are drawn in the sky, so that the operator (driver and passenger) can see the band image 1002 after the right turn and below The actual right turn road displayed is recognized in association with it, and the turning point (road) is estimated. Therefore, in either FIG. 15 or FIG. 21, the operator (driver, passenger) can easily recognize at which point (road) the turn should be made.
 このように、路面ルート案内表示画像と上空ルート案内表示画像とを比較した場合に、自車の位置の測位の信頼性が高い(測位の精度がよい)場合は、路面ルート案内表示画像の方がよりわかりやすいが、位置の測位の信頼性が低い(測位の精度が悪い)場合は、かえって路面ルート案内表示画像の方がわかりづらくなる。したがって、自車の位置の精度によって、両者を切り替えることが有効であることがわかる。 In this way, when the road surface route guidance display image and the sky route guidance display image are compared, if the positioning reliability of the vehicle position is high (the positioning accuracy is good), the road surface route guidance display image Is easier to understand, but if the position positioning reliability is low (positioning accuracy is poor), the road surface route guidance display image is rather difficult to understand. Therefore, it turns out that it is effective to switch both according to the precision of the position of the own vehicle.
 また、地図上の道路ごとに調査に関する属性(たとえば、調査方法や調査精度を示す属性)を保存してあれば、地図のデータの精度が悪い場合(たとえば地図データを作成する際に詳細な地図の計測をしていないとき)は、上空ルート案内表示画像に切り替えて生成するようにしてもよい。すなわち、自車位置の測位の信頼性に代えて、またはこれに加えて、地図データの精度の信頼性に応じてルート表示画像の路面からの高さを決定してもよい。また、付近に曲がる道路が連続してある場合は、上空ルート案内表示画像では、何処で曲がるかわかりづらい場合があるため、路面ルート案内表示画像に切り替えて生成するようにしてもよい。また、曲がる道路が少ない場合や、道路に起伏(傾斜を含む)がある場合は、路面ルート案内表示画像では、起伏の形と合わないことがあるので、上空ルート案内表示画像に切り替えて生成するようにしてもよい。これに対して、道路が平面な場合は、路面ルート案内表示画像に切り替えて生成するようにしてもよい。 In addition, if you store the attributes related to the survey for each road on the map (for example, the attribute indicating the survey method and survey accuracy), if the accuracy of the map data is poor (for example, the detailed map when creating the map data) May be generated by switching to the sky route guidance display image. That is, instead of or in addition to the reliability of the positioning of the vehicle position, the height of the route display image from the road surface may be determined according to the reliability of the accuracy of the map data. In addition, when there are continuous roads that bend in the vicinity, it may be difficult to know where to turn in the sky route guidance display image. Therefore, the road route guidance display image may be switched to be generated. Also, if there are few roads to bend or if there are undulations (including slopes) on the road, the road surface route guidance display image may not match the shape of the undulations, so it is generated by switching to the sky route guidance display image. You may do it. On the other hand, when the road is flat, it may be generated by switching to a road surface route guidance display image.
 位置の精度に基づいて路面ルート案内表示画像と上空ルート案内表示画像との切り替えをおこなった場合に、切り替えが頻繁におこなわれる可能性がある。そこで、案内地に近い場合では位置精度の判断をおこなわないようにして、切り替えが頻繁に起こらないようにしてもよい。また、一度でも位置の精度が悪いと判断された後は操作指示などがない限り、上空ルート案内表示画像に表示するようにしてもよい。また、一度でも位置の精度が悪いと判断された後は、装置の電源が切れるまで上空ルート案内表示画像に表示するようにしてもよい。このような所定の条件を満たすまで、路面ルート案内表示画像と上空ルート案内表示画像のいずれかの表示を維持することで、切り替えが頻繁におこなわれることを防止できる。 When switching between the road surface route guidance display image and the sky route guidance display image based on the accuracy of the position, there is a possibility of frequent switching. In view of this, the position accuracy may not be determined when the vehicle is close to the guide point, and switching may not occur frequently. Further, once it is determined that the position accuracy is poor, it may be displayed on the sky route guidance display image unless there is an operation instruction or the like. In addition, once it is determined that the position accuracy is poor, it may be displayed on the sky route guidance display image until the apparatus is turned off. Until such a predetermined condition is satisfied, it is possible to prevent frequent switching by maintaining the display of either the road surface route guidance display image or the sky route guidance display image.
 また、路面ルート案内表示画像と上空ルート案内表示画像を切り替えるのではなく、両者を同時に表示させるようにしてもよい。その際、位置の精度に応じてルートの表示形態(色、透過度など)を異ならせてもよい。 Further, instead of switching the road surface route guidance display image and the sky route guidance display image, both may be displayed simultaneously. At that time, the route display form (color, transparency, etc.) may be varied according to the accuracy of the position.
 図22は、フロントウインドウに上空ルート案内表示画像を投影した状態を示す説明図である。図22において、フロントウインドウ2201には、上空ルート案内表示画像(リンク1001、1002)が表示されている。自車の位置の精度によって、路面ルート案内表示画像と上空ルート案内表示画像とを切り替えることが有効であることは、フロントウインドウ2201に投影する場合であっても同様である。なお、路面ルート案内表示画像、上空ルート案内表示画像などのルート表示画像を投影する対象は、フロントウインドウに限らない。車両の側面や後面のウインドウであってもよいし、また、車内に透明な部材を配置させて、その部材にルート表示画像を投影してもよい。すなわち、ユーザの視点とルートに対応する道路の路面との間に位置する透明部材にルート表示画像を投影すれば、ユーザは、車外を見るだけでルートを認識することができる。 FIG. 22 is an explanatory diagram showing a state in which the sky route guidance display image is projected on the front window. In FIG. 22, a sky route guidance display image (links 1001 and 1002) is displayed on the front window 2201. Switching between the road surface route guidance display image and the sky route guidance display image according to the accuracy of the position of the own vehicle is effective even when projecting on the front window 2201. Note that a target for projecting a route display image such as a road surface route guidance display image or an aerial route guidance display image is not limited to the front window. It may be a window on the side or rear of the vehicle, or a transparent member may be arranged in the vehicle and a route display image projected onto the member. That is, if a route display image is projected onto a transparent member positioned between the user's viewpoint and the road surface corresponding to the route, the user can recognize the route only by looking outside the vehicle.
 以上説明したように、本実施例によれば、移動体の現在位置を測位して現在位置情報を取得し、目的地までのルートに関するルート情報を取得し、ルートに対応する道路の路面を撮影した映像を取得し、現在位置情報とルート情報とに基づいて生成されたルート表示画像を、映像上の路面の位置から所定の高さに重ねて表示する。その際、現在位置の測位の信頼性に応じて所定の高さが変更されたルート表示画像を表示することができる。また、現在位置の測位の信頼性を判断し、現在位置の測位の信頼性が高いときは、路面の直上となる位置を所定の高さとした路面ルート案内表示画像を生成し、現在位置の測位の信頼性が低いときは、映像を撮影する視点よりも高い位置を所定の高さとした上空ルート案内表示画像を生成する。 As described above, according to the present embodiment, the current position of the moving body is measured to acquire the current position information, the route information regarding the route to the destination is acquired, and the road surface of the road corresponding to the route is photographed. The obtained video is acquired, and the route display image generated based on the current position information and the route information is displayed so as to be superimposed at a predetermined height from the position of the road surface on the video. At that time, it is possible to display a route display image whose predetermined height is changed according to the reliability of positioning of the current position. Also, the reliability of positioning of the current position is judged, and when the positioning of the current position is highly reliable, a road surface route guidance display image having a predetermined height as the position directly above the road surface is generated, and the positioning of the current position is determined. When the reliability is low, an aerial route guidance display image is generated in which a position higher than the viewpoint for shooting the video is set to a predetermined height.
 また、本実施例によれば、移動体の現在位置を測位して現在位置情報を取得し、目的地までのルートに関するルート情報を取得し、移動体の移動者の視点とルートに対応する道路の路面との間に位置する透過部材上に、現在位置情報とルート情報とに基づいて生成されたルート表示画像を、移動者から見た透明部材上の路面の位置から所定の高さに投影する。その際、現在位置の測位の信頼性に応じて所定の高さが変更されたルート表示画像を投影することができる。また、現在位置の測位の信頼性を判断し、現在位置の測位の信頼性が高いときは、路面の直上となる位置を所定の高さとした路面ルート案内表示画像を生成し、現在位置の測位の信頼性が低いときは、移動者の視点よりも高い位置を所定の高さとした上空ルート案内表示画像を生成する。 Further, according to the present embodiment, the current position of the moving body is measured to obtain current position information, the route information about the route to the destination is obtained, and the road corresponding to the viewpoint and route of the moving body of the moving body A route display image generated based on the current position information and the route information is projected on a transparent member located between the road surface and a predetermined height from the position of the road surface on the transparent member as seen by the moving person. To do. At that time, it is possible to project a route display image whose predetermined height is changed according to the reliability of positioning of the current position. Also, the reliability of positioning of the current position is judged, and when the positioning of the current position is highly reliable, a road surface route guidance display image having a predetermined height as the position directly above the road surface is generated, and the positioning of the current position is determined. When the reliability is low, an aerial route guidance display image is generated in which a position higher than the viewpoint of the moving person is set to a predetermined height.
 これらによって、ルートに沿った連続的なルート案内をおこなうことによって、より確実なルート案内を実現することができるとともに、位置の精度が悪い場合に生じる実際の道路とルート案内のずれによるルートの誤認識を低減することができる。 By providing continuous route guidance along the route, it is possible to realize more reliable route guidance and to correct the route error due to the difference between the actual road and route guidance that occurs when the position accuracy is poor. Recognition can be reduced.
 さらに、本実施例によれば、路面ルート案内表示画像または上空ルート案内表示画像のいずれか一方が表示されているときに他方に切り替える指示の入力を受け付け、入力を受け付けたときは測位の信頼性によらずに他方に切り替えて表示することもできる。また、路面ルート案内表示画像または上空ルート案内表示画像のいずれか一方から他方に切り替えて表示したときは、所定の条件を満たすまで切り替え後のルート表示画像の表示を維持するので、操作者の所望のルート案内を表示し続けることができる。 Furthermore, according to the present embodiment, when either the road surface route guidance display image or the sky route guidance display image is displayed, an instruction to switch to the other is accepted, and when the input is accepted, the reliability of positioning It is also possible to switch to the other display without depending on it. In addition, when switching from one of the road surface route guidance display image or the sky route guidance display image to the other and displaying, the display of the switched route display image is maintained until a predetermined condition is satisfied. You can continue to display route guidance.
 なお、本実施の形態で説明したナビゲーション方法は、あらかじめ用意されたプログラムをパーソナル・コンピュータやワークステーションなどのコンピュータで実行することにより実現することができる。このプログラムは、ハードディスク、フレキシブルディスク、CD-ROM、MO、DVDなどのコンピュータで読み取り可能な記録媒体に記録され、コンピュータによって記録媒体から読み出されることによって実行される。またこのプログラムは、インターネットなどのネットワークを介して配布することが可能な伝送媒体であってもよい。 Note that the navigation method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation. This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read from the recording medium by the computer. The program may be a transmission medium that can be distributed via a network such as the Internet.
 100 ナビゲーション装置
 101 現在位置情報取得部
 102 ルート情報取得部
 103 記憶部
 104 生成部
 105 映像取得部
 106 表示部
 107 入力部
 108 判断部
 109 センサ
 300 ナビゲーション装置
 301 現在位置情報取得部
 302 ルート情報取得部
 303 記憶部
 304 生成部
 305 投影部
 306 入力部
 307 判断部
 308 センサ
 500 情報表示装置
 501 現在位置情報取得部
 502 ルート情報取得部
 503 記憶部
 504 生成部
 505 映像取得部
 506 表示部
 507 入力部
 508 判断部
 509 センサ
 510 撮像部
DESCRIPTION OF SYMBOLS 100 Navigation apparatus 101 Current position information acquisition part 102 Route information acquisition part 103 Storage part 104 Generation part 105 Image acquisition part 106 Display part 107 Input part 108 Judgment part 109 Sensor 300 Navigation apparatus 301 Current position information acquisition part 302 Route information acquisition part 303 Storage unit 304 generation unit 305 projection unit 306 input unit 307 determination unit 308 sensor 500 information display device 501 current position information acquisition unit 502 route information acquisition unit 503 storage unit 504 generation unit 505 video acquisition unit 506 display unit 507 input unit 508 determination unit 509 Sensor 510 Imaging unit

Claims (12)

  1.  移動体の現在位置を測位して現在位置情報を取得する現在位置情報取得手段と、
     目的地までのルートに関するルート情報を取得するルート情報取得手段と、
     前記ルートに対応する道路の路面を撮影した映像を取得する映像取得手段と、
     前記現在位置情報と前記ルート情報とに基づいて生成されたルート表示画像を、前記映像上の前記路面の位置から所定の高さに重ねて表示する表示手段と、を備え、
     前記表示手段は、前記現在位置の測位の信頼性に応じて前記所定の高さが変更された前記ルート表示画像を表示することを特徴とするナビゲーション装置。
    Current position information acquisition means for measuring the current position of the mobile body and acquiring current position information;
    Route information acquisition means for acquiring route information about the route to the destination;
    Video acquisition means for acquiring a video of the road surface corresponding to the route;
    Display means for displaying a route display image generated based on the current position information and the route information so as to overlap a predetermined height from the position of the road surface on the video,
    The navigation device, wherein the display means displays the route display image in which the predetermined height is changed according to the reliability of positioning of the current position.
  2.  前記現在位置の測位の信頼性を判断する判断手段と、
     前記現在位置の測位の信頼性が高いときは、前記路面の直上となる位置を前記所定の高さとした前記ルート表示画像(以下「路面ルート案内表示画像」という)を生成し、前記現在位置の測位の信頼性が低いときは、前記映像を撮影する視点よりも高い位置を前記所定の高さとした前記ルート表示画像(以下「上空ルート案内表示画像」という)を生成する生成手段と、を備えたことを特徴とする請求項1に記載のナビゲーション装置。
    Determining means for determining the reliability of positioning of the current position;
    When the reliability of the positioning of the current position is high, the route display image (hereinafter referred to as “road surface route guidance display image”) in which the position directly above the road surface is the predetermined height is generated, and the current position Generating means for generating the route display image (hereinafter referred to as “aerial route guidance display image”) having the predetermined height at a position higher than the viewpoint at which the video is captured when positioning reliability is low; The navigation device according to claim 1, wherein:
  3.  移動体の現在位置を測位して現在位置情報を取得する現在位置情報取得手段と、
     目的地までのルートに関するルート情報を取得するルート情報取得手段と、
     前記移動体の移動者の視点と前記ルートに対応する道路の路面との間に位置する透過部材上に、前記現在位置情報と前記ルート情報とに基づいて生成されたルート表示画像を、前記移動者から見た前記透過部材上の前記路面の位置から所定の高さに投影する投影手段と、を備え、
     前記投影手段は、前記現在位置の測位の信頼性に応じて前記所定の高さが変更された前記ルート表示画像を投影することを特徴とするナビゲーション装置。
    Current position information acquisition means for measuring the current position of the mobile body and acquiring current position information;
    Route information acquisition means for acquiring route information about the route to the destination;
    A route display image generated based on the current position information and the route information on a transmission member located between a viewpoint of a moving person of the moving body and a road surface corresponding to the route is moved. Projecting means for projecting to a predetermined height from the position of the road surface on the transmission member viewed from the person,
    The navigation device according to claim 1, wherein the projection unit projects the route display image having the predetermined height changed according to the reliability of positioning of the current position.
  4.  前記現在位置の測位の信頼性を判断する判断手段と、
     前記現在位置の測位の信頼性が高いときは、前記路面の直上となる位置を前記所定の高さとした前記ルート表示画像(以下「路面ルート案内表示画像」という)を生成し、前記現在位置の測位の信頼性が低いときは、前記移動者の視点よりも高い位置を前記所定の高さとした前記ルート表示画像(以下「上空ルート案内表示画像」という)を生成する生成手段と、を備えたことを特徴とする請求項2に記載のナビゲーション装置。
    Determining means for determining the reliability of positioning of the current position;
    When the reliability of the positioning of the current position is high, the route display image (hereinafter referred to as “road surface route guidance display image”) in which the position directly above the road surface is the predetermined height is generated, and the current position Generating means for generating the route display image (hereinafter referred to as “aerial route guidance display image”) having the predetermined height at a position higher than the viewpoint of the moving person when the positioning reliability is low; The navigation device according to claim 2.
  5.  前記路面ルート案内表示画像または前記上空ルート案内表示画像のいずれか一方が表示されているときに他方に切り替える指示の入力を受け付ける入力手段を備え、
     前記表示手段は、前記入力手段が入力を受け付けたときは前記測位の信頼性によらずに前記他方に切り替えて表示することを特徴とする請求項2または4に記載のナビゲーション装置。
    When either one of the road surface route guidance display image or the sky route guidance display image is displayed, an input unit that receives an input of an instruction to switch to the other,
    5. The navigation device according to claim 2, wherein, when the input unit receives an input, the display unit switches to the other and displays it regardless of the reliability of the positioning. 6.
  6.  前記表示手段は、前記路面ルート案内表示画像または前記上空ルート案内表示画像のいずれか一方から他方に切り替えて表示したときは、所定の条件を満たすまで切り替え後の前記ルート表示画像の表示を維持することを特徴とする請求項2または4に記載のナビゲーション装置。 The display means maintains the display of the route display image after the switching until a predetermined condition is satisfied when the display is switched from one of the road surface route guidance display image or the sky route guidance display image to the other. The navigation device according to claim 2 or 4, wherein
  7.  前記現在位置及び方位を測位する複数の自立航法センサを備え、
     前記判断手段は、各々の前記自立航法センサの測位結果が整合するか否かによって前記信頼性を判断することを特徴とする請求項2または4に記載のナビゲーション装置。
    A plurality of self-contained navigation sensors for measuring the current position and direction;
    5. The navigation device according to claim 2, wherein the determination unit determines the reliability based on whether or not the positioning results of the independent navigation sensors match each other.
  8.  前記現在位置及び方位を測位する自立航法センサとGPSセンサとを備え、
     前記判断手段は、前記自立航法センサと前記GPSセンサとの測位結果が整合するか否かによって前記信頼性を判断することを特徴とする請求項2または4に記載のナビゲーション装置。
    A self-contained navigation sensor for measuring the current position and direction and a GPS sensor;
    5. The navigation device according to claim 2, wherein the determination unit determines the reliability based on whether or not positioning results of the self-contained navigation sensor and the GPS sensor match each other.
  9.  ナビゲーション装置におけるナビゲーション方法であって、
     移動体の現在位置を測位して現在位置情報を取得する現在位置情報取得工程と、
     目的地までのルートに関するルート情報を取得するルート情報取得工程と、
     前記ルートに対応する道路の路面を撮影した映像を取得する映像取得工程と、
     前記現在位置情報と前記ルート情報とに基づいて生成されたルート表示画像を、前記映像上の前記路面の位置から所定の高さに重ねて表示する表示工程と、を備え、
     前記表示工程では、前記現在位置の測位精度に応じて前記所定の高さが変更された前記ルート表示画像を表示することを特徴とするナビゲーション方法。
    A navigation method in a navigation device,
    A current position information acquisition step of measuring the current position of the mobile body and acquiring the current position information;
    A route information acquisition process for acquiring route information about the route to the destination;
    A video acquisition step of acquiring a video of the road surface corresponding to the route;
    A display step of displaying a route display image generated based on the current position information and the route information so as to overlap with a predetermined height from the position of the road surface on the video,
    In the display step, the route display image in which the predetermined height is changed according to the positioning accuracy of the current position is displayed.
  10.  ナビゲーション装置におけるナビゲーション方法であって、
     移動体の現在位置を測位して現在位置情報を取得する現在位置情報取得工程と、
     目的地までのルートに関するルート情報を取得するルート情報取得工程と、
     前記移動体の移動者の視点と前記ルートに対応する道路の路面との間に位置する透過部材上に、前記現在位置情報と前記ルート情報とに基づいて生成されたルート表示画像を、前記移動者から見た前記透明部材上の前記路面の位置から所定の高さに投影する投影工程と、を備え、
     前記投影工程では、前記現在位置の測位精度に応じて前記所定の高さが変更された前記ルート表示画像を投影することを特徴とするナビゲーション方法。
    A navigation method in a navigation device,
    A current position information acquisition step of measuring the current position of the mobile body and acquiring the current position information;
    A route information acquisition process for acquiring route information about the route to the destination;
    A route display image generated based on the current position information and the route information on a transmission member positioned between a viewpoint of a moving person of the moving body and a road surface corresponding to the route is moved. Projecting to a predetermined height from the position of the road surface on the transparent member viewed from the person,
    In the projecting step, the route display image in which the predetermined height is changed according to the positioning accuracy of the current position is projected.
  11.  請求項9または10に記載のナビゲーション方法をコンピュータに実行させることを特徴とするナビゲーションプログラム。 A navigation program for causing a computer to execute the navigation method according to claim 9 or 10.
  12.  請求項11に記載のナビゲーションプログラムを記録したことを特徴とするコンピュータに読み取り可能な記録媒体。 A computer-readable recording medium in which the navigation program according to claim 11 is recorded.
PCT/JP2010/057392 2010-04-26 2010-04-26 Navigation system, navigation method, navigation program, and storage medium WO2011135660A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2010/057392 WO2011135660A1 (en) 2010-04-26 2010-04-26 Navigation system, navigation method, navigation program, and storage medium
JP2011530191A JP4833384B1 (en) 2010-04-26 2010-04-26 Navigation device, navigation method, navigation program, and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/057392 WO2011135660A1 (en) 2010-04-26 2010-04-26 Navigation system, navigation method, navigation program, and storage medium

Publications (1)

Publication Number Publication Date
WO2011135660A1 true WO2011135660A1 (en) 2011-11-03

Family

ID=44861007

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/057392 WO2011135660A1 (en) 2010-04-26 2010-04-26 Navigation system, navigation method, navigation program, and storage medium

Country Status (2)

Country Link
JP (1) JP4833384B1 (en)
WO (1) WO2011135660A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013114617A1 (en) * 2012-02-03 2013-08-08 パイオニア株式会社 Image-display device, method for displaying image, and image-display program
CN103245345A (en) * 2013-04-24 2013-08-14 浙江大学 Indoor navigation system, indoor navigation method and indoor searching method based on image sensing technology
KR20160104825A (en) * 2015-02-26 2016-09-06 엘지전자 주식회사 Apparatus for guiding traggic lane and method thereof
JP2018020779A (en) * 2017-09-29 2018-02-08 日本精機株式会社 Vehicle information projection system
WO2020121810A1 (en) * 2018-12-14 2020-06-18 株式会社デンソー Display control device, display control program, and tangible, non-transitory computer-readable recording medium
JP2020097399A (en) * 2018-12-14 2020-06-25 株式会社デンソー Display control device and display control program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0525476U (en) * 1991-02-22 1993-04-02 株式会社ケンウツド In-vehicle navigation device
JP2003344062A (en) * 2002-05-30 2003-12-03 Alpine Electronics Inc Navigation apparatus
JP2008501956A (en) * 2004-06-03 2008-01-24 メイキング バーチャル ソリッド,エル.エル.シー. Navigation navigation display method and apparatus using head-up display
WO2009084129A1 (en) * 2007-12-28 2009-07-09 Mitsubishi Electric Corporation Navigation device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0525476U (en) * 1991-02-22 1993-04-02 株式会社ケンウツド In-vehicle navigation device
JP2003344062A (en) * 2002-05-30 2003-12-03 Alpine Electronics Inc Navigation apparatus
JP2008501956A (en) * 2004-06-03 2008-01-24 メイキング バーチャル ソリッド,エル.エル.シー. Navigation navigation display method and apparatus using head-up display
WO2009084129A1 (en) * 2007-12-28 2009-07-09 Mitsubishi Electric Corporation Navigation device

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013114617A1 (en) * 2012-02-03 2013-08-08 パイオニア株式会社 Image-display device, method for displaying image, and image-display program
CN103245345A (en) * 2013-04-24 2013-08-14 浙江大学 Indoor navigation system, indoor navigation method and indoor searching method based on image sensing technology
KR20160104825A (en) * 2015-02-26 2016-09-06 엘지전자 주식회사 Apparatus for guiding traggic lane and method thereof
KR101698104B1 (en) * 2015-02-26 2017-01-20 엘지전자 주식회사 Apparatus for guiding traggic lane and method thereof
JP2018020779A (en) * 2017-09-29 2018-02-08 日本精機株式会社 Vehicle information projection system
WO2020121810A1 (en) * 2018-12-14 2020-06-18 株式会社デンソー Display control device, display control program, and tangible, non-transitory computer-readable recording medium
JP2020097399A (en) * 2018-12-14 2020-06-25 株式会社デンソー Display control device and display control program
US20210223058A1 (en) * 2018-12-14 2021-07-22 Denso Corporation Display control device and non-transitory computer-readable storage medium for the same
JP7052786B2 (en) 2018-12-14 2022-04-12 株式会社デンソー Display control device and display control program
JP2022079590A (en) * 2018-12-14 2022-05-26 株式会社デンソー Display control device and display control program
JP7416114B2 (en) 2018-12-14 2024-01-17 株式会社デンソー Display control device and display control program

Also Published As

Publication number Publication date
JPWO2011135660A1 (en) 2013-07-18
JP4833384B1 (en) 2011-12-07

Similar Documents

Publication Publication Date Title
US20230029160A1 (en) Apparatus and Methods of Displaying Navigation Instructions
US8180567B2 (en) Navigation device with camera-info
US8423292B2 (en) Navigation device with camera-info
JP4705170B2 (en) Navigation device and method for scrolling map data displayed on navigation device
JP2006084208A (en) Navigation system and travelling direction guidance method
JP4833384B1 (en) Navigation device, navigation method, navigation program, and recording medium
JP2015105903A (en) Navigation device, head-up display, control method, program, and storage medium
RU2375756C2 (en) Navigation device with information received from camera
JP5702476B2 (en) Display device, control method, program, storage medium
WO2011121788A1 (en) Navigation device, information display device, navigation method, navigation program, and recording medium
JP5438172B2 (en) Information display device, information display method, information display program, and recording medium
KR20080019690A (en) Navigation device with camera-info
JP5356483B2 (en) Navigation device and navigation method
JP2011022152A (en) Navigation device

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2011530191

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10850679

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10850679

Country of ref document: EP

Kind code of ref document: A1