WO2011135660A1 - Système de navigation, procédé de navigation, programme de navigation, et support de stockage - Google Patents

Système de navigation, procédé de navigation, programme de navigation, et support de stockage Download PDF

Info

Publication number
WO2011135660A1
WO2011135660A1 PCT/JP2010/057392 JP2010057392W WO2011135660A1 WO 2011135660 A1 WO2011135660 A1 WO 2011135660A1 JP 2010057392 W JP2010057392 W JP 2010057392W WO 2011135660 A1 WO2011135660 A1 WO 2011135660A1
Authority
WO
WIPO (PCT)
Prior art keywords
route
current position
display image
information
road surface
Prior art date
Application number
PCT/JP2010/057392
Other languages
English (en)
Japanese (ja)
Inventor
智博 廣瀬
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2010/057392 priority Critical patent/WO2011135660A1/fr
Priority to JP2011530191A priority patent/JP4833384B1/ja
Publication of WO2011135660A1 publication Critical patent/WO2011135660A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself

Definitions

  • the present invention relates to a navigation device, a navigation method, a navigation program, and a recording medium that display a route to a destination.
  • the use of the present invention is not limited to the above-described navigation device, navigation method, navigation program, and recording medium.
  • Patent Document 1 since the guidance display is continuously superimposed on the road surface of the road, the guidance display is drawn on the road surface of the road when the positioning accuracy of the position is good. This makes it easy to understand where to turn (see Fig. 16), but if the guidance display is drawn on the road surface when the positioning accuracy of the self-position is poor, the road display and the guidance display will be misaligned. (Refer to FIG. 20).
  • a navigation device includes a current position information acquisition means for measuring a current position of a mobile body and acquiring current position information, Route information acquisition means for acquiring route information relating to the route, video acquisition means for acquiring an image of a road surface of the road corresponding to the route, and a route display generated based on the current position information and the route information Display means for superimposing and displaying an image at a predetermined height from the position of the road surface on the video, the display means having the predetermined height according to the reliability of positioning of the current position. The changed route display image is displayed.
  • the navigation device includes a current position information acquisition unit that acquires a current position information by measuring a current position of a mobile body, and a route information acquisition unit that acquires route information regarding a route to a destination. And a route display image generated based on the current position information and the route information on a transmissive member located between the viewpoint of the mobile person of the moving body and the road surface corresponding to the route, Projection means for projecting to a predetermined height from the position of the road surface on the transmission member as seen from the moving person, the projection means depending on the reliability of positioning at the current position. Projecting the route display image whose length is changed.
  • a navigation method is a navigation method in a navigation device, and relates to a current position information acquisition step of acquiring a current position information by measuring a current position of a moving body, and a route to a destination.
  • a route information acquisition step for acquiring route information a video acquisition step for acquiring a video of a road surface of a road corresponding to the route, and a route display image generated based on the current position information and the route information.
  • a route display image is displayed.
  • a navigation method is a navigation method in a navigation device, and relates to a current position information acquisition step of acquiring a current position information by measuring a current position of a moving body, and a route to a destination. Based on the current position information and the route information on a route information obtaining step for obtaining route information, and on a transparent member located between a viewpoint of a moving person of the moving body and a road surface of the road corresponding to the route. Projecting the route display image generated in this way to a predetermined height from the position of the road surface on the transmission member as seen by the moving person, and in the projection step, the positioning accuracy of the current position The route display image having the predetermined height changed according to the projection is projected.
  • a navigation program according to the invention of claim 11 causes a computer to execute the navigation method according to claim 9 or 10.
  • the recording medium according to the invention of claim 12 is readable by a computer in which the navigation program according to claim 11 is recorded.
  • FIG. 1 is a block diagram of a functional configuration of the navigation device according to the first embodiment.
  • FIG. 2 is a flowchart of a navigation process performed by the navigation device according to the first embodiment.
  • FIG. 3 is a block diagram of a functional configuration of the navigation device according to the second embodiment.
  • FIG. 4 is a flowchart of a navigation process performed by the navigation device according to the second embodiment.
  • FIG. 5 is a block diagram of a functional configuration of the information display apparatus according to the third embodiment.
  • FIG. 6 is a block diagram illustrating a hardware configuration of the navigation device.
  • FIG. 7 is a block diagram illustrating a functional configuration of the generation unit of the navigation device.
  • FIG. 8 is a flowchart illustrating a processing procedure of the generation unit of the navigation device.
  • FIG. 1 is a block diagram of a functional configuration of the navigation device according to the first embodiment.
  • FIG. 2 is a flowchart of a navigation process performed by the navigation device according to the first embodiment.
  • FIG. 9 is an explanatory diagram showing map data (node information, link information) at intersections of crossroads.
  • FIG. 10A is an explanatory diagram of the contents of the placement process in the placement processing unit of the generation unit.
  • FIG. 10B is an explanatory diagram of the contents of the placement process in the placement processing unit of the generation unit.
  • FIG. 11A is an explanatory diagram of the contents of the placement process in the placement processing unit of the generation unit.
  • FIG. 11B is an explanatory diagram of the contents of the placement processing in the placement processing unit of the generation unit.
  • FIG. 12A is an explanatory diagram of the contents of the movement process in the movement processing unit of the generation unit.
  • FIG. 12B is an explanatory diagram of the contents of the movement process in the movement processing unit of the generation unit.
  • FIG. 13A is an explanatory diagram of the contents of setting processing in the setting processing unit of the generation unit.
  • FIG. 13B is an explanatory diagram of the contents of setting processing in the setting processing unit of the generation unit.
  • FIG. 14 is an explanatory diagram showing a state rendered by the rendering processing unit of the generation unit.
  • FIG. 15 is an explanatory diagram showing a state in which an aerial route guidance display image is superimposed on the front video.
  • FIG. 16 is an explanatory diagram showing a state in which a road surface route guidance display image is displayed superimposed on the front video.
  • FIG. 17 is a flowchart illustrating a procedure of switching generation processing in the generation unit of the navigation device.
  • FIG. 18 is a flowchart showing a procedure (No.
  • FIG. 20 is an explanatory diagram illustrating a state in which a road surface route guidance display image is displayed over the front video.
  • FIG. 21 is an explanatory diagram showing a state in which an aerial route guidance display image is displayed superimposed on the front video.
  • FIG. 22 is an explanatory diagram showing a state in which the sky route guidance display image is projected on the front window.
  • FIG. 1 is a block diagram of a functional configuration of the navigation device according to the first embodiment.
  • the navigation apparatus 100 according to the first embodiment includes a current position information acquisition unit 101, a route information acquisition unit 102, a storage unit 103, a generation unit 104, a video acquisition unit 105, a display unit 106, an input unit 107, and a determination.
  • the unit 108 and the sensor 109 are configured.
  • the current position information acquisition unit 101 measures the current position of the moving body and acquires current position information.
  • the information related to the current position of the mobile object may include information such as the speed and travel direction of the mobile object in addition to the latitude and longitude information.
  • Information relating to the current position of the moving object can be acquired by, for example, a sensor 109 described later.
  • the route information acquisition unit 102 acquires route information related to the route to the destination.
  • the route information may be registered in advance, may be acquired by being received by communication means, or may be acquired by input of an operator (driver or passenger). At that time, only information on the destination is input from the operator, a route is searched based on the current position of the moving body (for example, the own vehicle) and the input destination, and the determined route is acquired as route information. You may make it do.
  • the route information may include link information of each road corresponding to the route to the destination and node information indicating an intersection.
  • the storage unit 103 stores so-called map information, specifically, for example, link information including the shape of a road and node information indicating an intersection.
  • the map information may further include traffic lights, sign information, building information (for example, building shape and height), and the like.
  • the link information may include information on road width information and inclination.
  • the storage unit 103 may be provided in the navigation device 100, or an external server accessible by communication may realize the function of the storage unit 103.
  • the generation unit 104 generates a route display image based on the information on the current position acquired by the current position information acquisition unit 101 and the route information acquired by the route information acquisition unit 102. Specifically, for example, a road-like route display image that is continuous along the route and displayed to overlap the road surface, and a belt-like route display image that is continuous along the route. Then, an aerial route guidance display image displayed at a predetermined height from the road surface is generated by switching at a predetermined timing. At that time, if the positioning of the current position is highly reliable, a road surface route guidance display image with a predetermined height as the position directly above the road surface is generated, and if the positioning of the current position is low, the video is displayed. A sky route guidance display image having a predetermined height at a position higher than the viewpoint to be photographed is generated. A specific method for generating the road route guidance display image and the sky route guidance display image, switching timing, and the like will be described later.
  • the generation unit 104 may switch the road route guidance display image to the sky route guidance display image when a predetermined condition is met. In addition, the generation unit 104 switches from the sky route guidance display image to the road surface route guidance display image when the predetermined condition is met. Specific contents of the predetermined condition will be described later.
  • the video acquisition unit 105 acquires a video of the road surface corresponding to the route (for example, from the imaging unit 110).
  • the imaging unit 110 may be included in the navigation device 100.
  • the display unit 106 displays the route display image generated (for example, generated by the generation unit 104) based on the current position information and the route information, and the position of the road surface on the video (for example, acquired by the video acquisition unit 105).
  • the images are displayed at a predetermined height.
  • the display unit 106 displays a route display image whose predetermined height is changed according to the reliability of positioning of the current position. Further, when the display unit 106 switches from one of the road surface route guidance display image and the sky route guidance display image to the other, the display unit 106 maintains the display of the route display image after the switching until a predetermined condition is satisfied. It may be.
  • the input unit 107 accepts an input of an instruction to switch to the other when either the road surface route guidance display image or the sky route guidance display image is displayed. Then, when the input unit 107 receives an input, the display unit 106 may be switched to the other display without depending on the reliability of positioning determined by the determination unit 108 described later.
  • the determination unit 108 determines the reliability of positioning of the current position (for example, acquired by the current position information acquisition unit 101), and passes the determination result to the generation unit 104. Details of specific determination processing will be described later.
  • the generation unit 104 generates a road surface route guidance display image with a predetermined height as a position directly above the road surface when the current position positioning reliability is high, and when the current position positioning reliability is low, You may make it produce
  • the sensor 109 includes a plurality of self-contained navigation sensors that measure the current position and direction. Then, the determination unit 108 can determine the reliability depending on whether or not the positioning results of the independent navigation sensors 109 are consistent.
  • the sensor 109 includes a self-contained navigation sensor that measures the current position and direction and a GPS sensor. The determination unit 108 can determine the reliability based on whether or not the positioning results of the self-contained navigation sensor and the GPS sensor match.
  • FIG. 2 is a flowchart showing a procedure of navigation processing by the navigation device.
  • the navigation device 100 first acquires the current position information by measuring the current position of the moving body by the current position information acquisition unit 101 (step S201). At this time, information regarding the reliability of positioning of the current position may be acquired.
  • the route information acquisition unit 102 acquires route information regarding the route to the destination (step S202). At that time, the navigation device 100 may also acquire road information.
  • the navigation device 100 generates a route display image (road surface route guidance display image or sky route guidance display image) according to the reliability of positioning of the current position based on the current position information and route information (step). S204).
  • a route display image road surface route guidance display image or sky route guidance display image
  • the navigation device 100 acquires a video image of the road surface corresponding to the route (step S205). Then, the navigation device 100 displays the route display image generated based on the current position information and the route information so as to overlap the predetermined height from the position of the road surface on the video (step S206).
  • the navigation apparatus 100 waits for the current position to change due to the traveling of the moving body (for example, the own vehicle) (step S207: No), and when the current position is changed (step S207: Yes). It is then determined whether or not the destination has been reached (step S208). If the destination has not yet been reached (step S208: No), the process returns to step S201, and the processes in steps S201 to S207 are repeated. In step S208, when it arrives at the destination (step S208: Yes), a series of processing is ended.
  • a route display image (either a road route guidance display image or a sky route guidance display image) according to the reliability of positioning of the current position is displayed on the video until the moving body (for example, the own vehicle) arrives at the destination. It continues to be displayed repeatedly. The operator can recognize the route to the destination without making a mistake by viewing the route display image displayed together with the video on the display screen while traveling.
  • the navigation device 100 may be configured as a single device with each component unit integrated, or each component unit may be a single device and the function of the navigation device 100 may be realized by a plurality of devices. Good.
  • FIG. 3 is a block diagram of a functional configuration of the navigation device according to the second embodiment.
  • the navigation apparatus 300 according to the second embodiment includes a current position information acquisition unit 301, a route information acquisition unit 302, a storage unit 303, a generation unit 304, a projection unit 305, an input unit 306, a determination unit 307, and a sensor 308. Consists of.
  • the current position information acquisition unit 301, the route information acquisition unit 302, the storage unit 303, the input unit 306, the determination unit 307, and the sensor 308 are the current position information acquisition unit 101, the route information acquisition unit 102, and the like in the first embodiment.
  • the configuration is the same as that of the storage unit 103, the input unit 107, the determination unit 108, and the sensor 109, and a description thereof will be omitted.
  • the generation unit 304 includes information on the current position acquired by the current position information acquisition unit 301, route information acquired by the route information acquisition unit 302, only link information stored in the storage unit 303, or node information and A route display image is generated based on the link information. Specifically, for example, a sky-like route display image that is a continuous belt-like route display image along the route and displayed at a predetermined height from the road surface is generated. At that time, when the positioning reliability of the current position is high, a road surface route guidance display image with a predetermined height as the position directly above the road surface is generated. When the positioning reliability of the current position is low, the mobile A sky route guidance display image having a predetermined height higher than the viewpoint is generated.
  • the generation unit 304 is different from the generation unit 104 of the first embodiment in a specific generation method of the road surface route guidance display image and the sky route guidance display image.
  • a specific method for generating the road surface route guidance display image and the sky route guidance display image will be described later.
  • the difference between the navigation device 300 according to the second embodiment and the navigation device 100 according to the first embodiment is that the navigation device 100 according to the first embodiment includes a video acquisition unit 105 and a display unit 106.
  • the navigation device 300 according to the second embodiment is not provided with the video acquisition unit 105 and the display unit 106 but is provided with a projection unit 305 instead.
  • the projecting unit 305 moves a route display image generated based on the current position information and the route information on a transmission member located between the viewpoint of the moving object moving person and the road surface corresponding to the route. Projection is performed at a predetermined height from the position of the road surface on the transparent member as seen by the person.
  • the road surface route guidance display image or the sky route guidance display image generated by the generation unit 304 is projected onto the window of the corresponding moving body (for example, the own vehicle).
  • the image is projected onto a window (usually a front window) in which the road corresponding to the route can be seen from inside the vehicle.
  • the projection unit 305 projects a route display image whose predetermined height has been changed according to the reliability of positioning at the current position.
  • FIG. 4 is a flowchart showing a procedure of navigation processing by the navigation device.
  • the navigation device 300 first acquires the current position information by measuring the current position of the moving body by the current position information acquisition unit 301 (step S ⁇ b> 401). At that time, the reliability of the current position may be determined.
  • the route information acquisition unit 302 acquires route information regarding the route to the destination (step S402). At that time, the navigation apparatus 300 may also acquire road information (node information / link information) stored in the storage unit 303 (step S403).
  • the navigation device 300 generates a route display image (road surface route guidance display image or sky route guidance display image) according to the reliability of positioning of the current position based on the current position information and route information (step S404). ), And projecting the generated route display image at a predetermined height from the position of the road surface on the transparent member as viewed from the moving person. Specifically, for example, the generated road surface route guidance display image or the sky route guidance display image is projected onto the front window of the moving body (for example, the own vehicle) (step S405).
  • the navigation apparatus 300 waits for the current position to change due to the traveling of the mobile body (for example, the own vehicle) (step S406: No), and when the current position is changed (step S406: Yes). Then, it is determined whether or not the destination has been reached (step S407). If the destination has not yet been reached (step S407: No), the process returns to step S401, and the processes of steps S401 to S406 are repeated. In step S407, when it arrives at the destination (step S407: Yes), a series of processes are complete
  • the road surface route guidance display image or the sky route guidance display image continues to be projected on the front window of the moving body (for example, the own vehicle) until the moving body (for example, the own vehicle) arrives at the destination. While traveling, the operator can recognize the route to the destination by viewing the actual scenery ahead through the front window and viewing the projected road route guidance display image or the sky route guidance display image.
  • the navigation device 300 may be configured as a single device with each component unit integrated, or each component unit may be a single device, and the function of the navigation device 300 may be realized by a plurality of devices. Good.
  • the navigation device is mounted on a moving body such as an automobile.
  • the navigation apparatus is not limited to the one mounted on the moving body.
  • FIG. 5 is a block diagram of a functional configuration of the information display apparatus according to the third embodiment.
  • the information display apparatus 500 according to the third embodiment includes a current position information acquisition unit 501, a route information acquisition unit 502, a storage unit 503, a generation unit 504, a video acquisition unit 505, a display unit 506, an input unit 507, The determination unit 508, the sensor 509, and the imaging unit 510 are configured.
  • the current position information acquisition unit 501 measures the current position of the own device and acquires current position information.
  • the information about the current position of the device itself includes the direction in which the device (particularly the display screen) is facing, the height of the device from the ground, and the device (the operator carrying the device). Information such as moving speed may be included. Information relating to the current position of the device itself can be acquired, for example, by a sensor 509 described later.
  • the route information acquisition unit 502 acquires route information related to the route to the destination.
  • the route information may be registered in advance, may be acquired by being received by a communication unit, or may be acquired by input of an operator of the own device. At that time, only information related to the destination is input from the operator, a route is searched based on the current position of the own device and the input destination, and the determined route is acquired as route information. Good.
  • the storage unit 503 stores so-called map information, specifically, link information including, for example, a road shape, and node information indicating an intersection.
  • the map information may further include traffic lights, sign information, building information (for example, building shape and height), and the like.
  • the link information may include information on road width information and inclination.
  • the storage unit 503 may be provided in the information display device 500, or an external server accessible by communication may realize the function of the storage unit 503.
  • the generation unit 504 generates a route display image based on the information on the current position acquired by the current position information acquisition unit 501 and the route information acquired by the route information acquisition unit 502. Specifically, for example, a road-like route display image that is continuous along the route and displayed to overlap the road surface, and a belt-like route display image that is continuous along the route. Then, an aerial route guidance display image displayed at a predetermined height from the road surface is generated by switching at a predetermined timing. At that time, if the positioning of the current position is highly reliable, a road surface route guidance display image with a predetermined height as the position directly above the road surface is generated, and if the positioning of the current position is low, the video is displayed. A sky route guidance display image having a predetermined height at a position higher than the viewpoint to be photographed is generated. A specific method for generating the road route guidance display image and the sky route guidance display image, switching timing, and the like will be described later.
  • the video acquisition unit 505 acquires the video from the imaging unit 510 that captures the video on the route of the device itself.
  • the imaging unit 510 is configured to be included in the information display device 500. However, it is only necessary that the imaging unit 510 is connected to the information display device 500 by wire or wireless and the information display device 500 can acquire the captured video.
  • the display unit 506 displays the generated route display image (for example, generated by the generation unit 504) based on the current position information and the route information, and the position of the road surface on the video (for example, acquired by the video acquisition unit 505). The images are displayed at a predetermined height. Then, the display unit 506 displays a route display image whose predetermined height is changed according to the reliability of positioning of the current position. Further, when the display unit 506 switches from one of the road surface route guidance display image or the sky route guidance display image to the other, the display unit 506 maintains the display of the route display image after the switching until a predetermined condition is satisfied. It may be.
  • the input unit 507 receives an input of an instruction to switch to the other when either the road surface route guidance display image or the sky route guidance display image is displayed. Then, when the input unit 507 receives an input, the display unit 506 may switch to the other display without depending on the reliability of positioning determined by the determination unit 508 described later.
  • the determination unit 508 determines the reliability of positioning of the current position (for example, acquired by the current position information acquisition unit 501), and passes the determination result to the generation unit 504. Details of specific determination processing will be described later.
  • the generation unit 504 generates a road surface route guidance display image in which the position directly above the road surface is a predetermined height when the positioning reliability of the current position is high, and when the reliability of the positioning of the current position is low, You may make it produce
  • the sensor 509 includes a plurality of self-contained navigation sensors that measure the current position and direction. Then, the determination unit 508 can determine the reliability based on whether or not the positioning results of the respective independent navigation sensors 509 match.
  • the sensor 509 includes a self-contained navigation sensor that measures the current position and direction and a GPS sensor. The determination unit 508 can determine the reliability based on whether or not the positioning results of the self-contained navigation sensor and the GPS sensor match.
  • the information display device 500 may be configured as a single device with each component unit integrated, or each component unit is a single device, and the function of the information display device 500 is realized by a plurality of devices. May be.
  • Information display device 500 may be, for example, a portable information terminal device (more specifically, for example, a mobile phone, a PDA, a mobile personal computer, etc.). Further, the information display device 500 can be mounted on a moving body to provide the navigation device 100 according to the first embodiment.
  • examples of the navigation device 100 according to the first embodiment, the navigation device 300 according to the second embodiment, and the information display device 500 according to the third embodiment will be described.
  • the example of the navigation device 100 according to the first embodiment, the example of the navigation device 300 according to the second embodiment, and the information display device 500 according to the third embodiment are applied to a navigation device mounted on a vehicle. An example will be described.
  • FIG. 6 is a block diagram illustrating a hardware configuration of the navigation device 100, the navigation device 300, and the information display device 500 (hereinafter simply referred to as “navigation device”).
  • the navigation device includes a CPU 601, ROM 602, RAM (memory) 603, magnetic disk drive 604, magnetic disk 605, optical disk drive 606, optical disk 607, audio I / F (interface) 608, microphone 609, speaker 610, input.
  • a device 611, a video I / F 612, a camera 613, a display 614, a projector 615, a communication I / F 616, a GPS unit 617, and various sensors 618 are provided.
  • the constituent units 601 to 618 are connected by a bus 620, respectively.
  • the CPU 601 governs overall control of the navigation device.
  • the ROM 602 records various programs such as a boot program, a communication program, a data display program, and a data analysis program.
  • the RAM 603 is used as a work area for the CPU 601.
  • the magnetic disk drive 604 controls the reading / writing of the data with respect to the magnetic disk 605 according to control of CPU601.
  • the magnetic disk 605 records data written under the control of the magnetic disk drive 604.
  • an HD hard disk
  • FD flexible disk
  • the optical disc drive 606 controls reading / writing of data with respect to the optical disc 607 according to the control of the CPU 601.
  • the optical disc 607 is a detachable recording medium from which data is read according to the control of the optical disc drive 606, and includes, for example, a Blu-ray disc, DVD, CD, and the like.
  • a writable recording medium can be used as the optical disk 607.
  • the removable recording medium may be an MO, a memory card, or the like.
  • map data used for route search / route guidance.
  • the map data includes background data representing features (features) such as buildings, rivers, and the ground surface, and road shape data representing the shape of the road.
  • the map data is displayed on the display screen of the display 614 in two dimensions or three dimensions. Drawn. When the navigation device is guiding the route, the map data and the current location of the moving body (for example, the own vehicle) acquired by the GPS unit 617 described later are displayed in an overlapping manner.
  • the voice I / F 608 is connected to a microphone 609 for voice input and a speaker 610 for voice output.
  • the sound received by the microphone 609 is A / D converted in the sound I / F 608.
  • sound is output from the speaker 610. Note that the sound input from the microphone 609 can be recorded on the magnetic disk 605 or the optical disk 607 as sound data.
  • the input device 611 includes a remote controller, a keyboard, a mouse, a touch panel, and the like that are provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like. Further, the input device 611 can connect other information processing terminals such as a digital camera and a mobile phone terminal to input / output data.
  • the video I / F 612 is connected to a camera 613 for video input, a display 614 for video output, and a projector 615 for video output.
  • the video I / F 612 includes, for example, a graphic controller that controls the entire display 614 and the projector 615, and a VRAM (Video) that temporarily records image information that can be displayed immediately.
  • VRAM Video
  • RAM random access memory
  • control IC controls display of the display 614 and the projector 615 based on image data output from the graphic controller.
  • the camera 613 captures images inside and outside the vehicle and outputs them as image data.
  • An image captured by the camera 613 can be recorded on the magnetic disk 605 or the optical disk 607 as image data.
  • the display 614 displays icons, cursors, menus, windows, or various data such as characters and images.
  • a CRT, a TFT liquid crystal display, a plasma display, or the like can be adopted.
  • the display 614 can realize the functions of the display screen controlled by the display unit 106 in Embodiment 1 and the display unit 506 in Embodiment 3.
  • the projector 615 displays icons, cursors, menus, windows, or various data such as characters and images.
  • the projector 615 projects various data on the front window using, for example, a CRT or a liquid crystal.
  • the projector 615 is installed on the ceiling or the upper part of the seat in the vehicle.
  • the projector 615 can realize the function of the projection unit 305 of the second embodiment.
  • the communication I / F 616 is connected to the network via wireless and functions as an interface between the navigation device and the CPU 601.
  • the communication I / F 616 is further connected to a communication network such as the Internet via wireless, and also functions as an interface between the communication network and the CPU 601.
  • Communication networks include LANs, WANs, public line networks and mobile phone networks.
  • the GPS unit 617 receives radio waves from GPS satellites, and calculates information indicating the current position of the vehicle (current position of the navigation device).
  • the output information of the GPS unit 617 is used when the current position of the vehicle is calculated by the CPU 601 together with output values of various sensors described later.
  • the information indicating the current location is information for specifying one point on the map data, such as latitude / longitude and altitude.
  • the various sensors 618 are, for example, a gyro sensor, an acceleration sensor, a vehicle speed sensor, and the like, and detect the moving state of the vehicle. Output signals from the various sensors 618 are used for the calculation of the current location by the CPU 601 and the measurement of changes in speed and direction.
  • the current position information acquisition unit 101, the determination unit 108, and the sensor 109 are a GPS unit 617, various sensors 618, a CPU 601 (ROM602, RAM603, magnetic disk 605, optical disk).
  • the route information acquisition unit 102 and the storage unit 103 are executed by the CPU 601, the magnetic disk drive 604 and the magnetic disk 605, or the optical disk drive 606 and the optical disk 607.
  • the CPU 601 the video acquisition unit 105 is the CPU 601, the video I / F 612, the camera 613, or the communication I / F 616
  • the display unit 106 is the CPU 601, the video I / F 612, The spray 614, imaging unit 110, the camera 613 and the communication I / F616, the input unit 107, the input device 611, to realize the respective functions.
  • the current position information acquisition unit 301, the determination unit 307, and the sensor 308 are a GPS unit 617, various sensors 618, and a CPU 601 (ROM602, RAM603, magnetic disk 605, optical disk).
  • the route information acquisition unit 302 and the storage unit 303 are executed by the CPU 601, the magnetic disk drive 604 and the magnetic disk 605, or the optical disk drive 606 and the optical disk 607.
  • the CPU 601 realizes the respective functions of the projection unit 305 by the CPU 601, the video I / F 612 and the projector 615, and the input unit 306 by the input device 611.
  • the current position information acquisition unit 501, the determination unit 508, and the sensor 509 include a GPS unit 617, various sensors 618, and a CPU 601 (ROM 602, RAM 603, magnetic disk 605, The route information acquisition unit 502 and the storage unit 503 are generated by the CPU 601, the magnetic disk drive 604 and the magnetic disk 605, or the optical disk drive 606 and the optical disk 607.
  • Reference numeral 504 denotes the CPU 601
  • the video acquisition unit 505 includes the CPU 601, video I / F 612, camera 613, or communication I / F 616.
  • the display unit 506 includes the CPU 601, video I / F 612, and display.
  • FIG. 7 is a block diagram illustrating a functional configuration of the generation unit of the navigation device.
  • FIG. 8 is a flowchart showing a processing procedure of the generation unit of the navigation device.
  • the generation unit includes an arrangement processing unit (arrangement unit) 701, a movement processing unit (movement unit) 702, a setting processing unit (setting unit) 703, a rendering processing unit (rendering unit) 704, and an adjustment unit. 705.
  • the arrangement processing unit 701 arranges a band-like object corresponding to a planar road in the space for three-dimensional calculation based only on link information or based on node information and link information (step S801). At that time, based on the route information, only the link information corresponding to the route is extracted, and a portion other than the route is deleted (step S802).
  • the movement processing unit 702 moves the band-like object arranged by the arrangement processing unit 701 by a predetermined amount (predetermined height) in the height direction orthogonal to the plane in the space for three-dimensional calculation (step S803).
  • the setting processing unit 703 determines the viewpoint position, viewpoint direction, and height for the three-dimensional calculation for the band-shaped object moved by the movement processing unit 702, the position, direction, and height of the imaging unit, or an operator (driver Or the passenger's viewpoint position, viewpoint direction, and height are set (step S804).
  • the rendering processing unit 704 renders the band-like object viewed from the viewpoint position, the viewpoint direction, and the height set by the setting processing unit 703 as a sky route guidance display image (step S805).
  • the adjustment unit 705 can adjust a predetermined amount (predetermined height) from the road surface (the ground), and the movement processing unit 702 can adjust the height adjusted by the adjustment unit 705 in the three-dimensional calculation space in step S803. It moves in the height direction perpendicular to the plane.
  • the adjustment unit 705 may perform adjustment according to route information (for example, road inclination, presence / absence of an intersection (node), presence / absence of a traffic light, etc.). Moreover, you may make it adjust to an operator's desired height by the input from an operator.
  • FIG. 9 is an explanatory diagram showing map data (node information, link information) at intersections of a crossroad
  • FIGS. 10-1 to 11-2 are explanatory diagrams showing the contents of arrangement processing in the arrangement processing unit 701 of the generation unit.
  • the map data includes a node 900 indicating an intersection and links 901 to 904 indicating four roads with respect to the node 900.
  • the link has information such as the shape of the road and the width of the road. Based on the information, a link in which a planar road is arranged in the space for three-dimensional calculation is represented from the top, as shown in FIG. It is.
  • the arrangement processing unit 701 arranges the link 901 shown in FIG. 9 as a planar road in a space for three-dimensional calculation in consideration of the shape and width of the road, and sets it as a link 1001. .
  • the arrangement processing unit 701 arranges the links 902 to 904 shown in FIG. 9 as planar roads in the space for three-dimensional calculation to form links (band images) 1002 to 1004.
  • FIG. 10A is a view from the side (viewed from the lower side to the upper side in FIG. 10-1) of what is arranged as a planar road in the space for three-dimensional calculation in FIG. 10-1. .
  • the reference height that is, height 0
  • the reference height can be arbitrarily set, and for example, the ground surface can be used as a reference.
  • FIG. 10A only the link corresponding to the route is extracted from the four links 1001 to 1004 in FIG.
  • FIG. 11A also shows a plan view in which a planar road is arranged in a space for three-dimensional calculation. Since the route reaches from the link 901 in FIG. 9 to the node 900, turns right on the node 900 and reaches the link 902, in FIG. 11-1, the placement processing unit 701 only includes the links 1001 and 1002 in FIG. Is extracted, and links other than the root (links 1003 and 1004) are deleted.
  • the placement processing unit 701 may not place links other than the root (links 1003 and 1004) from the beginning (that is, omitting FIG. 10-1 in the space for three-dimensional calculation and starting from the beginning. 11-1 may be used).
  • FIG. 11-2 is similar to FIG. 10-2, in which the one arranged as a planar road in the space for three-dimensional calculation in FIG. 11-1 is viewed from the side (from the lower side to the upper side of FIG. 11-1). (Point of view). Even at this time, the height of the links 1001 and 1002 (Z-axis direction) is zero.
  • FIGS. 12A and 12B are explanatory diagrams showing the contents of the movement processing in the movement processing unit 702 of the generation unit.
  • the movement processing unit 702 moves the links 1001 and 1002 arranged by the arrangement processing unit 701 upward (Z-axis direction) by a predetermined height (here, about 20 m above the ground). Thereby, the arrangement of the objects (band-like objects) in the space for three-dimensional calculation is completed.
  • FIG. 12-1 is expressed from the viewpoint from the top like FIGS. 10-1 and 11-1, it looks the same as FIG. 11-1 (that is, the movement in the Z-axis direction is not known).
  • FIG. 12-2 is a view from the side (view point from the lower side to the upper side in FIG. 12-1) as in FIGS. 10-2 and 11-2. Compared to FIG. It can be seen that the links 1001 and 1002 are higher than the original height (height 0) by 20 m in the Z-axis direction.
  • FIGS. 13A and 13B are explanatory diagrams illustrating the contents of the setting process in the setting processing unit 703 of the generation unit.
  • the setting processing unit 703 is an imaging unit in which a camera 1301 indicating a viewpoint position, direction, and height for three-dimensional calculation is mounted on a moving body (for example, the own vehicle) with respect to the belt-like image moved by the movement processing unit 702. Position, direction and height, or the viewpoint position, direction and height of the driver or passenger.
  • FIG. 13-2 shows the viewpoint from the side (the viewpoint from the lower side to the upper side of FIG. 13-1) as in FIGS. 10-2, 11-2, and 12-2. It can be seen that the position is lower than the links 1001 and 1002 in the Z-axis direction (for example, about 1.5 m above the ground) and is almost directly below the link 1001. This indicates that the vehicle is traveling on the road of the link 1001. In addition, it indicates that the camera 1301 is mounted at a predetermined position (for example, the back side of the rearview mirror) of the own vehicle, or that the viewpoint position of the driver or the passenger riding the own vehicle is taken into consideration. .
  • a predetermined position for example, the back side of the rearview mirror
  • FIG. 14 is an explanatory diagram showing a state rendered by the rendering processing unit 704 of the generation unit.
  • FIG. 14 illustrates a state in which the links 1001 and 1002 viewed from the camera 1301 illustrated in FIGS. 13A and 13B are rendered. Since the camera 1301 and the link 1001 have different heights, the link 1001 looks up from below from the camera 1301. When this state is shown in the space for three-dimensional calculation, it is close to an inverted trapezoid as shown in FIG. It becomes a shape.
  • the link 1002 also becomes a band-like image having a width that the camera 1301 looks up from below. Thereby, the generation (drawing) of the band-like image (link) by the generation unit is completed.
  • FIG. 15 is an explanatory diagram showing a state in which an aerial route guidance display image is displayed over the front video.
  • the band images 1001 and 1002 generated in FIG. 14 indicate route guidance on the front video. Since the route guidance constituted by the belt-like images 1001 and 1002 continuous along the route is displayed above the road surface, even if it is superimposed on the front image, the operator (driver or passenger) However, even if there is an obstacle on the front side in the front image, for example, a car traveling in front, the obstacle is not blocked by the belt-like image, and the user does not mistake the front image. Furthermore, as shown in FIG.
  • the strip images 1001 and 1002 are positioned higher than the imaging position of the camera 1301 (photographing viewpoint) and are displayed on the front video, the strip images 1001 and 1002 are displayed in the front video. Since the image is displayed at a position where the sky is looked up, there is a low possibility that an object that overlaps the strip images 1001 and 1002 exists.
  • the road surface route guidance display image is generated by substantially the same procedure as described above for generating the sky route guidance display image.
  • the difference between the two is that the movement processing unit 702 shown in FIG. 7 applies a predetermined amount (predetermined height) of the band-shaped image arranged by the arrangement processing unit 701 in the height direction orthogonal to the plane in the space for three-dimensional calculation. Only the presence / absence of the process (step S803 in FIG. 8). That is, in the generation of the sky route guidance display image, step S803 in FIG. 8 is executed, but in the generation of the road surface route guidance display image, the execution of step S803 in FIG. 8 is omitted, and step S804 is followed by step S804. Execute. Other than that, both processes are the same.
  • FIG. 16 is an explanatory diagram showing a state in which a road surface route guidance display image is superimposed on the front video. As shown in FIG. 16, strip images 1001 and 1002 are displayed so as to overlap the road surface of the front image. In FIG. 16, since the vehicle is recognized correctly, the turning position can be accurately drawn. In this case, as shown in FIG. 16, when there is no obstacle ahead, it is considered effective as route guidance.
  • the high accuracy of the position is regarded as high reliability of positioning, and either the road surface route guidance display image or the sky route guidance display image is generated according to the reliability determination result.
  • the case where the position accuracy is poor may be, for example, a case where positioning is performed only by GPS and information such as vehicle speed, G sensor, gyro sensor, and electronic compass is not obtained or cannot be obtained. For example, this is the case where the navigation device itself has only a GPS built-in.
  • FIG. 17 is a flowchart showing a procedure of switching generation processing in the generation unit of the navigation device
  • FIGS. 18 and 19 are flowcharts showing a procedure of determination processing as to whether or not the output value of each sensor is normal. . Judgment processing as to whether or not the output value of each sensor is normal is performed by the judgment unit.
  • step S1701 it is determined whether or not information can be acquired from a predetermined positioning sensor (independent sensor) (step S1701). Specifically, the determination is made based on whether or not the vehicle speed sensor is connected and whether or not the gyro sensor and the G sensor are in the apparatus and can be used. If it is determined in step S1701 that information cannot be acquired from a predetermined positioning sensor (step S1701: No), it is determined that the accuracy of the current position is poor, and a sky route guidance display image is generated (step S1705). .
  • a predetermined positioning sensor independent sensor
  • step S1701 when it is in a state where information can be acquired from a predetermined positioning sensor (step S1701: Yes), the output value of each sensor is confirmed (step S1702). Whether the output value of each sensor is normal or abnormal is determined according to the flowchart shown in FIG. 18 (determination process based on comparison between gyro sensor and G sensor) and the flowchart shown in FIG. 19 (determination process based on comparison between GPS and vehicle speed pulse). The determination is made according to the procedure of the determination process of whether or not the output value of each sensor is normal. That is, if the determination result shown in FIG. 18 is normal or the determination result shown in FIG.
  • step S1703: Yes if the output value of each sensor is normal (step S1703: Yes), it is determined that the accuracy of the current position is good, and a road surface route guidance display image is generated (step S1704).
  • step S1703: No when the output value of each sensor is abnormal (step S1703: No), it is determined that the accuracy of the current position is poor, and the sky route guidance display image is generated (step S1705).
  • step S1801 determines whether or not the angular velocity of the gyro sensor is equal to or greater than a predetermined threshold value.
  • step S1801 determines whether or not the angular velocity of the gyro sensor is equal to or greater than a predetermined threshold value.
  • step S1801: Yes the rotational direction of the gyro sensor is stored (step S1802).
  • the G sensor determines whether or not the direction opposite to the rotation direction of the gyro sensor (that is, the left-right direction) is output more than a certain value (step S1803).
  • step S1803: Yes the output value of each sensor is matched, so the output value is determined to be normal (step S1804).
  • step S1803: No it is determined that the output value is abnormal because the output values of the sensors are not matched (step S1805).
  • step S1903 the procedure of the determination process (determination process based on comparison between GPS and vehicle speed pulses) for determining whether the output value of each sensor is normal. Also in this case, whether or not the output value of each sensor is normal is determined based on whether or not the output values of the GPS and the vehicle speed pulse match.
  • step S1903 the first position information obtained by GPS is stored (step S1901).
  • step S1902 the number of vehicle speed pulses is counted (step S1902).
  • step S1903 it is determined whether or not a predetermined period has elapsed.
  • the process returns to step S1902, and the number of vehicle speed pulses is counted.
  • the number of vehicle speed pulses is counted until a predetermined period elapses. If the predetermined period has elapsed (step S1903: YES), the second position information obtained by GPS is stored (step S1904).
  • the movement distance during a predetermined period is calculated by subtracting the first position information from the second position information (step S1905). Further, the movement distance during the predetermined period calculated in step S1905 is divided by the number of vehicle speed pulses counted in step S1902, and the movement amount per vehicle speed pulse is calculated (step S1906).
  • steps S1901 to S1906 are repeated a predetermined number of times. Then, it is determined whether or not a series of processing has reached a predetermined number of trials X (step S1907). If the predetermined number of trials X has not been reached (step S1907: NO), the process returns to step S1901, and the processing of steps S1901 to S1906 is repeated. If the predetermined number of trials X has been reached in step S1907 (step S1907: Yes), then whether the variation in the movement amount per vehicle speed pulse of the predetermined number of trials X is within a predetermined range. It is determined whether or not (step S1908).
  • step S1908 when the variation is within a predetermined range (step S1908: Yes), it is determined that the output values of the sensors are consistent and the output values are normal (step S1909). On the other hand, if the variation is outside the predetermined range (step S1908: No), it is determined that the output values of the sensors are not matched and the output values are abnormal (step S1910).
  • the generation unit when the output value of each sensor is normal, that is, when the predetermined condition that the accuracy of the current position is good is met, the generation unit generates a sky route from the road surface route guidance display image. It is generated by switching to the guidance display image. In addition, when the predetermined condition is not met, the generation unit generates the image by switching from the sky route guidance display image to the road surface route guidance display image.
  • FIG. 20 is an explanatory diagram illustrating a state in which a road surface route guidance display image is displayed over the front image
  • FIG. 21 is an explanatory diagram illustrating a state in which the sky route guidance display image is displayed over the front image. is there.
  • FIG. 20 shows a case where the position of the own vehicle is not recognized correctly, so that the position of the own vehicle is recognized about 10 m before delivery, and the turning position is drawn in the back.
  • the position accuracy of the own vehicle is poor
  • the belt-like images 1001 and 1002 are displayed so as to overlap the road surface, the difference between the actual right-turn road and the belt-like image 1002 after the right turn appears to be large. Is difficult to understand.
  • FIG. 21 shows a case in which the position of the own vehicle is recognized about 10 m before actual delivery and the turning position is drawn in the back because the position recognition of the own vehicle is not correct, as in FIG.
  • the band images 1001 and 1002 are drawn in the sky, so that the operator (driver and passenger) can see the band image 1002 after the right turn and below
  • the actual right turn road displayed is recognized in association with it, and the turning point (road) is estimated. Therefore, in either FIG. 15 or FIG. 21, the operator (driver, passenger) can easily recognize at which point (road) the turn should be made.
  • the accuracy of the map data is poor (for example, the detailed map when creating the map data) May be generated by switching to the sky route guidance display image. That is, instead of or in addition to the reliability of the positioning of the vehicle position, the height of the route display image from the road surface may be determined according to the reliability of the accuracy of the map data.
  • the road route guidance display image may be switched to be generated.
  • the road surface route guidance display image may not match the shape of the undulations, so it is generated by switching to the sky route guidance display image. You may do it.
  • the road when the road is flat, it may be generated by switching to a road surface route guidance display image.
  • the position accuracy may not be determined when the vehicle is close to the guide point, and switching may not occur frequently. Further, once it is determined that the position accuracy is poor, it may be displayed on the sky route guidance display image unless there is an operation instruction or the like. In addition, once it is determined that the position accuracy is poor, it may be displayed on the sky route guidance display image until the apparatus is turned off. Until such a predetermined condition is satisfied, it is possible to prevent frequent switching by maintaining the display of either the road surface route guidance display image or the sky route guidance display image.
  • the route display form (color, transparency, etc.) may be varied according to the accuracy of the position.
  • FIG. 22 is an explanatory diagram showing a state in which the sky route guidance display image is projected on the front window.
  • a sky route guidance display image (links 1001 and 1002) is displayed on the front window 2201. Switching between the road surface route guidance display image and the sky route guidance display image according to the accuracy of the position of the own vehicle is effective even when projecting on the front window 2201.
  • a target for projecting a route display image such as a road surface route guidance display image or an aerial route guidance display image is not limited to the front window. It may be a window on the side or rear of the vehicle, or a transparent member may be arranged in the vehicle and a route display image projected onto the member. That is, if a route display image is projected onto a transparent member positioned between the user's viewpoint and the road surface corresponding to the route, the user can recognize the route only by looking outside the vehicle.
  • the current position of the moving body is measured to acquire the current position information
  • the route information regarding the route to the destination is acquired
  • the road surface of the road corresponding to the route is photographed.
  • the obtained video is acquired, and the route display image generated based on the current position information and the route information is displayed so as to be superimposed at a predetermined height from the position of the road surface on the video.
  • the reliability of positioning of the current position is judged, and when the positioning of the current position is highly reliable, a road surface route guidance display image having a predetermined height as the position directly above the road surface is generated, and the positioning of the current position is determined.
  • a road surface route guidance display image having a predetermined height as the position directly above the road surface is generated, and the positioning of the current position is determined.
  • an aerial route guidance display image is generated in which a position higher than the viewpoint for shooting the video is set to a predetermined height.
  • the current position of the moving body is measured to obtain current position information
  • the route information about the route to the destination is obtained
  • the road corresponding to the viewpoint and route of the moving body of the moving body A route display image generated based on the current position information and the route information is projected on a transparent member located between the road surface and a predetermined height from the position of the road surface on the transparent member as seen by the moving person. To do. At that time, it is possible to project a route display image whose predetermined height is changed according to the reliability of positioning of the current position.
  • the reliability of positioning of the current position is judged, and when the positioning of the current position is highly reliable, a road surface route guidance display image having a predetermined height as the position directly above the road surface is generated, and the positioning of the current position is determined.
  • a road surface route guidance display image having a predetermined height as the position directly above the road surface is generated, and the positioning of the current position is determined.
  • an aerial route guidance display image is generated in which a position higher than the viewpoint of the moving person is set to a predetermined height.
  • the display of the switched route display image is maintained until a predetermined condition is satisfied. You can continue to display route guidance.
  • the navigation method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read from the recording medium by the computer.
  • the program may be a transmission medium that can be distributed via a network such as the Internet.

Abstract

La présente invention concerne un système de navigation (100), comprenant : une section d'acquisition d'informations d'emplacement courant (101), pour localiser l'emplacement courant d'une entité mobile afin d'acquérir des informations d'emplacement courant ; et une section d'acquisition d'informations d'itinéraire (102), pour acquérir des informations d'itinéraire concernant un itinéraire vers une destination. En se basant sur les informations d'emplacement courant et sur les informations d'itinéraire, une section d'affichage (106) superpose une image d'affichage de guide d'itinéraire routier créée ou une image d'affichage de guide d'itinéraire aérien avec une image vidéo sur l'itinéraire du véhicule acquise par une section d'acquisition d'image vidéo (105), ou bien la projette sur le pare-brise.
PCT/JP2010/057392 2010-04-26 2010-04-26 Système de navigation, procédé de navigation, programme de navigation, et support de stockage WO2011135660A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2010/057392 WO2011135660A1 (fr) 2010-04-26 2010-04-26 Système de navigation, procédé de navigation, programme de navigation, et support de stockage
JP2011530191A JP4833384B1 (ja) 2010-04-26 2010-04-26 ナビゲーション装置、ナビゲーション方法、ナビゲーションプログラムおよび記録媒体

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/057392 WO2011135660A1 (fr) 2010-04-26 2010-04-26 Système de navigation, procédé de navigation, programme de navigation, et support de stockage

Publications (1)

Publication Number Publication Date
WO2011135660A1 true WO2011135660A1 (fr) 2011-11-03

Family

ID=44861007

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/057392 WO2011135660A1 (fr) 2010-04-26 2010-04-26 Système de navigation, procédé de navigation, programme de navigation, et support de stockage

Country Status (2)

Country Link
JP (1) JP4833384B1 (fr)
WO (1) WO2011135660A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013114617A1 (fr) * 2012-02-03 2013-08-08 パイオニア株式会社 Dispositif d'affichage d'image, procédé pour l'affichage d'image et programme pour l'affichage d'image
CN103245345A (zh) * 2013-04-24 2013-08-14 浙江大学 一种基于图像传感技术的室内导航系统及导航、搜索方法
KR20160104825A (ko) * 2015-02-26 2016-09-06 엘지전자 주식회사 차로 안내 장치 및 그 방법
JP2018020779A (ja) * 2017-09-29 2018-02-08 日本精機株式会社 車両情報投影システム
WO2020121810A1 (fr) * 2018-12-14 2020-06-18 株式会社デンソー Dispositif de commande d'affichage, programme de commande d'affichage et support d'enregistrement tangible non transitoire lisible par ordinateur
JP2020097399A (ja) * 2018-12-14 2020-06-25 株式会社デンソー 表示制御装置および表示制御プログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0525476U (ja) * 1991-02-22 1993-04-02 株式会社ケンウツド 車載用ナビゲーシヨン装置
JP2003344062A (ja) * 2002-05-30 2003-12-03 Alpine Electronics Inc ナビゲーション装置
JP2008501956A (ja) * 2004-06-03 2008-01-24 メイキング バーチャル ソリッド,エル.エル.シー. ヘッドアップ表示を使用する途上ナビゲーション表示方法および装置
WO2009084129A1 (fr) * 2007-12-28 2009-07-09 Mitsubishi Electric Corporation Dispositif de navigation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0525476U (ja) * 1991-02-22 1993-04-02 株式会社ケンウツド 車載用ナビゲーシヨン装置
JP2003344062A (ja) * 2002-05-30 2003-12-03 Alpine Electronics Inc ナビゲーション装置
JP2008501956A (ja) * 2004-06-03 2008-01-24 メイキング バーチャル ソリッド,エル.エル.シー. ヘッドアップ表示を使用する途上ナビゲーション表示方法および装置
WO2009084129A1 (fr) * 2007-12-28 2009-07-09 Mitsubishi Electric Corporation Dispositif de navigation

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013114617A1 (fr) * 2012-02-03 2013-08-08 パイオニア株式会社 Dispositif d'affichage d'image, procédé pour l'affichage d'image et programme pour l'affichage d'image
CN103245345A (zh) * 2013-04-24 2013-08-14 浙江大学 一种基于图像传感技术的室内导航系统及导航、搜索方法
KR20160104825A (ko) * 2015-02-26 2016-09-06 엘지전자 주식회사 차로 안내 장치 및 그 방법
KR101698104B1 (ko) * 2015-02-26 2017-01-20 엘지전자 주식회사 차로 안내 장치 및 그 방법
JP2018020779A (ja) * 2017-09-29 2018-02-08 日本精機株式会社 車両情報投影システム
WO2020121810A1 (fr) * 2018-12-14 2020-06-18 株式会社デンソー Dispositif de commande d'affichage, programme de commande d'affichage et support d'enregistrement tangible non transitoire lisible par ordinateur
JP2020097399A (ja) * 2018-12-14 2020-06-25 株式会社デンソー 表示制御装置および表示制御プログラム
US20210223058A1 (en) * 2018-12-14 2021-07-22 Denso Corporation Display control device and non-transitory computer-readable storage medium for the same
JP7052786B2 (ja) 2018-12-14 2022-04-12 株式会社デンソー 表示制御装置および表示制御プログラム
JP2022079590A (ja) * 2018-12-14 2022-05-26 株式会社デンソー 表示制御装置および表示制御プログラム
JP7416114B2 (ja) 2018-12-14 2024-01-17 株式会社デンソー 表示制御装置および表示制御プログラム

Also Published As

Publication number Publication date
JP4833384B1 (ja) 2011-12-07
JPWO2011135660A1 (ja) 2013-07-18

Similar Documents

Publication Publication Date Title
US20230029160A1 (en) Apparatus and Methods of Displaying Navigation Instructions
US8180567B2 (en) Navigation device with camera-info
US8423292B2 (en) Navigation device with camera-info
JP4705170B2 (ja) ナビゲーションデバイス及びナビゲーションデバイス上に表示された地図データをスクロールする方法
JP2006084208A (ja) ナビゲーション装置及び進行方向案内方法
JP4833384B1 (ja) ナビゲーション装置、ナビゲーション方法、ナビゲーションプログラムおよび記録媒体
JP2015105903A (ja) ナビゲーション装置、ヘッドアップディスプレイ、制御方法、プログラム、及び記憶媒体
RU2375756C2 (ru) Навигационное устройство с информацией, получаемой от камеры
JP5702476B2 (ja) 表示装置、制御方法、プログラム、記憶媒体
WO2011121788A1 (fr) Dispositif de navigation, dispositif d'affichage d'informations, procédé de navigation, programme de navigation et support d'enregistrement
JP5438172B2 (ja) 情報表示装置、情報表示方法、情報表示プログラムおよび記録媒体
KR20080019690A (ko) 카메라 정보를 구비하는 내비게이션 기기
JP5356483B2 (ja) ナビゲーション装置及びナビゲーション方法
JP2011022152A (ja) ナビゲーションデバイス

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2011530191

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10850679

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10850679

Country of ref document: EP

Kind code of ref document: A1