WO2014050172A1 - 交差点案内システム、方法およびプログラム - Google Patents

交差点案内システム、方法およびプログラム Download PDF

Info

Publication number
WO2014050172A1
WO2014050172A1 PCT/JP2013/057643 JP2013057643W WO2014050172A1 WO 2014050172 A1 WO2014050172 A1 WO 2014050172A1 JP 2013057643 W JP2013057643 W JP 2013057643W WO 2014050172 A1 WO2014050172 A1 WO 2014050172A1
Authority
WO
WIPO (PCT)
Prior art keywords
intersection
image
vehicle
approach
degree
Prior art date
Application number
PCT/JP2013/057643
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
佑美枝 荒井
石川 健
Original Assignee
アイシン・エィ・ダブリュ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アイシン・エィ・ダブリュ株式会社 filed Critical アイシン・エィ・ダブリュ株式会社
Priority to EP13842313.2A priority Critical patent/EP2863181B1/en
Priority to US14/424,681 priority patent/US9508258B2/en
Priority to CN201380044530.XA priority patent/CN104603578B/zh
Publication of WO2014050172A1 publication Critical patent/WO2014050172A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3655Timing of guidance instructions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0464Positioning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Definitions

  • the present invention relates to an intersection guidance system, method and program for guiding the position of an intersection.
  • Patent Document 1 a technique for capturing a landscape image in front by a video camera installed in a vehicle and projecting the landscape image on a screen of a display device is known (see Patent Document 1).
  • Patent Document 1 a location of a guidance target intersection on a landscape image is specified based on an intersection node recorded in map data, and an arrow graphic is synthesized with the specified location. Thereby, the position of the guidance target intersection can be visually recognized based on the position where the arrow graphic is projected on the landscape image.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a technique capable of guiding the position of an intersection without causing a sense of incongruity even when a vehicle approaches the intersection.
  • the approach degree acquisition means acquires the approach degree of the vehicle to the intersection existing in front of the vehicle.
  • the display control means superimposes a guide image for guiding the position of the intersection on the front landscape of the vehicle and displays it on the display unit.
  • the display control means sets the position in the front landscape corresponding to the registered position registered in the map information as the intersection position as the superimposed position of the guide image when the approach degree is less than the threshold.
  • the display control means sets the position in the forward landscape corresponding to the position on the straight line that is a straight line extending in the traveling direction of the vehicle from the vehicle as the superimposed position of the guide image when the approach degree is equal to or greater than the threshold value.
  • the degree of approach to the intersection the more clearly the driver can visually recognize the intersection image in the forward scenery. Therefore, when the degree of approach to the intersection is equal to or greater than the threshold, the driver determines a target position to be a target when entering the intersection based on the image of the clearly visible intersection, and moves the vehicle toward the target position. It can be estimated that it is progressing. That is, when the degree of approach to the intersection is greater than or equal to the threshold, it can be estimated that the driver has already taken the course of the vehicle so that the target position exists in the traveling direction of the vehicle.
  • the display control means acquires the position in the forward landscape corresponding to the position on the straight line extending in the traveling direction of the vehicle from the vehicle, and the acquired The position is set as the superimposed position of the guide image.
  • the display control means acquires the position in the forward landscape corresponding to the position on the straight line extending in the traveling direction of the vehicle from the vehicle, and the acquired The position is set as the superimposed position of the guide image.
  • the display control means acquires a position in the front landscape corresponding to the registered position registered in the map information as the position of the intersection, and uses the acquired position of the guidance image. Set as superposition position. Thereby, the driver can recognize the position of the intersection in the forward scenery. That is, when the degree of approach is less than the threshold value, the visibility of the image of the intersection in the forward landscape is poor, but the driver can recognize the location of the intersection depending on the position of the guide image in the forward landscape.
  • the degree of approach is an index that increases as the vehicle approaches the intersection, and an index that increases as the image of the intersection can be clearly recognized.
  • the approach degree acquisition unit may acquire the approach degree based on the positional relationship between the vehicle and the intersection, or may acquire the approach degree based on the traveling state of the vehicle such as the vehicle speed and the traveling direction.
  • the intersection existing in front of the vehicle may be an intersection that is reached when the vehicle travels forward on the road, and may be an intersection that is reached when the vehicle travels forward on the planned travel route.
  • the guide image may be an image that guides the position of the intersection, and may be an image that indicates that the image of the intersection exists at the superimposed position where the guide image is superimposed.
  • the guide image may be a dot image that exists at the set superposition position, or may be a linear image that has a tip or a bending point at the set superposition position. It may be a polygonal image having a vertex at the superimposed position.
  • the guidance image may include not only a part indicating the position of the intersection but also a part for guiding information on the intersection other than the position of the intersection.
  • the display control means may project the front landscape onto the display unit by causing the display unit to display an image representing the front landscape (hereinafter referred to as a front image). That is, the display control unit may superimpose the guide image on the front image and display the front image on which the guide image is superimposed on the display unit.
  • the forward image may be obtained by photographing the forward landscape with a camera, or may be obtained by drawing the forward landscape based on the map information.
  • the display unit may be configured as a transflective head-up display that displays a guide image so as to be superimposed on an actual landscape visually recognized by the driver through the windshield of the vehicle. In this case, a part of the front scenery visually recognized by the driver is transmitted through the display unit, so that a part of the transmitted scenery is projected on the display unit as the front scenery.
  • the threshold may be set to a degree of approach that allows the driver to determine the target position in the intersection so that the image of the intersection can be clearly visually recognized in the front image.
  • the threshold value may be set to a degree of approach where the size of the intersection image in the forward landscape is a predetermined size.
  • the threshold value may be set to a degree of approach such that the size of the image of the intersection in the forward landscape is such that the image of the connecting road connected to the intersection is clearly visible. This is because if the image of the connecting road can be clearly seen, the position in the intersection where the user can smoothly exit the connecting road can be determined as the target position.
  • the threshold value may be set to a degree of approach that allows the size of the image of the intersection in the forward landscape to be a size at which the lane configuration of the road entering the intersection (the direction of the connected road that can exit for each lane) is clearly visible. . This is because if the lane configuration can be clearly recognized, the lane to be traveled when entering the intersection can be specified, and the position on the specified lane can be determined as the target position.
  • the size of the intersection image may be the area of the intersection image in the forward landscape, the length of the intersection image in the longitudinal direction of the forward landscape, or in the lateral direction of the forward landscape. It may be the length of the image of the intersection.
  • the registered position is a position registered in the map information as a representative position on the road surface in the intersection, and may be a geometric center position of the road surface in the intersection. However, the registered position does not necessarily have to be a position registered as an intersection position in the map information, and may be a position that can be derived based only on data registered in the map information.
  • the position in the front scenery corresponding to the registered position means a position in the front scenery where the image of the object existing at the registered position is projected on the display unit.
  • the superimposition position of the guide image may be an overlap position of at least a part of the guide image, and may be a superimposition position of a portion indicating the position of the intersection in the guide image.
  • the traveling direction of the vehicle means the front in the longitudinal direction of the vehicle (the direction orthogonal to the axle).
  • the straight line is a straight line extending from the vehicle in the traveling direction of the vehicle, and means a trajectory of the vehicle when it is assumed that the vehicle goes straight with the steering angle maintained at 0 °.
  • the position in the forward scenery corresponding to the position on the straight line means the position in the forward scenery where the image of the object existing at the position on the straight line is projected on the display unit.
  • the display control means obtains a projection space corresponding to the front landscape projected on the display unit from among the spaces ahead of the vehicle, and the degree of approach is less than a threshold value, and the left side of the projection space or
  • the position in the front landscape corresponding to the left end position or the right end position in the projection space may be set as the superimposed position of the guide image.
  • the position of the edge of the front scenery on the side where the registration position exists is set as the superimposed position of the guide image May be.
  • projection space means the space in the visual field of the camera which image
  • the projection space means a space corresponding to the cut out front image in the space within the field of view of the camera.
  • the projection space means a space in the field of view set when the front image is drawn.
  • the projection space means a space that can be seen by the driver through the display unit.
  • the display control means may determine that the approach degree is less than the threshold when the remaining distance from the vehicle to the intersection is larger than a predetermined reference distance. Since the degree of approach increases as the remaining distance from the vehicle to the intersection decreases, it can be determined that the degree of approach is less than the threshold when the remaining distance is greater than the reference distance.
  • the reference distance may be set to a remaining distance at which the driver has already determined the target position when entering the intersection and can determine that the vehicle is moving toward the determined target position.
  • the reference distance may be a remaining distance at which the size of the image of the intersection in the forward landscape becomes a size that can be clearly recognized to such an extent that the target position can be determined.
  • the display control means may set the reference distance for each intersection, may specify the size of the intersection based on the map information, and may set the reference distance to be larger as the size of the intersection is larger. Furthermore, the reference distance may be a remaining distance that can be considered that the vehicle has entered an extension section where an extension lane such as a right turn lane is added before the intersection. In the section before the lane extension section, the driver may not be able to travel on the extension lane even if the driver has determined the position on the extension lane as the target position, and the target position determined on the extension lane is This is because there may be a case where it does not exist on the straight line.
  • the reference distance may be a remaining distance that can be considered that the vehicle has entered a section in which lane change is prohibited before the intersection. Accordingly, it is possible to guide the driver that the current traveling direction should be maintained as it is without changing the lane in the section where the lane change is prohibited.
  • the display control means may determine that the degree of approach is less than a threshold when the angle formed by the straight line extending from the vehicle to the registered position and the straight line in the horizontal plane is greater than a predetermined reference angle.
  • a predetermined reference angle may be set to an angle at which an intersection can be considered to exist on a straight line.
  • the case where there is no intersection on the straight line means the case where the intersection cannot be reached when the vehicle goes straight in the traveling direction. Therefore, by setting the angle at which the intersection can be considered to exist on the straight line as the reference angle, it is possible to guide the vehicle to go straight in the traveling direction even if the vehicle cannot reach the intersection even if going straight in the traveling direction. This can prevent the driver from feeling uncomfortable.
  • the display control means may determine that the degree of approach is less than the threshold when the required period until the vehicle reaches the intersection is longer than a predetermined reference period. As a result, the timing at which the vehicle reaches the intersection is imminent, and when the possibility that the target position when entering the intersection has been determined is high, it is directed toward the target position existing in the traveling direction of the vehicle. Can guide you to run.
  • the display control means displays a front image obtained by photographing the front landscape so that the optical axis is positioned on the straight line on the display unit, and when the degree of approach is equal to or greater than a threshold value,
  • the position on the line may be set as the superimposed position of the guide image.
  • the position on the straight line In a forward image taken so that the optical axis is positioned on the straight line, the position on the straight line always corresponds to the position on the bisector in the horizontal direction. Accordingly, when the degree of approach is equal to or greater than the threshold value, it is possible to prevent the front image from moving laterally at a position on the horizontal bisector of the front image, and to improve the visibility of the guide image.
  • the display control means determines that the registration position exists on the left side or the right side of the projection space when the angle formed on the horizontal plane by the straight line extending from the vehicle to the registration position and the straight line is larger than a predetermined determination angle. May be. Thereby, it can be easily determined whether or not the registered position exists on the left side or the right side of the projection space.
  • the determination angle may be set to the same angle or a different angle when the registration position is on the left side of the straight line and when the registration position is on the right side of the straight line. Good.
  • the method of guiding the position of the intersection by the guide image as in the present invention can also be applied as a program or method.
  • the system, program, and method as described above may be realized as a single device, or may be realized by using components shared with each unit provided in the vehicle when realized by a plurality of devices. It can be assumed and includes various aspects. For example, it is possible to provide a navigation system, method, and program including the above-described devices. Further, some changes may be made as appropriate, such as a part of software and a part of hardware.
  • the invention can be realized as a recording medium for a program for controlling the system.
  • the software recording medium may be a magnetic recording medium, a magneto-optical recording medium, or any recording medium to be developed in the future.
  • FIG. 2A is a diagram showing the relationship between the registered position and the current position
  • FIG. 2B is a diagram showing a front image
  • FIG. 2C is a plan view of a road
  • FIG. 2D is a diagram showing an arrow image
  • FIG. 2E is a diagram showing a lower image.
  • 3A and 3C are diagrams illustrating the relationship between the registered position and the current position
  • 3B and 3D are diagrams illustrating the front image
  • FIG. 3E is a diagram illustrating the reference angle. It is a flowchart of an intersection guidance process.
  • FIG. 1 is a block diagram showing a configuration of a navigation device 10 as an intersection guidance system according to an embodiment of the present invention.
  • the navigation device 10 is provided in a vehicle.
  • the navigation device 10 includes a control unit 20 and a recording medium 30.
  • the control unit 20 includes a CPU, a RAM, a ROM, and the like, and executes a program stored in the recording medium 30 or the ROM.
  • the recording medium 30 records map information 30a, display setting data 30b, and a position conversion table 30c.
  • the map information 30a includes node data indicating the position of nodes set on the road, shape interpolation point data indicating the position of shape interpolation points set on the center line in the width direction of the road between the nodes, and the nodes. It contains link data indicating information about links to be linked. A node to which three or more links connect corresponds to an intersection. The position of the node corresponding to the intersection means a registered position registered in the map information 30a as the position of the intersection.
  • the link data includes information indicating the width of the road between the nodes, the number of lanes constituting the road between the nodes, the position of the lane marking of each lane, and the road that can be exited from each lane at the node.
  • the display setting data 30b is data in which various setting information for displaying a guidance image for guiding an intersection is recorded. Details of the display setting data 30b will be described later.
  • the position conversion table 30c is a table that defines the correspondence between the position in the projection space ahead of the vehicle and the projection position in the front image.
  • the projection space is a space in the field of view of the camera 44 when the camera 44 captures a front landscape of the vehicle, among the spaces in front of the vehicle.
  • the front image is an image generated when the camera 44 captures a front landscape, and is an image representing the front landscape.
  • the position conversion table 30c is created based on the optical specifications of the camera 44 (view angle, optical axis direction, imaging magnification, etc.), and is recorded in the recording medium 30 in advance.
  • the control unit 20 converts the arbitrary position in the projection space using the position conversion table 30c, thereby acquiring the projection position at which the image of the object existing at the arbitrary position is projected in the front image. On the other hand, the control unit 20 acquires a position in the projection space of the object on which the image is projected at the arbitrary position by converting an arbitrary position in the front image by the position conversion table 30c.
  • the vehicle includes a GPS receiver 41, a vehicle speed sensor 42, a gyro sensor 43, a camera 44, and a display 45.
  • the GPS receiver 41 receives radio waves from GPS satellites and outputs a signal for calculating the position of the vehicle via an interface (not shown).
  • the vehicle speed sensor 42 outputs a signal corresponding to the rotational speed of the wheels provided in the vehicle.
  • the gyro sensor 43 outputs a signal corresponding to the angular acceleration acting on the vehicle.
  • the camera 44 is an image sensor that captures a front landscape of the vehicle and generates a front image representing the front landscape.
  • the front image generated by the camera 44 is output to the control unit 20 via an interface (not shown).
  • FIG. 2A is a plan view showing a state in which the camera 44 captures a front landscape.
  • the camera 44 has an optical system that is symmetrical with respect to the optical axis in the horizontal direction, and the left and right viewing angles in the horizontal direction are the determination angles Ath , respectively.
  • the camera 44 is provided at the center position in the width direction of the vehicle, and the optical axis coincides with the traveling direction of the vehicle.
  • the optical axis of the camera 44 coincides with the straight line F in plan view.
  • the projection space K projected onto the front image is a space having a symmetrical shape with respect to the straight line F in plan view.
  • the projection space K is determined by the left end line l, which is a straight line in which the straight line F is tilted to the left by the determination angle A th around the current position P, and the straight line F is determined around the current position P.
  • the space is sandwiched between the right end line r, which is a straight line inclined rightward by the angle Ath .
  • the determination angle A th is a half of the viewing angle of the camera 44 and is recorded in the display setting data 30b.
  • FIG. 2B is a diagram showing a front image.
  • the horizontal direction in the front image corresponds to the width direction of the road and lane, and the midpoint of the lower side of the front image corresponds to the current position P of the vehicle (the position of the camera 44).
  • the position on the horizontal bisector of the front image corresponds to the position on the straight line F in the projection space K.
  • the image of an object that is farther forward from the vehicle on the road surface in real space is located above the vertical direction of the front image (vertical direction of the front image).
  • the position on the left edge of the front image corresponds to the position on the left edge line 1 in the projection space K
  • the position on the right edge of the front image corresponds to the position on the right edge line r in the projection space K.
  • a viewpoint coordinate system that is a coordinate system in which the vehicle width direction is the X axis (the left direction is positive and the right direction is negative), the traveling direction of the vehicle is the Y axis, and the current position P is the origin.
  • the coordinate (0, Y) on the straight line F in the projection space K is converted by the position conversion table 30c, it is converted to a position on the bisector in the horizontal direction of the front image.
  • the control unit 20 performs known map matching based on signals output from the GPS receiving unit 41, the vehicle speed sensor 42, the gyro sensor 43, and the like and the map information 30a, so that the traveling road on which the vehicle is currently traveling is determined.
  • the current position P of the vehicle is specified on the center line in the width direction.
  • the control unit 20 corrects the current position P in the width direction of the traveling road based on the image recognition of the front image.
  • the control unit 20 recognizes the position in the front image of the image of the lane markings BL constituting the traveling road by performing a known Hough transform or the like on the front image.
  • the control unit 20 specifies the traveling lane in which the vehicle is traveling based on the position of the image of the lane marking line BL in the front image. For example, the control unit 20 acquires the number of lane markings BL images located on the left side of the bisector in the horizontal direction of the front image, and the number of lanes counted from the left end of the traveling road is the acquired number. Identified as a driving lane. Furthermore, the control unit 20 specifies the distance from the lane marking BL of the travel lane to the vehicle in the width direction of the travel road by converting the position of the image of the lane marking BL of the travel lane in the front image by the position conversion table 30c. To do.
  • the control unit 20 determines the current position P based on the position of the lane marking BL of the travel lane based on the link data of the map information 30a and the distance from the lane marking BL of the travel lane to the vehicle in the width direction of the travel road. Is corrected in the width direction of the traveling road.
  • the current position P means a position after correction.
  • the control unit 20 specifies the traveling direction of the vehicle based on the output signal from the gyro sensor 43 and the like.
  • the display 45 is a video output device that outputs various images including a front image and a guide image based on the video signal output from the control unit 20.
  • the front image may be displayed on the entire display 45 or may be displayed on a part of the display 45.
  • the control unit 20 executes the intersection guide program 21.
  • the intersection guide program 21 includes an approach degree acquisition unit 21a and a display control unit 21b.
  • the approach degree acquisition unit 21a is a module that causes the control unit 20 to execute a function of acquiring the degree of approach of the vehicle with respect to an intersection existing ahead of the vehicle.
  • the control part 20 acquires the approach degree of the vehicle with respect to the guidance intersection which is an intersection where a vehicle travels next among the guidance target intersections existing on the planned travel route searched in advance.
  • the scheduled travel route is composed of a series of roads on which the vehicle should travel in order to reach the destination point. Note that the planned travel route may be a route acquired by the control unit 20 from an external device or a removable memory via communication or the like.
  • the control unit 20 exits a road on a planned travel route that travels immediately after exiting the intersection with respect to the direction (entry direction) of the approach road that is a road on the planned travel route that travels immediately before entering the intersection.
  • An intersection having an absolute value of a turning angle (a left turn is positive and a right turn is negative) formed by a road direction (exit direction) is equal to or greater than a threshold value.
  • the control unit 20 specifies an approach direction based on a vector from the shape interpolation point in the approach road closest to the node corresponding to the intersection to the node, and the exit road closest to the node from the node corresponding to the intersection.
  • the exit direction is specified based on a vector toward the shape interpolation point.
  • FIG. 2C is a plan view showing a state in which the vehicle approaches the guidance intersection C.
  • FIG. 2C it is assumed that the approach road R I and the exit road R O at which the turning angle at the guidance intersection C is 90 ° are roads on the planned travel route.
  • the lane markings BL are indicated by white broken lines or solid lines, and the central separation band M that separates roads whose traveling directions are opposite to each other is indicated by solid black lines.
  • the control unit 20 acquires the registered position Q registered in the map information 30a as the position of the guidance intersection C from the map information 30a by the function of the approach degree acquisition unit 21a. That is, the control unit 20 sets a node that is set in common on the approach road R I and the exit road R O (the node to which the link corresponding to the entry road R I and the link corresponding to the exit road R O are connected). ) As a registered position Q from the node data of the map information 30a.
  • the area on the road surface where the approach road R I and the exit road R O overlap is the guidance intersection C (within the broken line frame), and the registered position Q is the geometric center position of the guidance intersection C ( Center of gravity).
  • the guidance intersection C is surrounded by the extension lines B 1 and B 2 at both edges in the width direction of the approach road R I and the extension lines B 3 and B 3 at both edges in the width direction of the exit road R O. This area
  • the control unit 20 acquires the linear distance between the registered position Q of the guidance intersection C and the current position P as the remaining distance S by the function of the approach level acquisition unit 21a.
  • the remaining distance S is an indicator of the degree of approach of the vehicle to the guidance intersection C. The smaller the remaining distance S, the greater the degree of approach of the vehicle to the guidance intersection C.
  • the control unit 20 may acquire the remaining length distance S along the road from the current position of the vehicle to the registered position Q.
  • the display control unit 21b is a module that causes the display 45 to display a guidance image that guides the position of the guidance intersection C on a front image that represents a front landscape of the vehicle.
  • control unit 20 generates a guide image G comprising a lower image G 1 and the upper image G 2 and the arrow image G 3.
  • Control unit 20 superimposes the arrow image G 3 on the upper image G 2, it generates a guide image G by further coupling the upper end of the lower image G 1 to the lower end of the upper image G 2.
  • Control unit 20 obtains the turning angle at the guide intersection C, and generates an arrow image G 3 corresponding to the pivot angle.
  • 2D shows an arrow image G 3.
  • Arrow image G 3 are, the entrance portion I, and a and egress portion O which connects the upper end of the entry portion I.
  • Entry portion I is a part representing the traveling direction of the vehicle in approach road R I (entering direction), the center line i bisecting the entry portion I in the widthwise direction is always vertical line of the front image. Entering direction immediately before entering the guidance intersection C in approach road R I is always forward, corresponding to upward in the vertical direction of the front image.
  • the approach portion I in which the center line i in the width direction is the vertical line of the front image represents the approach direction of the vehicle with respect to the guidance intersection C.
  • the exit portion O is a portion that represents the traveling direction (retreat direction) of the vehicle on the exit road R O , and the center line o that bisects the exit portion O in the width direction is relative to the center line i in the width direction of the entrance portion I. And a line in a direction inclined by the turning angle ⁇ .
  • An arrow head is provided at the tip of the exit portion O.
  • the control unit 20 generates a rectangular upper image G 2 whose height and width are larger than the arrow image G 3 by a predetermined amount, and superimposes the arrow image G 3 on the upper image G 2. . Then, the control unit 20, the size of the image G 2 acquired over associated with the remaining distance S in the display setting data 30b, an image G 2 on the arrow image G 3 are superimposed in accordance with the remaining distance S size Convert to (enlarge or reduce). In the display setting data 30b, the size of the upper image G 2 as remaining distance S increases as smaller is defined.
  • Figure 2E is a diagram showing the lower image G 1.
  • Under the image G 1 is a horizontal line above the sides in the front image, an isosceles triangle-shaped image having an apex (bottom point) below the upper edge. That is, the lower image G 1 is an image whose width becomes narrower as it approaches the lower end point.
  • the control unit 20 acquires the length Z of the lower image G 1 associated with the remaining distance S in the display setting data 30b, and generates the lower image G 1 having the acquired length Z.
  • the length Z of the lower image G 1 the length of the lower image G 1 in the longitudinal direction of the front image.
  • the length Z of the lower image G 1 as as the length remaining distance S is, the smaller is defined.
  • Control unit 20 as the center line c of the lateral direction of the lower image G 1 is present on the extension U of the central line i in the lateral direction of the entrance part I of the arrow image G 3, is superimposed arrow image G 3
  • the guide image G is generated by combining the upper end (upper side) of the lower image G 1 with the lower end of the upper image G 2 .
  • the lower end point of the lower image G 1 is also positioned on the extension line U of the horizontal center line i of the entry portion I of the arrow image G 3 .
  • the control unit 20 sets the superimposition position of the lower end point of the lower image G 1 of the guide image G in the front image.
  • the control unit 20 sets the superimposing position of the lower end point of the lower image G 1 of the guide image G by different methods depending on the state of the positional relationship between the guide intersection C and the vehicle (first to third state) To do.
  • the reference distance S th is set to the remaining distance S from the vehicle to the average shape intersection when the area of the average shape intersection image becomes a predetermined area in the front image, and the display setting data 30b.
  • the predetermined area is the lane configuration of the approach road that can recognize the shape of the connecting road connected to the average-shaped intersection when the image of the average-shaped intersection is the predetermined area in the front image and that enters the average-shaped intersection. This is the area that can be recognized by experiments.
  • the average-shaped intersection is an intersection having a shape obtained by averaging the shapes of the intersections existing on the road.
  • the average-shaped intersection may be a square in which the length of each of the four sides is the average width of the road.
  • the control unit 20 determines whether or not the registered position Q of the guidance intersection C exists in the projection space K as follows. That is, the control unit 20 projects the projection when the intersection angle A, which is an angle formed by the straight line F extending from the current position P to the registered position Q of the guidance intersection C and the straight line F, in the horizontal plane is equal to or less than the determination angle Ath. It is determined that the registration position Q of the guidance intersection C exists in the space K. For example, as shown in FIG.
  • the coordinates (X Q , Y Q ) of the registered position Q in the viewpoint coordinate system can be acquired.
  • the current position P may be P 2, greater than the remaining distance S is the reference distance S th, and the intersection angle A is determined to be equal to or lower than the determination angle A th. In this case, it is determined that the positional relationship between the guidance intersection C and the vehicle is in the first state.
  • the control unit 20 sets the position in the front image corresponding to the registered position Q registered in the map information as the position of the guidance intersection C in the lower image G 1 of the guide image G. Is set as the superposition position of the lower end point.
  • the control unit 20 acquires the position in the front image corresponding to the registered position Q by converting the coordinates (X Q , Y Q ) of the registered position Q in the viewpoint coordinate system by using the position conversion table 30c.
  • the position is set as the superimposed position of the lower end point of the lower image G 1 of the guide image G.
  • control unit 20 as the lower end point of the lower image G 1 of the guide image G is superimposed on the position in the front image corresponding to the registered position Q, it superimposes a guide image G in the front image. Further, the control unit 20 causes the display 45 to display a front image on which the guide image G is superimposed.
  • FIG. 2B shows a front image in the first state.
  • the driver can recognize the position of the guide intersection C rely on the position of the lower end point of the lower image G 1 of the guide image G.
  • FIG. 3A is a diagram illustrating a positional relationship between the guidance intersection C and the vehicle in the second state in the viewpoint coordinate system.
  • FIG. 2C as shown in 3A, the current position P may be P 1, greater than the remaining distance S is the reference distance S th, and the intersection angle A is determined to be greater than the determination angle A th.
  • the control unit 20 determines whether the registration position Q of the guidance intersection C exists on the left side of the projection space K or the registration position Q of the guidance intersection C exists on the right side of the projection space K. . In the second state, the control unit 20 determines that the registration position Q of the guidance intersection C exists on the left side of the projection space K if the X coordinate (X Q ) of the registration position Q in the viewpoint coordinate system is positive. If the X coordinate (X Q ) of the position Q is negative, it is determined that the registered position Q of the guidance intersection C exists on the right side of the projection space K. As shown in FIG. 3A, when the current position P is P 1 , it is determined that the registration position Q of the guidance intersection C exists on the left side of the projection space K.
  • the control unit 20 allows the front image corresponding to the left end position (end position N) in the projection space K when the registered position Q of the guidance intersection C exists on the left side of the projection space K. Is set as the superimposed position of the lower end point of the lower image G 1 of the guide image G. That is, as illustrated in FIG. 3A, the control unit 20 acquires the end position N on the left end line 1 existing at the left end of the projection space K, and indicates the position in the front image corresponding to the end position N in the guide image G. It is set as the superimposed position of the lower end point of the lower image G 1 of.
  • the control unit 20 corresponds to the right end position (end position N) in the projection space K when the registered position Q of the guidance intersection C exists on the right side of the projection space K.
  • the position in the front image is set as the superimposed position of the lower end point of the lower image G 1 of the guide image G. That is, the control unit 20 acquires the end position N on the right end line r existing at the right end of the projection space K, and sets the position in the front image corresponding to the end position N to the lower end of the lower image G 1 of the guide image G. Set as point overlap position.
  • the control unit 20 converts the coordinates (X N , Y N ) in the viewpoint coordinate system of the end position N of the projection space K by the position conversion table 30c by the function of the display control unit 21b, thereby It gets the position in the front image corresponding to the position N, to set the position where the acquired as the superimposed position of the lower end point of the lower image G 1 of the guide image. Then, the control unit 20 superimposes the guide image G on the front image so that the lower end point of the lower image G 1 of the guide image G is superimposed on the position in the front image corresponding to the end position N of the projection space K. To do. Further, the control unit 20 causes the display 45 to display a front image on which the guide image G is superimposed.
  • the superimposed position of the lower end point of the lower image G 1 of the guide image G is always set on the side of the lateral edges of the front image.
  • the guide image G is the front image. It will protrude from the left or right side.
  • the guide image G is either right from the front image It will protrude.
  • a lateral distance e 1 (FIG. 2B) from the center line i of the entry portion I of the arrow image G 3 to the left end of the guide image G is determined from the position in the front image corresponding to the registered position Q to the front image.
  • the guide image G protrudes to the left from the front image.
  • the control unit 20 the guide while maintaining the superimposed position of the lower end point of the lower image G 1 of the image G, so that the left end of the upper image G 2 of the guide image G coincides with the left end of the front image,
  • the superimposed position of the upper image G 2 on which the arrow image G 3 is superimposed is moved in the right direction.
  • the control unit 20 may tilt the lower image G 1 so that the coupling position of the upper end of the lower image G 1 with respect to the lower end of the upper image G 2 does not change.
  • the lower image G 1 is not an isosceles triangular image.
  • the lateral distance e 2 (FIG.
  • the control unit 20 determines whether or not the intersection angle A formed on the horizontal plane by the straight line extending from the current position P to the registered position Q and the straight line F is larger than a predetermined determination angle Ath. It can be easily determined whether or not the registered position Q exists outside the space K.
  • FIG. 3C is a diagram illustrating, in the viewpoint coordinate system, how to set the overlapping position of the lower end point of the lower image G 1 of the guide image G in the third state.
  • the function of the display control unit 21b causes the control unit 20 to guide the position in the forward image corresponding to the position on the straight line F (straight line position V) that is a straight line extending from the vehicle in the traveling direction of the vehicle. It is set as the overlapping position of the lower end point of the lower image G 1 of the image G.
  • the control unit 20 sets the rectilinear position V on the rectilinear line F by setting the X coordinate (X V ) of the rectilinear position V in the viewpoint coordinate system to 0.
  • the control unit 20 matches the distance in the vehicle traveling direction between the current position P and the registered position Q of the guidance intersection C and the distance between the current position P and the straight traveling position V in the vehicle traveling direction. . That is, the control unit 20 matches the Y coordinate (Y V ) of the rectilinear position V in the viewpoint coordinate system with the Y coordinate (Y Q ) of the registered position Q.
  • the control unit 20 converts the coordinates (X V , Y V ) in the viewpoint coordinate system of the rectilinear position V on the rectilinear line F by the position conversion table 30c.
  • the position in the forward image corresponding to the straight traveling position V is acquired, and the acquired position is set as the overlapping position of the lower end point of the lower image G 1 of the guide image G.
  • the control unit 20 superimposes the guide image G on the front image so that the lower end point of the lower image G1 of the guide image G is superimposed on the position in the front image corresponding to the rectilinear position V on the straight line F. To do. Further, the control unit 20 causes the display 45 to display a front image on which the guide image G is superimposed. As shown in FIG. 3D, the lower end point of the lower image G 1 of the guide image G is positioned on the horizontal bisector of the front image, and the horizontal center line i of the entry portion I of the arrow image G 3 is also forward. Located on the bisector in the horizontal direction of the image.
  • the control unit 20 sets the straight traveling position V on the straight line F extending from the vehicle in the traveling direction of the vehicle to the guidance intersection. Guide as C position.
  • the vehicle By guiding the rectilinear position V on the rectilinear line F, it can be guided that the vehicle has only to travel in the traveling direction toward the target position that has already been determined by the driver. For example, as shown in FIG. 3D, it can be guided that the vehicle should just travel on the left turn lane. That is, as shown by a broken line in FIG. 3D, the guidance intersection C must be turned left via the center position of the guidance intersection C by guiding the registration position Q that is the center position of the guidance intersection C. It can prevent a sense of incongruity.
  • the straight position V on the straight line F always corresponds to the position on the bisector in the horizontal direction. It is possible to prevent blurring in the direction and improve the visibility of the guide image G.
  • the guide image G is generated such that the horizontal center line of the lower image G 1 exists on the extension line U of the horizontal center line i of the entry portion I of the arrow image G 3. . That is, on the bisector in the horizontal direction of the forward image corresponding to the straight line F, the horizontal center line i of the entry portion I of the arrow image G 3 and the horizontal center line c of the lower image G 1 are Line up on a straight line.
  • lower point of the lower image G 1 of the guide image G in superimposed position converted by the table 30c is superimposed. Therefore, when the state of the positional relationship between the guide intersection C and the vehicle changes, it is possible to prevent the lower end point of the lower image G 1 of the guide image G is rapidly moved in the longitudinal direction of the guide image G, resulting discomfort Can be prevented.
  • FIG. 4 is a flowchart of the intersection guidance process.
  • the intersection guidance process is a process of updating the display on the display 45 every time the latest forward image is taken.
  • the control part 20 acquires the guidance intersection C where a vehicle drive
  • the control part 20 acquires a front image by the function of the approach degree acquisition part 21a (step S105). Further, the control unit 20 acquires the current position P and the traveling direction of the vehicle by the function of the approach degree acquisition unit 21a (step S110).
  • control unit 20 acquires the remaining distance S to the guidance intersection C by the function of the approach degree acquisition unit 21a (step S115). That is, the control unit 20 acquires, as the remaining distance S, the linear distance from the current position P to the registered position Q set at the center position of the guidance intersection C.
  • step S120 determines whether or not the remaining distance S to the registered position Q is equal to or less than the reference distance Sth (step S120). That is, the control unit 20 determines whether or not the degree of approach to the guidance intersection C is greater than or equal to a threshold value.
  • step S120: N the control unit 20 moves from the current position P to the registered position Q by the function of the display control unit 21b.
  • intersection angle a and the line and the rectilinear line F extending forms in the horizontal plane is equal to or less than the determination angle a th (step S125).
  • the control unit 20 uses the function of the display control unit 21b to superimpose a guide image G that guides the registered position Q. Is displayed on the display 45 (step S130). That is, when the positional relationship between the guidance intersection C and the vehicle is in the first state, the control unit 20 determines the superimposed position of the lower end point of the lower image G 1 of the guidance image G as the registration position of the guidance intersection C at the guidance intersection C. Set to the position in the front image corresponding to Q.
  • control unit 20 acquires the position in the front image corresponding to the registered position Q by converting the coordinates (X Q , Y Q ) of the registered position Q in the viewpoint coordinate system by the position conversion table 30c.
  • the acquired position is set as the superimposed position of the lower end point of the lower image G 1 of the guide image G.
  • Control unit 20 an arrow image G 3 representing the turn angle T of the guide intersection C superimposed on the upper image G 2, the size of the image G 2 on the arrow image G 3 are superimposed in accordance with the remaining distance S size Convert to Furthermore, the control unit 20 generates an isosceles triangular lower image G 1 having a length Z corresponding to the remaining distance S in the vertical direction of the front image. Then, the control unit 20 determines that the arrow image G 3 is present so that the horizontal center line c of the lower image G 1 is on the extension line U of the horizontal center line i of the entry portion I of the arrow image G 3. The guide image G is generated by combining the upper end (upper side) of the lower image G 1 with the lower end of the superimposed upper image G 2 .
  • control unit 20 as the guide image G may be that protrude on either the left and right from the front image, left or right edge of the top image G 2 of the guide image G coincides with the left or right edge of the front image,
  • the overlapping position of the upper image G 2 on which the arrow image G 3 is superimposed is moved in the horizontal direction.
  • the control unit 20 superimposes the guide image G on the front image and displays it so that the lower end point of the lower image G 1 is superimposed on the position in the front image corresponding to the registered position Q of the guidance intersection C. 45 (FIG. 2B).
  • step S125: N the control unit 20 of the function of the display control section 21b, guide image for guiding the position of the end of the projection space K
  • the front image on which G is superimposed is displayed on the display 45 (step S135). That is, when the positional relationship between the guidance intersection C and the vehicle is in the second state, the control unit 20 sets the overlapping position of the lower end point of the lower image G 1 of the guidance image G to the end position N of the end of the projection space K. Set to the corresponding position in the front image.
  • the control unit 20 determines the end position N on the right end line r existing at the right end of the projection space K by the function of the display control unit 21b.
  • the X coordinate (X N ) of the right end position N is set so as to satisfy “ Q ”.
  • the control unit 20 sets the position obtained by converting the coordinates (X N , Y N ) of the end position N of the projection space K by the position conversion table 30c as the superimposed position of the lower end point of the lower image G 1 of the guide image G. To do. Further, the control unit 20, as the left or right edge of the top image G 2 of the guide image G coincides with the left or right edge of the front image, laterally superimposed position of the image G 2 on the arrow image G 3 is superimposed Move to. The control unit 20 superimposes the guide image G on the front image on the display 45 so that the lower end point of the lower image G 1 is superimposed on the position in the front image corresponding to the end position N of the end of the projection space K. Display (FIG. 3B).
  • step S120: Y when it is determined that the remaining distance S to the registered position Q is equal to or less than the reference distance S th (step S120: Y), the control unit 20 extends from the vehicle in the traveling direction of the vehicle by the function of the display control unit 21b. A front image superimposed with a guide image G for guiding the straight position V on the straight line F that is a straight line is displayed on the display 45 (step S140). That is, when the positional relationship between the guidance intersection C and the vehicle is in the third state, the control unit 20 corresponds to the position where the lower end point of the lower image G 1 of the guidance image G overlaps with the straight position V on the straight line F. Set to a position in the front image.
  • the control unit 20 matches the coordinates (0, Y Q ) that the Y coordinate (Y Q ) and the Y coordinate (Y V ) of the registered position Q in the viewpoint coordinate system coincide with each other and exist on the straight line F. Is acquired as a straight traveling position V. And the control part 20 acquires the position in the front image corresponding to the said rectilinear position V by converting the coordinate (0, YQ ) of the rectilinear position V by the position conversion table 30c, and uses the acquired position as the acquired position. It is set as the superimposed position of the lower end point of the lower image G 1 of the guide image G.
  • the position in the front image corresponding to the straight traveling position V is always a position on the bisector in the horizontal direction of the front image.
  • Control unit 20 as the lower end point of the lower image G 1 is being superimposed on the position in the front image corresponding to the straight-ahead position V, superimposed on the forward image to be displayed on the display 45 the guide image C (FIG. 3D) .
  • the superimposition position of the lower end point of the lower image G 1 of the guide image G is set so that the Y coordinate (Y Q ) of the registered position Q is maintained in any of the first to third states.
  • the control unit 20 identifies the size of the approach road R I width departing road R O guidance intersection C based on the width of the indicated link data of the map information 30a, than the intersection of the guidance intersection C is the average shape If it is larger, the reference distance S th may be corrected upward, and if the guide intersection C is smaller than the average intersection, the reference distance S th may be corrected downward. Furthermore, the reference distance S th may be set to the remaining distance S when the horizontal length of the average shape intersection image becomes a predetermined length in the front image. If the horizontal length of the image of the intersection is large to some extent, the lane configuration of the lane of the approach road where the horizontal images are arranged in the front image can be clearly recognized.
  • the reference distance S th may be set to the remaining distance S when the longitudinal length of the average shape intersection image becomes a predetermined length in the front image. If the length in the vertical direction of the image of the intersection is somewhat large, it is possible to clearly see the image of the connecting road (exit road R O ) other than the straight road whose width appears in the vertical direction of the front image. That is, it is possible to transition to the third state at a stage where the shape of the exit road R O and the exit direction can be clearly recognized. Of course, the visibility of the guidance intersection C changes according to the time zone (daytime, nighttime) and the weather. Therefore , the reference distance Sth is set according to the time zone and the weather so that the guidance intersection C can be clearly seen. Also good.
  • the function of the display control unit 21b allows the control unit 20 to set the approach degree to a threshold value when the intersection angle A formed by the straight line extending from the vehicle to the registration position Q and the straight line F in the horizontal plane is larger than a predetermined reference angle. You may determine that it is less than.
  • the reference angle may be set to an angle at which it is considered that the guidance intersection C exists on the straight line F.
  • FIG. 3E it is assumed that there is an average-shaped intersection at a position where the distance to the registered position Q is the remaining distance S, and the maximum angle at which the straight line F passes through the average-shaped intersection is defined as a reference angle H. Also good.
  • FIG. 3E it is assumed that there is an average-shaped intersection at a position where the distance to the registered position Q is the remaining distance S, and the maximum angle at which the straight line F passes through the average-shaped intersection is defined as a reference angle H. Also good. In the example of FIG.
  • the reference angle H is larger as the remaining distance S is smaller.
  • the reference angle H may be set based on the size of the guidance intersection C, not the size of the average-shaped intersection.
  • the control unit 20 may acquire an approach degree that increases as the required period until the intersection is reached is shortened by the function of the approach degree acquisition unit 21a. For example, the control unit 20 acquires a value obtained by dividing the remaining distance S by the vehicle speed as a required period until the intersection is reached, and determines that the approach degree is less than the threshold when the required period is longer than the reference period. May be. That is, the control unit 20 may determine that the degree of approach is greater than or equal to the threshold when the required period is less than or equal to the reference period. As a result, the timing at which the vehicle reaches the intersection is imminent, and when the possibility that the target position when entering the intersection has been determined is high, it is directed toward the target position existing in the traveling direction of the vehicle. Can guide you to run.
  • the control unit 20 may obtain the required time based on the vehicle speed based on the output signal from the vehicle speed sensor 42, to obtain the required period based on the vehicle speed limit of the approach road R I indicated link data Also good.
  • the guide image G may be a dot-like image that is entirely superimposed on the set superposition position, or may be a polygonal line-like image in which a bending point is superposed on the set superposition position. .
  • the guide image G superimposed on the front scenery when the approach degree is less than the threshold may be the same image as the guide image G superimposed on the front scenery when the approach degree is greater than or equal to the threshold.
  • the images may not be similar.
  • the degree of approach to the guidance intersection C may be recognized by changing the color and shape of the guidance image G. Furthermore, when changing from the second state to the first state, it is possible to recognize that the image of the guidance intersection C is visible in the front image by changing the color and shape of the guidance image G. It may be.
  • the registered position Q does not necessarily have to be a position registered as an intersection position in the map information 30a, and may be a position that can be derived based only on data registered in the map information 30a.
  • the control unit 20 may acquire the width of the connection road connected to the intersection based on the link data of the map information 30a, and derive the registered position Q based on the width of the connection road.
  • the registered position Q may not be the center position of the area where the approach road R I and the exit road R O intersect.
  • the registration position Q is opposed to the approach road R I and the exit road R O and the vehicle traveling on them.
  • the center position D (FIG. 2C) of the area where the opposite road where the vehicle travels intersects may be used.
  • the control unit 20 may set the position in the front landscape corresponding to any position on the straight line F as the superimposed position of the guide image G in the third state. For example, the control unit 20 may set a position on the straight line F and before the intersection as the superimposed position of the guide image G.
  • the reference distance S th may be set to the remaining distance S that can be considered that the vehicle has entered an extension section where an extension lane such as a right turn lane is added before the intersection. In the section before the lane extension section, even if the driver recognizes that the target position can be reached by going straight on the extension lane, the driver cannot travel on the extension lane and the target position is on the straight line F. It is because the case where it does not exist may arise. Further, the reference distance S th may be set to the remaining distance S that can be considered that the vehicle has entered a section in which lane change is prohibited before the intersection. Accordingly, it is possible to guide the driver that the current traveling direction should be maintained as it is without changing the lane in the section where the lane change is prohibited. Further, the reference distance S th is the remaining distance S at which the road sign (arrow) image drawn so as to indicate the exitable direction for each lane is visible on the road surface before the intersection. It may be set.
  • the front image may be obtained by drawing a front landscape based on the map information 30a.
  • the guide image G may be drawn simultaneously with the front image.
  • superimposing the guide image G on the front landscape and displaying it on the display 45 may result in displaying the guide image G on the display 45 and consequently superimposing the guide image G on the front landscape.
  • the display 45 only needs to display at least the guide image G and may not display the front image. In other words, the display 45 may superimpose the guide image G on the actual front scenery viewed by the driver through the windshield of the vehicle.
  • the guide image G may be superimposed on the actual front landscape by making the display 45 a transflective type so that the actual front landscape can be seen through the display 45.
  • the projection space K means a space that can be seen by the driver through the display 45.
  • the angle formed by the direction of the line of sight when the driver looks at the front in front and the direction of the line of sight when the left end of the display 45 is viewed from the viewpoint of the driver is the determination angle Ath in the left direction.
  • the angle formed by the direction of the line of sight when the driver looks at the front in front and the direction of the line of sight when the right end of the display 45 is viewed from the viewpoint of the driver is the determination angle Ath in the right direction.
  • the determination angle Ath has different sizes on the left and right.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • Instructional Devices (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Controls And Circuits For Display Device (AREA)
PCT/JP2013/057643 2012-09-28 2013-03-18 交差点案内システム、方法およびプログラム WO2014050172A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP13842313.2A EP2863181B1 (en) 2012-09-28 2013-03-18 Intersection navigation system, method, and program
US14/424,681 US9508258B2 (en) 2012-09-28 2013-03-18 Intersection guide system, method, and program
CN201380044530.XA CN104603578B (zh) 2012-09-28 2013-03-18 交叉路口引导系统、方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-217016 2012-09-28
JP2012217016A JP5935636B2 (ja) 2012-09-28 2012-09-28 交差点案内システム、方法およびプログラム

Publications (1)

Publication Number Publication Date
WO2014050172A1 true WO2014050172A1 (ja) 2014-04-03

Family

ID=50387594

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/057643 WO2014050172A1 (ja) 2012-09-28 2013-03-18 交差点案内システム、方法およびプログラム

Country Status (5)

Country Link
US (1) US9508258B2 (zh)
EP (1) EP2863181B1 (zh)
JP (1) JP5935636B2 (zh)
CN (1) CN104603578B (zh)
WO (1) WO2014050172A1 (zh)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5935636B2 (ja) * 2012-09-28 2016-06-15 アイシン・エィ・ダブリュ株式会社 交差点案内システム、方法およびプログラム
KR101750876B1 (ko) * 2015-05-28 2017-06-26 엘지전자 주식회사 차량용 디스플레이 장치 및 차량
CN105588576B (zh) * 2015-12-15 2019-02-05 招商局重庆交通科研设计院有限公司 一种车道级导航方法及系统
CN107492060B (zh) * 2016-06-12 2021-11-19 北京嘀嘀无限科技发展有限公司 事件信息的显示方法和装置
US11156473B2 (en) * 2016-08-18 2021-10-26 Sony Corporation Information processing apparatus, information processing system, and information processing method
CN107525516B (zh) * 2016-10-09 2019-04-09 腾讯科技(深圳)有限公司 用于导航的车道线显示方法和装置
KR102466737B1 (ko) * 2016-11-26 2022-11-14 팅크웨어(주) 경로 안내를 위한 장치, 방법, 컴퓨터 프로그램 및 컴퓨터 판독 가능한 기록 매체
WO2018097609A1 (ko) 2016-11-26 2018-05-31 팅크웨어(주) 경로 안내를 위한 장치, 방법, 컴퓨터 프로그램 및 컴퓨터 판독 가능한 기록 매체
WO2018147066A1 (ja) * 2017-02-08 2018-08-16 株式会社デンソー 車両用表示制御装置
CN110383214B (zh) * 2017-03-09 2022-05-10 索尼公司 信息处理装置、信息处理方法和记录介质
US10527449B2 (en) * 2017-04-10 2020-01-07 Microsoft Technology Licensing, Llc Using major route decision points to select traffic cameras for display
JP6838522B2 (ja) * 2017-08-10 2021-03-03 トヨタ自動車株式会社 画像収集システム、画像収集方法、画像収集装置、および記録媒体
CN107963077B (zh) * 2017-10-26 2020-02-21 东软集团股份有限公司 一种车辆通过路口的控制方法、装置及系统
JP6626069B2 (ja) * 2017-11-10 2019-12-25 矢崎総業株式会社 車両用表示装置
KR102547823B1 (ko) * 2017-12-13 2023-06-26 삼성전자주식회사 컨텐츠 시각화 장치 및 방법
CN110579222B (zh) * 2018-06-07 2022-03-15 百度在线网络技术(北京)有限公司 导航路线处理方法、装置及设备
CN109297502A (zh) * 2018-08-01 2019-02-01 广州大学 基于图像处理与gps导航技术的激光投影指向方法及装置
CN110047301B (zh) * 2019-04-19 2021-07-27 山东科技大学 一种城市快速路智能交叉口左转车辆检测及信号控制系统和方法
WO2021091039A1 (ko) * 2019-11-06 2021-05-14 엘지전자 주식회사 차량용 디스플레이 장치 및 그 제어 방법
CN113758490A (zh) * 2020-06-01 2021-12-07 南宁富桂精密工业有限公司 进入匝道判断方法及导航系统
JP2022138782A (ja) * 2021-03-11 2022-09-26 トヨタ自動車株式会社 交差点管制システム、交差点管制方法、及び、プログラム
CN113899380A (zh) * 2021-09-29 2022-01-07 北京百度网讯科技有限公司 路口转向提醒方法、装置、电子设备及存储介质

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0763572A (ja) 1993-08-31 1995-03-10 Alpine Electron Inc 車載用ナビゲーション装置の走行案内画像表示方法
JPH08184452A (ja) * 1994-12-27 1996-07-16 Nissan Motor Co Ltd 車両用経路誘導装置
JP2001082969A (ja) * 1999-09-14 2001-03-30 Alpine Electronics Inc ナビゲーション装置
JP2005265573A (ja) * 2004-03-18 2005-09-29 Xanavi Informatics Corp 車載ナビゲーション装置、ナビゲーションシステム
WO2007129382A1 (ja) * 2006-04-28 2007-11-15 Panasonic Corporation ナビゲーション装置およびその方法
WO2007142084A1 (ja) * 2006-06-05 2007-12-13 Panasonic Corporation ナビゲーション装置
JP2008122150A (ja) * 2006-11-09 2008-05-29 Nissan Motor Co Ltd ナビゲーション装置
JP2010181363A (ja) * 2009-02-09 2010-08-19 Nissan Motor Co Ltd 車両用走行情報提供装置
JP2011149835A (ja) * 2010-01-22 2011-08-04 Clarion Co Ltd カーナビゲーション装置

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002318277A (ja) * 2001-04-23 2002-10-31 Yupiteru Ind Co Ltd 車載用目標物検出装置及びマイクロ波検出器
JP4722433B2 (ja) * 2004-08-25 2011-07-13 アルパイン株式会社 車載用ナビゲーション装置
JP2007121001A (ja) * 2005-10-26 2007-05-17 Matsushita Electric Ind Co Ltd ナビゲーション装置
JP4550927B2 (ja) * 2006-03-28 2010-09-22 パナソニック株式会社 ナビゲーション装置
WO2007145190A1 (ja) * 2006-06-12 2007-12-21 Panasonic Corporation ナビゲーション装置及びナビゲーション方法
JP4776476B2 (ja) * 2006-09-01 2011-09-21 アルパイン株式会社 ナビゲーション装置および交差点拡大図の描画方法
JP4948944B2 (ja) * 2006-09-06 2012-06-06 アルパイン株式会社 ナビゲーション装置および交差点案内図の描画方法
DE102007030345A1 (de) * 2007-02-28 2008-09-04 Navigon Ag Navigationseinrichtung und Verfahren zur grafischen Ausgabe von Navigationsanweisungen
JP2008309529A (ja) * 2007-06-12 2008-12-25 Panasonic Corp ナビゲーション装置、ナビゲーション方法、及びナビゲーション用プログラム
WO2009084135A1 (ja) * 2007-12-28 2009-07-09 Mitsubishi Electric Corporation ナビゲーション装置
DE112008003481T5 (de) * 2007-12-28 2010-12-30 Mitsubishi Electric Corp. Navigationsgerät
DE102008025053B4 (de) * 2008-01-18 2023-07-06 Garmin Switzerland Gmbh Navigationseinrichtung
US20110288766A1 (en) * 2008-01-29 2011-11-24 Increment P Corporation Navigation device, navigation method, navigation program, and recording medium
JP2011515717A (ja) * 2008-03-24 2011-05-19 グーグル インコーポレイテッド 運転指図内のパノラマ画像
AU2008355643A1 (en) * 2008-05-02 2009-11-05 Tomtom International B.V. A navigation device and method for displaying a static image of an upcoming location along a route of travel
JP2011529569A (ja) * 2008-07-31 2011-12-08 テレ アトラス ベスローテン フエンノートシャップ ナビゲーションデータを三次元で表示するコンピュータ装置および方法
CA2725800A1 (en) * 2008-07-31 2010-02-04 Tele Atlas B.V. Method of displaying navigation data in 3d
DE102008045994A1 (de) * 2008-09-05 2010-03-11 Volkswagen Ag Verfahren und Vorrichtung zum Anzeigen von Informationen in einem Fahrzeug
JP2010127685A (ja) * 2008-11-26 2010-06-10 Honda Motor Co Ltd ナビゲーション装置
JP5393195B2 (ja) * 2009-02-26 2014-01-22 アルパイン株式会社 ナビゲーション装置および経路探索方法
US8358224B2 (en) * 2009-04-02 2013-01-22 GM Global Technology Operations LLC Point of interest location marking on full windshield head-up display
JP5387544B2 (ja) * 2009-12-18 2014-01-15 株式会社デンソー ナビゲーション装置
JP5810842B2 (ja) * 2011-11-02 2015-11-11 アイシン・エィ・ダブリュ株式会社 レーン案内表示システム、方法およびプログラム
JP2013117515A (ja) * 2011-11-02 2013-06-13 Aisin Aw Co Ltd レーン案内表示システム、方法およびプログラム
JP5810843B2 (ja) * 2011-11-02 2015-11-11 アイシン・エィ・ダブリュ株式会社 レーン案内表示システム、方法およびプログラム
JP6015227B2 (ja) * 2012-08-10 2016-10-26 アイシン・エィ・ダブリュ株式会社 交差点案内システム、方法およびプログラム
JP6015228B2 (ja) * 2012-08-10 2016-10-26 アイシン・エィ・ダブリュ株式会社 交差点案内システム、方法およびプログラム
JP5492962B2 (ja) * 2012-09-28 2014-05-14 富士重工業株式会社 視線誘導システム
JP5935636B2 (ja) * 2012-09-28 2016-06-15 アイシン・エィ・ダブリュ株式会社 交差点案内システム、方法およびプログラム

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0763572A (ja) 1993-08-31 1995-03-10 Alpine Electron Inc 車載用ナビゲーション装置の走行案内画像表示方法
JPH08184452A (ja) * 1994-12-27 1996-07-16 Nissan Motor Co Ltd 車両用経路誘導装置
JP2001082969A (ja) * 1999-09-14 2001-03-30 Alpine Electronics Inc ナビゲーション装置
JP2005265573A (ja) * 2004-03-18 2005-09-29 Xanavi Informatics Corp 車載ナビゲーション装置、ナビゲーションシステム
WO2007129382A1 (ja) * 2006-04-28 2007-11-15 Panasonic Corporation ナビゲーション装置およびその方法
WO2007142084A1 (ja) * 2006-06-05 2007-12-13 Panasonic Corporation ナビゲーション装置
JP2008122150A (ja) * 2006-11-09 2008-05-29 Nissan Motor Co Ltd ナビゲーション装置
JP2010181363A (ja) * 2009-02-09 2010-08-19 Nissan Motor Co Ltd 車両用走行情報提供装置
JP2011149835A (ja) * 2010-01-22 2011-08-04 Clarion Co Ltd カーナビゲーション装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2863181A4

Also Published As

Publication number Publication date
CN104603578A (zh) 2015-05-06
JP2014070999A (ja) 2014-04-21
US20150221220A1 (en) 2015-08-06
JP5935636B2 (ja) 2016-06-15
EP2863181A4 (en) 2015-08-12
CN104603578B (zh) 2017-11-10
EP2863181B1 (en) 2017-06-21
US9508258B2 (en) 2016-11-29
EP2863181A1 (en) 2015-04-22

Similar Documents

Publication Publication Date Title
JP5935636B2 (ja) 交差点案内システム、方法およびプログラム
JP5810842B2 (ja) レーン案内表示システム、方法およびプログラム
EP2848895B1 (en) Intersection guidance system, method and program
JP5810843B2 (ja) レーン案内表示システム、方法およびプログラム
JP5708449B2 (ja) レーン案内表示システム、方法およびプログラム
WO2013065256A1 (en) Lane guidance display system, lane guidance display method, and lane guidance display program
EP2848896B1 (en) Intersection guidance system, method and program
JP5994574B2 (ja) 位置案内システム、方法およびプログラム
JP2015049221A (ja) 進路案内表示システム、方法およびプログラム
JP5906988B2 (ja) 道路形状案内システム、方法およびプログラム
JP5983498B2 (ja) 交差点案内システム、方法およびプログラム
JP5772571B2 (ja) レーン案内表示システム、方法およびプログラム
JP2014071001A (ja) 推奨レーン案内システム、方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13842313

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2013842313

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2013842313

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 14424681

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE