WO2021090668A1 - Vehicle driving assistance system, road surface drawing device, and road - Google Patents

Vehicle driving assistance system, road surface drawing device, and road Download PDF

Info

Publication number
WO2021090668A1
WO2021090668A1 PCT/JP2020/039290 JP2020039290W WO2021090668A1 WO 2021090668 A1 WO2021090668 A1 WO 2021090668A1 JP 2020039290 W JP2020039290 W JP 2020039290W WO 2021090668 A1 WO2021090668 A1 WO 2021090668A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
unit
driving support
course
support system
Prior art date
Application number
PCT/JP2020/039290
Other languages
French (fr)
Japanese (ja)
Inventor
例人 田村
祐貴 高橋
金子 進
新 竹田
柴田 裕一
浩一 田辺
裕介 仲田
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2019201392A external-priority patent/JP2021076956A/en
Priority claimed from JP2019203610A external-priority patent/JP2021077125A/en
Priority claimed from JP2019205095A external-priority patent/JP2021075952A/en
Priority claimed from JP2019209198A external-priority patent/JP7403288B2/en
Priority claimed from JP2019211412A external-priority patent/JP7348819B2/en
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Publication of WO2021090668A1 publication Critical patent/WO2021090668A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/34Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating change of drive direction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/507Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking specific to autonomous vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/543Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for indicating other states or conditions of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • EFIXED CONSTRUCTIONS
    • E01CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
    • E01FADDITIONAL WORK, SUCH AS EQUIPPING ROADS OR THE CONSTRUCTION OF PLATFORMS, HELICOPTER LANDING STAGES, SIGNS, SNOW FENCES, OR THE LIKE
    • E01F9/00Arrangement of road signs or traffic signals; Arrangements for enforcing caution
    • E01F9/50Road surface markings; Kerbs or road edgings, specially adapted for alerting road users
    • E01F9/506Road surface markings; Kerbs or road edgings, specially adapted for alerting road users characterised by the road surface marking material, e.g. comprising additives for improving friction or reflectivity; Methods of forming, installing or applying markings in, on or to road surfaces
    • E01F9/524Reflecting elements specially adapted for incorporation in or application to road surface markings
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V14/00Controlling the distribution of the light emitted by adjustment of elements
    • F21V14/04Controlling the distribution of the light emitted by adjustment of elements by movement of reflectors
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V7/00Reflectors for light sources
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • This disclosure relates to a vehicle driving support system, a road surface drawing device, and a road.
  • driving support vehicle When driving a vehicle using driving support technology or automatic driving technology (hereinafter referred to as driving support vehicle), the passengers of the vehicle do not always pay close attention to the driving situation. It becomes difficult to predict the operation of the vehicle by visually observing the driver from outside the vehicle, which was possible with conventional manual driving. Therefore, in a vehicle using driving assistance technology, it is important to notify the surroundings of the vehicle of the vehicle operation by some method.
  • Patent Document 1 As a method of notifying the vehicle movement to the outside of the vehicle, a method of irradiating the area around the vehicle on the road with light to draw an image has been proposed (see, for example, Patent Document 1).
  • image drawing on the road surface is a vehicle that is traveling by the driving support technology for other drivers, pedestrians, etc. of a vehicle that does not have the driving support technology (hereinafter referred to as a manually driven vehicle). That is, the movement of the vehicle now and in the future can be transmitted, so that other drivers and pedestrians can predict the movement of the vehicle.
  • the road surface drawing device draws a predetermined image on the road surface using a display element such as an LED array or a DMD (digital mirror device) provided on the front surface of the vehicle.
  • a display element such as an LED array or a DMD (digital mirror device) provided on the front surface of the vehicle.
  • the vehicle drawing apparatus described in Patent Document 2 displays an arrow in the direction of turning at an intersection when turning right or left as a marker on the road surface in front of the vehicle.
  • the course display image (road surface display image) displayed as a marker on the road surface allows other vehicles and pedestrians in the vicinity to recognize the traveling direction of the vehicle.
  • turn right and left turns with blinker lamps, etc. are displayed at a predetermined distance before the intersection specified by the Road Traffic Act, or from a predetermined time before entering the intersection.
  • the course display image is also displayed in the same manner as the blinker lamp.
  • the road surface drawing device continues to project specific images such as right turn arrows and left turn arrows on the road surface until the vehicle finishes turning. That is, when the steering angle (steering angle) from the steering device of the vehicle is in the right turn direction, the road surface drawing device continues to display the right turn arrow until the steering angle returns to the original direction (returns to the straight direction).
  • Patent Document 1 only presents the driving state and the driving schedule of each driving support vehicle even when a plurality of driving support vehicles travel on a public road. It was difficult to grasp the overall situation because the amount of information displayed in the entire traffic scene increased.
  • an object of the present disclosure is to provide a vehicle driving support system capable of supporting smooth traffic even when a plurality of vehicles merge at an intersection. ..
  • Patent Document 1 since information is transmitted by visually recognizing an image drawn on a road surface, it is necessary for a pedestrian or a driver of another vehicle to recognize the image and select an action. The behavior of vehicles equipped with tends to be prioritized.
  • an object of the present disclosure is to provide a vehicle driving support system that actively works on a vehicle traveling by using driving support technology.
  • Patent Document 2 in particular, at an intersection of a highway, a plurality of vehicles may stop in a string of beads waiting for a traffic light.
  • the vehicle located at the head of the intersection can display the course display image on the road surface in front, but the second and subsequent vehicles have sufficient space to display the course display image between the vehicle and the vehicle in front. Cannot be secured. Therefore, the light of the course display image of the second and subsequent vehicles may be reflected by the vehicle body, bumper, etc. of the vehicle in front, and the road surface may not be drawn accurately.
  • the course display image of the own vehicle does not optically affect the vehicle in front or the own vehicle located in front of the own vehicle. It is an object of the present invention to provide a vehicle driving support system for controlling the display of a course display image.
  • a general road surface drawing device always projects the same image regardless of the position of the vehicle 10E on the intersection. For example, when the vehicle 10E turns right, the road surface drawing device displays an image A of an arrow turning to the right on the road surface in front of the vehicle 10E before the vehicle 10E enters the intersection (see FIG. 33A).
  • the road marking image A may be far away from the pedestrian B at a position facing the vehicle 10E at the intersection, and may be hidden behind another vehicle and cannot be seen.
  • the vehicle 10E when the vehicle 10E makes a right turn and the direction of the vehicle 10E is almost right, the same road marking image A is always displayed. Therefore, the vehicle 10E actually remains slightly to the right.
  • the image is an arrow that makes a large turn to the right, even though it only bends. That is, as shown in FIGS. 33C and 33D, the traveling direction (course) of the vehicle 10E may not match the direction of the arrow on the road surface display image A.
  • the road marking image A may give the pedestrian B at the opposite position a misunderstanding that the vehicle 10E turns further to the right.
  • the difference between the traveling direction (course) of the vehicle and the direction of the arrow on the road surface display image A may give a misunderstanding to other vehicles and pedestrians B in the vicinity.
  • an object of the present disclosure is to provide a road surface drawing device that matches the course of the vehicle with the content of the course display image on the road surface as the direction of the traveling vehicle changes when turning right or left.
  • Patent Document 1 irradiates the road surface with light to draw an image
  • other drivers and pedestrians recognize the drawn image by visually recognizing the light reflected on the road surface. It will be.
  • the light irradiation of the road surface drawing can be visually recognized, such as when the road surface is exposed to direct sunlight in fine weather, when the road surface is wet in rainy weather, and when the headlights from other vehicles are illuminated at night. Difficult situations can occur.
  • an object of the present disclosure is to provide a road and vehicle driving support system that improves the visibility of road surface drawing.
  • the vehicle driving support system of the present disclosure acquires the motion of the first vehicle traveling on the road as the first motion information, and obtains the motion of the second vehicle traveling on the road as the second motion information.
  • Guidance for creating guidance information indicating a route on which the first vehicle or the second vehicle is scheduled to travel based on the vehicle motion grasping unit acquired as motion information and the first motion information and the second motion information.
  • the information creation unit includes a guidance information presentation unit that presents the guidance information to the first vehicle or the second vehicle, and the guidance information creation unit has the same first operation information and the second operation information. The guidance information is created when the course change in the direction is included and the right turn motion and the left turn motion are included.
  • the vehicle motion grasping unit acquires the motions of the first vehicle and the second vehicle, and the first motion information and the second motion information turn left and right by changing the course in the same direction.
  • the guidance information creation unit creates and presents guidance information. Therefore, the present disclosure makes it possible to support smooth traffic while ensuring safety even when a plurality of vehicles traveling by the driving assistance technology merge at an intersection.
  • the vehicle driving support system of the present disclosure has a detection unit that detects light from the outside of the vehicle and a driving support unit that supports driving according to the result of the detection unit. And a light irradiation unit that irradiates the detection unit with a predetermined light signal, and when the detection unit detects the light signal, the driving support unit stops the receiving vehicle. Or execute a deceleration operation.
  • the receiving side vehicle executes a stop operation or a deceleration operation by irradiating the detection unit with an optical signal from the light irradiation unit, so that the vehicle travels by the driving support technology. It is possible to actively work on.
  • the vehicle driving support system of the present disclosure includes a vehicle course acquisition unit that acquires vehicle course information, a vehicle state information acquisition unit that acquires the position and speed of the vehicle as vehicle state information, and the like. Projected onto the road surface around the vehicle based on the other vehicle detection unit that detects the presence of another vehicle in the vicinity of the vehicle and acquires the detection information, the course information, the vehicle state information, and the detection information.
  • An image selection unit for selecting a course display image to be used and a road surface drawing unit for projecting the course display image on the road surface are provided, and the image selection unit is in front of the vehicle based on the detection information and the vehicle state information.
  • the course display image is not selected, and if it is determined that the other vehicle does not exist within the first distance, the course forward display image is selected.
  • the road surface drawing unit projects a course forward display image selected by the image selection unit onto the road surface in front of the vehicle.
  • the road surface drawing unit projects the path forward display image on the front road surface only when there is no other vehicle within the first distance in front of the vehicle. Therefore, even when a plurality of vehicles are lined up at an intersection or the like, the vehicle It is possible to control the display of the route display image so that the route display image does not optically affect the vehicle in front or the vehicle located in front of the vehicle.
  • the road surface drawing device of the present disclosure includes a vehicle course acquisition unit that acquires vehicle course information, a vehicle state information acquisition unit that acquires the position and direction of the vehicle as vehicle state information, and the above.
  • An image selection unit that selects a course display image to be projected on the front road surface of the vehicle based on the course information and the vehicle state information, and a road surface drawing unit that projects the course display image onto the front road surface are provided.
  • the image selection unit selects the course display image including a line indicating the second direction from the vehicle, and the image selection unit changes with the change of the vehicle state information. , The course display image is reselected.
  • the road of the present disclosure contains a pavement surface and a phosphor material that is excited by the primary light and emits a secondary light different from the primary light, and is formed on the pavement surface.
  • a fluorescent substance-containing layer and a coating layer formed on the fluorescent substance-containing layer and transmitting the primary light and the secondary light are provided.
  • the phosphor-containing layer is irradiated with primary light and an image is drawn.
  • the wavelength of the primary light is converted in the phosphor-containing layer and the secondary light is emitted, so that the visibility of the road surface drawing is improved.
  • the vehicle driving support system of the present disclosure includes the road described above and a light irradiation unit that irradiates the road with the primary light.
  • the course display image of the own vehicle does not optically affect other vehicles or the own vehicle located in front of the own vehicle. It is possible to provide a vehicle driving support system that controls the display of images.
  • FIG. 1 is a schematic view showing a road 1 and a vehicle driving support system 100 according to the first embodiment.
  • FIG. 2 is a schematic cross-sectional view showing the structure of the road 1 according to the first embodiment.
  • FIG. 3 is a block diagram showing the configuration of the vehicle driving support system 100.
  • FIG. 4A is an example of the presentation of guidance information in the vehicle driving support system 100, and is a schematic diagram showing the irradiation of light from the infrastructure side device 30.
  • FIG. 4B is an example of guidance information presentation in the vehicle driving support system 100, and is a schematic view showing light irradiation from the first vehicle 10 or the second vehicle 20.
  • FIG. 5 is a schematic cross-sectional view showing the structure of a modified example of the road 1.
  • FIG. 4A is an example of the presentation of guidance information in the vehicle driving support system 100, and is a schematic diagram showing the irradiation of light from the infrastructure side device 30.
  • FIG. 4B is an example of guidance information presentation in the vehicle driving support
  • FIG. 6 is a flowchart showing an operation example of the vehicle driving support system 100 according to the first embodiment.
  • FIG. 7 is a schematic view showing an operation example of the vehicle driving support system 200 according to the second embodiment on the road.
  • FIG. 8 is a schematic view showing an example of operation of the vehicle driving support system 300 according to the third embodiment on the road.
  • FIG. 9 is a schematic view showing an example of operation of the vehicle driving support system 400 according to the fourth embodiment on the road.
  • FIG. 10 is a schematic view showing an example of operation of the vehicle driving support system 500 according to the fifth embodiment on the road.
  • FIG. 11 is a schematic view showing an operation example of the vehicle driving support system 600 according to the sixth embodiment on the road.
  • FIG. 12 is a block diagram showing the configuration of the vehicle driving support system 600.
  • FIG. 12 is a block diagram showing the configuration of the vehicle driving support system 600.
  • FIG. 13A is an example of light irradiation in the vehicle driving support system 600, and is a schematic view showing light irradiation from the transmitting side vehicle 20C.
  • FIG. 13B is an example of light irradiation in the vehicle driving support system 600, and is a schematic view showing light irradiation from the infrastructure side device 30.
  • FIG. 13C is an example of light irradiation in the vehicle driving support system 600, and is a schematic view showing light irradiation from a portable electronic device 40 held by a pedestrian.
  • FIG. 14 is a flowchart showing an operation example of the vehicle driving support system 600.
  • FIG. 15 is a schematic view showing an operation example of the vehicle driving support system 700 according to the seventh embodiment on the road.
  • FIG. 16 is a flowchart showing an operation example of the vehicle driving support system 700.
  • FIG. 17A is a front view of a vehicle traveling by using the vehicle driving support system 800 according to the eighth embodiment.
  • FIG. 17B is a rear view of a vehicle traveling by using the vehicle driving support system 800 according to the eighth embodiment.
  • FIG. 18 is a block diagram showing a configuration of the vehicle driving support system 800 according to the eighth embodiment.
  • FIG. 19 is a flowchart showing a flow of processing in which the vehicle driving support system 800 according to the eighth embodiment displays a course display image.
  • FIG. 20 is a schematic view showing a course and a course display image at an intersection of a plurality of vehicles according to the eighth embodiment.
  • FIG. 21 is a schematic view showing a course display image at an intersection of a vehicle of a ninth embodiment and a two-wheeled vehicle (another vehicle).
  • FIG. 22 is a schematic view showing the relationship between the vehicle of the tenth embodiment, the two-wheeled vehicle (other vehicle), and the infrastructure side device at the intersection.
  • FIG. 23 is a block diagram of the road surface drawing device 900 according to the eleventh embodiment.
  • FIG. 24 is a flowchart showing a flow of processing in which the road surface drawing device 900 according to the eleventh embodiment displays the course display image 6E.
  • FIG. 25 is a schematic view showing a course 7E and a course display image 6E when turning right at an intersection of the vehicle 10E in the eleventh embodiment.
  • FIG. 26 is a schematic view showing a method of selecting the course display image 6E in the eleventh embodiment.
  • FIG. 27A is a schematic view showing a course display image 6E at the position of the vehicle 10E in the intersection at the time of turning right in the eleventh embodiment, and shows a state before the vehicle 10E enters the intersection.
  • FIG. 27B is a schematic view showing a course display image 6E at the vehicle 10E position in the intersection when turning right in the eleventh embodiment, and shows a state in which the vehicle 10E is located near the center of the intersection.
  • FIG. 27C is a schematic view showing a course display image 6E at the position of the vehicle 10E in the intersection at the time of turning right in the eleventh embodiment, and shows a state in which the vehicle 10E has turned the intersection.
  • FIG. 27A is a schematic view showing a course display image 6E at the position of the vehicle 10E in the intersection at the time of turning right in the eleventh embodiment, and shows a state before the vehicle 10E enters the intersection
  • FIG. 28A is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state before the vehicle 10E enters the intersection.
  • FIG. 28B is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state in which the vehicle 10E is located near the center of the intersection.
  • FIG. 28C is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state in which the vehicle 10E is located near the center of the intersection.
  • FIG. 28A is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state before the vehicle 10E enters the intersection.
  • FIG. 28B
  • FIG. 28D is a schematic view showing the course display image 6E at the time of turning right in the twelfth embodiment according to the position of the vehicle 10E, and shows the state of the vehicle 10E at the position where the intersection is completely turned.
  • FIG. 29A shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6A at the position where the vehicle 10E enters the intersection.
  • FIG. 29B shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6EB when the vehicle 10E is located near the center of the intersection.
  • FIG. 29C shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6EC when the vehicle 10E is located near the center of the intersection.
  • FIG. 29D shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6ED at a position where the vehicle 1 turns completely at the intersection.
  • FIG. 30A shows the course display image 6E of the thirteenth embodiment, and is a schematic view of the intersection showing the course display image 6E at the intersection with the passage of time.
  • FIG. 30B shows the course display image 6E of the thirteenth embodiment, and is a schematic perspective view showing the projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection.
  • FIG. 30C shows the course display image 6E of the thirteenth embodiment, and is a schematic perspective view showing a state of projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection.
  • FIG. 30D shows the course display image 6E of the thirteenth embodiment, and is a schematic perspective view showing a state of projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection.
  • FIG. 31A is a schematic view showing a course display image at an intersection in the 14th embodiment, and shows a course display image 61E at the time when the vehicle 10E trying to turn right enters the intersection.
  • FIG. 31B is a schematic view showing a course display image at an intersection in the 14th embodiment and showing a course display image 62E after the vehicle 10E detects a pedestrian.
  • FIG. 32A is a schematic view showing a course display image 6E when traveling straight in the fifteenth embodiment.
  • FIG. 32B is a schematic view showing a course display image 6E at the time of turning right in the fifteenth embodiment.
  • 32C is a schematic view showing a course display image 6E at the time of a left turn in the fifteenth embodiment.
  • FIG. 33A is a schematic view showing a course display image of the vehicle position in the intersection at the time of a right turn in the conventional example, and shows a state before the vehicle 10E enters the intersection.
  • FIG. 33B is a schematic view showing a course display image of the vehicle position in the intersection at the time of a right turn in the conventional example, and shows a state in which the vehicle 10E is located near the center front of the intersection.
  • FIG. 33C is a schematic view showing a course display image of the vehicle position in the intersection at the time of a right turn in the conventional example, and shows a state in which the vehicle 10E is located near the center of the intersection.
  • FIG. 33D is a schematic view showing a course display image of the vehicle position in the intersection when turning right in the conventional example, and showing the state of the vehicle 10E at the position where the intersection is completely turned.
  • FIG. 1 is a schematic view showing a road 1 and a vehicle driving support system 100 according to the present embodiment.
  • FIG. 1 shows an intersection where the road 1 intersects and the sidewalk 2 is provided.
  • the first vehicle 10 and the second vehicle 20 are running on the road 1 facing each other.
  • the infrastructure side device 30 is arranged on the area of the sidewalk 2.
  • the first vehicle 10 is going straight on the road 1 from the lower part to the upper part in the figure and then turns left
  • the second vehicle 20 is going straight on from the upper part to the lower part in the figure and then turning right. ..
  • the left turn of the first vehicle 10 is prioritized, but depending on the situation of the first vehicle 10, the number of lanes, the speed and the distance of the road 1 at the confluence, the congestion situation of the straight-ahead vehicle, and the like, the second vehicle 20 May cross the front of the first vehicle 10 and turn right.
  • FIG. 1 shows the driving condition in the area where the driving lane is on the left side and the regulation, but in the area where the driving lane is on the right side and the regulation is regulated, the second vehicle 20 turns left and the second vehicle 20 is turned.
  • Vehicle 10 may turn right.
  • a phosphor coating region R is formed at an intersection and within a predetermined distance L from the intersection.
  • the range of the predetermined distance L is a range 30 m away from the intersection, for example, in order to indicate a course change with a turn signal.
  • hatching is applied as the phosphor-coated region R, but as will be described later, the phosphor-coated region R is uncolored.
  • Guidance information M1 is drawn on the road surface in front of the first vehicle 10 and guidance information M2 is drawn on the road surface in front of the second vehicle 20 on the road 1 in the phosphor-coated region R.
  • Guidance information M1 and M2 are presented. Further, road surface information M3 such as a pedestrian crossing and a stop line is drawn on the road 1 in the phosphor-coated region R, and guidance information M4 is also drawn on the sidewalk 2.
  • the guidance information M1 has an arrow shape that bends from the front direction to the left in the traveling direction of the first vehicle 10
  • the guidance information M2 has an arrow shape that bends from the front direction to the right in the traveling direction of the second vehicle.
  • FIG. 1 shows an example of drawing an arrow image on a road surface as a method of presenting guidance information, but the shape of the image is not limited, and the drawing is not limited to the road surface.
  • the image may contain characters and icons.
  • the method of presenting the guidance information may be an image display using an image display device or a head-up display mounted on the vehicle, or may be voice guidance.
  • Road 1 is a route on which vehicles travel, and may be a paved road, an unpaved road, a public road, or a private road. Further, FIG. 1 shows an example of a crossroad having one lane on each side and an oncoming lane, but the number of lanes on one side and the shape of the intersection are not limited.
  • the road 1 and the vehicle driving support system 100 are not limited to the inside of the intersection.
  • the vehicle driving support system 100 is also used when more lanes such as a three-way junction and a five-way junction intersect.
  • the vehicle driving support system 100 is not limited to turning left or right on the road, and is also used when turning left or right to cross an oncoming lane and enter private land such as a parking lot or a store.
  • the sidewalk 2 is a space provided along the road 1 on which the vehicle travels, on which the vehicle does not travel, and is an area through which pedestrians pass. If sidewalk 2 is not clearly separated from road 1, sidewalk 2 may be a roadside zone of road 1. Further, the vehicle driving support system 100 does not necessarily have to include the sidewalk 2.
  • FIG. 1 only shows the sidewalk 2 as an example of an area where the infrastructure side device 30 is arranged, an area where a pedestrian walks, and an area where a portable electronic device possessed by a pedestrian is arranged.
  • the first vehicle 10 and the second vehicle 20 are vehicles traveling on the road 1.
  • the first vehicle 10 and the second vehicle 20 are preferably driving support vehicles in which a part of steering control and acceleration / deceleration control is performed by a computer or the like.
  • the vehicle driving support system 100 of the present embodiment can provide driving support even for a manually driven vehicle that does not have the driving support technology.
  • each vehicle includes a situation grasping unit, a driving support unit, a vehicle motion control unit, and an information / communication unit.
  • the situation grasping unit acquires information on the running condition and surrounding conditions of the vehicle.
  • the situation grasping unit may be various sensor devices such as a vehicle speed sensor, a position sensor, an image imaging device, a laser ranging device, and a LIDAR (Light Detection and Ringing) that realize a driving support function.
  • the traveling state acquired by the situation grasping unit includes the traveling speed, the position, the direction of the vehicle body, the steering angle, the brake operation, the traveling route by the car navigation system, the direction instruction by the conversation recognition in the vehicle, and the like.
  • the surrounding conditions acquired by the situation grasping unit include road surface conditions, ambient temperature, road maps by car navigation system, road gradients, detection of surrounding objects by image recognition, and distance between vehicles in front, oncoming vehicles, and following vehicles. Includes behavior prediction, detection of pedestrians by image recognition, etc.
  • the driving support unit processes the information acquired by the situation grasping unit and outputs a driving control signal for supporting the driving of the vehicle to the vehicle motion control unit.
  • the vehicle motion control unit executes vehicle steering control and acceleration / deceleration control based on the driving control signal output from the driving support unit.
  • the vehicle motion control unit has a driving support function.
  • the vehicle motion control unit adjusts the output of the power source, operates the brakes, changes the steering angle, displays the driving guide, controls the lighting of the turn signals and the stop lights, and so on, so that the first vehicle 10 or the second vehicle 20 Support driving.
  • the information and communication unit is connected to the driving support unit and the situation grasping unit, and performs information communication with the communication unit provided outside the vehicle.
  • the information and communication unit performs vehicle-to-vehicle communication between vehicles and road-to-vehicle communication with equipment provided on the road.
  • the infrastructure side device 30 is provided on the road 1 or the sidewalk 2. As will be described later, the infrastructure side device 30 may irradiate the road surface of the road 1 with light to draw an image.
  • the specific configuration of the infrastructure side device 30 is not limited, and a dedicated lighting device may be arranged.
  • Infrastructure equipment such as electric bulletin boards, such as street lights and signals, may have a function of irradiating light.
  • the infrastructure side device 30 may include an information communication unit for information communication with the first vehicle 10 or the second vehicle 20, and may enable road-to-vehicle communication with the vehicle.
  • the infrastructure side device 30 may include a situation grasping unit and a driving support unit in the same manner as the driving support vehicle.
  • FIG. 2 is a schematic cross-sectional view showing the structure of the road 1 according to the first embodiment.
  • a pavement surface 3 is formed on the ground of the road 1, and a phosphor-containing layer 4 and a coating layer 5 are laminated on the pavement surface 3.
  • the road 1 and the pavement surface 3 are schematically shown in a two-layer structure for simplification, but the road 1 and the pavement surface 3 are a road body, a roadbed, a construction roadbed, a lower layer roadbed, an upper layer roadbed, and a base layer.
  • a laminated structure including a surface layer and the like may be used.
  • the phosphor-containing layer 4 and the coating layer 5 may be formed by being laminated on the sidewalk 2.
  • the pavement surface 3 is a layer corresponding to the surface layer of the road 1 in a region other than the phosphor-containing region R, and is exposed to the surface and comes into contact with the tires of the first vehicle 10 and the second vehicle 20.
  • the material of the pavement surface 3 is not limited, and may be asphalt, concrete, interlocking, wood, brick, or the like.
  • the phosphor-containing layer 4 contains fine particles of a phosphor material that are excited by primary light and emit secondary light having a wavelength different from that of the primary light. Further, the phosphor-containing layer 4 contains a dispersion medium for dispersing the fluorescent material fine particles, and the fluorescent material fine particles are uniformly dispersed in the dispersion medium.
  • the phosphor-containing layer 4 is colored in a color corresponding to the absorption band of the phosphor material, but when the entire phosphor-containing layer 4 is visually recognized from a distance by reducing the concentration of the phosphor fine particles in the dispersion medium. Can also be uncolored.
  • the material of the dispersion medium is not limited, but is a material that transmits primary light and secondary light, and includes acrylic resin, epoxy resin, silicone resin, polycarbonate, and the like.
  • the phosphor-containing layer 4 may contain a light scattering material for scattering primary light and secondary light.
  • the light scattering material contains fine particles having a refractive index different from that of the dispersion medium, and may be, for example, SiO 2 or TiO 2 .
  • the fluorescent material contained in the fluorescent material-containing layer 4 is not limited, and a plurality of types of fluorescent materials may be included.
  • the phosphor-containing layer 4 emits an amber color as the secondary light.
  • the phosphor-containing layer 4 is excited by green light so that the mixed color of the primary light and the secondary light becomes white, and the blue light is generated. It may contain a plurality of phosphor materials that emit red light.
  • the phosphor-containing layer 4 may contain a plurality of phosphor materials that are excited by purple light and emit green light, blue light, or red light.
  • Examples of phosphor materials include a YAG system that emits yellow light ((Y, Gd) (Al, Ga) O: Ce), a CASN system that emits red light (CaAlSiN 3 : Eu, etc.), and a green light emission.
  • YAG system that emits yellow light
  • CASN system that emits red light
  • CaAlSiN 3 red light
  • ⁇ -SiAlON system Si 6-Z Al Z O Z N 8-z, etc.
  • the coating layer 5 is a layer formed by covering the surface of the phosphor-containing layer 4, and is made of a material that transmits primary light and secondary light.
  • the material of the coating layer 5 is not limited, and may include acrylic resin, epoxy resin, silicone resin, polycarbonate, and the like. Further, the coating layer 5 is not limited to the one made entirely of a light-transmitting material, and may partially include a light-shielding material. Since the coating layer 5 covers the surface of the phosphor-containing layer 4 and comes into contact with the tire of the vehicle traveling on the road 1, the phosphor-containing layer 4 is protected, the durability of the road 1 is improved, and the frictional force is secured. Can be done.
  • FIG. 3 is a block diagram showing the configuration of the vehicle driving support system 100.
  • the vehicle driving support system 100 includes a first vehicle 10, a second vehicle 20, a vehicle motion grasping unit 110, a guidance information creation unit 120, and a guidance information presentation unit 130. ..
  • the vehicle motion grasping unit 110, the guidance information creating unit 120, the driving support unit, and the information grasping unit are recorded in advance on a computer equipped with a central processing unit (CPU: Central Processing Unit), a memory, an external storage device, and the like.
  • CPU Central Processing Unit
  • Predetermined information processing may be executed by the program.
  • the vehicle motion grasping unit 110 grasps the motions of the first vehicle 10 and the second vehicle 20 in the area of the road 1, and acquires the respective situations and future operation schedules as the first motion information and the second motion information. ..
  • the vehicle motion grasping unit 110 is composed of a situation grasping unit, a driving support unit, a vehicle motion control unit, and a combination thereof provided in the first vehicle 10, the second vehicle 20, and the infrastructure side device 30 described above.
  • the vehicle motion grasping unit 110 uses the operation of the direction indicator and the route information of the car navigation system. , Map information, conversation in the vehicle, etc. to grasp the running state of the operation of the own vehicle. Further, the vehicle motion grasping unit 110 recognizes the image captured by the image capturing unit to determine the direction indicator operation of the oncoming vehicle, the content of the image drawn on the road surface, the presence / absence of a pedestrian, a two-wheeled vehicle, or the like, and the surrounding conditions. To grasp as.
  • the vehicle motion grasping unit 110 when the vehicle motion grasping unit 110 is configured by the infrastructure side device 30, the vehicle motion grasping unit 110 recognizes the image captured by the image capturing unit provided in the infrastructure side device 30 to recognize the first vehicle. Predict the operation of 10 and the second vehicle 20.
  • the vehicle motion grasping unit 110 communicates information about the information of the situation grasping unit, the driving support unit, and the vehicle motion control unit provided in the vehicle. By acquiring through the unit, the operation of the first vehicle 10 and the second vehicle 20 is predicted.
  • the guidance information creating unit 120 indicates the route on which the first vehicle 10 or the second vehicle 20 is scheduled to travel based on the first operation information and the second operation information acquired by the vehicle operation grasping unit 110. Create M2.
  • the guidance information may include the timing at which the first vehicle 10 and the second vehicle 20 make a left turn and a right turn, respectively, and the lane position in the merging lane after the left turn and the right turn.
  • the guidance information presentation unit 130 is a road surface drawing device that draws the guidance information M1 and M2 and the road surface information M3 created by the guidance information creation unit 120 in the phosphor-containing region R of the road 1.
  • the guidance information presentation unit 130 may draw road surface information M3 such as a pedestrian crossing or a stop line, or guidance information M4 on the sidewalk 2.
  • the guidance information presenting unit 130 presents guidance information to the first vehicle 10 and the second vehicle 20 by drawing guidance information M1 and guidance information M2 on the road surface, but the content of the image is limited. Not done.
  • the presentation of the guidance information may include a display using an image display device in the vehicle and voice guidance.
  • FIG. 4A and 4B are schematic views showing an example of guidance information presentation in the vehicle driving support system 100.
  • FIG. 4A shows the irradiation of light from the infrastructure side device 30.
  • FIG. 4B shows the irradiation of light from the first vehicle 10 or the second vehicle 20.
  • the infrastructure side device 30 has the light irradiation unit 31, and in the example of FIG. 4B, the first vehicle 10 or the second vehicle 20 has the light irradiation units 11 and 21.
  • the light irradiation units 11, 21, and 31 are examples of the guidance information presentation unit 130.
  • the light irradiation units 11, 21, and 31 project and draw the guidance information M1 and M2 on the road surface of the road 1 based on the guidance information created by the guidance information creation unit 120.
  • the light irradiation units 11, 21, and 31 project and draw the guidance information M4 and the road surface information M3 on the road surface of the road 1 or the sidewalk 2 based on the guidance information created by the guidance information creation unit 120 (FIG.
  • the primary light When the primary light is irradiated to the phosphor coating region R from the light irradiation units 11, 21, 31, the primary light passes through the coating layer 5 and reaches the phosphor-containing layer 4. Since the phosphor-containing layer 4 contains a phosphor material, at least a part of the primary light is wavelength-converted to the secondary light. The secondary light and the primary light that has not been wavelength-converted are irradiated to the outside of the road 1 through the coating layer 5. Therefore, in the phosphor coating region R, the shapes of the guidance information M1 and M2 and the road surface information M3 are displayed in the color obtained by mixing the primary light and the secondary light. At this time, since the primary light and the secondary light are scattered by the phosphor fine particles and the light scattering material contained in the phosphor-containing layer 4, the light distribution characteristics become isotropic and various on the road 1 and the sidewalk 2. Visibility at the position is improved.
  • the secondary light is a phosphor material contained in the phosphor-containing layer 4, and at least a part of the primary light is wavelength-converted.
  • Guidance information M1, M2 and road surface information M3 are displayed in a color different from the light emitted from the light irradiation units 11, 21, 31 and the sun or the surrounding environment to the road 1, and is the surface of the road 1 self-luminous? Visibility is improved because it looks like.
  • the wavelength of the primary light is preferably green light or purple light having a short wavelength.
  • the white light emitting device by the combination of the blue LED and the yellow phosphor irradiates the light of the blue wavelength. Therefore, when the phosphor material contained in the phosphor-containing layer 4 is excited by blue light as the primary light, the secondary light may be emitted by the blue light contained in the headlight or the illumination lamp. Therefore, blue light is not preferable as the wavelength of the primary light.
  • the road 1 and the vehicle driving support system 100 of the present embodiment include the phosphor-containing layer 4 and the coating layer 5 that cover the pavement surface 3, and irradiate the phosphor-containing layer 4 with primary light. Draw an image. The wavelength of the primary light is converted in the phosphor-containing layer 4 to emit the secondary light, and the visibility of road surface drawing is improved.
  • FIG. 5 is a schematic cross-sectional view showing the structure of the road 1 according to the present embodiment.
  • the pavement surface 3 is formed on the ground of the road 1, and the phosphor-containing layer 4, the adhesive layer 6 and the coating layer 7 are laminated on the pavement surface 3. Has been done.
  • the adhesive layer 6 is a member that is interposed between the phosphor-containing layer 4 and the coating layer 7 to bond the two, and may be an adhesive.
  • the coating layer 7 is a plate-shaped member formed on the adhesive layer 6, and includes a cover member having a fine uneven shape formed on the front surface and the back surface of the coating layer 7. I'm out.
  • FIG. 5 shows an example in which a concavo-convex shape is formed on the front and back surfaces of the coating layer 7, but the concavo-convex shape may be formed on either the front surface side or the back surface side.
  • Both the adhesive layer 6 and the coating layer 7 are made of a material that transmits primary light and secondary light.
  • an uneven shape may be formed in advance when the plate-shaped member is formed, or an uneven shape may be formed on a flat surface by sandblasting or the like.
  • An adhesive is applied on the phosphor-containing layer 4, and the obtained plate-shaped coating layer 7 is arranged to cure the adhesive to form an adhesive layer 6, and the adhesive layer 6 contains the phosphor.
  • the layer 4 and the coating layer 7 are adhered to each other.
  • the shape and size of the unevenness formed on the coating layer 7 are not limited, but it is a size that scatters the primary light and the secondary light, and the width and height of the unevenness may be larger than the wavelengths of the primary light and the secondary light. preferable.
  • the light distribution characteristics of the primary light and the secondary light taken out from the phosphor-containing layer 4 become more isotropic. Visibility is improved at various positions on the road 1 and the sidewalk 2. Further, by forming irregularities on the surface of the coating layer 7, it is possible to secure friction with the tires of the traveling vehicle.
  • the road 1 and the vehicle driving support system 100 of the present embodiment include a phosphor-containing layer 4, an adhesive layer 6, and a coating layer 7 that cover the pavement surface 3, and irradiate the phosphor-containing layer 4 with primary light.
  • the wavelength of the primary light is converted in the phosphor-containing layer 4 to emit the secondary light, and the visibility of road surface drawing is improved.
  • the coating layer 7 is formed with irregularities, the light distribution characteristics of the primary light and the secondary light emitted from the phosphor-containing layer 4 to the outside become more isotropic and the visibility is improved.
  • the first vehicle 10 or the second vehicle 20 is a driving support vehicle.
  • the first vehicle 10 or the second vehicle 20 draws the road surface by irradiating the primary light from the light irradiation units 11 and 21 only while traveling in the phosphor coating region R, and does not irradiate the primary light while traveling in other regions. ..
  • the first vehicle 10 or the second vehicle 20 is a driving support vehicle, and has a situation grasping unit as in the first embodiment.
  • the surrounding conditions acquired by the situation grasping unit include the position information of the phosphor coating region R on the road 1.
  • the map information of the car navigation system includes the position information of the phosphor coating area R, and the situation grasping unit collates the vehicle position with the map information to collate the phosphor application area on the current position or the traveling route. Detect R.
  • the infrastructure-side device 30 arranged in the vicinity of the phosphor-coated region R may transmit the presence of the phosphor-coated region R to the first vehicle 10 or the second vehicle 20 by road-to-vehicle communication.
  • the situation grasping unit detects the existence of the phosphor coating region R and the light irradiation unit 11 , 21 irradiates the road surface with primary light. Since the phosphor-containing layer 4 is laminated in the phosphor-coated region R, the primary light is wavelength-converted to the secondary light by the phosphor material contained in the phosphor-containing layer 4, and the primary light and the secondary light are converted into wavelengths. Guidance information M1 and M2 are drawn on the road surface by color mixing.
  • the road surface is irradiated with the primary light from the light irradiation units 11 and 21 only in the phosphor-coated region R, and the primary light is not irradiated in the other regions where the phosphor-containing layer 4 is not formed, so that the power consumption is consumed. Can be reduced.
  • FIG. 6 is a flowchart showing an operation example of the vehicle driving support system 100 according to the first embodiment.
  • the vehicle motion grasping unit 110 acquires the first motion information of the first vehicle 10 and acquires the second motion information of the second vehicle 20.
  • the situation grasping unit provided in the first vehicle 10, the second vehicle 20, and the infrastructure side device 30 can be used for the congestion status of the road 1, the presence / absence and operation of other traveling vehicles, pedestrians, and two-wheeled vehicles, and the road at the confluence. Acquire information such as road conditions and driving conditions for 1.
  • the vehicle motion grasping unit 110 includes the left turn of the first vehicle 10 in the first motion information, the right turn of the second vehicle 20 in the second motion information, and the first. It is determined whether the traveling direction is the same after the left turn of the vehicle 10 and the right turn of the second vehicle 20, and the merging is a change of course in the same direction.
  • the vehicle operation grasping unit 110 determines that the vehicle is merging
  • the vehicle driving support system 100 proceeds to step 3, and when it is determined that the vehicle is not merging, the vehicle driving support system 100 proceeds to the operation information acquisition step of step 1.
  • the vehicle motion grasping unit 110 may consider the overlap of the left turn timing of the first vehicle 10 and the right turn timing of the second vehicle 20.
  • the vehicle motion grasping unit 110 determines that a left turn of the first vehicle 10 and a right turn of the second vehicle 20 are executed within a predetermined time, and if there is an interval of a predetermined time or more, the vehicle does not merge. Is determined.
  • the guidance information creation unit 120 uses the first guidance information M1 for guiding the movement of the first vehicle 10 and the second vehicle 20 based on the first movement information and the second movement information.
  • the second guidance information M2 that guides the movement is created.
  • the first-lead information and the second-lead information correspond to the guidance information in the present disclosure, respectively.
  • the guidance information creation unit 120 may consider the information of the road 1 acquired in the operation information acquisition step. Specific examples of the first lead information M1 and the second lead information M2 will be described later.
  • the guidance information presentation unit 130 determines the guidance information presentation method based on the first guidance information M1 and the second guidance information M2, and the first vehicle 10 and the second vehicle 20 Present to.
  • the method of presenting the guidance information may include drawing on the road surface, displaying on the image display device in the vehicle, displaying on the head-up display, and voice guidance.
  • Guidance information may be presented by combining these methods.
  • Examples of operations of the vehicle motion grasping unit 110, the guidance information creating unit 120, and the guidance information presenting unit 130 include the following.
  • the vehicle driving support system 100 presents the left turn information of the first vehicle 10 first, and the right turn information of the second vehicle 20 is after the left turn of the first vehicle 10 is completed. Present.
  • the vehicle driving support system 100 presents the lane after the left turn of the first vehicle 10 and the lane after the right turn of the second vehicle 20 differently. .. Specifically, the vehicle driving support system 100 presents the first guidance information M1 to the first vehicle 10 so as to drive in the left lane after turning left, and causes the second vehicle 20 to drive in the right lane after turning right. The second guidance information M2 is presented to. Alternatively, the first vehicle 10 is presented with the first guidance information M1 so as to drive in the right lane after turning left, and the second vehicle 20 is presented with the second guidance information M2 so as to drive in the left lane after turning right. To do.
  • a lane that is easy to shift to the route after turning left or right may be selected depending on whether the vehicle goes straight after turning left or right or turns left or right again. Further, since the turning radius becomes large when the first vehicle 10 is a large vehicle, the first guidance information M1 may be presented so as to drive in the right lane after turning left.
  • the presentation timing of the first guidance information M1 and the presentation timing of the second guidance information M2 may overlap.
  • the traveling lanes of the first vehicle 10 and the second vehicle 20 after merging are different, even if the timings of the first vehicle 10 and the second vehicle 20 entering the merging road 1 overlap, the vehicles The vehicle driving support system 100 supports smooth traffic by completing left and right turns of two vehicles in a short time while avoiding collisions with each other.
  • the vehicle motion grasping unit 110 acquires the first motion information and the second motion information.
  • the guidance information creation unit 120 causes the first vehicle 10 or the second vehicle 20 to travel. Create the route to be used as guidance information.
  • the guidance information presentation unit 130 presents guidance information.
  • FIG. 7 is a schematic view showing an operation example of the vehicle driving support system 200 according to the second embodiment on the road.
  • FIG. 7 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
  • This embodiment shows a case where the left turn of the first vehicle 10 is prioritized and the right turn timing of the second vehicle 20 is delayed.
  • the guidance information is selected by the guidance information creation unit 120 in the guidance information creation step of step 3 (FIG. 6).
  • the conditions for stopping the second vehicle 20 are, for example, when the road 1 after merging is one lane, or when the first vehicle 10 is a large vehicle and it takes time to turn left, the road 1 after merging is congested. Cases etc. are included.
  • the guidance information presenting unit 130 presents the first guidance information M1 for turning left to the first vehicle 10 and the second guidance information M2 for stopping to the second vehicle 20.
  • the presentation of the first guidance information M1 and the second guidance information M2 continues until the condition for stopping the second vehicle 20 as described above is resolved, and after the condition is resolved, the second turn to the right is as shown in FIG.
  • Guidance information M2 is presented.
  • FIG. 7 shows an example in which the characters “STOP” are drawn on the road surface as the second guidance information M2 indicating the stop of the second vehicle 20, but a figure such as an icon may be presented in order to improve visibility. Voice guidance may be used.
  • the vehicle driving support system 200 of the present embodiment explicitly indicates that the second vehicle 20 is stopped and turns right, so that even when a plurality of vehicles merge at an intersection, smooth traffic is ensured while ensuring safety. Can be assisted.
  • FIG. 8 is a schematic view showing an example of operation of the vehicle driving support system 300 according to the third embodiment on the road.
  • FIG. 8 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
  • This embodiment shows a case where the left turn schedule of the first vehicle 10 is preliminarily presented from the front of the intersection.
  • the first guidance for turning left is the same as shown in the first embodiment.
  • Information M1 is presented.
  • the predetermined distance L2 is longer than the predetermined distance L1 and the first vehicle is traveling between the predetermined distances L2 and L1 before reaching the predetermined distance L1
  • the left turn preparation is presented as the first guidance information M1.
  • Guidance information (first guidance information M1) is selected by the guidance information creation unit 120 in the guidance information creation step of step 3 based on the position information and map information of the first vehicle 10 (FIG. 6).
  • the predetermined distance L1 is a distance of 5 m from the intersection
  • the predetermined distance L2 is a distance of 30 m from the intersection.
  • the left turn first guidance information M1 and the left turn preparation first guidance information M1 when the left turn first guidance information M1 is a character or an image indicating that the left turn is turned to the left, the first guidance for the left turn preparation As the information M1, an image different from that of the first guidance information M1 for turning left is used.
  • Different images include image color, brightness, line width, solid and dashed lines, images with changed blinking and lighting, and the like.
  • the guidance information creation unit 120 may calculate the time required for the first vehicle 10 to reach the intersection and start a left turn based on the speed of the first vehicle 10, the distance to the intersection, and the deceleration.
  • the guidance information presenting unit 130 informs the second vehicle 20 as the second guidance information M2 before presenting the first guidance information M1 for the left turn. You may offer a right turn.
  • the vehicle driving support system 300 of the present embodiment is scheduled to turn left with respect to the second vehicle 20, other surrounding vehicles, pedestrians, motorcycles, etc. by presenting the left turn of the first vehicle 10 from the front side of the intersection. Can be communicated to call attention. Even when a plurality of vehicles meet at an intersection, the vehicle driving support system 300 can support smooth traffic while ensuring safety. Further, by informing the second vehicle 20 of the left turn schedule, it is possible to encourage the second vehicle 20 to make a right turn first, so that smoother traffic can be supported.
  • FIG. 9 is a schematic view showing an example of operation of the vehicle driving support system 400 according to the fourth embodiment on the road.
  • FIG. 9 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
  • the two-wheeled vehicle 40 is running in parallel on the left side of the first vehicle 10 that turns left at the intersection.
  • An example of the two-wheeled vehicle 40 includes a bicycle and a motorcycle, but a three-wheeled vehicle, an ultra-small vehicle, a runner running on a roadside belt, or the like may be used as long as it is an object to move in parallel on the side of the first vehicle 10.
  • the vehicle operation grasping unit 110 adds the first operation information of the first vehicle 10 and the second operation information of the second vehicle 20 to the surrounding roads 1 and sidewalks 2. To get the status of.
  • the vehicle motion grasping unit 110 determines that the first vehicle 10 and the second vehicle 20 are merging by turning left or right, and the motorcycle 40 is traveling on the left side of the first vehicle 10. , And it is determined that the motorcycle 40 reaches the intersection at the timing when the first vehicle 10 turns left.
  • the guidance information creation unit 120 creates the first guidance information M1a and M1b indicating the left turn and stop of the first vehicle 10 and the second guidance information M2 indicating the left turn of the second vehicle 20. ..
  • the guidance information presentation unit 130 presents the first guidance information M1a and M1b indicating a left turn and a stop to the first vehicle 10, and indicates a right turn to the second vehicle 20.
  • the second guidance information M2 is presented.
  • the guidance information presenting unit 130 presents the second guidance information M2 for turning right to the second vehicle 20, but when the two-wheeled vehicle 40 goes straight through the intersection, the second stop as shown in FIG. Guidance information M2 may be presented.
  • the guidance information presenting unit 130 may present the first guidance information M1a and M1b at the same time, but first presents the first guidance information M1b indicating a stop, and indicates a left turn after the parallel running state of the motorcycle 40 is resolved.
  • 1 Guidance information M1a may be presented.
  • the guidance information presenting unit 130 may use different images for the first guidance information M1a and M1b. Different images include image color, brightness, line width, solid and dashed lines, images with changed blinking and lighting, and the like.
  • the vehicle driving support system 400 of the present embodiment detects parallel running of the two-wheeled vehicle 40 and the like and stops the left turn of the first vehicle 10 to prevent an accident involving the two-wheeled vehicle 40 and a plurality of vehicles merge at an intersection. In some cases, it is possible to support smooth traffic while ensuring safety.
  • FIG. 10 is a schematic view showing an example of operation of the vehicle driving support system 500 according to the fifth embodiment on the road.
  • FIG. 10 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
  • a pedestrian crossing is provided in the lane where the first vehicle 10 and the second vehicle 20 meet.
  • FIG. 10 shows a case where the pedestrian 50 is alerted when the first vehicle 10 starts to turn left or when the second vehicle 20 starts to turn right.
  • the crossing of the pedestrian 50 is prioritized over the left turn of the first vehicle 10 and the right turn of the second vehicle 20.
  • the vehicle driving support system 500 presents warning information around the pedestrian crossing. Then, give a preliminary alert.
  • the method of presenting the alert information around the pedestrian crossing may be the road surface drawing from the infrastructure side device 30 as shown in FIG. 4A.
  • the infrastructure side device 30 may be provided with a speaker, and the speaker may indicate a left turn of the first vehicle 10 and a right turn of the second vehicle 20 by voice guidance, and make an announcement calling attention.
  • the portable electronic device uses an information communication unit provided in the first vehicle 10, the second vehicle 20, and the infrastructure side device 30 to display an image or sound. You may present the alert information at.
  • the left turn of the first vehicle 10 and the right turn of the second vehicle 20 are allowed, and even when the right / left turn operation is started, the pedestrian 50 who suddenly tries to cross the pedestrian crossing 50 Even when multiple vehicles meet at an intersection, it is possible to support smooth traffic while ensuring safety.
  • FIG. 11 is a schematic view showing an example of operation of the vehicle driving support system 600 according to the sixth embodiment on the road.
  • FIG. 11 shows an intersection where the road 1 intersects and the sidewalk 2 is provided.
  • the receiving side vehicle 10C as the first vehicle and the transmitting side vehicle 20C as the second vehicle are running on the road 1 facing each other.
  • an infrastructure side device 30 and a portable electronic device 60 held by a pedestrian are arranged on the area of the sidewalk 2, an infrastructure side device 30 and a portable electronic device 60 held by a pedestrian are arranged.
  • the receiving side vehicle 10C is trying to go straight on the road 1 from the lower side to the upper side in the figure, and the transmitting side vehicle 20C is trying to turn right from the upper side in the figure to the left side in the figure.
  • the receiving side vehicle 10C is a vehicle traveling on the road 1 and is a driving support vehicle in which a part of steering control and acceleration / deceleration control is performed by a computer or the like.
  • the receiving side vehicle 10C takes a route traveling straight on the road 1, but when the transmitting side vehicle 20C crosses the road in front of the road 1 and tries to make a right turn, the receiving side vehicle 10C detects an optical signal described later to stop the operation or stop operation. Decelerate.
  • the transmitting side vehicle 20C is a vehicle traveling on the road 1.
  • the transmitting vehicle 20C may be a driving support vehicle, but may be a manually driven vehicle that does not have a driving support function.
  • FIG. 11 when the transmitting side vehicle 20C turns right on the road 1 and tries to cross in front of the receiving side vehicle 10C in front, it irradiates the receiving side vehicle 10C with an optical signal described later to the receiving side vehicle 10C. Perform stop operation or deceleration operation.
  • the infrastructure side device 30 is a device that irradiates the receiving side vehicle 10C with an optical signal.
  • the infrastructure side device 30 has a function of grasping the situation of the vehicle on the road 1 as described later, and irradiates the receiving side vehicle 10C with an optical signal described later according to the situation and stops at the receiving side vehicle 10C. Perform operation or deceleration operation.
  • the portable electronic device 60 is, for example, a portable electronic device or a lighting device that can be carried by a pedestrian on the sidewalk 2.
  • the specific configuration of the portable electronic device 60 is not limited.
  • the portable electronic device 60 has at least a function of irradiating light, and may be in the form of a flashlight or a portable communication device.
  • the portable electronic device 60 irradiates the receiving side vehicle 10C with an optical signal by being operated by a pedestrian, and causes the receiving side vehicle 10C to perform a stop operation or a deceleration operation.
  • FIG. 12 is a block diagram showing the configuration of the vehicle driving support system 600.
  • the receiving vehicle 10C includes a detection unit 11C, a driving support unit 12, a vehicle motion control unit 13, a situation grasping unit 14, and an information communication unit 15.
  • the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60 include light irradiation units 21, 31, and 61, situation grasping units 22, 32, 62, and information and communication units 23, 33, 63, respectively.
  • the driving support unit 12, the vehicle motion control unit 13, and the situation grasping units 14, 22, 32, 62 are on a computer equipped with a central processing unit (CPU: Central Processing Unit), a memory, an external storage device, and the like. Predetermined information processing is executed by a program recorded in advance.
  • CPU Central Processing Unit
  • the detection unit 11C detects light from the outside of the vehicle, converts it into an electrical signal, and transmits the converted signal to the driving support unit 12.
  • the specific configuration of the detection unit 11C is not limited.
  • the detection unit 11C may be an optical sensor or an image imaging device.
  • the wavelength of the light detected by the detection unit 11C is not limited, and may be infrared light, visible light, ultraviolet light, or white light.
  • the driving support unit 12 processes information on the driving state and surrounding conditions acquired from the situation grasping unit 14 and the information communication unit 15 in order to support the driving of the receiving side vehicle 10C, and controls the operation of the receiving side vehicle 10C.
  • the driving control signal is output to the vehicle operation control unit 13.
  • the driving support unit 12 indicates a stop operation or a deceleration operation to the vehicle operation control unit 13 when the light detected by the detection unit 11C includes a predetermined optical signal. Outputs an operation control signal.
  • the vehicle motion control unit 13 executes steering control and acceleration / deceleration control of the receiving side vehicle 10C based on the driving control signal output from the driving support unit 12.
  • the vehicle motion control unit 13 has a driving support function, and by adjusting the output of the power source, operating the brake, changing the steering angle, displaying the driving guide, controlling the lighting of the turn signal and the stop light, and the like, the vehicle on the receiving side. Supports the operation of 10C.
  • the situation grasping unit 14 acquires information on the running state and surrounding conditions of the receiving vehicle 10C and transmits it to the driving support unit 12.
  • the situation grasping unit 14 is provided with various sensor devices such as a vehicle speed sensor, a position sensor, an image imaging device, a laser range finder, and a LIDAR (Light Detection and Ringing).
  • the traveling state acquired by the situation grasping unit 14 includes the traveling speed, the position, the direction of the vehicle body, the steering angle, the brake operation, the traveling route by the car navigation system, the direction instruction by the conversation recognition in the vehicle, and the like. Is done.
  • the surrounding conditions acquired by the situation grasping unit 14 include the road surface condition, the ambient temperature, the road map by the car navigation system, the slope of the road, the detection of surrounding objects by image recognition, the preceding vehicle, the oncoming vehicle, and the following vehicle. Includes inter-vehicle distance, behavior prediction, and detection of pedestrians by image recognition.
  • the information and communication unit 15 is connected to the driving support unit 12 and the situation grasping unit 14, and performs information communication with a communication unit provided outside the receiving vehicle 10C.
  • the information communication unit 15 communicates by radio waves or light.
  • the information and communication unit 15 performs vehicle-to-vehicle communication between vehicles and road-to-vehicle communication with equipment provided on the road to obtain information such as a traveling state and surrounding conditions.
  • the detection unit 11C may detect light including information not related to the operation of the vehicle driving support system 600, but the detection unit 11C may detect a predetermined optical signal related to the operation of the vehicle driving support system 600. It is configured.
  • the predetermined optical signal includes, for example, a pulse signal having a specific wavelength or a specific waveform. Further, the predetermined optical signal may have a light intensity exceeding the dynamic range of the detection unit 11C in the wavelength range that can be detected by the detection unit 11C.
  • the information of these predetermined optical signals is recorded in the driving support unit 12, or is recorded in the driving support unit 12 as a processing procedure when a light intensity exceeding the dynamic range is irradiated.
  • the light irradiation units 21, 31, and 61 are provided in the transmission side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60, respectively, and irradiate the detection unit 11C with a predetermined light signal.
  • the predetermined optical signal is a specific wavelength, a specific signal waveform, a light intensity exceeding a dynamic range, or the like.
  • 11 and 12 show an example in which the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60 are all arranged on the road 1 or the sidewalk 2.
  • any one of the transmission side vehicle 20C, the infrastructure side device 30 or the portable electronic device 60 is on the road.
  • the vehicle driving support system 600 functions if it is on 1 or the sidewalk 2.
  • the situation grasping units 22, 32, and 62 are provided in the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60, respectively, and may have the same configuration as the situation grasping unit 14.
  • the situation grasping units 22, 32, and 62 acquire information on the running state and surrounding conditions of vehicles and pedestrians, respectively, and transmit the information to the outside via the information and communication units 23, 33, 63. Further, the situation grasping units 22, 32, 62 may acquire the traveling state and the surrounding situation from the outside via the information and communication units 23, 33, 63.
  • the situation grasping units 22, 32, 62 determine the acquired running state and surrounding conditions, and detect from the light irradiation units 21, 31, 61 when the running state and surrounding conditions satisfy predetermined irradiation conditions.
  • the unit 11C is irradiated with an optical signal.
  • the information and communication units 23, 33, and 63 are provided in the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60, respectively, and perform information communication with the communication unit provided outside the receiving side vehicle 10C.
  • the information and communication units 23, 33, and 63 communicate with each other by radio waves or light, respectively.
  • FIG. 12 shows an example in which the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60 are provided with the situation grasping units 22, 32, 62 and the information and communication units 23, 33, 63. Since it is not essential for the operation of the system 600, it can be omitted.
  • FIGS. 13A to 13C are schematic views showing an example of light irradiation in the vehicle driving support system 600.
  • FIG. 13A shows the irradiation of light from the transmitting vehicle 20C.
  • FIG. 13B shows the irradiation of light from the infrastructure side device 30.
  • FIG. 13C shows the irradiation of light from the portable electronic device 60 held by the pedestrian.
  • the light irradiation units 21, 31, and 61 of the transmission side vehicle 20C, the infrastructure side device 30, or the portable electronic device 60 are predetermined with respect to the detection unit 11C of the reception side vehicle 10C.
  • the light signal is emitted.
  • the driving support unit 12 When the light detected by the detection unit 11C of the receiving vehicle 10C partially matches a predetermined optical signal, the driving support unit 12 indicates a stop operation or a deceleration operation to the vehicle operation control unit 13. A control signal is output to decelerate or stop the receiving vehicle 10C.
  • the transmitting side vehicle 20C may be a manually driven vehicle that does not have a driving support function.
  • the transmitting side vehicle 20C operates a passing operation or a dedicated switch, and the light irradiating unit 21 irradiates the detecting unit 11C with a predetermined light signal, so that the vehicle driving support system 600 is the receiving side vehicle 10C.
  • the stop operation or deceleration operation by the driving support unit 12 can be activated.
  • the light irradiation unit 21 of the transmitting vehicle 20C may be a headlight, a fog lamp, a decorative light, or the like, and the vehicle driving support system 600 can be used more easily than adding a driving support function to the vehicle.
  • the infrastructure side device 30 grasps the running state and surrounding conditions of the receiving side vehicle 10C and the transmitting side vehicle 20C by the situation grasping unit 32 or the information communication unit 33.
  • the infrastructure side device 30 sets the irradiation condition when the receiving side vehicle 10C is an oncoming vehicle of the transmitting side vehicle 20C and the transmitting side vehicle 20C crosses the front of the receiving side vehicle 10C and makes a right turn operation or a left turn operation. It is determined that the condition is satisfied, and the light irradiation unit 31 irradiates the detection unit 11C with an optical signal.
  • the driving support unit 12 outputs a driving control signal indicating a stop operation or a deceleration operation to the vehicle operation control unit 13 to decelerate or stop the receiving side vehicle 10C.
  • the infrastructure side device 30 determines the light irradiation, as shown in FIG. 11, it is predicted that a pedestrian crossing exists in front of the receiving side vehicle 10C and the receiving side vehicle 10C continues the straight-ahead operation. This also includes the case where it is determined that the pedestrian crosses the front of the receiving vehicle 10C.
  • the portable electronic device 60 possessed by the pedestrian on the sidewalk 2 is operated to irradiate the detection unit 11C with an optical signal from the light irradiation unit 61.
  • the pedestrian can positively act on the receiving side vehicle 10C, which is a driving support vehicle, to activate the stopping operation and the decelerating operation of the receiving side vehicle 10C.
  • the portable electronic device 60 may grasp the traveling state and surrounding conditions of the receiving side vehicle 10C by the situation grasping unit 62 or the information communication unit 63.
  • the irradiation of the optical signal is executed only when the situation grasping unit 62 determines that the irradiation condition is satisfied.
  • the irradiation condition includes a case where a pedestrian crossing exists in front of the receiving side vehicle 10C, the receiving side vehicle 10C is predicted to continue the straight-ahead operation, and a pedestrian crosses the pedestrian crossing.
  • FIG. 14 is a flowchart showing an operation example of the vehicle driving support system 600.
  • the receiving side vehicle 10C executes the driving support function and is traveling as the driving support vehicle, and the transmitting side vehicle 20C has the driving support function. Further, the flowchart of FIG. 14 shows the irradiation conditions of the optical signal in the transmitting side vehicle 20C.
  • step 1C the situation grasping unit 22 determines a right / left turn from the running state of the transmitting vehicle 20C, shifts to step 2C in the case of a right turn or a left turn, and returns to step 1C in the case of neither a right turn nor a left turn. ..
  • step 2C when the situation grasping unit 22 determines the presence or absence of an oncoming vehicle and the situation grasping unit 22 predicts that the receiving side vehicle 10C, which is an oncoming vehicle, is going straight, the process proceeds to step 3C, and the like. In the case of, the process returns to step 1C.
  • step 3C the light irradiation unit 21 irradiates the detection unit 11C with an optical signal and returns to step 1C.
  • the transmitting side vehicle 20C determines the irradiation of the optical signal according to the traveling state, the driver of the transmitting side vehicle 20C does not perform any special operation and automatically performs the optical signal when the irradiation condition is satisfied. Is irradiated, and the stop operation or deceleration operation of the receiving side vehicle 10C can be activated.
  • the receiving side vehicle 10C traveling by the driving support technology is positively subjected to only by irradiating the light signals from the light irradiation units 21, 31, and 61. It is possible to support smooth traffic by activating a stop operation or a deceleration operation on the receiving side vehicle 10C.
  • FIG. 15 is a schematic view showing an example of operation of the vehicle driving support system 700 according to the seventh embodiment on the road.
  • the road 1 and the sidewalk 2 extend in a straight line.
  • the transmitting side vehicle 20C is traveling in front of the receiving side vehicle 10C.
  • the infrastructure side device 30 is arranged on the area of the sidewalk 2.
  • the receiving side vehicle 10C and the transmitting side vehicle 20C are trying to go straight on the road 1 from the lower side to the upper side in the figure.
  • an optical signal is irradiated from the rear of the transmitting side vehicle 20C to activate the stopping operation and the deceleration operation of the receiving side vehicle 10C. Encourage them to avoid danger and maintain comfortable vehicle spacing.
  • FIG. 16 is a flowchart showing an operation example of the vehicle driving support system 700.
  • the receiving side vehicle 10C and the transmitting side vehicle 20C have a driving support function. Further, the flowchart of FIG. 16 shows the irradiation conditions of the optical signal in the transmitting side vehicle 20C.
  • step 11 the situation grasping unit 22 determines whether the vehicle is going straight from the traveling state of the transmitting vehicle 20C, and if the vehicle is going straight, the process proceeds to step 12, and if the vehicle is not going straight, the process returns to step 11.
  • step 12 the situation grasping unit 22 determines whether or not there is a following vehicle, and if it determines that the receiving vehicle 10C is present as the following vehicle, the process proceeds to step 13, and in other cases, the process returns to step 11.
  • step 13 the situation grasping unit 22 measures the inter-vehicle distance between the transmitting side vehicle 20C and the receiving side vehicle 10C, and if the distance is less than or equal to a certain distance (constant value), the process proceeds to step 14, and in other cases, the process proceeds to step 14. Return to step 11.
  • step 14 the situation grasping unit 22 measures the time during which the inter-vehicle distance is a certain distance (constant value) or less, and if it continues for a certain time or more, the process proceeds to step 15, and in other cases, Return to step 11.
  • step 15 the light irradiation unit 21 irradiates the detection unit 11C with an optical signal, and the process returns to step 11.
  • the transmitting side vehicle 20C determines the irradiation of the optical signal according to the traveling state, the driver of the transmitting side vehicle 20C does not perform any special operation and automatically performs the optical signal when the irradiation condition is satisfied. Is irradiated, and the stop operation or deceleration operation of the receiving side vehicle 10C can be activated to promote danger avoidance and maintenance of a comfortable inter-vehicle distance.
  • the receiving side vehicle 10C traveling by the driving support technology is positively subjected to only by irradiating the light signals from the light irradiation units 21, 31, and 61. It is possible to support smooth traffic by activating a stop operation or a deceleration operation on the receiving side vehicle 10C.
  • FIG. 17A and 17B are schematic views of a vehicle 10D traveling by using the vehicle driving support system 800 according to the present embodiment.
  • FIG. 17A is a front view of the vehicle 10D.
  • FIG. 17B is a rear view of the vehicle 10D.
  • the vehicle 10D is an automobile equipped with a road surface drawing device, and includes a headlamp 2D and a lighting unit (irradiation unit) 52D that illuminate the front of the vehicle 10D.
  • the headlamps 2D are arranged as right side headlamps 2R and left side headlamps 2L on the front left and right sides in the traveling direction of the vehicle 10D.
  • the headlamp 2D may include a light source, a reflector, and the like in a lamp body (not shown).
  • the lighting unit (irradiation unit) 52D is arranged below the left and right headlamps 2D on the front surface of the vehicle 10D.
  • the lighting unit 52D in the present embodiment is separately arranged on the left and right sides of the right side lighting unit 52R and the left side lighting unit 52L, but the vehicle driving support system or the road surface drawing device of the present disclosure is not limited to this, and the front surface of the vehicle 10D is not limited to this. It may be in the form of arranging one in the center of.
  • the lighting unit 52D is a drawing projection unit of a road surface drawing device that displays various drawings (marks) on the road surface in the vehicle driving support system 800.
  • the structure of the illumination unit 52D is, for example, a laser scanning device (not shown) including a laser light source and a light deflector that deflects the laser light emitted from the laser light source.
  • the light deflector is, for example, a movable mirror such as a MEMS (Micro Electro Mechanical Systems) mirror or a galvano mirror.
  • the lighting unit 52D may be a liquid crystal display, an LED array, a digital mirror device (DMD), or the like, as long as it can display various predetermined drawings (marks) on the road surface in front of the vehicle 10D.
  • the operation of the lighting unit 52D such as lighting and extinguishing, is controlled in response to a command from the lighting control unit 51D of the road surface drawing unit 50D in the vehicle driving support system 800, which will be described later.
  • the rear lighting unit 52B is also provided below the back lamp 2B on the rear side of the vehicle 10D.
  • the left lighting unit 52BL is separately arranged under the left back lamp 2BL
  • the right lighting unit 52BR is separately arranged under the right back lamp 2BR.
  • the arrangement of the rear lighting unit 52B is not limited to this, and one may be arranged in the center of the rear surface of the vehicle.
  • the structure of the rear lighting unit 52B is the same as that of the front lighting unit 52D.
  • FIG. 18 is a block diagram of the vehicle driving support system 800 according to the present embodiment.
  • the vehicle driving support system 800 of the present disclosure includes a control unit 90D, a vehicle course acquisition unit 20D, a vehicle state information acquisition unit 30D, an image selection unit 40D, and a road surface drawing unit 50D.
  • the control unit 90D controls various devices of the vehicle 10D, and is composed of an electronic control unit.
  • the electronic control unit includes a microcontroller including a processor and a memory, and other electronic circuits such as transistors.
  • the processor includes a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit).
  • the memory also includes a ROM and a RAM. The processor executes various control programs stored in the ROM, and executes various processes in cooperation with the RAM.
  • Various sensors and external devices for monitoring the outside of the vehicle 10D such as the navigation system 11D, the direction indicator 12D, the in-vehicle camera 13D, the sensor 14D, and the wireless communication unit 15D are connected to the control unit 90D. Data is input and output.
  • the navigation system 11D is a system that is connected to a satellite positioning system such as GPS to obtain the current position information of the vehicle 10D, indicate an appropriate course to the destination input by the driver, and guide the vehicle 10D. is there.
  • the control unit 90D acquires the course information of the vehicle 10D from the navigation system 11D, and also acquires the current position information, orientation information, and the like of the vehicle 10D.
  • the control unit 90D is assumed to acquire the course information of the vehicle 10D by the navigation system 11D, but the present disclosure is not limited to this.
  • the control unit 90D may acquire the course information by using various control means such as sequential instruction by automatic operation control.
  • the direction indicator 12D is interlocked with a lever (not shown) for the driver to input the traveling direction of the vehicle 10D, inputs a signal indicating the traveling direction of the vehicle 10D to the control unit 90D, and is a direction indicator to the outside of the vehicle 10D. (Winker lamp) (not shown) to transmit.
  • the control unit 90D also acquires the course information of the vehicle 10D by the direction indicator 12D.
  • the in-vehicle camera 13D is provided to obtain information on the outside of the vehicle 10D in front of and behind the vehicle.
  • the in-vehicle camera 13F installed on the front surface of the vehicle 10D sequentially photographs the state of the front (including the road surface), and promptly transmits forward information on the presence of other vehicles, pedestrians, etc. in front of the vehicle 10D to the control unit 90D.
  • the vehicle-mounted camera 13B installed on the rear surface of the vehicle 10D also sequentially photographs the rear view, and transmits rear information to the control unit 90D that other vehicles, motorcycles, and pedestrians are present behind the vehicle 10D.
  • the in-vehicle camera 13D includes, for example, an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary MOS).
  • the in-vehicle camera 13D is combined with a millimeter-wave radar, a microwave radar, a laser radar, and the like to obtain information on the surroundings outside the vehicle such as other vehicles, pedestrians, road shapes, traffic signs, and obstacles.
  • the in-vehicle camera 13D sends the captured image data to the control unit 90D.
  • the control unit 90D may recognize the existence, position, etc. of pedestrians and other vehicles (including motorcycles) from the image data by various analysis programs, and the in-vehicle camera 13D itself has a program for recognizing pedestrians, etc. May be good.
  • the imaging range of the vehicle-mounted camera also includes the road surface in front of the vehicle 10D in the traveling direction.
  • the sensor 14D is provided to obtain information on the outside of the vehicle around the vehicle 10D.
  • an infrared sensor for detecting whether or not there is another vehicle, a pedestrian, or the like, a motion capture for detecting the movement of a pedestrian, or the like is installed in front, behind, or side of the vehicle 10D.
  • the sensor 14D also detects a pedestrian or another vehicle (including a two-wheeled vehicle)
  • the sensor 14D also transmits the detection data to the control unit 90D.
  • the sensor 14D may include an electronic compass and an angular velocity sensor that detect the orientation of the vehicle 10D.
  • the orientation information of the vehicle 10D may include information detected by an electronic compass and an angular velocity sensor.
  • the wireless communication unit 15D receives and transmits information by wireless communication with other devices outside the vehicle, such as predetermined devices provided at other vehicles and intersections, automatic driving instruction devices, and the like.
  • the control unit 90D also obtains information around the vehicle (states of other vehicles, pedestrians, etc.) from the information received by the wireless communication unit 15D.
  • the wireless communication unit 15D may have a form of transmitting the position and the traveling direction of the vehicle 10D to another vehicle or the like. That is, the wireless communication unit 15D performs vehicle-to-vehicle communication, which is communication between vehicles, and road-to-vehicle communication with equipment provided on the road.
  • the vehicle-mounted camera 13D, the sensor 14D, and the wireless communication unit 15D that detect the surrounding information of the vehicle 10D are collectively referred to as the surrounding information detection unit (or other vehicle detection unit) 16D.
  • the surrounding information detection unit 16D is not limited to the in-vehicle camera 13D, the sensor 14D, and the wireless communication unit 15D, and may be any one that obtains information on the surroundings of the vehicle 10D.
  • the surrounding information detection unit 16D may include a mechanism for obtaining an image captured by another device outside the vehicle.
  • the control unit 90D includes a vehicle course acquisition unit 20D, a vehicle state information acquisition unit 30D, and an image selection unit 40D.
  • the vehicle course acquisition unit 20D acquires the course information of the vehicle 10D from the navigation system 11D, the direction indicator 12D, the wireless communication unit 15D, etc. described above, for example, the course information such as going straight, turning right, or turning left at the next intersection. To do.
  • the vehicle course acquisition unit 20D may periodically acquire the course information at predetermined intervals, or may acquire the course information when the course is changed from the previously obtained course.
  • the course information acquired by the vehicle course acquisition unit 20D includes not only information at intersections but also all cases where the course should be displayed for other vehicles such as changing lanes.
  • the vehicle course acquisition unit 20D also acquires the course information from the information from the wireless communication unit 15D.
  • the vehicle state information acquisition unit 30D acquires the state information of the vehicle 10D such as the position, orientation, and speed of the vehicle 10D. As described above, the vehicle state information acquisition unit 30D acquires the position information and the orientation information of the vehicle 10D from the navigation system 11D. The vehicle state information acquisition unit 30D is connected to the speed sensor 31D.
  • the speed sensor 31D detects the traveling speed of the vehicle 10D and the like.
  • the speed sensor 31D may also have an acceleration sensor or the like.
  • the vehicle state information acquisition unit 30D obtains the position of the vehicle 10D after a predetermined time from the current position of the vehicle 10D obtained from the navigation system 11D or the like and the speed or acceleration detected by the speed sensor 31D.
  • the image selection unit 40D selects the course display image 6D projected on the road surface in front of or behind the vehicle 10D by using the lighting unit 52D, as will be described later. That is, the vehicle driving support system 800 of the present embodiment accurately conveys the course of the vehicle 10D to other vehicles and pedestrians when the course of the vehicle 10D changes, such as when turning right or left. The course display image 6D showing the course direction of is projected on the road surface. However, the image selection unit 40D selects the display image data, such as not selecting the course display image 6D or changing the selected image as appropriate according to the situation of other vehicles or pedestrians around the vehicle 10D.
  • the image selection unit 40D is connected to the route display image data storage unit 41D in which a plurality of data of the route display image 6D indicating the route, such as a right turn and a left turn, are stocked.
  • the image selection unit 40D includes the course information of the vehicle 10D acquired by the vehicle course acquisition unit 20D and the vehicle state information such as the direction, position, speed, etc. of the vehicle 10D acquired by the vehicle state information acquisition unit 30D. Therefore, the data of the route display image 6D indicating the direction in which the vehicle 10D should travel is selected from the route display image data storage unit 41D.
  • the image selection unit 40D does not select the data of the course display image 6D when the other vehicle is located within a predetermined range of the vehicle 10D described later.
  • the image selection unit 40D sends a command instructing the data of the selected course display image 6D to the road surface drawing unit 50D.
  • the road surface drawing unit 50D includes a lighting control unit 51D and a lighting unit 52D.
  • the road surface drawing unit 50D takes out the data of the route display image instructed from the image selection unit 40D from the route display image data storage unit 41D and sends it to the lighting control unit 51D.
  • the lighting control unit 51D converts data into a form suitable for the lighting unit 52D, and the lighting unit 52D irradiates light from a light source according to the converted data and projects a predetermined course display image on the road surface in front of the vehicle 10D.
  • the illumination control unit 51D is composed of an electronic control unit, and determines the illumination state (point off, illumination color, emission intensity, emission area, etc.) of the illumination unit 52D according to the data of the advance path display image.
  • the lighting control unit 51D includes a microcontroller including a processor such as a CPU and an MPU and a memory, and other electronic circuits and the like.
  • the control unit 90D and the lighting control unit 51D have separate configurations, but they may be integrally configured.
  • the lighting unit 52D projects the selected course display image onto the road surface around the vehicle 10D (front, rear) and displays it.
  • the lighting unit 52D has an adjustment function such that the position and angle at which the image is projected are adjusted according to the direction, the current position, and the surrounding conditions of the vehicle obtained by the vehicle state information acquisition unit 30D. I have.
  • the specific adjustment instruction is given based on the instruction signal accompanying the image data selected by the image selection unit 40D.
  • FIG. 19 is a flowchart showing the flow of processing in which the vehicle driving support system 800 displays the course display image 6D.
  • FIG. 20 is a schematic view showing a course 7D and a course display image 6 at an intersection of a plurality of vehicles 10Da to 10Dc. As shown in FIG. 19, the vehicle driving support system 800 of the present embodiment displays the course display image 6D in the vicinity of the intersection in the following flow.
  • the vehicle course acquisition unit 20D turns right as the vehicle 10D indicates in the course 7D, that is, in the straight direction from the first direction to the right turn direction. Recognize and judge that the course information changes in a certain second direction (step 1D).
  • the vehicle state information acquisition unit 30D acquires the ambient information detected by the ambient information detection unit 16D such as the vehicle-mounted camera 13D, the sensor 14D, and the wireless communication unit 15D (step 2D).
  • the vehicle state information acquisition unit 30D acquires the position and orientation information of the vehicle 10D from the navigation system 11D. Further, the traveling speed of the vehicle 10D is acquired from the speed sensor 32D. Then, from the surrounding information and the position, orientation, and traveling speed of the vehicle 10D, information such as whether another vehicle exists before and after the vehicle 10D and how far it is from the other vehicle is acquired (step 3D).
  • the control unit 90D is a position where the route display image 6D should be displayed (described later), such as when the vehicle 10D enters an intersection or changes the route based on the position and orientation information and the route information of the vehicle 10D. It is determined whether the vehicle 10D is located within the third distance L3). When the control unit 90D determines that the position should be displayed, the process proceeds to step 5D, and in other cases, the process proceeds to step 1D (step 4D).
  • control unit 90D is a position where the vehicle 10D should display the course display image 6D, and there is no other vehicle in front of the vehicle 10D, or the distance between the vehicle 10D and the other vehicle in front is the first distance L1 or more. Determine if it is in a state (step 5D). If there is no other vehicle within the first distance L1, the process proceeds to step 6D, and if there is another vehicle, the process proceeds to step 7D.
  • the image selection unit 40D is the optimum to be displayed from the current position and orientation of the vehicle 10D acquired by the vehicle state information acquisition unit 30D, the route acquired by the vehicle course acquisition unit 20D, and the surrounding information acquired by the surrounding information detection unit 16D.
  • the image data of the course forward display image 6Da is selected from the course display image data storage unit 41D (step 6D).
  • the control unit 90D determines that if another vehicle exists in front of the vehicle 10D and the distance to the other vehicle is within the first distance L1, the image in front of the course is not displayed, and the image selection unit 40D is reached. Do not send image selection commands.
  • the control unit 90D confirms the rear of the vehicle 10D and the second distance L2. It is determined whether there is another vehicle inside (step 7D). If the control unit 90D determines that there is no other vehicle within the second distance L2, the process proceeds to step 9D, and if it determines that the other vehicle does exist, the process proceeds to step 8D.
  • the vehicle 10D when the vehicle 10D is located within the third distance L3 specified by the Road Traffic Act, such as before a predetermined distance (30 m) from the intersection, the vehicle 10D must turn on the blinker lamp when changing the course. .. Since the course display image 6D is the same as the lighting of the blinker lamp, the road surface is projected at the position where the blinker lamp must be turned on. However, when another vehicle is located within the first distance L1 or the second distance L2, the course display image 6D (in the case of the front, the course front display image 6Da, is used in consideration of the visual influence on the other vehicle. In the case of the rear, the control unit 90D determines that the rearward display image 6Db) is not displayed.
  • the image selection unit 40D should display from the current position and orientation of the vehicle 10D acquired by the vehicle state information acquisition unit 30D, the course acquired by the vehicle course acquisition unit 20D, and the surrounding information acquired by the surrounding information detection unit 16D.
  • the image data of the optimum course front display image 6Da or the course rear display image 6Db is selected from the course display image data storage unit 41D (step 8D).
  • the image selection unit 40D outputs a display command to the road surface drawing unit 50D according to the image data of the selected course front display image 6Da or the course rear display image 6Db.
  • the road surface drawing unit 50D selects the image data of the course display image 6D selected by the image selection unit 40D from the course display image data storage unit 41D, and uses the lighting control unit 51D and the lighting unit 52D to display the course forward display image 6Da. Is displayed on the road surface in front of the vehicle 10D, or the track rear display image 6Db is displayed on the road surface in front of the vehicle 10D (step 9D).
  • steps 1D to 9D are repeated at predetermined intervals until the vehicle 10D arrives at the destination.
  • image data is sequentially reselected by the image selection unit 40D according to the position of the vehicle 10D, and the optimum course display image 6D is always displayed at the optimum position (step 10D).
  • the control unit 90D determines whether or not the course display image 6D (course forward display image 6Da, course rear display image 6Db) needs to be displayed from the course information, vehicle state information, and detection information of the vehicle 10D. Although the determination has been made, the image selection unit 40D may determine whether or not the display is necessary. In this case, the image selection unit 40D selects whether or not the course display image 6D needs to be displayed, and then selects a suitable course display image.
  • the second direction may be a left turn direction.
  • Vehicle 10Da changes its course from the straight direction, which is the first direction, to the right turn direction, which is the second direction. Since there is no other vehicle in front of the vehicle 10Da, the image selection unit 40D selects the course forward display image 6Da, and the road surface drawing unit 50D displays the course forward display image 6Da of the arrow indicating a right turn on the road surface in front of the vehicle 10Da. The front lighting units 52R and 52L project. Further, since the vehicle 10Db is located behind the vehicle 10Da, the vehicle 10Da does not display the course display image on the road surface behind the vehicle.
  • the vehicle 10Db does not display the course forward display image 6Da because the vehicle 10Da exists within the first front distance L1. Further, since the vehicle 10Dc exists within the second rear distance L2, the rearward display image 6Db is not displayed either. If there is no other vehicle in front of the vehicle 10Db, such as when the vehicle 10Da enters the intersection and the distance to the vehicle 10Da is the first distance L1 or more, the course forward display image 6Da is displayed.
  • Vehicle 10Dc does not display the course front display image 6Da because the vehicle 110b exists within the first front distance L1. Further, since there is no other vehicle within the second rear distance L2, the course rear display image 6Db indicating the course direction is projected on the rear road surface. This is projected by the lighting units 52BR and 52BL on the rear surface of the vehicle 10Dc. In this way, by displaying the course rearward display image 6Db on the rearward road surface, the course can be shown to surrounding pedestrians and the like.
  • the course front display image 6Da is projected on the road surface in front of the vehicle 10Da, and the course rear display image 6Db is projected on the road surface behind. That is, one vehicle 10D has decided to display the course display image 6D on each of the front and rear road surfaces, but the present disclosure is not limited to this, and the course display image 6D may be displayed only on the road surface in front.
  • the course front display image 6Da is not displayed when another vehicle exists within the first front distance L1.
  • the first distance L1 in the present embodiment is "braking distance + length for one vehicle".
  • the braking distance is the distance traveled from when the brake of the vehicle 10D starts to work until it stops, is proportional to the square of the speed and the vehicle weight, and is inversely proportional to the braking force.
  • the braking distance for each speed of the vehicle 10D is obtained (set) in advance, and the control unit 90D sequentially obtains the braking distance from the detected speed of the vehicle 10D and calculates the first distance L1. .. In this embodiment, the length of 10D vehicles is set to 5 m.
  • the control unit 90D compares the first distance L1 with the distance to another vehicle in front, and determines the necessity of displaying the course forward display image 6Da.
  • the second distance L2 which is the criterion for displaying the rearward display image 6Db, is "stop distance + length for one vehicle".
  • the stopping distance is the free running distance + the braking distance.
  • the free-running distance is the distance that the car runs from the moment the driver feels it is necessary to stop the car until the driver steps on the brake and begins to work. Therefore, the stop distance is the distance traveled from the time when the driver feels that the vehicle 10D is to be stopped until the vehicle 10D actually stops.
  • the time required for the driver to react to step on the brake varies from person to person, but in the present embodiment, it is set to 0.75 seconds.
  • the free running distance can be obtained by the detected speed ⁇ 0.75 seconds, and the stopping distance is calculated from the free running distance and the braking distance thus obtained.
  • the control unit 90D compares the obtained second distance L2 with the distance to another vehicle behind, and determines the necessity of displaying the course rear display image 6Db.
  • the third distance L3 is determined as described above. That is, the Road Traffic Act Enforcement Ordinance must give a signal such as turning on the blinker lamp when reaching a point 30 meters before the side edge in front of the intersection, for example, when performing an act such as turning left. It is stipulated. Therefore, the distance from the intersection to the point where this signal must be given is defined as the third distance L3.
  • the rearmost vehicle 10D has a course even if another vehicle is located within the second rear distance L2 behind the rearmost vehicle 10D. It may be a control to display the rear display image 6Db. Whether or not the vehicle 10D is the last within the third distance L3 is detected by the distance from the position of the vehicle 10D to the end of the third distance L3 (the point where a signal must be given 30 m before the intersection). Judgment is made based on conditions such as being shorter than the distance to other vehicles.
  • the leading vehicle 10Da shows the course on the front road surface
  • the rearmost vehicle 10Dc shows the course by projecting the course display image 6D on the rear road surface. It can predict movement and call attention.
  • the next vehicle 10Db which continues in a state where the distance from the leading vehicle 10Da is not so large, does not project the course display image 6D, so that the projected light is not diffusely reflected on the rear surface of the front vehicle 10Da, and the driver of the vehicle 10Db Diffuse reflection does not obstruct the view.
  • FIG. 21 is a schematic view showing a course display image 6D at an intersection of the vehicles 10Da and 10Db and the two-wheeled vehicle 10Dd of the ninth embodiment. The description of the contents overlapping with the eighth embodiment will be omitted.
  • the vehicle 10Db since the vehicle 10Db exists within the first distance L1 ahead of the vehicle 10Db, the vehicle 10Db does not project the course forward display image 6Da.
  • the vehicle 10Db there is a motorcycle 10Dd within the second rear distance L2.
  • the course rearward display image 6Db was not displayed in the first embodiment, but in the present embodiment, the course rearward display image 6Db is displayed.
  • the control unit 90D prevents the two-wheeled vehicle 10Dd from being caught. Therefore, the vehicle rear display image 6Db is selected to be displayed, and an image display command is sent to the image selection unit 40D.
  • the image selection unit 40D selects data related to the left turn image in the image showing the course of the vehicle 10Db, and in the example shown in FIG. 21, from the course display image data storage unit 41D, and outputs a command to the road surface drawing unit 50D.
  • the road surface drawing unit 50D projects a course rear display image 6Db indicating a left turn onto the rear road surface of the vehicle 10Db.
  • the lighting control unit 51D sets the lighting unit 52D so that the driver's field of view of the two-wheeled vehicle 10Dd can be projected to a visible position without giving glare. Control.
  • the road surface drawing unit 50D does not use both lighting units 52BR and 52BL, but projects using only the right lighting unit 52BR, or adjusts the projection angle.
  • the motorcycle 10Dd is located behind when the vehicle 10Db turns left, there is a high risk that the motorcycle 10Dd will be involved in an accident. Therefore, only when the motorcycle 10Dd exists within the second rear distance L2 of the vehicle 10Db and the course of the vehicle 10Db is a left turn, the road surface drawing unit 50D projects the course rear display image 6Db and the motorcycle 10Dd is at the same position. Even if there is, if the course of the vehicle 10Db goes straight or turns right, the road surface drawing unit 50D does not have to display the course rear display image 6Db.
  • the path rear display image 6Db may be displayed. That is, in the line of sight of a pedestrian located within the second distance L2, it may be difficult to notice the blinker lamp of the vehicle 10Db, and the road surface drawing unit 50D displays the path rear display image 6Db in order to call further attention. It may be. Even when a pedestrian is detected, the lighting control unit 51D controls the lighting unit 52D so as not to give glare to the pedestrian's field of view and to project the rearward display image 6Db at a position where it can be accurately recognized.
  • the road surface drawing unit 50D displays the path rear display image 6Db in order to prevent the danger of the motorcycle 10Dd being involved from another viewpoint. ..
  • the course of the vehicle 10Db can be accurately indicated to the two-wheeled vehicle 10Dd in the vicinity of the vehicle 10Db, and the driver of the two-wheeled vehicle 10Dd can be alerted.
  • the projection of the course display image 6D is controlled based on the surrounding information detected by the surrounding information detection unit 16D included in the vehicle 10D.
  • the detection member of the vehicle 10D not only the detection member of the vehicle 10D but also the device installed in advance on the road is used to control the projection of the course display image 6D.
  • the tenth embodiment will be described with reference to FIG. 4A. The description of the contents overlapping with the eighth embodiment and the ninth embodiment will be omitted.
  • the infrastructure side device 30 may be provided with an infrastructure side surrounding information detection unit (another vehicle detection unit or a situation grasping unit) that detects the surrounding state of the infrastructure side device 30 such as an imaging camera.
  • the infrastructure-side ambient information detection unit in the present embodiment is also provided as the light irradiation unit 31, and may detect ambient information by photographing the road surface from above.
  • the present disclosure is not limited to this, and the infrastructure side surrounding information detection unit may be provided at another place such as a support column of the infrastructure side device 30.
  • FIG. 22 is a schematic view showing the relationship between the vehicle 10Da, the motorcycle 10Dd, and the infrastructure side device 30 of the present embodiment at an intersection.
  • the infrastructure side device 30 detects surrounding information such as imaging the road surface, is located in a range where the vehicle 10Da must display the course in front of the intersection, and the motorcycle 10Dd is within the second rear distance L2 of the vehicle 10Da. Detects that it is located.
  • the infrastructure side device 30 is provided with a control unit that determines whether or not to display the course display image 6D from the detected image.
  • the control unit determines from the surrounding information detected by the surrounding information detection unit on the infrastructure side that no other vehicle exists within the first distance L1 ahead of the vehicle 10Da and displays the course forward display image 6Da. Further, since the motorcycle 10Dd is also located behind the vehicle 10Da, it is determined that the course rear display image 6Db is displayed.
  • the information and communication unit of the infrastructure side device 30 sends a command to display the course front display image 6Da and the course rear display image 6Db by communication between the vehicle 10Da and the road-to-vehicle.
  • the wireless communication unit 15D receives the command from the infrastructure side device 30 and sends it to the vehicle status information acquisition unit 30D.
  • the control unit 90D receives a command from the vehicle state information acquisition unit 30D and receives the route information that the vehicle course acquisition unit 20D makes a left turn, and sends the image selection unit 40D a course forward display image 6Da for a left turn and a course rearward display. Send a command to display image 6Db.
  • the predetermined course display image 6D is selected in the same manner as in the eighth embodiment and the ninth embodiment, and the road surface drawing unit 50D displays the course forward display image 6Da on the road surface in front of the vehicle 10Da. Further, the road surface drawing unit 50D displays the path rear display image 6Db on the rear road surface of the vehicle 1Da.
  • the present embodiment determines the display of the course display image 6D based on the command from the infrastructure side device 30.
  • the image display on the road surface near the intersection, that is, near the infrastructure side device 30 not only the lighting unit 52D of the vehicle 10Da projects the image, but also the light irradiation unit 31 of the infrastructure side device 30 projects the image. May be good.
  • the vehicle course acquisition unit 20D of the vehicle 10Da notifies the infrastructure side device 30 of the course information via the wireless communication unit 15D, and based on the course information obtained by the control unit of the infrastructure side device 30.
  • the appropriate course display image 6D may be determined, and the infrastructure side device 30 may command the determined course display image 6D to the vehicle 10Da. That is, the infrastructure side device 30 processes up to the selection of the course display image 6D, and the vehicle 10Da simply displays the data of the course display image 6D sent from the infrastructure side device 30. In this case, it is not necessary to store a plurality of display image data in the vehicle 10Da.
  • the display control of the course display image 6D of each vehicle 10D is performed by using the infrastructure side device 30, so that the display control of the plurality of vehicles 10D in the vicinity of the infrastructure side device 30 is collectively performed. It is possible to perform integrated display control in a plurality of vehicles 10D. That is, each vehicle 10D can display the integrated course display image 6D regardless of the control capability of the individual control unit 90D of each vehicle 10D.
  • the detection information of the surrounding information detection unit 16D provided on the vehicle 10D and the infrastructure side device The one using the detection information of the surrounding information detection unit on the infrastructure side of 30 has been described.
  • the present disclosure is not limited to these, and other vehicles may be detected by vehicle-to-vehicle communication between vehicles 10D.
  • the wireless communication unit 15D may be used for vehicle-to-vehicle communication.
  • the vehicle 10D receives the position of another vehicle, the position of the own vehicle, etc. by using the information from the satellite positioning system connected to the navigation system 11D, and uses the received information to use the other vehicle within a predetermined distance. May be determined if is present.
  • a right turn and a left turn at an intersection are given as an example, but the present disclosure is not limited to this, and can be applied to a lane change when there are a plurality of lanes.
  • the vehicle 10D can be applied not only to a general five-seater vehicle but also to a wide variety of vehicles such as trucks, trailers, and motorcycles.
  • FIG. 23 is a block diagram of the road surface drawing device 900 according to the eleventh embodiment.
  • the road surface drawing device 900 of the present disclosure includes a control unit 90E, a vehicle course acquisition unit 20E, a vehicle state information acquisition unit 30E, an image selection unit 40E, and an image drawing unit 50E.
  • the control unit 90E controls various devices of the vehicle 10E.
  • the control unit 90E is connected to various sensors and external devices for monitoring the outside of the vehicle 10E such as the navigation system 11E, the direction indicator 12E, the in-vehicle camera 13E, the sensor 14E, and the wireless communication unit 15E, and various signals. Data is input and output.
  • the vehicle-mounted camera 13E, the sensor 14E, and the wireless communication unit 15E that detect the front information of the vehicle 10E are collectively referred to as the front information detection unit 16E.
  • the front information detection unit 16E is not limited to the in-vehicle camera 13E, the sensor 14E, and the wireless communication unit 15E, and may be any one that obtains information around the vehicle 10E, particularly in front.
  • the front information detection unit 16E may include a mechanism for obtaining an image captured by another device outside the vehicle.
  • the vehicle state information acquisition unit 30E acquires the state information of the vehicle 10E such as the position, orientation, and speed of the vehicle 10E.
  • the vehicle state information acquisition unit 30E acquires the position information and orientation information of the vehicle 10E from the navigation system 11E.
  • the vehicle state information acquisition unit 30E is connected to the steering angle detection unit 31E and the speed sensor 32E.
  • the steering angle detection unit 31E detects the steering angle (steering angle) of the steering device 31Ea of the vehicle 10E (not shown).
  • the steering angle detection unit 31E is attached to the steering wheel of the vehicle 10E and detects the steering angle from the reference position of the steering wheel.
  • the speed sensor 32E detects the traveling speed of the vehicle 10E and the like.
  • the vehicle state information acquisition unit 30E obtains the direction of the vehicle 10E from the steering angle detected by the steering angle detection unit 31E, and the position and orientation of the vehicle 10E after a predetermined time from the speed and acceleration detected by the speed sensor 32E. Ask for.
  • the vehicle state information acquisition unit 30E also obtains the arrival time (required time) of the vehicle 10E to a predetermined position in the course.
  • the image selection unit 40E selects the course display image 6E projected on the road surface in front of the vehicle 10E by using the lighting unit 52E, as will be described later. That is, the road surface drawing device 900 of the present embodiment accurately transmits the course of the vehicle 10E to other vehicles and pedestrians when the course of the vehicle 10E changes, such as when turning right or left. The course display image 6E showing the course direction is projected on the road surface. Therefore, the image selection unit 40E selects the display image data so that the accurate course display image 6E can always be displayed according to the position and orientation of the vehicle 10E.
  • the image selection unit 40E is connected to the course display image data storage unit 41E in which a plurality of data of the course display image 6E indicating a right turn, a left turn, a course, etc. are stocked.
  • the image selection unit 40E advances the vehicle 10E from the course information of the vehicle 10E acquired by the vehicle course acquisition unit 20E and the vehicle state information such as the direction, position, and speed of the vehicle 10E acquired by the vehicle state information acquisition unit 30E.
  • the data of the course display image 6E indicating the direction to be taken is selected from the course display image data storage unit 41E.
  • the image selection unit 40E sends a command for instructing the data of the selected course display image 6E to the image drawing unit 50E.
  • the image drawing unit 50E includes a lighting control unit 51E and a lighting unit 52E.
  • the image drawing unit 50E takes out the data of the route display image instructed from the image selection unit 40E from the route display image data storage unit 41E and sends it to the lighting control unit 51E.
  • the lighting control unit 51E converts data into a form suitable for the lighting unit 52E, and the lighting unit 52E irradiates light from a light source according to the converted data and projects a predetermined course display image onto the road surface in front of the vehicle 10E.
  • FIG. 24 is a flowchart showing the flow of processing in which the road surface display device 900 displays the course display image 6E.
  • FIG. 25 is a schematic view showing a course 7E and a course display image 6E when turning right at an intersection of the vehicle 10E. As shown in FIG. 24, the road surface drawing device 900 of the eleventh embodiment displays the course display image 6E near the intersection in the following flow.
  • the vehicle course acquisition unit 20E recognizes and determines that the vehicle 10E turns right as shown by the course 7E (step 1E).
  • the vehicle state information acquisition unit 30E acquires the position and orientation information of the vehicle 10E from the navigation system 11E. Further, the steering angle of the vehicle 10E is acquired from the steering angle detection unit 31E, and the traveling speed of the vehicle 10E is acquired from the speed sensor 32E (step 2E).
  • the control unit 90E sets the vehicle at a position where the route display image 6E should be displayed, such as when the vehicle 10E enters an intersection or changes the route based on the position and orientation information and the route information of the vehicle 10E. Determine if 10E is located (step 3E). When the control unit 90E determines that the position should be displayed, the acquired information is sent to the image selection unit 40E. The image selection unit 40E selects the image data of the optimum course display image 6E to be displayed from the course display image data storage unit 41E in which the image data is stored, from the current position, orientation, and course of the vehicle 10E.
  • the vehicle state information acquisition unit 30E acquires that the vehicle 10E is located 30 m before the intersection in a straight-ahead state, and the vehicle course acquisition unit 20E acquires the course 7E.
  • the image selection unit 40E selects the image data of the course display image 6E indicating that the vehicle 10E turns right from the acquired information, and issues a command to the image drawing unit 50E (step 4E).
  • the image drawing unit 50E selects the image data of the route display image 6E selected by the image selection unit 40E from the route display image data storage unit 41E, and uses the lighting control unit 51E and the lighting unit 52E to use the lighting control unit 51E and the lighting unit 52E to select the road surface in front of the vehicle 10E.
  • the course display image 6E is displayed on the top (step 5E). In the case of the position of the vehicle 10E shown in FIG. 25, since it is before entering the intersection, the road marking image 6E indicating the right turn is displayed with the intention to turn right.
  • step 1E to step 5E are repeated at predetermined intervals until the vehicle 10E arrives at the destination (step 6E).
  • the image selection unit 40E reselects image data sequentially according to the position of the vehicle 10E, and the optimum course display image 6E is always displayed.
  • FIG. 26 is a schematic view showing a method of selecting the course display image 6E in the present embodiment.
  • FIG. 26 shows a stage in which the vehicle 10E turns to the right and makes a right turn.
  • the course display image 6E in the present embodiment includes a branch line 6Ea parallel to the direction of the current vehicle 10E and a direction line 6Eb extending from the branch line 6Ea in the course direction.
  • the angle between the branch line 6Ea and the direction line 6Eb (hereinafter, also referred to as “travel angle ⁇ ”) changes sequentially according to the steering angle and the direction of the vehicle obtained from the navigation system 11E. That is, the angle of the direction line 6Eb of the course display image 6E changes sequentially according to the progress of the vehicle 10E in the intersection.
  • course direction t can be known from the course information acquired by the vehicle course acquisition unit 20E.
  • course angle the angle ⁇ from the straight direction s to the course direction t (hereinafter, this is referred to as “course angle”) ⁇ is calculated by the intersection of the straight direction s of the vehicle 10E indicated by the broken arrow and the course direction t of the vehicle 10E.
  • the steering angle of the vehicle 10E is detected from the steering angle detection unit 31E. Further, the vehicle state information acquisition unit 30E calculates the cumulative movement amount of the vehicle 10E from the displacement of the position information of the vehicle 10E from the navigation system 11E to the predetermined position after detecting the steering angle. Then, the inclination angle ⁇ of the vehicle 10E at a predetermined position is calculated from the steering angle and the cumulative movement amount. Then, the direction u (indicated by the broken line arrow u) of the vehicle 10E can be known from this inclination angle ⁇ . The traveling angle ⁇ is the sum of the course angle ⁇ and the tilt angle ⁇ . Therefore, in the course display image 6E, the branch line 6Ea is determined to be along the direction u, and the direction line 6Eb is determined to be along the course direction t of the vehicle 10E at each point.
  • the tilt angle ⁇ is generated from the cumulative movement amount. That is, if the steering angle is constant, if the vehicle 10E advances with the arc of the turning radius as the track, the direction of the vehicle changes according to the amount of movement of the vehicle 10E. Therefore, the tilt angle also changes. Therefore, the tilt angle ⁇ is calculated from the relationship between the steering angle and the cumulative movement amount. For example, FIG. 27B shows vehicle 10E on the way to a turn at an intersection.
  • the course display image 6E has a shape along the course 7E shown in FIG. 25.
  • the traveling angle ⁇ is 180 degrees, and the direction u of the vehicle 10E coincides with the course direction t. That is, when the vehicle 10E changes its course at the intersection from the straight direction which is the first direction to the right turn direction which is the second direction, the course display image 6E becomes a line indicating the second direction from the vehicle 10E.
  • the inclination angle ⁇ of the vehicle 10E is obtained from the steering angle and the cumulative movement amount of the vehicle 10E in this way.
  • the direction direction u and the traveling angle ⁇ of the vehicle 10E are obtained by using the inclination angle ⁇ , and the course display image 6E is generated (selected) from the relationship between the direction direction u and the course direction t. Therefore, the course display image 6E is regenerated (reselected) according to changes in the steering angle, the cumulative movement amount of the vehicle 10E (calculated again from the vehicle position, etc.), and the vehicle state information.
  • the image selection unit 40E may select an image close to the traveling angle ⁇ obtained in the above-described form from the plurality of course display images 6E stored in the memory, or the course display image from the obtained traveling angle ⁇ . 6E may be generated (selective generation). As described above, in order to acquire the position and steering angle of the vehicle 10E at predetermined intervals, the course display image 6E is reselected (reselected and generated) and displayed every time the direction of the vehicle 10E changes. As a result, after the vehicle 10E enters the intersection, the course display image 6E is displayed along the course 7E.
  • the branch line 6Ea and the direction line 6Eb are directly connected, but they may be connected through a gently curved portion (curve line).
  • the curvature of the curved portion (curve line) is determined according to the traveling angle ⁇ , that is, the angle from the direction u of the vehicle to the course direction t.
  • the vehicle state information such as the position and orientation of the vehicle.
  • the road surface drawing device 900 of the twelfth embodiment obtains a course display image 6E from the position of the vehicle 10E by GPS detected by the navigation system 11E and the direction of the vehicle 10E.
  • the traveling angle ⁇ of the course display image 6E before entering the intersection is a course angle ⁇ composed of the straight direction s and the course direction t in order to clearly indicate the turning direction, and the vehicle 10E.
  • the traveling angle ⁇ becomes blunt.
  • the line of the course display image 6E changes in the direction of becoming straight.
  • the traveling angle ⁇ is not changed with reference to the course angle ⁇ formed by the straight direction s and the course direction t, but the course display image 6E is displayed according to the change in the direction of the vehicle 10E.
  • the traveling angle ⁇ changes. That is, the inclination of the line of the course display image 6E changes.
  • the vehicle course acquisition unit 20E obtains the course 7E at the intersection indicated by the broken line arrow based on the information from the navigation system 11E. Further, the vehicle state information acquisition unit 30E acquires the position and orientation information of the vehicle 10E from the position information from the GPS (satellite positioning system) of the navigation system 11E and the progress state of the vehicle 10E.
  • GPS satellite positioning system
  • the image selection unit 40E selects the course display image 6E along the most course from the direction (inclination) of the course 7E at the acquired position of the vehicle 10E, and the image drawing unit 50E displays the selected course display image 6E. .. Specifically, the course display image 6E is displayed as follows.
  • 28A to 28D are schematic views showing the course display image 6E of the vehicle 10E when turning right in the present embodiment according to the position of the vehicle 10E.
  • FIG. 28A when the vehicle 10E enters the intersection, it gradually turns from straight ahead, so that the course display image 6EA composed of a curve with a small inclination is displayed.
  • the course display image 6EA is relatively long so that the direction of the turn can be seen.
  • the course display images 6EB and 6EC composed of inclined curves are displayed.
  • the course display images 6EB and 6EC are shorter than the previous course display image 6EA so that the turning direction can be known.
  • the course display image 6ED consisting of a straight line extending in the course direction (right turn direction) and having no inclination is displayed.
  • the length of the course display image 6ED is such that other vehicles and pedestrians can recognize that it is in the straight-ahead direction.
  • the course display images 6EA, 6EB, 6EC, and 6ED along the course 7E are selected and displayed by the image selection unit 40E according to the direction of the vehicle 10E at each position at the intersection.
  • the course display image 6EA which is a relatively long curve, is displayed at the position where the vehicle 10E enters the intersection so that oncoming vehicles and pedestrians can easily recognize the course of the vehicle 10E, and is near the center of the intersection.
  • the course display images 6EB and 6EC which are relatively short curves, are displayed. That is, when the vehicle 10E changes course at an intersection from the straight direction which is the first direction to the course direction which is the second direction, the course display image 6E becomes a curve extending from the vehicle in the second direction.
  • FIG. 29A to 29D are schematic views showing a modified example of the course display image 6E in the present embodiment.
  • FIG. 29A shows a course display image 6EA at a position where the vehicle 10E enters the intersection.
  • FIG. 29B shows a position in front of the center of the intersection, and
  • FIG. 29C shows the course display images 6EB and 6EC when the vehicle 10E is located near the center of the intersection.
  • FIG. 29D shows the course display image 6ED at the position after the vehicle 10E has turned the intersection.
  • the course display image 6E is a straight line connecting the vehicle 10E and the course 7E.
  • the length of the straight line of the course display image 6E is shorter, and the course direction is displayed in an easy-to-understand manner. Therefore, the length of the straight line of the course display image 6E is predetermined in relation to the position and direction (that is, the inclination in the course direction) of the vehicle 10E.
  • the course display image 6E is a straight line extending from the vehicle in the second direction when the vehicle 10E changes course from the straight direction which is the first direction to the right direction which is the second direction. Therefore, the length of the course display image 6E is determined according to the direction of the vehicle 10E.
  • the course 7E is determined by the information from the navigation system 11E, the position of the vehicle 10E is specified by GPS, and the line connecting the vehicle 10E to the course direction is set as the course display image 6E.
  • the display image 6E is displayed along the course 7E.
  • the image selection unit 40E can perform the image selection unit 40E according to the change in the position of the vehicle 10E in the automatic driving mode or the like.
  • the appropriate course display image 6E can always be quickly reselected and drawn on the road surface.
  • FIG. 30A is a schematic view of the intersection showing the time transition of the course display image 6E at the intersection.
  • 30B to 30D are schematic perspective views showing a state of projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection.
  • the course display image 6E displayed according to the position of the vehicle 10E is projected so that the drawing position on the road surface overlaps the drawing position projected immediately before. Specifically, it is as follows.
  • the course display image 6EA is projected on the road surface when the vehicle 10E is in the position of entering the intersection.
  • the irradiation angle of the light from the lighting unit 52E is set so as to be an angle ⁇ toward the road surface G.
  • the road surface G is photographed by the vehicle-mounted camera 13E, and the vehicle state information acquisition unit 30E confirms the position of the course display image 6EA on the road surface.
  • the image selection unit 40E reselects the course display image according to the position and orientation of the vehicle 10E, and the reselected course display image 6EB is placed on the road surface.
  • the road surface G at a predetermined position (scheduled display position) is imaged by the in-vehicle camera 13E, and it is confirmed whether the previous course display image 6EA overlaps with the projected position and whether the road surface has unevenness or inclination. ..
  • the irradiation direction of the lighting unit 52E is adjusted so that the irradiation angle ⁇ is constant with respect to the road surface G. Then, as shown in FIG. 30C, the irradiation direction of the lighting unit 52E is adjusted so that the reselected course display image 6EB partially overlaps the previously projected course display image 6EA (image shown by a broken line). .. In this way, the lighting unit 52E adjusts the irradiation direction according to the state of the road surface G, and projects the course display images 6EA, 6EB, 6EC, and 6ED so that the irradiation angle ⁇ is constant with respect to the road surface G.
  • the driver of the vehicle 10E can always recognize the course display image 6E in a constant field of view. Further, since a constant projection state can always be maintained with respect to the road surface G regardless of the unevenness or inclination of the road surface G, other vehicles or pedestrians on the opposite side can easily recognize the course display image 6E.
  • the course display images 6EA, 6EB, 6EC can be continuously projected on the course 7E. 6ED will be projected.
  • the course display image E6 is continuously displayed up to the lane after the right turn, as if a line was drawn, in combination with the visual afterimage effect of the viewer, so that surrounding pedestrians and other vehicles can see the vehicle 10E. It will be very easy to understand when recognizing the course at the intersection.
  • FIG. 31A and 31B are schematic views showing a course display image at an intersection in the 14th embodiment.
  • FIG. 31A is a schematic view showing a course display image 61E at the time when the vehicle 10E trying to turn right enters an intersection.
  • FIG. 31B is a schematic view showing a course display image 62E after the vehicle 10E detects a pedestrian.
  • the course display image 61E is a linear image.
  • the front information detection unit 16E including the vehicle-mounted camera 13E and the sensor 14E of the present embodiment detects the front of the vehicle 10E and detects the presence or absence of a pedestrian B in front of the vehicle 10E. Then, the forward information detection unit 16E sends the detection result to the vehicle state information acquisition unit 30E.
  • the vehicle state information acquisition unit 30E detects that a pedestrian B is in front of the vehicle 10E from the information of the vehicle-mounted camera 13E and the sensor 14E, and sends the information to the image selection unit 40E.
  • the image selection unit 40E selects the course display image 61E displayed in a normal thickness.
  • the course display image 62E displayed with a thickness thicker than the course display image 61E is selected so that the pedestrian can easily identify the pedestrian B.
  • the course display image 61E when there is no pedestrian B near the vehicle 10E is projected with a normal thickness.
  • the course display image 62E when the pedestrian B is detected in front of the vehicle 10E is thicker than usual and is projected prominently.
  • the forward information detection unit 16E detects the pedestrian B, and as the pedestrian B approaches, the course display image 62E having a thick line width of the image is selected and projected.
  • the attention of the pedestrian B can be drawn. That is, the pedestrian B can be made to accurately recognize the course of the vehicle 10E, and the movement of the vehicle 10E can be correctly grasped when walking on a pedestrian crossing or the like.
  • the line width of the course display image 62E is increased, but the present disclosure is not limited to this, and a process for blinking the course display image 62E is added. May be good.
  • one course display image 6E is displayed on the road surface in front of the vehicle 10E.
  • the present disclosure is not limited to these embodiments, and a plurality of course display images 6E may be displayed.
  • the road surface drawing device 900 in the fifteenth embodiment projects the course display image 6E on the left and right lighting units 52L and 52R, respectively.
  • FIG. 32A to 32C are schematic views showing the course display image 6E in the fifteenth embodiment.
  • FIG. 32A shows a course display image 6E when traveling straight.
  • FIG. 32B shows a course display image 6E when turning right.
  • FIG. 32C shows a course display image 6E when turning left.
  • the image selection unit 40E when going straight, displays a course display image 6S which is a straight arrow on each of the left lighting unit 52L and the right lighting unit 52R. Select data.
  • the image drawing unit 50E projects the left lighting unit 52L and the right lighting unit 52R so that the two course display images 6S are displayed on the road surface in front of the vehicle 10E according to the image data selected by the image selection unit 40E. ..
  • the image drawing unit 50E color-codes the course display image 6E in the manual driving mode and the automatic driving mode, and the vehicle 10E automatically surrounds the vehicle 10E, that is, the surrounding pedestrians and other vehicles (automobiles, bicycles). Be able to identify whether you are driving or not.
  • the color of the course display image 6S when traveling straight is white
  • the color of the course display image 6S is turquoise (blue-green).
  • straight line path display images 6S are displayed from the left and right lighting units 52R and 52L, respectively, and two straight arrows are projected in front of the vehicle 10E, so that pedestrians and the like on the left side of the vehicle 10E can also see them.
  • the course display image 6S is easy to see.
  • the image selection unit 40E uses only the right lighting unit 52R and displays the course display image 6R for turning right. Select the display image data 41Er. Then, the image drawing unit 50E projects only the right lighting unit 52R so that the route display image 6R is displayed on the road surface in front of the vehicle 10E according to the display image data 41Er selected by the image selection unit 40E. The projection of the right lighting unit 52R is performed at the same time as the timing when the vehicle 10E turns on the blinker lamp by the turn signal 12E in front of the intersection.
  • the display color of the course display image 6R is amber so that the manifestation of intention that the vehicle 10E will turn is clearly communicated to the surroundings.
  • the display of the course display image 6R after the vehicle 10E enters the intersection may be in a form conforming to the above-mentioned 11th to 14th embodiments. Further, the display after the vehicle 10E has entered the intersection may be performed only by the right lighting unit 52R, or may be performed by using both the left and right lighting units 52R and 52L.
  • the image selection unit 40E uses only the left lighting unit 52L and displays the display image data so as to display the course display image 6L for the left turn. Select 41 El.
  • the image drawing unit 50E projects the left lighting unit 52L based on the display image data 41El selected by the image selection unit 40E.
  • the display method and the display contents after the vehicle 10E enters the intersection are the same as when turning right.
  • the course display images 6R and 6L can be displayed in the direction of the turn. It becomes easier for oncoming vehicles and pedestrians around you to notice.
  • the image selection unit 40E selects the display image data 41Er or El to project the instruction to use only one of the left and right lighting units 52L and 52R by only one of the lighting units 52, and the image drawing unit. Instructed 50E.
  • the present disclosure is not limited to this, and the image selection unit 40E always selects the display image data 41Er (or El) for right turn (or left turn) when turning right (when turning left), and draws an image.
  • the unit 50E may be in a form of performing data processing as to whether the projection uses only one lighting unit 52E or the projection using both the left and right lighting units 52E.

Abstract

This vehicle driving assistance system (100) comprises: a vehicle operation ascertainment unit (110) that acquires the operation of a first vehicle (10) traveling on a road as first operation information, and acquires the operation of a second vehicle (20) traveling on the road as second operation information; a guidance information creation unit (120) that, on the basis of the first operation information and the second operation information, creates guidance information showing the planned route on which the first vehicle (10) or the second vehicle (20) will travel; and a guidance information presentation unit (130) that presents guidance information to the first vehicle (10) or the second vehicle (20). The guidance information creation unit (120) creates guidance information when the first operation information and the second operation information include a course change in the same direction, and include a right turn operation and a left turn operation.

Description

車両運転支援システム、路面描画装置および道路Vehicle driving support system, road surface drawing device and road
 本開示は、車両運転支援システム、路面描画装置および道路に関する。 This disclosure relates to a vehicle driving support system, a road surface drawing device, and a road.
 現在、各国において車両のステアリング制御や加減速制御の一部をコンピュータ等で行う運転支援技術が開発されている。運転支援技術を発展させて、車両の運転における運転支援の比率を高めて、車両を自動的に運転する自動運転技術が開発されている。 Currently, driving support technology is being developed in each country to perform part of vehicle steering control and acceleration / deceleration control with a computer or the like. Autonomous driving technology has been developed to automatically drive a vehicle by developing the driving support technology and increasing the ratio of driving support in the driving of the vehicle.
 運転支援技術や自動運転技術を用いた車両(以下、運転支援車両という)の走行では、車両の搭乗者が運転状況を注視するとは限らない。従来の手動運転では可能だった、車両外から運転者を目視することでの車両の動作予測が困難になる。したがって、運転支援技術を用いた車両においては、何らかの方法を用いて車両の周囲に対して車両動作を通知することが重要となる。 When driving a vehicle using driving support technology or automatic driving technology (hereinafter referred to as driving support vehicle), the passengers of the vehicle do not always pay close attention to the driving situation. It becomes difficult to predict the operation of the vehicle by visually observing the driver from outside the vehicle, which was possible with conventional manual driving. Therefore, in a vehicle using driving assistance technology, it is important to notify the surroundings of the vehicle of the vehicle operation by some method.
 車両動作を車外に対して通知する方法として、道路上の車両周囲領域に光を照射して画像を描画する方法が提案されている(例えば特許文献1を参照)。このような路面への画像描画は、運転支援技術を備えていない車両(以下、手動運転車両という)の他の運転者や歩行者などに対して、運転支援技術により走行している車両であること、現在および今後に行う車両の動作を伝達することができるため、他の運転者や歩行者は車両の動作を予測することができる。 As a method of notifying the vehicle movement to the outside of the vehicle, a method of irradiating the area around the vehicle on the road with light to draw an image has been proposed (see, for example, Patent Document 1). Such image drawing on the road surface is a vehicle that is traveling by the driving support technology for other drivers, pedestrians, etc. of a vehicle that does not have the driving support technology (hereinafter referred to as a manually driven vehicle). That is, the movement of the vehicle now and in the future can be transmitted, so that other drivers and pedestrians can predict the movement of the vehicle.
 また、自動車等の車両は走行方向を示すために、方向指示器(ウィンカー)を点滅させている。しかしながら、車両に搭載された方向指示器の点滅だけでは、歩行者等に分かりにくい。そこで、車両前面に設けた光照射部から画像を路面上に照射し、歩行者や対向車へ進路を示す路面描画装置がある。 In addition, vehicles such as automobiles blink the direction indicator (winker) to indicate the traveling direction. However, it is difficult for pedestrians and the like to understand only by blinking the direction indicator mounted on the vehicle. Therefore, there is a road surface drawing device that irradiates an image on the road surface from a light irradiation unit provided on the front surface of the vehicle to indicate a course to a pedestrian or an oncoming vehicle.
 路面描画装置は、車両前面に設けたLEDアレイやDMD(デジタルミラーデバイス)といった表示素子を用いて所定画像を路面上に描画する。例えば、特許文献2に記載の車両用描画装置は、交差点で右折時、左折時に曲がる方向の矢印をマーカとして車両前方の路面に表示する。この路面にマーカとして表示される進路表示画像(路面表示画像)によって、周囲の他車両や歩行者は車両の進行方向を認識することができる。 The road surface drawing device draws a predetermined image on the road surface using a display element such as an LED array or a DMD (digital mirror device) provided on the front surface of the vehicle. For example, the vehicle drawing apparatus described in Patent Document 2 displays an arrow in the direction of turning at an intersection when turning right or left as a marker on the road surface in front of the vehicle. The course display image (road surface display image) displayed as a marker on the road surface allows other vehicles and pedestrians in the vicinity to recognize the traveling direction of the vehicle.
 交差点でウインカーランプ等による右折、左折表示は、道路交通法で決められた交差点手前所定距離の位置、または交差点に入る所定時間前から表示する。進路表示画像もウインカーランプと同様に表示される。 At an intersection, turn right and left turns with blinker lamps, etc. are displayed at a predetermined distance before the intersection specified by the Road Traffic Act, or from a predetermined time before entering the intersection. The course display image is also displayed in the same manner as the blinker lamp.
 また路面描画装置は、右折用矢印、左折用矢印といった特定画像を、車両が曲がり終わるまで路面上に投影し続ける。つまり、車両の操舵装置からの操舵角度(ステアリング角度)が右折方向になったならば、操舵角度がもとに戻る(直進方向に戻る)まで、路面描画装置は右折用矢印を表示し続ける。 In addition, the road surface drawing device continues to project specific images such as right turn arrows and left turn arrows on the road surface until the vehicle finishes turning. That is, when the steering angle (steering angle) from the steering device of the vehicle is in the right turn direction, the road surface drawing device continues to display the right turn arrow until the steering angle returns to the original direction (returns to the straight direction).
日本国特開2015-164828号公報Japanese Patent Application Laid-Open No. 2015-164828 日本国特開2016-193689号公報Japanese Patent Application Laid-Open No. 2016-193689
 特許文献1は、公道上において複数の運転支援車両が走行する場合にも、それぞれの運転支援車両が走行状態や走行予定を各自で提示するだけである。交通シーン全体での情報表示量が増大するため全体の状況を把握することが困難であった。 Patent Document 1 only presents the driving state and the driving schedule of each driving support vehicle even when a plurality of driving support vehicles travel on a public road. It was difficult to grasp the overall situation because the amount of information displayed in the entire traffic scene increased.
 特に、複数車両が対向して走行している状況から、同一方向に走行を変更する交差点での合流等においては、車線数や車両サイズ、自動二輪車や歩行者等の有無などの状況によって最適な誘導方法が異なる。個別の運転支援車両による誘導では円滑な交通支援としては不十分であった。 In particular, when merging at an intersection where multiple vehicles are running facing each other and changing their running in the same direction, it is optimal depending on the number of lanes, vehicle size, presence of motorcycles, pedestrians, etc. The guidance method is different. Guidance by individual driving support vehicles was not sufficient for smooth traffic support.
 そこで本開示は、上記問題点に鑑みなされたものであり、複数の車両が交差点で合流する場合にも、円滑な交通を支援することが可能な車両運転支援システムを提供することを目的とする。 Therefore, the present disclosure has been made in view of the above problems, and an object of the present disclosure is to provide a vehicle driving support system capable of supporting smooth traffic even when a plurality of vehicles merge at an intersection. ..
 特許文献1では、路面に描画された画像を視認することで情報の伝達が行われるため、歩行者や他の車両の運転者が画像を認識して行動を選択する必要があり、運転支援技術を備えた車両の行動が優先される傾向がある。 In Patent Document 1, since information is transmitted by visually recognizing an image drawn on a road surface, it is necessary for a pedestrian or a driver of another vehicle to recognize the image and select an action. The behavior of vehicles equipped with tends to be prioritized.
 運転支援技術が発展して普及した場合には、公道上を走行する運転支援車両の比率が高まるため、歩行者や手動運転車両の行動を明確に運転支援車両に伝達できる円滑な交通支援が必要となる。 If driving support technology develops and becomes widespread, the proportion of driving support vehicles traveling on public roads will increase, so smooth traffic support that can clearly convey the actions of pedestrians and manually driven vehicles to driving support vehicles is necessary. Will be.
 そこで本開示は、上記問題点に鑑みなされたものであり、運転支援技術によって走行する車両に対して積極的に働きかける車両運転支援システムを提供することを目的とする。 Therefore, the present disclosure has been made in view of the above problems, and an object of the present disclosure is to provide a vehicle driving support system that actively works on a vehicle traveling by using driving support technology.
 また特許文献2において、特に幹線道路の交差点等では、信号待ちで複数台の車両が数珠つなぎに並んで停止する場合がある。この場合、それぞれの車両が進路表示画像を表示しようとして、各車両に設けられた描画ユニットから路面に光を照射すると、次のような現象が起きる。すなわち、交差点先頭に位置する車両は、前方の路面に進路表示画像を表示することができるが、2台目以降の車両は、前方車両との車間では進路表示画像を表示するには十分な空間を確保することができない。よって2台目以降の車両の進路表示画像の光が前方車両の車体やバンパー等で反射され、正確に路面描画ができない場合があった。 Further, in Patent Document 2, in particular, at an intersection of a highway, a plurality of vehicles may stop in a string of beads waiting for a traffic light. In this case, when each vehicle tries to display a course display image and irradiates the road surface with light from a drawing unit provided in each vehicle, the following phenomenon occurs. That is, the vehicle located at the head of the intersection can display the course display image on the road surface in front, but the second and subsequent vehicles have sufficient space to display the course display image between the vehicle and the vehicle in front. Cannot be secured. Therefore, the light of the course display image of the second and subsequent vehicles may be reflected by the vehicle body, bumper, etc. of the vehicle in front, and the road surface may not be drawn accurately.
 そこで本開示は、交差点等で、複数車両が縦列した場合においても、自車両の進路表示画像が、自車両よりも前に位置する前方車両や自車両に光学的な影響を与えないように、進路表示画像の表示を制御する車両運転支援システムを提供することを目的とする。 Therefore, in the present disclosure, even when a plurality of vehicles are arranged in a row at an intersection or the like, the course display image of the own vehicle does not optically affect the vehicle in front or the own vehicle located in front of the own vehicle. It is an object of the present invention to provide a vehicle driving support system for controlling the display of a course display image.
 また、図33Aから図33Dに示すように、一般的な路面描画装置は、交差点上の車両10Eの位置にかかわらず、常に同じ画像を投影していた。例えば、車両10Eが右折する場合、路面描画装置は、車両10Eが交差点に進入する前に、右方向に曲がる矢印の画像Aを車両10E前方の路面に表示する(図33A参照。)。交差点の車両10Eに対向する位置にいる歩行者Bには、この路面表示画像Aは遠く、他の車の陰に隠れて見えない場合もある。 Further, as shown in FIGS. 33A to 33D, a general road surface drawing device always projects the same image regardless of the position of the vehicle 10E on the intersection. For example, when the vehicle 10E turns right, the road surface drawing device displays an image A of an arrow turning to the right on the road surface in front of the vehicle 10E before the vehicle 10E enters the intersection (see FIG. 33A). The road marking image A may be far away from the pedestrian B at a position facing the vehicle 10E at the intersection, and may be hidden behind another vehicle and cannot be seen.
 車両10Eが右折進行し、図33Cに示すように、車両10Eの向きがほぼ右向きとなった時点でも、常に同じ路面表示画像Aを表示するため、実際には車両10Eは残り僅かに右方向へ曲がる程度なのにもかかわらず、画像は右方向に大きく曲がる矢印となる。つまり、図33C、図33Dに示すように、車両10Eの進行方向(進路)と路面表示画像Aの矢印の向きが合わないことがある。対向する位置の歩行者Bには、この路面表示画像Aによって、車両10Eがさらに右に曲がるような誤解を与える可能性があった。特に車線が複数ある交差点等では、このような車両の進行方向(進路)と路面表示画像Aの矢印の向きが異なることは、周囲の他車両や歩行者Bに誤解を与える可能性がある。 As shown in FIG. 33C, when the vehicle 10E makes a right turn and the direction of the vehicle 10E is almost right, the same road marking image A is always displayed. Therefore, the vehicle 10E actually remains slightly to the right. The image is an arrow that makes a large turn to the right, even though it only bends. That is, as shown in FIGS. 33C and 33D, the traveling direction (course) of the vehicle 10E may not match the direction of the arrow on the road surface display image A. The road marking image A may give the pedestrian B at the opposite position a misunderstanding that the vehicle 10E turns further to the right. In particular, at an intersection having a plurality of lanes, the difference between the traveling direction (course) of the vehicle and the direction of the arrow on the road surface display image A may give a misunderstanding to other vehicles and pedestrians B in the vicinity.
 そこで本開示は、右折、左折時等での走行車両の向きの変化に伴い、車両の進路と路面上での進路表示画像の内容を一致させる路面描画装置を提供することを目的とする。 Therefore, an object of the present disclosure is to provide a road surface drawing device that matches the course of the vehicle with the content of the course display image on the road surface as the direction of the traveling vehicle changes when turning right or left.
 また、特許文献1は路面に対して光を照射して画像を描画しているため、路面で反射された光を視認することで、他の運転者や歩行者は描かれた画像を認識することになる。しかし現実の路面では、晴天時に直射日光が照射された状態や、雨天時に路面が濡れた状態、夜間に他の車両からの前照灯が照射されている状態など、路面描画の光照射を視認しにくい状況が起こりえる。 Further, since Patent Document 1 irradiates the road surface with light to draw an image, other drivers and pedestrians recognize the drawn image by visually recognizing the light reflected on the road surface. It will be. However, on the actual road surface, the light irradiation of the road surface drawing can be visually recognized, such as when the road surface is exposed to direct sunlight in fine weather, when the road surface is wet in rainy weather, and when the headlights from other vehicles are illuminated at night. Difficult situations can occur.
 そこで本開示は、上記問題点に鑑みなされたものであり、路面描画の視認性を向上させる道路および車両運転支援システムを提供することを目的とする。 Therefore, the present disclosure has been made in view of the above problems, and an object of the present disclosure is to provide a road and vehicle driving support system that improves the visibility of road surface drawing.
 上記課題を解決するために、本開示の車両運転支援システムは、道路上を走行する第1車両の動作を第1動作情報として取得し、前記道路上を走行する第2車両の動作を第2動作情報として取得する車両動作把握部と、前記第1動作情報および前記第2動作情報に基づいて、前記第1車両または前記第2車両が走行する予定の経路を示す、誘導情報を作成する誘導情報作成部と、前記誘導情報を前記第1車両または前記第2車両に提示する誘導情報提示部と、を備え、前記誘導情報作成部は、前記第1動作情報および前記第2動作情報が同一方向への進路変更を含み、かつ右折動作および左折動作を含んでいる場合に、前記誘導情報を作成する。 In order to solve the above problems, the vehicle driving support system of the present disclosure acquires the motion of the first vehicle traveling on the road as the first motion information, and obtains the motion of the second vehicle traveling on the road as the second motion information. Guidance for creating guidance information indicating a route on which the first vehicle or the second vehicle is scheduled to travel based on the vehicle motion grasping unit acquired as motion information and the first motion information and the second motion information. The information creation unit includes a guidance information presentation unit that presents the guidance information to the first vehicle or the second vehicle, and the guidance information creation unit has the same first operation information and the second operation information. The guidance information is created when the course change in the direction is included and the right turn motion and the left turn motion are included.
 このような本開示の車両運転支援システムでは、車両動作把握部が第1車両と第2車両の動作を取得し、第1動作情報及び第2動作情報が同一方向への進路変更で左折と右折が含まれている場合に、誘導情報作成部が誘導情報を作成して提示する。このため本開示は、運転支援技術によって走行する複数の車両が交差点で合流する場合にも、安全性を確保しながらも円滑な交通を支援することが可能となる。 In such a vehicle driving support system of the present disclosure, the vehicle motion grasping unit acquires the motions of the first vehicle and the second vehicle, and the first motion information and the second motion information turn left and right by changing the course in the same direction. Is included, the guidance information creation unit creates and presents guidance information. Therefore, the present disclosure makes it possible to support smooth traffic while ensuring safety even when a plurality of vehicles traveling by the driving assistance technology merge at an intersection.
 上記課題を解決するために、本開示の車両運転支援システムは、車両の外部からの光を検知する検知部と、前記検知部の結果に応じて運転を支援する運転支援部を有する受信側車両と、前記検知部に対して予め定められた光信号を照射する光照射部と、を備え、前記検知部が前記光信号を検知した際に、前記運転支援部が前記受信側車両に停止動作または減速動作を実行させる。 In order to solve the above problems, the vehicle driving support system of the present disclosure has a detection unit that detects light from the outside of the vehicle and a driving support unit that supports driving according to the result of the detection unit. And a light irradiation unit that irradiates the detection unit with a predetermined light signal, and when the detection unit detects the light signal, the driving support unit stops the receiving vehicle. Or execute a deceleration operation.
 このような本開示の車両運転支援システムは、光照射部から検知部に対して光信号を照射することで、受信側車両が停止動作または減速動作を実行するため、運転支援技術によって走行する車両に対して積極的に働きかけることが可能である。 In such a vehicle driving support system of the present disclosure, the receiving side vehicle executes a stop operation or a deceleration operation by irradiating the detection unit with an optical signal from the light irradiation unit, so that the vehicle travels by the driving support technology. It is possible to actively work on.
 上記課題を解決するために、本開示の車両運転支援システムは、車両の進路情報を取得する車両進路取得部と、前記車両の位置及び速度を車両状態情報として取得する車両状態情報取得部と、前記車両の付近に他車両が存在するかを検知し、検知情報を取得する他車両検知部と、前記進路情報と前記車両状態情報および前記検知情報に基づいて、前記車両の周囲の路面に投影する進路表示画像を選択する画像選択部と、前記路面に前記進路表示画像を投影する路面描画部と、を備え、前記画像選択部は、前記検知情報および前記車両状態情報に基づいて前記車両前方の第1距離内に前記他車両が存在すると判断した場合は前記進路表示画像を選択せず、前記第1距離内に前記他車両が存在しないと判断した場合は進路前方表示画像を選択し、前記路面描画部は、前記画像選択部が選択した進路前方表示画像を前記車両の前方路面に投影する。 In order to solve the above problems, the vehicle driving support system of the present disclosure includes a vehicle course acquisition unit that acquires vehicle course information, a vehicle state information acquisition unit that acquires the position and speed of the vehicle as vehicle state information, and the like. Projected onto the road surface around the vehicle based on the other vehicle detection unit that detects the presence of another vehicle in the vicinity of the vehicle and acquires the detection information, the course information, the vehicle state information, and the detection information. An image selection unit for selecting a course display image to be used and a road surface drawing unit for projecting the course display image on the road surface are provided, and the image selection unit is in front of the vehicle based on the detection information and the vehicle state information. If it is determined that the other vehicle exists within the first distance, the course display image is not selected, and if it is determined that the other vehicle does not exist within the first distance, the course forward display image is selected. The road surface drawing unit projects a course forward display image selected by the image selection unit onto the road surface in front of the vehicle.
 これにより、車両前方の第1距離内に他車両が存在しない場合のみ、路面描画部が前方路面に進路前方表示画像を投影するため、交差点等で、複数車両が縦列した場合においても、車両の進路表示画像が、前記車両よりも前に位置する前方車両や前記車両に光学的な影響を与えないように、進路表示画像の表示を制御することが可能となる。 As a result, the road surface drawing unit projects the path forward display image on the front road surface only when there is no other vehicle within the first distance in front of the vehicle. Therefore, even when a plurality of vehicles are lined up at an intersection or the like, the vehicle It is possible to control the display of the route display image so that the route display image does not optically affect the vehicle in front or the vehicle located in front of the vehicle.
 上記課題を解決するために、本開示の路面描画装置は、車両の進路情報を取得する車両進路取得部と、前記車両の位置と向きを車両状態情報として取得する車両状態情報取得部と、前記進路情報と前記車両状態情報に基づいて、前記車両の前方路面に投影する進路表示画像を選択する画像選択部と、前記前方路面に前記進路表示画像を投影する路面描画部と、を備え、前記画像選択部は、前記進路情報が第1方向から第2方向に変化する場合、前記車両から前記第2方向を示す線を含む前記進路表示画像を選択し、前記車両状態情報の変化に伴って、前記進路表示画像を再選択する。 In order to solve the above problems, the road surface drawing device of the present disclosure includes a vehicle course acquisition unit that acquires vehicle course information, a vehicle state information acquisition unit that acquires the position and direction of the vehicle as vehicle state information, and the above. An image selection unit that selects a course display image to be projected on the front road surface of the vehicle based on the course information and the vehicle state information, and a road surface drawing unit that projects the course display image onto the front road surface are provided. When the course information changes from the first direction to the second direction, the image selection unit selects the course display image including a line indicating the second direction from the vehicle, and the image selection unit changes with the change of the vehicle state information. , The course display image is reselected.
 上記課題を解決するために、本開示の道路は、舗装面と、一次光によって励起されて前記一次光とは異なる二次光を発光する蛍光体材料を含有し、前記舗装面上に形成された蛍光体含有層と、前記蛍光体含有層上に形成され、前記一次光および前記二次光を透過するコーティング層と、を備える。 In order to solve the above problems, the road of the present disclosure contains a pavement surface and a phosphor material that is excited by the primary light and emits a secondary light different from the primary light, and is formed on the pavement surface. A fluorescent substance-containing layer and a coating layer formed on the fluorescent substance-containing layer and transmitting the primary light and the secondary light are provided.
 このような本開示の道路では、蛍光体含有層に対して一次光が照射され画像が描画される。蛍光体含有層で一次光が波長変換されて二次光が発光されることで、路面描画の視認性が向上する。 In such a road of the present disclosure, the phosphor-containing layer is irradiated with primary light and an image is drawn. The wavelength of the primary light is converted in the phosphor-containing layer and the secondary light is emitted, so that the visibility of the road surface drawing is improved.
 上記課題を解決するために、本開示の車両運転支援システムは、上記に記載の道路と、前記道路に対して前記一次光を照射する光照射部を備える。 In order to solve the above problems, the vehicle driving support system of the present disclosure includes the road described above and a light irradiation unit that irradiates the road with the primary light.
 本開示では、複数の車両が交差点で合流する場合にも、円滑な交通を支援することが可能な車両運転支援システムを提供することができる。 In the present disclosure, it is possible to provide a vehicle driving support system capable of supporting smooth traffic even when a plurality of vehicles merge at an intersection.
 本開示では、運転支援技術によって走行する車両に対して積極的に働きかける車両運転支援システムを提供することができる。 In the present disclosure, it is possible to provide a vehicle driving support system that actively works on a vehicle traveling by using driving support technology.
 本開示では、交差点等で、複数車両が縦列した場合においても、自車両の進路表示画像が自車両よりも前方に位置する他車両や自車両に光学的な影響を与えないように、進路表示画像の表示を制御する車両運転支援システムを提供することができる。 In the present disclosure, even when a plurality of vehicles are lined up at an intersection or the like, the course display image of the own vehicle does not optically affect other vehicles or the own vehicle located in front of the own vehicle. It is possible to provide a vehicle driving support system that controls the display of images.
 本開示では、右折、左折時等での走行車両の向きの変化に伴い、車両の進路と路面表示画像内容を一致させる路面描画装置を提供することができる。 In the present disclosure, it is possible to provide a road surface drawing device that matches the course of a vehicle with the content of a road surface display image as the direction of the traveling vehicle changes when turning right or left.
 本開示では、路面描画の視認性を向上させる道路および車両運転支援システムを提供することができる。 In the present disclosure, it is possible to provide a road and vehicle driving support system that improves the visibility of road surface drawing.
図1は第1実施形態に係る道路1および車両運転支援システム100を示す模式図である。FIG. 1 is a schematic view showing a road 1 and a vehicle driving support system 100 according to the first embodiment. 図2は第1実施形態に係る道路1の構造を示す模式断面図である。FIG. 2 is a schematic cross-sectional view showing the structure of the road 1 according to the first embodiment. 図3は車両運転支援システム100の構成を示すブロック図である。FIG. 3 is a block diagram showing the configuration of the vehicle driving support system 100. 図4Aは車両運転支援システム100における誘導情報提示の一例であって、インフラ側装置30からの光の照射を示す模式図である。FIG. 4A is an example of the presentation of guidance information in the vehicle driving support system 100, and is a schematic diagram showing the irradiation of light from the infrastructure side device 30. 図4Bは車両運転支援システム100における誘導情報提示の一例であって、第1車両10または第2車両20から光の照射を示す模式図である。FIG. 4B is an example of guidance information presentation in the vehicle driving support system 100, and is a schematic view showing light irradiation from the first vehicle 10 or the second vehicle 20. 図5は道路1の変形例の構造を示す模式断面図である。FIG. 5 is a schematic cross-sectional view showing the structure of a modified example of the road 1. 図6は第1実施形態に係る車両運転支援システム100の動作例を示すフローチャートである。FIG. 6 is a flowchart showing an operation example of the vehicle driving support system 100 according to the first embodiment. 図7は第2実施形態に係る車両運転支援システム200の道路上での動作例を示す模式図である。FIG. 7 is a schematic view showing an operation example of the vehicle driving support system 200 according to the second embodiment on the road. 図8は第3実施形態に係る車両運転支援システム300の道路上での動作例を示す模式図である。FIG. 8 is a schematic view showing an example of operation of the vehicle driving support system 300 according to the third embodiment on the road. 図9は第4実施形態に係る車両運転支援システム400の道路上での動作例を示す模式図である。FIG. 9 is a schematic view showing an example of operation of the vehicle driving support system 400 according to the fourth embodiment on the road. 図10は第5実施形態に係る車両運転支援システム500の道路上での動作例を示す模式図である。FIG. 10 is a schematic view showing an example of operation of the vehicle driving support system 500 according to the fifth embodiment on the road. 図11は第6実施形態に係る車両運転支援システム600の道路上での動作例を示す模式図である。FIG. 11 is a schematic view showing an operation example of the vehicle driving support system 600 according to the sixth embodiment on the road. 図12は車両運転支援システム600の構成を示すブロック図である。FIG. 12 is a block diagram showing the configuration of the vehicle driving support system 600. 図13Aは車両運転支援システム600における光照射の一例であって、送信側車両20Cからの光の照射を示す模式図である。FIG. 13A is an example of light irradiation in the vehicle driving support system 600, and is a schematic view showing light irradiation from the transmitting side vehicle 20C. 図13Bは車両運転支援システム600における光照射の一例であって、インフラ側装置30からの光の照射を示す模式図である。FIG. 13B is an example of light irradiation in the vehicle driving support system 600, and is a schematic view showing light irradiation from the infrastructure side device 30. 図13Cは車両運転支援システム600における光照射の一例であって、歩行者が保持する携帯電子機器40からの光の照射を示す模式図である。FIG. 13C is an example of light irradiation in the vehicle driving support system 600, and is a schematic view showing light irradiation from a portable electronic device 40 held by a pedestrian. 図14は車両運転支援システム600の動作例を示すフローチャートである。FIG. 14 is a flowchart showing an operation example of the vehicle driving support system 600. 図15は第7実施形態に係る車両運転支援システム700の道路上での動作例を示す模式図である。FIG. 15 is a schematic view showing an operation example of the vehicle driving support system 700 according to the seventh embodiment on the road. 図16は車両運転支援システム700の動作例を示すフローチャートである。FIG. 16 is a flowchart showing an operation example of the vehicle driving support system 700. 図17Aは第8実施形態における車両運転支援システム800を利用して走行する車両の正面図である。FIG. 17A is a front view of a vehicle traveling by using the vehicle driving support system 800 according to the eighth embodiment. 図17Bは第8実施形態における車両運転支援システム800を利用して走行する車両の後面図である。FIG. 17B is a rear view of a vehicle traveling by using the vehicle driving support system 800 according to the eighth embodiment. 図18は第8実施形態における車両運転支援システム800の構成を示すブロック図である。FIG. 18 is a block diagram showing a configuration of the vehicle driving support system 800 according to the eighth embodiment. 図19は第8実施形態における車両運転支援システム800が進路表示画像を表示する処理の流れを示すフローチャートである。FIG. 19 is a flowchart showing a flow of processing in which the vehicle driving support system 800 according to the eighth embodiment displays a course display image. 図20は第8実施形態における複数の車両の交差点での進路と進路表示画像を示す模式図である。FIG. 20 is a schematic view showing a course and a course display image at an intersection of a plurality of vehicles according to the eighth embodiment. 図21は第9実施形態の車両と二輪車(他車両)の交差点での進路表示画像を示す模式図である。FIG. 21 is a schematic view showing a course display image at an intersection of a vehicle of a ninth embodiment and a two-wheeled vehicle (another vehicle). 図22は第10実施形態の車両、二輪車(他車両)、インフラ側装置の交差点での関係を示す模式図である。FIG. 22 is a schematic view showing the relationship between the vehicle of the tenth embodiment, the two-wheeled vehicle (other vehicle), and the infrastructure side device at the intersection. 図23は第11実施形態における路面描画装置900のブロック図である。FIG. 23 is a block diagram of the road surface drawing device 900 according to the eleventh embodiment. 図24は第11実施形態における路面描画装置900が進路表示画像6Eを表示する処理の流れを示すフローチャートである。FIG. 24 is a flowchart showing a flow of processing in which the road surface drawing device 900 according to the eleventh embodiment displays the course display image 6E. 図25は第11実施形態における車両10Eの交差点での右折時の進路7Eと進路表示画像6Eを示す模式図である。FIG. 25 is a schematic view showing a course 7E and a course display image 6E when turning right at an intersection of the vehicle 10E in the eleventh embodiment. 図26は第11実施形態における進路表示画像6Eの選択方法を示す模式図である。FIG. 26 is a schematic view showing a method of selecting the course display image 6E in the eleventh embodiment. 図27Aは第11実施形態における右折時の交差点内の車両10E位置における進路表示画像6Eを示し、車両10Eが交差点に進入する前の状態を示す模式図である。FIG. 27A is a schematic view showing a course display image 6E at the position of the vehicle 10E in the intersection at the time of turning right in the eleventh embodiment, and shows a state before the vehicle 10E enters the intersection. 図27Bは第11実施形態における右折時の交差点内の車両10E位置における進路表示画像6Eを示し、車両10Eが交差点の中央付近に位置する状態を示す模式図である。FIG. 27B is a schematic view showing a course display image 6E at the vehicle 10E position in the intersection when turning right in the eleventh embodiment, and shows a state in which the vehicle 10E is located near the center of the intersection. 図27Cは第11実施形態における右折時の交差点内の車両10E位置における進路表示画像6Eを示し、車両10Eが交差点を曲がり切った位置での状態を示す模式図である。FIG. 27C is a schematic view showing a course display image 6E at the position of the vehicle 10E in the intersection at the time of turning right in the eleventh embodiment, and shows a state in which the vehicle 10E has turned the intersection. 図28Aは第12実施形態における右折時の進路表示画像6Eを車両10Eの位置に応じて示し、車両10Eが交差点に進入する前の状態を示す模式図である。FIG. 28A is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state before the vehicle 10E enters the intersection. 図28Bは第12実施形態における右折時の進路表示画像6Eを車両10Eの位置に応じて示し、車両10Eが交差点の中央手前付近に位置する状態を示す模式図である。FIG. 28B is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state in which the vehicle 10E is located near the center of the intersection. 図28Cは第12実施形態における右折時の進路表示画像6Eを車両10Eの位置に応じて示し、車両10Eが交差点の中央付近に位置する状態を示す模式図である。FIG. 28C is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state in which the vehicle 10E is located near the center of the intersection. 図28Dは第12実施形態における右折時の進路表示画像6Eを車両10Eの位置に応じて示し、車両10Eが交差点を曲がり切った位置での状態を示す模式図である。FIG. 28D is a schematic view showing the course display image 6E at the time of turning right in the twelfth embodiment according to the position of the vehicle 10E, and shows the state of the vehicle 10E at the position where the intersection is completely turned. 図29Aは第12実施形態における進路表示画像6Eの変形例を示し、車両10Eが交差点に進入する位置での進路表示画像6Aを示す模式図である。FIG. 29A shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6A at the position where the vehicle 10E enters the intersection. 図29Bは第12実施形態における進路表示画像6Eの変形例を示し、交差点中央手前付近に車両10Eが位置する時の進路表示画像6EBを示す模式図である。FIG. 29B shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6EB when the vehicle 10E is located near the center of the intersection. 図29Cは第12実施形態における進路表示画像6Eの変形例を示し、交差点中央付近に車両10Eが位置する時の進路表示画像6ECを示す模式図である。FIG. 29C shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6EC when the vehicle 10E is located near the center of the intersection. 図29Dは第12実施形態における進路表示画像6Eの変形例を示し、車両1が交差点を曲がり切った位置での進路表示画像6EDを示す模式図である。FIG. 29D shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6ED at a position where the vehicle 1 turns completely at the intersection. 図30Aは第13実施形態の進路表示画像6Eを示し、交差点における進路表示画像6Eを時間推移に伴って示した交差点の模式図である。FIG. 30A shows the course display image 6E of the thirteenth embodiment, and is a schematic view of the intersection showing the course display image 6E at the intersection with the passage of time. 図30Bは第13実施形態の進路表示画像6Eを示し、交差点での時間推移に応じた車両10Eと進路表示画像6Eの投影の様子を示す模式的斜視図である。FIG. 30B shows the course display image 6E of the thirteenth embodiment, and is a schematic perspective view showing the projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection. 図30Cは第13実施形態の進路表示画像6Eを示し、交差点での時間推移に応じた車両10Eと進路表示画像6Eの投影の様子を示す模式的斜視図である。FIG. 30C shows the course display image 6E of the thirteenth embodiment, and is a schematic perspective view showing a state of projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection. 図30Dは第13実施形態の進路表示画像6Eを示し、交差点での時間推移に応じた車両10Eと進路表示画像6Eの投影の様子を示す模式的斜視図である。FIG. 30D shows the course display image 6E of the thirteenth embodiment, and is a schematic perspective view showing a state of projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection. 図31Aは第14実施形態における交差点での進路表示画像を示し、右折しようとする車両10Eが交差点に進入時点での進路表示画像61Eを示す模式図である。FIG. 31A is a schematic view showing a course display image at an intersection in the 14th embodiment, and shows a course display image 61E at the time when the vehicle 10E trying to turn right enters the intersection. 図31Bは第14実施形態における交差点での進路表示画像を示し、車両10Eが歩行者を検知した後の進路表示画像62Eを示す模式図である。FIG. 31B is a schematic view showing a course display image at an intersection in the 14th embodiment and showing a course display image 62E after the vehicle 10E detects a pedestrian. 図32Aは第15実施形態における直進時の進路表示画像6Eを示す模式図である。FIG. 32A is a schematic view showing a course display image 6E when traveling straight in the fifteenth embodiment. 図32Bは第15実施形態における右折時の進路表示画像6Eを示す模式図である。FIG. 32B is a schematic view showing a course display image 6E at the time of turning right in the fifteenth embodiment. 同32Cは第15実施形態における左折時の進路表示画像6Eを示す模式図である。32C is a schematic view showing a course display image 6E at the time of a left turn in the fifteenth embodiment. 図33Aは従来例における右折時の交差点内の車両位置の進路表示画像を示し、車両10Eが交差点に進入する前の状態を示す模式図である。FIG. 33A is a schematic view showing a course display image of the vehicle position in the intersection at the time of a right turn in the conventional example, and shows a state before the vehicle 10E enters the intersection. 図33Bは従来例における右折時の交差点内の車両位置の進路表示画像を示し、車両10Eが交差点の中央手前付近に位置する状態を示す模式図である。FIG. 33B is a schematic view showing a course display image of the vehicle position in the intersection at the time of a right turn in the conventional example, and shows a state in which the vehicle 10E is located near the center front of the intersection. 図33Cは従来例における右折時の交差点内の車両位置の進路表示画像を示し、車両10Eが交差点の中央付近に位置する状態を示す模式図である。FIG. 33C is a schematic view showing a course display image of the vehicle position in the intersection at the time of a right turn in the conventional example, and shows a state in which the vehicle 10E is located near the center of the intersection. 図33Dは、従来例における右折時の交差点内の車両位置の進路表示画像を示し、車両10Eが交差点を曲がり切った位置での状態を示す模式図である。FIG. 33D is a schematic view showing a course display image of the vehicle position in the intersection when turning right in the conventional example, and showing the state of the vehicle 10E at the position where the intersection is completely turned.
 (第1実施形態)
 以下、本開示の実施の形態について、図面を参照して詳細に説明する。各図面に示される同一または同等の構成要素、部材、処理には、同一の符号を付すものとし、適宜重複した説明は省略する。図1は、本実施形態に係る道路1および車両運転支援システム100を示す模式図である。
(First Embodiment)
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. The same or equivalent components, members, and processes shown in the drawings shall be designated by the same reference numerals, and redundant description will be omitted as appropriate. FIG. 1 is a schematic view showing a road 1 and a vehicle driving support system 100 according to the present embodiment.
 道路1および車両運転支援システム100の一例として図1は、道路1が交差して歩道2が設けられた交差点を示している。道路1上には第1車両10と第2車両20が対向して走行している。歩道2の領域上には、インフラ側装置30が配置されている。図1では、第1車両10は図中下方から上方に向かって道路1を直進してから左折しようとしており、第2車両20は図中上方から下方に向かって直進してから右折しようとしている。 As an example of the road 1 and the vehicle driving support system 100, FIG. 1 shows an intersection where the road 1 intersects and the sidewalk 2 is provided. The first vehicle 10 and the second vehicle 20 are running on the road 1 facing each other. The infrastructure side device 30 is arranged on the area of the sidewalk 2. In FIG. 1, the first vehicle 10 is going straight on the road 1 from the lower part to the upper part in the figure and then turns left, and the second vehicle 20 is going straight on from the upper part to the lower part in the figure and then turning right. ..
 本実施形態では、第1車両10の左折が優先されるが、第1車両10の状況や合流先の道路1の車線数、速度や距離、直進車両の渋滞状況等によっては、第2車両20が第1車両10の前方を横断して右折してもよい。図1は、走行車線が左側と法規で定められている地域での走行状態を示しているが、走行車線が右側と法規で定められている地域では、第2車両20が左折して、第1車両10が右折してもよい。 In the present embodiment, the left turn of the first vehicle 10 is prioritized, but depending on the situation of the first vehicle 10, the number of lanes, the speed and the distance of the road 1 at the confluence, the congestion situation of the straight-ahead vehicle, and the like, the second vehicle 20 May cross the front of the first vehicle 10 and turn right. FIG. 1 shows the driving condition in the area where the driving lane is on the left side and the regulation, but in the area where the driving lane is on the right side and the regulation is regulated, the second vehicle 20 turns left and the second vehicle 20 is turned. 1 Vehicle 10 may turn right.
 図1にハッチングで示したように、交差点及び交差点から所定距離Lの範囲内には、蛍光体塗布領域Rが形成されている。所定距離Lの範囲は、例えば方向指示器で進路変更を表示するために交差点から30m離れた範囲である。図1では、蛍光体塗布領域Rとしてハッチングを施して示しているが、後述するように蛍光体塗布領域Rは非着色である。蛍光体塗布領域R内の道路1上には、第1車両10の前方の路面に誘導情報M1が描画され、第2車両20の前方の路面に誘導情報M2が描画されて、路面描画を用いて誘導情報M1,M2が提示されている。また、蛍光体塗布領域R内の道路1上には、横断歩道や停止線等の路面情報M3が描画されており、歩道2上にも誘導情報M4が描画されている。 As shown by hatching in FIG. 1, a phosphor coating region R is formed at an intersection and within a predetermined distance L from the intersection. The range of the predetermined distance L is a range 30 m away from the intersection, for example, in order to indicate a course change with a turn signal. In FIG. 1, hatching is applied as the phosphor-coated region R, but as will be described later, the phosphor-coated region R is uncolored. Guidance information M1 is drawn on the road surface in front of the first vehicle 10 and guidance information M2 is drawn on the road surface in front of the second vehicle 20 on the road 1 in the phosphor-coated region R. Guidance information M1 and M2 are presented. Further, road surface information M3 such as a pedestrian crossing and a stop line is drawn on the road 1 in the phosphor-coated region R, and guidance information M4 is also drawn on the sidewalk 2.
 図1では、誘導情報M1は第1車両10の進行方向において前方向から左方向に曲がる矢印形状であり、誘導情報M2は第2車両の進行方向において前方向から右方向に曲がる矢印形状である。図1は誘導情報の提示方法として矢印の画像を路面に描画する例を示しているが、画像の形状は限定されず、また路面への描画にも限定されない。画像は文字やアイコンを含んでもよい。誘導情報の提示方法は、車両に搭載された画像表示装置やヘッドアップディスプレイを用いた画像表示であってもよく、音声によるガイダンスであってよい。 In FIG. 1, the guidance information M1 has an arrow shape that bends from the front direction to the left in the traveling direction of the first vehicle 10, and the guidance information M2 has an arrow shape that bends from the front direction to the right in the traveling direction of the second vehicle. .. FIG. 1 shows an example of drawing an arrow image on a road surface as a method of presenting guidance information, but the shape of the image is not limited, and the drawing is not limited to the road surface. The image may contain characters and icons. The method of presenting the guidance information may be an image display using an image display device or a head-up display mounted on the vehicle, or may be voice guidance.
 道路1は車両が走行する経路であり、舗装道路や未舗装道路、公道や私道の何れであってもよい。また、図1は片側一車線で対向車線のある十字路の例を示しているが、片側の車線数や交差点の形状は限定されない。道路1および車両運転支援システム100は交差点内にも限定されない。また車両運転支援システム100は、三叉路や五叉路等のより多くの車線が交差している場合にも用いられる。車両運転支援システム100は、道路への右左折にも限定されず、右左折により対向車線を横断して駐車場や店舗等の私有地に進入する場合でも用いられる。 Road 1 is a route on which vehicles travel, and may be a paved road, an unpaved road, a public road, or a private road. Further, FIG. 1 shows an example of a crossroad having one lane on each side and an oncoming lane, but the number of lanes on one side and the shape of the intersection are not limited. The road 1 and the vehicle driving support system 100 are not limited to the inside of the intersection. The vehicle driving support system 100 is also used when more lanes such as a three-way junction and a five-way junction intersect. The vehicle driving support system 100 is not limited to turning left or right on the road, and is also used when turning left or right to cross an oncoming lane and enter private land such as a parking lot or a store.
 歩道2は、車両が走行する道路1に沿って設けられた、車両が走行しない空間であり、歩行者が通行する領域である。歩道2が道路1と明確に分離されていない場合には、歩道2は道路1の路側帯でもよい。また、車両運転支援システム100は必ずしも歩道2を備えている必要はない。図1はインフラ側装置30が配置される領域や歩行者が歩行する領域、歩行者が所持する携帯電子機器が配置される領域の一例として歩道2を示しているに過ぎない。 The sidewalk 2 is a space provided along the road 1 on which the vehicle travels, on which the vehicle does not travel, and is an area through which pedestrians pass. If sidewalk 2 is not clearly separated from road 1, sidewalk 2 may be a roadside zone of road 1. Further, the vehicle driving support system 100 does not necessarily have to include the sidewalk 2. FIG. 1 only shows the sidewalk 2 as an example of an area where the infrastructure side device 30 is arranged, an area where a pedestrian walks, and an area where a portable electronic device possessed by a pedestrian is arranged.
 第1車両10および第2車両20は、道路1上を走行する車両である。第1車両10および第2車両20は、ステアリング制御や加減速制御の一部をコンピュータ等で実施している運転支援車両であることが好ましい。しかし運転支援技術を備えていない手動運転車両であっても本実施形態の車両運転支援システム100は運転の支援を実施することができる。第1車両10または第2車両20が運転支援車両である場合には、それぞれの車両は状況把握部、運転支援部、車両動作制御部、情報通信部と、を備えている。 The first vehicle 10 and the second vehicle 20 are vehicles traveling on the road 1. The first vehicle 10 and the second vehicle 20 are preferably driving support vehicles in which a part of steering control and acceleration / deceleration control is performed by a computer or the like. However, the vehicle driving support system 100 of the present embodiment can provide driving support even for a manually driven vehicle that does not have the driving support technology. When the first vehicle 10 or the second vehicle 20 is a driving support vehicle, each vehicle includes a situation grasping unit, a driving support unit, a vehicle motion control unit, and an information / communication unit.
 状況把握部は、車両の走行状態や周囲状況の情報を取得する。状況把握部は、運転支援機能を実現する車速センサや位置センサ、画像撮像装置、レーザ測距装置、LIDAR(Light Detection and Ranging)等の各種センサ機器でもよい。状況把握部が取得する走行状態は、走行速度、位置、車体の向き、ステアリング角度、ブレーキ操作、カーナビゲーションシステムによる走行経路、車内での会話認識による方向指示等を含む。また、状況把握部が取得する周囲状況は、路面状態、周囲温度、カーナビゲーションシステムによる道路地図、道路の勾配、画像認識による周囲の物体検出、前走車両や対向車両、後続車両との車間距離や行動予測、画像認識による歩行者の検出等を含む。 The situation grasping unit acquires information on the running condition and surrounding conditions of the vehicle. The situation grasping unit may be various sensor devices such as a vehicle speed sensor, a position sensor, an image imaging device, a laser ranging device, and a LIDAR (Light Detection and Ringing) that realize a driving support function. The traveling state acquired by the situation grasping unit includes the traveling speed, the position, the direction of the vehicle body, the steering angle, the brake operation, the traveling route by the car navigation system, the direction instruction by the conversation recognition in the vehicle, and the like. In addition, the surrounding conditions acquired by the situation grasping unit include road surface conditions, ambient temperature, road maps by car navigation system, road gradients, detection of surrounding objects by image recognition, and distance between vehicles in front, oncoming vehicles, and following vehicles. Includes behavior prediction, detection of pedestrians by image recognition, etc.
 運転支援部は、状況把握部が取得した情報を処理して、車両の運転を支援するための運転制御信号を車両動作制御部に出力する。車両動作制御部は、運転支援部から出力された運転制御信号に基づいて、車両のステアリング制御や加減速制御を実行する。車両動作制御部は運転支援機能を有する。車両動作制御部は、動力源の出力調整やブレーキ操作、ステアリング角度の変更、運転案内の表示、方向指示器や停止灯の点灯制御等を行うことで、第1車両10または第2車両20の運転を支援する。情報通信部は、運転支援部および状況把握部とに接続されて、車両の外部に設けられた通信部と情報通信する。情報通信部は、車両同士の間での車車間通信や、道路に設けられた設備との間での路車間通信を行う。 The driving support unit processes the information acquired by the situation grasping unit and outputs a driving control signal for supporting the driving of the vehicle to the vehicle motion control unit. The vehicle motion control unit executes vehicle steering control and acceleration / deceleration control based on the driving control signal output from the driving support unit. The vehicle motion control unit has a driving support function. The vehicle motion control unit adjusts the output of the power source, operates the brakes, changes the steering angle, displays the driving guide, controls the lighting of the turn signals and the stop lights, and so on, so that the first vehicle 10 or the second vehicle 20 Support driving. The information and communication unit is connected to the driving support unit and the situation grasping unit, and performs information communication with the communication unit provided outside the vehicle. The information and communication unit performs vehicle-to-vehicle communication between vehicles and road-to-vehicle communication with equipment provided on the road.
 インフラ側装置30は、道路1または歩道2に設けられている。後述するように、インフラ側装置30は、道路1の路面に対して光を照射して画像を描画してもよい。インフラ側装置30の具体的な構成は限定されず、専用の照明装置が配置されてもよい。街路灯や信号等、電光掲示板等のインフラ設備が光を照射する機能を有してもよい。 The infrastructure side device 30 is provided on the road 1 or the sidewalk 2. As will be described later, the infrastructure side device 30 may irradiate the road surface of the road 1 with light to draw an image. The specific configuration of the infrastructure side device 30 is not limited, and a dedicated lighting device may be arranged. Infrastructure equipment such as electric bulletin boards, such as street lights and signals, may have a function of irradiating light.
 インフラ側装置30は、第1車両10または第2車両20との間で情報通信する情報通信部を備え、車両との間での路車間通信を可能としてもよい。インフラ側装置30は、運転支援車両と同様に状況把握部と運転支援部を備えてもよい。 The infrastructure side device 30 may include an information communication unit for information communication with the first vehicle 10 or the second vehicle 20, and may enable road-to-vehicle communication with the vehicle. The infrastructure side device 30 may include a situation grasping unit and a driving support unit in the same manner as the driving support vehicle.
 図2は、第1実施形態に係る道路1の構造を示す模式断面図である。図2に示すように、道路1の地面上には舗装面3が形成されており、舗装面3上に蛍光体含有層4とコーティング層5が積層して形成されている。図2は簡略化のために道路1と舗装面3を模式的に二層構造で示したが、道路1と舗装面3は路体、路床、構築路床、下層路盤、上層路盤、基層、表層等を備えた積層構造でもよい。なお蛍光体含有層4及びコーティング層5は歩道2上に積層して形成されていてもよい。 FIG. 2 is a schematic cross-sectional view showing the structure of the road 1 according to the first embodiment. As shown in FIG. 2, a pavement surface 3 is formed on the ground of the road 1, and a phosphor-containing layer 4 and a coating layer 5 are laminated on the pavement surface 3. In FIG. 2, the road 1 and the pavement surface 3 are schematically shown in a two-layer structure for simplification, but the road 1 and the pavement surface 3 are a road body, a roadbed, a construction roadbed, a lower layer roadbed, an upper layer roadbed, and a base layer. , A laminated structure including a surface layer and the like may be used. The phosphor-containing layer 4 and the coating layer 5 may be formed by being laminated on the sidewalk 2.
 舗装面3は、蛍光体含有領域R以外の領域では道路1の表層に該当する層であり、表面に露出して第1車両10および第2車両20のタイヤと接触する。舗装面3の材料は限定されず、アスファルトやコンクリート、インターロッキング、木材、レンガ等でもよい。 The pavement surface 3 is a layer corresponding to the surface layer of the road 1 in a region other than the phosphor-containing region R, and is exposed to the surface and comes into contact with the tires of the first vehicle 10 and the second vehicle 20. The material of the pavement surface 3 is not limited, and may be asphalt, concrete, interlocking, wood, brick, or the like.
 蛍光体含有層4は、一次光によって励起され一次光とは異なる波長の二次光を発光する蛍光体材料の微粒子を含む。また、蛍光体含有層4は蛍光体微粒子を分散する分散媒を含んでおり、分散媒中に均一に蛍光体微粒子が分散されている。蛍光体含有層4は、蛍光体材料の吸収帯域に対応した色に着色されているが、分散媒における蛍光体微粒子の濃度を小さくすることで蛍光体含有層4全体を遠方から視認した場合には、無着色にもなりえる。分散媒の材料は限定されないが、一次光および二次光を透過する材料であり、アクリル樹脂やエポキシ樹脂、シリコーン樹脂、ポリカーボネート等を含む。また、蛍光体含有層4は、一次光および二次光を散乱するための光散乱材を含有してもよい。光散乱材は、分散媒とは屈折率が異なる微粒子を含み、例えばSiOやTiO等でもよい。 The phosphor-containing layer 4 contains fine particles of a phosphor material that are excited by primary light and emit secondary light having a wavelength different from that of the primary light. Further, the phosphor-containing layer 4 contains a dispersion medium for dispersing the fluorescent material fine particles, and the fluorescent material fine particles are uniformly dispersed in the dispersion medium. The phosphor-containing layer 4 is colored in a color corresponding to the absorption band of the phosphor material, but when the entire phosphor-containing layer 4 is visually recognized from a distance by reducing the concentration of the phosphor fine particles in the dispersion medium. Can also be uncolored. The material of the dispersion medium is not limited, but is a material that transmits primary light and secondary light, and includes acrylic resin, epoxy resin, silicone resin, polycarbonate, and the like. Further, the phosphor-containing layer 4 may contain a light scattering material for scattering primary light and secondary light. The light scattering material contains fine particles having a refractive index different from that of the dispersion medium, and may be, for example, SiO 2 or TiO 2 .
 蛍光体含有層4に含有される蛍光体材料は限定されず、複数種類の蛍光体材料を含んでもよい。誘導情報M1、M2として右左折を示す画像を描画する場合には、蛍光体含有層4は二次光としてアンバー色を発光することが好ましい。また、横断歩道や停止線等の路面情報M3を描画する場合には、一次光と二次光の混色が白色となるように、蛍光体含有層4は緑色光で励起されて、青色光、赤色光を発光する複数の蛍光体材料を含んでもよい。同様に蛍光体含有層4は、紫色光で励起されて緑色光、青色光、赤色光を発光する複数の蛍光体材料を含んでもよい。蛍光体材料の一例としては、黄色光を発光するYAG系((Y,Gd)(Al,Ga)O:Ce)、赤色光を発光するCASN系(CaAlSiN:Eu等)、緑色光を発光するβ-SiAlON系(Si6-ZAl8-z等)が挙げられる。 The fluorescent material contained in the fluorescent material-containing layer 4 is not limited, and a plurality of types of fluorescent materials may be included. When drawing an image showing a right or left turn as the guidance information M1 and M2, it is preferable that the phosphor-containing layer 4 emits an amber color as the secondary light. Further, when drawing the road surface information M3 such as a crosswalk or a stop line, the phosphor-containing layer 4 is excited by green light so that the mixed color of the primary light and the secondary light becomes white, and the blue light is generated. It may contain a plurality of phosphor materials that emit red light. Similarly, the phosphor-containing layer 4 may contain a plurality of phosphor materials that are excited by purple light and emit green light, blue light, or red light. Examples of phosphor materials include a YAG system that emits yellow light ((Y, Gd) (Al, Ga) O: Ce), a CASN system that emits red light (CaAlSiN 3 : Eu, etc.), and a green light emission. Β-SiAlON system (Si 6-Z Al Z O Z N 8-z, etc.) can be mentioned.
 コーティング層5は、蛍光体含有層4の表面を覆って形成された層であり、一次光および二次光を透過する材料で構成されている。コーティング層5の材料は限定されず、アクリル樹脂やエポキシ樹脂、シリコーン樹脂、ポリカーボネート等を含んでもよい。また、コーティング層5は全体を光透過性の材料で構成するものに限定されず、部分的に遮光性の材料含んでもよい。コーティング層5は、蛍光体含有層4の表面を覆って道路1上を走行する車両のタイヤと接触するため、蛍光体含有層4の保護や道路1の耐久性向上および摩擦力確保を図ることができる。 The coating layer 5 is a layer formed by covering the surface of the phosphor-containing layer 4, and is made of a material that transmits primary light and secondary light. The material of the coating layer 5 is not limited, and may include acrylic resin, epoxy resin, silicone resin, polycarbonate, and the like. Further, the coating layer 5 is not limited to the one made entirely of a light-transmitting material, and may partially include a light-shielding material. Since the coating layer 5 covers the surface of the phosphor-containing layer 4 and comes into contact with the tire of the vehicle traveling on the road 1, the phosphor-containing layer 4 is protected, the durability of the road 1 is improved, and the frictional force is secured. Can be done.
 図3は、車両運転支援システム100の構成を示すブロック図である。図3に示すように、車両運転支援システム100には、第1車両10と、第2車両20と、車両動作把握部110と、誘導情報作成部120と、誘導情報提示部130が備わっている。ここで、車両動作把握部110、誘導情報作成部120、運転支援部および情報把握部は、中央演算装置(CPU: Central Processing Unit)やメモリ、外部記憶装置等を備えたコンピュータ上で、予め記録されたプログラムによって所定の情報処理を実行させてもよい。 FIG. 3 is a block diagram showing the configuration of the vehicle driving support system 100. As shown in FIG. 3, the vehicle driving support system 100 includes a first vehicle 10, a second vehicle 20, a vehicle motion grasping unit 110, a guidance information creation unit 120, and a guidance information presentation unit 130. .. Here, the vehicle motion grasping unit 110, the guidance information creating unit 120, the driving support unit, and the information grasping unit are recorded in advance on a computer equipped with a central processing unit (CPU: Central Processing Unit), a memory, an external storage device, and the like. Predetermined information processing may be executed by the program.
 車両動作把握部110は、道路1の領域内における第1車両10および第2車両20の動作を把握して、それぞれの状況と今後の動作予定を第1動作情報および第2動作情報として取得する。車両動作把握部110は、上述した第1車両10、第2車両20およびインフラ側装置30に備えられている状況把握部、運転支援部および車両動作制御部や、これらの組み合わせによって構成される。 The vehicle motion grasping unit 110 grasps the motions of the first vehicle 10 and the second vehicle 20 in the area of the road 1, and acquires the respective situations and future operation schedules as the first motion information and the second motion information. .. The vehicle motion grasping unit 110 is composed of a situation grasping unit, a driving support unit, a vehicle motion control unit, and a combination thereof provided in the first vehicle 10, the second vehicle 20, and the infrastructure side device 30 described above.
 例えば、第1車両10または第2車両20に備えられた状況把握部で車両動作把握部110を構成する場合には、車両動作把握部110は、方向指示器の動作やカーナビゲーションシステムの経路情報、地図情報、車両内での会話等によって自車両の動作について走行状態を把握する。また車両動作把握部110は、画像撮像部で撮像した画像を認識することで、対向車両の方向指示器動作や路面に描画された画像の内容、歩行者や二輪車等の有無と動作を周囲状況として把握する。 For example, when the vehicle motion grasping unit 110 is configured by the situation grasping unit provided in the first vehicle 10 or the second vehicle 20, the vehicle motion grasping unit 110 uses the operation of the direction indicator and the route information of the car navigation system. , Map information, conversation in the vehicle, etc. to grasp the running state of the operation of the own vehicle. Further, the vehicle motion grasping unit 110 recognizes the image captured by the image capturing unit to determine the direction indicator operation of the oncoming vehicle, the content of the image drawn on the road surface, the presence / absence of a pedestrian, a two-wheeled vehicle, or the like, and the surrounding conditions. To grasp as.
 例えばインフラ側装置30で車両動作把握部110を構成する場合には、車両動作把握部110は、インフラ側装置30に備えられている画像撮像部で撮像した画像を認識することで、第1車両10および第2車両20の動作を予測する。または、第1車両10や第2車両20が運転支援車両である場合には、車両動作把握部110は、車両が備えている状況把握部、運転支援部および車両動作制御部の情報について情報通信部を介して取得することで、第1車両10および第2車両20の動作を予測する。 For example, when the vehicle motion grasping unit 110 is configured by the infrastructure side device 30, the vehicle motion grasping unit 110 recognizes the image captured by the image capturing unit provided in the infrastructure side device 30 to recognize the first vehicle. Predict the operation of 10 and the second vehicle 20. Alternatively, when the first vehicle 10 or the second vehicle 20 is a driving support vehicle, the vehicle motion grasping unit 110 communicates information about the information of the situation grasping unit, the driving support unit, and the vehicle motion control unit provided in the vehicle. By acquiring through the unit, the operation of the first vehicle 10 and the second vehicle 20 is predicted.
 誘導情報作成部120は、車両動作把握部110が取得した第1動作情報および第2動作情報に基づいて、第1車両10または第2車両20が走行する予定の経路を示す、誘導情報M1、M2を作成する。誘導情報は、第1車両10および第2車両20がそれぞれ左折および右折をするタイミングや、左折後および右折後の合流車線における車線位置等を含めてもよい。 The guidance information creating unit 120 indicates the route on which the first vehicle 10 or the second vehicle 20 is scheduled to travel based on the first operation information and the second operation information acquired by the vehicle operation grasping unit 110. Create M2. The guidance information may include the timing at which the first vehicle 10 and the second vehicle 20 make a left turn and a right turn, respectively, and the lane position in the merging lane after the left turn and the right turn.
 誘導情報提示部130は、誘導情報作成部120で作成された誘導情報M1、M2や路面情報M3を道路1の蛍光体含有領域R内に描画する路面描画装置である。インフラ側装置30が誘導情報提示部130を備える場合には、誘導情報提示部130は横断歩道や停止線等の路面情報M3や、歩道2上の誘導情報M4を描画してもよい。図1では、誘導情報提示部130が路面に対して誘導情報M1と誘導情報M2を描画することで第1車両10と第2車両20に誘導情報を提示しているが、画像の内容は限定されない。誘導情報の提示は、車内の画像表示装置を用いた表示や音声ガイダンスを含んでもよい。 The guidance information presentation unit 130 is a road surface drawing device that draws the guidance information M1 and M2 and the road surface information M3 created by the guidance information creation unit 120 in the phosphor-containing region R of the road 1. When the infrastructure side device 30 includes the guidance information presentation unit 130, the guidance information presentation unit 130 may draw road surface information M3 such as a pedestrian crossing or a stop line, or guidance information M4 on the sidewalk 2. In FIG. 1, the guidance information presenting unit 130 presents guidance information to the first vehicle 10 and the second vehicle 20 by drawing guidance information M1 and guidance information M2 on the road surface, but the content of the image is limited. Not done. The presentation of the guidance information may include a display using an image display device in the vehicle and voice guidance.
 図4A及び図4Bは、車両運転支援システム100における誘導情報提示の一例を示す模式図である。図4Aはインフラ側装置30からの光の照射を示している。図4Bは第1車両10または第2車両20からの光の照射を示している。図4Aの例ではインフラ側装置30は光照射部31を有しており、図4Bの例では第1車両10または第2車両20が光照射部11,21を有している。光照射部11,21,31は、誘導情報提示部130の一例である。光照射部11,21,31は、誘導情報作成部120が作成した誘導情報に基づいて、誘導情報M1,M2を道路1の路面に対して投影して描画する。なお光照射部11,21,31は、誘導情報作成部120が作成した誘導情報に基づいて、誘導情報M4および路面情報M3を道路1または歩道2の路面に対して投影して描画する(図1)。 4A and 4B are schematic views showing an example of guidance information presentation in the vehicle driving support system 100. FIG. 4A shows the irradiation of light from the infrastructure side device 30. FIG. 4B shows the irradiation of light from the first vehicle 10 or the second vehicle 20. In the example of FIG. 4A, the infrastructure side device 30 has the light irradiation unit 31, and in the example of FIG. 4B, the first vehicle 10 or the second vehicle 20 has the light irradiation units 11 and 21. The light irradiation units 11, 21, and 31 are examples of the guidance information presentation unit 130. The light irradiation units 11, 21, and 31 project and draw the guidance information M1 and M2 on the road surface of the road 1 based on the guidance information created by the guidance information creation unit 120. The light irradiation units 11, 21, and 31 project and draw the guidance information M4 and the road surface information M3 on the road surface of the road 1 or the sidewalk 2 based on the guidance information created by the guidance information creation unit 120 (FIG. 1).
 光照射部11,21,31から一次光が蛍光体塗布領域Rに照射されると、一次光はコーティング層5を透過して蛍光体含有層4に到達する。蛍光体含有層4には、蛍光体材料が含まれているため、一次光の少なくとも一部が二次光に波長変換される。二次光および波長変換されなかった一次光は、コーティング層5を介して道路1の外部に照射される。したがって、蛍光体塗布領域Rでは、誘導情報M1、M2や路面情報M3の形状が一次光と二次光の混色により得られる色で表示される。このとき一次光および二次光は、蛍光体含有層4に含まれる蛍光体微粒子および光散乱材で散乱されるため、配光特性は等方的になり、道路1および歩道2上の様々な位置における視認性が向上する。 When the primary light is irradiated to the phosphor coating region R from the light irradiation units 11, 21, 31, the primary light passes through the coating layer 5 and reaches the phosphor-containing layer 4. Since the phosphor-containing layer 4 contains a phosphor material, at least a part of the primary light is wavelength-converted to the secondary light. The secondary light and the primary light that has not been wavelength-converted are irradiated to the outside of the road 1 through the coating layer 5. Therefore, in the phosphor coating region R, the shapes of the guidance information M1 and M2 and the road surface information M3 are displayed in the color obtained by mixing the primary light and the secondary light. At this time, since the primary light and the secondary light are scattered by the phosphor fine particles and the light scattering material contained in the phosphor-containing layer 4, the light distribution characteristics become isotropic and various on the road 1 and the sidewalk 2. Visibility at the position is improved.
 二次光は蛍光体含有層4に含まれる蛍光体材料で一次光の少なくとも一部が波長変換されたものである。光照射部11,21,31や太陽または周囲環境から道路1に照射される光とは異なった色で誘導情報M1、M2や路面情報M3が表示され、道路1の表面が自発光しているかのように見えるため視認性が向上する。 The secondary light is a phosphor material contained in the phosphor-containing layer 4, and at least a part of the primary light is wavelength-converted. Guidance information M1, M2 and road surface information M3 are displayed in a color different from the light emitted from the light irradiation units 11, 21, 31 and the sun or the surrounding environment to the road 1, and is the surface of the road 1 self-luminous? Visibility is improved because it looks like.
 ここで、二次光は蛍光体含有層4で一次光の少なくとも一部が波長変換されて長波長となるため、一次光の波長は波長の短い緑色光または紫色光であることが好ましい。また、第1車両10または第2車両20の前照灯や、インフラ側装置30の照明灯の光源として、青色LEDと黄色蛍光体の組み合わせによる白色発光装置は、青色波長の光を照射する。したがって、蛍光体含有層4に含まれる蛍光体材料が一次光として青色光によって励起される場合、前照灯や照明灯に含まれる青色光によって二次光を発光してしまうことがある。このため、一次光の波長として青色光は好ましくない。 Here, since at least a part of the primary light is wavelength-converted in the phosphor-containing layer 4 to become a long wavelength, the wavelength of the primary light is preferably green light or purple light having a short wavelength. Further, as a light source of the headlight of the first vehicle 10 or the second vehicle 20 or the illumination lamp of the infrastructure side device 30, the white light emitting device by the combination of the blue LED and the yellow phosphor irradiates the light of the blue wavelength. Therefore, when the phosphor material contained in the phosphor-containing layer 4 is excited by blue light as the primary light, the secondary light may be emitted by the blue light contained in the headlight or the illumination lamp. Therefore, blue light is not preferable as the wavelength of the primary light.
 上述したように、本実施形態の道路1および車両運転支援システム100は、舗装面3を覆う蛍光体含有層4とコーティング層5を備え、蛍光体含有層4に対して一次光を照射して画像を描画する。蛍光体含有層4で一次光が波長変換されて二次光が発光され、路面描画の視認性が向上する。 As described above, the road 1 and the vehicle driving support system 100 of the present embodiment include the phosphor-containing layer 4 and the coating layer 5 that cover the pavement surface 3, and irradiate the phosphor-containing layer 4 with primary light. Draw an image. The wavelength of the primary light is converted in the phosphor-containing layer 4 to emit the secondary light, and the visibility of road surface drawing is improved.
 (道路の変形例)
 次に、本開示の道路の変形例について図5を用いて説明する。第1実施形態と重複する内容は説明を省略する。図5は本実施形態に係る道路1の構造を示す模式断面図である。図5に示すように本実施形態では、道路1の地面上に舗装面3が形成されており、舗装面3上に蛍光体含有層4と接着剤層6およびコーティング層7が積層して形成されている。
(Example of road modification)
Next, a modified example of the road of the present disclosure will be described with reference to FIG. The description of the contents overlapping with the first embodiment will be omitted. FIG. 5 is a schematic cross-sectional view showing the structure of the road 1 according to the present embodiment. As shown in FIG. 5, in the present embodiment, the pavement surface 3 is formed on the ground of the road 1, and the phosphor-containing layer 4, the adhesive layer 6 and the coating layer 7 are laminated on the pavement surface 3. Has been done.
 接着剤層6は、蛍光体含有層4とコーティング層7の間に介在して、両者を接着する部材であり、接着剤でもよい。図5に示すようにコーティング層7は、接着剤層6の上に形成された板状の部材であり、コーティング層7の表面と裏面には微細な凹凸形状が形成されてたカバー部材を含んでいる。図5はコーティング層7の表裏面に凹凸形状を形成した例を示したが、表面側または裏面側のどちらか一方に凹凸形状が形成されていてもよい。接着剤層6およびコーティング層7は、どちらも一次光および二次光を透過する材料で構成されている。 The adhesive layer 6 is a member that is interposed between the phosphor-containing layer 4 and the coating layer 7 to bond the two, and may be an adhesive. As shown in FIG. 5, the coating layer 7 is a plate-shaped member formed on the adhesive layer 6, and includes a cover member having a fine uneven shape formed on the front surface and the back surface of the coating layer 7. I'm out. FIG. 5 shows an example in which a concavo-convex shape is formed on the front and back surfaces of the coating layer 7, but the concavo-convex shape may be formed on either the front surface side or the back surface side. Both the adhesive layer 6 and the coating layer 7 are made of a material that transmits primary light and secondary light.
 コーティング層7の形成方法としては、板状の部材を成形する際に予め凹凸形状が形成されてもよく、平坦面にサンドブラスト等で凹凸形状が形成されてもよい。蛍光体含有層4上に接着材を塗布しておき、得られた板状のコーティング層7を配置して接着剤を硬化させて接着剤層6を形成し、接着剤層6によって蛍光体含有層4とコーティング層7を接着する。 As a method for forming the coating layer 7, an uneven shape may be formed in advance when the plate-shaped member is formed, or an uneven shape may be formed on a flat surface by sandblasting or the like. An adhesive is applied on the phosphor-containing layer 4, and the obtained plate-shaped coating layer 7 is arranged to cure the adhesive to form an adhesive layer 6, and the adhesive layer 6 contains the phosphor. The layer 4 and the coating layer 7 are adhered to each other.
 コーティング層7に形成される凹凸の形状やサイズは限定されないが、一次光および二次光を散乱するサイズであり、凹凸の幅や高さは一次光および二次光の波長よりも大きいことが好ましい。コーティング層7に形成された凹凸で一次光および二次光が散乱されることで、蛍光体含有層4から外部に取り出される一次光および二次光の配光特性はさらに等方的になり、道路1および歩道2上の様々な位置における視認性が向上する。また、コーティング層7の表面に凹凸が形成されることで、走行する車両のタイヤとの摩擦を確保することができる。 The shape and size of the unevenness formed on the coating layer 7 are not limited, but it is a size that scatters the primary light and the secondary light, and the width and height of the unevenness may be larger than the wavelengths of the primary light and the secondary light. preferable. By scattering the primary light and the secondary light by the unevenness formed on the coating layer 7, the light distribution characteristics of the primary light and the secondary light taken out from the phosphor-containing layer 4 become more isotropic. Visibility is improved at various positions on the road 1 and the sidewalk 2. Further, by forming irregularities on the surface of the coating layer 7, it is possible to secure friction with the tires of the traveling vehicle.
 本実施形態の道路1および車両運転支援システム100は、舗装面3を覆う蛍光体含有層4、接着剤層6およびコーティング層7を備え、蛍光体含有層4に対して一次光を照射して画像を描画する。蛍光体含有層4で一次光が波長変換されて二次光が発光され、路面描画の視認性が向上する。また、コーティング層7に凹凸が形成されているため、蛍光体含有層4から外部に照射される一次光および二次光の配光特性はさらに等方的になり視認性が向上する。 The road 1 and the vehicle driving support system 100 of the present embodiment include a phosphor-containing layer 4, an adhesive layer 6, and a coating layer 7 that cover the pavement surface 3, and irradiate the phosphor-containing layer 4 with primary light. Draw an image. The wavelength of the primary light is converted in the phosphor-containing layer 4 to emit the secondary light, and the visibility of road surface drawing is improved. Further, since the coating layer 7 is formed with irregularities, the light distribution characteristics of the primary light and the secondary light emitted from the phosphor-containing layer 4 to the outside become more isotropic and the visibility is improved.
 (第1実施形態の変形例)
 次に、本開示の第1実施形態の変形例について説明する。第1実施形態と重複する内容は説明を省略する。本実施形態では、第1車両10または第2車両20が運転支援車両である。第1車両10または第2車両20は、蛍光体塗布領域Rを走行中にのみ光照射部11,21から一次光の照射による路面描画を行い、他の領域を走行中は一次光を照射しない。
(Modified example of the first embodiment)
Next, a modified example of the first embodiment of the present disclosure will be described. The description of the contents overlapping with the first embodiment will be omitted. In the present embodiment, the first vehicle 10 or the second vehicle 20 is a driving support vehicle. The first vehicle 10 or the second vehicle 20 draws the road surface by irradiating the primary light from the light irradiation units 11 and 21 only while traveling in the phosphor coating region R, and does not irradiate the primary light while traveling in other regions. ..
 第1車両10または第2車両20が運転支援車両であり、第1実施形態と同様に状況把握部を備えている。状況把握部が取得する周囲状況は道路1における蛍光体塗布領域Rの位置情報を含む。具体例としては、カーナビゲーションシステムの地図情報に蛍光体塗布領域Rの位置情報が含まれ、状況把握部は車両位置と地図情報を照合することで、現在位置もしくは走行経路上の蛍光体塗布領域Rを検知する。または、蛍光体塗布領域Rの近傍に配置されたインフラ側装置30が、路車間通信によって蛍光体塗布領域Rの存在を第1車両10または第2車両20に伝達してもよい。 The first vehicle 10 or the second vehicle 20 is a driving support vehicle, and has a situation grasping unit as in the first embodiment. The surrounding conditions acquired by the situation grasping unit include the position information of the phosphor coating region R on the road 1. As a specific example, the map information of the car navigation system includes the position information of the phosphor coating area R, and the situation grasping unit collates the vehicle position with the map information to collate the phosphor application area on the current position or the traveling route. Detect R. Alternatively, the infrastructure-side device 30 arranged in the vicinity of the phosphor-coated region R may transmit the presence of the phosphor-coated region R to the first vehicle 10 or the second vehicle 20 by road-to-vehicle communication.
 本実施形態では、第1車両10または第2車両20が道路1を走行し、蛍光体塗布領域Rに差し掛かると、状況把握部が蛍光体塗布領域Rの存在を検知し、光照射部11,21から路面に一次光を照射する。蛍光体塗布領域R内には蛍光体含有層4が積層されているため、蛍光体含有層4に含まれる蛍光体材料によって一次光が二次光に波長変換され、一次光と二次光の混色によって誘導情報M1、M2が路面描画される。 In the present embodiment, when the first vehicle 10 or the second vehicle 20 travels on the road 1 and approaches the phosphor coating region R, the situation grasping unit detects the existence of the phosphor coating region R and the light irradiation unit 11 , 21 irradiates the road surface with primary light. Since the phosphor-containing layer 4 is laminated in the phosphor-coated region R, the primary light is wavelength-converted to the secondary light by the phosphor material contained in the phosphor-containing layer 4, and the primary light and the secondary light are converted into wavelengths. Guidance information M1 and M2 are drawn on the road surface by color mixing.
 本実施形態では、蛍光体塗布領域Rにおいてのみ光照射部11,21から路面に一次光が照射され、蛍光体含有層4が形成されていない他の領域において一次光は照射されないため、消費電力を低減することができる。 In the present embodiment, the road surface is irradiated with the primary light from the light irradiation units 11 and 21 only in the phosphor-coated region R, and the primary light is not irradiated in the other regions where the phosphor-containing layer 4 is not formed, so that the power consumption is consumed. Can be reduced.
 (第1実施形態の動作例)
 次に、本開示の第1実施形態の動作例について図1、図3及び図6を用いて説明する。
(Operation example of the first embodiment)
Next, an operation example of the first embodiment of the present disclosure will be described with reference to FIGS. 1, 3 and 6.
 図6は、第1実施形態に係る車両運転支援システム100の動作例を示すフローチャートである。はじめに、ステップ1の動作取得工程で、車両動作把握部110が第1車両10の第1動作情報を取得し、第2車両20の第2動作情報を取得する。また、第1車両10、第2車両20、インフラ側装置30に備えられている状況把握部は、道路1の渋滞状況や他の走行車両、歩行者、二輪車の有無や動作、合流先の道路1についての道路状況と走行状況等の情報を取得する。 FIG. 6 is a flowchart showing an operation example of the vehicle driving support system 100 according to the first embodiment. First, in the motion acquisition step of step 1, the vehicle motion grasping unit 110 acquires the first motion information of the first vehicle 10 and acquires the second motion information of the second vehicle 20. In addition, the situation grasping unit provided in the first vehicle 10, the second vehicle 20, and the infrastructure side device 30 can be used for the congestion status of the road 1, the presence / absence and operation of other traveling vehicles, pedestrians, and two-wheeled vehicles, and the road at the confluence. Acquire information such as road conditions and driving conditions for 1.
 次にステップ2の合流判定工程では、車両動作把握部110は、第1動作情報に第1車両10の左折が含まれ、第2動作情報に第2車両20の右折が含まれ、かつ第1車両10の左折後と第2車両20の右折後の進行方向が同じで、同一方向への進路変更である合流かを判断する。車両運転支援システム100は、車両動作把握部110が合流であると判断した場合にはステップ3に移行し、合流でないと判断した場合にはステップ1の動作情報取得工程に移行する。ここで、合流の判定には、車両動作把握部110は第1車両10の左折タイミングと第2車両20の右折タイミングの重なりを考慮してもよい。例えば車両動作把握部110は、第1車両10の左折と第2車両20の右折が所定時間内に実行される場合には合流と判定し、所定時間以上の間隔がある場合には合流ではないと判定する。 Next, in the merging determination step of step 2, the vehicle motion grasping unit 110 includes the left turn of the first vehicle 10 in the first motion information, the right turn of the second vehicle 20 in the second motion information, and the first. It is determined whether the traveling direction is the same after the left turn of the vehicle 10 and the right turn of the second vehicle 20, and the merging is a change of course in the same direction. When the vehicle operation grasping unit 110 determines that the vehicle is merging, the vehicle driving support system 100 proceeds to step 3, and when it is determined that the vehicle is not merging, the vehicle driving support system 100 proceeds to the operation information acquisition step of step 1. Here, in determining the merging, the vehicle motion grasping unit 110 may consider the overlap of the left turn timing of the first vehicle 10 and the right turn timing of the second vehicle 20. For example, the vehicle motion grasping unit 110 determines that a left turn of the first vehicle 10 and a right turn of the second vehicle 20 are executed within a predetermined time, and if there is an interval of a predetermined time or more, the vehicle does not merge. Is determined.
 ステップ3の誘導情報作成工程では、第1動作情報および第2動作情報に基づいて、誘導情報作成部120は、第1車両10の動作を誘導する第1誘導情報M1と、第2車両20の動作を誘導する第2誘導情報M2を作成する。ここで、第1誘導情報および第2誘導情報はそれぞれ本開示における誘導情報に相当している。また、第1誘導情報M1および第2誘導情報M2の作成に際しては、誘導情報作成部120は、動作情報取得工程で取得した道路1の情報を考慮してもよい。第1誘導情報M1および第2誘導情報M2の具体例は後述する。誘導情報の作成が終了すると、車両運転支援システム100はステップ4に移行する。 In the guidance information creation step of step 3, the guidance information creation unit 120 uses the first guidance information M1 for guiding the movement of the first vehicle 10 and the second vehicle 20 based on the first movement information and the second movement information. The second guidance information M2 that guides the movement is created. Here, the first-lead information and the second-lead information correspond to the guidance information in the present disclosure, respectively. Further, when creating the first guidance information M1 and the second guidance information M2, the guidance information creation unit 120 may consider the information of the road 1 acquired in the operation information acquisition step. Specific examples of the first lead information M1 and the second lead information M2 will be described later. When the creation of the guidance information is completed, the vehicle driving support system 100 shifts to step 4.
 次にステップ4の誘導情報提示工程では、第1誘導情報M1および第2誘導情報M2に基づいて、誘導情報提示部130は誘導情報の提示方法を決定し、第1車両10および第2車両20に提示する。誘導情報の提示方法は、上述したように路面への描画や車内の画像表示装置への表示、ヘッドアップディスプレイへの表示、音声ガイダンスを含んでもよい。これらの方法を組み合わせて誘導情報を提示してもよい。誘導情報の提示が終了すると、車両運転支援システム100は再度ステップ1の動作情報取得工程に移行する。 Next, in the guidance information presentation step of step 4, the guidance information presentation unit 130 determines the guidance information presentation method based on the first guidance information M1 and the second guidance information M2, and the first vehicle 10 and the second vehicle 20 Present to. As described above, the method of presenting the guidance information may include drawing on the road surface, displaying on the image display device in the vehicle, displaying on the head-up display, and voice guidance. Guidance information may be presented by combining these methods. When the presentation of the guidance information is completed, the vehicle driving support system 100 shifts to the operation information acquisition step of step 1 again.
 車両動作把握部110、誘導情報作成部120および誘導情報提示部130の動作例としては、以下のようなものが挙げられる。一例として車両運転支援システム100は、合流先の車線が1車線の場合には、第1車両10の左折情報を先に提示し、第2車両20の右折情報は第1車両10の左折終了後に提示する。 Examples of operations of the vehicle motion grasping unit 110, the guidance information creating unit 120, and the guidance information presenting unit 130 include the following. As an example, when the merging destination lane is one lane, the vehicle driving support system 100 presents the left turn information of the first vehicle 10 first, and the right turn information of the second vehicle 20 is after the left turn of the first vehicle 10 is completed. Present.
 他の例として車両運転支援システム100は、合流先の車線が2車線以上の場合には、第1車両10の左折後の車線と、第2車両20の右折後の車線を異ならせて提示する。具体的に車両運転支援システム100は、第1車両10には左折後に左側の車線を走行するように第1誘導情報M1を提示し、第2車両20には右折後に右側の車線を走行するように第2誘導情報M2を提示する。若しくは、第1車両10には左折後に右側の車線を走行するように第1誘導情報M1を提示し、第2車両20には右折後に左側の車線を走行するように第2誘導情報M2を提示する。このような右左折後の走行車線の指示には、例えば右左折後に車両が直進するか再度右左折するか等に応じて、右左折後の経路に移行しやすい車線が選択されてもよい。また、第1車両10が大型車である場合には旋回半径が大きくなるため、左折後には右側の車線を走行するように第1誘導情報M1を提示してもよい。 As another example, when the lane of the merging destination is two or more lanes, the vehicle driving support system 100 presents the lane after the left turn of the first vehicle 10 and the lane after the right turn of the second vehicle 20 differently. .. Specifically, the vehicle driving support system 100 presents the first guidance information M1 to the first vehicle 10 so as to drive in the left lane after turning left, and causes the second vehicle 20 to drive in the right lane after turning right. The second guidance information M2 is presented to. Alternatively, the first vehicle 10 is presented with the first guidance information M1 so as to drive in the right lane after turning left, and the second vehicle 20 is presented with the second guidance information M2 so as to drive in the left lane after turning right. To do. In the instruction of the traveling lane after turning left or right, a lane that is easy to shift to the route after turning left or right may be selected depending on whether the vehicle goes straight after turning left or right or turns left or right again. Further, since the turning radius becomes large when the first vehicle 10 is a large vehicle, the first guidance information M1 may be presented so as to drive in the right lane after turning left.
 第1車両10と第2車両20の合流後の走行車線を異ならせる場合には、第1誘導情報M1の提示タイミングと第2誘導情報M2の提示タイミングが重なってもよい。合流後の第1車両10と第2車両20の走行車線が異なっている場合には、第1車両10と第2車両20とが合流先の道路1に進入するタイミングが重なっていても、車両同士の衝突を避けながらも短時間で2台の車両の左折と右折を完了され、車両運転支援システム100は円滑な交通を支援する。 When the traveling lanes after the merging of the first vehicle 10 and the second vehicle 20 are different, the presentation timing of the first guidance information M1 and the presentation timing of the second guidance information M2 may overlap. When the traveling lanes of the first vehicle 10 and the second vehicle 20 after merging are different, even if the timings of the first vehicle 10 and the second vehicle 20 entering the merging road 1 overlap, the vehicles The vehicle driving support system 100 supports smooth traffic by completing left and right turns of two vehicles in a short time while avoiding collisions with each other.
 以上に述べたように、本実施形態の車両運転支援システム100は、車両動作把握部110で第1動作情報および第2動作情報を取得する。第1動作情報および第2動作情報が同一方向への進路変更であり、かつ右折動作および左折動作が含まれている場合に、誘導情報作成部120が第1車両10または第2車両20が走行する予定の経路を誘導情報として作成する。誘導情報提示部130が誘導情報を提示する。これにより、複数の車両が交差点で合流する場合にも、車両運転支援システム100は安全性を確保しながらも円滑な交通を支援することができる。 As described above, in the vehicle driving support system 100 of the present embodiment, the vehicle motion grasping unit 110 acquires the first motion information and the second motion information. When the first motion information and the second motion information are course changes in the same direction and include a right turn motion and a left turn motion, the guidance information creation unit 120 causes the first vehicle 10 or the second vehicle 20 to travel. Create the route to be used as guidance information. The guidance information presentation unit 130 presents guidance information. As a result, even when a plurality of vehicles merge at an intersection, the vehicle driving support system 100 can support smooth traffic while ensuring safety.
 (第2実施形態)
 次に、本開示の第2実施形態について図7を用いて説明する。第1実施形態と重複する内容は説明を省略する。図7は、第2実施形態に係る車両運転支援システム200の道路上での動作例を示す模式図である。車両運転支援システム200の一例として図7は、第1実施形態と同様に、道路1が交差して歩道2が設けられた交差点において、第1車両10が左折し第2車両20が右折して、合流して同一方向に走行する場合を示している。
(Second Embodiment)
Next, the second embodiment of the present disclosure will be described with reference to FIG. The description of the contents overlapping with the first embodiment will be omitted. FIG. 7 is a schematic view showing an operation example of the vehicle driving support system 200 according to the second embodiment on the road. As an example of the vehicle driving support system 200, FIG. 7 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
 本実施形態は、第1車両10の左折を優先して、第2車両20の右折タイミングを遅らせる場合を示している。第2車両20を停止させる条件が存在する場合に、ステップ3の誘導情報作成工程で誘導情報作成部120によって誘導情報が選択される(図6)。第2車両20を停止させる条件は、例えば合流後の道路1が1車線である場合や、第1車両10が大型車両で左折動作に時間を要する場合、合流後の道路1が渋滞している場合等が含まれる。 This embodiment shows a case where the left turn of the first vehicle 10 is prioritized and the right turn timing of the second vehicle 20 is delayed. When the condition for stopping the second vehicle 20 exists, the guidance information is selected by the guidance information creation unit 120 in the guidance information creation step of step 3 (FIG. 6). The conditions for stopping the second vehicle 20 are, for example, when the road 1 after merging is one lane, or when the first vehicle 10 is a large vehicle and it takes time to turn left, the road 1 after merging is congested. Cases etc. are included.
 図7に示すように誘導情報提示部130は、第1車両10に左折の第1誘導情報M1を提示し、第2車両20には停止の第2誘導情報M2を提示する。第1誘導情報M1と第2誘導情報M2の提示は、上述したような第2車両20を停止させる条件が解消するまで継続し、条件の解消後には図1に示したように右折の第2誘導情報M2を提示する。図7では第2車両20の停止を示す第2誘導情報M2として「STOP」の文字を路面描画する例を示したが、視認性を向上させるためにアイコン等の図形が提示されてもよく、音声ガイダンスが用いられてもよい。 As shown in FIG. 7, the guidance information presenting unit 130 presents the first guidance information M1 for turning left to the first vehicle 10 and the second guidance information M2 for stopping to the second vehicle 20. The presentation of the first guidance information M1 and the second guidance information M2 continues until the condition for stopping the second vehicle 20 as described above is resolved, and after the condition is resolved, the second turn to the right is as shown in FIG. Guidance information M2 is presented. FIG. 7 shows an example in which the characters “STOP” are drawn on the road surface as the second guidance information M2 indicating the stop of the second vehicle 20, but a figure such as an icon may be presented in order to improve visibility. Voice guidance may be used.
 本実施形態の車両運転支援システム200は、第2車両20の停止と右折を明示的に提示することで、複数の車両が交差点で合流する場合にも、安全性を確保しながらも円滑な交通を支援することができる。 The vehicle driving support system 200 of the present embodiment explicitly indicates that the second vehicle 20 is stopped and turns right, so that even when a plurality of vehicles merge at an intersection, smooth traffic is ensured while ensuring safety. Can be assisted.
 (第3実施形態)
 次に、本開示の第3実施形態について図8を用いて説明する。第1実施形態と重複する内容は説明を省略する。図8は、第3実施形態に係る車両運転支援システム300の道路上での動作例を示す模式図である。車両運転支援システム300の一例として図8は、第1実施形態と同様に、道路1が交差して歩道2が設けられた交差点において、第1車両10が左折し第2車両20が右折して、合流して同一方向に走行する場合を示している。
(Third Embodiment)
Next, the third embodiment of the present disclosure will be described with reference to FIG. The description of the contents overlapping with the first embodiment will be omitted. FIG. 8 is a schematic view showing an example of operation of the vehicle driving support system 300 according to the third embodiment on the road. As an example of the vehicle driving support system 300, FIG. 8 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
 本実施形態は、第1車両10の左折予定を交差点の手前から予備的に提示する場合を示している。図8に示したように、第1車両10が道路1を走行して、左折する交差点から所定距離L1に到達した場合には、第1実施形態で示したのと同様に左折の第1誘導情報M1が提示される。所定距離L2が所定距離L1よりも長い場合であって、第1車両が所定距離L1に到達する前の所定距離L2からL1までの間を走行中、第1誘導情報M1として左折準備が提示される。第1車両10の位置情報と地図情報に基づいて、ステップ3の誘導情報作成工程で誘導情報作成部120によって誘導情報(第1誘導情報M1)が選択される(図6)。所定距離L1、L2の例としては、所定距離L1が交差点から5mの距離であり、所定距離L2が交差点から30mの距離である。 This embodiment shows a case where the left turn schedule of the first vehicle 10 is preliminarily presented from the front of the intersection. As shown in FIG. 8, when the first vehicle 10 travels on the road 1 and reaches a predetermined distance L1 from the intersection where the vehicle turns left, the first guidance for turning left is the same as shown in the first embodiment. Information M1 is presented. When the predetermined distance L2 is longer than the predetermined distance L1 and the first vehicle is traveling between the predetermined distances L2 and L1 before reaching the predetermined distance L1, the left turn preparation is presented as the first guidance information M1. To. Guidance information (first guidance information M1) is selected by the guidance information creation unit 120 in the guidance information creation step of step 3 based on the position information and map information of the first vehicle 10 (FIG. 6). As an example of the predetermined distances L1 and L2, the predetermined distance L1 is a distance of 5 m from the intersection, and the predetermined distance L2 is a distance of 30 m from the intersection.
 左折の第1誘導情報M1と左折準備の第1誘導情報M1の例としては、左折の第1誘導情報M1が左に曲がることを示す文字あるいは画像である場合には、左折準備の第1誘導情報M1は左折の第1誘導情報M1とは異なる画像が用いられる。異なる画像としては、画像の色、明るさ、線幅、実線と破線、点滅と点灯が変更された画像などが含まれる。 As an example of the left turn first guidance information M1 and the left turn preparation first guidance information M1, when the left turn first guidance information M1 is a character or an image indicating that the left turn is turned to the left, the first guidance for the left turn preparation As the information M1, an image different from that of the first guidance information M1 for turning left is used. Different images include image color, brightness, line width, solid and dashed lines, images with changed blinking and lighting, and the like.
 誘導情報作成部120は、第1車両10の速度と交差点までの距離、減速度に基づいて、第1車両10が交差点に到達して左折を開始するまでの時間を算出してもよい。第2車両20が安全に右折できる時間が確保されている場合には、誘導情報提示部130は、左折の第1誘導情報M1を提示する前に、第2車両20に第2誘導情報M2として右折を提示してもよい。 The guidance information creation unit 120 may calculate the time required for the first vehicle 10 to reach the intersection and start a left turn based on the speed of the first vehicle 10, the distance to the intersection, and the deceleration. When the time for the second vehicle 20 to safely turn right is secured, the guidance information presenting unit 130 informs the second vehicle 20 as the second guidance information M2 before presenting the first guidance information M1 for the left turn. You may offer a right turn.
 本実施形態の車両運転支援システム300は、第1車両10の左折を交差点の手前側から提示することで、第2車両20やその他の周囲の車両、歩行者、二輪車等に対して左折の予定を伝達して注意を促すことができる。複数の車両が交差点で合流する場合にも、車両運転支援システム300は安全性を確保しながらも円滑な交通を支援することができる。また、第2車両20に対して左折予定を伝達することで、第2車両20が先に右折することを促すこともできるため、さらに円滑な交通を支援することができる。 The vehicle driving support system 300 of the present embodiment is scheduled to turn left with respect to the second vehicle 20, other surrounding vehicles, pedestrians, motorcycles, etc. by presenting the left turn of the first vehicle 10 from the front side of the intersection. Can be communicated to call attention. Even when a plurality of vehicles meet at an intersection, the vehicle driving support system 300 can support smooth traffic while ensuring safety. Further, by informing the second vehicle 20 of the left turn schedule, it is possible to encourage the second vehicle 20 to make a right turn first, so that smoother traffic can be supported.
 (第4実施形態)
 次に、本開示の第4実施形態について図9を用いて説明する。第1実施形態と重複する内容は説明を省略する。図9は、第4実施形態に係る車両運転支援システム400の道路上での動作例を示す模式図である。車両運転支援システム400の一例として図9は、第1実施形態と同様に、道路1が交差して歩道2が設けられた交差点において、第1車両10が左折し第2車両20が右折して、合流して同一方向に走行する場合を示している。
(Fourth Embodiment)
Next, the fourth embodiment of the present disclosure will be described with reference to FIG. The description of the contents overlapping with the first embodiment will be omitted. FIG. 9 is a schematic view showing an example of operation of the vehicle driving support system 400 according to the fourth embodiment on the road. As an example of the vehicle driving support system 400, FIG. 9 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
 本実施形態では、交差点で左折する第1車両10の左脇を二輪車40が並走している場合を示している。二輪車40の一例としては、自転車や自動二輪車が含まれるが、第1車両10の脇を並走して移動する対象であれば三輪車や超小型車、路側帯を走るランナー等であってもよい。 In the present embodiment, the case where the two-wheeled vehicle 40 is running in parallel on the left side of the first vehicle 10 that turns left at the intersection is shown. An example of the two-wheeled vehicle 40 includes a bicycle and a motorcycle, but a three-wheeled vehicle, an ultra-small vehicle, a runner running on a roadside belt, or the like may be used as long as it is an object to move in parallel on the side of the first vehicle 10.
 ステップ1の動作情報取得工程において(図6)、車両動作把握部110は、第1車両10の第1動作情報および第2車両20の第2動作情報に加えて、周囲の道路1および歩道2の状況を取得する。ステップ2の合流判定工程において、車両動作把握部110は、第1車両10と第2車両20の右左折による合流の判定に加えて、第1車両10の左側を二輪車40が走行していること、および第1車両10が左折するタイミングで二輪車40が交差点に到達することを判定する。ステップ3の誘導情報作成工程では、誘導情報作成部120は第1車両10の左折と停止を示す第1誘導情報M1a、M1bと、第2車両20の左折を示す第2誘導情報M2を作成する。ステップ4の誘導情報提示工程では、誘導情報提示部130は、第1車両10に対して左折と停止を示す第1誘導情報M1a、M1bを提示して、第2車両20に対して右折を示す第2誘導情報M2を提示する。 In the operation information acquisition process of step 1 (FIG. 6), the vehicle operation grasping unit 110 adds the first operation information of the first vehicle 10 and the second operation information of the second vehicle 20 to the surrounding roads 1 and sidewalks 2. To get the status of. In the merging determination step of step 2, the vehicle motion grasping unit 110 determines that the first vehicle 10 and the second vehicle 20 are merging by turning left or right, and the motorcycle 40 is traveling on the left side of the first vehicle 10. , And it is determined that the motorcycle 40 reaches the intersection at the timing when the first vehicle 10 turns left. In the guidance information creation step of step 3, the guidance information creation unit 120 creates the first guidance information M1a and M1b indicating the left turn and stop of the first vehicle 10 and the second guidance information M2 indicating the left turn of the second vehicle 20. .. In the guidance information presentation step of step 4, the guidance information presentation unit 130 presents the first guidance information M1a and M1b indicating a left turn and a stop to the first vehicle 10, and indicates a right turn to the second vehicle 20. The second guidance information M2 is presented.
 図9では、誘導情報提示部130は、第2車両20に右折の第2誘導情報M2を提示するが、二輪車40が交差点を直進する場合には、図7に示したような停止の第2誘導情報M2を提示してもよい。誘導情報提示部130は、第1誘導情報M1a,M1bを同時に提示してもよいが、はじめに停止を示す第1誘導情報M1bを提示し、二輪車40の並走状態が解消した後に左折を示す第1誘導情報M1aを提示してもよい。誘導情報提示部130は、、第1誘導情報M1a,M1bに異なる画像を用いてもよい。異なる画像としては、画像の色、明るさ、線幅、実線と破線、点滅と点灯が変更された画像などが含まれる。 In FIG. 9, the guidance information presenting unit 130 presents the second guidance information M2 for turning right to the second vehicle 20, but when the two-wheeled vehicle 40 goes straight through the intersection, the second stop as shown in FIG. Guidance information M2 may be presented. The guidance information presenting unit 130 may present the first guidance information M1a and M1b at the same time, but first presents the first guidance information M1b indicating a stop, and indicates a left turn after the parallel running state of the motorcycle 40 is resolved. 1 Guidance information M1a may be presented. The guidance information presenting unit 130 may use different images for the first guidance information M1a and M1b. Different images include image color, brightness, line width, solid and dashed lines, images with changed blinking and lighting, and the like.
 本実施形態の車両運転支援システム400は、二輪車40等の並走を検知して、第1車両10の左折を停止することで二輪車40の巻き込み事故を防止し、複数の車両が交差点で合流する場合にも、安全性を確保しながらも円滑な交通を支援することができる。 The vehicle driving support system 400 of the present embodiment detects parallel running of the two-wheeled vehicle 40 and the like and stops the left turn of the first vehicle 10 to prevent an accident involving the two-wheeled vehicle 40 and a plurality of vehicles merge at an intersection. In some cases, it is possible to support smooth traffic while ensuring safety.
 (第5実施形態)
 次に、本開示の第5実施形態について図10を用いて説明する。第1実施形態と重複する内容は説明を省略する。図10は、第5施形態に係る車両運転支援システム500の道路上での動作例を示す模式図である。車両運転支援システム500の一例として図10は、第1実施形態と同様に、道路1が交差して歩道2が設けられた交差点において、第1車両10が左折し第2車両20が右折して、合流して同一方向に走行する場合を示している。
(Fifth Embodiment)
Next, the fifth embodiment of the present disclosure will be described with reference to FIG. The description of the contents overlapping with the first embodiment will be omitted. FIG. 10 is a schematic view showing an example of operation of the vehicle driving support system 500 according to the fifth embodiment on the road. As an example of the vehicle driving support system 500, FIG. 10 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
 本実施形態では、第1車両10と第2車両20が合流する車線に横断歩道が設けられている。図10は、第1車両10が左折を開始した際、または第2車両20が右折を開始した際に、歩行者50に対して注意喚起をする場合を示している。歩行者50が横断歩道を渡ることが明らかな場合には、第1車両10の左折および第2車両20の右折よりも、歩行者50の横断が優先される。しかし、歩行者50の動作を正確に予測することが困難な場合や、横断歩道の近辺に歩行者50が存在しない場合には、車両運転支援システム500は横断歩道の周囲に注意喚起情報を提示して、予備的に注意喚起をする。 In the present embodiment, a pedestrian crossing is provided in the lane where the first vehicle 10 and the second vehicle 20 meet. FIG. 10 shows a case where the pedestrian 50 is alerted when the first vehicle 10 starts to turn left or when the second vehicle 20 starts to turn right. When it is clear that the pedestrian 50 crosses the pedestrian crossing, the crossing of the pedestrian 50 is prioritized over the left turn of the first vehicle 10 and the right turn of the second vehicle 20. However, when it is difficult to accurately predict the movement of the pedestrian 50, or when the pedestrian 50 does not exist in the vicinity of the pedestrian crossing, the vehicle driving support system 500 presents warning information around the pedestrian crossing. Then, give a preliminary alert.
 横断歩道の周囲に注意喚起情報を提示する方法は、図4Aに示したようなインフラ側装置30からの路面描画でもよい。インフラ側装置30がスピーカーを備え、スピーカーが音声ガイダンスで第1車両10の左折および第2車両20の右折を提示し、注意を促すアナウンスを流してもよい。また、歩行者50が携帯電子機器を所持している場合には、第1車両10、第2車両20、インフラ側装置30が備えている情報通信部を用いて、携帯電子機器が画像や音声で注意喚起情報を提示してもよい。 The method of presenting the alert information around the pedestrian crossing may be the road surface drawing from the infrastructure side device 30 as shown in FIG. 4A. The infrastructure side device 30 may be provided with a speaker, and the speaker may indicate a left turn of the first vehicle 10 and a right turn of the second vehicle 20 by voice guidance, and make an announcement calling attention. When the pedestrian 50 possesses a portable electronic device, the portable electronic device uses an information communication unit provided in the first vehicle 10, the second vehicle 20, and the infrastructure side device 30 to display an image or sound. You may present the alert information at.
 本実施形態の車両運転支援システム500は、第1車両10の左折と第2車両20の右折が許容され、右左折動作が開始された場合にも、急に横断歩道を渡ろうとする歩行者50に注意を喚起し、複数の車両が交差点で合流する場合にも、安全性を確保しながらも円滑な交通を支援することができる。 In the vehicle driving support system 500 of the present embodiment, the left turn of the first vehicle 10 and the right turn of the second vehicle 20 are allowed, and even when the right / left turn operation is started, the pedestrian 50 who suddenly tries to cross the pedestrian crossing 50 Even when multiple vehicles meet at an intersection, it is possible to support smooth traffic while ensuring safety.
 (第6実施形態)
 次に、本開示の第6実施形態について図11を用いて説明する。第1実施形態と重複する内容は説明を省略する。図11は、第6実施形態に係る車両運転支援システム600の道路上での動作例を示す模式図である。
(Sixth Embodiment)
Next, the sixth embodiment of the present disclosure will be described with reference to FIG. The description of the contents overlapping with the first embodiment will be omitted. FIG. 11 is a schematic view showing an example of operation of the vehicle driving support system 600 according to the sixth embodiment on the road.
 車両運転支援システム600の一例として図11は、道路1が交差して歩道2が設けられた交差点を示している。道路1上には第1車両としての受信側車両10Cと第2車両としての送信側車両20Cが互いに対向して走行している。歩道2の領域上には、インフラ側装置30と歩行者が保持する携帯電子機器60が配置されている。図11では、受信側車両10Cは図中下方から上方に向かって道路1を直進しようとしており、送信側車両20Cは図中上方から図中左側へと右折しようとしている状況である。 As an example of the vehicle driving support system 600, FIG. 11 shows an intersection where the road 1 intersects and the sidewalk 2 is provided. The receiving side vehicle 10C as the first vehicle and the transmitting side vehicle 20C as the second vehicle are running on the road 1 facing each other. On the area of the sidewalk 2, an infrastructure side device 30 and a portable electronic device 60 held by a pedestrian are arranged. In FIG. 11, the receiving side vehicle 10C is trying to go straight on the road 1 from the lower side to the upper side in the figure, and the transmitting side vehicle 20C is trying to turn right from the upper side in the figure to the left side in the figure.
 受信側車両10Cは、道路1上を走行する車両であり、ステアリング制御や加減速制御の一部をコンピュータ等で実施している運転支援車両である。図11では、受信側車両10Cは、道路1を直進する経路をとるが、その前方を送信側車両20Cが横断して右折しようとする際に、後述する光信号を検知することで停止動作または減速動作を行う。 The receiving side vehicle 10C is a vehicle traveling on the road 1 and is a driving support vehicle in which a part of steering control and acceleration / deceleration control is performed by a computer or the like. In FIG. 11, the receiving side vehicle 10C takes a route traveling straight on the road 1, but when the transmitting side vehicle 20C crosses the road in front of the road 1 and tries to make a right turn, the receiving side vehicle 10C detects an optical signal described later to stop the operation or stop operation. Decelerate.
 送信側車両20Cは、道路1上を走行する車両である。送信側車両20Cは、運転支援車両であってもよいが、運転支援の機能を備えていない手動運転車両であってもよい。図11では、送信側車両20Cは道路1を右折して、前方の受信側車両10Cの前を横断しようとする際に、後述する光信号を受信側車両10Cに照射して受信側車両10Cに停止動作または減速動作を実行させる。 The transmitting side vehicle 20C is a vehicle traveling on the road 1. The transmitting vehicle 20C may be a driving support vehicle, but may be a manually driven vehicle that does not have a driving support function. In FIG. 11, when the transmitting side vehicle 20C turns right on the road 1 and tries to cross in front of the receiving side vehicle 10C in front, it irradiates the receiving side vehicle 10C with an optical signal described later to the receiving side vehicle 10C. Perform stop operation or deceleration operation.
 インフラ側装置30は、受信側車両10Cに対して光信号を照射する装置である。インフラ側装置30は、後述するように道路1上での車両の状況を把握する機能を備えており、状況に応じて後述する光信号を受信側車両10Cに照射して受信側車両10Cに停止動作または減速動作を実行させる。 The infrastructure side device 30 is a device that irradiates the receiving side vehicle 10C with an optical signal. The infrastructure side device 30 has a function of grasping the situation of the vehicle on the road 1 as described later, and irradiates the receiving side vehicle 10C with an optical signal described later according to the situation and stops at the receiving side vehicle 10C. Perform operation or deceleration operation.
 携帯電子機器60は、例えば歩道2上の歩行者が所持できる可搬性のある電子機器または照明装置である。携帯電子機器60の具体的な構成は限定されない。携帯電子機器60は、少なくとも光を照射する機能を備えており、懐中電灯のような形態や携帯通信機器のような形態であってもよい。携帯電子機器60は、歩行者が操作することによって光信号を受信側車両10Cに照射し、受信側車両10Cに停止動作または減速動作を実行させる。 The portable electronic device 60 is, for example, a portable electronic device or a lighting device that can be carried by a pedestrian on the sidewalk 2. The specific configuration of the portable electronic device 60 is not limited. The portable electronic device 60 has at least a function of irradiating light, and may be in the form of a flashlight or a portable communication device. The portable electronic device 60 irradiates the receiving side vehicle 10C with an optical signal by being operated by a pedestrian, and causes the receiving side vehicle 10C to perform a stop operation or a deceleration operation.
 図12は、車両運転支援システム600の構成を示すブロック図である。図12に示すように、受信側車両10Cは、検知部11Cと、運転支援部12と、車両動作制御部13と、状況把握部14と、情報通信部15と、を備えている。また送信側車両20C、インフラ側装置30および携帯電子機器60は、それぞれ光照射部21,31,61と、状況把握部22,32,62と、情報通信部23,33,63と、を備えている。ここで、運転支援部12、車両動作制御部13、状況把握部14,22,32,62は、中央演算装置(CPU: Central Processing Unit)やメモリ、外部記憶装置等を備えたコンピュータ上で、予め記録されたプログラムによって所定の情報処理が実行される。 FIG. 12 is a block diagram showing the configuration of the vehicle driving support system 600. As shown in FIG. 12, the receiving vehicle 10C includes a detection unit 11C, a driving support unit 12, a vehicle motion control unit 13, a situation grasping unit 14, and an information communication unit 15. Further, the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60 include light irradiation units 21, 31, and 61, situation grasping units 22, 32, 62, and information and communication units 23, 33, 63, respectively. ing. Here, the driving support unit 12, the vehicle motion control unit 13, and the situation grasping units 14, 22, 32, 62 are on a computer equipped with a central processing unit (CPU: Central Processing Unit), a memory, an external storage device, and the like. Predetermined information processing is executed by a program recorded in advance.
 検知部11Cは、車両の外部からの光を検知して電気的な信号に変換し、運転支援部12に変換した信号を伝達する。検知部11Cの具体的な構成は限定されない。検知部11Cは、光センサや画像撮像装置でもよい。検知部11Cが検知する光の波長は限定されず、赤外光や可視光、紫外光、白色光であってもよい。 The detection unit 11C detects light from the outside of the vehicle, converts it into an electrical signal, and transmits the converted signal to the driving support unit 12. The specific configuration of the detection unit 11C is not limited. The detection unit 11C may be an optical sensor or an image imaging device. The wavelength of the light detected by the detection unit 11C is not limited, and may be infrared light, visible light, ultraviolet light, or white light.
 運転支援部12は、受信側車両10Cの運転を支援するために、状況把握部14や情報通信部15から取得した走行状態や周囲状況の情報を処理して、受信側車両10Cの動作を制御する運転制御信号を車両動作制御部13に出力する。運転支援部12は運転支援機能に加えて、検知部11Cが検知した光に予め定められた光信号が含まれている場合には、車両動作制御部13に対して停止動作または減速動作を示す運転制御信号を出力する。 The driving support unit 12 processes information on the driving state and surrounding conditions acquired from the situation grasping unit 14 and the information communication unit 15 in order to support the driving of the receiving side vehicle 10C, and controls the operation of the receiving side vehicle 10C. The driving control signal is output to the vehicle operation control unit 13. In addition to the driving support function, the driving support unit 12 indicates a stop operation or a deceleration operation to the vehicle operation control unit 13 when the light detected by the detection unit 11C includes a predetermined optical signal. Outputs an operation control signal.
 車両動作制御部13は、運転支援部12から出力された運転制御信号に基づいて、受信側車両10Cのステアリング制御や加減速制御を実行する。車両動作制御部13は運転支援機能を有し、動力源の出力調整やブレーキ操作、ステアリング角度の変更、運転案内の表示、方向指示器や停止灯の点灯制御等を行うことで、受信側車両10Cの運転を支援する。 The vehicle motion control unit 13 executes steering control and acceleration / deceleration control of the receiving side vehicle 10C based on the driving control signal output from the driving support unit 12. The vehicle motion control unit 13 has a driving support function, and by adjusting the output of the power source, operating the brake, changing the steering angle, displaying the driving guide, controlling the lighting of the turn signal and the stop light, and the like, the vehicle on the receiving side. Supports the operation of 10C.
 状況把握部14は、受信側車両10Cの走行状態や周囲状況の情報を取得して、運転支援部12に伝達する。状況把握部14は、車速センサや位置センサ、画像撮像装置、レーザ測距装置、LIDAR(Light Detection and Ranging)等の各種センサ機器を備えている。状況把握部14が取得する走行状態としては、受信側車両10Cの走行速度、位置、車体の向き、ステアリング角度、ブレーキ操作、カーナビゲーションシステムによる走行経路、車内での会話認識による方向指示等が含まれる。また、状況把握部14が取得する周囲状況としては、路面状態、周囲温度、カーナビゲーションシステムによる道路地図、道路の勾配、画像認識による周囲の物体検出、前走車両や対向車両、後続車両との車間距離や行動予測、画像認識による歩行者の検出等が含まれる。 The situation grasping unit 14 acquires information on the running state and surrounding conditions of the receiving vehicle 10C and transmits it to the driving support unit 12. The situation grasping unit 14 is provided with various sensor devices such as a vehicle speed sensor, a position sensor, an image imaging device, a laser range finder, and a LIDAR (Light Detection and Ringing). The traveling state acquired by the situation grasping unit 14 includes the traveling speed, the position, the direction of the vehicle body, the steering angle, the brake operation, the traveling route by the car navigation system, the direction instruction by the conversation recognition in the vehicle, and the like. Is done. In addition, the surrounding conditions acquired by the situation grasping unit 14 include the road surface condition, the ambient temperature, the road map by the car navigation system, the slope of the road, the detection of surrounding objects by image recognition, the preceding vehicle, the oncoming vehicle, and the following vehicle. Includes inter-vehicle distance, behavior prediction, and detection of pedestrians by image recognition.
 情報通信部15は、運転支援部12および状況把握部14と接続されて、受信側車両10Cの外部に設けられた通信部と情報通信する。情報通信部15は、電波や光による通信を行う。情報通信部15は、車両同士の間での車車間通信や、道路に設けられた設備との間での路車間通信を行って、走行状態や周囲状況等の情報を入手する。 The information and communication unit 15 is connected to the driving support unit 12 and the situation grasping unit 14, and performs information communication with a communication unit provided outside the receiving vehicle 10C. The information communication unit 15 communicates by radio waves or light. The information and communication unit 15 performs vehicle-to-vehicle communication between vehicles and road-to-vehicle communication with equipment provided on the road to obtain information such as a traveling state and surrounding conditions.
 検知部11Cは、車両運転支援システム600の動作に関わらない情報を含む光を検知してもよいが、検知部11Cは車両運転支援システム600の動作に関わる予め定められた光信号を検知するよう構成されている。予め定められた光信号としては、例えば特定の波長や、特定波形のパルス信号が含まれる。また予め定められた光信号は、検知部11Cの検知可能な波長範囲において、検知部11Cのダイナミックレンジを超えた光強度でもよい。これらの予め定められた光信号の情報は、運転支援部12に記録されているか、ダイナミックレンジを超えた光強度が照射された場合の処理手順として運転支援部12に記録されている。 The detection unit 11C may detect light including information not related to the operation of the vehicle driving support system 600, but the detection unit 11C may detect a predetermined optical signal related to the operation of the vehicle driving support system 600. It is configured. The predetermined optical signal includes, for example, a pulse signal having a specific wavelength or a specific waveform. Further, the predetermined optical signal may have a light intensity exceeding the dynamic range of the detection unit 11C in the wavelength range that can be detected by the detection unit 11C. The information of these predetermined optical signals is recorded in the driving support unit 12, or is recorded in the driving support unit 12 as a processing procedure when a light intensity exceeding the dynamic range is irradiated.
 光照射部21,31,61は、それぞれ送信側車両20C、インフラ側装置30および携帯電子機器60に備えられており、検知部11Cに対して予め定められた光信号を照射する。予め定められた光信号とは、特定の波長や特定の信号波形、ダイナミックレンジを超えた光強度等である。図11および図12は、送信側車両20C、インフラ側装置30および携帯電子機器60の全てが道路1または歩道2上に配置された例を示している。しかし、光照射部21,31,61の何れか一つによって検知部11Cに光信号が照射されればよいため、送信側車両20C、インフラ側装置30または携帯電子機器60の何れか一つが道路1または歩道2上にあれば車両運転支援システム600は機能する。 The light irradiation units 21, 31, and 61 are provided in the transmission side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60, respectively, and irradiate the detection unit 11C with a predetermined light signal. The predetermined optical signal is a specific wavelength, a specific signal waveform, a light intensity exceeding a dynamic range, or the like. 11 and 12 show an example in which the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60 are all arranged on the road 1 or the sidewalk 2. However, since it is sufficient that the detection unit 11C is irradiated with the light signal by any one of the light irradiation units 21, 31 and 61, any one of the transmission side vehicle 20C, the infrastructure side device 30 or the portable electronic device 60 is on the road. The vehicle driving support system 600 functions if it is on 1 or the sidewalk 2.
 状況把握部22,32,62は、それぞれ送信側車両20C、インフラ側装置30および携帯電子機器60に備えられており、状況把握部14と同様の構成を備えてもよい。状況把握部22,32,62は、それぞれ車両や歩行者の走行状態や周囲状況の情報を取得して、情報通信部23,33,63を介して外部に伝達する。また、状況把握部22,32,62は、情報通信部23,33,63を介して外部から走行状態や周囲状況を取得してもよい。状況把握部22,32,62は、取得した走行状態や周囲状況を判断して、走行状態や周囲状況が予め定められた照射条件を満たした場合に、光照射部21,31,61から検知部11Cに対して光信号を照射させる。 The situation grasping units 22, 32, and 62 are provided in the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60, respectively, and may have the same configuration as the situation grasping unit 14. The situation grasping units 22, 32, and 62 acquire information on the running state and surrounding conditions of vehicles and pedestrians, respectively, and transmit the information to the outside via the information and communication units 23, 33, 63. Further, the situation grasping units 22, 32, 62 may acquire the traveling state and the surrounding situation from the outside via the information and communication units 23, 33, 63. The situation grasping units 22, 32, 62 determine the acquired running state and surrounding conditions, and detect from the light irradiation units 21, 31, 61 when the running state and surrounding conditions satisfy predetermined irradiation conditions. The unit 11C is irradiated with an optical signal.
 情報通信部23,33,63は、それぞれ送信側車両20C、インフラ側装置30および携帯電子機器60に備えられており、受信側車両10Cの外部に設けられた通信部と情報通信する。情報通信部23,33,63は、それぞれ電波や光による通信を行う。 The information and communication units 23, 33, and 63 are provided in the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60, respectively, and perform information communication with the communication unit provided outside the receiving side vehicle 10C. The information and communication units 23, 33, and 63 communicate with each other by radio waves or light, respectively.
 図12は、送信側車両20C、インフラ側装置30および携帯電子機器60が状況把握部22,32,62と情報通信部23,33,63を備わえる例を示しているが、車両運転支援システム600の動作には必須ではないため省略することも可能である。 FIG. 12 shows an example in which the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60 are provided with the situation grasping units 22, 32, 62 and the information and communication units 23, 33, 63. Since it is not essential for the operation of the system 600, it can be omitted.
 図13A~13Cは、車両運転支援システム600における光照射の例を示す模式図である。図13Aは送信側車両20Cからの光の照射を示す。図13Bはインフラ側装置30からの光の照射を示す。図13Cは歩行者が保持する携帯電子機器60からの光の照射を示す。図13A~図13Cに示したように、送信側車両20C、インフラ側装置30または携帯電子機器60の光照射部21,31,61から、受信側車両10Cの検知部11Cに対して予め定められた光信号が照射される。運転支援部12は、受信側車両10Cの検知部11Cが検知した光が予め定められた光信号と一部一致する場合には、車両動作制御部13に対して停止動作または減速動作を示す運転制御信号を出力して、受信側車両10Cを減速または停止させる。 13A to 13C are schematic views showing an example of light irradiation in the vehicle driving support system 600. FIG. 13A shows the irradiation of light from the transmitting vehicle 20C. FIG. 13B shows the irradiation of light from the infrastructure side device 30. FIG. 13C shows the irradiation of light from the portable electronic device 60 held by the pedestrian. As shown in FIGS. 13A to 13C, the light irradiation units 21, 31, and 61 of the transmission side vehicle 20C, the infrastructure side device 30, or the portable electronic device 60 are predetermined with respect to the detection unit 11C of the reception side vehicle 10C. The light signal is emitted. When the light detected by the detection unit 11C of the receiving vehicle 10C partially matches a predetermined optical signal, the driving support unit 12 indicates a stop operation or a deceleration operation to the vehicle operation control unit 13. A control signal is output to decelerate or stop the receiving vehicle 10C.
 図13Aに示したように、本実施形態の車両運転支援システム600では、送信側車両20Cは、運転支援の機能を備えていない手動運転車両でもよい。送信側車両20Cがパッシング動作や専用のスイッチを操作して、予め定められた光信号を光照射部21が検知部11Cに対して照射することで、車両運転支援システム600は受信側車両10Cの運転支援部12による停止動作または減速動作を作動させることができる。送信側車両20Cの光照射部21は、前照灯やフォグランプ、装飾灯などでもよく、車両に運転支援機能を付加するよりも簡便に車両運転支援システム600を利用することができる。 As shown in FIG. 13A, in the vehicle driving support system 600 of the present embodiment, the transmitting side vehicle 20C may be a manually driven vehicle that does not have a driving support function. The transmitting side vehicle 20C operates a passing operation or a dedicated switch, and the light irradiating unit 21 irradiates the detecting unit 11C with a predetermined light signal, so that the vehicle driving support system 600 is the receiving side vehicle 10C. The stop operation or deceleration operation by the driving support unit 12 can be activated. The light irradiation unit 21 of the transmitting vehicle 20C may be a headlight, a fog lamp, a decorative light, or the like, and the vehicle driving support system 600 can be used more easily than adding a driving support function to the vehicle.
 図13Bでは、インフラ側装置30は、状況把握部32または情報通信部33によって受信側車両10Cおよび送信側車両20Cの走行状態や周囲状況を把握する。インフラ側装置30は、受信側車両10Cが送信側車両20Cの対向車両であり、かつ送信側車両20Cが受信側車両10Cの前方を横断して右折動作または左折動作する場合には、照射条件を満たすと判断し、光照射部31から光信号を検知部11Cに対して照射させる。これにより、受信側車両10Cでは運転支援部12が車両動作制御部13に対して停止動作または減速動作を示す運転制御信号を出力して、受信側車両10Cを減速または停止させる。 In FIG. 13B, the infrastructure side device 30 grasps the running state and surrounding conditions of the receiving side vehicle 10C and the transmitting side vehicle 20C by the situation grasping unit 32 or the information communication unit 33. The infrastructure side device 30 sets the irradiation condition when the receiving side vehicle 10C is an oncoming vehicle of the transmitting side vehicle 20C and the transmitting side vehicle 20C crosses the front of the receiving side vehicle 10C and makes a right turn operation or a left turn operation. It is determined that the condition is satisfied, and the light irradiation unit 31 irradiates the detection unit 11C with an optical signal. As a result, in the receiving side vehicle 10C, the driving support unit 12 outputs a driving control signal indicating a stop operation or a deceleration operation to the vehicle operation control unit 13 to decelerate or stop the receiving side vehicle 10C.
 インフラ側装置30が光照射を決定する照射条件の他の例としては、図11に示したように受信側車両10Cの前方に横断歩道が存在し、受信側車両10Cが直進動作を継続すると予測され、歩行者が受信側車両10Cの前方を横断すると判断した場合も含まれる。 As another example of the irradiation condition in which the infrastructure side device 30 determines the light irradiation, as shown in FIG. 11, it is predicted that a pedestrian crossing exists in front of the receiving side vehicle 10C and the receiving side vehicle 10C continues the straight-ahead operation. This also includes the case where it is determined that the pedestrian crosses the front of the receiving vehicle 10C.
 図13Cに示した例では、歩道2上の歩行者が所持している携帯電子機器60を操作して、検知部11Cに対して光照射部61から光信号を照射する。これにより、歩行者は運転支援車両である受信側車両10Cに積極的に働きかけて、受信側車両10Cの停止動作や減速動作を作動させることができる。 In the example shown in FIG. 13C, the portable electronic device 60 possessed by the pedestrian on the sidewalk 2 is operated to irradiate the detection unit 11C with an optical signal from the light irradiation unit 61. As a result, the pedestrian can positively act on the receiving side vehicle 10C, which is a driving support vehicle, to activate the stopping operation and the decelerating operation of the receiving side vehicle 10C.
 携帯電子機器60は、状況把握部62または情報通信部63によって受信側車両10Cの走行状態や周囲状況を把握してもよい。この場合、状況把握部62が照射条件を満たすと判断した場合にのみ光信号の照射を実行する。照射条件は、受信側車両10Cの前方に横断歩道が存在し、受信側車両10Cが直進動作を継続すると予測され、歩行者が横断歩道を横断する場合が含まれる。 The portable electronic device 60 may grasp the traveling state and surrounding conditions of the receiving side vehicle 10C by the situation grasping unit 62 or the information communication unit 63. In this case, the irradiation of the optical signal is executed only when the situation grasping unit 62 determines that the irradiation condition is satisfied. The irradiation condition includes a case where a pedestrian crossing exists in front of the receiving side vehicle 10C, the receiving side vehicle 10C is predicted to continue the straight-ahead operation, and a pedestrian crosses the pedestrian crossing.
 図14は、車両運転支援システム600の動作例を示すフローチャートである。図14では、受信側車両10Cが運転支援機能を実行して運転支援車両として走行中であり、送信側車両20Cが運転支援機能を備えている。また、図14のフローチャートは送信側車両20Cにおける光信号の照射条件を示している。 FIG. 14 is a flowchart showing an operation example of the vehicle driving support system 600. In FIG. 14, the receiving side vehicle 10C executes the driving support function and is traveling as the driving support vehicle, and the transmitting side vehicle 20C has the driving support function. Further, the flowchart of FIG. 14 shows the irradiation conditions of the optical signal in the transmitting side vehicle 20C.
 はじめに、送信側車両20Cの運転支援機能がスタートする。次にステップ1Cでは、状況把握部22が送信側車両20Cの走行状態から右左折を判断し、右折または左折の場合にはステップ2Cに移行し、右折でも左折でもない場合にはステップ1Cに戻る。次にステップ2Cでは、状況把握部22が対向車の有無を判断し、状況把握部22が対向車である受信側車両10Cが直進してくると予測した場合にはステップ3Cに移行し、その他の場合にはステップ1Cに戻る。最後にステップ3Cでは、光照射部21が検知部11Cに対して光信号を照射し、ステップ1Cに戻る。 First, the driving support function of the transmitting vehicle 20C starts. Next, in step 1C, the situation grasping unit 22 determines a right / left turn from the running state of the transmitting vehicle 20C, shifts to step 2C in the case of a right turn or a left turn, and returns to step 1C in the case of neither a right turn nor a left turn. .. Next, in step 2C, when the situation grasping unit 22 determines the presence or absence of an oncoming vehicle and the situation grasping unit 22 predicts that the receiving side vehicle 10C, which is an oncoming vehicle, is going straight, the process proceeds to step 3C, and the like. In the case of, the process returns to step 1C. Finally, in step 3C, the light irradiation unit 21 irradiates the detection unit 11C with an optical signal and returns to step 1C.
 図14では、送信側車両20Cが走行状態に応じて光信号の照射を決定するため、送信側車両20Cの運転者は特別の動作をせずに、照射条件を満たす場合に自動的に光信号が照射され、受信側車両10Cの停止動作または減速動作を作動させることができる。 In FIG. 14, since the transmitting side vehicle 20C determines the irradiation of the optical signal according to the traveling state, the driver of the transmitting side vehicle 20C does not perform any special operation and automatically performs the optical signal when the irradiation condition is satisfied. Is irradiated, and the stop operation or deceleration operation of the receiving side vehicle 10C can be activated.
 以上に述べたように、本実施形態の車両運転支援システム600では、運転支援技術によって走行する受信側車両10Cに対して、光照射部21,31,61から光信号を照射するだけで、積極的に働きかけて受信側車両10Cに停止動作または減速動作を作動させ、円滑な交通を支援することができる。 As described above, in the vehicle driving support system 600 of the present embodiment, the receiving side vehicle 10C traveling by the driving support technology is positively subjected to only by irradiating the light signals from the light irradiation units 21, 31, and 61. It is possible to support smooth traffic by activating a stop operation or a deceleration operation on the receiving side vehicle 10C.
 (第7実施形態)
 次に、本開示の第7実施形態について図15を用いて説明する。第6実施形態と重複する内容は説明を省略する。図15は、第7実施形態に係る車両運転支援システム700の道路上での動作例を示す模式図である。
(7th Embodiment)
Next, the seventh embodiment of the present disclosure will be described with reference to FIG. The description of the contents overlapping with the sixth embodiment will be omitted. FIG. 15 is a schematic view showing an example of operation of the vehicle driving support system 700 according to the seventh embodiment on the road.
 車両運転支援システム700の一例として図15では、道路1と歩道2が直線状に伸びている。道路1上では受信側車両10Cの前方を送信側車両20Cが走行している。歩道2の領域上には、インフラ側装置30が配置されている。図15では、受信側車両10Cおよび送信側車両20Cは図中下方から上方に向かって道路1を直進しようとしている。本実施形態では、受信側車両10Cが送信側車両20Cに接近しすぎた場合に、送信側車両20Cの後方から光信号を照射して受信側車両10Cの停止動作や減速動作を作動させて、危険回避や快適な車両間隔を維持するように促す。 As an example of the vehicle driving support system 700, in FIG. 15, the road 1 and the sidewalk 2 extend in a straight line. On the road 1, the transmitting side vehicle 20C is traveling in front of the receiving side vehicle 10C. The infrastructure side device 30 is arranged on the area of the sidewalk 2. In FIG. 15, the receiving side vehicle 10C and the transmitting side vehicle 20C are trying to go straight on the road 1 from the lower side to the upper side in the figure. In the present embodiment, when the receiving side vehicle 10C is too close to the transmitting side vehicle 20C, an optical signal is irradiated from the rear of the transmitting side vehicle 20C to activate the stopping operation and the deceleration operation of the receiving side vehicle 10C. Encourage them to avoid danger and maintain comfortable vehicle spacing.
 図16は、車両運転支援システム700の動作例を示すフローチャートである。図16では、受信側車両10Cおよび送信側車両20Cが運転支援機能を備えている。また、図16のフローチャートは送信側車両20Cにおける光信号の照射条件を示している。 FIG. 16 is a flowchart showing an operation example of the vehicle driving support system 700. In FIG. 16, the receiving side vehicle 10C and the transmitting side vehicle 20C have a driving support function. Further, the flowchart of FIG. 16 shows the irradiation conditions of the optical signal in the transmitting side vehicle 20C.
 はじめに、送信側車両20Cの運転支援機能がスタートする。次にステップ11では、状況把握部22が送信側車両20Cの走行状態から直進を判断し、直進の場合にはステップ12に移行し、直進ではない場合にはステップ11に戻る。次にステップ12では、状況把握部22が後続車の有無を判断し、後続車として受信側車両10Cがいると判断した場合にはステップ13に移行し、その他の場合にはステップ11に戻る。次にステップ13では、状況把握部22は送信側車両20Cと受信側車両10Cの車間距離を計測し、一定距離(一定値)以下である場合にはステップ14に移行し、その他の場合にはステップ11に戻る。次にステップ14では、状況把握部22は車間距離が一定距離(一定値)以下の時間が継続した時間を計測し、一定時間以上継続した場合にはステップ15に移行し、その他の場合にはステップ11に戻る。最後にステップ15では、光照射部21が検知部11Cに対して光信号を照射し、ステップ11に戻る。 First, the driving support function of the transmitting vehicle 20C starts. Next, in step 11, the situation grasping unit 22 determines whether the vehicle is going straight from the traveling state of the transmitting vehicle 20C, and if the vehicle is going straight, the process proceeds to step 12, and if the vehicle is not going straight, the process returns to step 11. Next, in step 12, the situation grasping unit 22 determines whether or not there is a following vehicle, and if it determines that the receiving vehicle 10C is present as the following vehicle, the process proceeds to step 13, and in other cases, the process returns to step 11. Next, in step 13, the situation grasping unit 22 measures the inter-vehicle distance between the transmitting side vehicle 20C and the receiving side vehicle 10C, and if the distance is less than or equal to a certain distance (constant value), the process proceeds to step 14, and in other cases, the process proceeds to step 14. Return to step 11. Next, in step 14, the situation grasping unit 22 measures the time during which the inter-vehicle distance is a certain distance (constant value) or less, and if it continues for a certain time or more, the process proceeds to step 15, and in other cases, Return to step 11. Finally, in step 15, the light irradiation unit 21 irradiates the detection unit 11C with an optical signal, and the process returns to step 11.
 図16でも、送信側車両20Cが走行状態に応じて光信号の照射を決定するため、送信側車両20Cの運転者は特別の動作をせずに、照射条件を満たす場合に自動的に光信号が照射され、受信側車両10Cの停止動作または減速動作を作動させ、危険回避や快適な車間距離の維持を促すことができる。 Also in FIG. 16, since the transmitting side vehicle 20C determines the irradiation of the optical signal according to the traveling state, the driver of the transmitting side vehicle 20C does not perform any special operation and automatically performs the optical signal when the irradiation condition is satisfied. Is irradiated, and the stop operation or deceleration operation of the receiving side vehicle 10C can be activated to promote danger avoidance and maintenance of a comfortable inter-vehicle distance.
 以上に述べたように、本実施形態の車両運転支援システム700でも、運転支援技術によって走行する受信側車両10Cに対して、光照射部21,31,61から光信号を照射するだけで、積極的に働きかけて受信側車両10Cに停止動作または減速動作を作動させ、円滑な交通を支援することができる。 As described above, even in the vehicle driving support system 700 of the present embodiment, the receiving side vehicle 10C traveling by the driving support technology is positively subjected to only by irradiating the light signals from the light irradiation units 21, 31, and 61. It is possible to support smooth traffic by activating a stop operation or a deceleration operation on the receiving side vehicle 10C.
 (第8実施形態)
 次に、本開示の第8実施形態について図17A~17Bを用いて説明する。第1実施形態と重複する内容は説明を省略する。
(8th Embodiment)
Next, the eighth embodiment of the present disclosure will be described with reference to FIGS. 17A to 17B. The description of the contents overlapping with the first embodiment will be omitted.
 図17Aおよび図17Bは、本実施形態に係る車両運転支援システム800を利用して走行する車両10Dの模式図である。図17Aは車両10Dの正面図である。図17Bは、車両10Dの後面図である。車両10Dは、路面描画装置を備えた自動車であり、車両10D前方を照射する、ヘッドランプ2Dと照明ユニット(照射ユニット)52Dを備えている。ヘッドランプ2Dは、車両10Dの進行方向の前面左右に右側ヘッドランプ2R、左側ヘッドランプ2Lと配されている。ヘッドランプ2Dは、図示しないランプボディ内に光源及びリフレクタ等を備えてもよい。 17A and 17B are schematic views of a vehicle 10D traveling by using the vehicle driving support system 800 according to the present embodiment. FIG. 17A is a front view of the vehicle 10D. FIG. 17B is a rear view of the vehicle 10D. The vehicle 10D is an automobile equipped with a road surface drawing device, and includes a headlamp 2D and a lighting unit (irradiation unit) 52D that illuminate the front of the vehicle 10D. The headlamps 2D are arranged as right side headlamps 2R and left side headlamps 2L on the front left and right sides in the traveling direction of the vehicle 10D. The headlamp 2D may include a light source, a reflector, and the like in a lamp body (not shown).
 照明ユニット(照射ユニット)52Dは、本実施形態では、車両10D前面の左右のヘッドランプ2Dの下部に配されている。本実施形態における照明ユニット52Dは、右側照明ユニット52R、左側照明ユニット52Lと左右に分けられて配置されているが、本開示の車両運転支援システムあるいは路面描画装置はこれに限らず、車両10D前面の中央に一つ配する形等であってもよい。照明ユニット52Dは、車両運転支援システム800において路面に各種描画(マーク)を表示する路面描画装置の描画投影部である。照明ユニット52Dの構造は、例えば、レーザ光源と、レーザ光源から出射されるレーザ光を偏向する光偏向装置とを備える不図示のレーザ走査装置である。光偏向装置は、例えば、MEMS(Micro Electro Mechanical Systems)ミラーやガルバノミラー等の可動ミラーである。照明ユニット52Dは、車両10D前方の路面に所定の各種描画(マーク)を表示できるものであればよく、液晶表示器、LEDアレイ、デジタルミラーデバイス(DMD)等でもよい。照明ユニット52Dは、後述する車両運転支援システム800における路面描画部50Dの照明制御部51Dからの指令に伴い、点灯、消灯等、動作が制御される。 In this embodiment, the lighting unit (irradiation unit) 52D is arranged below the left and right headlamps 2D on the front surface of the vehicle 10D. The lighting unit 52D in the present embodiment is separately arranged on the left and right sides of the right side lighting unit 52R and the left side lighting unit 52L, but the vehicle driving support system or the road surface drawing device of the present disclosure is not limited to this, and the front surface of the vehicle 10D is not limited to this. It may be in the form of arranging one in the center of. The lighting unit 52D is a drawing projection unit of a road surface drawing device that displays various drawings (marks) on the road surface in the vehicle driving support system 800. The structure of the illumination unit 52D is, for example, a laser scanning device (not shown) including a laser light source and a light deflector that deflects the laser light emitted from the laser light source. The light deflector is, for example, a movable mirror such as a MEMS (Micro Electro Mechanical Systems) mirror or a galvano mirror. The lighting unit 52D may be a liquid crystal display, an LED array, a digital mirror device (DMD), or the like, as long as it can display various predetermined drawings (marks) on the road surface in front of the vehicle 10D. The operation of the lighting unit 52D, such as lighting and extinguishing, is controlled in response to a command from the lighting control unit 51D of the road surface drawing unit 50D in the vehicle driving support system 800, which will be described later.
 図17Bに示すように、車両10Dの後ろ側においてもバックランプ2Bの下方に後方の照明ユニット52Bが設けられている。本実施形態では、左のバックランプ2BLの下に左の照明ユニット52BL、右のバックランプ2BRの下に右の照明ユニット52BRが分けて配置されている。後面の照明ユニット52Bの配置はこれに限らず、車両後面の中央に一つ配されてもよい。後面の照明ユニット52Bの構造は、前方の照明ユニット52Dと同様である。 As shown in FIG. 17B, the rear lighting unit 52B is also provided below the back lamp 2B on the rear side of the vehicle 10D. In the present embodiment, the left lighting unit 52BL is separately arranged under the left back lamp 2BL, and the right lighting unit 52BR is separately arranged under the right back lamp 2BR. The arrangement of the rear lighting unit 52B is not limited to this, and one may be arranged in the center of the rear surface of the vehicle. The structure of the rear lighting unit 52B is the same as that of the front lighting unit 52D.
 次に本開示の車両運転支援システム800の各種制御について説明する。図18は、本実施形態に係る車両運転支援システム800のブロック図である。本開示の車両運転支援システム800は、制御部90D、車両進路取得部20D、車両状態情報取得部30D、画像選択部40D、路面描画部50Dを備える。 Next, various controls of the vehicle driving support system 800 of the present disclosure will be described. FIG. 18 is a block diagram of the vehicle driving support system 800 according to the present embodiment. The vehicle driving support system 800 of the present disclosure includes a control unit 90D, a vehicle course acquisition unit 20D, a vehicle state information acquisition unit 30D, an image selection unit 40D, and a road surface drawing unit 50D.
 制御部90Dは、車両10Dの各種機器の制御を行うものであり、電子制御ユニットにより構成されている。電子制御ユニットは、プロセッサとメモリを含むマイクロコントローラとトランジスタ等のその他電子回路等を含む。プロセッサは、CPU(Central Processing Unit)、MPU(Micro Processing Unit)、GPU(Graphics Processing Unit)を含むものである。また、メモリはROM及びRAMを含む。プロセッサは、ROMに記憶された各種制御プログラムを実行し、RAMとの協働で各種処理を実行する。 The control unit 90D controls various devices of the vehicle 10D, and is composed of an electronic control unit. The electronic control unit includes a microcontroller including a processor and a memory, and other electronic circuits such as transistors. The processor includes a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit). The memory also includes a ROM and a RAM. The processor executes various control programs stored in the ROM, and executes various processes in cooperation with the RAM.
 制御部90Dには、ナビゲーションシステム11D、方向指示器12D、車載カメラ13D、センサ14D、無線通信部15D等の車両10D外部のモニタリングを行う各種センサや外部機器等が接続されており、各種信号、データが入出力される。 Various sensors and external devices for monitoring the outside of the vehicle 10D such as the navigation system 11D, the direction indicator 12D, the in-vehicle camera 13D, the sensor 14D, and the wireless communication unit 15D are connected to the control unit 90D. Data is input and output.
 ナビゲーションシステム11Dは、GPS等の衛星測位システムと接続され、現在の車両10Dの位置情報を入手するとともに、運転手が入力した目的地に対して適切な進路を示し、車両10Dを誘導するシステムである。制御部90Dは、ナビゲーションシステム11Dから、車両10Dの進路情報を取得するとともに、車両10Dの現在の位置情報、向き情報等を取得する。本実施形態では、ナビゲーションシステム11Dによって制御部90Dは、車両10Dの進路情報を取得するものとしたが、本開示はこれに限らない。制御部90Dは、自動運転制御による逐次指示等、各種制御手段を用いて進路情報を取得してもよい。 The navigation system 11D is a system that is connected to a satellite positioning system such as GPS to obtain the current position information of the vehicle 10D, indicate an appropriate course to the destination input by the driver, and guide the vehicle 10D. is there. The control unit 90D acquires the course information of the vehicle 10D from the navigation system 11D, and also acquires the current position information, orientation information, and the like of the vehicle 10D. In the present embodiment, the control unit 90D is assumed to acquire the course information of the vehicle 10D by the navigation system 11D, but the present disclosure is not limited to this. The control unit 90D may acquire the course information by using various control means such as sequential instruction by automatic operation control.
 方向指示器12Dは、運転手が車両10Dの進行方向を入力するレバー(不図示)と連動し、車両10Dの進行方向を示す信号を制御部90Dへ入力するとともに、車両10D外部へ方向指示器(ウィンカーランプ)(不図示)を通じて発信する。制御部90Dは、方向指示器12Dによっても、車両10Dの進路情報を取得する。 The direction indicator 12D is interlocked with a lever (not shown) for the driver to input the traveling direction of the vehicle 10D, inputs a signal indicating the traveling direction of the vehicle 10D to the control unit 90D, and is a direction indicator to the outside of the vehicle 10D. (Winker lamp) (not shown) to transmit. The control unit 90D also acquires the course information of the vehicle 10D by the direction indicator 12D.
 車載カメラ13Dは、車両10D前方及び後方の車外情報を入手するよう設けられている。車両10Dの前面に設置された車載カメラ13Fは、前方の様子(路面を含む)を逐次撮影し、車両10D前方に他車両や歩行者等が存在する前方情報をいち早く制御部90Dへ送信する。車両10Dの後面に設置された車載カメラ13Bも同様に、後方の様子を逐次撮影し、車両10D後方に他車両や二輪車、歩行者が存在する後方情報を制御部90Dへ送信する。車載カメラ13Dは、例えば、CCD(Charge Coupled Device)やCMOS(Complementary MOS)等の撮像素子を含む。車載カメラ13Dは、ミリ波レーダ、マイクロ波レーダ、レーザーレーダ等と組み合わされ、他車両、歩行者、道路形状、交通標識、障害物等の車外周辺情報を入手する。車載カメラ13Dは、撮影した画像データを制御部90Dに送る。制御部90Dが各種解析プログラムによって、画像データから歩行者や他車両(二輪車を含む)の存在、位置等を認識してもよいし、車載カメラ13D自体が歩行者等を認識するプログラムを備えてもよい。車載カメラの撮像範囲には、車両10D走行方向前方の路面も含まれる。 The in-vehicle camera 13D is provided to obtain information on the outside of the vehicle 10D in front of and behind the vehicle. The in-vehicle camera 13F installed on the front surface of the vehicle 10D sequentially photographs the state of the front (including the road surface), and promptly transmits forward information on the presence of other vehicles, pedestrians, etc. in front of the vehicle 10D to the control unit 90D. Similarly, the vehicle-mounted camera 13B installed on the rear surface of the vehicle 10D also sequentially photographs the rear view, and transmits rear information to the control unit 90D that other vehicles, motorcycles, and pedestrians are present behind the vehicle 10D. The in-vehicle camera 13D includes, for example, an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary MOS). The in-vehicle camera 13D is combined with a millimeter-wave radar, a microwave radar, a laser radar, and the like to obtain information on the surroundings outside the vehicle such as other vehicles, pedestrians, road shapes, traffic signs, and obstacles. The in-vehicle camera 13D sends the captured image data to the control unit 90D. The control unit 90D may recognize the existence, position, etc. of pedestrians and other vehicles (including motorcycles) from the image data by various analysis programs, and the in-vehicle camera 13D itself has a program for recognizing pedestrians, etc. May be good. The imaging range of the vehicle-mounted camera also includes the road surface in front of the vehicle 10D in the traveling direction.
 センサ14Dは、車載カメラ13Dと同様に、車両10D周囲の車外情報を入手するよう設けられている。例えば、車両10D前方、後方、側方等に他車両や歩行者等がいるか否かを検知する赤外線センサや歩行者の動きを検知するモーションキャプチャ等が設置されている。センサ14Dも、歩行者や他車両(二輪車を含む)を検知した場合、検知データを制御部90Dへ送信する。またセンサ14Dは、車両10Dの向きを検出する電子コンパス、角速度センサを備えてもよい。車両10Dの向き情報については、電子コンパス、角速度センサで検出した情報を含んでもよい。 Similar to the in-vehicle camera 13D, the sensor 14D is provided to obtain information on the outside of the vehicle around the vehicle 10D. For example, an infrared sensor for detecting whether or not there is another vehicle, a pedestrian, or the like, a motion capture for detecting the movement of a pedestrian, or the like is installed in front, behind, or side of the vehicle 10D. When the sensor 14D also detects a pedestrian or another vehicle (including a two-wheeled vehicle), the sensor 14D also transmits the detection data to the control unit 90D. Further, the sensor 14D may include an electronic compass and an angular velocity sensor that detect the orientation of the vehicle 10D. The orientation information of the vehicle 10D may include information detected by an electronic compass and an angular velocity sensor.
 無線通信部15Dは、他車両や交差点等に設けられた所定装置、自動運転の指示装置等、車外の他の装置と無線通信によって情報の受信、発信を行う。無線通信部15Dが受信した情報によっても、車両周囲の情報(他車両や歩行者等の様子)を制御部90Dは入手する。無線通信部15Dは、他車両等に対して、車両10Dの位置や進行方向を発信する形態を備えていてもよい。すなわち無線通信部15Dは、車両同士の通信である車車間通信や、道路に設けられた設備との間での路車間通信を行う。 The wireless communication unit 15D receives and transmits information by wireless communication with other devices outside the vehicle, such as predetermined devices provided at other vehicles and intersections, automatic driving instruction devices, and the like. The control unit 90D also obtains information around the vehicle (states of other vehicles, pedestrians, etc.) from the information received by the wireless communication unit 15D. The wireless communication unit 15D may have a form of transmitting the position and the traveling direction of the vehicle 10D to another vehicle or the like. That is, the wireless communication unit 15D performs vehicle-to-vehicle communication, which is communication between vehicles, and road-to-vehicle communication with equipment provided on the road.
 本実施形態は、車両10Dの周囲情報を検知する車載カメラ13D、センサ14D、無線通信部15Dを合わせて、周囲情報検知部(または他車両検知部)16Dと称する。周囲情報検知部16Dは、車載カメラ13D、センサ14D、無線通信部15Dに限らず、車両10Dの周囲の情報を入手するものであればよい。例えば、周囲情報検知部16Dは、車外の他装置が撮像した画像を入手する仕組み等を含んでもよい。 In this embodiment, the vehicle-mounted camera 13D, the sensor 14D, and the wireless communication unit 15D that detect the surrounding information of the vehicle 10D are collectively referred to as the surrounding information detection unit (or other vehicle detection unit) 16D. The surrounding information detection unit 16D is not limited to the in-vehicle camera 13D, the sensor 14D, and the wireless communication unit 15D, and may be any one that obtains information on the surroundings of the vehicle 10D. For example, the surrounding information detection unit 16D may include a mechanism for obtaining an image captured by another device outside the vehicle.
 制御部90Dは、車両進路取得部20D、車両状態情報取得部30D、画像選択部40Dを備えている。車両進路取得部20Dは、上述したナビゲーションシステム11Dや方向指示器12D、無線通信部15D等から車両10Dの進路情報、例えば、次の交差点を直進する、右折する、左折する等といった進路情報を取得する。車両進路取得部20Dは、進路情報を所定間隔で定期的に取得してもよいし、先に入手した進路から変わる場合に取得してもよい。車両進路取得部20Dが取得する進路情報は、交差点での情報だけでなく、車線変更等、他車両に対して進路表示をすべき場合は全て含まれる。車両10Dが自動運転制御されている場合は、無線通信部15Dからの情報によっても車両進路取得部20Dは進路情報を取得する。 The control unit 90D includes a vehicle course acquisition unit 20D, a vehicle state information acquisition unit 30D, and an image selection unit 40D. The vehicle course acquisition unit 20D acquires the course information of the vehicle 10D from the navigation system 11D, the direction indicator 12D, the wireless communication unit 15D, etc. described above, for example, the course information such as going straight, turning right, or turning left at the next intersection. To do. The vehicle course acquisition unit 20D may periodically acquire the course information at predetermined intervals, or may acquire the course information when the course is changed from the previously obtained course. The course information acquired by the vehicle course acquisition unit 20D includes not only information at intersections but also all cases where the course should be displayed for other vehicles such as changing lanes. When the vehicle 10D is automatically controlled, the vehicle course acquisition unit 20D also acquires the course information from the information from the wireless communication unit 15D.
 車両状態情報取得部30Dは、車両10Dの位置、向き、速度等、車両10Dの状態情報を取得する。上述したように、車両状態情報取得部30Dは、ナビゲーションシステム11Dから車両10Dの位置情報、向き情報を取得する。車両状態情報取得部30Dは、速度センサ31Dと接続している。 The vehicle state information acquisition unit 30D acquires the state information of the vehicle 10D such as the position, orientation, and speed of the vehicle 10D. As described above, the vehicle state information acquisition unit 30D acquires the position information and the orientation information of the vehicle 10D from the navigation system 11D. The vehicle state information acquisition unit 30D is connected to the speed sensor 31D.
 速度センサ31Dは車両10Dの走行速度等を検出する。速度センサ31Dはさらに、加速度センサ等も有していてもよい。車両状態情報取得部30Dは、ナビゲーションシステム11D等から入手した車両10Dの現在位置と、速度センサ31Dが検出する速度や加速度から、所定時間後の車両10Dの位置を求める。 The speed sensor 31D detects the traveling speed of the vehicle 10D and the like. The speed sensor 31D may also have an acceleration sensor or the like. The vehicle state information acquisition unit 30D obtains the position of the vehicle 10D after a predetermined time from the current position of the vehicle 10D obtained from the navigation system 11D or the like and the speed or acceleration detected by the speed sensor 31D.
 画像選択部40Dは、後述するように車両10Dの前方または後方路面に照明ユニット52Dを用いて投影する進路表示画像6Dを選択するものである。すなわち、本実施形態の車両運転支援システム800は、右折時、左折時等、車両10Dの進路が変更する場合等に、他車両や歩行者に車両10Dの進路が的確に伝わるように、車両10Dの進路方向を示す進路表示画像6Dを路面に投影する。しかしながら、車両10Dの周囲の他車両や歩行者の状況に応じて、進路表示画像6Dを選択しない、また選択する画像を適宜変更する等、画像選択部40Dが表示画像データを選択する。具体的には、画像選択部40Dは、右折、左折等、進路を示す進路表示画像6Dのデータが複数ストックされた進路表示画像データ格納部41Dと接続している。画像選択部40Dは、車両進路取得部20Dが取得した車両10Dの進路情報と、車両状態情報取得部30Dが取得した車両10Dの向き、位置、速度等、他車両の位置等の車両状態情報とから、車両10Dの進むべき方向を示す進路表示画像6Dのデータを進路表示画像データ格納部41Dから選択する。もしくは画像選択部40Dは、他車両が後述する車両10Dの所定範囲内に位置する時は、進路表示画像6Dのデータを選択しない。画像選択部40Dは選択した進路表示画像6Dのデータを指示する指令を路面描画部50Dへ送る。 The image selection unit 40D selects the course display image 6D projected on the road surface in front of or behind the vehicle 10D by using the lighting unit 52D, as will be described later. That is, the vehicle driving support system 800 of the present embodiment accurately conveys the course of the vehicle 10D to other vehicles and pedestrians when the course of the vehicle 10D changes, such as when turning right or left. The course display image 6D showing the course direction of is projected on the road surface. However, the image selection unit 40D selects the display image data, such as not selecting the course display image 6D or changing the selected image as appropriate according to the situation of other vehicles or pedestrians around the vehicle 10D. Specifically, the image selection unit 40D is connected to the route display image data storage unit 41D in which a plurality of data of the route display image 6D indicating the route, such as a right turn and a left turn, are stocked. The image selection unit 40D includes the course information of the vehicle 10D acquired by the vehicle course acquisition unit 20D and the vehicle state information such as the direction, position, speed, etc. of the vehicle 10D acquired by the vehicle state information acquisition unit 30D. Therefore, the data of the route display image 6D indicating the direction in which the vehicle 10D should travel is selected from the route display image data storage unit 41D. Alternatively, the image selection unit 40D does not select the data of the course display image 6D when the other vehicle is located within a predetermined range of the vehicle 10D described later. The image selection unit 40D sends a command instructing the data of the selected course display image 6D to the road surface drawing unit 50D.
 路面描画部50Dは、照明制御部51Dと照明ユニット52Dを備える。路面描画部50Dは、画像選択部40Dから指示された進路表示画像のデータを進路表示画像データ格納部41Dより取り出し、照明制御部51Dへ送る。照明制御部51Dは照明ユニット52Dに適した形にデータを変換し、照明ユニット52Dは変換されたデータに従って、光源より光を照射し、車両10D前方の路面へ所定の進路表示画像を投影する。照明制御部51Dは、電子制御ユニットにより構成されており、先の進路表示画像のデータに応じて、照明ユニット52Dの照明状態(点消灯、照明色、発光強度、発光領域等)を所定の進路表示画像が投影できるように設定する。照明制御部51DはCPU、MPU等のプロセッサとメモリとを含むマイクロコントローラと他の電子回路等を含む。本実施形態では、制御部90Dと照明制御部51Dは別構成としたが、一体的に構成されていてもよい。 The road surface drawing unit 50D includes a lighting control unit 51D and a lighting unit 52D. The road surface drawing unit 50D takes out the data of the route display image instructed from the image selection unit 40D from the route display image data storage unit 41D and sends it to the lighting control unit 51D. The lighting control unit 51D converts data into a form suitable for the lighting unit 52D, and the lighting unit 52D irradiates light from a light source according to the converted data and projects a predetermined course display image on the road surface in front of the vehicle 10D. The illumination control unit 51D is composed of an electronic control unit, and determines the illumination state (point off, illumination color, emission intensity, emission area, etc.) of the illumination unit 52D according to the data of the advance path display image. Set so that the display image can be projected. The lighting control unit 51D includes a microcontroller including a processor such as a CPU and an MPU and a memory, and other electronic circuits and the like. In the present embodiment, the control unit 90D and the lighting control unit 51D have separate configurations, but they may be integrally configured.
 照明ユニット52Dは、上述したように、選択された進路表示画像を車両10D周囲(前方、後方)の路面に投影して表示する。本実施形態では、照明ユニット52Dは、車両状態情報取得部30Dが入手した車両の向きや現在の位置、周囲の状況に応じて、画像を投影する位置や角度が調整されるような調整機能を備えている。具体的な調整指示は、画像選択部40Dが選択した画像データに付随する指示信号に基づいて行われる。 As described above, the lighting unit 52D projects the selected course display image onto the road surface around the vehicle 10D (front, rear) and displays it. In the present embodiment, the lighting unit 52D has an adjustment function such that the position and angle at which the image is projected are adjusted according to the direction, the current position, and the surrounding conditions of the vehicle obtained by the vehicle state information acquisition unit 30D. I have. The specific adjustment instruction is given based on the instruction signal accompanying the image data selected by the image selection unit 40D.
 図19は、車両運転支援システム800が進路表示画像6Dを表示する処理の流れを示すフローチャートである。図20は、複数の車両10Da~車両10Dcの交差点での進路7Dと進路表示画像6を示す模式図である。図19に示すように、本実施形態の車両運転支援システム800は、次に示す流れで、交差点付近で進路表示画像6Dを表示する。 FIG. 19 is a flowchart showing the flow of processing in which the vehicle driving support system 800 displays the course display image 6D. FIG. 20 is a schematic view showing a course 7D and a course display image 6 at an intersection of a plurality of vehicles 10Da to 10Dc. As shown in FIG. 19, the vehicle driving support system 800 of the present embodiment displays the course display image 6D in the vicinity of the intersection in the following flow.
 まず、ナビゲーションシステム11Dからの情報や方向指示器12Dからの情報により、車両進路取得部20Dは、車両10Dが進路7Dで示すように右折すること、すなわち直進方向である第1方向から右折方向である第2方向へ進路情報が変化することを認知、判断する(ステップ1D)。 First, based on the information from the navigation system 11D and the information from the direction indicator 12D, the vehicle course acquisition unit 20D turns right as the vehicle 10D indicates in the course 7D, that is, in the straight direction from the first direction to the right turn direction. Recognize and judge that the course information changes in a certain second direction (step 1D).
 続いて、車両状態情報取得部30Dは、車載カメラ13D、センサ14D、無線通信部15Dといった周囲情報検知部16Dが検知した周囲情報を取得する(ステップ2D)。 Subsequently, the vehicle state information acquisition unit 30D acquires the ambient information detected by the ambient information detection unit 16D such as the vehicle-mounted camera 13D, the sensor 14D, and the wireless communication unit 15D (step 2D).
 同じく、車両状態情報取得部30Dは、ナビゲーションシステム11Dから車両10Dの位置、向き情報を取得する。また、速度センサ32Dから車両10Dの走行速度を取得する。そして、先の周囲情報と車両10Dの位置、向き、走行速度から、車両10Dの前後に他車両が存在するか、また他車両との距離はどの程度かといった情報を取得する(ステップ3D)。 Similarly, the vehicle state information acquisition unit 30D acquires the position and orientation information of the vehicle 10D from the navigation system 11D. Further, the traveling speed of the vehicle 10D is acquired from the speed sensor 32D. Then, from the surrounding information and the position, orientation, and traveling speed of the vehicle 10D, information such as whether another vehicle exists before and after the vehicle 10D and how far it is from the other vehicle is acquired (step 3D).
 制御部90Dは、先の車両10Dの位置や向き情報と進路情報とから車両10Dが交差点に進入する状況や、進路を変更する状況である場合等、進路表示画像6Dを表示すべき位置(後述する第3距離L3内)に車両10Dが位置するかを判断する。制御部90Dが表示すべき位置と判断した場合にはステップ5Dに移行し、その他の場合にはステップ1Dに移行する(ステップ4D)。 The control unit 90D is a position where the route display image 6D should be displayed (described later), such as when the vehicle 10D enters an intersection or changes the route based on the position and orientation information and the route information of the vehicle 10D. It is determined whether the vehicle 10D is located within the third distance L3). When the control unit 90D determines that the position should be displayed, the process proceeds to step 5D, and in other cases, the process proceeds to step 1D (step 4D).
 続いて制御部90Dは、車両10Dが進路表示画像6Dを表示すべき位置であり、車両10Dの前方に他車両がいない、または前方の他車両との間の距離が第1距離L1以上等の状態であるかを判断する(ステップ5D)。第1距離L1以内に他車両がいない場合には、ステップ6Dに移行し、他車両がいる場合にはステップ7Dに移行する。 Subsequently, the control unit 90D is a position where the vehicle 10D should display the course display image 6D, and there is no other vehicle in front of the vehicle 10D, or the distance between the vehicle 10D and the other vehicle in front is the first distance L1 or more. Determine if it is in a state (step 5D). If there is no other vehicle within the first distance L1, the process proceeds to step 6D, and if there is another vehicle, the process proceeds to step 7D.
 画像選択部40Dは、車両状態情報取得部30Dが取得した現時点の車両10Dの位置及び向きと車両進路取得部20Dが取得した進路と周囲情報検知部16Dが取得した周囲情報から、表示すべき最適な進路前方表示画像6Daの画像データをこの進路表示画像データ格納部41Dから選択する(ステップ6D)。 The image selection unit 40D is the optimum to be displayed from the current position and orientation of the vehicle 10D acquired by the vehicle state information acquisition unit 30D, the route acquired by the vehicle course acquisition unit 20D, and the surrounding information acquired by the surrounding information detection unit 16D. The image data of the course forward display image 6Da is selected from the course display image data storage unit 41D (step 6D).
 一方、制御部90Dは、車両10Dの前方に他車両が存在し、その他車両までの距離が第1距離L1内であった場合、進路前方画像表示は行わないと判断し、画像選択部40Dへ画像選択の指令を送らない。制御部90Dは、車両10Dが交差点から道路交通法に定められるウインカーランプを出すべき位置までの所定距離である第3距離L3内に位置する場合、車両10Dの後方を確認し、第2距離L2内に他車両が存在するかを判断する(ステップ7D)。制御部90Dが第2距離L2内に他車両が存在しないと判断した場合にはステップ9Dに移行し、存在すると判断した場合にはステップ8Dに移行する。 On the other hand, the control unit 90D determines that if another vehicle exists in front of the vehicle 10D and the distance to the other vehicle is within the first distance L1, the image in front of the course is not displayed, and the image selection unit 40D is reached. Do not send image selection commands. When the vehicle 10D is located within the third distance L3, which is a predetermined distance from the intersection to the position where the turn signal lamp should be emitted, the control unit 90D confirms the rear of the vehicle 10D and the second distance L2. It is determined whether there is another vehicle inside (step 7D). If the control unit 90D determines that there is no other vehicle within the second distance L2, the process proceeds to step 9D, and if it determines that the other vehicle does exist, the process proceeds to step 8D.
 すなわち、交差点から所定距離(30m)手前等、道路交通法で定められた第3距離L3内に車両10Dが位置する場合、車両10Dは進路変更をする時は、ウインカーランプを点灯させなければいけない。進路表示画像6Dはウインカーランプの点灯と同様とするから、ウインカーランプを点灯させなければならない位置では、路面投影する。しかしながら、第1距離L1内や第2距離L2内に他車両が位置する場合は、他車両への視覚的影響を考慮して、進路表示画像6D(前方の場合は、進路前方表示画像6Da、後方の場合は、進路後方表示画像6Db)を表示しないと制御部90Dは判断する。 That is, when the vehicle 10D is located within the third distance L3 specified by the Road Traffic Act, such as before a predetermined distance (30 m) from the intersection, the vehicle 10D must turn on the blinker lamp when changing the course. .. Since the course display image 6D is the same as the lighting of the blinker lamp, the road surface is projected at the position where the blinker lamp must be turned on. However, when another vehicle is located within the first distance L1 or the second distance L2, the course display image 6D (in the case of the front, the course front display image 6Da, is used in consideration of the visual influence on the other vehicle. In the case of the rear, the control unit 90D determines that the rearward display image 6Db) is not displayed.
 画像選択部40Dは、車両状態情報取得部30Dが取得した現時点の車両10Dの位置及び向きと車両進路取得部20Dが取得した進路と、周囲情報検知部16Dが取得した周囲情報から、表示すべき最適な進路前方表示画像6Daまたは進路後方表示画像6Dbの画像データを進路表示画像データ格納部41Dから選択する(ステップ8D)。 The image selection unit 40D should display from the current position and orientation of the vehicle 10D acquired by the vehicle state information acquisition unit 30D, the course acquired by the vehicle course acquisition unit 20D, and the surrounding information acquired by the surrounding information detection unit 16D. The image data of the optimum course front display image 6Da or the course rear display image 6Db is selected from the course display image data storage unit 41D (step 8D).
 次に画像選択部40Dは、選択した進路前方表示画像6Daまたは進路後方表示画像6Dbの画像データに応じて路面描画部50Dに表示の指令を出力する。路面描画部50Dは、画像選択部40Dが選択した進路表示画像6Dの画像データを進路表示画像データ格納部41D内から選択し、照明制御部51D及び照明ユニット52Dを用いて、進路前方表示画像6Daを車両10D前方の路面上に、または、進路後方表示画像6Dbを車両10D後方の路面上に表示する(ステップ9D)。 Next, the image selection unit 40D outputs a display command to the road surface drawing unit 50D according to the image data of the selected course front display image 6Da or the course rear display image 6Db. The road surface drawing unit 50D selects the image data of the course display image 6D selected by the image selection unit 40D from the course display image data storage unit 41D, and uses the lighting control unit 51D and the lighting unit 52D to display the course forward display image 6Da. Is displayed on the road surface in front of the vehicle 10D, or the track rear display image 6Db is displayed on the road surface in front of the vehicle 10D (step 9D).
 ステップ1Dからステップ9Dの処理は車両10Dが目的地に到着するまで、所定間隔で繰り返される。交差点や進路変更箇所では、逐次、車両10Dの位置に応じて、画像データを画像選択部40Dが再選択し、常に最適な進路表示画像6Dが最適な位置に表示される(ステップ10D)。 The processing of steps 1D to 9D is repeated at predetermined intervals until the vehicle 10D arrives at the destination. At intersections and course change points, image data is sequentially reselected by the image selection unit 40D according to the position of the vehicle 10D, and the optimum course display image 6D is always displayed at the optimum position (step 10D).
 なお、ステップ4Dからステップ7Dで、車両10Dの進路情報、車両状態情報、検知情報から、制御部90Dが進路表示画像6D(進路前方表示画像6Da、進路後方表示画像6Db)の表示の要否を判断したが、表示の要否の判断を画像選択部40Dが行っても良い。この場合、画像選択部40Dは進路表示画像6Dの表示の要否を選択したのち、適した進路表示画像の選択を行う。また第2方向は左折方向でもよい。 From step 4D to step 7D, the control unit 90D determines whether or not the course display image 6D (course forward display image 6Da, course rear display image 6Db) needs to be displayed from the course information, vehicle state information, and detection information of the vehicle 10D. Although the determination has been made, the image selection unit 40D may determine whether or not the display is necessary. In this case, the image selection unit 40D selects whether or not the course display image 6D needs to be displayed, and then selects a suitable course display image. The second direction may be a left turn direction.
 次に、交差点での車両10Dの右折時(図20に示す進路7Dを進む)において、各車両10Daから10Dcの進路表示画像選択の判断について説明する。図20に示すように、第3距離L3内に3台の車両10Da~10Dcが位置する。 Next, when the vehicle 10D turns right at the intersection (following the course 7D shown in FIG. 20), the determination of the course display image selection of each vehicle 10Da to 10Dc will be described. As shown in FIG. 20, three vehicles 10Da to 10Dc are located within the third distance L3.
 車両10Daは、第1方向である直進方向から第2方向である右折方向へ進路を変更する。車両10Daの前方に他車両が存在しないため、画像選択部40Dは進路前方表示画像6Daを選択し、路面描画部50Dは、車両10Daの前方路面に右折を示す矢印の進路前方表示画像6Daを車両前面の照明ユニット52R、52Lが投影する。また、車両10Daの後方には車両10Dbが位置するため、車両10Daは、車両後方の路面には進路表示画像を表示しない。 Vehicle 10Da changes its course from the straight direction, which is the first direction, to the right turn direction, which is the second direction. Since there is no other vehicle in front of the vehicle 10Da, the image selection unit 40D selects the course forward display image 6Da, and the road surface drawing unit 50D displays the course forward display image 6Da of the arrow indicating a right turn on the road surface in front of the vehicle 10Da. The front lighting units 52R and 52L project. Further, since the vehicle 10Db is located behind the vehicle 10Da, the vehicle 10Da does not display the course display image on the road surface behind the vehicle.
 車両10Dbは、前方第1距離L1内に車両10Daが存在するため、進路前方表示画像6Daを表示しない。また、後方第2距離L2内に車両10Dcが存在するため、進路後方表示画像6Dbも表示しない。車両10Daが交差点内に進入し、車両10Daとの距離が第1距離L1以上となった場合等、車両10Dbの前方に他車両が存在しなければ進路前方表示画像6Daを表示する。 The vehicle 10Db does not display the course forward display image 6Da because the vehicle 10Da exists within the first front distance L1. Further, since the vehicle 10Dc exists within the second rear distance L2, the rearward display image 6Db is not displayed either. If there is no other vehicle in front of the vehicle 10Db, such as when the vehicle 10Da enters the intersection and the distance to the vehicle 10Da is the first distance L1 or more, the course forward display image 6Da is displayed.
 車両10Dcは、前方第1距離L1内に車両110bが存在するため、進路前方表示画像6Daを表示しない。また、後方第2距離L2内に他車両が存在しないため、後方の路面に進路方向を示す進路後方表示画像6Dbを投影する。これは、車両10Dcの車両後面の照明ユニット52BR,52BLが投影する。このように、進路後方表示画像6Dbを後方の路面に表示することにより、周囲の歩行者等に進路を示すことができる。 Vehicle 10Dc does not display the course front display image 6Da because the vehicle 110b exists within the first front distance L1. Further, since there is no other vehicle within the second rear distance L2, the course rear display image 6Db indicating the course direction is projected on the rear road surface. This is projected by the lighting units 52BR and 52BL on the rear surface of the vehicle 10Dc. In this way, by displaying the course rearward display image 6Db on the rearward road surface, the course can be shown to surrounding pedestrians and the like.
 また、本実施形態では、交差点手前に車両10Daが1台だけ存在する場合は、車両10Da前方の路面に進路前方表示画像6Daを投影し、後方の路面に進路後方表示画像6Dbを投影する。つまり1台の車両10Dが前後それぞれの路面に進路表示画像6Dを表示することとしたが、本開示はこれに限らず前方の路面だけに進路表示画像6D表示してもよい。 Further, in the present embodiment, when only one vehicle 10Da exists in front of the intersection, the course front display image 6Da is projected on the road surface in front of the vehicle 10Da, and the course rear display image 6Db is projected on the road surface behind. That is, one vehicle 10D has decided to display the course display image 6D on each of the front and rear road surfaces, but the present disclosure is not limited to this, and the course display image 6D may be displayed only on the road surface in front.
 本実施形態で、進路前方表示画像6Daを表示しないのは、前方第1距離L1内に他車両が存在する場合とした。本実施形態における第1距離L1は「制動距離+車両1台分の長さ」とする。制動距離とは、車両10Dのブレーキが効き始めてから停止するまでに走行した距離であり、速度の2乗と車重に比例し、制動力に反比例するものである。本実施形態では、車両10Dの速度毎の制動距離を予め求めておき(設定しておき)、制御部90Dは検知された車両10Dの速度から逐次制動距離を求め、第1距離L1を算出する。なお、本実施形態では、車両10D台分の長さは、5mと設定した。制御部90Dは、第1距離L1と前方の他車両までの距離を比較し、進路前方表示画像6Daの表示の必要性を判断する。 In the present embodiment, the course front display image 6Da is not displayed when another vehicle exists within the first front distance L1. The first distance L1 in the present embodiment is "braking distance + length for one vehicle". The braking distance is the distance traveled from when the brake of the vehicle 10D starts to work until it stops, is proportional to the square of the speed and the vehicle weight, and is inversely proportional to the braking force. In the present embodiment, the braking distance for each speed of the vehicle 10D is obtained (set) in advance, and the control unit 90D sequentially obtains the braking distance from the detected speed of the vehicle 10D and calculates the first distance L1. .. In this embodiment, the length of 10D vehicles is set to 5 m. The control unit 90D compares the first distance L1 with the distance to another vehicle in front, and determines the necessity of displaying the course forward display image 6Da.
 次に進路後方表示画像6Dbを表示する上での判断基準とした第2距離L2は、「停止距離+車両1台分の長さ」とする。停止距離とは、空走距離+制動距離である。空走距離は、運転手が自動車を停止させる必要があると感じた瞬間から、ブレーキを踏みこんで効きはじめるまでに自動車が走ってしまう距離である。したがって、停止距離とは、運転手が車両10Dを停止させようと感じた時から実際に車両10Dが停止するまでに走った距離である。運転手がブレーキを踏みこむよう反応するのに要する時間は、個人差があるが、本実施形態では、0.75秒と設定する。空走距離は、検知した速度×0.75秒で求めることができ、停止距離はこのように求めた空走距離と制動距離から算出される。本実施形態では制御部90Dが、求められた第2距離L2と後方の他車両までの距離を比較し、進路後方表示画像6Dbの表示の必要性を判断する。 Next, the second distance L2, which is the criterion for displaying the rearward display image 6Db, is "stop distance + length for one vehicle". The stopping distance is the free running distance + the braking distance. The free-running distance is the distance that the car runs from the moment the driver feels it is necessary to stop the car until the driver steps on the brake and begins to work. Therefore, the stop distance is the distance traveled from the time when the driver feels that the vehicle 10D is to be stopped until the vehicle 10D actually stops. The time required for the driver to react to step on the brake varies from person to person, but in the present embodiment, it is set to 0.75 seconds. The free running distance can be obtained by the detected speed × 0.75 seconds, and the stopping distance is calculated from the free running distance and the braking distance thus obtained. In the present embodiment, the control unit 90D compares the obtained second distance L2 with the distance to another vehicle behind, and determines the necessity of displaying the course rear display image 6Db.
 第3距離L3は、上述したように定める。すなわち、道路交通法施行令は、例えば左折等の行為を行う場合は、当該交差点の手前の側端から30メートル手前の地点に達したときに、ウインカーランプの点灯等の合図を出さなければならないと定めている。したがって、交差点からこの合図を出さなければならない地点までの距離を第3距離L3とする。 The third distance L3 is determined as described above. That is, the Road Traffic Act Enforcement Ordinance must give a signal such as turning on the blinker lamp when reaching a point 30 meters before the side edge in front of the intersection, for example, when performing an act such as turning left. It is stipulated. Therefore, the distance from the intersection to the point where this signal must be given is defined as the third distance L3.
 上述の例では、第3距離L3内に位置していても、車両10Dの後方第2距離L2内に他車両が位置する場合は、進路後方表示画像6Dbを表示しないとしている。しかしながら、交差点から第3距離L3内に位置する複数の車両10Dの中で最後尾の車両10Dは、最後尾の車両10Dの後方第2距離L2内に他車両が位置していたとしても、進路後方表示画像6Dbを表示する制御としてもよい。車両10Dが第3距離L3内の最後尾か否かは、車両10Dの位置から第3距離L3端(交差点より30m手前で合図を出さなければならない地点)までの距離が、後続で検知している他車両までの距離よりも短い等の条件によって判断する。 In the above example, even if the vehicle is located within the third distance L3, if another vehicle is located within the second distance L2 behind the vehicle 10D, the course rear display image 6Db is not displayed. However, among the plurality of vehicles 10D located within the third distance L3 from the intersection, the rearmost vehicle 10D has a course even if another vehicle is located within the second rear distance L2 behind the rearmost vehicle 10D. It may be a control to display the rear display image 6Db. Whether or not the vehicle 10D is the last within the third distance L3 is detected by the distance from the position of the vehicle 10D to the end of the third distance L3 (the point where a signal must be given 30 m before the intersection). Judgment is made based on conditions such as being shorter than the distance to other vehicles.
 このように、先頭車両10Daは、前方路面に、最後尾車両10Dcは、後方路面に、進路表示画像6Dを投影することにより進路を示し、対向車線の他車両や、周囲の歩行者に今後の動きを予測させ、注意を促すことができる。一方で先頭車両10Daとあまり距離を置かない状態で続く次車両10Dbは、進路表示画像6Dを投影しないため、前方車両10Daの後面等で投影光が乱反射することがなく、車両10Dbの運転手は乱反射によって視界が邪魔されることがない。 In this way, the leading vehicle 10Da shows the course on the front road surface, and the rearmost vehicle 10Dc shows the course by projecting the course display image 6D on the rear road surface. It can predict movement and call attention. On the other hand, the next vehicle 10Db, which continues in a state where the distance from the leading vehicle 10Da is not so large, does not project the course display image 6D, so that the projected light is not diffusely reflected on the rear surface of the front vehicle 10Da, and the driver of the vehicle 10Db Diffuse reflection does not obstruct the view.
 (第9実施形態)
 第9実施形態では、車両10Dの後方第2距離L2内に他車両が存在したとしても、他車両がバイクや自転車等、巻き込みの危険性がある車両の場合は後方路面に進路表示画像6Dを表示する。図21は第9実施形態の車両10Da、10Db、二輪車10Ddの交差点での進路表示画像6Dを示す模式図である。第8実施形態と重複する内容は説明を省略する。
(9th Embodiment)
In the ninth embodiment, even if another vehicle exists within the second rear distance L2 of the vehicle 10D, if the other vehicle is a vehicle such as a motorcycle or a bicycle that has a risk of getting caught, the course display image 6D is displayed on the rear road surface. indicate. FIG. 21 is a schematic view showing a course display image 6D at an intersection of the vehicles 10Da and 10Db and the two-wheeled vehicle 10Dd of the ninth embodiment. The description of the contents overlapping with the eighth embodiment will be omitted.
 図21に示すように、車両10Dbの前方第1距離L1内に車両10Daが存在するため、車両10Dbは、進路前方表示画像6Daを投影しない。一方で、後方第2距離L2内に二輪車10Ddが存在する。第1実施形態ではこの場合進路後方表示画像6Dbを表示しなかったが、本実施形態では進路後方表示画像6Dbを表示する。 As shown in FIG. 21, since the vehicle 10Db exists within the first distance L1 ahead of the vehicle 10Db, the vehicle 10Db does not project the course forward display image 6Da. On the other hand, there is a motorcycle 10Dd within the second rear distance L2. In this case, the course rearward display image 6Db was not displayed in the first embodiment, but in the present embodiment, the course rearward display image 6Db is displayed.
 図19の処理フローで示したように、車両10Dbの後方を車載カメラ13D、センサ14Dで検知した際、バイクや自転車といった二輪車10Ddが検知された場合、制御部90Dは、二輪車10Ddの巻き込み防止のために、進路後方表示画像6Dbを表示することを選択し、画像選択部40Dへ画像表示の指令を送る。 As shown in the processing flow of FIG. 19, when the rear of the vehicle 10Db is detected by the in-vehicle camera 13D and the sensor 14D, when a two-wheeled vehicle 10Dd such as a motorcycle or a bicycle is detected, the control unit 90D prevents the two-wheeled vehicle 10Dd from being caught. Therefore, the vehicle rear display image 6Db is selected to be displayed, and an image display command is sent to the image selection unit 40D.
 画像選択部40Dは、車両10Dbの進路を示す画像、図21に示す例では、左折画像に係るデータを進路表示画像データ格納部41D内から選択し、路面描画部50Dへ指令を出力する。 The image selection unit 40D selects data related to the left turn image in the image showing the course of the vehicle 10Db, and in the example shown in FIG. 21, from the course display image data storage unit 41D, and outputs a command to the road surface drawing unit 50D.
 路面描画部50Dは、左折を示す進路後方表示画像6Dbを車両10Dbの後方路面に投影する。この際、車両10Dbの後方近くに二輪車10Ddが存在しているため、二輪車10Ddの運転手の視界に眩しさを与えず、かつ視認できる位置に投影できるよう、照明制御部51Dは照明ユニット52Dを制御する。具体的に路面描画部50Dは、両方の照明ユニット52BR、52BLを用いるのではなく、右側の照明ユニット52BRだけを用いて投影する、あるいは投影角度を調整する。 The road surface drawing unit 50D projects a course rear display image 6Db indicating a left turn onto the rear road surface of the vehicle 10Db. At this time, since the two-wheeled vehicle 10Dd exists near the rear of the vehicle 10Db, the lighting control unit 51D sets the lighting unit 52D so that the driver's field of view of the two-wheeled vehicle 10Dd can be projected to a visible position without giving glare. Control. Specifically, the road surface drawing unit 50D does not use both lighting units 52BR and 52BL, but projects using only the right lighting unit 52BR, or adjusts the projection angle.
 特に、車両10Dbが左折する際に、後方に二輪車10Ddが位置すると、二輪車10Ddの巻き込み事故が発生する危険性が高い。したがって、車両10Dbの後方第2距離L2内に二輪車10Ddが存在する場合でかつ、車両10Dbの進路が左折の場合のみ、路面描画部50Dは進路後方表示画像6Dbを投影し、同位置に二輪車10Ddが存在しても、車両10Dbの進路が直進、または右折の場合は、路面描画部50Dは進路後方表示画像6Dbを表示しなくてもよい。 In particular, if the motorcycle 10Dd is located behind when the vehicle 10Db turns left, there is a high risk that the motorcycle 10Dd will be involved in an accident. Therefore, only when the motorcycle 10Dd exists within the second rear distance L2 of the vehicle 10Db and the course of the vehicle 10Db is a left turn, the road surface drawing unit 50D projects the course rear display image 6Db and the motorcycle 10Dd is at the same position. Even if there is, if the course of the vehicle 10Db goes straight or turns right, the road surface drawing unit 50D does not have to display the course rear display image 6Db.
 また、車両10Dbの後方第2距離L2内に歩行者が位置する場合も、進路後方表示画像6Dbを表示する形としても良い。すなわち、第2距離L2内に位置する歩行者の視線では、車両10Dbのウインカーランプに気づきにくい場合があり、更なる注意を喚起するために路面描画部50Dは進路後方表示画像6Dbを表示するようにしてもよい。歩行者を検知した場合も、歩行者の視界に眩しさを与えず、進路後方表示画像6Dbを的確に認識できる位置に投影するよう、照明制御部51Dは、照明ユニット52Dを制御する。 Further, even when a pedestrian is located within the second rear distance L2 of the vehicle 10Db, the path rear display image 6Db may be displayed. That is, in the line of sight of a pedestrian located within the second distance L2, it may be difficult to notice the blinker lamp of the vehicle 10Db, and the road surface drawing unit 50D displays the path rear display image 6Db in order to call further attention. It may be. Even when a pedestrian is detected, the lighting control unit 51D controls the lighting unit 52D so as not to give glare to the pedestrian's field of view and to project the rearward display image 6Db at a position where it can be accurately recognized.
 このように、車両10Dbの後方第2距離L2内に、二輪車10Ddが位置する時は、二輪車10Dd巻き込みという別観点の危険を防止するために、路面描画部50Dは進路後方表示画像6Dbを表示する。このような制御を行うことで、車両10Db近傍の二輪車10Ddに車両10Dbの進路を的確に示すことができ、二輪車10Ddの運転手に注意を促すことができる。 In this way, when the motorcycle 10Dd is located within the second rear distance L2 of the vehicle 10Db, the road surface drawing unit 50D displays the path rear display image 6Db in order to prevent the danger of the motorcycle 10Dd being involved from another viewpoint. .. By performing such control, the course of the vehicle 10Db can be accurately indicated to the two-wheeled vehicle 10Dd in the vicinity of the vehicle 10Db, and the driver of the two-wheeled vehicle 10Dd can be alerted.
 (第10実施形態)
 第8実施形態、第9実施形態では、車両10Dが備える周囲情報検知部16Dが検知した周囲情報に基づいて、進路表示画像6Dの投影を制御した。第10実施形態は、車両10Dの検知部材だけでなく、道路に予め設置されている装置も利用して、進路表示画像6Dの投影を制御する。図4Aを用いて第10実施形態を説明する。第8実施形態および第9実施形態と重複する内容は説明を省略する。
(10th Embodiment)
In the eighth and ninth embodiments, the projection of the course display image 6D is controlled based on the surrounding information detected by the surrounding information detection unit 16D included in the vehicle 10D. In the tenth embodiment, not only the detection member of the vehicle 10D but also the device installed in advance on the road is used to control the projection of the course display image 6D. The tenth embodiment will be described with reference to FIG. 4A. The description of the contents overlapping with the eighth embodiment and the ninth embodiment will be omitted.
 インフラ側装置30は、撮像カメラ等、インフラ側装置30の周囲の状態を検知するインフラ側周囲情報検知部(他車両検知部あるいは状況把握部)を設けていてもよい。本実施形態におけるインフラ側周囲情報検知部は、光照射部31と兼用で設けられており、上方から路面を撮像する等して周囲情報を検知してもよい。本開示はこれに限らず、インフラ側周囲情報検知部はインフラ側装置30の支柱等、他の箇所に設けられてもよい。 The infrastructure side device 30 may be provided with an infrastructure side surrounding information detection unit (another vehicle detection unit or a situation grasping unit) that detects the surrounding state of the infrastructure side device 30 such as an imaging camera. The infrastructure-side ambient information detection unit in the present embodiment is also provided as the light irradiation unit 31, and may detect ambient information by photographing the road surface from above. The present disclosure is not limited to this, and the infrastructure side surrounding information detection unit may be provided at another place such as a support column of the infrastructure side device 30.
 図22は本実施形態の車両10Da、二輪車10Dd、インフラ側装置30の交差点での関係を示す模式図である。 FIG. 22 is a schematic view showing the relationship between the vehicle 10Da, the motorcycle 10Dd, and the infrastructure side device 30 of the present embodiment at an intersection.
 インフラ側装置30は、路面を撮像する等の周囲情報を検知し、車両10Daが交差点手前の進路表示をしなければならない範囲に位置し、かつ車両10Daの後方第2距離L2内に二輪車10Ddが位置することを検知する。インフラ側装置30には、検知画像から進路表示画像6Dを表示するか否かを判断する制御部が設けられている。制御部は、先のインフラ側周囲情報検知部が検知した周囲情報から、車両10Daの前方第1距離L1内には他車両が存在せず、進路前方表示画像6Daを表示させると判断する。また、車両10Daの後方についても二輪車10Ddが位置することから、進路後方表示画像6Dbを表示させると判断する。 The infrastructure side device 30 detects surrounding information such as imaging the road surface, is located in a range where the vehicle 10Da must display the course in front of the intersection, and the motorcycle 10Dd is within the second rear distance L2 of the vehicle 10Da. Detects that it is located. The infrastructure side device 30 is provided with a control unit that determines whether or not to display the course display image 6D from the detected image. The control unit determines from the surrounding information detected by the surrounding information detection unit on the infrastructure side that no other vehicle exists within the first distance L1 ahead of the vehicle 10Da and displays the course forward display image 6Da. Further, since the motorcycle 10Dd is also located behind the vehicle 10Da, it is determined that the course rear display image 6Db is displayed.
 次にインフラ側装置30の情報通信部は、車両10Daと路車間通信で進路前方表示画像6Da及び進路後方表示画像6Dbを表示するよう指令を送る。 Next, the information and communication unit of the infrastructure side device 30 sends a command to display the course front display image 6Da and the course rear display image 6Db by communication between the vehicle 10Da and the road-to-vehicle.
 一方、車両10Daでは、無線通信部15Dがインフラ側装置30からの指令を受信し、車両状態情報取得部30Dへ送る。制御部90Dは、車両状態情報取得部30Dから指令を受けるとともに、車両進路取得部20Dから左折をするという進路情報を受け、画像選択部40Dへ、左折用の進路前方表示画像6Da、進路後方表示画像6Dbの表示の指令を送る。画像選択部40D以降は、第8実施形態、第9実施形態と同様に、所定の進路表示画像6Dを選択し、路面描画部50Dが車両10Daの前方路面に進路前方表示画像6Daを表示する。また路面描画部50Dが、車両1Daの後方路面に、進路後方表示画像6Dbを表示する。 On the other hand, in the vehicle 10Da, the wireless communication unit 15D receives the command from the infrastructure side device 30 and sends it to the vehicle status information acquisition unit 30D. The control unit 90D receives a command from the vehicle state information acquisition unit 30D and receives the route information that the vehicle course acquisition unit 20D makes a left turn, and sends the image selection unit 40D a course forward display image 6Da for a left turn and a course rearward display. Send a command to display image 6Db. After the image selection unit 40D, the predetermined course display image 6D is selected in the same manner as in the eighth embodiment and the ninth embodiment, and the road surface drawing unit 50D displays the course forward display image 6Da on the road surface in front of the vehicle 10Da. Further, the road surface drawing unit 50D displays the path rear display image 6Db on the rear road surface of the vehicle 1Da.
 このように本実施形態は、インフラ側装置30からの指令に基づいて、進路表示画像6Dの表示を判断する。なお、交差点付近、すなわちインフラ側装置30付近の路面への画像表示においては、車両10Daの照明ユニット52Dが画像を投影するだけでなく、インフラ側装置30の光照射部31が画像を投影してもよい。 In this way, the present embodiment determines the display of the course display image 6D based on the command from the infrastructure side device 30. In the image display on the road surface near the intersection, that is, near the infrastructure side device 30, not only the lighting unit 52D of the vehicle 10Da projects the image, but also the light irradiation unit 31 of the infrastructure side device 30 projects the image. May be good.
 また、車両10Daの進路について、車両10Daの車両進路取得部20Dが無線通信部15Dを介して、インフラ側装置30に進路情報を通知し、インフラ側装置30の制御部が入手した進路情報に基づいて適切な進路表示画像6Dを決定し、決定した進路表示画像6Dをインフラ側装置30が車両10Daへ指令してもよい。すなわち、インフラ側装置30が進路表示画像6Dの選択までを処理し、車両10Daは、インフラ側装置30から送られてきた進路表示画像6Dのデータを単に表示する。この場合、車両10Da内に複数の表示画像データを格納する必要はない。 Further, regarding the course of the vehicle 10Da, the vehicle course acquisition unit 20D of the vehicle 10Da notifies the infrastructure side device 30 of the course information via the wireless communication unit 15D, and based on the course information obtained by the control unit of the infrastructure side device 30. The appropriate course display image 6D may be determined, and the infrastructure side device 30 may command the determined course display image 6D to the vehicle 10Da. That is, the infrastructure side device 30 processes up to the selection of the course display image 6D, and the vehicle 10Da simply displays the data of the course display image 6D sent from the infrastructure side device 30. In this case, it is not necessary to store a plurality of display image data in the vehicle 10Da.
 このように本実施形態は、インフラ側装置30を利用して各車両10Dの進路表示画像6Dの表示制御を行うことにより、インフラ側装置30付近の複数車両10Dの表示制御を一括して行うことができ、複数車両10Dにおいて統合された表示制御を行うことができる。すなわち、各車両10D個別の制御部90Dの制御能力によらず、統合した進路表示画像6Dを各車両10Dが表示することができる。 As described above, in the present embodiment, the display control of the course display image 6D of each vehicle 10D is performed by using the infrastructure side device 30, so that the display control of the plurality of vehicles 10D in the vicinity of the infrastructure side device 30 is collectively performed. It is possible to perform integrated display control in a plurality of vehicles 10D. That is, each vehicle 10D can display the integrated course display image 6D regardless of the control capability of the individual control unit 90D of each vehicle 10D.
 (その他)
 車両10Dの前後の所定距離(第1距離L1、第2距離L2)内に他車両が存在するか否かの判断において、車両10Dに設けた周囲情報検知部16Dの検知情報や、インフラ側装置30のインフラ側周囲情報検知部の検知情報を用いるものを説明した。しかしながら、本開示はこれらに限らず、車両10D間の車車間通信によって他車両を検知してもよい。車車間通信は無線通信部15Dを利用してもよい。
(Other)
In determining whether or not another vehicle exists within a predetermined distance (first distance L1, second distance L2) before and after the vehicle 10D, the detection information of the surrounding information detection unit 16D provided on the vehicle 10D and the infrastructure side device The one using the detection information of the surrounding information detection unit on the infrastructure side of 30 has been described. However, the present disclosure is not limited to these, and other vehicles may be detected by vehicle-to-vehicle communication between vehicles 10D. The wireless communication unit 15D may be used for vehicle-to-vehicle communication.
 また、車両10Dは、ナビゲーションシステム11Dに接続する衛星測位システムからの情報を利用して他車両の位置や自車両の位置等を受信し、受信した情報を利用して、所定距離内に他車両が存在するか否かを判断してもよい。 Further, the vehicle 10D receives the position of another vehicle, the position of the own vehicle, etc. by using the information from the satellite positioning system connected to the navigation system 11D, and uses the received information to use the other vehicle within a predetermined distance. May be determined if is present.
 また、先に説明した各実施形態では、交差点での右折、左折を例に挙げたが、本開示はこれに限らず、複数車線ある場合の車線変更等においても適用可能である。また、車両10Dは一般的な5人乗り乗用車だけでなく、トラックやトレーラー、また二輪車等、多種多様な車両に適用可能である。 Further, in each of the above-described embodiments, a right turn and a left turn at an intersection are given as an example, but the present disclosure is not limited to this, and can be applied to a lane change when there are a plurality of lanes. Further, the vehicle 10D can be applied not only to a general five-seater vehicle but also to a wide variety of vehicles such as trucks, trailers, and motorcycles.
 (第11実施形態) (11th embodiment)
 次に、車両10Eが備える路面描画装置900について説明する。第8実施形態と重複する内容は説明を省略する。図23は、第11実施形態に係る路面描画装置900のブロック図である。本開示の路面描画装置900は、制御部90E、車両進路取得部20E、車両状態情報取得部30E、画像選択部40E、画像描画部50Eを備える。 Next, the road surface drawing device 900 included in the vehicle 10E will be described. The description of the contents overlapping with the eighth embodiment will be omitted. FIG. 23 is a block diagram of the road surface drawing device 900 according to the eleventh embodiment. The road surface drawing device 900 of the present disclosure includes a control unit 90E, a vehicle course acquisition unit 20E, a vehicle state information acquisition unit 30E, an image selection unit 40E, and an image drawing unit 50E.
 制御部90Eは、車両10Eの各種機器の制御を行うものである。制御部90Eには、ナビゲーションシステム11E、方向指示器12E、車載カメラ13E、センサ14E、無線通信部15E等の車両10E外部のモニタリングを行う各種センサや外部機器等と接続されており、各種信号、データが入出力される。 The control unit 90E controls various devices of the vehicle 10E. The control unit 90E is connected to various sensors and external devices for monitoring the outside of the vehicle 10E such as the navigation system 11E, the direction indicator 12E, the in-vehicle camera 13E, the sensor 14E, and the wireless communication unit 15E, and various signals. Data is input and output.
 本実施形態は、車両10Eの前方情報を検知する車載カメラ13E、センサ14E、無線通信部15Eを合わせて、前方情報検知部16Eと称する。前方情報検知部16Eは、車載カメラ13E、センサ14E、無線通信部15Eに限らず、車両10Eの周囲、特に前方の情報を入手するものであればよい。例えば、前方情報検知部16Eは、車外の他装置が撮像した画像を入手する仕組み等を含んでもよい。 In this embodiment, the vehicle-mounted camera 13E, the sensor 14E, and the wireless communication unit 15E that detect the front information of the vehicle 10E are collectively referred to as the front information detection unit 16E. The front information detection unit 16E is not limited to the in-vehicle camera 13E, the sensor 14E, and the wireless communication unit 15E, and may be any one that obtains information around the vehicle 10E, particularly in front. For example, the front information detection unit 16E may include a mechanism for obtaining an image captured by another device outside the vehicle.
 車両状態情報取得部30Eは、車両10Eの位置、向き、速度等、車両10Eの状態情報を取得する。車両状態情報取得部30Eは、ナビゲーションシステム11Eから車両10Eの位置情報、向き情報を取得する。車両状態情報取得部30Eは、操舵角度検出部31E及び速度センサ32Eと接続している。 The vehicle state information acquisition unit 30E acquires the state information of the vehicle 10E such as the position, orientation, and speed of the vehicle 10E. The vehicle state information acquisition unit 30E acquires the position information and orientation information of the vehicle 10E from the navigation system 11E. The vehicle state information acquisition unit 30E is connected to the steering angle detection unit 31E and the speed sensor 32E.
 操舵角度検出部31Eは、不図示の車両10Eの操舵装置31Eaにおける操舵角度(ステアリング角度)を検出する。操舵角度検出部31Eは、車両10Eのステアリングホイールに取り付けられ、ステアリングホイールの基準位置からの操舵角度を検出する。速度センサ32Eは車両10Eの走行速度等を検出する。 The steering angle detection unit 31E detects the steering angle (steering angle) of the steering device 31Ea of the vehicle 10E (not shown). The steering angle detection unit 31E is attached to the steering wheel of the vehicle 10E and detects the steering angle from the reference position of the steering wheel. The speed sensor 32E detects the traveling speed of the vehicle 10E and the like.
 車両状態情報取得部30Eは、このように、操舵角度検出部31Eが検出する操舵角度から車両10Eの向きを求め、速度センサ32Eが検出する速度や加速度から所定時間後の車両10Eの位置や向きを求める。車両状態情報取得部30Eは、進路の所定位置への車両10Eの到達時刻(所要時間)等も求める。 In this way, the vehicle state information acquisition unit 30E obtains the direction of the vehicle 10E from the steering angle detected by the steering angle detection unit 31E, and the position and orientation of the vehicle 10E after a predetermined time from the speed and acceleration detected by the speed sensor 32E. Ask for. The vehicle state information acquisition unit 30E also obtains the arrival time (required time) of the vehicle 10E to a predetermined position in the course.
 画像選択部40Eは、後述するように車両10Eの前方路面に照明ユニット52Eを用いて投影する進路表示画像6Eを選択するものである。すなわち、本実施形態の路面描画装置900は、右折時、左折時等、車両10Eの進路が変更する場合等に、他車両や歩行者に車両10Eの進路が的確に伝わるように、車両10Eの進路方向を示す進路表示画像6Eを路面に投影する。したがって、車両10Eの位置や向きに応じて、常に的確な進路表示画像6Eを表示できるよう、画像選択部40Eが表示画像データを選択する。具体的には、画像選択部40Eは、右折、左折、進路等を示す進路表示画像6Eのデータが複数ストックされた進路表示画像データ格納部41Eと接続している。画像選択部40Eは、車両進路取得部20Eが取得した車両10Eの進路情報と、車両状態情報取得部30Eが取得した車両10Eの向き、位置、速度等の車両状態情報とから、車両10Eの進むべき方向を示す進路表示画像6Eのデータを進路表示画像データ格納部41Eから選択する。そして画像選択部40Eは、選択した進路表示画像6Eのデータを指示する指令を画像描画部50Eへ送る。画像描画部50Eは、照明制御部51Eと照明ユニット52Eを備える。画像描画部50Eは、画像選択部40Eから指示された進路表示画像のデータを進路表示画像データ格納部41Eより取り出し、照明制御部51Eへ送る。照明制御部51Eは照明ユニット52Eに適した形にデータを変換し、照明ユニット52Eは変換されたデータに従って、光源より光を照射し、車両10E前方の路面へ所定の進路表示画像を投影する。 The image selection unit 40E selects the course display image 6E projected on the road surface in front of the vehicle 10E by using the lighting unit 52E, as will be described later. That is, the road surface drawing device 900 of the present embodiment accurately transmits the course of the vehicle 10E to other vehicles and pedestrians when the course of the vehicle 10E changes, such as when turning right or left. The course display image 6E showing the course direction is projected on the road surface. Therefore, the image selection unit 40E selects the display image data so that the accurate course display image 6E can always be displayed according to the position and orientation of the vehicle 10E. Specifically, the image selection unit 40E is connected to the course display image data storage unit 41E in which a plurality of data of the course display image 6E indicating a right turn, a left turn, a course, etc. are stocked. The image selection unit 40E advances the vehicle 10E from the course information of the vehicle 10E acquired by the vehicle course acquisition unit 20E and the vehicle state information such as the direction, position, and speed of the vehicle 10E acquired by the vehicle state information acquisition unit 30E. The data of the course display image 6E indicating the direction to be taken is selected from the course display image data storage unit 41E. Then, the image selection unit 40E sends a command for instructing the data of the selected course display image 6E to the image drawing unit 50E. The image drawing unit 50E includes a lighting control unit 51E and a lighting unit 52E. The image drawing unit 50E takes out the data of the route display image instructed from the image selection unit 40E from the route display image data storage unit 41E and sends it to the lighting control unit 51E. The lighting control unit 51E converts data into a form suitable for the lighting unit 52E, and the lighting unit 52E irradiates light from a light source according to the converted data and projects a predetermined course display image onto the road surface in front of the vehicle 10E.
 図24は、路面表示装置900が進路表示画像6Eを表示する処理の流れを示すフローチャートである。図25は、車両10Eの交差点での右折時の進路7Eと進路表示画像6Eを示す模式図である。図24に示すように、第11実施形態の路面描画装置900は、次に示す流れで、交差点付近で進路表示画像6Eを表示する。 FIG. 24 is a flowchart showing the flow of processing in which the road surface display device 900 displays the course display image 6E. FIG. 25 is a schematic view showing a course 7E and a course display image 6E when turning right at an intersection of the vehicle 10E. As shown in FIG. 24, the road surface drawing device 900 of the eleventh embodiment displays the course display image 6E near the intersection in the following flow.
 まず、ナビゲーションシステム11Eからの情報、および方向指示器12Eからの情報により、車両進路取得部20Eは、車両10Eが進路7Eで示すように右折することを認知、判断する(ステップ1E)。 First, based on the information from the navigation system 11E and the information from the direction indicator 12E, the vehicle course acquisition unit 20E recognizes and determines that the vehicle 10E turns right as shown by the course 7E (step 1E).
 一方、車両状態情報取得部30Eは、ナビゲーションシステム11Eから車両10Eの位置、向き情報を取得する。また、操舵角度検出部31Eから車両10Eの操舵角度を取得し、速度センサ32Eから車両10Eの走行速度を取得する(ステップ2E)。 On the other hand, the vehicle state information acquisition unit 30E acquires the position and orientation information of the vehicle 10E from the navigation system 11E. Further, the steering angle of the vehicle 10E is acquired from the steering angle detection unit 31E, and the traveling speed of the vehicle 10E is acquired from the speed sensor 32E (step 2E).
 制御部90Eは、先の車両10Eの位置や向き情報と進路情報とから車両10Eが交差点に進入する状況や、進路を変更する状況である場合等、進路表示画像6Eを表示すべき位置に車両10Eが位置するかを判断する(ステップ3E)。制御部90Eが表示すべき位置と判断した場合には、これらの取得情報を画像選択部40Eへ送る。画像選択部40Eは、現時点の車両10Eの位置及び向きと進路から、表示すべき最適な進路表示画像6Eの画像データを画像データが蓄積されている進路表示画像データ格納部41Eから選択する。 The control unit 90E sets the vehicle at a position where the route display image 6E should be displayed, such as when the vehicle 10E enters an intersection or changes the route based on the position and orientation information and the route information of the vehicle 10E. Determine if 10E is located (step 3E). When the control unit 90E determines that the position should be displayed, the acquired information is sent to the image selection unit 40E. The image selection unit 40E selects the image data of the optimum course display image 6E to be displayed from the course display image data storage unit 41E in which the image data is stored, from the current position, orientation, and course of the vehicle 10E.
 図25に示すように、車両10Eが直進状態で交差点の30m手前に位置することを、車両状態情報取得部30Eが取得し、車両進路取得部20Eが進路7Eを取得する。画像選択部40Eは、取得した情報から車両10Eは右折することを表示する進路表示画像6Eの画像データを選択し、画像描画部50Eへ指令を出す(ステップ4E)。 As shown in FIG. 25, the vehicle state information acquisition unit 30E acquires that the vehicle 10E is located 30 m before the intersection in a straight-ahead state, and the vehicle course acquisition unit 20E acquires the course 7E. The image selection unit 40E selects the image data of the course display image 6E indicating that the vehicle 10E turns right from the acquired information, and issues a command to the image drawing unit 50E (step 4E).
 画像描画部50Eは、画像選択部40Eが選択した進路表示画像6Eの画像データを進路表示画像データ格納部41E内から選択し、照明制御部51E及び照明ユニット52Eを用いて、車両10E前方の路面上に進路表示画像6Eを表示する(ステップ5E)。図25に示す車両10Eの位置の場合、交差点に入る前であるので、右折するという意思表示で右折を示す路面表示画像6Eを表示する。 The image drawing unit 50E selects the image data of the route display image 6E selected by the image selection unit 40E from the route display image data storage unit 41E, and uses the lighting control unit 51E and the lighting unit 52E to use the lighting control unit 51E and the lighting unit 52E to select the road surface in front of the vehicle 10E. The course display image 6E is displayed on the top (step 5E). In the case of the position of the vehicle 10E shown in FIG. 25, since it is before entering the intersection, the road marking image 6E indicating the right turn is displayed with the intention to turn right.
 ステップ1Eからステップ5Eの処理は車両10Eが目的地に到着するまで(ステップ6E)、所定間隔で繰り返される。交差点や進路変更箇所では、逐次、車両10Eの位置に応じて、画像データを画像選択部40Eが再選択し、常に最適な進路表示画像6Eが表示される。 The processes from step 1E to step 5E are repeated at predetermined intervals until the vehicle 10E arrives at the destination (step 6E). At intersections and course change points, the image selection unit 40E reselects image data sequentially according to the position of the vehicle 10E, and the optimum course display image 6E is always displayed.
 次に、交差点での車両10Eの右折時(図25に示す進路7Eを進む)において、選択される進路表示画像の詳細について説明する。図26は、本実施形態における進路表示画像6Eの選択方法を示す模式図である。図26は、車両10Eが右方向に向きを変え、右折する途中段階を示している。 Next, the details of the course display image selected when the vehicle 10E turns right at the intersection (following the course 7E shown in FIG. 25) will be described. FIG. 26 is a schematic view showing a method of selecting the course display image 6E in the present embodiment. FIG. 26 shows a stage in which the vehicle 10E turns to the right and makes a right turn.
 本実施形態における進路表示画像6Eは、現状の車両10Eの向きに平行な支線6Eaと支線6Eaから進路方向へ伸びる方向線6Ebとを備えている。支線6Eaと方向線6Ebとの間の角度(以下、「進行角度γ」ともいう)は、操舵角度やナビゲーションシステム11Eから得られる車両の向きに応じて逐次変化する。すなわち、車両10Eの交差点内での進行に応じて、進路表示画像6Eの方向線6Ebの角度が逐次変化する仕組みとなっている。 The course display image 6E in the present embodiment includes a branch line 6Ea parallel to the direction of the current vehicle 10E and a direction line 6Eb extending from the branch line 6Ea in the course direction. The angle between the branch line 6Ea and the direction line 6Eb (hereinafter, also referred to as “travel angle γ”) changes sequentially according to the steering angle and the direction of the vehicle obtained from the navigation system 11E. That is, the angle of the direction line 6Eb of the course display image 6E changes sequentially according to the progress of the vehicle 10E in the intersection.
 具体的には、車両進路取得部20Eが取得した進路情報から、進路方向tが分かる。次に、破線矢印で示す車両10Eの直進方向sと車両10Eの進路方向tとの交差によって、直進方向sから進路方向tまでの角度(以下、これを「進路角度」という)αが算出される。 Specifically, the course direction t can be known from the course information acquired by the vehicle course acquisition unit 20E. Next, the angle α from the straight direction s to the course direction t (hereinafter, this is referred to as “course angle”) α is calculated by the intersection of the straight direction s of the vehicle 10E indicated by the broken arrow and the course direction t of the vehicle 10E. To.
 一方で、操舵角度検出部31Eから車両10Eの操舵角度が検出される。また、ナビゲーションシステム11Eからの車両10Eの位置情報の変位から車両状態情報取得部30Eは、操舵角度を検出してから所定位置までの車両10Eの累積移動量を算出する。そして、この操舵角度と累積移動量から、所定位置における車両10Eの傾き角度βが算出される。そして、この傾き角度βから車両10Eの向き方向u(破線矢印uで示す)が分かる。また、進行角度γは、進路角度αと傾き角度βを加算したものである。したがって、進路表示画像6Eにおける、支線6Eaは向き方向uに沿うように決定され、方向線6Ebは、各地点における車両10Eの進路方向tに沿うように決定される。 On the other hand, the steering angle of the vehicle 10E is detected from the steering angle detection unit 31E. Further, the vehicle state information acquisition unit 30E calculates the cumulative movement amount of the vehicle 10E from the displacement of the position information of the vehicle 10E from the navigation system 11E to the predetermined position after detecting the steering angle. Then, the inclination angle β of the vehicle 10E at a predetermined position is calculated from the steering angle and the cumulative movement amount. Then, the direction u (indicated by the broken line arrow u) of the vehicle 10E can be known from this inclination angle β. The traveling angle γ is the sum of the course angle α and the tilt angle β. Therefore, in the course display image 6E, the branch line 6Ea is determined to be along the direction u, and the direction line 6Eb is determined to be along the course direction t of the vehicle 10E at each point.
 図27Aから図27Cは、右折時における交差点内の車両10E位置における進路表示画像6Eを示す図である。図27Aは、車両10Eが交差点内に入る前であるので、車両10Eの進路方向を示すべく、右折を示す進路表示画像6Eが表示される。具体的には、車両10Eはまだ曲がっていない(車両10Eの向きは直進方向のままである)ので、操舵角度と累積移動量からの傾き角度βは0度となり、車両10Eの向き方向uは進路方向tと同じとなるため、支線6Eaと方向線6Ebとでなす進行角度γ=(α+β)は90度となる。 27A to 27C are diagrams showing the course display image 6E at the vehicle 10E position in the intersection when turning right. Since FIG. 27A is before the vehicle 10E enters the intersection, a course display image 6E showing a right turn is displayed in order to show the course direction of the vehicle 10E. Specifically, since the vehicle 10E has not yet turned (the direction of the vehicle 10E remains in the straight direction), the inclination angle β from the steering angle and the cumulative movement amount is 0 degrees, and the direction u of the vehicle 10E is Since it is the same as the course direction t, the traveling angle γ = (α + β) formed by the branch line 6Ea and the direction line 6Eb is 90 degrees.
 次に、車両10Eが交差点内に入り、運転手がハンドルを切り、操舵角度が発生すると、車両10Eの向きが直進方向から変わるとともに、累積移動量から傾き角度βが発生する。すなわち、操舵角度が一定であれば、旋回半径の円弧を軌道として車両10Eが進めば、車両10Eの移動量に応じて、車両の向きは変化する。したがって、傾き角度も変化することになる。よって、傾き角度βは操舵角度と累積移動量との関係から算出される。例えば、図27Bは交差点内で曲がる途中の車両10Eを示す。直進方向sから車両10Eの向きが角度βだけ傾いており、進行角度γ、すなわち支線6Eaと方向線6Ebとでなす進行角度γ=(α+β)は、90度よりも大きくなる。これにより、進路表示画像6Eは、図25に示した進路7Eに沿う形となる。 Next, when the vehicle 10E enters the intersection, the driver turns the steering wheel, and a steering angle is generated, the direction of the vehicle 10E changes from the straight-ahead direction, and the tilt angle β is generated from the cumulative movement amount. That is, if the steering angle is constant, if the vehicle 10E advances with the arc of the turning radius as the track, the direction of the vehicle changes according to the amount of movement of the vehicle 10E. Therefore, the tilt angle also changes. Therefore, the tilt angle β is calculated from the relationship between the steering angle and the cumulative movement amount. For example, FIG. 27B shows vehicle 10E on the way to a turn at an intersection. The direction of the vehicle 10E is tilted by the angle β from the straight direction s, and the traveling angle γ, that is, the traveling angle γ = (α + β) formed by the branch line 6Ea and the direction line 6Eb is larger than 90 degrees. As a result, the course display image 6E has a shape along the course 7E shown in FIG. 25.
 さらに、車両10Eが交差点を曲がり切った場合は、図27Cに示すように、車両10Eの傾き角度βは90度となる。したがって、進行角度γは、180度となり、車両10Eの向き方向uは、進路方向tと一致する。つまり、車両10Eが第一方向である直進方向から、第2方向である右折方向に交差点で進路を変える際に、進路表示画像6Eは、車両10Eから第2方向を示す線となる。 Further, when the vehicle 10E completely turns the intersection, the inclination angle β of the vehicle 10E becomes 90 degrees as shown in FIG. 27C. Therefore, the traveling angle γ is 180 degrees, and the direction u of the vehicle 10E coincides with the course direction t. That is, when the vehicle 10E changes its course at the intersection from the straight direction which is the first direction to the right turn direction which is the second direction, the course display image 6E becomes a line indicating the second direction from the vehicle 10E.
 本実施形態は、このように操舵角度と車両10Eの累積移動量から車両10Eの傾き角度βを求める。傾き角度βを用いて、車両10Eの向き方向u及び進行角度γを求め、向き方向uと進路方向tとの関係から進路表示画像6Eを生成(選択)する。よって、操舵角度や車両10Eの累積移動量(車両位置等から再度算出したもの)、車両状態情報の変化に伴って、進路表示画像6Eは再生成(再選択)される。 In this embodiment, the inclination angle β of the vehicle 10E is obtained from the steering angle and the cumulative movement amount of the vehicle 10E in this way. The direction direction u and the traveling angle γ of the vehicle 10E are obtained by using the inclination angle β, and the course display image 6E is generated (selected) from the relationship between the direction direction u and the course direction t. Therefore, the course display image 6E is regenerated (reselected) according to changes in the steering angle, the cumulative movement amount of the vehicle 10E (calculated again from the vehicle position, etc.), and the vehicle state information.
 画像選択部40Eは、メモリに格納された複数の進路表示画像6Eの中から、上述の形で求めた進行角度γに近いものを選択しても良いし、求めた進行角度γから進路表示画像6Eを生成(選択生成)してもよい。上述したように所定間隔で車両10Eの位置及び、操舵角度を取得するため、進路表示画像6Eは、車両10Eの向きが変化する毎に再選択(再選択生成)され、表示される。結果として、車両10Eが交差点に入ってからは、進路表示画像6Eは、進路7Eに沿う表示となる。 The image selection unit 40E may select an image close to the traveling angle γ obtained in the above-described form from the plurality of course display images 6E stored in the memory, or the course display image from the obtained traveling angle γ. 6E may be generated (selective generation). As described above, in order to acquire the position and steering angle of the vehicle 10E at predetermined intervals, the course display image 6E is reselected (reselected and generated) and displayed every time the direction of the vehicle 10E changes. As a result, after the vehicle 10E enters the intersection, the course display image 6E is displayed along the course 7E.
 なお、図26、図27Aから図27Cでは、支線6Eaと方向線6Ebが直接接続したが、両者はなだらかに結ぶ曲線部(カーブ線)を介してもよい。曲線部(カーブ線)の曲率は、進行角度γ、つまり、車両の向き方向uから進路方向tまでの角度に応じて決定される。 Note that, in FIGS. 26 and 27A to 27C, the branch line 6Ea and the direction line 6Eb are directly connected, but they may be connected through a gently curved portion (curve line). The curvature of the curved portion (curve line) is determined according to the traveling angle γ, that is, the angle from the direction u of the vehicle to the course direction t.
 このように本実施形態は、交差点内等、車両10Eが進行方向を変更する際に、車両の位置、向き等の車両状態情報の変化に伴って、進行角度γを変化させた進路表示画像6Eを再選択し、逐次表示することにより、周囲の他車両や歩行者が、車両10Eの進行方向を実際の動きに合わせた状態で正確に把握することができる。したがって、車両10Eの進行方向を誤解する等の誤認識を大幅に抑制することができる。 As described above, in the present embodiment, when the vehicle 10E changes the traveling direction such as in an intersection, the course display image 6E in which the traveling angle γ is changed according to the change of the vehicle state information such as the position and orientation of the vehicle. By reselecting and displaying sequentially, other vehicles and pedestrians in the vicinity can accurately grasp the traveling direction of the vehicle 10E in a state of matching the actual movement. Therefore, misrecognition such as misunderstanding of the traveling direction of the vehicle 10E can be significantly suppressed.
 (第12実施形態)
 第11実施形態では、交差点に入る前は、検知した操舵角度等を利用して進路表示画像6Eを選択するものを説明した。第12実施形態の路面描画装置900は、ナビゲーションシステム11Eが検知するGPSによる車両10Eの位置と車両10Eの向きから進路表示画像6Eを求める。
(12th Embodiment)
In the eleventh embodiment, the one that selects the course display image 6E by using the detected steering angle and the like before entering the intersection has been described. The road surface drawing device 900 of the twelfth embodiment obtains a course display image 6E from the position of the vehicle 10E by GPS detected by the navigation system 11E and the direction of the vehicle 10E.
 また、第11実施形態では、交差点に入る前の進路表示画像6Eの進行角度γは、曲がる方向を明確に示すために、直進方向sと進路方向tとから成す進路角度αであり、車両10Eが交差点に入り、曲がるにつれて、進行角度γが鈍角になっていく描画、言い換えると、車両10Eが曲がるにつれて、進路表示画像6Eの線がまっすぐになる方向に変化する。 Further, in the eleventh embodiment, the traveling angle γ of the course display image 6E before entering the intersection is a course angle α composed of the straight direction s and the course direction t in order to clearly indicate the turning direction, and the vehicle 10E. As the vehicle enters the intersection and turns, the traveling angle γ becomes blunt. In other words, as the vehicle 10E turns, the line of the course display image 6E changes in the direction of becoming straight.
 一方で、第12実施形態では、直進方向sと進路方向tとから成す進路角度αを基準として進行角度γを変化させるのではなく、車両10Eの向きの変化に応じて、進路表示画像6Eの進行角度γが変化する。すなわち、進路表示画像6Eの線の傾きが変化するものである。 On the other hand, in the twelfth embodiment, the traveling angle γ is not changed with reference to the course angle α formed by the straight direction s and the course direction t, but the course display image 6E is displayed according to the change in the direction of the vehicle 10E. The traveling angle γ changes. That is, the inclination of the line of the course display image 6E changes.
 第12実施形態において、車両進路取得部20Eは、ナビゲーションシステム11Eからの情報により、破線矢印で示す交差点における進路7Eを求める。また、車両状態情報取得部30Eは、同じくナビゲーションシステム11EのGPS(衛星測位システム)からの位置情報や車両10Eの進行状態から、車両10Eの位置や向きの情報を取得する。 In the twelfth embodiment, the vehicle course acquisition unit 20E obtains the course 7E at the intersection indicated by the broken line arrow based on the information from the navigation system 11E. Further, the vehicle state information acquisition unit 30E acquires the position and orientation information of the vehicle 10E from the position information from the GPS (satellite positioning system) of the navigation system 11E and the progress state of the vehicle 10E.
 画像選択部40Eは、取得した車両10Eの位置における進路7Eの向き(傾き具合)から、最も進路に沿う進路表示画像6Eを選択し、画像描画部50Eは選択された進路表示画像6Eを表示する。具体的には次のように進路表示画像6Eは表示される。 The image selection unit 40E selects the course display image 6E along the most course from the direction (inclination) of the course 7E at the acquired position of the vehicle 10E, and the image drawing unit 50E displays the selected course display image 6E. .. Specifically, the course display image 6E is displayed as follows.
 図28Aから図28Dは、本実施形態における車両10Eの右折時の進路表示画像6Eを車両10Eの位置に応じて示す模式図である。図28Aに示すように、車両10Eが交差点に進入する時は、直進から徐々に曲がっていくので、傾きの少ない曲線からなる進路表示画像6EAが表示される。曲がる方向が分かるように、進路表示画像6EAは比較的長い。 28A to 28D are schematic views showing the course display image 6E of the vehicle 10E when turning right in the present embodiment according to the position of the vehicle 10E. As shown in FIG. 28A, when the vehicle 10E enters the intersection, it gradually turns from straight ahead, so that the course display image 6EA composed of a curve with a small inclination is displayed. The course display image 6EA is relatively long so that the direction of the turn can be seen.
 次に、図28B及び図28Cに示すように、車両10Eが交差点の中央付近の位置では、傾きのある曲線からなる進路表示画像6EB、6ECが表示される。曲がる方向が分かるように、進路表示画像6EB、6ECは、先の進路表示画像6EAに比べて短い。 Next, as shown in FIGS. 28B and 28C, at the position where the vehicle 10E is near the center of the intersection, the course display images 6EB and 6EC composed of inclined curves are displayed. The course display images 6EB and 6EC are shorter than the previous course display image 6EA so that the turning direction can be known.
 さらに、図28Dに示すように、車両10Eが交差点を曲がり切った位置では、進路方向(右折方向)に伸びる傾きのない直線からなる進路表示画像6EDが表示される。進路表示画像6EDの長さは直進方向であることが他車や歩行者に分かる程度の長さである。 Further, as shown in FIG. 28D, at the position where the vehicle 10E turns completely at the intersection, the course display image 6ED consisting of a straight line extending in the course direction (right turn direction) and having no inclination is displayed. The length of the course display image 6ED is such that other vehicles and pedestrians can recognize that it is in the straight-ahead direction.
 このように、交差点における各位置での車両10Eの向きに合わせて、進路7Eに沿った進路表示画像6EA,6EB,6EC,6EDが、画像選択部40Eによって選択され、表示される。上述したように、対向車や歩行者が車両10Eの進路を認識しやすいように、車両10Eが交差点に入る位置では、比較的長めの曲線である進路表示画像6EAが表示され、交差点の中央付近の位置では、比較的短めの曲線である進路表示画像6EB、6ECが表示される。つまり、車両10Eが第一方向である直進方向から、第2方向である進路方向に交差点で進路を変える際に、進路表示画像6Eは、前記車両から前記第2方向へ伸びる曲線となる。 In this way, the course display images 6EA, 6EB, 6EC, and 6ED along the course 7E are selected and displayed by the image selection unit 40E according to the direction of the vehicle 10E at each position at the intersection. As described above, the course display image 6EA, which is a relatively long curve, is displayed at the position where the vehicle 10E enters the intersection so that oncoming vehicles and pedestrians can easily recognize the course of the vehicle 10E, and is near the center of the intersection. At the position of, the course display images 6EB and 6EC, which are relatively short curves, are displayed. That is, when the vehicle 10E changes course at an intersection from the straight direction which is the first direction to the course direction which is the second direction, the course display image 6E becomes a curve extending from the vehicle in the second direction.
 図29Aから図29Dは、本実施形態における進路表示画像6Eの変形例を示す模式図である。図29Aは、車両10Eが交差点に進入する位置での進路表示画像6EAを示す。図29Bは、交差点中央手前位置に、図29Cは交差点中央付近に車両10Eが位置する時の進路表示画像6EB、6ECを示す。図29Dは、車両10Eが交差点を曲がり切った後の位置での進路表示画像6EDを示す。 29A to 29D are schematic views showing a modified example of the course display image 6E in the present embodiment. FIG. 29A shows a course display image 6EA at a position where the vehicle 10E enters the intersection. FIG. 29B shows a position in front of the center of the intersection, and FIG. 29C shows the course display images 6EB and 6EC when the vehicle 10E is located near the center of the intersection. FIG. 29D shows the course display image 6ED at the position after the vehicle 10E has turned the intersection.
 変形例では、進路表示画像6Eは、車両10Eから進路7Eを結ぶ直線となっている。車両10Eの向きが右方向に傾くほど進路表示画像6Eの直線の長さは短く、進路方向が分かりやすく表示される。したがって、進路表示画像6Eの直線の長さは、車両10Eの位置と向き(すなわち進路方向への傾き)との関係で予め決定されている。 In the modified example, the course display image 6E is a straight line connecting the vehicle 10E and the course 7E. As the direction of the vehicle 10E is tilted to the right, the length of the straight line of the course display image 6E is shorter, and the course direction is displayed in an easy-to-understand manner. Therefore, the length of the straight line of the course display image 6E is predetermined in relation to the position and direction (that is, the inclination in the course direction) of the vehicle 10E.
 このように、進路表示画像6Eは、車両10Eが第一方向である直進方向から、第2方向である右方向に進路を変える際に、前記車両から前記第2方向である進路方向へ伸びる直線となり、進路表示画像6Eの長さは車両10Eの向きに応じて決定される。 As described above, the course display image 6E is a straight line extending from the vehicle in the second direction when the vehicle 10E changes course from the straight direction which is the first direction to the right direction which is the second direction. Therefore, the length of the course display image 6E is determined according to the direction of the vehicle 10E.
 本実施形態は、ナビゲーションシステム11Eからの情報により進路7Eが決定され、GPSによって車両10Eの位置を特定し、車両10Eから進路方向との間を結ぶ線を進路表示画像6Eとすることにより、進路表示画像6Eが進路7Eに沿うように表示される。 In the present embodiment, the course 7E is determined by the information from the navigation system 11E, the position of the vehicle 10E is specified by GPS, and the line connecting the vehicle 10E to the course direction is set as the course display image 6E. The display image 6E is displayed along the course 7E.
 ナビゲーションシステム11Eからの情報、特にGPSによる位置情報、車両10Eの向き情報と進路情報とを組み合わせることにより、自動運転モード時等において、車両10Eの位置の変化に応じて、画像選択部40Eは、常に適切な進路表示画像6Eを素早く再選択し、路面に描画することができる。 By combining the information from the navigation system 11E, particularly the position information by GPS, the orientation information of the vehicle 10E, and the course information, the image selection unit 40E can perform the image selection unit 40E according to the change in the position of the vehicle 10E in the automatic driving mode or the like. The appropriate course display image 6E can always be quickly reselected and drawn on the road surface.
 (第13実施形態)
 第13実施形態における路面描画装置900では、車両10Eが交差点等で進路方向を変更する際に、車両10Eの向きに応じて、所定時間おきに再選択される進路表示画像6Eの描画位置が、再選択の前後で重なる位置に投影されるように制御する。図30Aから図30Dは第13実施形態の進路表示画像6Eを示す模式図である。図30Aは交差点における進路表示画像6Eの時間推移を示した交差点の模式図である。図30Bから図30Dは交差点での時間推移に応じた車両10Eと進路表示画像6Eの投影の様子を示す模式的斜視図である。
(13th Embodiment)
In the road surface drawing device 900 according to the thirteenth embodiment, when the vehicle 10E changes the course direction at an intersection or the like, the drawing position of the course display image 6E that is reselected at predetermined time intervals according to the direction of the vehicle 10E is set. Control so that it is projected at overlapping positions before and after reselection. 30A to 30D are schematic views showing the course display image 6E of the thirteenth embodiment. FIG. 30A is a schematic view of the intersection showing the time transition of the course display image 6E at the intersection. 30B to 30D are schematic perspective views showing a state of projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection.
 図30Aから図30Dに示すように、車両10Eの位置に応じて表示される進路表示画像6Eは、路面上の描画位置が直前に投影した描画位置に重なるように投影される。具体的には、次のようになっている。 As shown in FIGS. 30A to 30D, the course display image 6E displayed according to the position of the vehicle 10E is projected so that the drawing position on the road surface overlaps the drawing position projected immediately before. Specifically, it is as follows.
 図30A及び図30Bに示すように、車両10Eが交差点に進入する位置にある時に路面に進路表示画像6EAが投影される。この時、照明ユニット52Eからの光の照射角度は、路面Gに向かって角度θとなるように設定される。そして、この投影と同時に車載カメラ13Eによって、路面Gが撮影され、車両状態情報取得部30Eは、進路表示画像6EAの路面上の位置を確認する。 As shown in FIGS. 30A and 30B, the course display image 6EA is projected on the road surface when the vehicle 10E is in the position of entering the intersection. At this time, the irradiation angle of the light from the lighting unit 52E is set so as to be an angle θ toward the road surface G. Then, at the same time as this projection, the road surface G is photographed by the vehicle-mounted camera 13E, and the vehicle state information acquisition unit 30E confirms the position of the course display image 6EA on the road surface.
 所定時間経過後、第11実施形態、第12実施形態同様、車両10Eの位置や向きに合わせて、画像選択部40Eは、進路表示画像を再選択し、再選択した進路表示画像6EBを路面に投影する段階で、所定位置(表示予定位置)の路面Gを車載カメラ13Eで撮像し、先の進路表示画像6EAが投影された位置と重なる位置か、路面に凹凸や傾きが無いかを確認する。 After a lapse of a predetermined time, as in the eleventh embodiment and the twelfth embodiment, the image selection unit 40E reselects the course display image according to the position and orientation of the vehicle 10E, and the reselected course display image 6EB is placed on the road surface. At the stage of projection, the road surface G at a predetermined position (scheduled display position) is imaged by the in-vehicle camera 13E, and it is confirmed whether the previous course display image 6EA overlaps with the projected position and whether the road surface has unevenness or inclination. ..
 路面に傾きがある場合は、路面Gに対して照射角度θが一定となるように、照明ユニット52Eの照射方向が調整される。そして、図30Cに示すように、再選択した進路表示画像6EBが先に投影されていた進路表示画像6EA(破線で示す画像)に一部が重なるように照明ユニット52Eの照射方向が調整される。このように、照明ユニット52Eは路面Gの状態に応じて照射向きを調整し、路面Gに対して照射角度θが一定となるように、進路表示画像6EA、6EB、6EC、6EDを投影することにより、車両10Eの運転手は常に一定の視界に進路表示画像6Eを認識することができる。また、路面Gの凹凸や傾きに関係なく、路面Gに対して常に一定の投影状態を維持できるので、対向する他車や歩行者は進路表示画像6Eを認識しやすくなる。 When the road surface is inclined, the irradiation direction of the lighting unit 52E is adjusted so that the irradiation angle θ is constant with respect to the road surface G. Then, as shown in FIG. 30C, the irradiation direction of the lighting unit 52E is adjusted so that the reselected course display image 6EB partially overlaps the previously projected course display image 6EA (image shown by a broken line). .. In this way, the lighting unit 52E adjusts the irradiation direction according to the state of the road surface G, and projects the course display images 6EA, 6EB, 6EC, and 6ED so that the irradiation angle θ is constant with respect to the road surface G. As a result, the driver of the vehicle 10E can always recognize the course display image 6E in a constant field of view. Further, since a constant projection state can always be maintained with respect to the road surface G regardless of the unevenness or inclination of the road surface G, other vehicles or pedestrians on the opposite side can easily recognize the course display image 6E.
 また、先に投影した進路表示画像6EAに、再選択で選択された進路表示画像6EBの一部が重なるように投影することによって、進路7E上に連続して進路表示画像6EA、6EB、6EC、6EDが投影されることになる。これにより、右折後の車線まで、視認者の視覚の残像効果も相まって、あたかも線を引いたように進路表示画像E6が連続性をもって表示されるため、周囲の歩行者や他車が車両10Eの交差点上での進路を認識する上で非常に分かりやすくなる。 Further, by projecting the previously projected course display image 6EA so that a part of the course display image 6EB selected by reselection overlaps, the course display images 6EA, 6EB, 6EC, can be continuously projected on the course 7E. 6ED will be projected. As a result, the course display image E6 is continuously displayed up to the lane after the right turn, as if a line was drawn, in combination with the visual afterimage effect of the viewer, so that surrounding pedestrians and other vehicles can see the vehicle 10E. It will be very easy to understand when recognizing the course at the intersection.
 (第14実施形態)
 第14実施形態における路面描画装置900は、路上に歩行者を検知した場合は、歩行者が視認しやすいように、進路表示画像6Eを変える。具体的には次のようになっている。図31Aおよび図31Bは第14実施形態における交差点での進路表示画像を示す模式図である。図31Aは、右折しようとする車両10Eが交差点に進入時点での進路表示画像61Eを示す模式図である。図31Bは車両10Eが歩行者を検知した後の進路表示画像62Eを示す模式図である。第14実施形態において進路表示画像61Eは線状画像である。
(14th Embodiment)
When the road surface drawing device 900 according to the 14th embodiment detects a pedestrian on the road, it changes the course display image 6E so that the pedestrian can easily see it. Specifically, it is as follows. 31A and 31B are schematic views showing a course display image at an intersection in the 14th embodiment. FIG. 31A is a schematic view showing a course display image 61E at the time when the vehicle 10E trying to turn right enters an intersection. FIG. 31B is a schematic view showing a course display image 62E after the vehicle 10E detects a pedestrian. In the 14th embodiment, the course display image 61E is a linear image.
 本実施形態の車載カメラ13E及びセンサ14Eを含む前方情報検知部16Eは、車両10Eの前方を検知し、前方に歩行者Bの有無を検知する。そして前方情報検知部16Eは、検知結果を車両状態情報取得部30Eへ送る。車両状態情報取得部30Eは、先の車載カメラ13E及びセンサ14Eの情報より、車両10Eの前方に歩行者Bがいることを検知し、その情報を画像選択部40Eへ送る。画像選択部40Eは、車両10Eの前方に歩行者Bがいない場合は、通常の太さで表示された進路表示画像61Eを選択する。一方、車両10Eの前方に歩行者Bがいる場合は、歩行者に識別されやすいように、進路表示画像61Eより太い太さで表示された進路表示画像62Eを選択する。 The front information detection unit 16E including the vehicle-mounted camera 13E and the sensor 14E of the present embodiment detects the front of the vehicle 10E and detects the presence or absence of a pedestrian B in front of the vehicle 10E. Then, the forward information detection unit 16E sends the detection result to the vehicle state information acquisition unit 30E. The vehicle state information acquisition unit 30E detects that a pedestrian B is in front of the vehicle 10E from the information of the vehicle-mounted camera 13E and the sensor 14E, and sends the information to the image selection unit 40E. When there is no pedestrian B in front of the vehicle 10E, the image selection unit 40E selects the course display image 61E displayed in a normal thickness. On the other hand, when the pedestrian B is in front of the vehicle 10E, the course display image 62E displayed with a thickness thicker than the course display image 61E is selected so that the pedestrian can easily identify the pedestrian B.
 図31Aに示すように、車両10Eの近くに歩行者Bがいない時の進路表示画像61Eは通常の太さで投影される。一方、図31Bに示すように車両10Eの前方に歩行者Bを検知した時の進路表示画像62Eは、通常よりも太くなり、目立つように投影される。このように、前方情報検知部16Eが歩行者Bを検知し、歩行者Bに近づくに従い、画像の線幅が太い進路表示画像62Eが選択され、投影される。歩行者Bの近くでは画像の線幅が太い進路表示画像62Eを表示することにより、歩行者Bの注意を引くことができる。すなわち、歩行者Bに車両10Eの進路を的確に認識させることができ、横断歩道の歩行等の時に、車両10Eの動きを正しく把握させることができる。 As shown in FIG. 31A, the course display image 61E when there is no pedestrian B near the vehicle 10E is projected with a normal thickness. On the other hand, as shown in FIG. 31B, the course display image 62E when the pedestrian B is detected in front of the vehicle 10E is thicker than usual and is projected prominently. In this way, the forward information detection unit 16E detects the pedestrian B, and as the pedestrian B approaches, the course display image 62E having a thick line width of the image is selected and projected. By displaying the course display image 62E having a wide line width of the image near the pedestrian B, the attention of the pedestrian B can be drawn. That is, the pedestrian B can be made to accurately recognize the course of the vehicle 10E, and the movement of the vehicle 10E can be correctly grasped when walking on a pedestrian crossing or the like.
 なお、本実施形態では、歩行者Bを検知した場合、進路表示画像62Eの線幅を太くするとしたが、本開示はこれに限らず、進路表示画像62Eが点滅するような処理を付加してもよい。 In the present embodiment, when the pedestrian B is detected, the line width of the course display image 62E is increased, but the present disclosure is not limited to this, and a process for blinking the course display image 62E is added. May be good.
 (第15実施形態)
 第11実施形態から第14実施形態では、車両10Eの前方路面に一つの進路表示画像6Eを表示する形態で説明した。本開示はこれら実施形態に限らず、複数の進路表示画像6Eを表示してもよい。第15実施形態における路面描画装置900は、左右の照明ユニット52L,52Rでそれぞれ進路表示画像6Eを投影するものである。
(15th Embodiment)
In the eleventh to fourteenth embodiments, one course display image 6E is displayed on the road surface in front of the vehicle 10E. The present disclosure is not limited to these embodiments, and a plurality of course display images 6E may be displayed. The road surface drawing device 900 in the fifteenth embodiment projects the course display image 6E on the left and right lighting units 52L and 52R, respectively.
 図32Aから図32Cは、第15実施形態における進路表示画像6Eを示す模式図である。図32Aは、直進時の進路表示画像6Eを示す。図32Bは右折時の進路表示画像6Eを示す。図32Cは左折時の進路表示画像6Eを示す。 32A to 32C are schematic views showing the course display image 6E in the fifteenth embodiment. FIG. 32A shows a course display image 6E when traveling straight. FIG. 32B shows a course display image 6E when turning right. FIG. 32C shows a course display image 6E when turning left.
 図32Aに示すように、本実施形態では、直進時には、画像選択部40Eは、左の照明ユニット52L、右の照明ユニット52Rそれぞれに直進の矢印である進路表示画像6Sを表示するように表示画像データを選択する。画像描画部50Eは、画像選択部40Eが選択した画像データに従って、車両10E前方の路面に2本の進路表示画像6Sが表示されるよう、左の照明ユニット52L、右の照明ユニット52Rを投影させる。 As shown in FIG. 32A, in the present embodiment, when going straight, the image selection unit 40E displays a course display image 6S which is a straight arrow on each of the left lighting unit 52L and the right lighting unit 52R. Select data. The image drawing unit 50E projects the left lighting unit 52L and the right lighting unit 52R so that the two course display images 6S are displayed on the road surface in front of the vehicle 10E according to the image data selected by the image selection unit 40E. ..
 画像描画部50Eは、進路表示画像6Eの色を手動運転モードと、自動運転モードで色分けし、車両10Eの周囲、すなわち周囲の歩行者、他の車両(自動車、自転車)が、車両10Eが自動運転かそうでないかを識別できるようにする。具体的には、手動運転モードの時は、直進時における進路表示画像6Sの色を白とし、自動運転モードの時は、進路表示画像6Sの色をターコイズ(青緑)とする。このように、色分けして表示することにより、車両10Eの周囲は、車両10Eが自動運転か否かを識別できる。また、左右の照明ユニット52R、52Lからそれぞれ直線の進路表示画像6Sが表示され、車両10Eの前方には、2本の直線矢印が出ているため、車両10Eの左側にいる歩行者等にも進路表示画像6Sが見えやすい。 The image drawing unit 50E color-codes the course display image 6E in the manual driving mode and the automatic driving mode, and the vehicle 10E automatically surrounds the vehicle 10E, that is, the surrounding pedestrians and other vehicles (automobiles, bicycles). Be able to identify whether you are driving or not. Specifically, in the manual operation mode, the color of the course display image 6S when traveling straight is white, and in the automatic operation mode, the color of the course display image 6S is turquoise (blue-green). By displaying the vehicle 10E in different colors in this way, it is possible to identify whether or not the vehicle 10E is automatically driven around the vehicle 10E. In addition, straight line path display images 6S are displayed from the left and right lighting units 52R and 52L, respectively, and two straight arrows are projected in front of the vehicle 10E, so that pedestrians and the like on the left side of the vehicle 10E can also see them. The course display image 6S is easy to see.
 なお、自動運転モード、手動運転モードで進路表示画像6Eの色を変えることは、本実施形態に限らず、上述の他の実施形態にも適用できる。 Note that changing the color of the course display image 6E in the automatic operation mode and the manual operation mode can be applied not only to this embodiment but also to the other embodiments described above.
 次に、図32Bに示すように、右折時に、車両10Eが交差点に入る手前では、画像選択部40Eは、右の照明ユニット52Rだけを使用し、右折用の進路表示画像6Rを表示するような表示画像データ41Erを選択する。そして、画像描画部50Eは、画像選択部40Eが選択した表示画像データ41Erに従って、進路表示画像6Rを車両10Eの前方路面に表示させるように、右の照明ユニット52Rだけを投影させる。右の照明ユニット52Rの投影は、交差点手前で車両10Eが方向指示器12Eによってウィンカーランプを点灯させるタイミングと同時に行われる。 Next, as shown in FIG. 32B, at the time of turning right, before the vehicle 10E enters the intersection, the image selection unit 40E uses only the right lighting unit 52R and displays the course display image 6R for turning right. Select the display image data 41Er. Then, the image drawing unit 50E projects only the right lighting unit 52R so that the route display image 6R is displayed on the road surface in front of the vehicle 10E according to the display image data 41Er selected by the image selection unit 40E. The projection of the right lighting unit 52R is performed at the same time as the timing when the vehicle 10E turns on the blinker lamp by the turn signal 12E in front of the intersection.
 進路表示画像6Rの表示色はアンバーとし、車両10Eが曲がるという意思表示を周囲に明確に伝わるようにする。 The display color of the course display image 6R is amber so that the manifestation of intention that the vehicle 10E will turn is clearly communicated to the surroundings.
 その後、交差点に車両10Eが進入してからの進路表示画像6Rの表示は、上述の第11実施形態から第14実施形態に準ずる形であればよい。また、交差点に車両10Eが進入してからの表示は、右の照明ユニット52Rだけで行っても良いし、左右の照明ユニット52R,52Lの両方を用いて行ってもよい。 After that, the display of the course display image 6R after the vehicle 10E enters the intersection may be in a form conforming to the above-mentioned 11th to 14th embodiments. Further, the display after the vehicle 10E has entered the intersection may be performed only by the right lighting unit 52R, or may be performed by using both the left and right lighting units 52R and 52L.
 図32Cに示すように、左折時に、車両10Eが交差点に入る手前では、画像選択部40Eは、左の照明ユニット52Lだけを使用し、左折用の進路表示画像6Lを表示するよう、表示画像データ41Elを選択する。画像描画部50Eは、画像選択部40Eが選択した表示画像データ41Elに基づいて、左の照明ユニット52Lを投影させる。表示の方法及び、車両10Eが交差点に進入してからの表示内容については右折時と同様である。 As shown in FIG. 32C, when the vehicle 10E turns left, before the vehicle 10E enters the intersection, the image selection unit 40E uses only the left lighting unit 52L and displays the display image data so as to display the course display image 6L for the left turn. Select 41 El. The image drawing unit 50E projects the left lighting unit 52L based on the display image data 41El selected by the image selection unit 40E. The display method and the display contents after the vehicle 10E enters the intersection are the same as when turning right.
 このように本実施形態は、右折、左折時は、一方の照明ユニット52R,52Lだけを使用して表示することにより、進路表示画像6R,6Lを曲がる方向に寄せて表示することができるため、周囲の対向車両や歩行者が気づきやすくなる。 As described above, in the present embodiment, when turning right or left, by displaying using only one of the lighting units 52R and 52L, the course display images 6R and 6L can be displayed in the direction of the turn. It becomes easier for oncoming vehicles and pedestrians around you to notice.
 なお、右折時、左折時には、進路表示画像6Eの色をアンバーにすることは、本実施形態に限らず、上述の他の実施形態にも適用できる。 Note that changing the color of the course display image 6E to amber when turning right or left can be applied not only to this embodiment but also to the other embodiments described above.
 また、本実施形態では、左右の照明ユニット52L、52Rの一方だけを使用する指示を画像選択部40Eが、一方の照明ユニット52だけで投影する表示画像データ41ErあるいはElを選択して画像描画部50Eに指示した。しかしながら、本開示はこれに限らず、画像選択部40Eは、右折時(左折時)には、常に右折用(または左折用)の表示画像データ41Er(またはEl)を選択するのみで、画像描画部50Eが一方の照明ユニット52Eだけを使用した投影とするか、左右両方の照明ユニット52Eを使用した投影とするかのデータ加工を行う形態であってもよい。 Further, in the present embodiment, the image selection unit 40E selects the display image data 41Er or El to project the instruction to use only one of the left and right lighting units 52L and 52R by only one of the lighting units 52, and the image drawing unit. Instructed 50E. However, the present disclosure is not limited to this, and the image selection unit 40E always selects the display image data 41Er (or El) for right turn (or left turn) when turning right (when turning left), and draws an image. The unit 50E may be in a form of performing data processing as to whether the projection uses only one lighting unit 52E or the projection using both the left and right lighting units 52E.
 本開示は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本開示の技術的範囲に含まれる。 The present disclosure is not limited to the above-described embodiments, and various modifications can be made within the scope of the claims, and the embodiments obtained by appropriately combining the technical means disclosed in the different embodiments. Is also included in the technical scope of the present disclosure.
 本出願は、2019年11月9日出願の日本出願第2019-203610号、2019年11月6日出願の日本出願第2019-201392号、2019年11月22日出願の日本出願第2019-211412号、2019年11月19日出願の日本出願第2019-209198号、2019年11月12日出願の日本出願第2019-205095号に基づく優先権を主張し、前記日本出願に記載された全ての記載内容を援用するものである。 This application is filed in Japanese Application No. 2019-203610 filed on November 9, 2019, Japanese Application No. 2019-20392 filed on November 6, 2019, and Japanese Application No. 2019-21142 filed on November 22, 2019. No., Japanese application No. 2019-2009198 filed on November 19, 2019, all claims of priority under Japanese application No. 2019-205095 filed on November 12, 2019, and all stated in the Japanese application. The contents of the description are used.

Claims (47)

  1.  道路上を走行する第1車両の動作を第1動作情報として取得し、前記道路上を走行する第2車両の動作を第2動作情報として取得する車両動作把握部と、
     前記第1動作情報および前記第2動作情報に基づいて、前記第1車両または前記第2車両が走行する予定の経路を示す、誘導情報を作成する誘導情報作成部と、
     前記誘導情報を前記第1車両または前記第2車両に提示する誘導情報提示部と、を備え、
     前記誘導情報作成部は、前記第1動作情報および前記第2動作情報が同一方向への進路変更を含み、かつ右折動作および左折動作を含んでいる場合に、前記誘導情報を作成する車両運転支援システム。
    A vehicle motion grasping unit that acquires the motion of the first vehicle traveling on the road as the first motion information and acquires the motion of the second vehicle traveling on the road as the second motion information.
    A guidance information creation unit that creates guidance information that indicates a route on which the first vehicle or the second vehicle is scheduled to travel based on the first operation information and the second operation information.
    A guidance information presenting unit that presents the guidance information to the first vehicle or the second vehicle is provided.
    The guidance information creating unit creates the guidance information when the first motion information and the second motion information include a course change in the same direction and include a right turn motion and a left turn motion. system.
  2.  請求項1に記載の車両運転支援システムであって、
     前記車両動作把握部は、前記第1車両または前記第2車両に設けられている車両運転支援システム。
    The vehicle driving support system according to claim 1.
    The vehicle motion grasping unit is a vehicle driving support system provided in the first vehicle or the second vehicle.
  3.  請求項1に記載の車両運転支援システムであって、
     前記車両動作把握部は、前記道路近傍に配置されたインフラ側装置に設けられている車両運転支援システム。
    The vehicle driving support system according to claim 1.
    The vehicle motion grasping unit is a vehicle driving support system provided in an infrastructure-side device arranged near the road.
  4.  請求項1から3の何れか一つに記載の車両運転支援システムであって、
     前記誘導情報提示部は、前記道路上に画像を投影する路面描画装置である車両運転支援システム。
    The vehicle driving support system according to any one of claims 1 to 3.
    The guidance information presentation unit is a vehicle driving support system that is a road surface drawing device that projects an image on the road.
  5.  請求項4に記載の車両運転支援システムであって、
     前記路面描画装置は、前記第1車両または前記第2車両に設けられている車両運転支援システム。
    The vehicle driving support system according to claim 4.
    The road surface drawing device is a vehicle driving support system provided in the first vehicle or the second vehicle.
  6.  請求項4に記載の車両運転支援システムであって、
     前記路面描画装置は、前記道路近傍に配置されたインフラ側装置に設けられている車両運転支援システム。
    The vehicle driving support system according to claim 4.
    The road surface drawing device is a vehicle driving support system provided in an infrastructure-side device arranged near the road.
  7.  請求項1から3の何れか一つに記載の車両運転支援システムであって、
     前記誘導情報提示部は、前記第1車両または前記第2車両に設けられた画像表示装置である車両運転支援システム。
    The vehicle driving support system according to any one of claims 1 to 3.
    The guidance information presenting unit is a vehicle driving support system that is an image display device provided on the first vehicle or the second vehicle.
  8.  車両の外部からの光を検知する検知部と、前記検知部の検知結果に応じて運転を支援する運転支援部を有する受信側車両と、
     前記検知部に対して予め定められた光信号を照射する光照射部と、を備え、
     前記検知部が前記光信号を検知した際に、前記運転支援部が前記受信側車両に停止動作または減速動作を実行させる車両運転支援システム。
    A detection unit that detects light from the outside of the vehicle, a receiving vehicle having a driving support unit that supports driving according to the detection result of the detection unit, and a receiving side vehicle.
    A light irradiation unit that irradiates the detection unit with a predetermined light signal is provided.
    A vehicle driving support system in which the driving support unit causes the receiving vehicle to perform a stop operation or a deceleration operation when the detection unit detects the optical signal.
  9.  請求項8に記載の車両運転支援システムであって、
     前記光信号は、前記運転支援部に予め記録された信号波形を有する車両運転支援システム。
    The vehicle driving support system according to claim 8.
    The optical signal is a vehicle driving support system having a signal waveform recorded in advance in the driving support unit.
  10.  請求項8または9に記載の車両運転支援システムであって、
     前記光信号は、前記運転支援部に予め記録された波長の光を含む車両運転支援システム。
    The vehicle driving support system according to claim 8 or 9.
    The optical signal is a vehicle driving support system that includes light having a wavelength pre-recorded in the driving support unit.
  11.  請求項8から10の何れか一つに記載の車両運転支援システムであって、
     前記光信号は、前記検知部のダイナミックレンジを超えた光強度である車両運転支援システム。
    The vehicle driving support system according to any one of claims 8 to 10.
    The optical signal is a vehicle driving support system having a light intensity exceeding the dynamic range of the detection unit.
  12.  請求項8から11の何れか一つに記載の車両運転支援システムであって、
     前記受信側車両とは異なる送信側車両の走行状態および周囲状況を判断する状況把握部を備え、
     前記状況把握部は、前記走行状態あるいは前記周囲状況が照射条件を満たした場合に、前記光照射部から前記検知部に対して前記光信号を照射させる車両運転支援システム。
    The vehicle driving support system according to any one of claims 8 to 11.
    It is equipped with a situation grasping unit that determines the running state and surrounding conditions of the transmitting side vehicle different from the receiving side vehicle.
    The situation grasping unit is a vehicle driving support system that irradiates the detection unit with the light signal from the light irradiation unit when the traveling state or the surrounding condition satisfies the irradiation condition.
  13.  請求項12に記載の車両運転支援システムであって、
     前記照射条件は、前記受信側車両が前記送信側車両の対向車両であり、かつ前記送信側車両の動作が右折動作または左折動作である車両運転支援システム。
    The vehicle driving support system according to claim 12.
    The irradiation condition is a vehicle driving support system in which the receiving side vehicle is an oncoming vehicle of the transmitting side vehicle, and the operation of the transmitting side vehicle is a right turn operation or a left turn operation.
  14.  請求項12に記載の車両運転支援システムであって、
     前記照射条件は、前記受信側車両が前記送信側車両の後続車両であり、かつ前記送信側車両と前記受信側車両の車間距離が一定値以下である車両運転支援システム。
    The vehicle driving support system according to claim 12.
    The irradiation condition is a vehicle driving support system in which the receiving side vehicle is a following vehicle of the transmitting side vehicle, and the distance between the transmitting side vehicle and the receiving side vehicle is equal to or less than a certain value.
  15.  請求項12から14の何れか一つに記載の車両運転支援システムであって、
     前記光照射部は、前記送信側車両に搭載されている車両運転支援システム。
    The vehicle driving support system according to any one of claims 12 to 14.
    The light irradiation unit is a vehicle driving support system mounted on the transmitting side vehicle.
  16.  請求項8から14の何れか一つに記載の車両運転支援システムであって、
     前記光照射部は、前記受信側車両が走行する道路上に配置されている車両運転支援システム。
    The vehicle driving support system according to any one of claims 8 to 14.
    The light irradiation unit is a vehicle driving support system arranged on a road on which the receiving vehicle travels.
  17.  請求項8から14の何れか一つに記載の車両運転支援システムであって、
     前記光照射部は、携帯電子機器に搭載されている車両運転支援システム。
    The vehicle driving support system according to any one of claims 8 to 14.
    The light irradiation unit is a vehicle driving support system mounted on a portable electronic device.
  18.  車両の進路情報を取得する車両進路取得部と、
     前記車両の位置及び速度を車両状態情報として取得する車両状態情報取得部と、
     前記車両の付近に他車両が存在するかを検知し、検知情報を取得する他車両検知部と、
     前記進路情報と前記車両状態情報および前記検知情報に基づいて、前記車両の周囲の路面に投影する進路表示画像を選択する画像選択部と、
     前記路面に前記進路表示画像を投影する路面描画部と、を備え、
     前記画像選択部は、前記検知情報および前記車両状態情報に基づいて前記車両前方の第1距離内に前記他車両が存在すると判断した場合は前記進路表示画像を選択せず、前記第1距離内に前記他車両が存在しないと判断した場合は進路前方表示画像を選択し、
     前記路面描画部は、前記画像選択部が選択した前記進路前方表示画像を前記車両の前方路面に投影する車両運転支援システム。
    The vehicle course acquisition unit that acquires the course information of the vehicle,
    A vehicle state information acquisition unit that acquires the position and speed of the vehicle as vehicle state information,
    The other vehicle detection unit that detects whether another vehicle exists in the vicinity of the vehicle and acquires the detection information,
    An image selection unit that selects a course display image to be projected on the road surface around the vehicle based on the course information, the vehicle state information, and the detection information.
    A road surface drawing unit that projects the course display image onto the road surface is provided.
    When the image selection unit determines that the other vehicle exists within the first distance in front of the vehicle based on the detection information and the vehicle state information, the image selection unit does not select the course display image and within the first distance. If it is determined that the other vehicle does not exist, select the path forward display image and select
    The road surface drawing unit is a vehicle driving support system that projects the course forward display image selected by the image selection unit onto the road surface in front of the vehicle.
  19.  請求項18に記載の車両運転支援システムであって、
     前記画像選択部は、前記検知情報および前記車両状態情報に基づいて前記車両後方の第2距離内に前記他車両が存在しないと判断した場合は、進路後方表示画像を選択し、
     前記路面描画部は、前記画像選択部が選択した前記進路後方表示画像を前記車両の後方路面に投影する車両運転支援システム。
    The vehicle driving support system according to claim 18.
    When the image selection unit determines that the other vehicle does not exist within the second distance behind the vehicle based on the detection information and the vehicle state information, the image selection unit selects a course rear display image.
    The road surface drawing unit is a vehicle driving support system that projects the rearward display image of the course selected by the image selection unit onto the rear road surface of the vehicle.
  20.  請求項19に記載の車両運転支援システムであって、
     前記画像選択部は、前記車両が交差点に向かって進行し、前記交差点から第3距離内に位置する場合には、前記車両後方の前記第2距離内に前記他車両が存在すると判断した場合にも、前記進路後方表示画像を選択する車両運転支援システム。
    The vehicle driving support system according to claim 19.
    When the image selection unit determines that the other vehicle exists within the second distance behind the vehicle when the vehicle travels toward the intersection and is located within a third distance from the intersection. Also, a vehicle driving support system that selects the rearward display image of the course.
  21.  請求項18から20の何れか一つに記載の車両運転支援システムであって、
     前記車両前方の前記第1距離は、前記車両状態情報取得部が取得する前記速度から求められる制動距離に基づいて決定される車両運転支援システム。
    The vehicle driving support system according to any one of claims 18 to 20.
    The vehicle driving support system determines the first distance in front of the vehicle based on the braking distance obtained from the speed acquired by the vehicle state information acquisition unit.
  22.  請求項18から21の何れか一つに記載の車両運転支援システムであって、
     前記画像選択部は、前記進路情報が第1方向から第2方向に変化する場合、画像選択を判断する車両運転支援システム。
    The vehicle driving support system according to any one of claims 18 to 21.
    The image selection unit is a vehicle driving support system that determines image selection when the course information changes from the first direction to the second direction.
  23.  請求項22に記載の車両運転支援システムであって、
     前記第1方向は前記車両の直進方向であり、前記第2方向は前記車両の右折方向または左折方向であり、
     前記画像選択部は、前記車両状態情報における前記車両の位置が右折または左折の合図を行わなければならない地点に到達する前に、前記進路表示画像の選択を判断する車両運転支援システム。
    The vehicle driving support system according to claim 22.
    The first direction is the straight direction of the vehicle, and the second direction is the right turn direction or the left turn direction of the vehicle.
    The image selection unit is a vehicle driving support system that determines selection of the course display image before the position of the vehicle in the vehicle state information reaches a point at which a right turn or left turn signal must be given.
  24.  請求項18から23の何れか一つに記載の車両運転支援システムであって、
     前記他車両検知部は、前記車両前方または後方を撮像装置で撮像し、画像認識によって前記他車両を検知する車両運転支援システム。
    The vehicle driving support system according to any one of claims 18 to 23.
    The other vehicle detection unit is a vehicle driving support system that captures the front or rear of the vehicle with an image pickup device and detects the other vehicle by image recognition.
  25.  請求項18から24の何れか一つに記載の車両運転支援システムであって、
     前記他車両検知部は、前記他車両との無線通信によって前記他車両を検知する車両運転支援システム。
    The vehicle driving support system according to any one of claims 18 to 24.
    The other vehicle detection unit is a vehicle driving support system that detects the other vehicle by wireless communication with the other vehicle.
  26.  請求項18から25の何れか一つに記載の車両運転支援システムであって、
     前記車両状態情報取得部は、衛星測位システムからの位置情報を用いて前記車両状態情報を取得する車両運転支援システム。
    The vehicle driving support system according to any one of claims 18 to 25.
    The vehicle state information acquisition unit is a vehicle driving support system that acquires the vehicle state information using position information from a satellite positioning system.
  27.  車両の進路情報を取得する車両進路取得部と、
     前記車両の位置と向きを車両状態情報として取得する車両状態情報取得部と、
     前記進路情報と前記車両状態情報に基づいて、前記車両の前方路面に投影する進路表示画像を選択する画像選択部と、
     前記前方路面に前記進路表示画像を投影する路面描画部と、を備え、
     前記画像選択部は、前記進路情報が第1方向から第2方向に変化する場合、前記車両から前記第2方向を示す線を含む前記進路表示画像を選択し、前記車両状態情報の変化に伴って、前記進路表示画像を再選択する路面描画装置。
    The vehicle course acquisition unit that acquires the course information of the vehicle,
    A vehicle state information acquisition unit that acquires the position and orientation of the vehicle as vehicle state information,
    An image selection unit that selects a course display image to be projected on the road surface in front of the vehicle based on the course information and the vehicle state information.
    A road surface drawing unit that projects the course display image onto the front road surface is provided.
    When the course information changes from the first direction to the second direction, the image selection unit selects the course display image including a line indicating the second direction from the vehicle, and accompanies the change in the vehicle state information. A road surface drawing device that reselects the course display image.
  28.  請求項27に記載の路面描画装置であって、
     前記第1方向は、前記車両の直進方向であり、前記第2方向は、前記車両の右折方向、または左折方向であり、
     前記車両状態情報取得部は、所定間隔で前記車両状態情報を取得し、
     前記車両状態情報における前記車両の向きが前記第2方向に一致するまで、前記画像選択部は、前記進路表示画像を再選択する路面描画装置。
    The road surface drawing device according to claim 27.
    The first direction is the straight direction of the vehicle, and the second direction is the right turn direction or the left turn direction of the vehicle.
    The vehicle state information acquisition unit acquires the vehicle state information at predetermined intervals, and obtains the vehicle state information.
    The image selection unit is a road surface drawing device that reselects the course display image until the direction of the vehicle in the vehicle state information matches the second direction.
  29.  請求項28に記載の路面描画装置であって、
     前記路面描画部は、路面上における描画位置が再選択された前記進路表示画像の前後で重なる位置に、前記進路表示画像を投影する路面描画装置。
    The road surface drawing apparatus according to claim 28.
    The road surface drawing unit is a road surface drawing device that projects the course display image onto a position where the drawing position on the road surface overlaps before and after the course display image whose drawing position has been reselected.
  30.  請求項27から29の何れか一つに記載の路面描画装置であって、
     前記画像選択部は、前記車両の向きから前記第2方向までの角度に応じたカーブ線を含む前記進路表示画像を選択する路面描画装置。
    The road surface drawing apparatus according to any one of claims 27 to 29.
    The image selection unit is a road surface drawing device that selects a course display image including a curve line corresponding to an angle from the direction of the vehicle to the second direction.
  31.  請求項30に記載の路面描画装置であって、
     前記カーブ線は、前記車両の変位に合わせて曲率が変化し、前記車両の進路に沿っている路面描画装置。
    The road surface drawing device according to claim 30.
    The curve line is a road surface drawing device whose curvature changes according to the displacement of the vehicle and is along the course of the vehicle.
  32.  請求項27から31の何れか一つに記載の路面描画装置であって、
     前記車両前方の歩行者を検知する前方情報検知部をさらに備え、
     前記進路表示画像は、前記車両の進路に沿う線状画像であり、
     前記画像選択部は、前記車両状態情報における前記車両の位置が、前記前方情報検知部が検知した前記歩行者に近づくにつれて、前記線状画像の線幅を太くした前記進路表示画像を選択する路面描画装置。
    The road surface drawing apparatus according to any one of claims 27 to 31.
    Further equipped with a front information detection unit that detects a pedestrian in front of the vehicle,
    The course display image is a linear image along the course of the vehicle.
    The image selection unit selects the course display image in which the line width of the linear image is increased as the position of the vehicle in the vehicle state information approaches the pedestrian detected by the front information detection unit. Drawing device.
  33.  請求項27から32の何れか一つに記載の路面描画装置であって、
     前記車両進路取得部は、カーナビゲーションシステムから前記進路情報を取得する路面描画装置。
    The road surface drawing apparatus according to any one of claims 27 to 32.
    The vehicle course acquisition unit is a road surface drawing device that acquires the course information from the car navigation system.
  34.  請求項27から33の何れか一つに記載の路面描画装置であって、
     前記車両進路取得部は、前記車両の運転手が入力した方向指示器から前記進路情報を取得する路面描画装置。
    The road surface drawing apparatus according to any one of claims 27 to 33.
    The vehicle course acquisition unit is a road surface drawing device that acquires the course information from a direction indicator input by the driver of the vehicle.
  35.  請求項27から34の何れか一つに記載の路面描画装置であって、
     前記車両状態情報取得部は、前記車両の操舵装置からの操舵角度によって前記車両の向きを取得する路面描画装置。
    The road surface drawing apparatus according to any one of claims 27 to 34.
    The vehicle state information acquisition unit is a road surface drawing device that acquires the direction of the vehicle based on the steering angle of the vehicle from the steering device.
  36.  請求項27から35の何れか一つに記載の路面描画装置であって、
     前記車両状態情報取得部は、衛星測位システムからの位置情報を用いて前記車両状態情報を取得する路面描画装置。
    The road surface drawing apparatus according to any one of claims 27 to 35.
    The vehicle state information acquisition unit is a road surface drawing device that acquires the vehicle state information using position information from a satellite positioning system.
  37.  請求項27から36の何れか一つに記載の路面描画装置であって、
     前記路面描画部は、前記車両の進行方向の前面左右に照射ユニットを備え、
     前記画像選択部は、前記進路情報が前記第1方向を示す場合は、左右の前記照射ユニットから描画するように前記進路表示画像を選択し、前記進路情報が前記第1方向から前記第2方向に変化することを示す場合、前記第2方向に近い一方の照射ユニットからのみ描画するように前記進路表示画像を選択する路面描画装置。
    The road surface drawing apparatus according to any one of claims 27 to 36.
    The road surface drawing unit includes irradiation units on the front left and right sides in the traveling direction of the vehicle.
    When the course information indicates the first direction, the image selection unit selects the course display image so as to draw from the left and right irradiation units, and the course information changes from the first direction to the second direction. A road surface drawing device that selects the course display image so as to draw only from one irradiation unit close to the second direction when indicating that the change is to.
  38.  舗装面と、
     一次光によって励起されて前記一次光とは異なる二次光を発光する蛍光体材料を含有し、前記舗装面上に形成された蛍光体含有層と、
     前記蛍光体含有層上に形成され、前記一次光および前記二次光を透過するコーティング層と、を備える道路。
    On the paved surface,
    A phosphor-containing layer formed on the pavement surface, which contains a phosphor material that is excited by the primary light and emits a secondary light different from the primary light.
    A road comprising a coating layer formed on the phosphor-containing layer and transmitting the primary light and the secondary light.
  39.  請求項38に記載の道路であって、
     前記一次光は、緑色または紫色の波長を有する道路。
    The road according to claim 38.
    The primary light is a road having a green or purple wavelength.
  40.  請求項38または39に記載の道路であって、
     前記コーティング層は、前記二次光を散乱する凹凸形状が表面または裏面に形成されたカバー材を含む道路。
    The road according to claim 38 or 39.
    The coating layer is a road including a cover material having an uneven shape that scatters the secondary light on the front surface or the back surface.
  41.  請求項38から40の何れか一つに記載の道路であって、
     前記蛍光体含有層は、交差点内の前記舗装面および前記交差点から所定範囲内の前記舗装面に形成されている道路。
    The road according to any one of claims 38 to 40.
    The phosphor-containing layer is a road formed on the pavement surface in an intersection and on the pavement surface within a predetermined range from the intersection.
  42.  請求項38から41の何れか一つに記載の道路であって、
     前記二次光は、アンバー色の波長を有する道路。
    The road according to any one of claims 38 to 41.
    The secondary light is a road having an amber wavelength.
  43.  請求項38から42の何れか一つに記載の道路と、
     前記道路に対して前記一次光を照射する光照射部を備える車両運転支援システム。
    The road according to any one of claims 38 to 42 and
    A vehicle driving support system including a light irradiation unit that irradiates the road with the primary light.
  44.  請求項43に記載の車両運転支援システムであって、
     前記光照射部は、前記道路上のインフラ側装置または前記道路近傍に配置されたインフラ側装置に搭載されている車両運転支援システム。
    The vehicle driving support system according to claim 43.
    The light irradiation unit is a vehicle driving support system mounted on the infrastructure side device on the road or the infrastructure side device arranged near the road.
  45.  請求項43に記載の車両運転支援システムであって、
     前記光照射部は、前記道路上を走行する車両に搭載されている車両運転支援システム。
    The vehicle driving support system according to claim 43.
    The light irradiation unit is a vehicle driving support system mounted on a vehicle traveling on the road.
  46.  請求項45に記載の車両運転支援システムであって、
     前記車両は、前記蛍光体含有層が形成された蛍光体塗布領域を検知する状況把握部を備え、
     前記状況把握部が前記蛍光体塗布領域を検知した場合に前記光照射部は前記一次光を照射する車両運転支援システム。
    The vehicle driving support system according to claim 45.
    The vehicle includes a situation grasping unit that detects a phosphor coating region on which the phosphor-containing layer is formed.
    A vehicle driving support system in which the light irradiation unit irradiates the primary light when the situation grasping unit detects the phosphor coating region.
  47.  請求項43から46の何れか一つに記載された車両運転支援システムであって、
     前記光照射部は、前記道路に画像を投影する路面描画装置である車両運転支援システム。
    The vehicle driving support system according to any one of claims 43 to 46.
    The light irradiation unit is a vehicle driving support system that is a road surface drawing device that projects an image on the road.
PCT/JP2020/039290 2019-11-06 2020-10-19 Vehicle driving assistance system, road surface drawing device, and road WO2021090668A1 (en)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
JP2019-201392 2019-11-06
JP2019201392A JP2021076956A (en) 2019-11-06 2019-11-06 Vehicle operation support system
JP2019-203610 2019-11-09
JP2019203610A JP2021077125A (en) 2019-11-09 2019-11-09 Vehicle driving assist system
JP2019-205095 2019-11-12
JP2019205095A JP2021075952A (en) 2019-11-12 2019-11-12 Road and vehicle driving support system
JP2019-209198 2019-11-19
JP2019209198A JP7403288B2 (en) 2019-11-19 2019-11-19 road surface drawing device
JP2019-211412 2019-11-22
JP2019211412A JP7348819B2 (en) 2019-11-22 2019-11-22 Vehicle driving support system

Publications (1)

Publication Number Publication Date
WO2021090668A1 true WO2021090668A1 (en) 2021-05-14

Family

ID=75849751

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/039290 WO2021090668A1 (en) 2019-11-06 2020-10-19 Vehicle driving assistance system, road surface drawing device, and road

Country Status (1)

Country Link
WO (1) WO2021090668A1 (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS508428Y1 (en) * 1969-04-02 1975-03-13
JPH07197407A (en) * 1993-12-28 1995-08-01 Yoshida Doro Kk Brilliant and clear asphalt paving surface and execution method thereof
JP2005006152A (en) * 2003-06-13 2005-01-06 Auto Network Gijutsu Kenkyusho:Kk Approach notification system
JP2006190187A (en) * 2005-01-07 2006-07-20 Toyota Motor Corp Traffic controller and traffic control system
JP2007168727A (en) * 2005-12-26 2007-07-05 Aisin Aw Co Ltd Operation support device, operation support system, and operation support program
JP2008143505A (en) * 2006-11-16 2008-06-26 Denso Corp Headlight control device
JP2011178257A (en) * 2010-03-01 2011-09-15 Koito Mfg Co Ltd Vehicle lamp system
WO2016163294A1 (en) * 2015-04-10 2016-10-13 日立マクセル株式会社 Video image projection device
JP2016205103A (en) * 2015-04-28 2016-12-08 株式会社デンソー Road surface indication structure and road surface indication system
JP2017010463A (en) * 2015-06-25 2017-01-12 株式会社デンソー Vehicle information provision device
WO2017126250A1 (en) * 2016-01-22 2017-07-27 日産自動車株式会社 Driving assistance method and device
JP2017138766A (en) * 2016-02-03 2017-08-10 三菱電機株式会社 Vehicle approach detection device
JP2018084955A (en) * 2016-11-24 2018-05-31 株式会社小糸製作所 Unmanned aircraft

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS508428Y1 (en) * 1969-04-02 1975-03-13
JPH07197407A (en) * 1993-12-28 1995-08-01 Yoshida Doro Kk Brilliant and clear asphalt paving surface and execution method thereof
JP2005006152A (en) * 2003-06-13 2005-01-06 Auto Network Gijutsu Kenkyusho:Kk Approach notification system
JP2006190187A (en) * 2005-01-07 2006-07-20 Toyota Motor Corp Traffic controller and traffic control system
JP2007168727A (en) * 2005-12-26 2007-07-05 Aisin Aw Co Ltd Operation support device, operation support system, and operation support program
JP2008143505A (en) * 2006-11-16 2008-06-26 Denso Corp Headlight control device
JP2011178257A (en) * 2010-03-01 2011-09-15 Koito Mfg Co Ltd Vehicle lamp system
WO2016163294A1 (en) * 2015-04-10 2016-10-13 日立マクセル株式会社 Video image projection device
JP2016205103A (en) * 2015-04-28 2016-12-08 株式会社デンソー Road surface indication structure and road surface indication system
JP2017010463A (en) * 2015-06-25 2017-01-12 株式会社デンソー Vehicle information provision device
WO2017126250A1 (en) * 2016-01-22 2017-07-27 日産自動車株式会社 Driving assistance method and device
JP2017138766A (en) * 2016-02-03 2017-08-10 三菱電機株式会社 Vehicle approach detection device
JP2018084955A (en) * 2016-11-24 2018-05-31 株式会社小糸製作所 Unmanned aircraft

Similar Documents

Publication Publication Date Title
US10232713B2 (en) Lamp for a vehicle
US10816982B2 (en) Vehicle control device mounted on vehicle and method for controlling the vehicle
KR101908308B1 (en) Lamp for Vehicle
US9952051B2 (en) Advanced driver assistance apparatus, display apparatus for vehicle and vehicle
KR101982774B1 (en) Autonomous Vehicle
CN109720267B (en) Vehicle lamp system
JP4720764B2 (en) Headlight control device
CN109070891B (en) Intent signaling for autonomous vehicle
US20090135024A1 (en) Display control system of traffic light and display method
US20180056858A1 (en) Vehicle signaling system
CN112752678A (en) Vehicle start notification display device
CN110126719B (en) Lighting system for vehicle and vehicle
JP7348819B2 (en) Vehicle driving support system
JP7370805B2 (en) Vehicle road surface drawing device.
WO2021090668A1 (en) Vehicle driving assistance system, road surface drawing device, and road
CN113044054A (en) Vehicle control device, vehicle control method, and program
WO2020085505A1 (en) Road surface drawing device for vehicle
US11731554B2 (en) Vehicle departure notification display device
CN111519553A (en) Traffic guidance system and method for intersection under urban viaduct
JP2021075952A (en) Road and vehicle driving support system
KR101908310B1 (en) Lamp for Vehicle
KR20210004186A (en) Induction signal displaying system and method
WO2020230523A1 (en) Transportation system and transportation infrastructure
WO2020162455A1 (en) Streetlight
KR102652239B1 (en) Traffic light systems and methods tailored to individual vehicles according to distance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20883876

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20883876

Country of ref document: EP

Kind code of ref document: A1