WO2021090668A1 - Système d'assistance à la conduite de véhicule, dispositif de tracé de surface de route, et route - Google Patents

Système d'assistance à la conduite de véhicule, dispositif de tracé de surface de route, et route Download PDF

Info

Publication number
WO2021090668A1
WO2021090668A1 PCT/JP2020/039290 JP2020039290W WO2021090668A1 WO 2021090668 A1 WO2021090668 A1 WO 2021090668A1 JP 2020039290 W JP2020039290 W JP 2020039290W WO 2021090668 A1 WO2021090668 A1 WO 2021090668A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
unit
driving support
course
support system
Prior art date
Application number
PCT/JP2020/039290
Other languages
English (en)
Japanese (ja)
Inventor
例人 田村
祐貴 高橋
金子 進
新 竹田
柴田 裕一
浩一 田辺
裕介 仲田
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2019201392A external-priority patent/JP2021076956A/ja
Priority claimed from JP2019203610A external-priority patent/JP2021077125A/ja
Priority claimed from JP2019205095A external-priority patent/JP2021075952A/ja
Priority claimed from JP2019209198A external-priority patent/JP7403288B2/ja
Priority claimed from JP2019211412A external-priority patent/JP7348819B2/ja
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Publication of WO2021090668A1 publication Critical patent/WO2021090668A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/34Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating change of drive direction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/507Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking specific to autonomous vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/543Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for indicating other states or conditions of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • EFIXED CONSTRUCTIONS
    • E01CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
    • E01FADDITIONAL WORK, SUCH AS EQUIPPING ROADS OR THE CONSTRUCTION OF PLATFORMS, HELICOPTER LANDING STAGES, SIGNS, SNOW FENCES, OR THE LIKE
    • E01F9/00Arrangement of road signs or traffic signals; Arrangements for enforcing caution
    • E01F9/50Road surface markings; Kerbs or road edgings, specially adapted for alerting road users
    • E01F9/506Road surface markings; Kerbs or road edgings, specially adapted for alerting road users characterised by the road surface marking material, e.g. comprising additives for improving friction or reflectivity; Methods of forming, installing or applying markings in, on or to road surfaces
    • E01F9/524Reflecting elements specially adapted for incorporation in or application to road surface markings
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V14/00Controlling the distribution of the light emitted by adjustment of elements
    • F21V14/04Controlling the distribution of the light emitted by adjustment of elements by movement of reflectors
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V7/00Reflectors for light sources
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • This disclosure relates to a vehicle driving support system, a road surface drawing device, and a road.
  • driving support vehicle When driving a vehicle using driving support technology or automatic driving technology (hereinafter referred to as driving support vehicle), the passengers of the vehicle do not always pay close attention to the driving situation. It becomes difficult to predict the operation of the vehicle by visually observing the driver from outside the vehicle, which was possible with conventional manual driving. Therefore, in a vehicle using driving assistance technology, it is important to notify the surroundings of the vehicle of the vehicle operation by some method.
  • Patent Document 1 As a method of notifying the vehicle movement to the outside of the vehicle, a method of irradiating the area around the vehicle on the road with light to draw an image has been proposed (see, for example, Patent Document 1).
  • image drawing on the road surface is a vehicle that is traveling by the driving support technology for other drivers, pedestrians, etc. of a vehicle that does not have the driving support technology (hereinafter referred to as a manually driven vehicle). That is, the movement of the vehicle now and in the future can be transmitted, so that other drivers and pedestrians can predict the movement of the vehicle.
  • the road surface drawing device draws a predetermined image on the road surface using a display element such as an LED array or a DMD (digital mirror device) provided on the front surface of the vehicle.
  • a display element such as an LED array or a DMD (digital mirror device) provided on the front surface of the vehicle.
  • the vehicle drawing apparatus described in Patent Document 2 displays an arrow in the direction of turning at an intersection when turning right or left as a marker on the road surface in front of the vehicle.
  • the course display image (road surface display image) displayed as a marker on the road surface allows other vehicles and pedestrians in the vicinity to recognize the traveling direction of the vehicle.
  • turn right and left turns with blinker lamps, etc. are displayed at a predetermined distance before the intersection specified by the Road Traffic Act, or from a predetermined time before entering the intersection.
  • the course display image is also displayed in the same manner as the blinker lamp.
  • the road surface drawing device continues to project specific images such as right turn arrows and left turn arrows on the road surface until the vehicle finishes turning. That is, when the steering angle (steering angle) from the steering device of the vehicle is in the right turn direction, the road surface drawing device continues to display the right turn arrow until the steering angle returns to the original direction (returns to the straight direction).
  • Patent Document 1 only presents the driving state and the driving schedule of each driving support vehicle even when a plurality of driving support vehicles travel on a public road. It was difficult to grasp the overall situation because the amount of information displayed in the entire traffic scene increased.
  • an object of the present disclosure is to provide a vehicle driving support system capable of supporting smooth traffic even when a plurality of vehicles merge at an intersection. ..
  • Patent Document 1 since information is transmitted by visually recognizing an image drawn on a road surface, it is necessary for a pedestrian or a driver of another vehicle to recognize the image and select an action. The behavior of vehicles equipped with tends to be prioritized.
  • an object of the present disclosure is to provide a vehicle driving support system that actively works on a vehicle traveling by using driving support technology.
  • Patent Document 2 in particular, at an intersection of a highway, a plurality of vehicles may stop in a string of beads waiting for a traffic light.
  • the vehicle located at the head of the intersection can display the course display image on the road surface in front, but the second and subsequent vehicles have sufficient space to display the course display image between the vehicle and the vehicle in front. Cannot be secured. Therefore, the light of the course display image of the second and subsequent vehicles may be reflected by the vehicle body, bumper, etc. of the vehicle in front, and the road surface may not be drawn accurately.
  • the course display image of the own vehicle does not optically affect the vehicle in front or the own vehicle located in front of the own vehicle. It is an object of the present invention to provide a vehicle driving support system for controlling the display of a course display image.
  • a general road surface drawing device always projects the same image regardless of the position of the vehicle 10E on the intersection. For example, when the vehicle 10E turns right, the road surface drawing device displays an image A of an arrow turning to the right on the road surface in front of the vehicle 10E before the vehicle 10E enters the intersection (see FIG. 33A).
  • the road marking image A may be far away from the pedestrian B at a position facing the vehicle 10E at the intersection, and may be hidden behind another vehicle and cannot be seen.
  • the vehicle 10E when the vehicle 10E makes a right turn and the direction of the vehicle 10E is almost right, the same road marking image A is always displayed. Therefore, the vehicle 10E actually remains slightly to the right.
  • the image is an arrow that makes a large turn to the right, even though it only bends. That is, as shown in FIGS. 33C and 33D, the traveling direction (course) of the vehicle 10E may not match the direction of the arrow on the road surface display image A.
  • the road marking image A may give the pedestrian B at the opposite position a misunderstanding that the vehicle 10E turns further to the right.
  • the difference between the traveling direction (course) of the vehicle and the direction of the arrow on the road surface display image A may give a misunderstanding to other vehicles and pedestrians B in the vicinity.
  • an object of the present disclosure is to provide a road surface drawing device that matches the course of the vehicle with the content of the course display image on the road surface as the direction of the traveling vehicle changes when turning right or left.
  • Patent Document 1 irradiates the road surface with light to draw an image
  • other drivers and pedestrians recognize the drawn image by visually recognizing the light reflected on the road surface. It will be.
  • the light irradiation of the road surface drawing can be visually recognized, such as when the road surface is exposed to direct sunlight in fine weather, when the road surface is wet in rainy weather, and when the headlights from other vehicles are illuminated at night. Difficult situations can occur.
  • an object of the present disclosure is to provide a road and vehicle driving support system that improves the visibility of road surface drawing.
  • the vehicle driving support system of the present disclosure acquires the motion of the first vehicle traveling on the road as the first motion information, and obtains the motion of the second vehicle traveling on the road as the second motion information.
  • Guidance for creating guidance information indicating a route on which the first vehicle or the second vehicle is scheduled to travel based on the vehicle motion grasping unit acquired as motion information and the first motion information and the second motion information.
  • the information creation unit includes a guidance information presentation unit that presents the guidance information to the first vehicle or the second vehicle, and the guidance information creation unit has the same first operation information and the second operation information. The guidance information is created when the course change in the direction is included and the right turn motion and the left turn motion are included.
  • the vehicle motion grasping unit acquires the motions of the first vehicle and the second vehicle, and the first motion information and the second motion information turn left and right by changing the course in the same direction.
  • the guidance information creation unit creates and presents guidance information. Therefore, the present disclosure makes it possible to support smooth traffic while ensuring safety even when a plurality of vehicles traveling by the driving assistance technology merge at an intersection.
  • the vehicle driving support system of the present disclosure has a detection unit that detects light from the outside of the vehicle and a driving support unit that supports driving according to the result of the detection unit. And a light irradiation unit that irradiates the detection unit with a predetermined light signal, and when the detection unit detects the light signal, the driving support unit stops the receiving vehicle. Or execute a deceleration operation.
  • the receiving side vehicle executes a stop operation or a deceleration operation by irradiating the detection unit with an optical signal from the light irradiation unit, so that the vehicle travels by the driving support technology. It is possible to actively work on.
  • the vehicle driving support system of the present disclosure includes a vehicle course acquisition unit that acquires vehicle course information, a vehicle state information acquisition unit that acquires the position and speed of the vehicle as vehicle state information, and the like. Projected onto the road surface around the vehicle based on the other vehicle detection unit that detects the presence of another vehicle in the vicinity of the vehicle and acquires the detection information, the course information, the vehicle state information, and the detection information.
  • An image selection unit for selecting a course display image to be used and a road surface drawing unit for projecting the course display image on the road surface are provided, and the image selection unit is in front of the vehicle based on the detection information and the vehicle state information.
  • the course display image is not selected, and if it is determined that the other vehicle does not exist within the first distance, the course forward display image is selected.
  • the road surface drawing unit projects a course forward display image selected by the image selection unit onto the road surface in front of the vehicle.
  • the road surface drawing unit projects the path forward display image on the front road surface only when there is no other vehicle within the first distance in front of the vehicle. Therefore, even when a plurality of vehicles are lined up at an intersection or the like, the vehicle It is possible to control the display of the route display image so that the route display image does not optically affect the vehicle in front or the vehicle located in front of the vehicle.
  • the road surface drawing device of the present disclosure includes a vehicle course acquisition unit that acquires vehicle course information, a vehicle state information acquisition unit that acquires the position and direction of the vehicle as vehicle state information, and the above.
  • An image selection unit that selects a course display image to be projected on the front road surface of the vehicle based on the course information and the vehicle state information, and a road surface drawing unit that projects the course display image onto the front road surface are provided.
  • the image selection unit selects the course display image including a line indicating the second direction from the vehicle, and the image selection unit changes with the change of the vehicle state information. , The course display image is reselected.
  • the road of the present disclosure contains a pavement surface and a phosphor material that is excited by the primary light and emits a secondary light different from the primary light, and is formed on the pavement surface.
  • a fluorescent substance-containing layer and a coating layer formed on the fluorescent substance-containing layer and transmitting the primary light and the secondary light are provided.
  • the phosphor-containing layer is irradiated with primary light and an image is drawn.
  • the wavelength of the primary light is converted in the phosphor-containing layer and the secondary light is emitted, so that the visibility of the road surface drawing is improved.
  • the vehicle driving support system of the present disclosure includes the road described above and a light irradiation unit that irradiates the road with the primary light.
  • the course display image of the own vehicle does not optically affect other vehicles or the own vehicle located in front of the own vehicle. It is possible to provide a vehicle driving support system that controls the display of images.
  • FIG. 1 is a schematic view showing a road 1 and a vehicle driving support system 100 according to the first embodiment.
  • FIG. 2 is a schematic cross-sectional view showing the structure of the road 1 according to the first embodiment.
  • FIG. 3 is a block diagram showing the configuration of the vehicle driving support system 100.
  • FIG. 4A is an example of the presentation of guidance information in the vehicle driving support system 100, and is a schematic diagram showing the irradiation of light from the infrastructure side device 30.
  • FIG. 4B is an example of guidance information presentation in the vehicle driving support system 100, and is a schematic view showing light irradiation from the first vehicle 10 or the second vehicle 20.
  • FIG. 5 is a schematic cross-sectional view showing the structure of a modified example of the road 1.
  • FIG. 4A is an example of the presentation of guidance information in the vehicle driving support system 100, and is a schematic diagram showing the irradiation of light from the infrastructure side device 30.
  • FIG. 4B is an example of guidance information presentation in the vehicle driving support
  • FIG. 6 is a flowchart showing an operation example of the vehicle driving support system 100 according to the first embodiment.
  • FIG. 7 is a schematic view showing an operation example of the vehicle driving support system 200 according to the second embodiment on the road.
  • FIG. 8 is a schematic view showing an example of operation of the vehicle driving support system 300 according to the third embodiment on the road.
  • FIG. 9 is a schematic view showing an example of operation of the vehicle driving support system 400 according to the fourth embodiment on the road.
  • FIG. 10 is a schematic view showing an example of operation of the vehicle driving support system 500 according to the fifth embodiment on the road.
  • FIG. 11 is a schematic view showing an operation example of the vehicle driving support system 600 according to the sixth embodiment on the road.
  • FIG. 12 is a block diagram showing the configuration of the vehicle driving support system 600.
  • FIG. 12 is a block diagram showing the configuration of the vehicle driving support system 600.
  • FIG. 13A is an example of light irradiation in the vehicle driving support system 600, and is a schematic view showing light irradiation from the transmitting side vehicle 20C.
  • FIG. 13B is an example of light irradiation in the vehicle driving support system 600, and is a schematic view showing light irradiation from the infrastructure side device 30.
  • FIG. 13C is an example of light irradiation in the vehicle driving support system 600, and is a schematic view showing light irradiation from a portable electronic device 40 held by a pedestrian.
  • FIG. 14 is a flowchart showing an operation example of the vehicle driving support system 600.
  • FIG. 15 is a schematic view showing an operation example of the vehicle driving support system 700 according to the seventh embodiment on the road.
  • FIG. 16 is a flowchart showing an operation example of the vehicle driving support system 700.
  • FIG. 17A is a front view of a vehicle traveling by using the vehicle driving support system 800 according to the eighth embodiment.
  • FIG. 17B is a rear view of a vehicle traveling by using the vehicle driving support system 800 according to the eighth embodiment.
  • FIG. 18 is a block diagram showing a configuration of the vehicle driving support system 800 according to the eighth embodiment.
  • FIG. 19 is a flowchart showing a flow of processing in which the vehicle driving support system 800 according to the eighth embodiment displays a course display image.
  • FIG. 20 is a schematic view showing a course and a course display image at an intersection of a plurality of vehicles according to the eighth embodiment.
  • FIG. 21 is a schematic view showing a course display image at an intersection of a vehicle of a ninth embodiment and a two-wheeled vehicle (another vehicle).
  • FIG. 22 is a schematic view showing the relationship between the vehicle of the tenth embodiment, the two-wheeled vehicle (other vehicle), and the infrastructure side device at the intersection.
  • FIG. 23 is a block diagram of the road surface drawing device 900 according to the eleventh embodiment.
  • FIG. 24 is a flowchart showing a flow of processing in which the road surface drawing device 900 according to the eleventh embodiment displays the course display image 6E.
  • FIG. 25 is a schematic view showing a course 7E and a course display image 6E when turning right at an intersection of the vehicle 10E in the eleventh embodiment.
  • FIG. 26 is a schematic view showing a method of selecting the course display image 6E in the eleventh embodiment.
  • FIG. 27A is a schematic view showing a course display image 6E at the position of the vehicle 10E in the intersection at the time of turning right in the eleventh embodiment, and shows a state before the vehicle 10E enters the intersection.
  • FIG. 27B is a schematic view showing a course display image 6E at the vehicle 10E position in the intersection when turning right in the eleventh embodiment, and shows a state in which the vehicle 10E is located near the center of the intersection.
  • FIG. 27C is a schematic view showing a course display image 6E at the position of the vehicle 10E in the intersection at the time of turning right in the eleventh embodiment, and shows a state in which the vehicle 10E has turned the intersection.
  • FIG. 27A is a schematic view showing a course display image 6E at the position of the vehicle 10E in the intersection at the time of turning right in the eleventh embodiment, and shows a state before the vehicle 10E enters the intersection
  • FIG. 28A is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state before the vehicle 10E enters the intersection.
  • FIG. 28B is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state in which the vehicle 10E is located near the center of the intersection.
  • FIG. 28C is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state in which the vehicle 10E is located near the center of the intersection.
  • FIG. 28A is a schematic view showing a course display image 6E at the time of a right turn in the twelfth embodiment according to the position of the vehicle 10E, and shows a state before the vehicle 10E enters the intersection.
  • FIG. 28B
  • FIG. 28D is a schematic view showing the course display image 6E at the time of turning right in the twelfth embodiment according to the position of the vehicle 10E, and shows the state of the vehicle 10E at the position where the intersection is completely turned.
  • FIG. 29A shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6A at the position where the vehicle 10E enters the intersection.
  • FIG. 29B shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6EB when the vehicle 10E is located near the center of the intersection.
  • FIG. 29C shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6EC when the vehicle 10E is located near the center of the intersection.
  • FIG. 29D shows a modified example of the course display image 6E in the twelfth embodiment, and is a schematic view showing the course display image 6ED at a position where the vehicle 1 turns completely at the intersection.
  • FIG. 30A shows the course display image 6E of the thirteenth embodiment, and is a schematic view of the intersection showing the course display image 6E at the intersection with the passage of time.
  • FIG. 30B shows the course display image 6E of the thirteenth embodiment, and is a schematic perspective view showing the projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection.
  • FIG. 30C shows the course display image 6E of the thirteenth embodiment, and is a schematic perspective view showing a state of projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection.
  • FIG. 30D shows the course display image 6E of the thirteenth embodiment, and is a schematic perspective view showing a state of projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection.
  • FIG. 31A is a schematic view showing a course display image at an intersection in the 14th embodiment, and shows a course display image 61E at the time when the vehicle 10E trying to turn right enters the intersection.
  • FIG. 31B is a schematic view showing a course display image at an intersection in the 14th embodiment and showing a course display image 62E after the vehicle 10E detects a pedestrian.
  • FIG. 32A is a schematic view showing a course display image 6E when traveling straight in the fifteenth embodiment.
  • FIG. 32B is a schematic view showing a course display image 6E at the time of turning right in the fifteenth embodiment.
  • 32C is a schematic view showing a course display image 6E at the time of a left turn in the fifteenth embodiment.
  • FIG. 33A is a schematic view showing a course display image of the vehicle position in the intersection at the time of a right turn in the conventional example, and shows a state before the vehicle 10E enters the intersection.
  • FIG. 33B is a schematic view showing a course display image of the vehicle position in the intersection at the time of a right turn in the conventional example, and shows a state in which the vehicle 10E is located near the center front of the intersection.
  • FIG. 33C is a schematic view showing a course display image of the vehicle position in the intersection at the time of a right turn in the conventional example, and shows a state in which the vehicle 10E is located near the center of the intersection.
  • FIG. 33D is a schematic view showing a course display image of the vehicle position in the intersection when turning right in the conventional example, and showing the state of the vehicle 10E at the position where the intersection is completely turned.
  • FIG. 1 is a schematic view showing a road 1 and a vehicle driving support system 100 according to the present embodiment.
  • FIG. 1 shows an intersection where the road 1 intersects and the sidewalk 2 is provided.
  • the first vehicle 10 and the second vehicle 20 are running on the road 1 facing each other.
  • the infrastructure side device 30 is arranged on the area of the sidewalk 2.
  • the first vehicle 10 is going straight on the road 1 from the lower part to the upper part in the figure and then turns left
  • the second vehicle 20 is going straight on from the upper part to the lower part in the figure and then turning right. ..
  • the left turn of the first vehicle 10 is prioritized, but depending on the situation of the first vehicle 10, the number of lanes, the speed and the distance of the road 1 at the confluence, the congestion situation of the straight-ahead vehicle, and the like, the second vehicle 20 May cross the front of the first vehicle 10 and turn right.
  • FIG. 1 shows the driving condition in the area where the driving lane is on the left side and the regulation, but in the area where the driving lane is on the right side and the regulation is regulated, the second vehicle 20 turns left and the second vehicle 20 is turned.
  • Vehicle 10 may turn right.
  • a phosphor coating region R is formed at an intersection and within a predetermined distance L from the intersection.
  • the range of the predetermined distance L is a range 30 m away from the intersection, for example, in order to indicate a course change with a turn signal.
  • hatching is applied as the phosphor-coated region R, but as will be described later, the phosphor-coated region R is uncolored.
  • Guidance information M1 is drawn on the road surface in front of the first vehicle 10 and guidance information M2 is drawn on the road surface in front of the second vehicle 20 on the road 1 in the phosphor-coated region R.
  • Guidance information M1 and M2 are presented. Further, road surface information M3 such as a pedestrian crossing and a stop line is drawn on the road 1 in the phosphor-coated region R, and guidance information M4 is also drawn on the sidewalk 2.
  • the guidance information M1 has an arrow shape that bends from the front direction to the left in the traveling direction of the first vehicle 10
  • the guidance information M2 has an arrow shape that bends from the front direction to the right in the traveling direction of the second vehicle.
  • FIG. 1 shows an example of drawing an arrow image on a road surface as a method of presenting guidance information, but the shape of the image is not limited, and the drawing is not limited to the road surface.
  • the image may contain characters and icons.
  • the method of presenting the guidance information may be an image display using an image display device or a head-up display mounted on the vehicle, or may be voice guidance.
  • Road 1 is a route on which vehicles travel, and may be a paved road, an unpaved road, a public road, or a private road. Further, FIG. 1 shows an example of a crossroad having one lane on each side and an oncoming lane, but the number of lanes on one side and the shape of the intersection are not limited.
  • the road 1 and the vehicle driving support system 100 are not limited to the inside of the intersection.
  • the vehicle driving support system 100 is also used when more lanes such as a three-way junction and a five-way junction intersect.
  • the vehicle driving support system 100 is not limited to turning left or right on the road, and is also used when turning left or right to cross an oncoming lane and enter private land such as a parking lot or a store.
  • the sidewalk 2 is a space provided along the road 1 on which the vehicle travels, on which the vehicle does not travel, and is an area through which pedestrians pass. If sidewalk 2 is not clearly separated from road 1, sidewalk 2 may be a roadside zone of road 1. Further, the vehicle driving support system 100 does not necessarily have to include the sidewalk 2.
  • FIG. 1 only shows the sidewalk 2 as an example of an area where the infrastructure side device 30 is arranged, an area where a pedestrian walks, and an area where a portable electronic device possessed by a pedestrian is arranged.
  • the first vehicle 10 and the second vehicle 20 are vehicles traveling on the road 1.
  • the first vehicle 10 and the second vehicle 20 are preferably driving support vehicles in which a part of steering control and acceleration / deceleration control is performed by a computer or the like.
  • the vehicle driving support system 100 of the present embodiment can provide driving support even for a manually driven vehicle that does not have the driving support technology.
  • each vehicle includes a situation grasping unit, a driving support unit, a vehicle motion control unit, and an information / communication unit.
  • the situation grasping unit acquires information on the running condition and surrounding conditions of the vehicle.
  • the situation grasping unit may be various sensor devices such as a vehicle speed sensor, a position sensor, an image imaging device, a laser ranging device, and a LIDAR (Light Detection and Ringing) that realize a driving support function.
  • the traveling state acquired by the situation grasping unit includes the traveling speed, the position, the direction of the vehicle body, the steering angle, the brake operation, the traveling route by the car navigation system, the direction instruction by the conversation recognition in the vehicle, and the like.
  • the surrounding conditions acquired by the situation grasping unit include road surface conditions, ambient temperature, road maps by car navigation system, road gradients, detection of surrounding objects by image recognition, and distance between vehicles in front, oncoming vehicles, and following vehicles. Includes behavior prediction, detection of pedestrians by image recognition, etc.
  • the driving support unit processes the information acquired by the situation grasping unit and outputs a driving control signal for supporting the driving of the vehicle to the vehicle motion control unit.
  • the vehicle motion control unit executes vehicle steering control and acceleration / deceleration control based on the driving control signal output from the driving support unit.
  • the vehicle motion control unit has a driving support function.
  • the vehicle motion control unit adjusts the output of the power source, operates the brakes, changes the steering angle, displays the driving guide, controls the lighting of the turn signals and the stop lights, and so on, so that the first vehicle 10 or the second vehicle 20 Support driving.
  • the information and communication unit is connected to the driving support unit and the situation grasping unit, and performs information communication with the communication unit provided outside the vehicle.
  • the information and communication unit performs vehicle-to-vehicle communication between vehicles and road-to-vehicle communication with equipment provided on the road.
  • the infrastructure side device 30 is provided on the road 1 or the sidewalk 2. As will be described later, the infrastructure side device 30 may irradiate the road surface of the road 1 with light to draw an image.
  • the specific configuration of the infrastructure side device 30 is not limited, and a dedicated lighting device may be arranged.
  • Infrastructure equipment such as electric bulletin boards, such as street lights and signals, may have a function of irradiating light.
  • the infrastructure side device 30 may include an information communication unit for information communication with the first vehicle 10 or the second vehicle 20, and may enable road-to-vehicle communication with the vehicle.
  • the infrastructure side device 30 may include a situation grasping unit and a driving support unit in the same manner as the driving support vehicle.
  • FIG. 2 is a schematic cross-sectional view showing the structure of the road 1 according to the first embodiment.
  • a pavement surface 3 is formed on the ground of the road 1, and a phosphor-containing layer 4 and a coating layer 5 are laminated on the pavement surface 3.
  • the road 1 and the pavement surface 3 are schematically shown in a two-layer structure for simplification, but the road 1 and the pavement surface 3 are a road body, a roadbed, a construction roadbed, a lower layer roadbed, an upper layer roadbed, and a base layer.
  • a laminated structure including a surface layer and the like may be used.
  • the phosphor-containing layer 4 and the coating layer 5 may be formed by being laminated on the sidewalk 2.
  • the pavement surface 3 is a layer corresponding to the surface layer of the road 1 in a region other than the phosphor-containing region R, and is exposed to the surface and comes into contact with the tires of the first vehicle 10 and the second vehicle 20.
  • the material of the pavement surface 3 is not limited, and may be asphalt, concrete, interlocking, wood, brick, or the like.
  • the phosphor-containing layer 4 contains fine particles of a phosphor material that are excited by primary light and emit secondary light having a wavelength different from that of the primary light. Further, the phosphor-containing layer 4 contains a dispersion medium for dispersing the fluorescent material fine particles, and the fluorescent material fine particles are uniformly dispersed in the dispersion medium.
  • the phosphor-containing layer 4 is colored in a color corresponding to the absorption band of the phosphor material, but when the entire phosphor-containing layer 4 is visually recognized from a distance by reducing the concentration of the phosphor fine particles in the dispersion medium. Can also be uncolored.
  • the material of the dispersion medium is not limited, but is a material that transmits primary light and secondary light, and includes acrylic resin, epoxy resin, silicone resin, polycarbonate, and the like.
  • the phosphor-containing layer 4 may contain a light scattering material for scattering primary light and secondary light.
  • the light scattering material contains fine particles having a refractive index different from that of the dispersion medium, and may be, for example, SiO 2 or TiO 2 .
  • the fluorescent material contained in the fluorescent material-containing layer 4 is not limited, and a plurality of types of fluorescent materials may be included.
  • the phosphor-containing layer 4 emits an amber color as the secondary light.
  • the phosphor-containing layer 4 is excited by green light so that the mixed color of the primary light and the secondary light becomes white, and the blue light is generated. It may contain a plurality of phosphor materials that emit red light.
  • the phosphor-containing layer 4 may contain a plurality of phosphor materials that are excited by purple light and emit green light, blue light, or red light.
  • Examples of phosphor materials include a YAG system that emits yellow light ((Y, Gd) (Al, Ga) O: Ce), a CASN system that emits red light (CaAlSiN 3 : Eu, etc.), and a green light emission.
  • YAG system that emits yellow light
  • CASN system that emits red light
  • CaAlSiN 3 red light
  • ⁇ -SiAlON system Si 6-Z Al Z O Z N 8-z, etc.
  • the coating layer 5 is a layer formed by covering the surface of the phosphor-containing layer 4, and is made of a material that transmits primary light and secondary light.
  • the material of the coating layer 5 is not limited, and may include acrylic resin, epoxy resin, silicone resin, polycarbonate, and the like. Further, the coating layer 5 is not limited to the one made entirely of a light-transmitting material, and may partially include a light-shielding material. Since the coating layer 5 covers the surface of the phosphor-containing layer 4 and comes into contact with the tire of the vehicle traveling on the road 1, the phosphor-containing layer 4 is protected, the durability of the road 1 is improved, and the frictional force is secured. Can be done.
  • FIG. 3 is a block diagram showing the configuration of the vehicle driving support system 100.
  • the vehicle driving support system 100 includes a first vehicle 10, a second vehicle 20, a vehicle motion grasping unit 110, a guidance information creation unit 120, and a guidance information presentation unit 130. ..
  • the vehicle motion grasping unit 110, the guidance information creating unit 120, the driving support unit, and the information grasping unit are recorded in advance on a computer equipped with a central processing unit (CPU: Central Processing Unit), a memory, an external storage device, and the like.
  • CPU Central Processing Unit
  • Predetermined information processing may be executed by the program.
  • the vehicle motion grasping unit 110 grasps the motions of the first vehicle 10 and the second vehicle 20 in the area of the road 1, and acquires the respective situations and future operation schedules as the first motion information and the second motion information. ..
  • the vehicle motion grasping unit 110 is composed of a situation grasping unit, a driving support unit, a vehicle motion control unit, and a combination thereof provided in the first vehicle 10, the second vehicle 20, and the infrastructure side device 30 described above.
  • the vehicle motion grasping unit 110 uses the operation of the direction indicator and the route information of the car navigation system. , Map information, conversation in the vehicle, etc. to grasp the running state of the operation of the own vehicle. Further, the vehicle motion grasping unit 110 recognizes the image captured by the image capturing unit to determine the direction indicator operation of the oncoming vehicle, the content of the image drawn on the road surface, the presence / absence of a pedestrian, a two-wheeled vehicle, or the like, and the surrounding conditions. To grasp as.
  • the vehicle motion grasping unit 110 when the vehicle motion grasping unit 110 is configured by the infrastructure side device 30, the vehicle motion grasping unit 110 recognizes the image captured by the image capturing unit provided in the infrastructure side device 30 to recognize the first vehicle. Predict the operation of 10 and the second vehicle 20.
  • the vehicle motion grasping unit 110 communicates information about the information of the situation grasping unit, the driving support unit, and the vehicle motion control unit provided in the vehicle. By acquiring through the unit, the operation of the first vehicle 10 and the second vehicle 20 is predicted.
  • the guidance information creating unit 120 indicates the route on which the first vehicle 10 or the second vehicle 20 is scheduled to travel based on the first operation information and the second operation information acquired by the vehicle operation grasping unit 110. Create M2.
  • the guidance information may include the timing at which the first vehicle 10 and the second vehicle 20 make a left turn and a right turn, respectively, and the lane position in the merging lane after the left turn and the right turn.
  • the guidance information presentation unit 130 is a road surface drawing device that draws the guidance information M1 and M2 and the road surface information M3 created by the guidance information creation unit 120 in the phosphor-containing region R of the road 1.
  • the guidance information presentation unit 130 may draw road surface information M3 such as a pedestrian crossing or a stop line, or guidance information M4 on the sidewalk 2.
  • the guidance information presenting unit 130 presents guidance information to the first vehicle 10 and the second vehicle 20 by drawing guidance information M1 and guidance information M2 on the road surface, but the content of the image is limited. Not done.
  • the presentation of the guidance information may include a display using an image display device in the vehicle and voice guidance.
  • FIG. 4A and 4B are schematic views showing an example of guidance information presentation in the vehicle driving support system 100.
  • FIG. 4A shows the irradiation of light from the infrastructure side device 30.
  • FIG. 4B shows the irradiation of light from the first vehicle 10 or the second vehicle 20.
  • the infrastructure side device 30 has the light irradiation unit 31, and in the example of FIG. 4B, the first vehicle 10 or the second vehicle 20 has the light irradiation units 11 and 21.
  • the light irradiation units 11, 21, and 31 are examples of the guidance information presentation unit 130.
  • the light irradiation units 11, 21, and 31 project and draw the guidance information M1 and M2 on the road surface of the road 1 based on the guidance information created by the guidance information creation unit 120.
  • the light irradiation units 11, 21, and 31 project and draw the guidance information M4 and the road surface information M3 on the road surface of the road 1 or the sidewalk 2 based on the guidance information created by the guidance information creation unit 120 (FIG.
  • the primary light When the primary light is irradiated to the phosphor coating region R from the light irradiation units 11, 21, 31, the primary light passes through the coating layer 5 and reaches the phosphor-containing layer 4. Since the phosphor-containing layer 4 contains a phosphor material, at least a part of the primary light is wavelength-converted to the secondary light. The secondary light and the primary light that has not been wavelength-converted are irradiated to the outside of the road 1 through the coating layer 5. Therefore, in the phosphor coating region R, the shapes of the guidance information M1 and M2 and the road surface information M3 are displayed in the color obtained by mixing the primary light and the secondary light. At this time, since the primary light and the secondary light are scattered by the phosphor fine particles and the light scattering material contained in the phosphor-containing layer 4, the light distribution characteristics become isotropic and various on the road 1 and the sidewalk 2. Visibility at the position is improved.
  • the secondary light is a phosphor material contained in the phosphor-containing layer 4, and at least a part of the primary light is wavelength-converted.
  • Guidance information M1, M2 and road surface information M3 are displayed in a color different from the light emitted from the light irradiation units 11, 21, 31 and the sun or the surrounding environment to the road 1, and is the surface of the road 1 self-luminous? Visibility is improved because it looks like.
  • the wavelength of the primary light is preferably green light or purple light having a short wavelength.
  • the white light emitting device by the combination of the blue LED and the yellow phosphor irradiates the light of the blue wavelength. Therefore, when the phosphor material contained in the phosphor-containing layer 4 is excited by blue light as the primary light, the secondary light may be emitted by the blue light contained in the headlight or the illumination lamp. Therefore, blue light is not preferable as the wavelength of the primary light.
  • the road 1 and the vehicle driving support system 100 of the present embodiment include the phosphor-containing layer 4 and the coating layer 5 that cover the pavement surface 3, and irradiate the phosphor-containing layer 4 with primary light. Draw an image. The wavelength of the primary light is converted in the phosphor-containing layer 4 to emit the secondary light, and the visibility of road surface drawing is improved.
  • FIG. 5 is a schematic cross-sectional view showing the structure of the road 1 according to the present embodiment.
  • the pavement surface 3 is formed on the ground of the road 1, and the phosphor-containing layer 4, the adhesive layer 6 and the coating layer 7 are laminated on the pavement surface 3. Has been done.
  • the adhesive layer 6 is a member that is interposed between the phosphor-containing layer 4 and the coating layer 7 to bond the two, and may be an adhesive.
  • the coating layer 7 is a plate-shaped member formed on the adhesive layer 6, and includes a cover member having a fine uneven shape formed on the front surface and the back surface of the coating layer 7. I'm out.
  • FIG. 5 shows an example in which a concavo-convex shape is formed on the front and back surfaces of the coating layer 7, but the concavo-convex shape may be formed on either the front surface side or the back surface side.
  • Both the adhesive layer 6 and the coating layer 7 are made of a material that transmits primary light and secondary light.
  • an uneven shape may be formed in advance when the plate-shaped member is formed, or an uneven shape may be formed on a flat surface by sandblasting or the like.
  • An adhesive is applied on the phosphor-containing layer 4, and the obtained plate-shaped coating layer 7 is arranged to cure the adhesive to form an adhesive layer 6, and the adhesive layer 6 contains the phosphor.
  • the layer 4 and the coating layer 7 are adhered to each other.
  • the shape and size of the unevenness formed on the coating layer 7 are not limited, but it is a size that scatters the primary light and the secondary light, and the width and height of the unevenness may be larger than the wavelengths of the primary light and the secondary light. preferable.
  • the light distribution characteristics of the primary light and the secondary light taken out from the phosphor-containing layer 4 become more isotropic. Visibility is improved at various positions on the road 1 and the sidewalk 2. Further, by forming irregularities on the surface of the coating layer 7, it is possible to secure friction with the tires of the traveling vehicle.
  • the road 1 and the vehicle driving support system 100 of the present embodiment include a phosphor-containing layer 4, an adhesive layer 6, and a coating layer 7 that cover the pavement surface 3, and irradiate the phosphor-containing layer 4 with primary light.
  • the wavelength of the primary light is converted in the phosphor-containing layer 4 to emit the secondary light, and the visibility of road surface drawing is improved.
  • the coating layer 7 is formed with irregularities, the light distribution characteristics of the primary light and the secondary light emitted from the phosphor-containing layer 4 to the outside become more isotropic and the visibility is improved.
  • the first vehicle 10 or the second vehicle 20 is a driving support vehicle.
  • the first vehicle 10 or the second vehicle 20 draws the road surface by irradiating the primary light from the light irradiation units 11 and 21 only while traveling in the phosphor coating region R, and does not irradiate the primary light while traveling in other regions. ..
  • the first vehicle 10 or the second vehicle 20 is a driving support vehicle, and has a situation grasping unit as in the first embodiment.
  • the surrounding conditions acquired by the situation grasping unit include the position information of the phosphor coating region R on the road 1.
  • the map information of the car navigation system includes the position information of the phosphor coating area R, and the situation grasping unit collates the vehicle position with the map information to collate the phosphor application area on the current position or the traveling route. Detect R.
  • the infrastructure-side device 30 arranged in the vicinity of the phosphor-coated region R may transmit the presence of the phosphor-coated region R to the first vehicle 10 or the second vehicle 20 by road-to-vehicle communication.
  • the situation grasping unit detects the existence of the phosphor coating region R and the light irradiation unit 11 , 21 irradiates the road surface with primary light. Since the phosphor-containing layer 4 is laminated in the phosphor-coated region R, the primary light is wavelength-converted to the secondary light by the phosphor material contained in the phosphor-containing layer 4, and the primary light and the secondary light are converted into wavelengths. Guidance information M1 and M2 are drawn on the road surface by color mixing.
  • the road surface is irradiated with the primary light from the light irradiation units 11 and 21 only in the phosphor-coated region R, and the primary light is not irradiated in the other regions where the phosphor-containing layer 4 is not formed, so that the power consumption is consumed. Can be reduced.
  • FIG. 6 is a flowchart showing an operation example of the vehicle driving support system 100 according to the first embodiment.
  • the vehicle motion grasping unit 110 acquires the first motion information of the first vehicle 10 and acquires the second motion information of the second vehicle 20.
  • the situation grasping unit provided in the first vehicle 10, the second vehicle 20, and the infrastructure side device 30 can be used for the congestion status of the road 1, the presence / absence and operation of other traveling vehicles, pedestrians, and two-wheeled vehicles, and the road at the confluence. Acquire information such as road conditions and driving conditions for 1.
  • the vehicle motion grasping unit 110 includes the left turn of the first vehicle 10 in the first motion information, the right turn of the second vehicle 20 in the second motion information, and the first. It is determined whether the traveling direction is the same after the left turn of the vehicle 10 and the right turn of the second vehicle 20, and the merging is a change of course in the same direction.
  • the vehicle operation grasping unit 110 determines that the vehicle is merging
  • the vehicle driving support system 100 proceeds to step 3, and when it is determined that the vehicle is not merging, the vehicle driving support system 100 proceeds to the operation information acquisition step of step 1.
  • the vehicle motion grasping unit 110 may consider the overlap of the left turn timing of the first vehicle 10 and the right turn timing of the second vehicle 20.
  • the vehicle motion grasping unit 110 determines that a left turn of the first vehicle 10 and a right turn of the second vehicle 20 are executed within a predetermined time, and if there is an interval of a predetermined time or more, the vehicle does not merge. Is determined.
  • the guidance information creation unit 120 uses the first guidance information M1 for guiding the movement of the first vehicle 10 and the second vehicle 20 based on the first movement information and the second movement information.
  • the second guidance information M2 that guides the movement is created.
  • the first-lead information and the second-lead information correspond to the guidance information in the present disclosure, respectively.
  • the guidance information creation unit 120 may consider the information of the road 1 acquired in the operation information acquisition step. Specific examples of the first lead information M1 and the second lead information M2 will be described later.
  • the guidance information presentation unit 130 determines the guidance information presentation method based on the first guidance information M1 and the second guidance information M2, and the first vehicle 10 and the second vehicle 20 Present to.
  • the method of presenting the guidance information may include drawing on the road surface, displaying on the image display device in the vehicle, displaying on the head-up display, and voice guidance.
  • Guidance information may be presented by combining these methods.
  • Examples of operations of the vehicle motion grasping unit 110, the guidance information creating unit 120, and the guidance information presenting unit 130 include the following.
  • the vehicle driving support system 100 presents the left turn information of the first vehicle 10 first, and the right turn information of the second vehicle 20 is after the left turn of the first vehicle 10 is completed. Present.
  • the vehicle driving support system 100 presents the lane after the left turn of the first vehicle 10 and the lane after the right turn of the second vehicle 20 differently. .. Specifically, the vehicle driving support system 100 presents the first guidance information M1 to the first vehicle 10 so as to drive in the left lane after turning left, and causes the second vehicle 20 to drive in the right lane after turning right. The second guidance information M2 is presented to. Alternatively, the first vehicle 10 is presented with the first guidance information M1 so as to drive in the right lane after turning left, and the second vehicle 20 is presented with the second guidance information M2 so as to drive in the left lane after turning right. To do.
  • a lane that is easy to shift to the route after turning left or right may be selected depending on whether the vehicle goes straight after turning left or right or turns left or right again. Further, since the turning radius becomes large when the first vehicle 10 is a large vehicle, the first guidance information M1 may be presented so as to drive in the right lane after turning left.
  • the presentation timing of the first guidance information M1 and the presentation timing of the second guidance information M2 may overlap.
  • the traveling lanes of the first vehicle 10 and the second vehicle 20 after merging are different, even if the timings of the first vehicle 10 and the second vehicle 20 entering the merging road 1 overlap, the vehicles The vehicle driving support system 100 supports smooth traffic by completing left and right turns of two vehicles in a short time while avoiding collisions with each other.
  • the vehicle motion grasping unit 110 acquires the first motion information and the second motion information.
  • the guidance information creation unit 120 causes the first vehicle 10 or the second vehicle 20 to travel. Create the route to be used as guidance information.
  • the guidance information presentation unit 130 presents guidance information.
  • FIG. 7 is a schematic view showing an operation example of the vehicle driving support system 200 according to the second embodiment on the road.
  • FIG. 7 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
  • This embodiment shows a case where the left turn of the first vehicle 10 is prioritized and the right turn timing of the second vehicle 20 is delayed.
  • the guidance information is selected by the guidance information creation unit 120 in the guidance information creation step of step 3 (FIG. 6).
  • the conditions for stopping the second vehicle 20 are, for example, when the road 1 after merging is one lane, or when the first vehicle 10 is a large vehicle and it takes time to turn left, the road 1 after merging is congested. Cases etc. are included.
  • the guidance information presenting unit 130 presents the first guidance information M1 for turning left to the first vehicle 10 and the second guidance information M2 for stopping to the second vehicle 20.
  • the presentation of the first guidance information M1 and the second guidance information M2 continues until the condition for stopping the second vehicle 20 as described above is resolved, and after the condition is resolved, the second turn to the right is as shown in FIG.
  • Guidance information M2 is presented.
  • FIG. 7 shows an example in which the characters “STOP” are drawn on the road surface as the second guidance information M2 indicating the stop of the second vehicle 20, but a figure such as an icon may be presented in order to improve visibility. Voice guidance may be used.
  • the vehicle driving support system 200 of the present embodiment explicitly indicates that the second vehicle 20 is stopped and turns right, so that even when a plurality of vehicles merge at an intersection, smooth traffic is ensured while ensuring safety. Can be assisted.
  • FIG. 8 is a schematic view showing an example of operation of the vehicle driving support system 300 according to the third embodiment on the road.
  • FIG. 8 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
  • This embodiment shows a case where the left turn schedule of the first vehicle 10 is preliminarily presented from the front of the intersection.
  • the first guidance for turning left is the same as shown in the first embodiment.
  • Information M1 is presented.
  • the predetermined distance L2 is longer than the predetermined distance L1 and the first vehicle is traveling between the predetermined distances L2 and L1 before reaching the predetermined distance L1
  • the left turn preparation is presented as the first guidance information M1.
  • Guidance information (first guidance information M1) is selected by the guidance information creation unit 120 in the guidance information creation step of step 3 based on the position information and map information of the first vehicle 10 (FIG. 6).
  • the predetermined distance L1 is a distance of 5 m from the intersection
  • the predetermined distance L2 is a distance of 30 m from the intersection.
  • the left turn first guidance information M1 and the left turn preparation first guidance information M1 when the left turn first guidance information M1 is a character or an image indicating that the left turn is turned to the left, the first guidance for the left turn preparation As the information M1, an image different from that of the first guidance information M1 for turning left is used.
  • Different images include image color, brightness, line width, solid and dashed lines, images with changed blinking and lighting, and the like.
  • the guidance information creation unit 120 may calculate the time required for the first vehicle 10 to reach the intersection and start a left turn based on the speed of the first vehicle 10, the distance to the intersection, and the deceleration.
  • the guidance information presenting unit 130 informs the second vehicle 20 as the second guidance information M2 before presenting the first guidance information M1 for the left turn. You may offer a right turn.
  • the vehicle driving support system 300 of the present embodiment is scheduled to turn left with respect to the second vehicle 20, other surrounding vehicles, pedestrians, motorcycles, etc. by presenting the left turn of the first vehicle 10 from the front side of the intersection. Can be communicated to call attention. Even when a plurality of vehicles meet at an intersection, the vehicle driving support system 300 can support smooth traffic while ensuring safety. Further, by informing the second vehicle 20 of the left turn schedule, it is possible to encourage the second vehicle 20 to make a right turn first, so that smoother traffic can be supported.
  • FIG. 9 is a schematic view showing an example of operation of the vehicle driving support system 400 according to the fourth embodiment on the road.
  • FIG. 9 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
  • the two-wheeled vehicle 40 is running in parallel on the left side of the first vehicle 10 that turns left at the intersection.
  • An example of the two-wheeled vehicle 40 includes a bicycle and a motorcycle, but a three-wheeled vehicle, an ultra-small vehicle, a runner running on a roadside belt, or the like may be used as long as it is an object to move in parallel on the side of the first vehicle 10.
  • the vehicle operation grasping unit 110 adds the first operation information of the first vehicle 10 and the second operation information of the second vehicle 20 to the surrounding roads 1 and sidewalks 2. To get the status of.
  • the vehicle motion grasping unit 110 determines that the first vehicle 10 and the second vehicle 20 are merging by turning left or right, and the motorcycle 40 is traveling on the left side of the first vehicle 10. , And it is determined that the motorcycle 40 reaches the intersection at the timing when the first vehicle 10 turns left.
  • the guidance information creation unit 120 creates the first guidance information M1a and M1b indicating the left turn and stop of the first vehicle 10 and the second guidance information M2 indicating the left turn of the second vehicle 20. ..
  • the guidance information presentation unit 130 presents the first guidance information M1a and M1b indicating a left turn and a stop to the first vehicle 10, and indicates a right turn to the second vehicle 20.
  • the second guidance information M2 is presented.
  • the guidance information presenting unit 130 presents the second guidance information M2 for turning right to the second vehicle 20, but when the two-wheeled vehicle 40 goes straight through the intersection, the second stop as shown in FIG. Guidance information M2 may be presented.
  • the guidance information presenting unit 130 may present the first guidance information M1a and M1b at the same time, but first presents the first guidance information M1b indicating a stop, and indicates a left turn after the parallel running state of the motorcycle 40 is resolved.
  • 1 Guidance information M1a may be presented.
  • the guidance information presenting unit 130 may use different images for the first guidance information M1a and M1b. Different images include image color, brightness, line width, solid and dashed lines, images with changed blinking and lighting, and the like.
  • the vehicle driving support system 400 of the present embodiment detects parallel running of the two-wheeled vehicle 40 and the like and stops the left turn of the first vehicle 10 to prevent an accident involving the two-wheeled vehicle 40 and a plurality of vehicles merge at an intersection. In some cases, it is possible to support smooth traffic while ensuring safety.
  • FIG. 10 is a schematic view showing an example of operation of the vehicle driving support system 500 according to the fifth embodiment on the road.
  • FIG. 10 shows that the first vehicle 10 turns left and the second vehicle 20 turns right at an intersection where the road 1 intersects and the sidewalk 2 is provided, as in the first embodiment. , Shows the case of merging and traveling in the same direction.
  • a pedestrian crossing is provided in the lane where the first vehicle 10 and the second vehicle 20 meet.
  • FIG. 10 shows a case where the pedestrian 50 is alerted when the first vehicle 10 starts to turn left or when the second vehicle 20 starts to turn right.
  • the crossing of the pedestrian 50 is prioritized over the left turn of the first vehicle 10 and the right turn of the second vehicle 20.
  • the vehicle driving support system 500 presents warning information around the pedestrian crossing. Then, give a preliminary alert.
  • the method of presenting the alert information around the pedestrian crossing may be the road surface drawing from the infrastructure side device 30 as shown in FIG. 4A.
  • the infrastructure side device 30 may be provided with a speaker, and the speaker may indicate a left turn of the first vehicle 10 and a right turn of the second vehicle 20 by voice guidance, and make an announcement calling attention.
  • the portable electronic device uses an information communication unit provided in the first vehicle 10, the second vehicle 20, and the infrastructure side device 30 to display an image or sound. You may present the alert information at.
  • the left turn of the first vehicle 10 and the right turn of the second vehicle 20 are allowed, and even when the right / left turn operation is started, the pedestrian 50 who suddenly tries to cross the pedestrian crossing 50 Even when multiple vehicles meet at an intersection, it is possible to support smooth traffic while ensuring safety.
  • FIG. 11 is a schematic view showing an example of operation of the vehicle driving support system 600 according to the sixth embodiment on the road.
  • FIG. 11 shows an intersection where the road 1 intersects and the sidewalk 2 is provided.
  • the receiving side vehicle 10C as the first vehicle and the transmitting side vehicle 20C as the second vehicle are running on the road 1 facing each other.
  • an infrastructure side device 30 and a portable electronic device 60 held by a pedestrian are arranged on the area of the sidewalk 2, an infrastructure side device 30 and a portable electronic device 60 held by a pedestrian are arranged.
  • the receiving side vehicle 10C is trying to go straight on the road 1 from the lower side to the upper side in the figure, and the transmitting side vehicle 20C is trying to turn right from the upper side in the figure to the left side in the figure.
  • the receiving side vehicle 10C is a vehicle traveling on the road 1 and is a driving support vehicle in which a part of steering control and acceleration / deceleration control is performed by a computer or the like.
  • the receiving side vehicle 10C takes a route traveling straight on the road 1, but when the transmitting side vehicle 20C crosses the road in front of the road 1 and tries to make a right turn, the receiving side vehicle 10C detects an optical signal described later to stop the operation or stop operation. Decelerate.
  • the transmitting side vehicle 20C is a vehicle traveling on the road 1.
  • the transmitting vehicle 20C may be a driving support vehicle, but may be a manually driven vehicle that does not have a driving support function.
  • FIG. 11 when the transmitting side vehicle 20C turns right on the road 1 and tries to cross in front of the receiving side vehicle 10C in front, it irradiates the receiving side vehicle 10C with an optical signal described later to the receiving side vehicle 10C. Perform stop operation or deceleration operation.
  • the infrastructure side device 30 is a device that irradiates the receiving side vehicle 10C with an optical signal.
  • the infrastructure side device 30 has a function of grasping the situation of the vehicle on the road 1 as described later, and irradiates the receiving side vehicle 10C with an optical signal described later according to the situation and stops at the receiving side vehicle 10C. Perform operation or deceleration operation.
  • the portable electronic device 60 is, for example, a portable electronic device or a lighting device that can be carried by a pedestrian on the sidewalk 2.
  • the specific configuration of the portable electronic device 60 is not limited.
  • the portable electronic device 60 has at least a function of irradiating light, and may be in the form of a flashlight or a portable communication device.
  • the portable electronic device 60 irradiates the receiving side vehicle 10C with an optical signal by being operated by a pedestrian, and causes the receiving side vehicle 10C to perform a stop operation or a deceleration operation.
  • FIG. 12 is a block diagram showing the configuration of the vehicle driving support system 600.
  • the receiving vehicle 10C includes a detection unit 11C, a driving support unit 12, a vehicle motion control unit 13, a situation grasping unit 14, and an information communication unit 15.
  • the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60 include light irradiation units 21, 31, and 61, situation grasping units 22, 32, 62, and information and communication units 23, 33, 63, respectively.
  • the driving support unit 12, the vehicle motion control unit 13, and the situation grasping units 14, 22, 32, 62 are on a computer equipped with a central processing unit (CPU: Central Processing Unit), a memory, an external storage device, and the like. Predetermined information processing is executed by a program recorded in advance.
  • CPU Central Processing Unit
  • the detection unit 11C detects light from the outside of the vehicle, converts it into an electrical signal, and transmits the converted signal to the driving support unit 12.
  • the specific configuration of the detection unit 11C is not limited.
  • the detection unit 11C may be an optical sensor or an image imaging device.
  • the wavelength of the light detected by the detection unit 11C is not limited, and may be infrared light, visible light, ultraviolet light, or white light.
  • the driving support unit 12 processes information on the driving state and surrounding conditions acquired from the situation grasping unit 14 and the information communication unit 15 in order to support the driving of the receiving side vehicle 10C, and controls the operation of the receiving side vehicle 10C.
  • the driving control signal is output to the vehicle operation control unit 13.
  • the driving support unit 12 indicates a stop operation or a deceleration operation to the vehicle operation control unit 13 when the light detected by the detection unit 11C includes a predetermined optical signal. Outputs an operation control signal.
  • the vehicle motion control unit 13 executes steering control and acceleration / deceleration control of the receiving side vehicle 10C based on the driving control signal output from the driving support unit 12.
  • the vehicle motion control unit 13 has a driving support function, and by adjusting the output of the power source, operating the brake, changing the steering angle, displaying the driving guide, controlling the lighting of the turn signal and the stop light, and the like, the vehicle on the receiving side. Supports the operation of 10C.
  • the situation grasping unit 14 acquires information on the running state and surrounding conditions of the receiving vehicle 10C and transmits it to the driving support unit 12.
  • the situation grasping unit 14 is provided with various sensor devices such as a vehicle speed sensor, a position sensor, an image imaging device, a laser range finder, and a LIDAR (Light Detection and Ringing).
  • the traveling state acquired by the situation grasping unit 14 includes the traveling speed, the position, the direction of the vehicle body, the steering angle, the brake operation, the traveling route by the car navigation system, the direction instruction by the conversation recognition in the vehicle, and the like. Is done.
  • the surrounding conditions acquired by the situation grasping unit 14 include the road surface condition, the ambient temperature, the road map by the car navigation system, the slope of the road, the detection of surrounding objects by image recognition, the preceding vehicle, the oncoming vehicle, and the following vehicle. Includes inter-vehicle distance, behavior prediction, and detection of pedestrians by image recognition.
  • the information and communication unit 15 is connected to the driving support unit 12 and the situation grasping unit 14, and performs information communication with a communication unit provided outside the receiving vehicle 10C.
  • the information communication unit 15 communicates by radio waves or light.
  • the information and communication unit 15 performs vehicle-to-vehicle communication between vehicles and road-to-vehicle communication with equipment provided on the road to obtain information such as a traveling state and surrounding conditions.
  • the detection unit 11C may detect light including information not related to the operation of the vehicle driving support system 600, but the detection unit 11C may detect a predetermined optical signal related to the operation of the vehicle driving support system 600. It is configured.
  • the predetermined optical signal includes, for example, a pulse signal having a specific wavelength or a specific waveform. Further, the predetermined optical signal may have a light intensity exceeding the dynamic range of the detection unit 11C in the wavelength range that can be detected by the detection unit 11C.
  • the information of these predetermined optical signals is recorded in the driving support unit 12, or is recorded in the driving support unit 12 as a processing procedure when a light intensity exceeding the dynamic range is irradiated.
  • the light irradiation units 21, 31, and 61 are provided in the transmission side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60, respectively, and irradiate the detection unit 11C with a predetermined light signal.
  • the predetermined optical signal is a specific wavelength, a specific signal waveform, a light intensity exceeding a dynamic range, or the like.
  • 11 and 12 show an example in which the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60 are all arranged on the road 1 or the sidewalk 2.
  • any one of the transmission side vehicle 20C, the infrastructure side device 30 or the portable electronic device 60 is on the road.
  • the vehicle driving support system 600 functions if it is on 1 or the sidewalk 2.
  • the situation grasping units 22, 32, and 62 are provided in the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60, respectively, and may have the same configuration as the situation grasping unit 14.
  • the situation grasping units 22, 32, and 62 acquire information on the running state and surrounding conditions of vehicles and pedestrians, respectively, and transmit the information to the outside via the information and communication units 23, 33, 63. Further, the situation grasping units 22, 32, 62 may acquire the traveling state and the surrounding situation from the outside via the information and communication units 23, 33, 63.
  • the situation grasping units 22, 32, 62 determine the acquired running state and surrounding conditions, and detect from the light irradiation units 21, 31, 61 when the running state and surrounding conditions satisfy predetermined irradiation conditions.
  • the unit 11C is irradiated with an optical signal.
  • the information and communication units 23, 33, and 63 are provided in the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60, respectively, and perform information communication with the communication unit provided outside the receiving side vehicle 10C.
  • the information and communication units 23, 33, and 63 communicate with each other by radio waves or light, respectively.
  • FIG. 12 shows an example in which the transmitting side vehicle 20C, the infrastructure side device 30, and the portable electronic device 60 are provided with the situation grasping units 22, 32, 62 and the information and communication units 23, 33, 63. Since it is not essential for the operation of the system 600, it can be omitted.
  • FIGS. 13A to 13C are schematic views showing an example of light irradiation in the vehicle driving support system 600.
  • FIG. 13A shows the irradiation of light from the transmitting vehicle 20C.
  • FIG. 13B shows the irradiation of light from the infrastructure side device 30.
  • FIG. 13C shows the irradiation of light from the portable electronic device 60 held by the pedestrian.
  • the light irradiation units 21, 31, and 61 of the transmission side vehicle 20C, the infrastructure side device 30, or the portable electronic device 60 are predetermined with respect to the detection unit 11C of the reception side vehicle 10C.
  • the light signal is emitted.
  • the driving support unit 12 When the light detected by the detection unit 11C of the receiving vehicle 10C partially matches a predetermined optical signal, the driving support unit 12 indicates a stop operation or a deceleration operation to the vehicle operation control unit 13. A control signal is output to decelerate or stop the receiving vehicle 10C.
  • the transmitting side vehicle 20C may be a manually driven vehicle that does not have a driving support function.
  • the transmitting side vehicle 20C operates a passing operation or a dedicated switch, and the light irradiating unit 21 irradiates the detecting unit 11C with a predetermined light signal, so that the vehicle driving support system 600 is the receiving side vehicle 10C.
  • the stop operation or deceleration operation by the driving support unit 12 can be activated.
  • the light irradiation unit 21 of the transmitting vehicle 20C may be a headlight, a fog lamp, a decorative light, or the like, and the vehicle driving support system 600 can be used more easily than adding a driving support function to the vehicle.
  • the infrastructure side device 30 grasps the running state and surrounding conditions of the receiving side vehicle 10C and the transmitting side vehicle 20C by the situation grasping unit 32 or the information communication unit 33.
  • the infrastructure side device 30 sets the irradiation condition when the receiving side vehicle 10C is an oncoming vehicle of the transmitting side vehicle 20C and the transmitting side vehicle 20C crosses the front of the receiving side vehicle 10C and makes a right turn operation or a left turn operation. It is determined that the condition is satisfied, and the light irradiation unit 31 irradiates the detection unit 11C with an optical signal.
  • the driving support unit 12 outputs a driving control signal indicating a stop operation or a deceleration operation to the vehicle operation control unit 13 to decelerate or stop the receiving side vehicle 10C.
  • the infrastructure side device 30 determines the light irradiation, as shown in FIG. 11, it is predicted that a pedestrian crossing exists in front of the receiving side vehicle 10C and the receiving side vehicle 10C continues the straight-ahead operation. This also includes the case where it is determined that the pedestrian crosses the front of the receiving vehicle 10C.
  • the portable electronic device 60 possessed by the pedestrian on the sidewalk 2 is operated to irradiate the detection unit 11C with an optical signal from the light irradiation unit 61.
  • the pedestrian can positively act on the receiving side vehicle 10C, which is a driving support vehicle, to activate the stopping operation and the decelerating operation of the receiving side vehicle 10C.
  • the portable electronic device 60 may grasp the traveling state and surrounding conditions of the receiving side vehicle 10C by the situation grasping unit 62 or the information communication unit 63.
  • the irradiation of the optical signal is executed only when the situation grasping unit 62 determines that the irradiation condition is satisfied.
  • the irradiation condition includes a case where a pedestrian crossing exists in front of the receiving side vehicle 10C, the receiving side vehicle 10C is predicted to continue the straight-ahead operation, and a pedestrian crosses the pedestrian crossing.
  • FIG. 14 is a flowchart showing an operation example of the vehicle driving support system 600.
  • the receiving side vehicle 10C executes the driving support function and is traveling as the driving support vehicle, and the transmitting side vehicle 20C has the driving support function. Further, the flowchart of FIG. 14 shows the irradiation conditions of the optical signal in the transmitting side vehicle 20C.
  • step 1C the situation grasping unit 22 determines a right / left turn from the running state of the transmitting vehicle 20C, shifts to step 2C in the case of a right turn or a left turn, and returns to step 1C in the case of neither a right turn nor a left turn. ..
  • step 2C when the situation grasping unit 22 determines the presence or absence of an oncoming vehicle and the situation grasping unit 22 predicts that the receiving side vehicle 10C, which is an oncoming vehicle, is going straight, the process proceeds to step 3C, and the like. In the case of, the process returns to step 1C.
  • step 3C the light irradiation unit 21 irradiates the detection unit 11C with an optical signal and returns to step 1C.
  • the transmitting side vehicle 20C determines the irradiation of the optical signal according to the traveling state, the driver of the transmitting side vehicle 20C does not perform any special operation and automatically performs the optical signal when the irradiation condition is satisfied. Is irradiated, and the stop operation or deceleration operation of the receiving side vehicle 10C can be activated.
  • the receiving side vehicle 10C traveling by the driving support technology is positively subjected to only by irradiating the light signals from the light irradiation units 21, 31, and 61. It is possible to support smooth traffic by activating a stop operation or a deceleration operation on the receiving side vehicle 10C.
  • FIG. 15 is a schematic view showing an example of operation of the vehicle driving support system 700 according to the seventh embodiment on the road.
  • the road 1 and the sidewalk 2 extend in a straight line.
  • the transmitting side vehicle 20C is traveling in front of the receiving side vehicle 10C.
  • the infrastructure side device 30 is arranged on the area of the sidewalk 2.
  • the receiving side vehicle 10C and the transmitting side vehicle 20C are trying to go straight on the road 1 from the lower side to the upper side in the figure.
  • an optical signal is irradiated from the rear of the transmitting side vehicle 20C to activate the stopping operation and the deceleration operation of the receiving side vehicle 10C. Encourage them to avoid danger and maintain comfortable vehicle spacing.
  • FIG. 16 is a flowchart showing an operation example of the vehicle driving support system 700.
  • the receiving side vehicle 10C and the transmitting side vehicle 20C have a driving support function. Further, the flowchart of FIG. 16 shows the irradiation conditions of the optical signal in the transmitting side vehicle 20C.
  • step 11 the situation grasping unit 22 determines whether the vehicle is going straight from the traveling state of the transmitting vehicle 20C, and if the vehicle is going straight, the process proceeds to step 12, and if the vehicle is not going straight, the process returns to step 11.
  • step 12 the situation grasping unit 22 determines whether or not there is a following vehicle, and if it determines that the receiving vehicle 10C is present as the following vehicle, the process proceeds to step 13, and in other cases, the process returns to step 11.
  • step 13 the situation grasping unit 22 measures the inter-vehicle distance between the transmitting side vehicle 20C and the receiving side vehicle 10C, and if the distance is less than or equal to a certain distance (constant value), the process proceeds to step 14, and in other cases, the process proceeds to step 14. Return to step 11.
  • step 14 the situation grasping unit 22 measures the time during which the inter-vehicle distance is a certain distance (constant value) or less, and if it continues for a certain time or more, the process proceeds to step 15, and in other cases, Return to step 11.
  • step 15 the light irradiation unit 21 irradiates the detection unit 11C with an optical signal, and the process returns to step 11.
  • the transmitting side vehicle 20C determines the irradiation of the optical signal according to the traveling state, the driver of the transmitting side vehicle 20C does not perform any special operation and automatically performs the optical signal when the irradiation condition is satisfied. Is irradiated, and the stop operation or deceleration operation of the receiving side vehicle 10C can be activated to promote danger avoidance and maintenance of a comfortable inter-vehicle distance.
  • the receiving side vehicle 10C traveling by the driving support technology is positively subjected to only by irradiating the light signals from the light irradiation units 21, 31, and 61. It is possible to support smooth traffic by activating a stop operation or a deceleration operation on the receiving side vehicle 10C.
  • FIG. 17A and 17B are schematic views of a vehicle 10D traveling by using the vehicle driving support system 800 according to the present embodiment.
  • FIG. 17A is a front view of the vehicle 10D.
  • FIG. 17B is a rear view of the vehicle 10D.
  • the vehicle 10D is an automobile equipped with a road surface drawing device, and includes a headlamp 2D and a lighting unit (irradiation unit) 52D that illuminate the front of the vehicle 10D.
  • the headlamps 2D are arranged as right side headlamps 2R and left side headlamps 2L on the front left and right sides in the traveling direction of the vehicle 10D.
  • the headlamp 2D may include a light source, a reflector, and the like in a lamp body (not shown).
  • the lighting unit (irradiation unit) 52D is arranged below the left and right headlamps 2D on the front surface of the vehicle 10D.
  • the lighting unit 52D in the present embodiment is separately arranged on the left and right sides of the right side lighting unit 52R and the left side lighting unit 52L, but the vehicle driving support system or the road surface drawing device of the present disclosure is not limited to this, and the front surface of the vehicle 10D is not limited to this. It may be in the form of arranging one in the center of.
  • the lighting unit 52D is a drawing projection unit of a road surface drawing device that displays various drawings (marks) on the road surface in the vehicle driving support system 800.
  • the structure of the illumination unit 52D is, for example, a laser scanning device (not shown) including a laser light source and a light deflector that deflects the laser light emitted from the laser light source.
  • the light deflector is, for example, a movable mirror such as a MEMS (Micro Electro Mechanical Systems) mirror or a galvano mirror.
  • the lighting unit 52D may be a liquid crystal display, an LED array, a digital mirror device (DMD), or the like, as long as it can display various predetermined drawings (marks) on the road surface in front of the vehicle 10D.
  • the operation of the lighting unit 52D such as lighting and extinguishing, is controlled in response to a command from the lighting control unit 51D of the road surface drawing unit 50D in the vehicle driving support system 800, which will be described later.
  • the rear lighting unit 52B is also provided below the back lamp 2B on the rear side of the vehicle 10D.
  • the left lighting unit 52BL is separately arranged under the left back lamp 2BL
  • the right lighting unit 52BR is separately arranged under the right back lamp 2BR.
  • the arrangement of the rear lighting unit 52B is not limited to this, and one may be arranged in the center of the rear surface of the vehicle.
  • the structure of the rear lighting unit 52B is the same as that of the front lighting unit 52D.
  • FIG. 18 is a block diagram of the vehicle driving support system 800 according to the present embodiment.
  • the vehicle driving support system 800 of the present disclosure includes a control unit 90D, a vehicle course acquisition unit 20D, a vehicle state information acquisition unit 30D, an image selection unit 40D, and a road surface drawing unit 50D.
  • the control unit 90D controls various devices of the vehicle 10D, and is composed of an electronic control unit.
  • the electronic control unit includes a microcontroller including a processor and a memory, and other electronic circuits such as transistors.
  • the processor includes a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit).
  • the memory also includes a ROM and a RAM. The processor executes various control programs stored in the ROM, and executes various processes in cooperation with the RAM.
  • Various sensors and external devices for monitoring the outside of the vehicle 10D such as the navigation system 11D, the direction indicator 12D, the in-vehicle camera 13D, the sensor 14D, and the wireless communication unit 15D are connected to the control unit 90D. Data is input and output.
  • the navigation system 11D is a system that is connected to a satellite positioning system such as GPS to obtain the current position information of the vehicle 10D, indicate an appropriate course to the destination input by the driver, and guide the vehicle 10D. is there.
  • the control unit 90D acquires the course information of the vehicle 10D from the navigation system 11D, and also acquires the current position information, orientation information, and the like of the vehicle 10D.
  • the control unit 90D is assumed to acquire the course information of the vehicle 10D by the navigation system 11D, but the present disclosure is not limited to this.
  • the control unit 90D may acquire the course information by using various control means such as sequential instruction by automatic operation control.
  • the direction indicator 12D is interlocked with a lever (not shown) for the driver to input the traveling direction of the vehicle 10D, inputs a signal indicating the traveling direction of the vehicle 10D to the control unit 90D, and is a direction indicator to the outside of the vehicle 10D. (Winker lamp) (not shown) to transmit.
  • the control unit 90D also acquires the course information of the vehicle 10D by the direction indicator 12D.
  • the in-vehicle camera 13D is provided to obtain information on the outside of the vehicle 10D in front of and behind the vehicle.
  • the in-vehicle camera 13F installed on the front surface of the vehicle 10D sequentially photographs the state of the front (including the road surface), and promptly transmits forward information on the presence of other vehicles, pedestrians, etc. in front of the vehicle 10D to the control unit 90D.
  • the vehicle-mounted camera 13B installed on the rear surface of the vehicle 10D also sequentially photographs the rear view, and transmits rear information to the control unit 90D that other vehicles, motorcycles, and pedestrians are present behind the vehicle 10D.
  • the in-vehicle camera 13D includes, for example, an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary MOS).
  • the in-vehicle camera 13D is combined with a millimeter-wave radar, a microwave radar, a laser radar, and the like to obtain information on the surroundings outside the vehicle such as other vehicles, pedestrians, road shapes, traffic signs, and obstacles.
  • the in-vehicle camera 13D sends the captured image data to the control unit 90D.
  • the control unit 90D may recognize the existence, position, etc. of pedestrians and other vehicles (including motorcycles) from the image data by various analysis programs, and the in-vehicle camera 13D itself has a program for recognizing pedestrians, etc. May be good.
  • the imaging range of the vehicle-mounted camera also includes the road surface in front of the vehicle 10D in the traveling direction.
  • the sensor 14D is provided to obtain information on the outside of the vehicle around the vehicle 10D.
  • an infrared sensor for detecting whether or not there is another vehicle, a pedestrian, or the like, a motion capture for detecting the movement of a pedestrian, or the like is installed in front, behind, or side of the vehicle 10D.
  • the sensor 14D also detects a pedestrian or another vehicle (including a two-wheeled vehicle)
  • the sensor 14D also transmits the detection data to the control unit 90D.
  • the sensor 14D may include an electronic compass and an angular velocity sensor that detect the orientation of the vehicle 10D.
  • the orientation information of the vehicle 10D may include information detected by an electronic compass and an angular velocity sensor.
  • the wireless communication unit 15D receives and transmits information by wireless communication with other devices outside the vehicle, such as predetermined devices provided at other vehicles and intersections, automatic driving instruction devices, and the like.
  • the control unit 90D also obtains information around the vehicle (states of other vehicles, pedestrians, etc.) from the information received by the wireless communication unit 15D.
  • the wireless communication unit 15D may have a form of transmitting the position and the traveling direction of the vehicle 10D to another vehicle or the like. That is, the wireless communication unit 15D performs vehicle-to-vehicle communication, which is communication between vehicles, and road-to-vehicle communication with equipment provided on the road.
  • the vehicle-mounted camera 13D, the sensor 14D, and the wireless communication unit 15D that detect the surrounding information of the vehicle 10D are collectively referred to as the surrounding information detection unit (or other vehicle detection unit) 16D.
  • the surrounding information detection unit 16D is not limited to the in-vehicle camera 13D, the sensor 14D, and the wireless communication unit 15D, and may be any one that obtains information on the surroundings of the vehicle 10D.
  • the surrounding information detection unit 16D may include a mechanism for obtaining an image captured by another device outside the vehicle.
  • the control unit 90D includes a vehicle course acquisition unit 20D, a vehicle state information acquisition unit 30D, and an image selection unit 40D.
  • the vehicle course acquisition unit 20D acquires the course information of the vehicle 10D from the navigation system 11D, the direction indicator 12D, the wireless communication unit 15D, etc. described above, for example, the course information such as going straight, turning right, or turning left at the next intersection. To do.
  • the vehicle course acquisition unit 20D may periodically acquire the course information at predetermined intervals, or may acquire the course information when the course is changed from the previously obtained course.
  • the course information acquired by the vehicle course acquisition unit 20D includes not only information at intersections but also all cases where the course should be displayed for other vehicles such as changing lanes.
  • the vehicle course acquisition unit 20D also acquires the course information from the information from the wireless communication unit 15D.
  • the vehicle state information acquisition unit 30D acquires the state information of the vehicle 10D such as the position, orientation, and speed of the vehicle 10D. As described above, the vehicle state information acquisition unit 30D acquires the position information and the orientation information of the vehicle 10D from the navigation system 11D. The vehicle state information acquisition unit 30D is connected to the speed sensor 31D.
  • the speed sensor 31D detects the traveling speed of the vehicle 10D and the like.
  • the speed sensor 31D may also have an acceleration sensor or the like.
  • the vehicle state information acquisition unit 30D obtains the position of the vehicle 10D after a predetermined time from the current position of the vehicle 10D obtained from the navigation system 11D or the like and the speed or acceleration detected by the speed sensor 31D.
  • the image selection unit 40D selects the course display image 6D projected on the road surface in front of or behind the vehicle 10D by using the lighting unit 52D, as will be described later. That is, the vehicle driving support system 800 of the present embodiment accurately conveys the course of the vehicle 10D to other vehicles and pedestrians when the course of the vehicle 10D changes, such as when turning right or left. The course display image 6D showing the course direction of is projected on the road surface. However, the image selection unit 40D selects the display image data, such as not selecting the course display image 6D or changing the selected image as appropriate according to the situation of other vehicles or pedestrians around the vehicle 10D.
  • the image selection unit 40D is connected to the route display image data storage unit 41D in which a plurality of data of the route display image 6D indicating the route, such as a right turn and a left turn, are stocked.
  • the image selection unit 40D includes the course information of the vehicle 10D acquired by the vehicle course acquisition unit 20D and the vehicle state information such as the direction, position, speed, etc. of the vehicle 10D acquired by the vehicle state information acquisition unit 30D. Therefore, the data of the route display image 6D indicating the direction in which the vehicle 10D should travel is selected from the route display image data storage unit 41D.
  • the image selection unit 40D does not select the data of the course display image 6D when the other vehicle is located within a predetermined range of the vehicle 10D described later.
  • the image selection unit 40D sends a command instructing the data of the selected course display image 6D to the road surface drawing unit 50D.
  • the road surface drawing unit 50D includes a lighting control unit 51D and a lighting unit 52D.
  • the road surface drawing unit 50D takes out the data of the route display image instructed from the image selection unit 40D from the route display image data storage unit 41D and sends it to the lighting control unit 51D.
  • the lighting control unit 51D converts data into a form suitable for the lighting unit 52D, and the lighting unit 52D irradiates light from a light source according to the converted data and projects a predetermined course display image on the road surface in front of the vehicle 10D.
  • the illumination control unit 51D is composed of an electronic control unit, and determines the illumination state (point off, illumination color, emission intensity, emission area, etc.) of the illumination unit 52D according to the data of the advance path display image.
  • the lighting control unit 51D includes a microcontroller including a processor such as a CPU and an MPU and a memory, and other electronic circuits and the like.
  • the control unit 90D and the lighting control unit 51D have separate configurations, but they may be integrally configured.
  • the lighting unit 52D projects the selected course display image onto the road surface around the vehicle 10D (front, rear) and displays it.
  • the lighting unit 52D has an adjustment function such that the position and angle at which the image is projected are adjusted according to the direction, the current position, and the surrounding conditions of the vehicle obtained by the vehicle state information acquisition unit 30D. I have.
  • the specific adjustment instruction is given based on the instruction signal accompanying the image data selected by the image selection unit 40D.
  • FIG. 19 is a flowchart showing the flow of processing in which the vehicle driving support system 800 displays the course display image 6D.
  • FIG. 20 is a schematic view showing a course 7D and a course display image 6 at an intersection of a plurality of vehicles 10Da to 10Dc. As shown in FIG. 19, the vehicle driving support system 800 of the present embodiment displays the course display image 6D in the vicinity of the intersection in the following flow.
  • the vehicle course acquisition unit 20D turns right as the vehicle 10D indicates in the course 7D, that is, in the straight direction from the first direction to the right turn direction. Recognize and judge that the course information changes in a certain second direction (step 1D).
  • the vehicle state information acquisition unit 30D acquires the ambient information detected by the ambient information detection unit 16D such as the vehicle-mounted camera 13D, the sensor 14D, and the wireless communication unit 15D (step 2D).
  • the vehicle state information acquisition unit 30D acquires the position and orientation information of the vehicle 10D from the navigation system 11D. Further, the traveling speed of the vehicle 10D is acquired from the speed sensor 32D. Then, from the surrounding information and the position, orientation, and traveling speed of the vehicle 10D, information such as whether another vehicle exists before and after the vehicle 10D and how far it is from the other vehicle is acquired (step 3D).
  • the control unit 90D is a position where the route display image 6D should be displayed (described later), such as when the vehicle 10D enters an intersection or changes the route based on the position and orientation information and the route information of the vehicle 10D. It is determined whether the vehicle 10D is located within the third distance L3). When the control unit 90D determines that the position should be displayed, the process proceeds to step 5D, and in other cases, the process proceeds to step 1D (step 4D).
  • control unit 90D is a position where the vehicle 10D should display the course display image 6D, and there is no other vehicle in front of the vehicle 10D, or the distance between the vehicle 10D and the other vehicle in front is the first distance L1 or more. Determine if it is in a state (step 5D). If there is no other vehicle within the first distance L1, the process proceeds to step 6D, and if there is another vehicle, the process proceeds to step 7D.
  • the image selection unit 40D is the optimum to be displayed from the current position and orientation of the vehicle 10D acquired by the vehicle state information acquisition unit 30D, the route acquired by the vehicle course acquisition unit 20D, and the surrounding information acquired by the surrounding information detection unit 16D.
  • the image data of the course forward display image 6Da is selected from the course display image data storage unit 41D (step 6D).
  • the control unit 90D determines that if another vehicle exists in front of the vehicle 10D and the distance to the other vehicle is within the first distance L1, the image in front of the course is not displayed, and the image selection unit 40D is reached. Do not send image selection commands.
  • the control unit 90D confirms the rear of the vehicle 10D and the second distance L2. It is determined whether there is another vehicle inside (step 7D). If the control unit 90D determines that there is no other vehicle within the second distance L2, the process proceeds to step 9D, and if it determines that the other vehicle does exist, the process proceeds to step 8D.
  • the vehicle 10D when the vehicle 10D is located within the third distance L3 specified by the Road Traffic Act, such as before a predetermined distance (30 m) from the intersection, the vehicle 10D must turn on the blinker lamp when changing the course. .. Since the course display image 6D is the same as the lighting of the blinker lamp, the road surface is projected at the position where the blinker lamp must be turned on. However, when another vehicle is located within the first distance L1 or the second distance L2, the course display image 6D (in the case of the front, the course front display image 6Da, is used in consideration of the visual influence on the other vehicle. In the case of the rear, the control unit 90D determines that the rearward display image 6Db) is not displayed.
  • the image selection unit 40D should display from the current position and orientation of the vehicle 10D acquired by the vehicle state information acquisition unit 30D, the course acquired by the vehicle course acquisition unit 20D, and the surrounding information acquired by the surrounding information detection unit 16D.
  • the image data of the optimum course front display image 6Da or the course rear display image 6Db is selected from the course display image data storage unit 41D (step 8D).
  • the image selection unit 40D outputs a display command to the road surface drawing unit 50D according to the image data of the selected course front display image 6Da or the course rear display image 6Db.
  • the road surface drawing unit 50D selects the image data of the course display image 6D selected by the image selection unit 40D from the course display image data storage unit 41D, and uses the lighting control unit 51D and the lighting unit 52D to display the course forward display image 6Da. Is displayed on the road surface in front of the vehicle 10D, or the track rear display image 6Db is displayed on the road surface in front of the vehicle 10D (step 9D).
  • steps 1D to 9D are repeated at predetermined intervals until the vehicle 10D arrives at the destination.
  • image data is sequentially reselected by the image selection unit 40D according to the position of the vehicle 10D, and the optimum course display image 6D is always displayed at the optimum position (step 10D).
  • the control unit 90D determines whether or not the course display image 6D (course forward display image 6Da, course rear display image 6Db) needs to be displayed from the course information, vehicle state information, and detection information of the vehicle 10D. Although the determination has been made, the image selection unit 40D may determine whether or not the display is necessary. In this case, the image selection unit 40D selects whether or not the course display image 6D needs to be displayed, and then selects a suitable course display image.
  • the second direction may be a left turn direction.
  • Vehicle 10Da changes its course from the straight direction, which is the first direction, to the right turn direction, which is the second direction. Since there is no other vehicle in front of the vehicle 10Da, the image selection unit 40D selects the course forward display image 6Da, and the road surface drawing unit 50D displays the course forward display image 6Da of the arrow indicating a right turn on the road surface in front of the vehicle 10Da. The front lighting units 52R and 52L project. Further, since the vehicle 10Db is located behind the vehicle 10Da, the vehicle 10Da does not display the course display image on the road surface behind the vehicle.
  • the vehicle 10Db does not display the course forward display image 6Da because the vehicle 10Da exists within the first front distance L1. Further, since the vehicle 10Dc exists within the second rear distance L2, the rearward display image 6Db is not displayed either. If there is no other vehicle in front of the vehicle 10Db, such as when the vehicle 10Da enters the intersection and the distance to the vehicle 10Da is the first distance L1 or more, the course forward display image 6Da is displayed.
  • Vehicle 10Dc does not display the course front display image 6Da because the vehicle 110b exists within the first front distance L1. Further, since there is no other vehicle within the second rear distance L2, the course rear display image 6Db indicating the course direction is projected on the rear road surface. This is projected by the lighting units 52BR and 52BL on the rear surface of the vehicle 10Dc. In this way, by displaying the course rearward display image 6Db on the rearward road surface, the course can be shown to surrounding pedestrians and the like.
  • the course front display image 6Da is projected on the road surface in front of the vehicle 10Da, and the course rear display image 6Db is projected on the road surface behind. That is, one vehicle 10D has decided to display the course display image 6D on each of the front and rear road surfaces, but the present disclosure is not limited to this, and the course display image 6D may be displayed only on the road surface in front.
  • the course front display image 6Da is not displayed when another vehicle exists within the first front distance L1.
  • the first distance L1 in the present embodiment is "braking distance + length for one vehicle".
  • the braking distance is the distance traveled from when the brake of the vehicle 10D starts to work until it stops, is proportional to the square of the speed and the vehicle weight, and is inversely proportional to the braking force.
  • the braking distance for each speed of the vehicle 10D is obtained (set) in advance, and the control unit 90D sequentially obtains the braking distance from the detected speed of the vehicle 10D and calculates the first distance L1. .. In this embodiment, the length of 10D vehicles is set to 5 m.
  • the control unit 90D compares the first distance L1 with the distance to another vehicle in front, and determines the necessity of displaying the course forward display image 6Da.
  • the second distance L2 which is the criterion for displaying the rearward display image 6Db, is "stop distance + length for one vehicle".
  • the stopping distance is the free running distance + the braking distance.
  • the free-running distance is the distance that the car runs from the moment the driver feels it is necessary to stop the car until the driver steps on the brake and begins to work. Therefore, the stop distance is the distance traveled from the time when the driver feels that the vehicle 10D is to be stopped until the vehicle 10D actually stops.
  • the time required for the driver to react to step on the brake varies from person to person, but in the present embodiment, it is set to 0.75 seconds.
  • the free running distance can be obtained by the detected speed ⁇ 0.75 seconds, and the stopping distance is calculated from the free running distance and the braking distance thus obtained.
  • the control unit 90D compares the obtained second distance L2 with the distance to another vehicle behind, and determines the necessity of displaying the course rear display image 6Db.
  • the third distance L3 is determined as described above. That is, the Road Traffic Act Enforcement Ordinance must give a signal such as turning on the blinker lamp when reaching a point 30 meters before the side edge in front of the intersection, for example, when performing an act such as turning left. It is stipulated. Therefore, the distance from the intersection to the point where this signal must be given is defined as the third distance L3.
  • the rearmost vehicle 10D has a course even if another vehicle is located within the second rear distance L2 behind the rearmost vehicle 10D. It may be a control to display the rear display image 6Db. Whether or not the vehicle 10D is the last within the third distance L3 is detected by the distance from the position of the vehicle 10D to the end of the third distance L3 (the point where a signal must be given 30 m before the intersection). Judgment is made based on conditions such as being shorter than the distance to other vehicles.
  • the leading vehicle 10Da shows the course on the front road surface
  • the rearmost vehicle 10Dc shows the course by projecting the course display image 6D on the rear road surface. It can predict movement and call attention.
  • the next vehicle 10Db which continues in a state where the distance from the leading vehicle 10Da is not so large, does not project the course display image 6D, so that the projected light is not diffusely reflected on the rear surface of the front vehicle 10Da, and the driver of the vehicle 10Db Diffuse reflection does not obstruct the view.
  • FIG. 21 is a schematic view showing a course display image 6D at an intersection of the vehicles 10Da and 10Db and the two-wheeled vehicle 10Dd of the ninth embodiment. The description of the contents overlapping with the eighth embodiment will be omitted.
  • the vehicle 10Db since the vehicle 10Db exists within the first distance L1 ahead of the vehicle 10Db, the vehicle 10Db does not project the course forward display image 6Da.
  • the vehicle 10Db there is a motorcycle 10Dd within the second rear distance L2.
  • the course rearward display image 6Db was not displayed in the first embodiment, but in the present embodiment, the course rearward display image 6Db is displayed.
  • the control unit 90D prevents the two-wheeled vehicle 10Dd from being caught. Therefore, the vehicle rear display image 6Db is selected to be displayed, and an image display command is sent to the image selection unit 40D.
  • the image selection unit 40D selects data related to the left turn image in the image showing the course of the vehicle 10Db, and in the example shown in FIG. 21, from the course display image data storage unit 41D, and outputs a command to the road surface drawing unit 50D.
  • the road surface drawing unit 50D projects a course rear display image 6Db indicating a left turn onto the rear road surface of the vehicle 10Db.
  • the lighting control unit 51D sets the lighting unit 52D so that the driver's field of view of the two-wheeled vehicle 10Dd can be projected to a visible position without giving glare. Control.
  • the road surface drawing unit 50D does not use both lighting units 52BR and 52BL, but projects using only the right lighting unit 52BR, or adjusts the projection angle.
  • the motorcycle 10Dd is located behind when the vehicle 10Db turns left, there is a high risk that the motorcycle 10Dd will be involved in an accident. Therefore, only when the motorcycle 10Dd exists within the second rear distance L2 of the vehicle 10Db and the course of the vehicle 10Db is a left turn, the road surface drawing unit 50D projects the course rear display image 6Db and the motorcycle 10Dd is at the same position. Even if there is, if the course of the vehicle 10Db goes straight or turns right, the road surface drawing unit 50D does not have to display the course rear display image 6Db.
  • the path rear display image 6Db may be displayed. That is, in the line of sight of a pedestrian located within the second distance L2, it may be difficult to notice the blinker lamp of the vehicle 10Db, and the road surface drawing unit 50D displays the path rear display image 6Db in order to call further attention. It may be. Even when a pedestrian is detected, the lighting control unit 51D controls the lighting unit 52D so as not to give glare to the pedestrian's field of view and to project the rearward display image 6Db at a position where it can be accurately recognized.
  • the road surface drawing unit 50D displays the path rear display image 6Db in order to prevent the danger of the motorcycle 10Dd being involved from another viewpoint. ..
  • the course of the vehicle 10Db can be accurately indicated to the two-wheeled vehicle 10Dd in the vicinity of the vehicle 10Db, and the driver of the two-wheeled vehicle 10Dd can be alerted.
  • the projection of the course display image 6D is controlled based on the surrounding information detected by the surrounding information detection unit 16D included in the vehicle 10D.
  • the detection member of the vehicle 10D not only the detection member of the vehicle 10D but also the device installed in advance on the road is used to control the projection of the course display image 6D.
  • the tenth embodiment will be described with reference to FIG. 4A. The description of the contents overlapping with the eighth embodiment and the ninth embodiment will be omitted.
  • the infrastructure side device 30 may be provided with an infrastructure side surrounding information detection unit (another vehicle detection unit or a situation grasping unit) that detects the surrounding state of the infrastructure side device 30 such as an imaging camera.
  • the infrastructure-side ambient information detection unit in the present embodiment is also provided as the light irradiation unit 31, and may detect ambient information by photographing the road surface from above.
  • the present disclosure is not limited to this, and the infrastructure side surrounding information detection unit may be provided at another place such as a support column of the infrastructure side device 30.
  • FIG. 22 is a schematic view showing the relationship between the vehicle 10Da, the motorcycle 10Dd, and the infrastructure side device 30 of the present embodiment at an intersection.
  • the infrastructure side device 30 detects surrounding information such as imaging the road surface, is located in a range where the vehicle 10Da must display the course in front of the intersection, and the motorcycle 10Dd is within the second rear distance L2 of the vehicle 10Da. Detects that it is located.
  • the infrastructure side device 30 is provided with a control unit that determines whether or not to display the course display image 6D from the detected image.
  • the control unit determines from the surrounding information detected by the surrounding information detection unit on the infrastructure side that no other vehicle exists within the first distance L1 ahead of the vehicle 10Da and displays the course forward display image 6Da. Further, since the motorcycle 10Dd is also located behind the vehicle 10Da, it is determined that the course rear display image 6Db is displayed.
  • the information and communication unit of the infrastructure side device 30 sends a command to display the course front display image 6Da and the course rear display image 6Db by communication between the vehicle 10Da and the road-to-vehicle.
  • the wireless communication unit 15D receives the command from the infrastructure side device 30 and sends it to the vehicle status information acquisition unit 30D.
  • the control unit 90D receives a command from the vehicle state information acquisition unit 30D and receives the route information that the vehicle course acquisition unit 20D makes a left turn, and sends the image selection unit 40D a course forward display image 6Da for a left turn and a course rearward display. Send a command to display image 6Db.
  • the predetermined course display image 6D is selected in the same manner as in the eighth embodiment and the ninth embodiment, and the road surface drawing unit 50D displays the course forward display image 6Da on the road surface in front of the vehicle 10Da. Further, the road surface drawing unit 50D displays the path rear display image 6Db on the rear road surface of the vehicle 1Da.
  • the present embodiment determines the display of the course display image 6D based on the command from the infrastructure side device 30.
  • the image display on the road surface near the intersection, that is, near the infrastructure side device 30 not only the lighting unit 52D of the vehicle 10Da projects the image, but also the light irradiation unit 31 of the infrastructure side device 30 projects the image. May be good.
  • the vehicle course acquisition unit 20D of the vehicle 10Da notifies the infrastructure side device 30 of the course information via the wireless communication unit 15D, and based on the course information obtained by the control unit of the infrastructure side device 30.
  • the appropriate course display image 6D may be determined, and the infrastructure side device 30 may command the determined course display image 6D to the vehicle 10Da. That is, the infrastructure side device 30 processes up to the selection of the course display image 6D, and the vehicle 10Da simply displays the data of the course display image 6D sent from the infrastructure side device 30. In this case, it is not necessary to store a plurality of display image data in the vehicle 10Da.
  • the display control of the course display image 6D of each vehicle 10D is performed by using the infrastructure side device 30, so that the display control of the plurality of vehicles 10D in the vicinity of the infrastructure side device 30 is collectively performed. It is possible to perform integrated display control in a plurality of vehicles 10D. That is, each vehicle 10D can display the integrated course display image 6D regardless of the control capability of the individual control unit 90D of each vehicle 10D.
  • the detection information of the surrounding information detection unit 16D provided on the vehicle 10D and the infrastructure side device The one using the detection information of the surrounding information detection unit on the infrastructure side of 30 has been described.
  • the present disclosure is not limited to these, and other vehicles may be detected by vehicle-to-vehicle communication between vehicles 10D.
  • the wireless communication unit 15D may be used for vehicle-to-vehicle communication.
  • the vehicle 10D receives the position of another vehicle, the position of the own vehicle, etc. by using the information from the satellite positioning system connected to the navigation system 11D, and uses the received information to use the other vehicle within a predetermined distance. May be determined if is present.
  • a right turn and a left turn at an intersection are given as an example, but the present disclosure is not limited to this, and can be applied to a lane change when there are a plurality of lanes.
  • the vehicle 10D can be applied not only to a general five-seater vehicle but also to a wide variety of vehicles such as trucks, trailers, and motorcycles.
  • FIG. 23 is a block diagram of the road surface drawing device 900 according to the eleventh embodiment.
  • the road surface drawing device 900 of the present disclosure includes a control unit 90E, a vehicle course acquisition unit 20E, a vehicle state information acquisition unit 30E, an image selection unit 40E, and an image drawing unit 50E.
  • the control unit 90E controls various devices of the vehicle 10E.
  • the control unit 90E is connected to various sensors and external devices for monitoring the outside of the vehicle 10E such as the navigation system 11E, the direction indicator 12E, the in-vehicle camera 13E, the sensor 14E, and the wireless communication unit 15E, and various signals. Data is input and output.
  • the vehicle-mounted camera 13E, the sensor 14E, and the wireless communication unit 15E that detect the front information of the vehicle 10E are collectively referred to as the front information detection unit 16E.
  • the front information detection unit 16E is not limited to the in-vehicle camera 13E, the sensor 14E, and the wireless communication unit 15E, and may be any one that obtains information around the vehicle 10E, particularly in front.
  • the front information detection unit 16E may include a mechanism for obtaining an image captured by another device outside the vehicle.
  • the vehicle state information acquisition unit 30E acquires the state information of the vehicle 10E such as the position, orientation, and speed of the vehicle 10E.
  • the vehicle state information acquisition unit 30E acquires the position information and orientation information of the vehicle 10E from the navigation system 11E.
  • the vehicle state information acquisition unit 30E is connected to the steering angle detection unit 31E and the speed sensor 32E.
  • the steering angle detection unit 31E detects the steering angle (steering angle) of the steering device 31Ea of the vehicle 10E (not shown).
  • the steering angle detection unit 31E is attached to the steering wheel of the vehicle 10E and detects the steering angle from the reference position of the steering wheel.
  • the speed sensor 32E detects the traveling speed of the vehicle 10E and the like.
  • the vehicle state information acquisition unit 30E obtains the direction of the vehicle 10E from the steering angle detected by the steering angle detection unit 31E, and the position and orientation of the vehicle 10E after a predetermined time from the speed and acceleration detected by the speed sensor 32E. Ask for.
  • the vehicle state information acquisition unit 30E also obtains the arrival time (required time) of the vehicle 10E to a predetermined position in the course.
  • the image selection unit 40E selects the course display image 6E projected on the road surface in front of the vehicle 10E by using the lighting unit 52E, as will be described later. That is, the road surface drawing device 900 of the present embodiment accurately transmits the course of the vehicle 10E to other vehicles and pedestrians when the course of the vehicle 10E changes, such as when turning right or left. The course display image 6E showing the course direction is projected on the road surface. Therefore, the image selection unit 40E selects the display image data so that the accurate course display image 6E can always be displayed according to the position and orientation of the vehicle 10E.
  • the image selection unit 40E is connected to the course display image data storage unit 41E in which a plurality of data of the course display image 6E indicating a right turn, a left turn, a course, etc. are stocked.
  • the image selection unit 40E advances the vehicle 10E from the course information of the vehicle 10E acquired by the vehicle course acquisition unit 20E and the vehicle state information such as the direction, position, and speed of the vehicle 10E acquired by the vehicle state information acquisition unit 30E.
  • the data of the course display image 6E indicating the direction to be taken is selected from the course display image data storage unit 41E.
  • the image selection unit 40E sends a command for instructing the data of the selected course display image 6E to the image drawing unit 50E.
  • the image drawing unit 50E includes a lighting control unit 51E and a lighting unit 52E.
  • the image drawing unit 50E takes out the data of the route display image instructed from the image selection unit 40E from the route display image data storage unit 41E and sends it to the lighting control unit 51E.
  • the lighting control unit 51E converts data into a form suitable for the lighting unit 52E, and the lighting unit 52E irradiates light from a light source according to the converted data and projects a predetermined course display image onto the road surface in front of the vehicle 10E.
  • FIG. 24 is a flowchart showing the flow of processing in which the road surface display device 900 displays the course display image 6E.
  • FIG. 25 is a schematic view showing a course 7E and a course display image 6E when turning right at an intersection of the vehicle 10E. As shown in FIG. 24, the road surface drawing device 900 of the eleventh embodiment displays the course display image 6E near the intersection in the following flow.
  • the vehicle course acquisition unit 20E recognizes and determines that the vehicle 10E turns right as shown by the course 7E (step 1E).
  • the vehicle state information acquisition unit 30E acquires the position and orientation information of the vehicle 10E from the navigation system 11E. Further, the steering angle of the vehicle 10E is acquired from the steering angle detection unit 31E, and the traveling speed of the vehicle 10E is acquired from the speed sensor 32E (step 2E).
  • the control unit 90E sets the vehicle at a position where the route display image 6E should be displayed, such as when the vehicle 10E enters an intersection or changes the route based on the position and orientation information and the route information of the vehicle 10E. Determine if 10E is located (step 3E). When the control unit 90E determines that the position should be displayed, the acquired information is sent to the image selection unit 40E. The image selection unit 40E selects the image data of the optimum course display image 6E to be displayed from the course display image data storage unit 41E in which the image data is stored, from the current position, orientation, and course of the vehicle 10E.
  • the vehicle state information acquisition unit 30E acquires that the vehicle 10E is located 30 m before the intersection in a straight-ahead state, and the vehicle course acquisition unit 20E acquires the course 7E.
  • the image selection unit 40E selects the image data of the course display image 6E indicating that the vehicle 10E turns right from the acquired information, and issues a command to the image drawing unit 50E (step 4E).
  • the image drawing unit 50E selects the image data of the route display image 6E selected by the image selection unit 40E from the route display image data storage unit 41E, and uses the lighting control unit 51E and the lighting unit 52E to use the lighting control unit 51E and the lighting unit 52E to select the road surface in front of the vehicle 10E.
  • the course display image 6E is displayed on the top (step 5E). In the case of the position of the vehicle 10E shown in FIG. 25, since it is before entering the intersection, the road marking image 6E indicating the right turn is displayed with the intention to turn right.
  • step 1E to step 5E are repeated at predetermined intervals until the vehicle 10E arrives at the destination (step 6E).
  • the image selection unit 40E reselects image data sequentially according to the position of the vehicle 10E, and the optimum course display image 6E is always displayed.
  • FIG. 26 is a schematic view showing a method of selecting the course display image 6E in the present embodiment.
  • FIG. 26 shows a stage in which the vehicle 10E turns to the right and makes a right turn.
  • the course display image 6E in the present embodiment includes a branch line 6Ea parallel to the direction of the current vehicle 10E and a direction line 6Eb extending from the branch line 6Ea in the course direction.
  • the angle between the branch line 6Ea and the direction line 6Eb (hereinafter, also referred to as “travel angle ⁇ ”) changes sequentially according to the steering angle and the direction of the vehicle obtained from the navigation system 11E. That is, the angle of the direction line 6Eb of the course display image 6E changes sequentially according to the progress of the vehicle 10E in the intersection.
  • course direction t can be known from the course information acquired by the vehicle course acquisition unit 20E.
  • course angle the angle ⁇ from the straight direction s to the course direction t (hereinafter, this is referred to as “course angle”) ⁇ is calculated by the intersection of the straight direction s of the vehicle 10E indicated by the broken arrow and the course direction t of the vehicle 10E.
  • the steering angle of the vehicle 10E is detected from the steering angle detection unit 31E. Further, the vehicle state information acquisition unit 30E calculates the cumulative movement amount of the vehicle 10E from the displacement of the position information of the vehicle 10E from the navigation system 11E to the predetermined position after detecting the steering angle. Then, the inclination angle ⁇ of the vehicle 10E at a predetermined position is calculated from the steering angle and the cumulative movement amount. Then, the direction u (indicated by the broken line arrow u) of the vehicle 10E can be known from this inclination angle ⁇ . The traveling angle ⁇ is the sum of the course angle ⁇ and the tilt angle ⁇ . Therefore, in the course display image 6E, the branch line 6Ea is determined to be along the direction u, and the direction line 6Eb is determined to be along the course direction t of the vehicle 10E at each point.
  • the tilt angle ⁇ is generated from the cumulative movement amount. That is, if the steering angle is constant, if the vehicle 10E advances with the arc of the turning radius as the track, the direction of the vehicle changes according to the amount of movement of the vehicle 10E. Therefore, the tilt angle also changes. Therefore, the tilt angle ⁇ is calculated from the relationship between the steering angle and the cumulative movement amount. For example, FIG. 27B shows vehicle 10E on the way to a turn at an intersection.
  • the course display image 6E has a shape along the course 7E shown in FIG. 25.
  • the traveling angle ⁇ is 180 degrees, and the direction u of the vehicle 10E coincides with the course direction t. That is, when the vehicle 10E changes its course at the intersection from the straight direction which is the first direction to the right turn direction which is the second direction, the course display image 6E becomes a line indicating the second direction from the vehicle 10E.
  • the inclination angle ⁇ of the vehicle 10E is obtained from the steering angle and the cumulative movement amount of the vehicle 10E in this way.
  • the direction direction u and the traveling angle ⁇ of the vehicle 10E are obtained by using the inclination angle ⁇ , and the course display image 6E is generated (selected) from the relationship between the direction direction u and the course direction t. Therefore, the course display image 6E is regenerated (reselected) according to changes in the steering angle, the cumulative movement amount of the vehicle 10E (calculated again from the vehicle position, etc.), and the vehicle state information.
  • the image selection unit 40E may select an image close to the traveling angle ⁇ obtained in the above-described form from the plurality of course display images 6E stored in the memory, or the course display image from the obtained traveling angle ⁇ . 6E may be generated (selective generation). As described above, in order to acquire the position and steering angle of the vehicle 10E at predetermined intervals, the course display image 6E is reselected (reselected and generated) and displayed every time the direction of the vehicle 10E changes. As a result, after the vehicle 10E enters the intersection, the course display image 6E is displayed along the course 7E.
  • the branch line 6Ea and the direction line 6Eb are directly connected, but they may be connected through a gently curved portion (curve line).
  • the curvature of the curved portion (curve line) is determined according to the traveling angle ⁇ , that is, the angle from the direction u of the vehicle to the course direction t.
  • the vehicle state information such as the position and orientation of the vehicle.
  • the road surface drawing device 900 of the twelfth embodiment obtains a course display image 6E from the position of the vehicle 10E by GPS detected by the navigation system 11E and the direction of the vehicle 10E.
  • the traveling angle ⁇ of the course display image 6E before entering the intersection is a course angle ⁇ composed of the straight direction s and the course direction t in order to clearly indicate the turning direction, and the vehicle 10E.
  • the traveling angle ⁇ becomes blunt.
  • the line of the course display image 6E changes in the direction of becoming straight.
  • the traveling angle ⁇ is not changed with reference to the course angle ⁇ formed by the straight direction s and the course direction t, but the course display image 6E is displayed according to the change in the direction of the vehicle 10E.
  • the traveling angle ⁇ changes. That is, the inclination of the line of the course display image 6E changes.
  • the vehicle course acquisition unit 20E obtains the course 7E at the intersection indicated by the broken line arrow based on the information from the navigation system 11E. Further, the vehicle state information acquisition unit 30E acquires the position and orientation information of the vehicle 10E from the position information from the GPS (satellite positioning system) of the navigation system 11E and the progress state of the vehicle 10E.
  • GPS satellite positioning system
  • the image selection unit 40E selects the course display image 6E along the most course from the direction (inclination) of the course 7E at the acquired position of the vehicle 10E, and the image drawing unit 50E displays the selected course display image 6E. .. Specifically, the course display image 6E is displayed as follows.
  • 28A to 28D are schematic views showing the course display image 6E of the vehicle 10E when turning right in the present embodiment according to the position of the vehicle 10E.
  • FIG. 28A when the vehicle 10E enters the intersection, it gradually turns from straight ahead, so that the course display image 6EA composed of a curve with a small inclination is displayed.
  • the course display image 6EA is relatively long so that the direction of the turn can be seen.
  • the course display images 6EB and 6EC composed of inclined curves are displayed.
  • the course display images 6EB and 6EC are shorter than the previous course display image 6EA so that the turning direction can be known.
  • the course display image 6ED consisting of a straight line extending in the course direction (right turn direction) and having no inclination is displayed.
  • the length of the course display image 6ED is such that other vehicles and pedestrians can recognize that it is in the straight-ahead direction.
  • the course display images 6EA, 6EB, 6EC, and 6ED along the course 7E are selected and displayed by the image selection unit 40E according to the direction of the vehicle 10E at each position at the intersection.
  • the course display image 6EA which is a relatively long curve, is displayed at the position where the vehicle 10E enters the intersection so that oncoming vehicles and pedestrians can easily recognize the course of the vehicle 10E, and is near the center of the intersection.
  • the course display images 6EB and 6EC which are relatively short curves, are displayed. That is, when the vehicle 10E changes course at an intersection from the straight direction which is the first direction to the course direction which is the second direction, the course display image 6E becomes a curve extending from the vehicle in the second direction.
  • FIG. 29A to 29D are schematic views showing a modified example of the course display image 6E in the present embodiment.
  • FIG. 29A shows a course display image 6EA at a position where the vehicle 10E enters the intersection.
  • FIG. 29B shows a position in front of the center of the intersection, and
  • FIG. 29C shows the course display images 6EB and 6EC when the vehicle 10E is located near the center of the intersection.
  • FIG. 29D shows the course display image 6ED at the position after the vehicle 10E has turned the intersection.
  • the course display image 6E is a straight line connecting the vehicle 10E and the course 7E.
  • the length of the straight line of the course display image 6E is shorter, and the course direction is displayed in an easy-to-understand manner. Therefore, the length of the straight line of the course display image 6E is predetermined in relation to the position and direction (that is, the inclination in the course direction) of the vehicle 10E.
  • the course display image 6E is a straight line extending from the vehicle in the second direction when the vehicle 10E changes course from the straight direction which is the first direction to the right direction which is the second direction. Therefore, the length of the course display image 6E is determined according to the direction of the vehicle 10E.
  • the course 7E is determined by the information from the navigation system 11E, the position of the vehicle 10E is specified by GPS, and the line connecting the vehicle 10E to the course direction is set as the course display image 6E.
  • the display image 6E is displayed along the course 7E.
  • the image selection unit 40E can perform the image selection unit 40E according to the change in the position of the vehicle 10E in the automatic driving mode or the like.
  • the appropriate course display image 6E can always be quickly reselected and drawn on the road surface.
  • FIG. 30A is a schematic view of the intersection showing the time transition of the course display image 6E at the intersection.
  • 30B to 30D are schematic perspective views showing a state of projection of the vehicle 10E and the course display image 6E according to the time transition at the intersection.
  • the course display image 6E displayed according to the position of the vehicle 10E is projected so that the drawing position on the road surface overlaps the drawing position projected immediately before. Specifically, it is as follows.
  • the course display image 6EA is projected on the road surface when the vehicle 10E is in the position of entering the intersection.
  • the irradiation angle of the light from the lighting unit 52E is set so as to be an angle ⁇ toward the road surface G.
  • the road surface G is photographed by the vehicle-mounted camera 13E, and the vehicle state information acquisition unit 30E confirms the position of the course display image 6EA on the road surface.
  • the image selection unit 40E reselects the course display image according to the position and orientation of the vehicle 10E, and the reselected course display image 6EB is placed on the road surface.
  • the road surface G at a predetermined position (scheduled display position) is imaged by the in-vehicle camera 13E, and it is confirmed whether the previous course display image 6EA overlaps with the projected position and whether the road surface has unevenness or inclination. ..
  • the irradiation direction of the lighting unit 52E is adjusted so that the irradiation angle ⁇ is constant with respect to the road surface G. Then, as shown in FIG. 30C, the irradiation direction of the lighting unit 52E is adjusted so that the reselected course display image 6EB partially overlaps the previously projected course display image 6EA (image shown by a broken line). .. In this way, the lighting unit 52E adjusts the irradiation direction according to the state of the road surface G, and projects the course display images 6EA, 6EB, 6EC, and 6ED so that the irradiation angle ⁇ is constant with respect to the road surface G.
  • the driver of the vehicle 10E can always recognize the course display image 6E in a constant field of view. Further, since a constant projection state can always be maintained with respect to the road surface G regardless of the unevenness or inclination of the road surface G, other vehicles or pedestrians on the opposite side can easily recognize the course display image 6E.
  • the course display images 6EA, 6EB, 6EC can be continuously projected on the course 7E. 6ED will be projected.
  • the course display image E6 is continuously displayed up to the lane after the right turn, as if a line was drawn, in combination with the visual afterimage effect of the viewer, so that surrounding pedestrians and other vehicles can see the vehicle 10E. It will be very easy to understand when recognizing the course at the intersection.
  • FIG. 31A and 31B are schematic views showing a course display image at an intersection in the 14th embodiment.
  • FIG. 31A is a schematic view showing a course display image 61E at the time when the vehicle 10E trying to turn right enters an intersection.
  • FIG. 31B is a schematic view showing a course display image 62E after the vehicle 10E detects a pedestrian.
  • the course display image 61E is a linear image.
  • the front information detection unit 16E including the vehicle-mounted camera 13E and the sensor 14E of the present embodiment detects the front of the vehicle 10E and detects the presence or absence of a pedestrian B in front of the vehicle 10E. Then, the forward information detection unit 16E sends the detection result to the vehicle state information acquisition unit 30E.
  • the vehicle state information acquisition unit 30E detects that a pedestrian B is in front of the vehicle 10E from the information of the vehicle-mounted camera 13E and the sensor 14E, and sends the information to the image selection unit 40E.
  • the image selection unit 40E selects the course display image 61E displayed in a normal thickness.
  • the course display image 62E displayed with a thickness thicker than the course display image 61E is selected so that the pedestrian can easily identify the pedestrian B.
  • the course display image 61E when there is no pedestrian B near the vehicle 10E is projected with a normal thickness.
  • the course display image 62E when the pedestrian B is detected in front of the vehicle 10E is thicker than usual and is projected prominently.
  • the forward information detection unit 16E detects the pedestrian B, and as the pedestrian B approaches, the course display image 62E having a thick line width of the image is selected and projected.
  • the attention of the pedestrian B can be drawn. That is, the pedestrian B can be made to accurately recognize the course of the vehicle 10E, and the movement of the vehicle 10E can be correctly grasped when walking on a pedestrian crossing or the like.
  • the line width of the course display image 62E is increased, but the present disclosure is not limited to this, and a process for blinking the course display image 62E is added. May be good.
  • one course display image 6E is displayed on the road surface in front of the vehicle 10E.
  • the present disclosure is not limited to these embodiments, and a plurality of course display images 6E may be displayed.
  • the road surface drawing device 900 in the fifteenth embodiment projects the course display image 6E on the left and right lighting units 52L and 52R, respectively.
  • FIG. 32A to 32C are schematic views showing the course display image 6E in the fifteenth embodiment.
  • FIG. 32A shows a course display image 6E when traveling straight.
  • FIG. 32B shows a course display image 6E when turning right.
  • FIG. 32C shows a course display image 6E when turning left.
  • the image selection unit 40E when going straight, displays a course display image 6S which is a straight arrow on each of the left lighting unit 52L and the right lighting unit 52R. Select data.
  • the image drawing unit 50E projects the left lighting unit 52L and the right lighting unit 52R so that the two course display images 6S are displayed on the road surface in front of the vehicle 10E according to the image data selected by the image selection unit 40E. ..
  • the image drawing unit 50E color-codes the course display image 6E in the manual driving mode and the automatic driving mode, and the vehicle 10E automatically surrounds the vehicle 10E, that is, the surrounding pedestrians and other vehicles (automobiles, bicycles). Be able to identify whether you are driving or not.
  • the color of the course display image 6S when traveling straight is white
  • the color of the course display image 6S is turquoise (blue-green).
  • straight line path display images 6S are displayed from the left and right lighting units 52R and 52L, respectively, and two straight arrows are projected in front of the vehicle 10E, so that pedestrians and the like on the left side of the vehicle 10E can also see them.
  • the course display image 6S is easy to see.
  • the image selection unit 40E uses only the right lighting unit 52R and displays the course display image 6R for turning right. Select the display image data 41Er. Then, the image drawing unit 50E projects only the right lighting unit 52R so that the route display image 6R is displayed on the road surface in front of the vehicle 10E according to the display image data 41Er selected by the image selection unit 40E. The projection of the right lighting unit 52R is performed at the same time as the timing when the vehicle 10E turns on the blinker lamp by the turn signal 12E in front of the intersection.
  • the display color of the course display image 6R is amber so that the manifestation of intention that the vehicle 10E will turn is clearly communicated to the surroundings.
  • the display of the course display image 6R after the vehicle 10E enters the intersection may be in a form conforming to the above-mentioned 11th to 14th embodiments. Further, the display after the vehicle 10E has entered the intersection may be performed only by the right lighting unit 52R, or may be performed by using both the left and right lighting units 52R and 52L.
  • the image selection unit 40E uses only the left lighting unit 52L and displays the display image data so as to display the course display image 6L for the left turn. Select 41 El.
  • the image drawing unit 50E projects the left lighting unit 52L based on the display image data 41El selected by the image selection unit 40E.
  • the display method and the display contents after the vehicle 10E enters the intersection are the same as when turning right.
  • the course display images 6R and 6L can be displayed in the direction of the turn. It becomes easier for oncoming vehicles and pedestrians around you to notice.
  • the image selection unit 40E selects the display image data 41Er or El to project the instruction to use only one of the left and right lighting units 52L and 52R by only one of the lighting units 52, and the image drawing unit. Instructed 50E.
  • the present disclosure is not limited to this, and the image selection unit 40E always selects the display image data 41Er (or El) for right turn (or left turn) when turning right (when turning left), and draws an image.
  • the unit 50E may be in a form of performing data processing as to whether the projection uses only one lighting unit 52E or the projection using both the left and right lighting units 52E.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Architecture (AREA)
  • Civil Engineering (AREA)
  • Structural Engineering (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

La présente invention concerne un système d'assistance à la conduite de véhicule (100) comprenant : une unité de détermination de fonctionnement de véhicule (110) qui acquiert le fonctionnement d'un premier véhicule (10) se déplaçant sur une route en tant que premières informations de fonctionnement, et acquiert le fonctionnement d'un deuxième véhicule (20) se déplaçant sur la route en tant que deuxièmes informations de fonctionnement ; une unité de création d'informations de guidage (120) qui, sur la base des premières informations de fonctionnement et des deuxièmes informations de fonctionnement, crée des informations de guidage montrant l'itinéraire planifié sur lequel le premier véhicule (10) ou le deuxième véhicule (20) se déplacera ; et une unité de présentation d'informations de guidage (130) qui présente des informations de guidage au premier véhicule (10) ou au deuxième véhicule (20). L'unité de création d'informations de guidage (120) crée des informations de guidage lorsque les premières informations de fonctionnement et les deuxièmes informations de fonctionnement comprennent un changement de trajectoire dans la même direction, et comprennent une opération de virage à droite et une opération de virage à gauche.
PCT/JP2020/039290 2019-11-06 2020-10-19 Système d'assistance à la conduite de véhicule, dispositif de tracé de surface de route, et route WO2021090668A1 (fr)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
JP2019201392A JP2021076956A (ja) 2019-11-06 2019-11-06 車両運転支援システム
JP2019-201392 2019-11-06
JP2019-203610 2019-11-09
JP2019203610A JP2021077125A (ja) 2019-11-09 2019-11-09 車両運転支援システム
JP2019205095A JP2021075952A (ja) 2019-11-12 2019-11-12 道路および車両運転支援システム
JP2019-205095 2019-11-12
JP2019-209198 2019-11-19
JP2019209198A JP7403288B2 (ja) 2019-11-19 2019-11-19 路面描画装置
JP2019-211412 2019-11-22
JP2019211412A JP7348819B2 (ja) 2019-11-22 2019-11-22 車両運転支援システム

Publications (1)

Publication Number Publication Date
WO2021090668A1 true WO2021090668A1 (fr) 2021-05-14

Family

ID=75849751

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/039290 WO2021090668A1 (fr) 2019-11-06 2020-10-19 Système d'assistance à la conduite de véhicule, dispositif de tracé de surface de route, et route

Country Status (1)

Country Link
WO (1) WO2021090668A1 (fr)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS508428Y1 (fr) * 1969-04-02 1975-03-13
JPH07197407A (ja) * 1993-12-28 1995-08-01 Yoshida Doro Kk 光輝性明色アスファルト舗装面及びその施工方法
JP2005006152A (ja) * 2003-06-13 2005-01-06 Auto Network Gijutsu Kenkyusho:Kk 接近報知システム
JP2006190187A (ja) * 2005-01-07 2006-07-20 Toyota Motor Corp 交通制御装置および交通制御システム
JP2007168727A (ja) * 2005-12-26 2007-07-05 Aisin Aw Co Ltd 運転支援装置、運転支援システムおよび運転支援プログラム
JP2008143505A (ja) * 2006-11-16 2008-06-26 Denso Corp 前照灯制御装置
JP2011178257A (ja) * 2010-03-01 2011-09-15 Koito Mfg Co Ltd 車両ランプシステム
WO2016163294A1 (fr) * 2015-04-10 2016-10-13 日立マクセル株式会社 Dispositif de projection d'image vidéo
JP2016205103A (ja) * 2015-04-28 2016-12-08 株式会社デンソー 路面標示構造及び路面標示システム
JP2017010463A (ja) * 2015-06-25 2017-01-12 株式会社デンソー 車両情報提供装置
WO2017126250A1 (fr) * 2016-01-22 2017-07-27 日産自動車株式会社 Procédé et dispositif d'aide à la conduite
JP2017138766A (ja) * 2016-02-03 2017-08-10 三菱電機株式会社 車両接近検出装置
JP2018084955A (ja) * 2016-11-24 2018-05-31 株式会社小糸製作所 無人航空機

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS508428Y1 (fr) * 1969-04-02 1975-03-13
JPH07197407A (ja) * 1993-12-28 1995-08-01 Yoshida Doro Kk 光輝性明色アスファルト舗装面及びその施工方法
JP2005006152A (ja) * 2003-06-13 2005-01-06 Auto Network Gijutsu Kenkyusho:Kk 接近報知システム
JP2006190187A (ja) * 2005-01-07 2006-07-20 Toyota Motor Corp 交通制御装置および交通制御システム
JP2007168727A (ja) * 2005-12-26 2007-07-05 Aisin Aw Co Ltd 運転支援装置、運転支援システムおよび運転支援プログラム
JP2008143505A (ja) * 2006-11-16 2008-06-26 Denso Corp 前照灯制御装置
JP2011178257A (ja) * 2010-03-01 2011-09-15 Koito Mfg Co Ltd 車両ランプシステム
WO2016163294A1 (fr) * 2015-04-10 2016-10-13 日立マクセル株式会社 Dispositif de projection d'image vidéo
JP2016205103A (ja) * 2015-04-28 2016-12-08 株式会社デンソー 路面標示構造及び路面標示システム
JP2017010463A (ja) * 2015-06-25 2017-01-12 株式会社デンソー 車両情報提供装置
WO2017126250A1 (fr) * 2016-01-22 2017-07-27 日産自動車株式会社 Procédé et dispositif d'aide à la conduite
JP2017138766A (ja) * 2016-02-03 2017-08-10 三菱電機株式会社 車両接近検出装置
JP2018084955A (ja) * 2016-11-24 2018-05-31 株式会社小糸製作所 無人航空機

Similar Documents

Publication Publication Date Title
US10232713B2 (en) Lamp for a vehicle
US10816982B2 (en) Vehicle control device mounted on vehicle and method for controlling the vehicle
KR101908308B1 (ko) 차량용 램프
US9952051B2 (en) Advanced driver assistance apparatus, display apparatus for vehicle and vehicle
KR101982774B1 (ko) 자율 주행 차량
CN109720267B (zh) 车辆用灯具系统
JP4720764B2 (ja) 前照灯制御装置
US20090135024A1 (en) Display control system of traffic light and display method
US20180056858A1 (en) Vehicle signaling system
CN110126719B (zh) 车辆用照明系统及车辆
CN112752678A (zh) 车辆的起步通知显示装置
JP7348819B2 (ja) 車両運転支援システム
JP7370805B2 (ja) 車両の路面描画装置。
WO2021090668A1 (fr) Système d'assistance à la conduite de véhicule, dispositif de tracé de surface de route, et route
CN113044054A (zh) 车辆控制装置、车辆控制方法和程序
WO2020085505A1 (fr) Dispositif de tracé de surface routière pour véhicule
US11731554B2 (en) Vehicle departure notification display device
CN111519553A (zh) 用于城市高架桥下交叉路口的交通引导系统及方法
JP2021075952A (ja) 道路および車両運転支援システム
KR101908310B1 (ko) 차량용 램프
KR20210004186A (ko) 유도신호 표시 시스템 및 방법
WO2020230523A1 (fr) Système de transport et infrastructure de transport
WO2020162455A1 (fr) Réverbère
KR102652239B1 (ko) 거리에 따른 개별 차량 맞춤형 신호등 시스템 및 방법
WO2023187991A1 (fr) Appareil de commande de corps mobile et procédé de commande de corps mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20883876

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20883876

Country of ref document: EP

Kind code of ref document: A1