WO2018029978A1 - 車外表示処理装置及び車外表示システム - Google Patents
車外表示処理装置及び車外表示システム Download PDFInfo
- Publication number
- WO2018029978A1 WO2018029978A1 PCT/JP2017/022093 JP2017022093W WO2018029978A1 WO 2018029978 A1 WO2018029978 A1 WO 2018029978A1 JP 2017022093 W JP2017022093 W JP 2017022093W WO 2018029978 A1 WO2018029978 A1 WO 2018029978A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- display
- display device
- outside
- scene
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present disclosure relates to a vehicle outside display processing device that performs display toward the periphery of a vehicle, and a vehicle outside display system including the vehicle outside display processing device.
- Patent Document 1 discloses a road surface around the vehicle by a road surface projection device for the purpose of allowing other people such as passengers and pedestrians of other vehicles to recognize the position at each time from the current time of the vehicle to 3 seconds later.
- a technique for irradiating a laser beam is disclosed above.
- Patent Literature 1 since the field of view for a person's central vision is limited, when the other person's line of sight is not facing the road surface, the other person cannot recognize the display drawn on the road surface. End up. Since the direction of the line of sight may vary depending on the traffic state, such as when the line of sight is more distant than when the vehicle is stopped, it is sufficient if the line of sight is not facing the road surface, even for others around the vehicle. Conceivable.
- This disclosure is intended to provide an out-of-vehicle display processing device and an out-of-vehicle display system that allow a person around the own vehicle to more reliably recognize a display that presents information toward the vicinity of the own vehicle.
- the vehicle exterior display processing device is used in a vehicle, and is a road surface display device that is a vehicle exterior display device that displays from the vehicle toward the road surface around the vehicle, and the vehicle from the vehicle surroundings.
- a road surface display device that is a vehicle exterior display device that displays from the vehicle toward the road surface around the vehicle, and the vehicle from the vehicle surroundings.
- a display processing unit that can display information to be displayed, and a display device determination unit that determines the display device outside the vehicle to be displayed by the display processing unit.
- the display device determination unit determines the display device outside the vehicle to be displayed by the display processing unit so that a plurality of types of display devices outside the vehicle display simultaneously, or the plurality of types of display devices outside the vehicle.
- the outside display device to be displayed is determined by the display processing unit so that the display is switched.
- At least one of the road surface display device, the aerial display device, and the own vehicle outer surface display device can display information on at least one of a plurality of types of outside display devices that can be displayed by the display processing unit. Can be displayed.
- the display device determination unit determines the display device outside the vehicle to be displayed by the display processing unit so that a plurality of types of display devices outside the vehicle display simultaneously, or switches between the plurality of types of display devices outside the vehicle for display.
- the display processing unit determines a display device outside the vehicle so that the display is performed.
- the vehicle outside display system is used in a vehicle, and is a road surface display device that is a vehicle outside display device that performs display from the vehicle toward the road surface around the vehicle, and At least one of a plurality of types of outside display devices of an aerial display device that is a vehicle outside display device that performs display toward the air and a vehicle outside surface display device that is a vehicle outside display device that performs display on the outside surface of the vehicle.
- a vehicle exterior display processing device according to the first aspect.
- the display processing unit determines the display device outside the vehicle to be displayed by the display processing unit so that a plurality of types of display devices outside the vehicle display simultaneously, or switches between the plurality of types of display devices outside the vehicle for display.
- the display processing unit determines a display device outside the vehicle so that the display is performed. Therefore, it is possible to change the display location to a location other than the road surface, or to simultaneously display the location other than the road surface, thereby preventing the situation where the display is always performed only on the road surface.
- FIG. 1 is a diagram illustrating an example of a schematic configuration of a driving support system
- FIG. 2 is a diagram for explaining a road surface display device.
- FIG. 3 is a diagram for explaining the aerial display device.
- FIG. 4 is a view for explaining the own vehicle outer surface display device.
- FIG. 5 is a diagram for explaining the vehicle outer surface display device,
- FIG. 6 is a diagram illustrating an example of a schematic configuration of the driving assistance ECU.
- FIG. 7 is a flowchart showing an example of the flow of the outside display related processing in the driving support ECU.
- FIG. 8 is a diagram showing a display example on the road surface display device and the aerial display device in the attention area running scene,
- FIG. 9 is a diagram illustrating a display example on the aerial display device and the own vehicle outer surface display device in the start request scene.
- a driving assistance system 1 shown in FIG. 1 is used in an automobile (hereinafter simply referred to as a vehicle), and includes a driving assistance ECU 10, an ADAS (Advanced Driver Assistance Systems) locator 20, a surrounding monitoring sensor 30, a vehicle control ECU 40, and an outside display.
- a device 50 is included. It is assumed that the driving support ECU 10, the ADAS locator 20, and the vehicle control ECU 40 are connected to an in-vehicle LAN, for example.
- a vehicle equipped with the driving support system 1 is referred to as a host vehicle.
- the ADAS locator 20 includes a GNSS (Global Navigation Satellite System) receiver, an inertial sensor, and a map database (hereinafter referred to as DB) storing map data.
- the GNSS receiver receives positioning signals from a plurality of artificial satellites.
- the inertial sensor includes, for example, a triaxial gyro sensor and a triaxial acceleration sensor.
- the map DB is a non-volatile memory and stores map data such as link data, node data, and road shapes.
- the link data includes a link ID for identifying the link, a link length indicating the length of the link, a link direction, a link travel time, link shape information, node coordinates (latitude / longitude) at the start and end of the link, and road attributes. It consists of each data such as.
- the road attributes include road name, road type, road width, number of lanes, speed regulation value, and the like.
- Node data includes node ID, node coordinates, node name, node type, connection link ID describing the link ID of the link connected to the node, intersection type, etc. Consists of
- the ADAS locator 20 sequentially measures the vehicle position of the vehicle on which the ADAS locator 20 is mounted by combining the positioning signal received by the GNSS receiver and the measurement result of the inertial sensor.
- the vehicle position may be measured using a travel distance obtained from a pulse signal sequentially output from a wheel speed sensor mounted on the host vehicle. And the measured vehicle position is output to in-vehicle LAN.
- the ADAS locator 20 also reads map data from the map DB and outputs it to the in-vehicle LAN.
- the map data may be obtained from a server outside the vehicle using a vehicle-mounted communication module used for telematics communication such as DCM (Data Communication Module) mounted on the vehicle.
- DCM Data Communication Module
- the surrounding monitoring sensor 30 is a moving object such as a pedestrian, an animal other than a human being, a vehicle other than the own vehicle, a person driving another vehicle, a falling object on the road, a stationary object such as a guardrail, a curbstone, or a tree. Detect obstacles around your vehicle. In addition, road markings such as travel lane lines and stop lines around the vehicle may be detected. Other vehicles include bicycles, motorcycles, automobiles, etc. other than the own vehicle.
- the surrounding monitoring sensor 30 is, for example, a surrounding monitoring camera that captures a predetermined range around the host vehicle, a millimeter wave radar that transmits an exploration wave to the predetermined range around the host vehicle, sonar, LIDAR (Light Detection and Ranging / Laser Imaging Detection and Ranging).
- the peripheral monitoring camera sequentially outputs captured images that are sequentially captured to the driving assistance ECU 10 as sensing results.
- a sensor that transmits an exploration wave such as sonar, millimeter wave radar, or LIDAR sequentially outputs a scanning result based on a received signal obtained when a reflected wave reflected by an obstacle is received to the driving support ECU 10 as a sensing result.
- the vehicle control ECU 40 is an electronic control device that performs acceleration / deceleration control and / or steering control of the host vehicle.
- the vehicle control ECU 40 includes a steering ECU that performs steering control, a power unit control ECU that performs acceleration / deceleration control, a brake ECU, and the like.
- the vehicle control ECU 40 acquires detection signals output from sensors such as an accelerator position sensor, a brake pedal force sensor, a rudder angle sensor, and a wheel speed sensor mounted on the host vehicle, and performs electronic control throttle, brake actuator, EPS (Electric Power Steering) Outputs control signals to each travel control device such as a motor. Further, the vehicle control ECU 40 can output detection signals of the above-described sensors to the in-vehicle LAN.
- the outside display device 50 performs a display for presenting information toward the outside of the vehicle.
- a road surface display device 51, an aerial display device 52, and a host vehicle outer surface display device 53 are provided as the vehicle outside display device 50.
- the road surface display device 51, the aerial display device 52, and the host vehicle outer surface display device 53 will be described with reference to FIGS.
- the road surface display device 51 performs display from the own vehicle toward the road surface around the own vehicle.
- the road surface display device 51 is a device that projects light, and is mounted on the host vehicle so that the projected light is directed toward the road surface.
- the road surface display device 51 may project visible light onto the road surface, or may draw on the road surface with a visible light semiconductor laser. Further, the road surface display device 51 may be mounted on the roof, bonnet, headlamp, bumper, etc. of the own vehicle.
- Examples of display representations on the road surface by the road surface display device 51 include display of various shapes such as arrows, blinking of the display, division of the display area (see A in FIG. 2), and the like.
- the display area is divided, the display color may be different for each divided area, the display luminance may be different, or the area may be different.
- the road surface display apparatus 51 can switch the distance and area which project light by switching the angle which projects light toward a road surface with an actuator.
- the structure which switches the distance and area which project light by switching the number of the light sources which project light may be sufficient.
- the aerial display device 52 performs display from the own vehicle toward the air around the own vehicle.
- the aerial display device 52 is a device that projects light, and is mounted on the host vehicle so that the projected light travels into the air.
- the aerial display device 52 may emit light by turning the atmosphere into plasma by focusing the laser beam in the air.
- water vapor may be blown into the air in the form of a mist, and light may be projected or laser-drawn onto the mist of water vapor.
- the aerial display device 52 may be mounted on the roof, bonnet, headlamp, bumper or the like of the own vehicle. Examples of the representation of the display toward the road surface by the road surface display device 51 include display of various shapes such as arrows (see B in FIG. 3), blinking of the display, rotation of the display, and the like.
- the own vehicle outer surface display device 53 performs display on the outer surface of the own vehicle as shown in FIGS.
- the own vehicle outer surface display device 53 may be a light source such as an LED disposed on the outer surface of the own vehicle, a display disposed on the outer surface of the own vehicle, or light on the outer surface of the own vehicle. It may be a projector that projects.
- the location displayed by the own vehicle outer surface display device 53 may be the hood, fender, wheel, windshield, etc. of the own vehicle.
- Examples of display expression by the vehicle exterior surface display device 53 include display of various shapes such as arrows (see C, D in FIG. 4 and F in FIG. 5), display of text, blinking of the display, rotation of the display, and the like. is there.
- rotation may be expressed by blinking a plurality of LED light sources (see E in FIG. 4) arranged in the circumferential direction of the wheel in order in the circumferential direction.
- the driving support ECU 10 includes a processor, a volatile memory, a non-volatile memory, an I / O, and a bus connecting them, and executes various processes by executing a control program stored in the non-volatile memory.
- the driving assistance ECU 10 recognizes the traveling environment of the host vehicle from the vehicle position and map data of the host vehicle acquired from the ADAS locator 20, the sensing result acquired from the surrounding monitoring sensor 30, and the like. In addition, the driving assistance ECU 10 generates a driving plan for driving the vehicle by automatic driving based on the recognized driving environment.
- a recommended route for making the vehicle head for the destination is generated.
- a short-term travel plan for traveling according to the recommended route is generated.
- execution of steering for lane change, acceleration / deceleration for speed adjustment, steering and braking for obstacle avoidance, and the like is determined.
- the driving support ECU 10 causes the vehicle control ECU 40 to automatically perform acceleration, braking, and / or steering of the host vehicle according to the generated travel plan.
- the driving assistance ECU 10 determines the vehicle outside display device 50 to be displayed outside the vehicle according to the scene in which the relationship between the subject around the vehicle to which information is to be presented and the vehicle is classified.
- the display outside the vehicle 50 is made to display. Therefore, this driving assistance ECU 10 corresponds to a vehicle outside display processing device.
- a configuration including the driving support ECU 10 and the outside display device 50 corresponds to the outside display system. Details of processing in the driving support ECU 10 will be described below.
- the driving support ECU 10 includes a driving environment recognition unit 101, a driving plan generation unit 102, an automatic driving function unit 103, a vehicle information acquisition unit 104, a scene specification unit 105, a display device determination unit 106, a display content determination unit 107, and a display processing unit. 108 is provided. Note that some or all of the functions executed by the driving support ECU 10 may be configured in hardware by one or a plurality of ICs.
- the travel environment recognition unit 101 recognizes the position, shape, and movement state of objects around the vehicle from the vehicle position and map data of the vehicle acquired from the ADAS locator 20, the sensing result acquired from the surrounding monitoring sensor 30, and the like. Recognize the driving environment of the vehicle. For example, the position, shape, and movement state of an object around the own vehicle may be recognized by a known method using a peripheral monitoring camera and / or a sensor that transmits a survey wave.
- the traveling environment recognition unit 101 detects road markings such as traveling lane markings by well-known image recognition processing such as edge detection from captured image data acquired from the periphery monitoring camera.
- the travel environment recognition unit 101 may be configured to recognize the travel environment of the host vehicle using information on other vehicles acquired through vehicle-to-vehicle communication or road-to-vehicle communication.
- the traveling environment recognition unit 101 is a person driving a vehicle such as an automobile, a bicycle, or a motorcycle, and / or a position of a person outside the vehicle such as a pedestrian, a gaze direction, a gazing point, a walking ability, a traffic It also recognizes the surrounding person state such as the intention of the person and the traffic state.
- a person may be recognized by a known image recognition process such as template matching from captured image data acquired from a peripheral monitoring camera.
- the position of the person may be recognized by a known method using a peripheral monitoring camera and / or a sensor that transmits an exploration wave.
- the configuration is such that the face direction detected from the captured image data acquired from the peripheral monitoring camera is recognized as the gaze direction by a known technique for detecting the face direction from the captured image data of the face. That's fine. If the position of the human pupil or the eyeball position can be recognized, the line-of-sight direction may be recognized from the position of the pupil or the eyeball position. The gaze point may be recognized from the position of the person and the gaze direction of the person.
- a configuration may be adopted in which a person with low walking ability such as an elderly person or a child is recognized by a known image recognition process such as template matching for a pedestrian.
- the intention of passing may be recognized from the position of the person, the gaze direction of the person, and the gaze point of the person. For example, if a person is looking at or staring at a pedestrian crossing, a roadway outside the pedestrian crossing, or a stopped vehicle, pass in front of a pedestrian crossing or a roadway outside the pedestrian crossing or a stopped vehicle Recognize the intention.
- the intention of passing the pedestrian crossing may be recognized.
- the traffic state may be recognized from the position of the person. For example, if a person is located on a pedestrian crossing, it recognizes that it is passing a pedestrian crossing. do it. Therefore, the driving environment recognition unit 101 corresponds to the target information detection unit.
- the recognition of the surrounding human state may be performed using learning data after learning the surrounding human state recognized in the past as learning data.
- the travel plan generation unit 102 generates a travel plan for driving the vehicle by automatic driving.
- the travel plan generated by the travel plan generation unit 102 is output to the automatic driving function unit 103.
- the travel plan generation unit 102 uses the vehicle position and map data of the host vehicle acquired from the ADAS locator 20 to generate a recommended route for directing the host vehicle to the destination as a medium to long-term travel plan.
- the recommended route may be configured to search using the Dijkstra method by setting a link cost so that, for example, a road suitable for running by automatic driving is prioritized.
- the travel plan generation unit 102 generates a short-term travel plan for traveling according to the recommended route based on the travel environment of the host vehicle recognized by the travel environment recognition unit 101. As specific examples, it is determined to execute steering for lane change, acceleration / deceleration for speed adjustment, stopping at a temporary stop position, steering and braking for obstacle avoidance, and the like.
- the automatic driving function unit 103 causes the vehicle control ECU 40 to automatically perform acceleration, braking, and / or steering of the host vehicle so that the driver can perform the driving operation. Do.
- This function of performing the driving operation is called an automatic driving function.
- the automatic driving function there is a function of controlling the traveling speed of the own vehicle so as to maintain the target inter-vehicle distance from the preceding vehicle by adjusting the driving force and the braking force.
- there is a function to automatically move the vehicle to the adjacent lane Furthermore, there is also a function for forcibly decelerating the vehicle by generating a braking force based on the sensing result in front.
- an automatic driving function a function for performing acceleration / deceleration and steering so that the vehicle position of the host vehicle travels along the recommended route generated by the travel plan generation unit 102, and travel along a recommended travel locus
- a function of performing acceleration / deceleration and steering and a function of automatically stopping on a road shoulder or the like in an emergency.
- what was described here is an example to the last, and it is good also as a structure provided with another function as an automatic driving
- the vehicle information acquisition unit 104 acquires a state quantity related to the behavior of the host vehicle from the detection result of each sensor output via the vehicle control ECU 40.
- vehicle information such as the vehicle speed of the own vehicle is acquired.
- the vehicle information acquisition unit 104 may acquire the detection result of the sensor without using the vehicle control ECU 40.
- the scene specifying unit 105 determines the target of the vehicle surrounding the vehicle to which information should be presented and the vehicle from the traveling environment recognized by the traveling environment recognition unit 101 and the vehicle information acquired by the vehicle information acquisition unit 104. Identify scenes that classify relationships with cars. In other words, the scene is specified from the vehicle position and map data of the own vehicle, the sensing result acquired from the surrounding monitoring sensor 30, and the vehicle information of the own vehicle acquired by the vehicle information acquisition unit 104. As an example, the scene is recognized by the traveling environment recognition unit 101 as to whether or not the traveling state of the own vehicle, the situation when the own vehicle is stopped, and whether a pedestrian or a person driving the vehicle exists within the sensing range. What is necessary is just to set it as the structure classified according to the surrounding person state, the road classification in which the own vehicle is located, the arrangement of vehicles around the own vehicle, and the like.
- the scene specifying unit 105 there is a scene (hereinafter referred to as a caution area travel scene) that is traveling in an area (hereinafter referred to as a caution area) where attention should be paid to the vicinity of the vehicle.
- the scene specifying unit 105 may determine that the host vehicle is traveling from the vehicle speed of the host vehicle acquired by the vehicle information acquiring unit 104.
- the scene specifying unit 105 may determine the caution area from the traveling environment of the host vehicle recognized by the traveling environment recognition unit 101.
- Examples of caution areas include road areas below one lane on one side, road areas where pedestrians are located on a single road, road areas where pedestrians with low walking ability such as children and elderly people are located, and distances to intersections are within the threshold range And road areas where oncoming vehicles or parked vehicles exist.
- Examples of exclusion from the attention area include a road area where a preceding vehicle exists within a forward threshold range, a road area corresponding to an automobile-only road, a private land, an area outside a road in a parking lot, and the like.
- the scene specifying unit 105 may determine that the host vehicle is stopped from the vehicle speed of the host vehicle acquired by the vehicle information acquiring unit 104. Moreover, the scene specific
- the predetermined time here can be arbitrarily set, and may be set to 1 minute, for example. What is necessary is just to measure the stop time of the own vehicle by a timer circuit etc.
- the presence of the vehicle following the own vehicle may be determined from the traveling environment of the own vehicle recognized by the traveling environment recognition unit 101.
- Existence other than the next vehicle that is closest to the host vehicle may be recognized by the traveling environment recognition unit 101 using information on other vehicles acquired through inter-vehicle communication or road-to-vehicle communication.
- the scene specifying unit 105 may determine that the start is not desired when the above-mentioned determination that the start is desired is not made while the host vehicle is stopped by the automatic driving function.
- the operation input unit that receives an operation input that gives priority to the passage of another person receives the operation input, it may be determined that the vehicle does not want to start.
- the traveling environment recognition unit 101 may be determined from the traveling environment of the host vehicle recognized by the traveling environment recognition unit 101 that the pedestrian or the vehicle estimated to be intended to pass in the direction intersecting with the host vehicle is located in front of the host vehicle.
- the intention of passing a pedestrian or a vehicle may be determined from the intention of passing among the surrounding person states described above. Note that the intention of passing a pedestrian or a vehicle may be determined based on the fact that the traveling environment recognition unit 101 recognizes that a pedestrian crossing or a pedestrian crossing is waiting.
- the display device determination unit 106 determines the outside-vehicle display device 50 that performs display according to the scene specified by the scene specification unit 105.
- the scene identifying unit 105 identifies a caution area travel scene
- at least the road surface display device 51 and the aerial display device 52 of the outside vehicle display device 50 are determined as the outside vehicle display device 50 for performing display. To do.
- the road surface display device 51 and the aerial display device 52 are determined as the vehicle outside display device 50 that performs the display when the attention area traveling scene is specified.
- the display device determining unit 106 displays at least the aerial display device 52 and the own vehicle outer surface display device 53 of the outside display device 50. This is determined as the vehicle outside display device 50 to be displayed.
- the road surface display device 51 may also be determined as the vehicle outside display device 50 that performs display.
- the aerial display device 52 and the vehicle exterior surface display device 53 are The description will be made on the assumption that the display device 50 outside the vehicle to be displayed is determined.
- the display device determining unit 106 displays at least the road surface display device 51 and the vehicle exterior surface display device 53 of the vehicle exterior display device 50. This is determined as the vehicle outside display device 50 to be displayed.
- the aerial display device 52 may be determined as the vehicle outside display device 50 that performs display.
- the road surface display device 51 and the vehicle outside surface display device 53 are changed. The description will be made on the assumption that the display device 50 outside the vehicle to be displayed is determined. Since the display in the air may obstruct the other person's path by blocking the other person's path, when the traffic promotion scene is specified, the aerial display device 52 is excluded from the outside display device 50 that performs the display. It is preferable to do.
- the display content determination unit 107 determines the display content on the outside display device 50 that has been determined to be displayed by the display device determination unit 106 according to the scene specified by the scene specification unit 105. Details of the display contents will be described later.
- the display processing unit 108 causes the outside display device 50 that has been determined to be displayed by the display device determination unit 106 to display the display content determined by the display content determination unit 107.
- Example of a flow of processing related to display on the vehicle outside display device 50 corresponding to a scene in the driving support ECU 10 (hereinafter referred to as vehicle outside display related processing) will be described using the flowchart of FIG. 7 may be configured to start when, for example, the ignition power of the own vehicle is turned on.
- step S1 when the scene specifying unit 105 specifies a scene as a caution area running scene (YES in S1), the process proceeds to step S2. On the other hand, when it is not specified as the attention area travel scene (NO in S1), the process proceeds to step S5. In addition, in step S1, you may add to the conditions that the own vehicle is below the speed of slow driving (for example, 10 km / h or less).
- step S2 the display device determination unit 106 determines the road surface display device 51 and the aerial display device 52 as the vehicle outside display device 50 that performs display.
- step S3 the display content determination unit 107 determines the display content to be displayed on the road surface display device 51 and the aerial display device 52.
- the road surface display device 51 may be determined to display a range indicating that the host vehicle cannot be stopped.
- the aerial display device 52 may be determined so as to perform a display informing that the vehicle is approaching.
- the display processing unit 108 causes the road surface display device 51 and the aerial display device 52 to display simultaneously.
- the display processing unit 108 causes the road surface display device 51 to display a range indicating that the host vehicle cannot be stopped.
- the range in which the host vehicle cannot be stopped may be a range from the host vehicle extending in front of the host vehicle to the stop distance of the host vehicle.
- the stop distance is a distance obtained by combining a so-called idle running distance and a braking distance, and a distance estimated according to the vehicle speed of the host vehicle may be used.
- the display processing unit 108 may display the range in which the host vehicle cannot be stopped by projecting light from the road surface display device 51 to the range in which the host vehicle cannot be stopped on the road surface (see G in FIG. 8).
- the display processing unit 108 may switch the area for projecting light by switching the angle at which light is projected and the number of light sources that project light. Moreover, the display which shows the range which cannot stop the own vehicle is good also as a structure which changes the color and brightness
- the display processing unit 108 causes the aerial display device 52 to perform a display informing that the vehicle is approaching.
- the display processing unit 108 may perform a display informing that the vehicle is approaching by projecting light from the aerial display device 52 so as to cause the display to extend forward in the course of the vehicle to the air (FIG. 8). H).
- the display notifying the approach of the host vehicle to be displayed on the aerial display device 52 is farther than the display of the range in which the host vehicle cannot be stopped displayed on the road surface display device 51 in order to enable early warning to the vicinity of the host vehicle. It is preferable to display up to the above.
- the display notifying that the host vehicle is approaching may be a display in which arrows indicating the traveling direction of the host vehicle are continued, or may be blinked to draw attention.
- the display processing unit 108 may be configured to cause the host vehicle outer surface display device 53 to display the vehicle speed and acceleration / deceleration status of the host vehicle.
- the vehicle speed and acceleration / deceleration status of the host vehicle may be represented by text such as “40 km / h” or “accelerating”.
- step S5 when the own vehicle is stopped (YES in S5), the process proceeds to step S6. On the other hand, if the vehicle is not stopped (NO in S5), the display processing unit 108 does not display on the outside display device 50, and the process proceeds to step S14.
- step S6 when the scene identifying unit 105 identifies the scene as a start request scene (YES in S6), the process proceeds to step S7. On the other hand, when it is not specified as the start request scene (NO in S6), the process proceeds to step S10.
- step S7 the display device determination unit 106 determines that the aerial display device 52 and the vehicle exterior surface display device 53 are the vehicle exterior display devices 50 that perform display.
- step S ⁇ b> 8 the display content determination unit 107 determines display contents to be displayed on the aerial display device 52 and the own vehicle outer surface display device 53. As an example, the determination may be made so that both the aerial display device 52 and the vehicle outer surface display device 53 perform a display notifying the start of the vehicle.
- the display processing unit 108 causes the aerial display device 52 and the own vehicle outer surface display device 53 to simultaneously display.
- the display processing unit 108 causes the aerial display device 52 to perform a display notifying the start of the own vehicle.
- the display processing unit 108 may perform a display informing the start of the host vehicle by projecting light from the aerial display device 52 so that the display indicating the traveling direction after the start of the host vehicle is performed in the air. (See I in FIG. 9).
- the display in the air informing the start of the host vehicle may be a display in which arrows indicating the direction of travel after the start of the host vehicle are continued, or may be blinked to draw attention.
- the display in the air notifying the start of the host vehicle is intended to call attention to the vicinity of the host vehicle, so that it stops near the host vehicle rather than the display notifying the approach of the host vehicle in the attention area driving scene. It is preferable to do.
- the display processing unit 108 also causes the host vehicle outer surface display device 53 to perform a display notifying the start of the host vehicle.
- the display processing unit 108 causes the vehicle outer surface display device 53 to display a text such as “start” on the outer surface of the vehicle such as a windshield of the vehicle, thereby displaying a notification of the start of the vehicle. (See J in FIG. 9).
- the display on the outer surface of the host vehicle that notifies the start of the host vehicle displays an arrow that indicates the traveling direction after the host vehicle starts, or an arrow that indicates the rotation direction of the wheel when the host vehicle starts. It is good also as a structure. In addition, it is good also as a structure which rotates a display in the rotation direction of the wheel at the time of the start of the own vehicle, or blinks several LED light sources arranged in the circumferential direction of the wheel in order.
- the display processing unit 108 may be configured to cause the road surface display device 51 to display to notify the start of the own vehicle. For example, a configuration may be adopted in which an arrow indicating the traveling direction after the vehicle starts is displayed on the road surface.
- step S10 when the scene specifying unit 105 specifies the scene as a passage promotion scene (YES in S10), the process proceeds to step S11. On the other hand, when it is not specified as a traffic promotion scene (NO in S10), the display processing unit 108 does not display on the outside display device 50, and proceeds to step S14.
- step S11 the display device determination unit 106 determines the road surface display device 51 and the own vehicle outer surface display device 53 as the vehicle outer display device 50 that performs display.
- step S12 the display content determination unit 107 determines the display content to be displayed on the road surface display device 51 and the host vehicle outer surface display device 53. As an example, it may be determined that both the road surface display device 51 and the host vehicle outer surface display device 53 perform a display that promotes traffic of the target.
- the object mentioned here is a pedestrian or a vehicle that is estimated to be intended to pass in the direction intersecting with the own vehicle.
- the display processing unit 108 causes the road surface display device 51 and the own vehicle outer surface display device 53 to display simultaneously.
- the display processing unit 108 causes the road surface display device 51 to perform a display prompting the target to pass.
- the display processing unit 108 may perform a display that prompts the passage of the target by projecting light that causes the road surface display device 51 to display an arrow pointing in a direction intersecting with the own vehicle on the road surface.
- the display on the road surface that prompts the subject to pass may be a text display such as “Please pass first”, or may be blinked to draw attention.
- the display processing unit 108 also causes the host vehicle outer surface display device 53 to perform a display prompting the subject to pass.
- the display processing unit 108 performs a display prompting the subject to pass by causing the vehicle outer surface display device 53 to display a text such as “Please pass first” on the outer surface of the vehicle such as a windshield of the vehicle. You can do it.
- the display on the outer surface of the host vehicle that prompts the traffic of the target is configured to display an arrow pointing toward the rear of the host vehicle or an arrow indicating the direction opposite to the rotation direction of the wheel when the host vehicle starts. It is good.
- the display processing unit 108 may be configured to cause the aerial display device 52 to perform a display prompting the target to pass.
- the display in the air for encouraging the passage of the object is displayed so as to open the passage of the object so as not to obstruct the passage of the object.
- stimulates the passage of object by the display of the tunnel or the wall which blocks the front of the own vehicle etc. toward the direction where the object passes.
- step S14 when it is the end timing of the outside display related process (YES in S14), the outside display related process is ended. On the other hand, if it is not the end timing of the outside-vehicle display related process (NO in S14), the process returns to S1 and is repeated. Examples of the end timing of the outside display related process include when the ignition power of the own vehicle is turned off.
- the display since a plurality of types of the outside display devices 50 are displayed at the same time, the display may be within the field of view of people around the own vehicle as compared with the case where only one type of the outside display device 50 is displayed. Can be increased.
- a set of the vehicle outside display devices 50 to be displayed at the same time is determined according to the scene specified by the scene specifying unit 105, the vehicle outside display device 50 that is more easily entered depending on the scene depending on the field of view of a person around the vehicle. Can be displayed.
- the display content to be displayed on the outside display device 50 is determined for each outside display device 50 according to the scene, it is possible to display the display content suitable for the scene and the display location.
- the road surface display device 51 displays on the road surface a range indicating that the host vehicle cannot be stopped, and the aerial display device 52 displays a display notifying that the vehicle is approaching in the air.
- the inventor has found through experiments that a walking pedestrian rarely turns his gaze toward the road surface at his feet and is looking in the air in the traveling direction.
- the configuration of the first embodiment in the attention area running scene, since the display is performed in the air and the road surface, even if the display is difficult to be recognized by the pedestrian only by the display on the road surface, Combining display in the air makes it easier to recognize the display.
- the display in the air makes the vehicle approachable to the vicinity of the vehicle, makes it easier to look at the road surface, and the display on the road surface makes it possible to notice the range where the vehicle can not stop, thereby raising the level of attention. The effect is obtained.
- both the aerial display device 52 and the own vehicle outer surface display device 53 are displayed to notify the start of the own vehicle.
- the inventor has found through experiments that the pedestrian turns his line of sight toward the vehicle body up to a distance of about 5 m between the pedestrians, and the line of sight movement toward the vehicle body is less at a position closer than that.
- the configuration of the first embodiment since the display is performed on the air and the outer surface of the own vehicle, walking with a distance of about several meters from the own vehicle is only performed on the outer surface of the own vehicle. Even if it is difficult for the person to recognize the display, the display can be more easily recognized by combining the display in the air. In addition, by displaying the start of the vehicle in the air, it is possible to easily prevent traffic ahead of the vehicle.
- both the road surface display device 51 and the host vehicle outer surface display device 53 are caused to display to promote the traffic of the target.
- the display since the display is performed on the road surface and the outer surface of the own vehicle in the traffic promotion scene, the display to the pedestrian whose distance from the own vehicle is about several meters is only displayed on the outer surface of the own vehicle. Even when it is difficult to recognize, the display can be more easily recognized by combining the display on the road surface.
- the display by performing display that promotes traffic on the road surface instead of in the air, it is possible to prompt the traffic of the target without inhibiting traffic in front of the vehicle.
- the present invention is not necessarily limited thereto.
- one type of the outside display device 50 to be displayed among the plurality of types of outside display devices 50 may be configured to be switched according to the scene specified by the scene specifying unit 105.
- the vehicle outer surface display device 53 is made to display,
- the display may be switched to the display on the road surface display device 51 or the aerial display device 52.
- the combination of the scene specified by the scene specifying unit 105 and the outside display device 50 that performs display is merely an example, and is not limited to the combination described in the above-described embodiment, and may be another combination. .
- a configuration in which the types of outside display devices 50 that perform display are increased compared to scenes where only pedestrians with low walking ability exist as pedestrians, etc. And it is sufficient.
- a configuration in which a display on the road surface display device 51 is added as compared to a scene where only a pedestrian who does not point at an umbrella exists.
- Modification 2 In the above-described embodiment, the configuration in which the road surface display device 51, the aerial display device 52, and the own vehicle outer surface display device 53 are the vehicle outer display device 50 is shown, but the present invention is not necessarily limited thereto. In the case of a plurality of types of outside display devices 50, only two types of the road surface display device 51, the aerial display device 52, and the own vehicle outside surface display device 53 may be used as the outside display device 50.
- Modification 3 In the first embodiment, the configuration in which the display on the outside display device 50 is not performed depending on the scene is shown. However, the configuration is not necessarily limited thereto, and the display on the outside display device 50 is always performed regardless of the scene. Good. For example, it is good also as a structure which always performs the simultaneous display with the multiple types of exterior display apparatus 50 irrespective of a scene.
- Modification 4 In the first embodiment, an example of a vehicle having an automatic driving function is shown, but the present invention is not necessarily limited thereto. For example, it may be configured to be applied to a vehicle that does not have an automatic driving function. In this case, the scene specifying unit 105 may be configured to determine that it is not desired to start when the operation input unit that receives an operation input that gives priority to the passage of another person receives the operation input.
- the configuration in which the driving environment recognition unit 101 recognizes the state of the surrounding person includes a person driving a vehicle such as an automobile, a bicycle, or a motorcycle, but is not limited thereto.
- a vehicle such as an automobile, a bicycle, or a motorcycle
- the object whose information is presented by the outside display device 50 may be configured to include only pedestrians.
- the first embodiment shows a configuration in which the driving support ECU 10 is responsible for the function of recognizing the traveling environment of the host vehicle from the sensing result acquired from the surrounding monitoring sensor 30, the automatic driving function, and the function of executing the vehicle-related display related processing.
- the driving support ECU 10 is responsible for the function of recognizing the traveling environment of the host vehicle from the sensing result acquired from the surrounding monitoring sensor 30, the automatic driving function, and the function of executing the vehicle-related display related processing.
- these functions may be performed by a plurality of ECUs.
- each section is expressed as, for example, S1. Further, each section can be divided into a plurality of subsections, while a plurality of sections can be combined into one section. Further, each section configured in this manner can be referred to as a device, module, or means.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Lighting Device Outwards From Vehicle And Optical Signal (AREA)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016-155817 | 2016-08-08 | ||
| JP2016155817A JP6680136B2 (ja) | 2016-08-08 | 2016-08-08 | 車外表示処理装置及び車外表示システム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018029978A1 true WO2018029978A1 (ja) | 2018-02-15 |
Family
ID=61162081
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/022093 Ceased WO2018029978A1 (ja) | 2016-08-08 | 2017-06-15 | 車外表示処理装置及び車外表示システム |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP6680136B2 (enExample) |
| WO (1) | WO2018029978A1 (enExample) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020100655A1 (ja) * | 2018-11-15 | 2020-05-22 | 株式会社小糸製作所 | 車両用照明システム |
| CN111200689A (zh) * | 2018-11-19 | 2020-05-26 | 阿尔派株式会社 | 移动体用投影装置、便携终端、程序 |
| US10679530B1 (en) | 2019-02-11 | 2020-06-09 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for mobile projection in foggy conditions |
Families Citing this family (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6613265B2 (ja) * | 2017-06-01 | 2019-11-27 | 本田技研工業株式会社 | 予測装置、車両、予測方法およびプログラム |
| JP6989418B2 (ja) * | 2018-03-12 | 2022-01-05 | 矢崎総業株式会社 | 車載システム |
| WO2020008560A1 (ja) * | 2018-07-04 | 2020-01-09 | 三菱電機株式会社 | 情報表示装置及び情報表示方法 |
| JP7158279B2 (ja) * | 2018-12-28 | 2022-10-21 | 株式会社小糸製作所 | 標識灯システム |
| JP7202208B2 (ja) * | 2019-02-13 | 2023-01-11 | 株式会社Subaru | 自動運転システム |
| JP7149199B2 (ja) * | 2019-02-13 | 2022-10-06 | 株式会社Subaru | 自動運転システム |
| JP2020149598A (ja) * | 2019-03-15 | 2020-09-17 | 豊田合成株式会社 | 車載警告装置 |
| JP7544701B2 (ja) * | 2019-06-28 | 2024-09-03 | 株式会社小糸製作所 | 車両用情報表示システム |
| JP7440237B2 (ja) | 2019-10-16 | 2024-02-28 | トヨタ自動車株式会社 | 表示装置 |
| JP7581254B2 (ja) * | 2020-01-23 | 2024-11-12 | 株式会社小糸製作所 | 灯具システム |
| JP7527155B2 (ja) | 2020-08-21 | 2024-08-02 | 株式会社ファルテック | スクリーングリル |
| WO2022168543A1 (ja) * | 2021-02-03 | 2022-08-11 | 株式会社小糸製作所 | 路面描画装置 |
| CN113428080A (zh) * | 2021-06-22 | 2021-09-24 | 阿波罗智联(北京)科技有限公司 | 无人车提醒行人或车辆避让的方法、装置、无人车 |
| JP7614043B2 (ja) * | 2021-07-14 | 2025-01-15 | 矢崎総業株式会社 | 車外表示装置 |
| JP2023092400A (ja) * | 2021-12-21 | 2023-07-03 | 株式会社小糸製作所 | 車両用画像表示装置、および車両用画像表示方法 |
| WO2025229754A1 (ja) * | 2024-05-01 | 2025-11-06 | Astemo株式会社 | 車両制御装置および車両制御方法 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH05221263A (ja) * | 1992-02-17 | 1993-08-31 | Nippon Steel Corp | 車両用表示装置 |
| JP2009018711A (ja) * | 2007-07-12 | 2009-01-29 | Denso Corp | 車両用表示灯 |
| JP2014184876A (ja) * | 2013-03-25 | 2014-10-02 | Stanley Electric Co Ltd | 路面照射信号灯具 |
| JP2016055691A (ja) * | 2014-09-08 | 2016-04-21 | 株式会社小糸製作所 | 車両用表示システム |
-
2016
- 2016-08-08 JP JP2016155817A patent/JP6680136B2/ja active Active
-
2017
- 2017-06-15 WO PCT/JP2017/022093 patent/WO2018029978A1/ja not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH05221263A (ja) * | 1992-02-17 | 1993-08-31 | Nippon Steel Corp | 車両用表示装置 |
| JP2009018711A (ja) * | 2007-07-12 | 2009-01-29 | Denso Corp | 車両用表示灯 |
| JP2014184876A (ja) * | 2013-03-25 | 2014-10-02 | Stanley Electric Co Ltd | 路面照射信号灯具 |
| JP2016055691A (ja) * | 2014-09-08 | 2016-04-21 | 株式会社小糸製作所 | 車両用表示システム |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020100655A1 (ja) * | 2018-11-15 | 2020-05-22 | 株式会社小糸製作所 | 車両用照明システム |
| JPWO2020100655A1 (ja) * | 2018-11-15 | 2021-09-30 | 株式会社小糸製作所 | 車両用照明システム |
| EP3882077A4 (en) * | 2018-11-15 | 2021-12-15 | Koito Manufacturing Co., Ltd. | VEHICLE LIGHTING SYSTEM |
| JP7309751B2 (ja) | 2018-11-15 | 2023-07-18 | 株式会社小糸製作所 | 車両用照明システム |
| CN111200689A (zh) * | 2018-11-19 | 2020-05-26 | 阿尔派株式会社 | 移动体用投影装置、便携终端、程序 |
| CN111200689B (zh) * | 2018-11-19 | 2022-05-03 | 阿尔派株式会社 | 移动体用投影装置、便携终端及便携终端的显示方法 |
| US10679530B1 (en) | 2019-02-11 | 2020-06-09 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for mobile projection in foggy conditions |
Also Published As
| Publication number | Publication date |
|---|---|
| JP6680136B2 (ja) | 2020-04-15 |
| JP2018024291A (ja) | 2018-02-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6680136B2 (ja) | 車外表示処理装置及び車外表示システム | |
| CN109515434B (zh) | 车辆控制装置、车辆控制方法及存储介质 | |
| JP6617534B2 (ja) | 運転支援装置 | |
| JP6515814B2 (ja) | 運転支援装置 | |
| US11900812B2 (en) | Vehicle control device | |
| CN110271544B (zh) | 车辆控制装置、车辆控制方法及存储介质 | |
| CN110036426B (zh) | 控制装置和控制方法 | |
| CN110662683A (zh) | 驾驶辅助装置以及驾驶辅助方法 | |
| JP2021175630A (ja) | 運転支援装置 | |
| US12221103B2 (en) | Vehicle control device, vehicle control method, and storage medium | |
| CN115884908A (zh) | 路径确认装置以及路径确认方法 | |
| JP2021131775A (ja) | 車両の運転支援システム | |
| CN112874513A (zh) | 驾驶支援装置 | |
| US20230399013A1 (en) | Vehicle control device and method for controlling vehicle | |
| CN113370972B (zh) | 行驶控制装置、行驶控制方法以及存储程序的计算机可读取存储介质 | |
| CN113646201A (zh) | 车辆用显示控制装置、车辆用显示控制方法、车辆用显示控制程序 | |
| CN113727895A (zh) | 车辆控制方法以及车辆控制装置 | |
| JP6471707B2 (ja) | 運転教示装置 | |
| US20190265727A1 (en) | Vehicle control device | |
| JP6894354B2 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
| JP2022156078A (ja) | 交通システム | |
| CN110194155A (zh) | 车辆控制装置 | |
| JP2022060075A (ja) | 運転支援装置 | |
| JP7652164B2 (ja) | 車両用制御装置及び車両用制御方法 | |
| JP7334107B2 (ja) | 車両制御方法及び車両制御装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17839053 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17839053 Country of ref document: EP Kind code of ref document: A1 |