CN110462704B - Driving support device - Google Patents

Driving support device Download PDF

Info

Publication number
CN110462704B
CN110462704B CN201880019444.6A CN201880019444A CN110462704B CN 110462704 B CN110462704 B CN 110462704B CN 201880019444 A CN201880019444 A CN 201880019444A CN 110462704 B CN110462704 B CN 110462704B
Authority
CN
China
Prior art keywords
vehicle
guide mark
driving support
travel
obstacle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880019444.6A
Other languages
Chinese (zh)
Other versions
CN110462704A (en
Inventor
深谷直树
上坂广人
胜利平
河村真治
大泽良树
棚桥章仁
川濑望实
仙石和香
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Original Assignee
Denso Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp filed Critical Denso Corp
Priority claimed from PCT/JP2018/005627 external-priority patent/WO2018173581A1/en
Publication of CN110462704A publication Critical patent/CN110462704A/en
Application granted granted Critical
Publication of CN110462704B publication Critical patent/CN110462704B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • B62D15/0265Automatic obstacle avoidance by steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/029Steering assistants using warnings or proposing actions to the driver without influencing the steering system

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

The driving support device includes: a map acquisition unit (S102) that acquires a surrounding space map (6) that defines the positional relationship of objects in a surrounding space (4) of the vehicle; a travel determination means (S103) for determining whether or not the host vehicle can travel in the support target scene on the basis of the surrounding space map; a guide mark generation means (S104) for generating a guide mark (7) for guiding the vehicle on the basis of the surrounding space map, and adjusting the relative positions of the guide mark and obstacles (4A, 4B) on the surrounding space map with respect to the vehicle on the basis of the reference position information (Ib); a steering support means (S105) for supporting the steering of the vehicle by the occupant on the basis of the generated guide mark when it is determined that the vehicle is capable of traveling; and reference updating means (S107, S108) for updating the reference position information on the basis of the degree of deviation between the guide mark and the actual trajectory (8) through which the own vehicle passes during the steering assistance.

Description

Driving support device
Cross Reference to Related Applications
The present application claims priority of japanese patent application No. 2017-.
Technical Field
The present disclosure relates to a driving support apparatus.
Background
As disclosed in patent document 1, a driving support device for supporting driving of a passenger in a vehicle is mounted in a part of the vehicle.
Patent document 1: japanese patent laid-open publication No. 2017-77829
In recent years, the function of the driving support apparatus has not been utilized satisfactorily by the passenger because of many restrictions. For example, among passengers who perform daily driving, there are passengers who have a poor awareness of driving of their own vehicle, a terrorist experience, or experience in the following scenarios.
In a case of waiting for a passing scene to the side of another vehicle turning right or in a case of a deviation from another vehicle facing the other vehicle on a narrow road having a narrow or narrow road width, the vehicle may be confused by the vehicle without the vehicle being driven by the passenger although there is a gap that allows the vehicle to pass through or deviate. Alternatively, in such a scenario, if the passenger moves the host vehicle without a passing or shifting interval, there is a possibility that the host vehicle collides with another vehicle.
In a situation where the vehicle passes by the side of another vehicle on a narrow road or in a situation where the vehicle is shifted from another vehicle on a narrow road, the vehicle may not pass or shift because the passenger cannot grasp the position of a side object such as a utility pole, a guardrail, a curb, a side wall, a pedestrian, a bicycle, or a motorcycle, or the position of a side edge of a road such as a shoulder or a sidewalk, for example, and cannot lean against the vehicle. Alternatively, in these situations, if the position of the side groove of the shoulder, which is the side edge of the road, or the position of the side object cannot be grasped and the host vehicle is too close to the side, the host vehicle may be separated from the shoulder to the side groove or come into contact with the side object.
In a case where a vehicle passes through while avoiding a falling object or a parked vehicle on the road, or in a case where the vehicle is shifted from another vehicle in the opposite direction while avoiding the vehicle, a steering operation required for avoiding the vehicle is not performed due to missing of a passenger or insufficient recognition of the vehicle width, and the vehicle may collide with the falling object or the parked vehicle. Alternatively, in these situations, particularly in the case of a shift situation, if the steering operation for avoiding the vehicle is performed excessively due to insufficient recognition of the vehicle width, there is a possibility that the steering operation will confuse another vehicle.
In these situations, in the disclosed technique of patent document 1, for the passing situation, a target steering angle for guiding the vehicle is set so that the vehicle passes through the target passing point on the side of the obstacle to be passed through. However, since the target passing point is a predetermined position in which only the vehicle width of the host vehicle is taken into consideration, there is a possibility that the size of the space secured between the host vehicle and the obstacle may vary in the passenger feeling. This is because, even if a region through which the own vehicle can physically pass is secured in the obstacle side, the feeling of whether the own vehicle and the obstacle can actually pass through or are offset differs for each passenger depending on whether or not the spatial size between the own vehicle and the obstacle can actually allow the passage. Therefore, such a deviation in feeling may make the passenger feel uncomfortable.
Disclosure of Invention
The present disclosure has been made in view of the above-described problems, and an object thereof is to provide a driving support device that ensures the safety and safety of a passenger in at least one of a crossing scene and a deviation scene.
A driving support device according to an aspect of the present disclosure supports driving of a passenger in a host vehicle, and includes: a map acquisition unit that acquires a peripheral space map that shows a state of an object in a peripheral space of the own vehicle and defines a positional relationship of the objects with each other; a travel determination unit that determines whether or not the host vehicle is able to travel in the support target scene in at least one of the crossing scene and the deviated scene, based on the surrounding space map acquired by the map acquisition unit; a guide mark generation unit that generates a guide mark for guiding the host vehicle in the support target scene based on the surrounding space map acquired by the map acquisition unit, and adjusts a relative position in the surrounding space map of the guide mark and an obstacle relative to the host vehicle in the support target scene according to the reference position information; a steering support unit that supports steering of the vehicle by the occupant based on the guide mark generated by the guide mark generation unit when the travel determination unit determines that the vehicle is capable of traveling; and a reference updating unit that updates the reference position information based on a degree of deviation between an actual trajectory that the own vehicle passes through and the guide mark generated by the guide mark generating unit in the steering assistance by the steering assistance unit.
According to the present disclosure, in the support target scene of at least one of the passing scene and the shifting scene, the traveling and steering of the vehicle can be supported for the passenger. Specifically, by acquiring a surrounding space map that defines the positional relationship between objects by showing the states of the objects in the surrounding space of the host vehicle, it is possible to identify with high accuracy the area in the space where the host vehicle can travel. Therefore, the possibility of traveling of the own vehicle in the support target scene can be accurately determined based on the surrounding space map. Further, by following the guide mark generated based on the surrounding space map as the guide mark for guiding in the assist target scene, the steering operation of the occupant on the own vehicle when it is determined that the traveling is possible can be accurately assisted. Accordingly, it is possible to ensure the safety and the safety of the passenger in at least one of the passing scene and the shifting scene as the support target scene.
In addition, according to the present disclosure, the relative positions of the obstacle and the guide mark with respect to the host vehicle in the support target scene in the surrounding space map are adjusted based on the predetermined reference position information. Here, the reference position information is updated based on the degree of deviation from the guide mark based on the actual trajectory through which the host vehicle passes during the steering assistance, and the size of the space secured between the host vehicle and the obstacle based on the guide mark can be made close to the size under the actual trajectory reflecting the passenger's feeling. Therefore, at least one of the crossing scene and the shifting scene as the support target scene can improve the sense of security given to the passenger.
Drawings
The above object and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description with reference to the accompanying drawings. The attached figures are such that,
FIG. 1 is a block diagram showing a driving support apparatus according to a first embodiment of the present disclosure,
fig. 2 is a schematic diagram for explaining the peripheral environment recognition sensor of the driving support apparatus according to the first embodiment,
fig. 3 is a schematic view for explaining the surrounding space and the map of the surrounding space of the first embodiment,
fig. 4 is a schematic view for explaining the surrounding space and the map of the surrounding space of the first embodiment,
fig. 5 is a schematic view for explaining the surrounding space and the map of the surrounding space of the first embodiment,
FIG. 6 is a schematic diagram for explaining the operation of the driving support apparatus of the first embodiment,
FIG. 7 is a schematic diagram for explaining the operation of the driving support apparatus of the first embodiment,
FIG. 8 is a schematic diagram for explaining the operation of the driving support apparatus of the first embodiment,
FIG. 9 is a schematic diagram for explaining the operation of the driving support apparatus of the first embodiment,
FIG. 10 is a flowchart showing a driving support flow of the driving support apparatus according to the first embodiment,
FIG. 11 is a schematic diagram for explaining the operation of the driving support apparatus of the first embodiment,
FIG. 12 is a schematic diagram for explaining the operation of the driving support apparatus of the first embodiment,
FIG. 13 is a schematic diagram for explaining the operation of the driving support apparatus of the first embodiment,
FIG. 14 is a schematic diagram for explaining the operation of the driving support apparatus of the first embodiment,
fig. 15 is a schematic diagram for explaining the operation of the driving support apparatus according to the second embodiment of the present disclosure,
FIG. 16 is a schematic diagram for explaining the operation of the driving support apparatus according to the second embodiment,
FIG. 17 is a flowchart showing a driving support flow of the driving support apparatus according to the second embodiment,
FIG. 18 is a schematic diagram for explaining the operation of the driving support apparatus according to the second embodiment,
fig. 19 is a schematic diagram for explaining the operation of the driving support apparatus according to the third embodiment of the present disclosure,
FIG. 20 is a flowchart showing a driving support flow of the driving support apparatus according to the third embodiment,
fig. 21 is a schematic diagram for explaining the operation of the driving support apparatus according to the fourth embodiment of the present disclosure,
FIG. 22 is a flowchart showing a driving support flow of the driving support apparatus according to the fourth embodiment,
fig. 23 is a schematic view of a surrounding environment recognition sensor for explaining a modification of the present disclosure,
fig. 24 is a schematic view of a peripheral environment recognition sensor for explaining a modification,
fig. 25 is a schematic diagram of an ambient environment recognition sensor for explaining a modification.
Detailed Description
Hereinafter, a plurality of embodiments of the present disclosure will be described based on the drawings. In addition, overlapping description of corresponding components in each embodiment may be omitted. In the case where only a part of the configuration is described in each embodiment, the configuration of the other embodiment described above can be applied to the other part of the configuration. In addition, not only the combinations of the configurations described in the description of the respective embodiments but also the configurations of the plurality of embodiments may be partially combined without being described unless otherwise specifically described, as long as the combinations do not particularly interfere with each other. Further, combinations of the configurations described in the embodiments and the modifications, which are not explicitly shown, are also disclosed in the following description.
(first embodiment)
In a first embodiment of the present disclosure, the driving support apparatus 1 shown in fig. 1 is applied to a vehicle 2 to support driving of a passenger. Hereinafter, the vehicle 2 to which the driving support apparatus 1 is applied will be referred to as the host vehicle 2.
The host vehicle 2 is mounted with a surrounding environment recognition sensor 3 so as to be able to recognize the surrounding environment. As shown in fig. 2 and 3 to 5, the surrounding environment recognition sensor 3 detects the state of an object existing in the surrounding space 4 of the host vehicle 2 within a detection range determined according to the angle of view θ. Here, the object state detected by the surrounding environment recognition sensor 3 refers to at least one of the distance, the orientation, the position including the distance and the orientation, and the size of the object in the surrounding space 4, for example. Therefore, the vehicle 2 may be equipped with at least one of a LIDAR (also referred to as a laser radar), a camera (for example, a stereo camera), a radio wave radar (for example, a millimeter wave radar), and the like as the surrounding environment recognition sensor 3.
As shown in fig. 1, the driving support apparatus 1 mounted on the host vehicle 2 is configured by at least one of the ECUs mainly including a microcomputer. The driving support apparatus 1 combines the detection information of the surrounding environment recognition sensor 3 and vehicle-related information such as a vehicle speed and a steering angle processed on an in-vehicle network 5 such as CAN (registered trademark), to obtain a surrounding space map 6 showing a state of an object in the surrounding space 4 as shown on the right side of fig. 3 to 5. That is, the surrounding space map 6 is acquired as two-dimensional or three-dimensional mapping data representing at least one of the distance, the orientation, the position including the distance and the orientation, and the size of the object existing in the surrounding space 4. The surrounding space map 6 thus defines the positional relationship between various objects with the host vehicle 2 as the center (i.e., the origin at 0m and 0 ° in the surrounding space map of fig. 3 to 5). Here, especially if the installation position of the surrounding environment recognition sensor 3 in the host vehicle 2 is grasped in advance, the predetermined trajectory of the outermost edge of the host vehicle 2 accompanying the movement in the surrounding space map 6 can be predicted and calculated, and therefore the relative positional relationship with the object around the host vehicle 2 can be accurately defined. Therefore, based on the peripheral space map 6, the driving support apparatus 1 can accurately recognize the area in the peripheral space 4 where the own vehicle 2 can travel and the area in the space 4 where the own vehicle 2 cannot travel.
In the surrounding space map of fig. 3, the area where the host vehicle 2 can travel because the probability of the presence of an object that becomes an obstacle is low is shown in white, while the area where the host vehicle 2 cannot travel because the probability is high is shown in gray to black. In the surrounding space maps of fig. 4 and 5, the travelable regions are not shown except for the positions where the obstacles (4A and 4B described later) are present with respect to the travelable regions shown in the same manner as in the surrounding space map of fig. 3.
As shown in fig. 4 to 9, when the obstacles 4A and 4B to the vehicle 2 are present on the left front side and the right front side of the peripheral space 4, respectively, and the distance D therebetween (see fig. 8 and 9) is relatively narrow, the passenger of the vehicle 2 may be confused as to whether the vehicle 2 can pass through or shift. Even if the passable or staggered interval D can be secured, the passenger may be confused about how the vehicle 2 can safely travel when being steered. For example, in a situation where the vehicle is waiting for a right turn to pass by the side of another vehicle as shown in fig. 4 and 6, a situation where a narrow road having a narrow road width or a narrow road width as shown in fig. 5, 7 to 9 is passed, or a situation where the vehicle is shifted, the judgment of the passenger becomes confusing. Here, the distance D refers not only to a direct separation distance between the obstacles 4A and 4B in the vehicle width direction (i.e., the left-right direction or the lateral direction) of the host vehicle 2, but also to a separation distance in the vehicle width direction between virtual lines extending from the obstacles 4A and 4B in the vehicle length direction (i.e., the front-rear direction) of the host vehicle 2 in the front-rear direction.
The driving assistance device 1 that is concerned about as described above determines the possibility of traveling based on the presence or absence of the passable or misaligned interval D according to the relative positional relationship between the host vehicle 2 and the obstacles 4A and 4B by the driving assistance flow using the surrounding space map 6, and thereby realizes safe passable or misaligned driving assistance. Here, the traveling availability means whether or not the host vehicle 2 can pass through or shift without contacting or colliding with the obstacles 4A and 4B to be passed through or shifted.
Specifically, the driving support apparatus 1 shown in fig. 1 functionally realizes the driving support flow shown in fig. 10 by the processor 1b executing the computer program stored in the memory 1 a. The driving support process is started by an on operation of a power switch provided in the host vehicle 2, and is ended by an off operation of the power switch. In addition, "S" in the driving support flow refers to each step.
In S101, it is determined whether or not the driving scene of the own vehicle 2 is a scene to be assisted requiring driving assistance. The support target scene of the first embodiment is set in advance as each of the passing scene and the shifting scene between the obstacles 4A and 4B described above. Therefore, in S101, the driving situation in which the obstacles 4A and 4B on the left and right sides of the peripheral space 4 determined to pass through the predetermined position or to be shifted by the interval D from the predetermined position are recognized by detection or the like in the situation where the vehicle speed of the host vehicle 2 is low (for example, 10km/h or less) is determined as the supporting target situation. At this time, at least one of a utility pole, a guardrail, a curb, a side wall, another vehicle (including another vehicle facing in a staggered situation), a pedestrian, a side object such as a bicycle or a motorcycle, and a road side edge (including a side edge having a side groove) such as a shoulder or a sidewalk can be recognized as each of the obstacles 4A and 4B. Therefore, while at least one of the obstacles 4A and 4B is not recognized, S101 is repeatedly executed while negative determination is made in S101. On the other hand, when both the obstacles 4A and 4B are recognized and an affirmative determination is made in S101, the process proceeds to S102.
In S102, the surrounding space map 6 is acquired based on the detection information of the surrounding environment recognition sensor 3 and the vehicle-related information on the in-vehicle network 5, and is stored in the memory 1 a. In this case, the surrounding space map 6 can be acquired based on the individual temporal information for each processing timing, but it is more preferable to acquire the temporal information based on the time-series data obtained by accumulating the temporal information in time series. Here, since the surrounding space 4 includes a moving object without being limited to a stationary object, the recognition accuracy such as the detection accuracy for such a moving object is higher in the case of time-series data than in the case of instantaneous information. In the case of time-series data, the detection information of the surrounding environment recognition sensor 3 is corrected based on the vehicle-related information and accumulated for each processing timing, so that the same object (that is, the same obstacle in the present embodiment) can be identified regardless of whether the object is a stationary object or a moving object. Therefore, the surrounding space map 6 is updated in sequence so as to reflect the result of identification of the same object when time-series data is used, thereby ensuring temporal continuity or spatial continuity.
In S103 following S102, it is determined in S101 whether or not the vehicle 2 is allowed to travel in the crossing scene or the shifting scene of the support target scene. At this time, when it is predicted that neither of the obstacles 4A and 4B will come into contact with or collide with each other, and the distance D (see fig. 8 and 9) equal to or larger than the threshold Dth at which the vehicle 2 can pass through or shift is secured, it is determined that the vehicle 2 can travel. On the other hand, when the predicted ensured interval D is smaller than the threshold Dth, it is determined that the host vehicle 2 cannot travel. Thus, the determination as to whether or not traveling is possible is performed based on at least one of the surrounding space map 6 acquired at S102 and stored in the memory 1a, the detection information of the surrounding environment recognition sensor 3, and the vehicle-related information on the in-vehicle network 5. In the first embodiment, the threshold Dth serving as the criterion for determining whether or not traveling is possible in S103 is a fixed value stored in advance in the memory 1a at the time of shipment or the like, for example.
When the determination that the travel is possible is made in S103, S104 and S105 are executed in this order. First, in S104, the guide mark 7 for guiding the host vehicle 2 in the crossing scene or the offset scene is generated based on the surrounding space map 6 as in the surrounding space maps of fig. 4 and 5. At this time, the guide mark 7 is generated to indicate a predetermined trajectory on which the host vehicle 2 can travel while securing the interval D (see fig. 8 and 9) in the range in which the steering assistance is required in the surrounding space map 6 acquired at S102 and stored in the memory 1 a. Next, in S105, the steering operation of the passenger on the own vehicle 2 is assisted in the crossing scene or the deviated scene based on the guide mark 7 generated in S104. At this time, as a presentation method (that is, an output method) of the guide mark 7 to the passenger, any one of the following three methods is adopted.
In the method of presenting the guide mark 7 as one mode, as shown in fig. 6 to 9, the steering operation of the passenger in the host vehicle 2 is directly supported by the electronic switch 70 that functions like a switch (switch mark) when the vehicle travels on a snow road. In this presentation method, the steering wheel reaction force F is given as shown in fig. 8 and 9 by setting the electronic switch 70 along the guide mark 7 that is the path through which the vehicle 2 should pass from now on. Here, the reaction force F is given in such a manner that it increases as it approaches the obstacle 4A or 4B in the peripheral space 4, and the increasing tendency is fixed to be uniform or different on the left and right sides. Thus, when the passenger performs steering operation such as disengaging the guide mark 7, the reaction force F is received by the steering wheel, and therefore, the passenger can be assisted so as not to fall into a dangerous situation.
In the presentation method of the other system, as shown in fig. 11, for example, the guidance mark 7 and the position of the host vehicle 2 are displayed as an image on the display device 71 such as a meter, and the steering operation of the occupant is indirectly supported so that the host vehicle 2 follows the guidance mark 7. In this presentation method, an image 7a simulating a guide mark 7 which is a route through which the host vehicle 2 should pass from now on is displayed, and the relative position of the image 2a simulating the host vehicle 2 and the relative positions of the images 40A and 40B simulating the obstacles 4A and 4B are displayed. The passenger can thereby steer the vehicle 2 by following the guide mark 7, and assist the vehicle so as not to be in a dangerous situation.
In another presentation method, a lamp 73 disposed in a meter 72 as shown in fig. 12 and displaying a real image or a lamp 73 displaying a virtual image in a display area 74 of a head-up display (HUD) as shown in fig. 13 functions as the guide mark 7. Specifically, the steering direction of the steering wheel to be operated is indicated by the pattern of lighting or blinking of the lamp 73, and the steering operation of the occupant is indirectly supported so that the host vehicle 2 follows the pattern. In this presentation method, the passenger steers the steering wheel in accordance with the pattern of lighting or blinking of the lamp 73, and thereby causes the host vehicle 2 to travel along the guide mark 7 that is the route through which the host vehicle 2 should pass from now on, so that it is possible to support without being in a dangerous situation.
The method of providing assistance to the steering operation of the passenger based on the guide mark 7 in S105 is not limited to the three methods described above, and may be realized by, for example, audio output or the like in addition to at least two of these methods. S105 is continuously executed until the steering assist is completed in a part of the range of generation of the guide mark 7 (i.e., the range in which the steering assist is required).
After S105 for realizing such steering assist is executed, S106 is further executed as shown in fig. 10. In S106, it is determined whether the support target scene is ended. At this time, if the support target scene continues and a negative determination is made in S106, the process returns to S101. On the other hand, when the support target scene is ended and the affirmative determination is made in S106, the process proceeds to S107.
In S107, the degree of deviation from the actual trajectory 8 actually traveled by the host vehicle 2 in the steering assistance, as shown in fig. 14, with respect to the guide mark 7 presented in the steering assistance in S105 is determined. At this time, for example, when the maximum or average deviation width of the actual trajectory 8 from the guide mark 7 is equal to or smaller than the upper limit allowable width, or when the deviation ratio of the deviation width from the vehicle width is equal to or smaller than the upper limit allowable ratio, or the like, it is determined that the actual trajectory 8 and the guide mark 7 match, that is, the deviation degree is substantially 0, and the process returns to S101 as shown in fig. 10. On the other hand, for example, when the maximum or average deviation width of the actual trajectory 8 from the guide mark 7 exceeds the upper limit allowable width, or when the deviation ratio of the deviation width to the vehicle width exceeds the upper limit allowable ratio, or the like, it is determined that there is a deviation of the actual trajectory 8 from the guide mark 7, and the process proceeds to S108.
In S108, the reference position information Ib is updated based on the degree of deviation between the guide mark 7 and the actual trajectory 8 confirmed in S107. Here, the reference position information Ib specifies the relative position of the guide mark 7 in the surrounding space map 6 with respect to each of the obstacles 4A and 4B in the passing scene or the deviated scene. The reference position information Ib is stored in the memory 1a in a predetermined data format, and is read from the memory 1a when the guide mark 7 is generated in S104 next after the update in S108 this time. As a result, the relative position of the guide mark 7 with respect to the obstacles 4A and 4B at the execution time of S104 adjusted to the next time on the surrounding space map 6 becomes the position based on the updated reference position information Ib. That is, in the next S104, the guide mark 7 is generated based on the updated reference position information Ib. Therefore, in S108 of this time, the relative position where the deviation of the actual trajectory 8 from the guide mark 7 is reduced or substantially eliminated from the degree of confirmation in S107 is learned, and the reference position information Ib is updated to define the learned relative position. After execution of S108, the process returns to S101.
In the above, S104 to S108, which present the travelable situation to the passenger when the travelable determination is made at S103, have been described. In contrast, S109 executed when it is determined that the travel is not possible at S103 will be described below.
In S109, the guidance mark generation stop of the generation of the stop guidance mark 7 and/or the own vehicle stop instruction instructing the own vehicle 2 to stop is executed, whereby the situation that the own vehicle 2 cannot travel is presented to the passenger so that the own vehicle 2 does not travel any further. This also enables the passenger to be supported without falling into a dangerous situation. After execution of S109, the process returns to S101.
In this way, in the first embodiment, the functional portion of the driving support apparatus 1 executing S102 corresponds to the "map acquisition means", the functional portion of the driving support apparatus 1 executing S103 corresponds to the "travel determination means", and the functional portion of the driving support apparatus 1 executing S104 corresponds to the "guide mark generation means". In the first embodiment, the functional portion of the driving support apparatus 1 executing S105 corresponds to the "steering support means", the functional portion of the driving support apparatus 1 executing S107 and S108 corresponds to the "reference update means", and the functional portion of the driving support apparatus 1 executing S109 corresponds to the "stop means".
According to the first embodiment described so far, in the support target scenes that are the through scene and the shift scene, respectively, the traveling and steering of the host vehicle 2 can be supported for the passenger. Specifically, by acquiring the surrounding space map 6 that defines the positional relationship between objects by indicating the states of the objects in the surrounding space 4 of the host vehicle 2, it is possible to accurately recognize the area in the space 4 where the host vehicle 2 can travel. Therefore, the possibility of traveling of the host vehicle 2 in the assist target scene can be accurately determined based on the surrounding space map 6. Further, by following the guide mark 7 generated based on the surrounding space map 6 as the guide mark 7 for guiding the host vehicle 2 in the support target scene, the steering operation of the occupant on the host vehicle 2 when it is determined that the travel is possible can be accurately supported.
In this case, particularly, the guidance mark 7 that presents the route through which the vehicle 2 passes next to the passenger who is not conscious of driving, experiences of terrorism, or experiences can support the judgment and operation of the passenger. Therefore, it is possible to eliminate the inconvenience of carelessness in avoiding accidents, and the feeling of reassurance on the experience or experience of terrorism. Thus, according to the first embodiment, it is possible to ensure the safety and the safety of the passenger in the passing scene and the shifting scene as the support target scene.
In addition, according to the first embodiment, the relative positions in the surrounding space map 6 with respect to the obstacles 4A and 4B and the guide mark 7 of the host vehicle 2 in the assist target scene are adjusted based on the predetermined reference position information Ib. Here, by updating the reference position information Ib based on the degree of deviation from the guide mark 7 of the actual trajectory 8 through which the host vehicle 2 passes during the steering assistance, the sizes of the spaces 9A and 9B secured between the host vehicle 2 and the obstacles 4A and 4B according to the guide mark 7 as shown in fig. 8 and 9 can be made close to the size under the actual trajectory 8 in which the passenger's feeling is reflected. Therefore, in the crossing scene and the shift scene as the support target scenes, the feeling of security given to the passenger can be improved.
(second embodiment)
The second embodiment is a modification of the first embodiment.
As shown in fig. 15, when a falling object is present on the road as the obstacle 4A, the passenger may determine that the vehicle 2 can pass through while avoiding the falling object, or whether the vehicle can be shifted from another vehicle (not shown) facing the obstacle B while avoiding the falling object (not shown). In addition, as shown in fig. 16, when a parked vehicle is present on the road as the obstacle 4A, the passenger may determine that the vehicle is confused as to whether the own vehicle 2 can pass through while avoiding the parked vehicle (not shown) or whether the own vehicle can be shifted from another vehicle opposite to the obstacle B while avoiding the parked vehicle (not shown).
Therefore, as shown in fig. 17, in S2101 of S101 in the alternative driving support flow of the second embodiment, the support target scene is set in advance as a passing scene and a shifting scene that avoid the above-described falling object or parked vehicle. Therefore, in S2101, in a situation where the vehicle speed of the host vehicle 2 is low (for example, 10km/h or less), the driving situation in which the obstacles 4A, 4B passing through or deviating from the predetermined position in the peripheral space 4, particularly the falling object as the obstacle 4A or the parked vehicle, are recognized by detection or the like is determined as the supporting target situation. At this time, at least one of a utility pole, a guardrail, a curb, a side wall, another vehicle (including another vehicle facing in a case of a shift), a pedestrian, a side object such as a bicycle or a motorcycle, and a road side edge such as a shoulder or a sidewalk can be recognized as the obstacle 4B.
After the execution of S2101, in S2104 and S2105 which replace S104 and S105 of the driving support flow, in a passing scene or a passing scene where the passenger overlooks a falling object on the road or parks the vehicle and the like and needs to avoid, the generation of the guide mark 7 and the steering support are sequentially executed as shown in fig. 15 and 16. At this time, the generation of the guide mark 7 and the steering assistance can be realized by safely completing the passing or the shifting while avoiding a falling object or parking the vehicle based on the surrounding space map 6. In S2105, even by using at least one of the electronic switch 70 and the display in S105 as a method of presenting the guide mark 7, the passenger can be supported without being in a dangerous situation.
On the other hand, in S2109 of S109 in place of the driving support flow after execution of S2101, it is not possible to perform the prompting of the situation in which the vehicle cannot travel by securing the predicted interval D at or above the threshold Dth at which the vehicle can pass or shift while avoiding the falling object or the parked vehicle. As a result, at least one of the own-vehicle stop instruction and the guidance mark generation stop is executed, so that the passenger can be supported without being in a dangerous situation. In addition, the driving support flow of the second embodiment is substantially the same as the driving support flow of the first embodiment except for the above description.
In this way, in the second embodiment, the functional portion of the driving support apparatus 1 executing S2104 corresponds to the "guide mark generating means", the functional portion of the driving support apparatus 1 executing S2105 corresponds to the "steering support means", and the functional portion of the driving support apparatus 1 executing S2109 corresponds to the "stopping means".
According to the second embodiment described so far, even in the passing scene and the shifting scene where the falling object is avoided or the vehicle is parked, the same operational effects as those of the first embodiment can be exhibited.
(third embodiment)
The third embodiment is a modification of the first embodiment.
As shown in fig. 18 and 19, there is a possibility that the size that the passenger feels necessary for the spaces 9A and 9B that are left open between the host vehicle 2 and the obstacles 4A and 4B also varies depending on the attribute representing the feature of the obstacles 4A and 4B (hereinafter, simply referred to as the obstacle attribute) and the environmental state outside the host vehicle 2 (hereinafter, simply referred to as the external environmental state). Here, the attribute means at least one of the type of an object (here, the obstacles 4A and 4B) such as a pedestrian, a bicycle, a motorcycle, a general automobile, and a large vehicle, and the motion state of the object such as the relative speed of a stationary object or a moving object with respect to the host vehicle 2. The pedestrian in the category as such attribute is, for example, an age or height difference among children, the elderly, and the young. The external environmental state is at least one of weather such as sunny, cloudy, rainy, and snowy weather, and time periods such as daytime and nighttime.
Specifically, as shown in fig. 18 and 19, when the obstacle 4A or 4B is a large vehicle such as a truck, the space 9A or 9B tends to be wider for many passengers than when the obstacle 4A or 4B is a small stationary object. In addition, in the case where the obstacle 4A or 4B is a child or an elderly person who is a pedestrian, for many passengers, the space 9A or 9B tends to be secured wider than in the case where the obstacle 4A or 4B is a small stationary object. In addition, the space 9A or 9B tends to be wider for many passengers in rainy days or snowy days than in fine days. Further, the spaces 9A and 9B tend to be wider for many passengers at night than at daytime. However, in any situation, not all passengers are necessarily in the same trend, and not all passengers are necessarily required to have the same width.
Therefore, as shown in fig. 20, in S3108 in S108 of the driving support flow in the third embodiment, the reference position information Ib is updated to information corresponding to the degree of deviation between the guide mark 7 and the actual trajectory 8 in association with the obstacle attribute and the external environment state in the passing scene or the offset scene in which the steering support is performed in the closest S3105 (described later in detail). Here, the obstacle attribute correlated with the reference position information Ib is identified based on the surrounding space map 6 stored in the memory 1 a. In addition, the external environmental state correlated with the reference position information Ib is identified based on at least one of communication information with the outside, clock information, opening and closing information of the wiper, and detection results of illuminance, for example. Further, the obstacle attribute and the external environment state are defined in advance, and the default information of the reference position information Ib is correlated with each set of the obstacle attribute and the external environment state in an initial state such as at the time of shipment and stored in the memory 1 a.
In S3102 of S102 in place of the driving assistance flow of the third embodiment, the attributes of the objects including the obstacles 4A and 4B are added to the surrounding space map 6 in S3107 described above after execution of the process. At this time, the attribute of the object is associated with the state of the object constituting the surrounding space map 6 (i.e., the distance, orientation, position, size, and the like exemplified in the first embodiment), and is stored in the memory 1 a.
In S3104 of S104 of the driving assistance flow in the third embodiment, the reference position information Ib corresponding to the obstacle attribute and the external environment state in the passing scene or the deviated scene at the execution time thereof is read from the memory 1 a. Here, the obstacle attribute corresponding to the read reference position information Ib is identified based on the surrounding space map 6 stored in the memory 1 a. The external environmental state corresponding to the read reference position information Ib is recognized based on at least one of communication information with the outside, clock information, wiper opening/closing information, illuminance detection results, and the like, for example. As described above, the relative position of the guide mark 7 generated in S3104 with respect to the obstacles 4A and 4B is adjusted based on the reference position information Ib updated in association with the obstacle attribute and the external environment state in the past S3108. That is, in S3104, the guide mark 7 is generated based on the updated reference position information Ib.
Then, in S3105 of S105 in place of the driving support routine of the third embodiment, the reaction force F given to the steering operation that is deviated from the guide mark 7 by the adoption of the electronic switch 70 is adjusted according to the obstacle attribute in the crossing scene or the deviated scene at the execution time thereof. At this time, in particular, in S3105 of the third embodiment, the strength of the reaction force F on the left and right sides that secures the spaces 9A and 9B between the subject vehicle 2 and the obstacles 4A and 4B by following the guide mark 7 is variably set to a weight. Specifically, the side of the left and right sides of the host vehicle 2 on which the spatial dimension is widened as shown in fig. 18 based on the reference position information Ib corresponding to the obstacle attribute is weighted so as to increase the reaction force F compared with the side on which the spatial dimension is narrowed as shown in fig. 19. Since it is difficult for the passenger to operate the steering wheel on the side where the reaction force F is strong, a wide space can be secured on the strong reaction force side by the steering assistance at S3105 as shown in fig. 18. On the side where the spatial dimension is narrowed in accordance with the reference position information Ib corresponding to the obstacle attribute, the reaction force F is weakened as shown in fig. 19, but the reaction force F capable of restricting the operation range of the steering wheel is given in order to avoid contact with or collision with the obstacle (4A in the drawing). In addition, the driving support routine of the third embodiment is substantially the same as the driving support routine of the first embodiment except for the above description.
In this way, in the third embodiment, the functional portion of the driving support apparatus 1 executing S3102 corresponds to the "map acquisition means", and the functional portion of the driving support apparatus 1 executing S3104 corresponds to the "guide mark generation means". In the third embodiment, the functional portion of the driving support apparatus 1 executing S3105 corresponds to the "steering support means", and the functional portion of the driving support apparatus 1 executing S107 and S3108 corresponds to the "reference update means".
According to the third embodiment described so far, the relative positions of the obstacles 4A and 4B and the guide mark 7 in the support target scene are adjusted based on the updated reference position information Ib correlated with the obstacle attribute and the external environment state. Accordingly, the sizes of the spaces 9A and 9B ensured between the host vehicle 2 and the obstacles 4A and 4B by the guide mark 7 can be made close to the size on the actual trajectory 8 reflecting the passenger's feeling that depends on the obstacle property and the external environment state. Therefore, it is possible to provide steering assistance that is advantageous in terms of giving a high feeling of security to the passenger in the passing scene and the shifting scene as the assistance target scenes.
(fourth embodiment)
The fourth embodiment is a modification of the third embodiment.
As shown in fig. 21, even if the distance D is secured so that the host vehicle 2 can physically pass through the threshold Dth or more between the obstacles 4A and 4B, the feeling of whether or not the passing or the passing is actually permitted differs for each passenger, and there is a fear that the passing or the passing is rejected regardless of the steering assistance.
Therefore, as shown in fig. 22, in S4110 following S3105 of the driving support flow of the fourth embodiment, it is determined whether or not the passing or the shifting is rejected for the steering support at S3105. At this time, if the start or completion of the passing or shifting is confirmed within the set time from the start of the steering assist at the previous S3105 and a negative determination is made, the process proceeds to S106 in the same manner as the first embodiment in the driving assist flow. On the other hand, if the start or completion of the passing or the shifting is not confirmed within the set time from the start of the steering assist and the affirmative determination is made, the process proceeds to S4111 of the fourth embodiment in the driving assist flow.
In S4111, the threshold Dth, which is the criterion to be followed in the next S4103 (described later), is updated in association with the obstacle attribute and the external environment state in the passing scene or the deviated scene in which the steering support is performed in the latest S3105. At this time, in S4111 of the fourth embodiment, a larger threshold Dth is set than the interval D shown in fig. 21 under the guide mark 7 (i.e., the predetermined track) based on the closest S3104. Thus, the threshold Dth of the memory 1a is learned and updated to the side where the next determination that the progress is difficult to be made in S4103 is performed. Here, the obstacle attribute correlated with the threshold Dth is identified based on the surrounding space map 6 stored in the memory 1 a. In addition, the external environmental state that is correlated with the threshold Dth is identified based on at least one of communication information with the outside, clock information, opening and closing information of the wiper, and the detection result of illuminance, for example. In the fourth embodiment, as in the third embodiment, the obstacle attribute and the external environment state are also defined in advance, and the default value of the threshold Dth is correlated with each set of the obstacle attribute and the external environment state in an initial state such as shipping and stored in the memory 1 a. After execution of S4111, the process returns to S101.
In S4103 of S103 of the alternative driving support flow of the fourth embodiment, the memory 1a reads the threshold Dth corresponding to the obstacle attribute and the external environment state in the crossing scene or the deviation scene from the execution time. Here, the obstacle attribute corresponding to the read threshold Dth is identified based on the peripheral space map 6 stored in the memory 1 a. The external environmental state corresponding to the read threshold Dth is recognized based on at least one of communication information with the outside, clock information, wiper opening/closing information, and illuminance detection results. From the above, the possibility of traveling of the host vehicle 2 in S4103 is determined based on the threshold Dth updated in association with the obstacle attribute and the external environment state in the past S4111. In addition, the driving support flow of the fourth embodiment is substantially the same as the driving support flow of the third embodiment except for the above description.
In this way, in the fourth embodiment, the functional portion of the driving support apparatus 1 executing S4103 corresponds to "traveling determination means", and the functional portion of the driving support apparatus 1 executing S107, S3108, S4110, and S4111 corresponds to "reference update means".
According to the fourth embodiment described so far, when the passing or the shifting is rejected for the steering support, the threshold Dt serving as the criterion for determining whether or not the vehicle can travel in the support target scene is updated to the side where the determination that the vehicle can travel is not easily made. With this, the threshold Dth serving as the travel permission determination criterion for allowing passage or displacement can be made close to the passenger feeling. Therefore, in the passing scene and the shifting scene as the support target scene, the sense of reassurance given to the passenger can be further improved.
And, when the crossing or the deviation is rejected for the steering support, the threshold Dth of the fourth embodiment is updated in association with the obstacle attribute and the external environment state. With this, the threshold Dth serving as the travel permission determination criterion for allowing passage or displacement can be made closer to the passenger's feeling depending on the obstacle attribute and the external environment state. Therefore, it is possible to provide steering assistance that is advantageous in providing a passenger with a particularly high feeling of security in the passing scene and the shifting scene as the assistance target scenes.
(other embodiments)
While the embodiments have been described above, the present disclosure is not limited to the embodiments, and can be applied to various embodiments and combinations without departing from the scope of the present disclosure.
The support subject scene in S2101 of the driving support flow in modification 1 may be added to the support subject scene in S101 (the first, third, and fourth embodiments) or may be substituted for the support subject scene in S101 (the third and fourth embodiments). In the driving support flow of modification 2, one of the passing scene and the shifting scene may be excluded from the support target scenes determined at S101 and S2101.
In the driving assistance flow of modification 3, one of the obstacle attribute and the external environment state may be excluded from the objects having the correlation with the reference position information Ib at S3108. In this case, in S3104, the reference position information Ib corresponding to the other of the obstacle attribute and the external environment state is read from the memory 1a and used for the relative position adjustment between the obstacles 4A and 4B and the guide mark 7 in the surrounding space map 6.
In S3105 of the driving assistance flow of modification 4, the weights of the reaction forces F on the left and right sides may be set so as to be increased in accordance with the reaction force F on the side where the spatial size of the reference position information Ib fixed in advance in accordance with the obstacle attribute is increased. In the case where the electronic switch 70 is used in S105 and S2105 of the driving support flow in modification 5, the weight setting in accordance with the reaction force F in S3105 may be performed. In this case, the weights of the reaction forces F on the left and right sides may be set so as to be increased according to the reaction force F on the side where the spatial size of the reference position information Ib fixed in advance in association with the obstacle attribute is increased. As specific examples of these modifications 4 and 5, for example, in order to secure a wide space 9A or 9B on a specific obstacle such as a pedestrian or a moving body, a weight for increasing the reaction force F is applied to the specific obstacle.
In S3105 of the driving support flow of modification 6, the electronic switch 70 for applying the reaction force F with a constant increasing tendency in accordance with S105 and S2105 may be used as the method for presenting the guide mark 7. In this case, at least one of the displays in S105 and S2105 may be used as a presentation method for the guide mark 7 in S3105. At S3105 of the driving support flow of modification 7, at least one of the displays at S105 and S2105 may be used as a method of presenting the guide mark 7 together with or instead of the electron spot 70 for applying a variable weight to the reaction force F.
In the driving assistance flow of modification 8, the threshold Dth variably input by the passenger may be used as a criterion for determining whether or not the vehicle can travel in S103. In the driving support flow of modification 9, one of the obstacle attribute and the external environment state may be excluded from the objects having the correlation with the threshold Dth at S4111. In this case, in S4103, the threshold Dth corresponding to the other of the obstacle attribute and the external environment state is read from the memory 1a and used for the travel availability determination.
In the driving assistance flow of modification 10, the order of S103, S4103 and S104, S2104, and S3104 may be switched. In this case, the determination of the possible progress is performed in S103 and S4103, and S105, S2105, and S3105 are executed. In modification 11, the driving support flow may be changed to the negative determination at S106, and then the process may return to S104, S2104, and S3104. In this case, in S104, S2104, and S3104, which return from the negative determination in S106, the guide mark 7 may be updated according to the actual position of the host vehicle 2 that has moved with the steering assistance.
Although the LIDAR, the camera, and the radar have been exemplified as the surrounding environment recognition sensor 3 in the first embodiment, for example, a sonar may be added to the surrounding environment recognition sensor 3 of modification 12. This is because, in a situation where the host vehicle 2 approaches the detection target, if the proximity-side end of the detection target to the host vehicle 2 is out of the detection range by using the peripheral environment recognition sensor 3 exemplified above alone, it is effective to warn the passenger so that the host vehicle 2 does not contact or collide with the detection target by using an additional sonar or the like in combination.
Specifically, as shown in fig. 23, the angle of view θ defining the detection range of the surrounding environment recognition sensor 3 is limited (in the case where 0 ° < θ < 180 °). Therefore, when the own vehicle 2 approaches the obstacles 4A and 4B to be detected within the distance L in the peripheral space 4, the entire obstacles 4A and 4B may not be detected. Here, the distance L is represented by an equation [ (W + Δ)/2]/[ tan (θ/2) ] using the vehicle width W of the host vehicle 2 and the remaining width Δ and the angle of view θ of the surrounding environment recognition sensor 3. Therefore, when the host vehicle 2 approaches the obstacles 4A and 4B within the distance L that can be assumed in advance according to the calculation formula, it is effective to compensate by sonar or the like in the modification 12.
In modification 13 instead of modification 12, a plurality of peripheral environment recognition sensors 3 are arranged in parallel in the adjacent state shown in fig. 24, and the total of the angles of view θ is set to 180, whereby the entire obstacles 4A and 4B can be detected within a distance L from the host vehicle 2. Alternatively, in modification 14 instead of modification 12, a plurality of peripheral environment recognition sensors 3 are arranged in parallel in the separated state shown in fig. 25, and the total of the angles of view θ of the sensors is set to exceed 180, whereby the entire obstacles 4A and 4B can be detected even within a distance L from the host vehicle 2.
The flowchart or the processing of the flowchart described in the present disclosure is configured by a plurality of units (or steps), and each unit is expressed as, for example, S101. Each unit can be divided into a plurality of sub-units, and a plurality of units can be combined into one unit. Each unit thus configured can be referred to as a circuit, a device, a module, or a method.
Further, each or a combination of the plurality of units described above can be realized not only by (i) a software unit combined with a hardware unit (e.g., a computer), but also by (ii) a unit including or not including a function of a relevant device as a hardware (e.g., an integrated circuit, a wired logic circuit). Further, the hardware unit may be configured inside the microcomputer.
The present disclosure has been described in terms of embodiments, but it should be understood that the present disclosure is not limited to the embodiments and configurations. The present disclosure also includes various modifications and equivalent variations. In addition, various combinations and modes, and other combinations and modes including only one element, more than one element, or less than one element are also within the scope and spirit of the present disclosure.

Claims (9)

1. A driving support device for supporting driving of a passenger in a vehicle, comprising:
a map acquisition unit that acquires a peripheral space map that defines a positional relationship of objects with each other by showing states of the objects in a peripheral space of the above-described own vehicle;
a travel determination unit configured to determine, based on the surrounding space map acquired by the map acquisition unit, whether or not the host vehicle is able to travel in a support target scene in which at least one of a crossing scene and a shifting scene is present;
a guide mark generation unit that generates a guide mark for guiding the host vehicle in the support target scene based on the surrounding space map acquired by the map acquisition unit, and adjusts a relative position in the surrounding space map with respect to an obstacle of the host vehicle and the guide mark in the support target scene based on reference position information;
a steering assist unit that assists steering of the vehicle by the occupant based on the guide mark generated by the guide mark generation unit when the travel determination unit determines that the vehicle is capable of traveling; and
and a reference updating unit that updates the reference position information based on a degree of deviation between an actual trajectory that the host vehicle passes through and the guide mark generated by the guide mark generating unit in the steering assistance by the steering assistance unit.
2. The driving support apparatus according to claim 1,
the guide mark generation means adjusts the relative position of the obstacle and the guide mark based on the reference position information corresponding to the attribute of the obstacle in the support target scene,
the reference updating means updates the reference position information in accordance with the guidance mark generating means in association with the attribute of the obstacle in the support target scene.
3. The driving support apparatus according to claim 2, wherein,
the steering assist means adjusts a reaction force to be given to steering that is disengaged from the guide mark, based on the reference position information corresponding to the attribute of the obstacle in the assist target scene.
4. The driving support apparatus according to claim 1,
the steering assist means adjusts a reaction force to be given to steering that is disengaged from the guide mark, in accordance with the attribute of the obstacle in the assist target scene.
5. The driving support apparatus according to claim 1,
the guide mark generation means adjusts the relative position of the obstacle and the guide mark based on the reference position information corresponding to the external environment state of the host vehicle in the assist target scene,
the reference updating means updates the reference position information corresponding to the guide mark generating means in association with the external environment state of the host vehicle in the assist target scene.
6. The driving support apparatus according to claim 1,
the travel determination unit is given a determination criterion of the possibility of travel,
the reference updating means updates the determination reference to a side where the travel determination means has difficulty in making a determination that travel is possible, when passing or shifting is rejected for the steering assist by the steering assist means.
7. The driving support apparatus according to claim 6,
the travel determination means is given the determination criterion corresponding to the attribute of the obstacle in the support target scene,
the reference updating means updates the determination reference in the travel determination means in association with the attribute of the obstacle in the support target scene.
8. The driving support apparatus according to claim 6,
the travel determination means is given the determination criterion corresponding to the external environment state of the host vehicle in the assist target scene,
the reference updating means updates the determination reference in the travel determination means in association with the external environment state of the host vehicle in the assist target scene.
9. The driving support apparatus according to any one of claims 1 to 8,
the vehicle information processing apparatus further includes a stopping unit that executes at least one of a vehicle stop instruction instructing the vehicle to stop and a guide mark generation stop stopping the generation of the guide mark when the travel determining unit determines that the vehicle is not able to travel.
CN201880019444.6A 2017-03-21 2018-02-19 Driving support device Active CN110462704B (en)

Applications Claiming Priority (11)

Application Number Priority Date Filing Date Title
JP2017054771 2017-03-21
JP2017-054771 2017-03-21
JP2017252523A JP6699647B2 (en) 2017-03-21 2017-12-27 Driving support device
JP2017252525A JP2018158711A (en) 2017-03-21 2017-12-27 Driving support device
JP2017-252525 2017-12-27
JP2017-252526 2017-12-27
JP2017252524A JP6699648B2 (en) 2017-03-21 2017-12-27 Driving support device
JP2017-252524 2017-12-27
JP2017-252523 2017-12-27
JP2017252526A JP6760255B2 (en) 2017-03-21 2017-12-27 Driving support device
PCT/JP2018/005627 WO2018173581A1 (en) 2017-03-21 2018-02-19 Driving assistance device

Publications (2)

Publication Number Publication Date
CN110462704A CN110462704A (en) 2019-11-15
CN110462704B true CN110462704B (en) 2021-08-31

Family

ID=63795381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880019444.6A Active CN110462704B (en) 2017-03-21 2018-02-19 Driving support device

Country Status (2)

Country Link
JP (4) JP6760255B2 (en)
CN (1) CN110462704B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10366714B1 (en) 2016-04-28 2019-07-30 Western Digital Technologies, Inc. Magnetic write head for providing spin-torque-assisted write field enhancement
US10424323B1 (en) 2016-12-30 2019-09-24 Western Digital Technologies, Inc. High-bandwidth STO bias architecture with integrated slider voltage potential control
US10388305B1 (en) 2016-12-30 2019-08-20 Western Digital Technologies, Inc. Apparatus and method for writing to magnetic media using an AC bias current to enhance the write field
JP6850231B2 (en) 2017-09-19 2021-03-31 株式会社東芝 Magnetic head and magnetic recording / playback device
JP7178297B2 (en) * 2019-03-12 2022-11-25 スズキ株式会社 Driving support device
CN113631454B (en) * 2019-03-27 2022-12-13 日产自动车株式会社 Driving assistance method and driving assistance device
US11011190B2 (en) 2019-04-24 2021-05-18 Western Digital Technologies, Inc. Magnetic write head with write-field enhancement structure including a magnetic notch
US11087784B2 (en) 2019-05-03 2021-08-10 Western Digital Technologies, Inc. Data storage devices with integrated slider voltage potential control
US10957346B2 (en) 2019-05-03 2021-03-23 Western Digital Technologies, Inc. Magnetic recording devices and methods using a write-field-enhancement structure and bias current with offset pulses
JP7243463B2 (en) * 2019-06-03 2023-03-22 株式会社デンソー Driving support control device, driving support control method, driving support control program
US20200406894A1 (en) * 2019-06-28 2020-12-31 Zoox, Inc. System and method for determining a target vehicle speed
CN114730526A (en) 2019-11-28 2022-07-08 三菱电机株式会社 Object recognition device, object recognition method, and object recognition program

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002352397A (en) * 2001-05-29 2002-12-06 Fuji Heavy Ind Ltd Driving support device for vehicle
JP2006099409A (en) * 2004-09-29 2006-04-13 Denso Corp Navigation system for avoiding deviation due to contact
JP2009208505A (en) * 2008-02-29 2009-09-17 Toyota Motor Corp Lane keep assist system, lane keep assist method
JP2011028609A (en) * 2009-07-28 2011-02-10 Nissan Motor Co Ltd Traveling support device and traveling support method
JP2012173843A (en) * 2011-02-18 2012-09-10 Nissan Motor Co Ltd Travel route generation device and travel route generation method
CN104228945A (en) * 2005-05-19 2014-12-24 罗伯特·博世有限公司 Driver assistance method
CN105163994A (en) * 2013-05-01 2015-12-16 丰田自动车株式会社 Driving support apparatus and driving support method
CN105246756A (en) * 2013-05-31 2016-01-13 日立汽车系统株式会社 Vehicle control system
DE102014111122A1 (en) * 2014-08-05 2016-02-11 Valeo Schalter Und Sensoren Gmbh Method for the at least semi-autonomous maneuvering of a motor vehicle, driver assistance system and motor vehicle
CN106470885A (en) * 2014-08-07 2017-03-01 日立汽车系统株式会社 Vehicle control system and the behavior planning system possessing this vehicle control system
JP2017052486A (en) * 2015-09-11 2017-03-16 株式会社ジェイテクト Steering device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5200926B2 (en) * 2008-12-26 2013-06-05 トヨタ自動車株式会社 Driving assistance device
JP5447663B2 (en) * 2010-06-11 2014-03-19 日産自動車株式会社 Parking support apparatus and method
DE102010061829A1 (en) * 2010-11-24 2012-05-24 Continental Teves Ag & Co. Ohg Method and distance control device for avoiding collisions of a motor vehicle in a driving situation with a small side clearance
JP2013241088A (en) * 2012-05-21 2013-12-05 Toyota Motor Corp Parking support device
JP2014069721A (en) * 2012-09-28 2014-04-21 Aisin Seiki Co Ltd Periphery monitoring device, control method, and program
JP5938334B2 (en) * 2012-11-12 2016-06-22 株式会社日本自動車部品総合研究所 Parking assistance device
JP6573795B2 (en) * 2015-07-31 2019-09-11 アイシン精機株式会社 Parking assistance device, method and program

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002352397A (en) * 2001-05-29 2002-12-06 Fuji Heavy Ind Ltd Driving support device for vehicle
JP2006099409A (en) * 2004-09-29 2006-04-13 Denso Corp Navigation system for avoiding deviation due to contact
CN104228945A (en) * 2005-05-19 2014-12-24 罗伯特·博世有限公司 Driver assistance method
JP2009208505A (en) * 2008-02-29 2009-09-17 Toyota Motor Corp Lane keep assist system, lane keep assist method
JP2011028609A (en) * 2009-07-28 2011-02-10 Nissan Motor Co Ltd Traveling support device and traveling support method
JP2012173843A (en) * 2011-02-18 2012-09-10 Nissan Motor Co Ltd Travel route generation device and travel route generation method
CN105163994A (en) * 2013-05-01 2015-12-16 丰田自动车株式会社 Driving support apparatus and driving support method
CN105246756A (en) * 2013-05-31 2016-01-13 日立汽车系统株式会社 Vehicle control system
DE102014111122A1 (en) * 2014-08-05 2016-02-11 Valeo Schalter Und Sensoren Gmbh Method for the at least semi-autonomous maneuvering of a motor vehicle, driver assistance system and motor vehicle
CN106470885A (en) * 2014-08-07 2017-03-01 日立汽车系统株式会社 Vehicle control system and the behavior planning system possessing this vehicle control system
JP2017052486A (en) * 2015-09-11 2017-03-16 株式会社ジェイテクト Steering device

Also Published As

Publication number Publication date
JP2018158709A (en) 2018-10-11
JP6760255B2 (en) 2020-09-23
JP2018158712A (en) 2018-10-11
JP2018158710A (en) 2018-10-11
JP6699647B2 (en) 2020-05-27
JP6699648B2 (en) 2020-05-27
CN110462704A (en) 2019-11-15
JP2018158711A (en) 2018-10-11

Similar Documents

Publication Publication Date Title
CN110462704B (en) Driving support device
JP7416176B2 (en) display device
CN110281930B (en) Vehicle control device, vehicle control method, and storage medium
US10479274B2 (en) Vehicle and control method for the same
CN108128243B (en) Display device for vehicle
EP3708962B1 (en) Display apparatus for vehicle and vehicle
CN110001643B (en) Vehicle control device, vehicle control method, storage medium, and information acquisition device
US9978280B2 (en) Driver assistance apparatus and vehicle including the same
US11230320B2 (en) Driving assistance device
US9898006B2 (en) Drive assist device
EP3088268B1 (en) Vehicle driving aid device and vehicle having same
US8730260B2 (en) Obstacle information notification apparatus for vehicle
US20190071075A1 (en) Vehicle control system, vehicle control method, and vehicle control program
US20130054089A1 (en) Method and control device for highlighting an expected movement path of a vehicle
JP2020163900A (en) Vehicle control device, vehicle control method, and program
CN115298073B (en) Vehicle travel support method and travel support device
US10522041B2 (en) Display device control method and display device
KR102611337B1 (en) Vehicle AR display device and method of operation thereof
CN113401056A (en) Display control device, display control method, and computer-readable storage medium
KR20190016375A (en) Side mirror for vehicle and vehicle
JP6471707B2 (en) Driving teaching device
JP7447842B2 (en) Vehicle display control device, vehicle display control method
WO2024069689A1 (en) Driving assistance method and driving assistance device
KR20230067799A (en) Method and Apparatus for controlling virtual lane based on environmental conditions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant