CN113442921A - Information processing device, driving assistance device, moving object, information processing method, and storage medium - Google Patents

Information processing device, driving assistance device, moving object, information processing method, and storage medium Download PDF

Info

Publication number
CN113442921A
CN113442921A CN202010213742.8A CN202010213742A CN113442921A CN 113442921 A CN113442921 A CN 113442921A CN 202010213742 A CN202010213742 A CN 202010213742A CN 113442921 A CN113442921 A CN 113442921A
Authority
CN
China
Prior art keywords
information
vehicle
moving body
output
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010213742.8A
Other languages
Chinese (zh)
Inventor
奈良匡树
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Priority to CN202010213742.8A priority Critical patent/CN113442921A/en
Publication of CN113442921A publication Critical patent/CN113442921A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

An information processing device, a driving assistance device, a mobile body, an information processing method, and a storage medium, which can realize safe or smooth driving. The information processing device includes a determination unit that determines the content of driving assistance information to be output in order to assist a driver of a mobile object in driving operation of the mobile object. (a) The determination unit determines to output the driving assistance information including the 1 st information when the mobile body is movable from the rear or the side of the 2 nd mobile body moving in front of or to the side of the mobile body toward the front of the 2 nd mobile body and the operation of the mobile body is difficult, and (b) determines to output the driving assistance information including the 2 nd information different from the 1 st information when the mobile body is movable from the rear or the side of the 2 nd mobile body toward the front of the 2 nd mobile body and the operation of the mobile body is easy.

Description

Information processing device, driving assistance device, moving object, information processing method, and storage medium
Technical Field
The invention relates to an information processing device, a driving assistance device, a mobile body, an information processing method, and a storage medium.
Background
Patent document 1 discloses a driving assistance device that performs assistance according to the intention of a driver. Patent document 2 discloses a lane guidance display system including a display control unit that generates a recommended lane-side guide line portion in an image of a recommended lane in a front image and displays the front image on a display unit, on which a guide line including the recommended lane-side guide line portion is superimposed.
Patent document 1: japanese patent laid-open publication No. 2015-143970
Patent document 2: japanese patent laid-open publication No. 2013-130463
Disclosure of Invention
In the 1 st aspect of the present invention, an information processing apparatus is provided. The information processing device described above includes, for example, a determination unit that determines the content of driving assistance information to be output in order to assist a driver of a mobile object in driving operation of the mobile object. In the above-described information processing apparatus, the determination unit (a) determines to output the driving assistance information including the 1 st information, for example, when the moving object is movable from the rear or the side of the 2 nd moving object moving in front of or to the side of the moving object toward the front of the 2 nd moving object and the operation of the moving object is difficult. In the above-described information processing apparatus, the determination unit (b) determines to output the driving assistance information including the 2 nd information different from the 1 st information, for example, when the moving object is movable from the rear or the side of the 2 nd moving object toward the front of the 2 nd moving object and the operation of the moving object is easy.
The information processing apparatus described above may include a 1 st determination unit that determines whether or not the moving object is movable from the rear or the side of the 2 nd moving object toward the front of the 2 nd moving object. The information processing apparatus described above may include a 2 nd determination unit configured to determine whether or not an operation for moving the moving body from the rear or the side of the 2 nd moving body toward the front of the 2 nd moving body is difficult. In the above-described information processing apparatus, the determination unit may (i) determine to output the driving assistance information including the 1 st information when the 1 st determination unit determines that the mobile object is movable forward of the 2 nd mobile object and the 2 nd determination unit determines that the operation is difficult. In the above-described information processing apparatus, the determination unit may (ii) determine to output the driving assistance information including the 2 nd information when the 1 st determination unit determines that the mobile object is movable forward of the 2 nd mobile object and the 2 nd determination unit determines that the operation is not difficult.
The information processing apparatus may further include a road information acquiring unit configured to acquire road information on a road on which the mobile object travels. The information processing apparatus may include a vehicle information acquisition unit that acquires peripheral vehicle information on one or more peripheral vehicles traveling around the mobile body. The information processing device described above may include a region specifying unit that specifies a region that the moving object can use to move from the rear or the side of the 2 nd moving object to the front of the 2 nd moving object, based on the road information and the peripheral vehicle information. In the above-described information processing apparatus, the mobile body may be a vehicle. In the above-described information processing apparatus, the 2 nd mobile body may be included in one or more nearby vehicles. In the above-described information processing apparatus, the 2 nd determination unit may determine that it is difficult for the moving body to move from the rear or the side of the 2 nd moving body toward the front of the 2 nd moving body when the 3 rd determination unit determines that the driving assistance information needs to be output and the region determined by the region determination unit has a predetermined characteristic.
In the above-described information processing device, the preset characteristic may be a condition that a minimum value of a region width, which is a length of the region in a direction substantially parallel to the vehicle width direction of the 2 nd mobile object, is smaller than the 1 st threshold value. The information processing apparatus may further include a speed information acquiring unit configured to acquire speed information indicating a speed of the moving object. The information processing apparatus described above may include a threshold determination unit configured to determine the 1 st threshold based on the speed of the moving object indicated by the speed information. In the above-described information processing apparatus, the threshold determination unit may determine the 1 st threshold so that the 1 st threshold when the speed of the mobile object is greater than the 2 nd threshold is greater than the 1 st threshold when the speed of the mobile object is less than the 2 nd threshold.
In the above-described information processing apparatus, the road information may include at least one of image data or ranging data of a road on which the mobile body travels, configuration information indicating a configuration of the road on which the mobile body travels, and regulation information indicating a traffic regulation applicable to the road on which the mobile body travels. In the above-described information processing apparatus, the 1 st deciding unit may decide whether or not the moving body is movable from the rear or the side of the 2 nd moving body toward the front of the 2 nd moving body based on the vehicle width of the moving body, and/or the 2 nd deciding unit may decide whether or not the operation of the moving body is difficult based on the vehicle width of the moving body.
The information processing apparatus described above may include a 3 rd determination unit that determines whether or not the driving assistance information needs to be output. In the above-described information processing device, the determination unit may determine the content of the driving assistance information when the 3 rd determination unit determines that the driving assistance information needs to be output. The information processing device described above may include an intention detection unit that detects that the rider intends to move the moving body from the rear or the side of the 2 nd moving body toward the front of the 2 nd moving body. In the above-described information processing device, the 3 rd determination unit may determine that the driving assistance information needs to be output when the intention detection unit detects the intention.
In the above-described information processing device, the determination unit (c) may determine to output the driving assistance information including the 3 rd information different from the 1 st information and the 2 nd information when it is determined that the mobile object cannot move forward of the 2 nd mobile object. The information processing apparatus described above may further include a route determination unit configured to determine a route along which the mobile object moves from the rear or the side of the 2 nd mobile object toward the front of the 2 nd mobile object. In the above-described information processing device, the 1 st information may include information indicating the route determined by the route determination unit. The information processing apparatus described above may further include an image generating unit configured to generate an image for superimposing the route determined by the route determining unit on the object image in the real space and showing the image to the passenger. In the above-described information processing apparatus, the 1 st information may include data of the image generated by the image generating unit.
The information processing apparatus described above may include a 1 st operation amount determination unit configured to determine an operation amount of a 1 st operation for changing a moving direction of the moving object. In the above-described information processing apparatus, the 1 st operation amount determining unit may determine the operation amount of the 1 st operation required to move the mobile object along the route determined by the route determining unit. In the above-described information processing apparatus, the 2 nd information may include information indicating the operation amount of the 1 st operation determined by the 1 st operation amount determining unit.
The information processing apparatus described above may include a 2 nd operation amount determination unit configured to determine an operation amount of a 2 nd operation for changing a moving speed of the moving object. In the above-described information processing apparatus, the 2 nd operation amount determining unit may determine the operation amount of the 2 nd operation required to move the moving object along the route determined by the route determining unit. In the above-described information processing device, when the relative speed between the mobile object and the 2 nd mobile object is smaller than the 3 rd threshold, the determination unit may determine to output the driving assistance information including the information indicating the operation amount of the 2 nd operation determined by the 2 nd operation amount determination unit.
In the 2 nd aspect of the present invention, a driving assistance device is provided. The driving assistance device described above includes, for example, the information processing device according to each of the above-described aspects 1. The driving assistance device described above includes, for example, an information presentation device that presents the driving assistance information determined by the determination unit to the passenger of the mobile object.
In the 3 rd aspect of the present invention, a mobile body is provided. The mobile body includes, for example, the driving assistance device according to the above-described 3 rd aspect.
In the 4 th aspect of the present invention, an information processing method is provided. The above-described information processing method includes a determination step of determining the content of the driving assistance information to be output in order to assist the driver of the mobile object in driving the mobile object. In the above-described information processing method, the determining step includes, for example, (a) determining to output the driving assistance information including the 1 st information when the moving body is movable from the rear or the side of the 2 nd moving body moving in front of or to the side of the moving body toward the front of the 2 nd moving body and the operation of the moving body is difficult. In the above-described information processing method, the determining step includes (b) determining to output the driving assistance information including the 2 nd information different from the 1 st information, when the moving body is movable from the rear or the side of the 2 nd moving body toward the front of the 2 nd moving body and the operation of the moving body is easy, for example.
In the 5 th aspect of the present invention, there is provided a storage medium. The storage medium described above may be a computer-readable storage medium storing a program. The storage medium described above may also be a nonvolatile computer-readable medium that stores programs. The storage medium is configured to execute the information processing method when the program is executed.
In the storage medium, the information processing method may include a determination step of determining a content of the driving assistance information to be output for assisting a driver of the mobile object in driving the mobile object. The determining step includes, for example, (a) determining to output the driving assistance information including the 1 st information when the moving body is movable from the rear or the side of the 2 nd moving body moving in front of or to the side of the moving body toward the front of the 2 nd moving body and the operation of the moving body is difficult. The determining step includes, for example, (b) determining to output driving assistance information including 2 nd information different from the 1 st information when the moving body is movable from the rear or the side of the 2 nd moving body toward the front of the 2 nd moving body and the moving body is easy to operate.
In addition, the summary of the present invention does not exemplify all the technical features of the present invention. In addition, sub-combinations of these feature sets may also be inventions.
Drawings
Fig. 1 schematically shows an example of a driving assistance method by a vehicle 20.
Fig. 2 schematically shows an example of the assist information 230 presented by the vehicle 20.
Fig. 3 schematically shows an example of an image 330 visually confirmed by the driver 22 of the vehicle 20.
Fig. 4 schematically shows an example of the system configuration of the vehicle 20.
Fig. 5 schematically shows an example of the internal configuration of the sensor unit 430.
Fig. 6 schematically shows an example of the internal configuration of the recognition unit 456.
Fig. 7 schematically shows an example of the internal configuration of the head-up display 112.
Fig. 8 schematically shows an example of the internal configuration of the assist control unit 114.
Fig. 9 schematically shows an example of information processing in the passing determination unit 840.
Fig. 10 schematically shows an example of information processing performed by the passing area specifying unit 842.
Fig. 11 schematically shows an example of information processing performed by the passing area specifying unit 842.
Fig. 12 schematically shows an example of information processing in the assist control unit 114.
Fig. 13 schematically shows an example of the system configuration of the computer 3000.
Detailed Description
The present invention will be described below with reference to embodiments thereof, but the following embodiments do not limit the invention according to the claims. In addition, a combination of all the features described in the embodiments is not necessarily essential to the means for solving the problems of the present invention. In the drawings, the same or similar portions are denoted by the same reference numerals, and redundant description may be omitted.
[ overview of Driving assistance method ]
The configuration of the vehicle 20 and the outline of information processing in the vehicle 20 will be described with reference to fig. 1, 2, and 3, taking as an example a case where the vehicle 20 traveling on the road 10 assists the driver 22 in driving the vehicle 20. According to the present embodiment, when the driver 22 performs an operation for causing the vehicle 20 to perform a specific action, or when the operation is to be performed, the vehicle 20 presents information for assisting the operation (which may be referred to as assist information) to the driver 22. The vehicle 20 is thereby able to assist the driver 22.
In one embodiment, the content of the assistance information may be different in a case where it is appropriate or possible to cause the vehicle 20 to perform the action intended by the driver 22, and in a case where it is inappropriate or impossible to cause the vehicle 20 to perform the action intended by the driver 22. In another embodiment, the content of the assist information may be different between a case where the vehicle 20 is caused to perform the action intended by the driver 22 with a difficult operation and a case where the vehicle 20 is caused to perform the action intended by the driver 22 without a difficult operation. The intention of the driver 22 can be estimated from the driving operation of the driver 22, the state of the vehicle 20, and the like.
Fig. 1 schematically shows an example of a driving assistance method by a vehicle 20. Fig. 1 schematically shows an example of the assist information 30 presented by the vehicle 20. Fig. 2 schematically shows an example of the assist information 230 presented by the vehicle 20. Fig. 3 schematically shows an example of an image 330 visually confirmed by the driver 22 when the assist information 230 is presented to the driver 22.
In the present embodiment, the details of the method for assisting the driver 22 by the vehicle 20 will be described, taking as an example a case where the driver 22 operates the vehicle 20 and causes the vehicle 20 to perform an operation for changing the traveling direction of the vehicle 20. More specifically, the details of the vehicle 20 will be described by taking, as an example, a case where the vehicle 20 presents the assist information 30 for assisting the operation when the driver 22 performs the operation for overtaking the preceding vehicle 40.
As shown in fig. 1, in the present embodiment, the vehicle 20 includes a driving assistance unit 110 and a front screen 120. In the present embodiment, the driving assistance unit 110 includes a head-up display 112 and an assistance control unit 114.
As shown in fig. 1, in one embodiment, the driving assistance unit 110 presents assistance information 30 to the driver 22. The assistance information 30 shown in fig. 1 represents an example of assistance information presented to the driver 22 when a difficult operation is accompanied when the vehicle 20 is caused to perform an action intended by the driver 22, for example, in a case where it is appropriate or possible to cause the vehicle 20 to perform the action intended by the driver 22.
In the present embodiment, the assist information 30 includes information indicating at least one of the message 32, the steering operation amount 34, and the accelerator operation amount 36. The message 32 may contain information indicating that it is appropriate or possible for the vehicle 20 to perform the action intended by the driver 22. The information indicating the steering operation amount 34 and the information indicating the accelerator operation amount 36 may include information indicating an appropriate operation amount for causing the vehicle 20 to perform the action intended by the driver 22.
In the embodiment described with reference to fig. 1, the information indicating the steering operation amount 34 includes information indicating the current steering operation amount and information indicating an appropriate steering operation amount for causing the vehicle 20 to perform the action intended by the driver 22. In the above-described embodiment, an instrument-type icon is used to indicate the steering operation amount. In addition, a triangular icon representing the above-described appropriate steering operation amount is arranged on the outer peripheral portion of the meter. In the above-described embodiment, the icons indicate that the current steering operation amount is 0.4 to the right, and the appropriate steering operation amount is 0.3 to the right.
In the embodiment described with reference to fig. 1, the information indicating the accelerator operation amount 36 includes information indicating the current accelerator operation amount and information indicating an appropriate accelerator operation amount for causing the vehicle 20 to execute the action intended by the driver 22. In the above-described embodiment, an instrument-type icon is used to indicate the accelerator operation amount. In addition, a triangular icon representing the above-described appropriate amount of throttle operation is arranged on the outer peripheral portion of the meter. In the above-described embodiment, it is indicated by these icons that the current accelerator operation amount is 25% and the above-described appropriate accelerator operation amount is 60%.
The assist information 30 preferably includes information indicating at least one of the steering operation amount 34 and the accelerator operation amount 36. The assist information 30 preferably includes information indicating the steering operation amount 34. The assistance information 30 may or may not include the message 32.
On the other hand, in the case where it is not appropriate or not possible to cause the vehicle 20 to perform the action intended by the driver 22, the assistance information 30 contains, for example, a message 32 indicating that it is not appropriate or not possible to cause the vehicle 20 to perform the action intended by the driver 22. In such a case, the assist information 30 may include information indicating at least one of the steering operation amount 34 and the accelerator operation amount 36. At this time, the information indicating the steering operation amount 34 and the information indicating the accelerator operation amount 36 may include information indicating an appropriate operation amount for stopping or interrupting the operation intended by the driver 22.
In another embodiment, as shown in fig. 2, the driving support unit 110 presents support information 230 to the driver 22. The assist information 230 shown in fig. 2 represents an example of assist information presented to the driver 22 when, for example, when it is appropriate or possible to cause the vehicle 20 to perform an action intended by the driver 22, a difficult operation is not involved in causing the vehicle 20 to perform the action intended by the driver 22.
In the present embodiment, the assistance information 230 includes information indicating the message 32 and information indicating the recommended route 38. The message 32 may contain information indicating that it is appropriate or possible for the vehicle 20 to perform the action intended by the driver 22. The assistance information 30 may or may not include the message 32.
The recommended route 38 may be a route recommended as a travel route of the vehicle 20 when the vehicle 20 is caused to perform an action intended by the driver 22. For example, in the case where the driver 22 operates the vehicle 20 to overtake the preceding vehicle 40, the recommended route 38 is set such that the vehicle 20 can overtake the preceding vehicle 40 safely. By presenting the recommended route 38 to the driver 22, the driver 22 can intuitively change the heading direction of the vehicle 20.
In addition, the assist information 230 may contain information representing the steering gear operation amount 34. The assist information 230 may contain information representing the accelerator operation amount 36. The information indicating the steering operation amount 34 and the information indicating the accelerator operation amount 36 may include information indicating an appropriate operation amount for causing the vehicle 20 to perform the action intended by the driver 22.
As shown in fig. 3, the driving assistance unit 110 may present the assistance information 230 to the driver 22 while overlapping the object image of the real space. Thus, the driver 22 can operate the vehicle 20 while viewing the image 330 in which the auxiliary information 230 displayed on the front screen 120 is superimposed on the object image of the actual space visually confirmed through the front screen 120.
According to the embodiment shown in fig. 1 to 3, in the case where it is appropriate or possible to cause the vehicle 20 to perform the action intended by the driver 22, and in the case where it is inappropriate or impossible to cause the vehicle 20 to perform the action intended by the driver 22, the content and/or the output manner of the assist information presented to the driver 22 are switched. Thus, the driving assistance unit 110 can present appropriate assistance information to the driver 22 in accordance with the driving skill level requested of the driver 22. As a result, for example, even when the driver 22 manually drives the vehicle 20, safe and smooth driving can be achieved.
For example, in the case where the manipulation of the vehicle 20 is not required to a high degree of driving skill, most drivers 22 can manipulate the vehicle 20 safely by being prompted only with the recommended route 38 as the assistance information. On the other hand, when the driving skill level for which the steering of the vehicle 20 is required to be advanced is presented only on the recommended route 38, the driver 22 may have difficulty moving the vehicle 20 along the recommended route 38. However, according to the present embodiment, the steering operation amount 34 is presented when the steering of the vehicle 20 is required to be highly advanced in the driving skill level. Thus, the driver 22 can relatively easily move the vehicle 20 along the recommended route 38. In the case where the steering operation amount 34 and the accelerator operation amount 36 are presented, the driver 22 can more easily move the vehicle 20 along the recommended route 38.
[ overview of each unit of the vehicle 20 ]
In the present embodiment, the vehicle 20 operates in accordance with an operation of the driver 22. In the present embodiment, the driving assistance unit 110 assists the driver 22 in driving the vehicle 20. For example, the driving assistance unit 110 presents various assistance information to the driver 22 to assist the driving operation of the driver 22. The driving support unit 110 may determine the content of the support information to be presented to the driver 22 based on a situation in which the vehicle 20 is located.
Examples of the vehicle 20 include an automobile, a motorcycle, and an electric train. Examples of the motorcycle include (i) a motorcycle, (ii) a three-wheeled motorcycle, (iii) a pick-up (registered trademark), a stand-up motorcycle or a tricycle with a power unit, such as a kick plate with a power unit (registered trademark) or a slide plate with a power unit.
In the present embodiment, the head-up display 112 presents the assist information to the driver 22. The content of the auxiliary information output from the head-up display 112 is determined by, for example, the auxiliary control unit 114.
When the assist control unit 114 determines the output mode of the assist information, the head-up display 112 may present the assist information to the driver 22 in the output mode determined by the assist control unit 114. Examples of the output mode of the auxiliary information include an image output, a sound output, a vibration output, and a combination thereof.
In the present embodiment, the assist control unit 114 determines the content of the assist information to be presented to the driver 22. The assist control unit 114 may determine the output mode of the assist information.
The assist control unit 114 may determine the content of the assist information in accordance with a driving situation of the vehicle 20 based on the driver 22. For example, when the driver 22 performs an operation (sometimes referred to as a driving operation) for causing the vehicle 20 to perform a specific motion, or attempts to perform the operation, the assist control unit 114 detects the intention of the driver 22. The assist control portion 114 may determine to present assist information for assisting an operation intended by the driver 22 to the driver 22.
The assist control unit 114 may determine the content of the assist information in accordance with a situation in which the vehicle 20 is located. In one embodiment, the assist control unit 114 determines whether it is appropriate to cause the vehicle 20 to perform the action intended by the driver 22, taking into account the situation in which the vehicle 20 is in. The assist control unit 114 may determine whether or not the above-described operation is executable. The assist control unit 114 determines the content of the assist information based on the determination result. In another embodiment, the assist control unit 114 determines the difficulty of the operation for causing the vehicle 20 to perform the above-described operation, taking into account a situation in which the vehicle 20 is located. The assist control unit 114 determines the content of the assist information based on the determination result.
More specifically, when the driver 22 operates the vehicle 20 to change the traveling direction of the vehicle 20, the assist control unit 114 determines whether or not to present at least one of the assist information 30 and the assist information 230 to the driver 22 in accordance with the situation in which the vehicle 20 is located. When at least one of the assist information 30 and the assist information 230 is presented to the driver 22, the assist control unit 114 determines to present one of the assist information 30 and the assist information 230 to the driver 22.
For example, when the driver 22 operates the vehicle 20 to overtake the preceding vehicle 40, the driver 22 attempts to move the vehicle 20 from the rear or side of the preceding vehicle 40 moving in front of or to the side of the vehicle 20 toward the front of the preceding vehicle 40. When the intention of the passing operation by the driver 22 is detected, the assist control unit 114 determines the content of the assist information.
More specifically, (a) when the vehicle 20 is movable as described above and an operation for moving the vehicle 20 as described above is difficult, the assist control unit 114 determines to output the assist information 30, for example. (b) When the vehicle 20 can move as described above and the operation for moving the vehicle 20 as described above is easy, the assist control unit 114 determines to output the assist information 230, for example. (c) When the vehicle 20 cannot move as described above, the assist control unit 114 determines to output assist information including a message 32 indicating that it is inappropriate or impossible to cause the vehicle 20 to perform the action intended by the driver 22, for example.
In a case where it is appropriate or possible to cause the vehicle 20 to perform the action intended by the driver 22, it may be determined that the vehicle 20 is movable as described above. In a case where it is not appropriate or possible to cause the vehicle 20 to perform the action intended by the driver 22, it may be determined that the vehicle 20 cannot move as described above. When the vehicle 20 is caused to perform the action intended by the driver 22 with a difficult operation, it can be determined that the operation for moving the vehicle 20 is difficult as described above. When the vehicle 20 is caused to perform the movement intended by the driver 22 without accompanying a difficult operation, it can be determined that the operation for moving the vehicle 20 as described above is easy.
The overtaking of the vehicle 20 by the vehicle 40 is not limited to (i) the vehicle 20 approaching from behind the vehicle 40 traveling in the same lane as the lane in which the vehicle 20 travels, (ii) the vehicle 20 changing lanes after the vehicle 20 overtakes the vehicle 40, (iii) the vehicle 20 moving forward of the vehicle 40 traveling in a lane different from the lane in which the vehicle 40 travels, and (iv) the vehicle 20 changing lanes from the lane in which the vehicle 40 is currently traveling to the lane in which the vehicle 40 travels. The overtaking operation may include an operation in which the vehicle 20 moves forward of the vehicle 40 without changing the lane after overtaking the vehicle 40, within a range allowed by the traffic regulation applied to the vehicle 20.
According to one embodiment of the operation that may be included in the passing operation, (i) the vehicle 20 approaches from behind the preceding vehicle 40 that is traveling in the same lane as the lane in which the vehicle 20 is traveling, and (ii) after the vehicle 20 overtakes the preceding vehicle 40, the vehicle 20 moves forward of the preceding vehicle 40 without changing the lane. When the traffic regulation applied to the vehicle 20 permits, the series of operations described above may be included in the passing operation even when a part of the vehicle 20 deviates from the lane when the vehicle 20 moves forward to the front of the vehicle 40.
According to another embodiment of the operation that may be included in the overtaking operation, (i) the vehicle 20 approaches from behind the preceding vehicle 40 traveling in a lane different from the lane in which the vehicle 20 travels, and (ii) after the vehicle 20 overtakes the preceding vehicle 40, the vehicle 20 moves forward of the preceding vehicle 40 without changing the lane. In addition, when the traffic regulation applied to the vehicle 20 permits, even when the vehicle 20 changes lanes from the lane currently traveling to the lane where the vehicle 40 is traveling to the front after the vehicle 20 moves to the front of the vehicle 40, the series of operations described above may be included in the passing operation.
In the present embodiment, the front screen 120 is provided on the front windshield of the vehicle 20, and presents the image projected by the head-up display 112 to the driver 22. The front screen 120 may be a transmissive screen. Thus, the front screen 120 can present the image projected by the head-up display 112 to the driver 22 while overlapping the object image in the real space.
[ concrete constitution of each unit of vehicle 20 ]
The units of the vehicle 20 may be implemented by hardware, software, or both. At least a part of each unit of the vehicle 20 may be realized by a Control unit such as an ecu (electronic Control unit). At least a part of each unit of the vehicle 20 may be implemented by a personal computer or a portable terminal. For example, a personal computer or a portable terminal is used as the user interface of the driving assistance portion 110. Examples of the portable terminal include a mobile phone, a smartphone, a PDA, a tablet computer, a notebook computer, a portable computer, and a wearable computer.
When at least a part of the components constituting the vehicle 20 is realized by software, the components realized by the software can be realized by starting a program that defines operations related to the components in an information processing device having a normal configuration. The information processing device includes, for example, (i) a data processing device including various processors (e.g., a CPU and a GPU), a ROM, a RAM, and a communication interface, and (ii) a storage device (including an external storage device) such as a memory and an HDD. The information processing device may include (iii) an input device such as a keyboard, a touch panel, a camera, a microphone, various sensors, and a GPS receiver, or may include (iv) an output device such as a display device, a speaker, and a vibration device.
In the information processing apparatus, the data processing apparatus or the storage apparatus may store a program. The information processing described in the above-described program functions as a specific means for cooperating software related to the program with various hardware resources of the vehicle 20, for example, by reading the program into a computer. The specific means described above implements calculation or processing of information corresponding to the purpose of use of the computer in the present embodiment, thereby constructing the vehicle 20 corresponding to the purpose of use.
The program described above may also be stored in a computer-readable medium. The program described above may also be stored in a nonvolatile computer-readable recording medium. The program may be stored in a computer-readable medium such as a CD-ROM, a DVD-ROM, a memory, or a hard disk, or may be stored in a storage device connected to a network. The above-described program may be installed from a computer-readable medium or a storage device connected to a network into a computer constituting at least a part of the vehicle 20.
By executing the program, the computer mounted on the vehicle 20 can function as at least a part of each unit of the vehicle 20. By executing the program, a computer mounted on the vehicle 20 can execute an information processing method in at least a part of each unit of the vehicle 20.
The program for causing the computer mounted on the vehicle 20 to function as at least a part of each unit of the vehicle 20 includes, for example, a module that defines an operation of at least a part of each unit of the vehicle 20. When the program or the module is executed, the program or the module drives the data processing device, the input device, the output device, the storage device, and the like, causes the computer to function as each unit of the vehicle 20, or causes the computer to execute the information processing method in each unit of the vehicle 20.
The information processing method described above includes a determination step of determining at least one of the content and the output mode of the driving assistance information used to assist the driving of the mobile object, for example. The determining step may include (a) determining to output the driving assistance information including the 1 st information when the moving body is movable from the rear or the side of the 2 nd moving body moving in front of or to the side of the moving body toward the front of the 2 nd moving body and the operation of the moving body is difficult. The determining step may include (b) determining to output driving assistance information including 2 nd information different from the 1 st information when the moving body is movable from the rear or the side of the 2 nd moving body toward the front of the 2 nd moving body and the moving body is easy to operate.
The vehicle 20 may be an example of a mobile body. The driver 22 may be an example of a rider. The preceding vehicle 40 may be an example of the 2 nd mobile body.
The assist information 30 may be an example of driving assist information. The assist information 230 may be an example of driving assist information. The steering operation amount 34 may be an example of the 1 st information. The accelerator operation amount 36 may be an example of the 1 st information. The recommended route 38 may be an example of the 2 nd information. The message 32 indicating that it is inappropriate or impossible for the vehicle 20 to perform the action intended by the driver 22 may be an example of the 3 rd information.
The driving assistance unit 110 may be an example of an information processing device or a driving assistance device. Head-up display 112 may be an example of an information presentation device. The assist control unit 114 may be an example of an information processing apparatus or a determination unit. The front screen 120 may be an example of an information presentation device.
The traveling direction of the vehicle 20 may be an example of the moving direction of the moving body. The operation of the driver 22 may be an example of a driving operation of the mobile body performed by a passenger of the mobile body. The presentation of the assistance information 30 or the assistance information 230 may be an example of the output of the driving assistance information.
In the present embodiment, the details of the driving assistance unit 110 are described by taking as an example a case where the driving assistance unit 110 assists the driving operation of the vehicle 20. However, the driving assistance unit 110 is not limited to this embodiment. The driving support unit 110 can support driving operation of any type of mobile object.
The moving body may be a pedestrian, a vehicle, a ship, a flying object, or the like. Examples of the vessel include ships, hovercraft, water-borne motorcycles, diving ships, submarines, and water scooters. The flying object may be an airplane, an airship, a hot air balloon, a helicopter, an unmanned aerial vehicle, or the like.
In the present embodiment, the driving assistance unit 110 is described in detail by taking, as an example, a case where the driving assistance unit 110 projects assistance information onto the front screen 120, which is an example of a transmissive display. However, the driving assistance unit 110 is not limited to this embodiment.
In another embodiment, the driving support unit 110 may display the support information on a display device such as a liquid crystal display or an organic EL display. The driving support unit 110 may display the support information on the display device in a manner of overlapping the support information with an object image of the real space captured by an imaging device such as a camera. The display device may be a display device mounted on the vehicle 20 or a display device mounted on a portable terminal of the driver 22.
In a transmission-type display in which light is transmitted to provide an object image in a real space to a viewer and an electronic image is displayed to the viewer, the display of the electronic image is performed not only at a position where the electronic image visually confirmed by the viewer as representing the auxiliary information overlaps with the object image in the real space, but also (i) when the viewer observes the object image in the real space through the transmission-type display, and (ii) an image obtained by combining the electronic image captured of the object image in the real space with the electronic image representing the auxiliary information is displayed. The electronic image may be a moving image or a still image.
In another embodiment, the driving support unit 110 may project the support information onto a screen installed at an arbitrary position of the vehicle 20. The screen may be a transmissive screen or a non-transmissive screen. In the case of the non-transmissive screen, the driving support unit 110 may project an image obtained by combining an electronic image obtained by capturing an object image of a real space and an electronic image representing support information on the screen. The driving assistance unit 110 may use the retina of the driver 22 as the screen.
Fig. 4 schematically shows an example of the system configuration of the vehicle 20. In the present embodiment, the vehicle 20 includes an operation input unit 420, a sensing unit 430, a communication unit 440, a vehicle control unit 450, and a storage unit 460. In the present embodiment, the vehicle control unit 450 includes the driving support unit 110, the self-position estimation unit 452, the road traffic information acquisition unit 454, and the recognition unit 456.
In the present embodiment, the operation input unit 420 receives an operation of the driver 22. The operation input unit 420 may be an accelerator pedal, a brake pedal, a steering gear, a direction indicator, a voice recognition device, a line-of-sight recognition device, or a gesture recognition device.
In the present embodiment, for example, the sensor unit 430 measures various physical quantities related to at least one of the vehicle 20 and the surrounding environment of the vehicle 20, and outputs data indicating the measurement result (may be referred to as measurement data). The sensor unit 430 may capture an image of at least one of the inside and the outside of the vehicle 20 and output data of the captured image (which may be referred to as image data).
The physical quantity related to the vehicle 20 includes a current position, a speed, an acceleration, an angular velocity, an engine rotation speed, a voltage of a driving battery, an SOC, and the like. Examples of the physical quantity relating to the surrounding environment of the vehicle 20 include (i) at least one of the relative position and the relative speed of each of the one or more preceding vehicles 40 with respect to the vehicle 20, (ii) the distance to an object around the vehicle 20 (which may be referred to as a distance image, distance measurement data, range data, and the like), (iii) the weather, the air temperature, the humidity, the illuminance, and the like around the vehicle 20. As the relative position, the relative distance and direction may be measured or output, or only the relative distance may be measured or output.
The sensor unit 430 may output at least one of the image data and the measurement data to the vehicle control unit 450. For example, the sensing unit 430 may output at least one of the image data and the measurement data, the driving support unit 110, the self-position estimating unit 452, and the identifying unit 456. Details of the sensing portion 430 will be described later.
In the present embodiment, the communication unit 440 transmits and receives information to and from an information processing device outside the vehicle 20 via a communication network. The communication unit 440 transmits and receives information between the vehicle 20 and an information providing server (not shown) that provides road traffic information, for example.
The communication network may comprise a wireless packet communication network, the internet, a P2P network, a private line, a VPN, a wire line communication line, etc. The communication network may include a mobile communication network such as a cellular phone network. The communication network may include a wireless MAN (e.g., WiMAX (registered trademark)), a wireless LAN (e.g., WiFi (registered trademark)), Bluetooth (registered trademark), Zigbee (registered trademark), nfc (near Field communication), or the like. The communication network may include a communication line for V2X for vehicle-to-vehicle communication, road-to-vehicle communication, or the like.
In the present embodiment, the vehicle control unit 450 controls the operation of the vehicle 20. The vehicle control portion 450 may control the movement of the vehicle 20. The vehicle control portion 450 may control presentation of the assist information based on the vehicle 20.
In the present embodiment, the self-position estimating unit 452 estimates the current position of the vehicle 20. The self-position estimating unit 452 can estimate the current position of the vehicle 20 based on various measurement data output from the sensing unit 430. The self-position estimating unit 452 outputs information indicating an estimation result (sometimes referred to as an estimated position) of the current position of the vehicle 20, for example, to the road traffic information acquiring unit 454.
In the present embodiment, the road traffic information acquisition unit 454 accesses an external information processing device via the communication unit 440 to acquire various information about the road 10. The road traffic information acquisition unit 454 may acquire information indicating the estimated position of the vehicle 20 from the own position estimation unit 452, and acquire information on the road 10 located in the vicinity of the estimated position of the vehicle 20. The road traffic information acquisition unit 454 outputs the above-described information to the driving support unit 110, for example.
Examples of the information on the road 10 include information on the structure of the road 10 (sometimes referred to as structure information), information on traffic regulations applicable to the road 10 (sometimes referred to as regulation information), information on the traffic state of the road 10 (sometimes referred to as traffic information), and the like. The configuration information may be information indicating the shape of a road, the number of lanes, the width and gradient of each lane, the position of a signal or an obstacle, control of a signal, and the like. The regulation information may be information indicating a rule or a restriction regarding passing, a rule or a restriction regarding speed, or the like. The traffic information may be information indicating a traffic volume, whether or not traffic is congested, a degree of congestion, presence or absence of an accident or disaster, presence or absence of traffic regulation, or contents of traffic regulation.
In the present embodiment, recognition unit 456 recognizes the current state of vehicle 20. Recognition unit 456 may analyze various data output from sensing unit 430 to recognize the current state of vehicle 20. The recognition unit 456 may recognize the current state of the vehicle 20 by using the analysis program stored in the storage unit 460. The recognition unit 456 may access an external information processing apparatus that executes a program for analysis via the communication unit 440, and recognize the current state of the vehicle 20 using an analysis result based on the external information processing apparatus.
The current state of the vehicle 20 includes a state of an external environment of the vehicle 20, a state of an internal environment of the vehicle 20, and the like. The state of the external environment includes, for example, the state of the road 10, the state of the surrounding vehicle including the preceding vehicle 40, and the state of the surrounding environment of the vehicle 20. The state of the internal environment may be, for example, the state of the driver 22. Details of recognition unit 456 will be described later.
In the present embodiment, the storage unit 460 stores various information. For example, the storage unit 460 stores various information used for information processing in the vehicle control unit 450. The storage unit 460 may store information indicating various set values, initial values, threshold values, and the like used in the information processing. The storage unit 460 may store information (sometimes referred to as own-vehicle information) related to the specification or setting of the vehicle 20. The own vehicle information includes the vehicle width of the vehicle 20, the running characteristics of the vehicle 20, the driving experience or driving skill of the driver 22, and the like. The running characteristic includes acceleration performance, deceleration performance, an upper limit value set in relation to speed or acceleration, steering performance, and the like. The steering performance includes a steering wheel operable region, a steering wheel operation upper limit value that does not cause an unstable traveling state to the vehicle, and the like. The storage unit 460 may store various information generated by information processing in the vehicle control unit 450.
The sensor unit 430 may be an example of a road information acquisition unit or a vehicle information acquisition unit. The communication unit 440 may be an example of a road information acquisition unit. The vehicle control unit 450 may be an example of an information processing device, a road information acquisition unit, or a vehicle information acquisition unit. The road traffic information acquisition unit 454 may be an example of a road information acquisition unit. The recognition unit 456 may be an example of a road information acquisition unit or a vehicle information acquisition unit.
Fig. 5 schematically shows an example of the internal configuration of the sensor unit 430. In the present embodiment, the sensor unit 430 includes an internal sensor 520 and an external sensor 540. In the present embodiment, the built-in sensor 520 includes a speed sensor 522 and an inertial sensor 524. In the present embodiment, the external sensor 540 includes a GPS sensor 542, a camera 544, a distance measurement sensor 546, and an environment sensor 548.
The built-in sensor 520 acquires information related to the state of the vehicle 20. The speed sensor 522 measures the speed of the vehicle 20. The principle of measuring the velocity is not particularly limited. The inertial sensor 524 measures acceleration, rotational angular velocity, translational motion, and the like of the vehicle 20. The inertial sensor 524 may determine the orientation of the vehicle 20. The inertial sensor 524 may include at least one of an acceleration sensor, a gyro sensor or an angular velocity sensor, and a geomagnetic field sensor.
The external sensor 540 acquires information about the state of the outside of the vehicle 20. The GPS sensor 542 receives GPS signals. The GPS sensor 542 may output information indicating the current position of the vehicle 20 shown by the GPS signal. The camera 544 captures an image of at least one of the surroundings of the vehicle 20 and the driver 22, and outputs data of the captured image. The camera 544 may have an off-board camera (not shown) that photographs the exterior of the vehicle 20. The camera 544 may have an in-vehicle camera (not shown) that photographs the interior of the vehicle 20.
The range sensor 546 measures the distance between the vehicle 20 and objects around the vehicle 20. The distance measuring sensor 546 may output data of a distance image, or may output measurement data indicating the position, distance, direction, and the like of an object around the vehicle 20. The distance measurement principle, the scanning method, and the scanning range by the distance measurement sensor 546 are not particularly limited. The range sensor 546 may be a range sensor, a millimeter wave radar, a LIDAR, an ultrasonic radar, or the like. Examples of the object around the vehicle 20 include a road 10, an object (for example, a traffic signal 352) located on the road 10, a preceding vehicle 40, an oncoming vehicle, a following vehicle, and a pedestrian.
The environmental sensor 548 measures, for example, the temperature of objects around the vehicle 20. The environment sensor 548 measures, for example, the temperature of the air (sometimes referred to as outside air) around the vehicle 20. The environmental sensor 548 may measure the surface temperature of the roadway 10. Other examples of the object around the vehicle 20 include a vehicle, a vehicle occupant, and a pedestrian. The vehicle may be a preceding vehicle 40, an oncoming vehicle, or a following vehicle.
The environment sensor 548 may measure the humidity of the outside air of the vehicle 20, or may measure the illuminance around the vehicle 20. The environmental sensor 548 may be a sensor of a non-contact type or a sensor of a contact type.
The external sensor 540 may be an example of a road information acquisition unit or a vehicle information acquisition unit. The camera 544 may be an example of a road information acquisition section or a vehicle information acquisition section. The distance measuring sensor 546 may be an example of a road information acquisition unit or a vehicle information acquisition unit. The environment sensor 548 may be an example of a road information acquisition unit or a vehicle information acquisition unit. The illuminance may be an example of the brightness.
Fig. 6 schematically shows an example of the internal configuration of the recognition unit 456. In the present embodiment, the recognition unit 456 includes a road state recognition unit 620, a nearby vehicle recognition unit 630, an environmental information recognition unit 640, and a user state recognition unit 650.
In the present embodiment, the road state recognition unit 620 recognizes the state of the road 10. The road state recognition unit 620 analyzes at least one of the image data of the road 10 output from the sensor unit 430 and the distance measurement data of the road 10 output from the sensor unit 430, for example, and recognizes the state of the road 10. The road condition recognition unit 620 may recognize the condition of the road 10 using various information acquired by the road traffic information acquisition unit 454 from an external information processing device.
The road state recognition part 620 may recognize the state of the road 10 using the state of the environment around the vehicle 20 recognized by the environment information recognition part 640. For example, when the road state recognition unit 620 analyzes the image data of the road 10, the road state recognition unit 620 determines whether or not the road surface is frozen, taking into account the outside air temperature recognized by the environment information recognition unit 640.
The state of the road 10 includes at least one of (i) the shape and width of the road 10, or the relative position between the end of the road 10 and the vehicle 20, (ii) the state of the road surface of the road 10, and (iii) the appearance shape, size, and position of a structure or obstacle located on or near the road 10. The position of the structure or obstacle described above may also be the relative position of the structure or obstacle to the vehicle 20.
In the present embodiment, the nearby vehicle recognition unit 630 recognizes the state of the nearby vehicle. The nearby vehicle recognition unit 630 analyzes at least one of the image data of the road 10 output by the sensor unit 430 and the distance measurement data of the road 10 output by the sensor unit 430, for example, and recognizes the state of the nearby vehicle.
The states of the peripheral vehicles include (i) the shape and size of each of the one or more peripheral vehicles, (ii) the relative position of each of the one or more peripheral vehicles with respect to the vehicle 20, and (iii) the relative speed of each of the one or more peripheral vehicles with respect to the vehicle 20. The peripheral vehicle may be a preceding vehicle 40, a following vehicle, an oncoming vehicle, or the like.
In the present embodiment, the environment information recognition unit 640 recognizes the state of the environment around the vehicle 20. The environment information recognition unit 640 recognizes the state of the environment around the vehicle 20, for example, based on at least 1 of the image data of the road 10 output by the sensor unit 430, the measurement data of at least one of the temperature and the humidity of the outside air output by the sensor unit 430, the measurement data of the illuminance output by the sensor unit 430, and the information on the weather around the vehicle 20 acquired by the road traffic information acquisition unit 454. The ambient environment state may be, for example, the weather, air temperature, humidity, and illuminance around the vehicle 20.
In the present embodiment, the user state recognition unit 650 recognizes the state of the user of the vehicle 20. For example, the user state recognition portion 650 recognizes the state of the driver 22. The user state recognition unit 650 may analyze the image data of the driver 22 captured by the camera 544 to recognize the state of the driver 22.
The algorithm for identifying the state of the driver 22 is not particularly limited. In one embodiment, the user state recognition unit 650 recognizes the state of the driver 22 using a machine learning model that has been learned. In another embodiment, the user state recognition unit 650 recognizes the state of the driver 22 based on the detection of a specific pattern corresponding to a specific state.
The state of the driver 22 may be a state in which the possibility of executing a specific operation is high, a state in which the possibility of executing a specific operation is low, or the like. For example, the user state recognition unit 650 recognizes that the driver 22 is likely to perform the passing operation, taking into consideration at least one of the consideration factors including (i) at least one of the operation, the gesture, and the line of sight of the driver 22, (ii) at least one of the relative position and the relative speed of the vehicle 20 and the preceding vehicle 40, (iii) the speed of the vehicle 20, and (iv) at least one of the speed of the preceding vehicle 40 and the driving pattern of the preceding vehicle 40.
The road state identifying unit 620 may be an example of a road information acquiring unit. The nearby vehicle recognition unit 630 may be an example of a vehicle information acquisition unit. The environmental information recognition unit 640 may be an example of a road information acquisition unit.
Fig. 7 schematically shows an example of the internal configuration of the head-up display 112. In the present embodiment, the head-up display 112 includes an auxiliary information acquisition unit 720, an image processing unit 730, and an image projection unit 740.
In the present embodiment, the assist information acquiring unit 720 acquires various assist information output by the assist control unit 114. The auxiliary information may be in the form of image data or data used to generate the image data.
In the present embodiment, the image processing unit 730 generates an image to be presented to the driver 22 based on the assist information acquired by the assist information acquiring unit 720. The image processing unit 730 may superimpose the above-described assist information on the object image of the real space to generate an image for showing to the driver 22. For example, the image processing unit 730 superimposes the recommended route 38 included in the support information 230 on the object image of the real space, and generates an image for showing to the driver 22.
In the present embodiment, the image projecting unit 740 projects the image generated by the image processing unit 730 toward the front screen 120. The image projecting unit 740 may adjust the projection position of the image so that the image generated by the image processing unit 730 is superimposed on the object image in the real space and presented to the driver 22.
The image processing unit 730 may be an example of an image generating unit. The image projecting unit 740 may be an example of an information presentation device.
Fig. 8 schematically shows an example of the internal configuration of the assist control unit 114. In the present embodiment, the assist control unit 114 includes a passing event detection unit 820, an output necessity determination unit 830, a passing determination unit 840, an output content determination unit 850, and an output data generation unit 860. In the present embodiment, the passing determination unit 840 includes a passing area determination unit 842, a threshold setting unit 844, a possibility determination unit 846, and a difficulty determination unit 848.
In the present embodiment, the passing event detection unit 820 detects an event related to a passing operation (sometimes referred to as a passing event). Thus, the passing event detection unit 820 can detect that the driver 22 intends to pass the preceding vehicle 40 by operating the vehicle 20. As described above, the overtaking operation may be an example of an operation for changing the traveling direction of the vehicle 20.
As the passing event, when the relative distance between the vehicle 20 and the preceding vehicle 40 is smaller than a preset value, examples of the passing event include (i) the operation input unit 420 having received a specific operation, (ii) a change in at least one of the relative position and the relative speed between the vehicle 20 and the preceding vehicle 40 satisfying a specific condition, (iii) the speed of the preceding vehicle 40 being smaller than a preset value, and (iv) the user state recognition unit 650 being determined as being in a state in which the driver 22 is highly likely to perform the passing operation. The overtaking event may be an event related to the intention of the driver 22 to overtake, and the content thereof is not limited to the above example.
Specific examples of the operation include (i) an operation of a direction indicator for changing the course of the vehicle 20 in a direction specified by a traffic law applied to the vehicle 20 as a passing direction, (ii) an operation of a steering gear for changing the course of the vehicle 20 in a direction specified by a traffic law applied to the vehicle 20 as a passing direction, and (iii) an operation of an accelerator pedal, a shift switch, or a shift lever for increasing an acceleration amount. The above-described operation may be input to the operation input unit 420 by voice, line of sight, or gesture of the driver 22.
Specific examples of the specific conditions include (i) a condition that the relative distance between the vehicle 20 and the preceding vehicle 40 is smaller than a predetermined value when the speed of the vehicle 20 is larger than the predetermined value, (ii) a condition that at least one of the fluctuation range and the frequency of the relative position between the vehicle 20 and the preceding vehicle 40 is larger than the predetermined value, and (iii) a condition that at least one of the fluctuation range and the frequency of the relative speed between the vehicle 20 and the preceding vehicle 40 is larger than the predetermined value. The specific condition is not limited to the above example as long as it is a condition indicating the movement pattern of the vehicle 20 or the preceding vehicle 40 before the overtaking operation.
When the passing event is detected, the passing event detecting unit 820 may output information indicating that the passing event is detected to each unit of the assist control unit 114. For example, the passing event detection unit 820 outputs information indicating that the passing event is detected to the output necessity determination unit 830.
In the present embodiment, the output necessity determining unit 830 determines whether or not the output of the auxiliary information is necessary. When it is determined that the assist information needs to be output, the output necessity determining unit 830 may output information indicating that the assist information is output to the passing determination unit 840. When it is determined that the output of the assist information is not necessary, the output necessity determining unit 830 may output information indicating that the assist information is not output to the passing determination unit 840.
In one embodiment, when the passing event detection unit 820 detects a passing event, the output necessity determination unit 830 determines that the assist information needs to be output. In another embodiment, the output necessity determining unit 830 determines that the output of the assist information is necessary when the overtaking event is to be stopped after the overtaking event detecting unit 820 detects the overtaking event or after the vehicle 20 starts the overtaking operation.
In the present embodiment, the overtaking determination unit 840 performs various determinations regarding overtaking of the preceding vehicle 40. In one embodiment, the passing determination unit 840 determines whether or not passing is possible. In another embodiment, the passing determination unit 840 determines the difficulty of the passing operation.
In the present embodiment, the passing area specifying unit 842 specifies an area (sometimes referred to as a usable area) that the vehicle 20 can use to pass the preceding vehicle 40. The passing area determination unit 842 may determine the available area in an area inside a predetermined range (sometimes referred to as a scanning range) in front of the vehicle 20. The shape and size of the scanning range are not particularly limited. The above-mentioned size includes the length of the scanning range along the extending direction of the road 10, the length of the scanning range along the width direction of the road 10, and the like.
In addition, the extending direction of the road 10 may be a direction substantially parallel to the advancing direction of the vehicle 20 or the preceding vehicle 40. The width direction of the road 10 may be a direction substantially parallel to the vehicle width direction of the vehicle 20 or the preceding vehicle 40.
The passing area determination unit 842 may determine the available area based on at least one of (a) the information indicating the relative position of the vehicle 20 and the preceding vehicle 40 that is the subject of the passing action, which is recognized by the nearby vehicle recognition unit 630 of the recognition unit 456, (b) (i) the information on the road 10 acquired by the road traffic information acquisition unit 454, and (ii) the information indicating the state of the road 10 recognized by the road state recognition unit 620 of the recognition unit 456. As the available area, the passing area determination part 842 may calculate a distance (sometimes referred to as an available road width) from the preceding vehicle 40 to an end of the lane of the road 10 at the current position of the preceding vehicle 40. The end of the lane described above may be an end on one side of the passing direction specified by traffic regulations applicable to the vehicle 20.
For example, the passing area specifying unit 842 acquires the information on the road 10 and the preceding vehicle 40 from the road traffic information acquiring unit 454 and the identifying unit 456. The passing area specifying unit 842 specifies the width of the road 10 on which the preceding vehicle 40 travels and the position of the preceding vehicle 40 on the road 10, based on the above-described (a) and (b). The passing area specifying unit 842 calculates the road width that the vehicle 20 can use to pass the preceding vehicle 40, based on the width of the road 10 and the position of the preceding vehicle 40 on the road 10.
The passing area specifying unit 842 may acquire at least one of (c) information indicating the relative speed between the vehicle 20 and the preceding vehicle 40 that is the subject of the passing operation, which is recognized by the surrounding vehicle recognition unit 630 of the recognition unit 456, (d) information indicating at least one of the relative position and the relative speed between the vehicle 20 and the surrounding vehicle other than the preceding vehicle 40 that is the subject of the passing operation, which is recognized by the surrounding vehicle recognition unit 630 of the recognition unit 456, (e) the current speed of the vehicle 20 measured by the speed sensor 522, (f) own vehicle information of the vehicle 20 stored in the storage unit 460, and (g) information applicable to the traffic regulations for the vehicle 20. The passing area determination unit 842 may determine the available area based on the acquired information. The passing area determination unit 842 may determine the available area based on at least one of (a) and (b) described above and (c) to (g) described above.
For example, the passing area determination unit 842 estimates the distance traveled by the vehicle 20 while the vehicle 20 is moving in front of the preceding vehicle 40, based on (i) the relative speed between the vehicle 20 and the preceding vehicle 40, (ii) the current speed of the vehicle 20, (iii) the speed limit (sometimes referred to as the maximum speed) of the road 10 on which the vehicle 20 is traveling, or the upper limit value of the set speed of the vehicle 20. The passing area determination unit 842 may determine the length of the usable area in the extending direction of the road 10 based on the current position of the vehicle 20 and the estimated distance. The passing area determination unit 842 may calculate the minimum value of the available road width in consideration of the positions of the structures or obstacles located in the available area.
In the present embodiment, the threshold setting unit 844 sets various thresholds to be used for the determination process in at least one of the availability determination unit 846 and the difficulty determination unit 848. For example, the threshold setting unit 844 accesses the storage unit 460 and acquires initial values set in advance as values of the respective thresholds. The threshold setting unit 844 may determine the value of each threshold based on at least one of the speed of the vehicle 20, the vehicle width of the vehicle 20, traffic regulations applied to the vehicle 20, the structure of the road 10, the road surface state of the road 10, the length of the usable area in the extending direction of the road 10, the driving experience of the driver 22, the weather around the vehicle 20, the brightness around the vehicle 20, and the time zone.
In one embodiment, the availability determination unit 846 determines whether or not the overtaking operation is possible based on the length of the usable road width, as described later. For example, when the length of the usable road width or the minimum value thereof is larger than a preset value (which may be referred to as a "threshold availability"), the availability determination unit 846 determines that the overtaking operation is possible. When the length of the usable road width or the minimum value thereof is smaller than the threshold, the possibility determination unit 846 may determine that the overtaking operation is not possible.
In this case, the threshold setting unit 844 may determine the value of the threshold. For example, the threshold setting unit 844 determines the value of the threshold so that the threshold is larger than the vehicle width of the vehicle 20. As the threshold value, the threshold value setting unit 844 may set a value sufficiently larger than the vehicle width of the vehicle 20.
In another embodiment, the difficulty level determining unit 848 determines whether the overtaking operation is difficult based on the length of the usable road width as described below. For example, when the length of the usable road width or the minimum value thereof is smaller than a preset value (which may be referred to as a difficulty threshold), the difficulty determination unit 848 determines that the overtaking operation is difficult. When the length of the usable road width or the minimum value thereof is larger than the difficulty threshold value, the difficulty level determination unit 848 may determine that the overtaking operation is easy.
In this case, the threshold setting unit 844 may determine the difficulty threshold value. For example, the threshold setting unit 844 determines the value of the difficulty threshold so that the difficulty threshold is larger than the vehicle width of the vehicle 20. The threshold setting unit 844 determines the value of the difficulty threshold so that the difficulty threshold is smaller than the availability threshold.
Threshold setting unit 844 acquires information indicating the speed of vehicle 20 measured by speed sensor 522, for example. The threshold setting portion 844 may determine the value of the difficulty threshold based on the speed of the vehicle 20. For example, the threshold setting unit 844 determines the difficulty threshold so that the difficulty threshold when the speed of the vehicle 20 is greater than a predetermined threshold (which may be referred to as a speed threshold) is greater than the difficulty threshold when the speed of the vehicle 20 is smaller than the speed threshold.
More specifically, when the speed of the vehicle 20 is 40km/h, the threshold setting unit 844 determines the difficulty threshold value by a calculation formula of the difficulty threshold value, i.e., the vehicle width + a of the vehicle 20. a is a positive number greater than 0. On the other hand, when the speed of the vehicle 20 is 10km/h, the threshold setting unit 844 determines the difficulty threshold value by a calculation formula of the difficulty threshold value, i.e., the vehicle width + b of the vehicle 20. b is a positive number greater than 0 and less than a. Thus, for example, the driver 22 can overtake the preceding vehicle 40 without giving an uncomfortable feeling to the occupant of the preceding vehicle 40 regardless of the speed of the vehicle 20.
In the present embodiment, the availability determination unit 846 determines whether or not the vehicle 20 can perform an operation of moving from the rear or the side of the preceding vehicle 40 toward the front of the preceding vehicle 40 (this operation is sometimes referred to as a passing operation). For example, when the output necessity determining unit 830 determines that the auxiliary information needs to be output, the possibility determining unit 846 executes the above-described determination process.
As described above, the possibility determination unit 846 may determine that the passing operation is possible when it is appropriate or possible to move the vehicle 20 from the rear or the side of the front vehicle 40 toward the front of the front vehicle 40. On the other hand, when it is not appropriate or possible to move the vehicle 20 from the rear or side of the preceding vehicle 40 toward the front of the preceding vehicle 40, the possibility determination unit 846 may determine that the overtaking operation is not possible.
The availability determination unit 846 may determine that the overtaking operation is possible when the available area has a predetermined characteristic. For example, when the length of the usable road width or the minimum value thereof is larger than the availability threshold value, the availability determination unit 846 determines that the overtaking operation is possible. The availability determination unit 846 may determine that the overtaking operation is not possible when the length of the usable road width or the minimum value thereof is smaller than the availability threshold value.
As described above, the availability threshold value may be determined based on at least one consideration of the speed of the vehicle 20, the vehicle width of the vehicle 20, traffic regulations applicable to the vehicle 20, the structure of the road 10, the road surface state of the road 10, the length of the usable area in the extending direction of the road 10, the driving experience of the driver 22, the weather around the vehicle 20, the brightness around the vehicle 20, and the time zone. Therefore, the availability determination unit 846 can determine whether or not the overtaking operation is available based on the above-described considerations. For example, the availability determination unit 846 may determine whether or not the overtaking operation is available based on the vehicle width of the vehicle 20.
In the present embodiment, the difficulty level determining unit 848 determines whether or not an operation for moving the vehicle 20 from the rear or side of the front vehicle 40 to the front of the front vehicle 40 (this operation is sometimes referred to as a passing operation) is difficult. For example, when the output necessity determining unit 830 determines that the auxiliary information needs to be output, the difficulty level determining unit 848 performs the above-described determination process.
As described above, when the overtaking operation includes an operation with a high difficulty level, the difficulty level determination unit 848 may determine that the overtaking operation is difficult. On the other hand, when the overtaking operation does not include an operation with a high difficulty level, the difficulty level determination unit 848 may determine that the overtaking operation is easy. The operation with high difficulty may be a predetermined kind of operation.
The difficulty determination unit 848 may determine that the passing operation is difficult when the available area has a predetermined characteristic. For example, the difficulty level determination unit 848 determines that the passing operation is difficult when the length of the available road width or the minimum value thereof is smaller than the difficulty level threshold. The difficulty level determination unit 848 may determine that the passing operation is easy when the length of the available road width or the minimum value thereof is greater than the difficulty level threshold.
As described above, the difficulty threshold value may be determined based on at least one consideration of the speed of the vehicle 20, the vehicle width of the vehicle 20, traffic regulations applicable to the vehicle 20, the structure of the road 10, the road surface state of the road 10, the length of the usable area in the extending direction of the road 10, the driving experience of the driver 22, the weather around the vehicle 20, the brightness around the vehicle 20, and the time zone. Therefore, the difficulty determination unit 848 can determine whether or not the overtaking operation is possible based on the above-described considerations. For example, the difficulty level determination unit 848 may determine whether the passing operation is difficult based on the vehicle width of the vehicle 20.
In the present embodiment, the output content determination unit 850 determines the content of information to be output as auxiliary information. For example, when the output necessity determining unit 830 determines that the auxiliary information needs to be output, the output content determining unit 850 determines the content of the information output as the auxiliary information.
As the content of the information output as the assist information, at least one of the message 32, the steering operation amount 34, the accelerator operation amount 36, and the recommended route 38 is exemplified. When the message 32 is output, the output-content determining unit 850 may determine the content of the message 32. As the content of the message 32, there are exemplified (i) that it is appropriate or possible to cause the vehicle 20 to perform the action intended by the driver 22, (ii) that it is inappropriate or impossible to cause the vehicle 20 to perform the action intended by the driver 22, and the like.
As described above, the steering operation amount 34 includes a meter or an icon indicating the steering operation amount, the current steering operation amount, and the steering operation amount for moving the vehicle 20 along the recommended route 38. As described above, the accelerator operation amount 36 includes a meter or an icon indicating the accelerator operation amount, the current accelerator operation amount, and the accelerator operation amount for moving the vehicle 20 along the recommended route 38.
[ embodiment in which output content is determined according to the difficulty of driving operation ]
In one embodiment, the output content determining unit 850 determines the content of the information output as the auxiliary information so that (a) when the overtaking operation of the vehicle 20 is possible and the overtaking operation is difficult, and (b) when the overtaking operation of the vehicle 20 is possible and the overtaking operation is easy, the information output as the auxiliary information is different. The output content determination unit 850 may determine the content of the information to be output as the auxiliary information so that (a) when the overtaking operation of the vehicle 20 is possible and the overtaking operation is difficult, (b) when the overtaking operation of the vehicle 20 is possible and the overtaking operation is easy, and (c) when the overtaking operation of the vehicle 20 is not possible, the information to be output as the auxiliary information is different.
The output content determining unit 850 may determine that (a) the overtaking operation of the vehicle 20 is possible and the overtaking operation is difficult, when (i) the possibility determining unit 846 determines that the overtaking operation of the vehicle 20 is possible and (ii) the difficulty determining unit 848 determines that the overtaking operation is difficult. The output content determining unit 850 may determine that (b) the overtaking operation of the vehicle 20 is possible and the overtaking operation is easy, when (i) the possibility determining unit 846 determines that the overtaking operation of the vehicle 20 is possible and (ii) the difficulty determining unit 848 determines that the overtaking operation is not difficult. The output content determination unit 850 may determine that (c) the overtaking operation of the vehicle 20 is not possible, when the availability determination unit 846 determines that the overtaking operation of the vehicle 20 is not possible.
For example, the output content determining unit 850 determines to output the assist information including the steering operation amount 34 when (a) the overtaking operation of the vehicle 20 is possible and the overtaking operation is difficult. The output-content determining unit 850 may determine to output the assist information including at least one of (i) the steering operation amount 34 and (ii) the message 32 and the accelerator operation amount 36 when (a) the overtaking operation of the vehicle 20 is possible and the overtaking operation is difficult. As the assist information, the output-content determining unit 850 may determine to output at least one of (i) the steering operation amount 34 and (ii) the message 32 and the accelerator operation amount 36. In the case where the message 32 is output, the output-content deciding part 850 may decide to output the message 32 indicating that it is appropriate or possible to cause the vehicle 20 to perform the overtaking action.
For example, the output content determining unit 850 determines to output the assistance information including the recommended route 38 when (b) the overtaking operation of the vehicle 20 is possible and the overtaking operation is easy. The output content determining unit 850 may determine to output the assist information including at least one of (i) the recommended route 38 and (ii) the message 32, the steering operation amount 34, and the accelerator operation amount 36 when (b) the overtaking operation of the vehicle 20 is possible and the overtaking operation is easy. As the assist information, the output content determination unit 850 may determine to output at least one of (i) the recommended route 38 and (ii) the message 32, the steering operation amount 34, and the accelerator operation amount 36. In the case where the message 32 is output, the output-content deciding part 850 may decide to output the message 32 indicating that it is appropriate or possible to cause the vehicle 20 to perform the overtaking action.
For example, when (c) the overtaking operation of the vehicle 20 is not possible, the output content determination unit 850 determines to output the assistance information including the message 32 indicating that it is not appropriate or possible to cause the vehicle 20 to perform the overtaking operation. As the auxiliary information, the output-content determining unit 850 may determine to output the message 32.
Further, the operation of the driver 22 may determine that the overtaking operation of the vehicle 20 is not possible after the start of the overtaking operation of the vehicle 20. At this time, the output content determination unit 850 determines to output the message 32 that recommends the interruption or suspension of the overtaking operation. The output-content determining unit 850 may determine to output the assist information including at least one of (i) the message 32 and (ii) the steering operation amount 34 and the accelerator operation amount 36.
[ embodiment in which the output content is determined according to the speed of the vehicle 20 ]
In another embodiment, the output content determination unit 850 determines the content of the information output as the auxiliary information based on the speed of the vehicle 20. The output content determination unit 850 may determine the content of the information output as the auxiliary information based on the speed of the vehicle 20, instead of or in addition to the embodiment in which the output content is determined based on the difficulty of the driving operation. The output-content determining unit 850 may determine the content of the information output as the auxiliary information based on the relative speed between the vehicle 20 and the preceding vehicle 40.
For example, the output-content determining unit 850 determines to output the assist information including the accelerator operation amount 36 (i) when the speed of the vehicle 20 or (ii) the relative speed of the vehicle 20 and the preceding vehicle 40 is smaller than the 1 st output threshold. (i) The output-content determining unit 850 may determine to output the assist information including the accelerator operation amount 36 when the speed of the vehicle 20 or (ii) the relative speed between the vehicle 20 and the preceding vehicle 40 is smaller than the 1 st output threshold and the width of the road 10 is smaller than the 1 st width threshold.
In these cases, the output content determination unit 850 may determine to output the assist information including the message 32 and the accelerator operation amount 36. The 1 st output threshold and the 1 st width threshold may be preset values or values determined by an arbitrary algorithm.
The smaller the speed of the vehicle 20 or the smaller the relative speed between the vehicle 20 and the preceding vehicle 40, the longer the period from the start of the overtaking operation to the completion of the overtaking operation. As a result, for example, the difficulty of passing a vehicle increases, or the possibility of an emergency event occurring during the passing period increases. In particular, when the width of the road 10 is narrow, more careful driving is required. In this situation, the accelerator operation amount 36 is presented to the driver 22 according to the present embodiment. Therefore, the driving assistance unit 110 can prompt the driver 22 to increase the speed of the vehicle 20, and can assist the passing operation with higher safety and greater ease.
For example, the output-content deciding part 850 may decide to output the assist information including the steering operation amount 34 and the recommended route 38 (i) when the speed of the vehicle 20 or (ii) the relative speed of the vehicle 20 and the preceding vehicle 40 is smaller than the 2 nd output threshold. (i) The output-content deciding part 850 may decide to output the assist information including the steering operation amount 34 and the recommended route 38 in the case where the speed of the vehicle 20 or (ii) the relative speed of the vehicle 20 and the preceding vehicle 40 is smaller than the 2 nd output threshold and the width of the road 10 is smaller than the 2 nd width threshold.
In these cases, the output content determination unit 850 may determine to output the assist information including at least one of the message 32 and the accelerator operation amount 36, the steering operation amount 34, and the recommended route 38. The 2 nd output threshold and the 2 nd width threshold may be preset values or values determined by an arbitrary algorithm. The 2 nd output threshold may be the same as or different from the 1 st output threshold. The 2 nd width threshold may be the same as or different from the 1 st width threshold.
The smaller the speed of the vehicle 20 or the smaller the relative speed between the vehicle 20 and the preceding vehicle 40, the longer the period from the start of the overtaking operation to the completion of the overtaking operation. As a result, for example, the difficulty of passing a vehicle increases, or the possibility of an emergency event occurring during the passing period increases. In particular, when the width of the road 10 is narrow, more careful driving is required. Even in such a situation, according to the present embodiment, the steering gear operation amount 34 and the recommended route 38 are presented to the driver 22. Therefore, the driving assistance unit 110 can prompt the driver 22 to perform an appropriate steering operation of the vehicle 20, and can assist an easier and highly safe overtaking operation.
[ Generation of auxiliary information ]
In the present embodiment, the output data generation unit 860 generates data (which may be referred to as output data) to be output from the assist control unit 114 to the head-up display 112. The output data generation unit 860 acquires information indicating the determination result of the output content determination unit 850, and generates output data based on the determination result.
The output data generation unit 860 may generate at least one of the message 32, the steering operation amount 34, the accelerator operation amount 36, and the recommended route 38 according to the determination result. In one embodiment, the output data generation unit 860 generates assist information including the message 32, the steering operation amount 34, the accelerator operation amount 36, and/or the recommended route 38, and outputs the generated assist information to the head-up display 112 as output data. In another embodiment, the output data generator 860 generates information for causing the image processor 730 of the head-up display 112 to generate auxiliary information including the message 32, the steering operation amount 34, the accelerator operation amount 36, and/or the recommended route 38, and outputs the generated information to the head-up display 112 as output data.
The data format of the output data is not particularly limited. Examples of the data format of the output data include image data, audio data, text data, and binary data for implementing a program of the image processing unit 730. The image data may be image data of an image subjected to image processing for superimposition display, or image data of an image before the image processing for superimposition display is performed.
As the message 32, there is a case where text data of a fixed format or image data of an image of a fixed format is set in advance and stored in the storage device of the head-up display 112. In this case, the binary data described above may contain identification information for identifying text data of a fixed format or image data of an image of a fixed format.
In some cases, image data of icons indicating the operation amounts of the steering operation amount 34 and the accelerator operation amount 36 is set in advance and stored in the storage device of the head-up display 112. In this case, the binary data may include (i) identification information of an icon for identifying each operation amount, (i) information indicating the current operation amount for each operation amount, and (ii) information indicating the operation amount for moving the vehicle 20 along the recommended route 38.
[ creation of respective data constituting auxiliary information ]
In the present embodiment, when the output-content determining unit 850 determines the output message 32, the output-data generating unit 860 creates the message 32 that outputs the content determined by the output-content determining unit 850. Specifically, the output data generation section 860 creates a message 32 indicating that it is appropriate or possible to cause the vehicle 20 to perform the action intended by the driver 22, or a message 32 indicating that it is inappropriate or impossible to cause the vehicle 20 to perform the action intended by the driver 22. The intention of the driver 22 is detected by the passing event detection section 820, for example.
In the present embodiment, when the output content determining unit 850 determines to output at least one of the steering operation amount 34, the accelerator operation amount 36, and the recommended route 38, the output data generating unit 860 first determines a route (i.e., the recommended route 38) for the vehicle 20 to overtake the preceding vehicle 40. The output data generation unit 860 determines the recommended route 38, for example, in consideration of (i) the relative position and relative speed of the vehicle 20 and the preceding vehicle 40, (ii) the relative position and relative speed of the vehicle 20 and the oncoming vehicle, (iii) the relative position and relative speed of the vehicle 20 and another preceding vehicle 40 traveling ahead of the preceding vehicle 40 that is the overtaking object, (iv) the relative position and relative speed of the vehicle 20 and the following vehicle, (v) the structure of the road 10, (vi) traffic regulations, and the like.
Next, for example, when the output content determining unit 850 determines to output the steering operation amount 34, the output data generating unit 860 creates the steering operation amount 34 that matches the current state of the vehicle 20. For example, first, the output data generation unit 860 acquires information indicating the current steering operation amount from the operation input unit 420. Next, the output data generation unit 860 detects a difference between the current position of the vehicle 20 and the recommended route 38. The output data generation unit 860 calculates the steering operation amount required to eliminate the difference. Thus, the output data generation unit 860 can determine the steering operation amount required to move the vehicle 20 along the recommended route 38.
When the output data generation unit 860 generates the image data of the steering operation amount 34, the output data generation unit 860 may generate the image data of the steering operation amount 34 by reflecting the current steering operation amount and the steering operation amount required to move the vehicle 20 along the recommended route 38 on the image of the icon indicating the steering operation amount. When the image processing unit 730 of the head-up display 112 generates the image of the steering operation amount 34, the output data generation unit 860 may not generate the image data of the steering operation amount 34.
Similarly, for example, when the output content determining unit 850 determines the output accelerator operation amount 36, the output data generating unit 860 creates the accelerator operation amount 36 that matches the current state of the vehicle 20. For example, first, the output data generation unit 860 acquires information indicating the current accelerator operation amount from the operation input unit 420. Next, the output data generation unit 860 detects a difference between the current position of the vehicle 20 and the recommended route 38. The output data generation unit 860 calculates the accelerator operation amount required to eliminate the difference. Thus, the output data generation unit 860 can determine the accelerator operation amount required to move the vehicle 20 along the recommended route 38.
When the output data generation unit 860 generates the image data of the accelerator operation amount 36, the output data generation unit 860 may generate the image data of the accelerator operation amount 36 by reflecting the current accelerator operation amount and the accelerator operation amount required to move the vehicle 20 along the recommended route 38 on the image of the icon representing the accelerator operation amount. When the image processing unit 730 of the head-up display 112 generates the image of the accelerator operation amount 36, the output data generation unit 860 may not generate the image data of the accelerator operation amount 36.
Similarly, for example, when the output content determination unit 850 determines to output the recommended route 38, the output data generation unit 860 generates image data of the recommended route 38. The output data generation unit 860 may generate image data for superimposition display. When the image processing unit 730 of the head-up display 112 generates image data for superimposition display, the output data generation unit 860 may not generate image data for superimposition display.
The passing event detection section 820 may be an example of an intention detection section. The output necessity determining unit 830 may be an example of the 3 rd determining unit. The passing area determination unit 842 may be an example of the area determination unit. The threshold setting unit 844 may be an example of the speed information acquiring unit or the threshold determining unit. The possibility determination unit 846 may be an example of the 1 st determination unit. The difficulty level determination unit 848 may be an example of the 2 nd determination unit. The output content determination unit 850 may be an example of an information processing apparatus or a determination unit. The output data generating unit 860 may be an example of the 1 st operation amount determining unit, the 2 nd operation amount determining unit, the route determining unit, or the image generating unit.
The available road width is an example of the area width. The difficulty threshold may be an example of the 1 st threshold. The speed threshold may be an example of a 2 nd threshold. The 1 st output threshold may be an example of the 3 rd threshold. The 2 nd output threshold may be an example of the 3 rd threshold. The overtaking maneuver may be an example of a maneuver intended by the driver 22. The steering operation may be an example of the 1 st operation. The steering gear operation amount may be an example of the operation amount of the 1 st operation. The steering operation amount 34 may be an example of information indicating the operation amount of the 1 st operation. The accelerator operation may be an example of the 2 nd operation. The accelerator operation amount may be an example of the operation amount of the 2 nd operation. The accelerator operation amount 36 may be an example of information indicating the operation amount of the 1 st operation. The recommended route 38 may be an example of information indicating the route determined by the route determination unit.
Fig. 9 schematically shows an example of information processing in the passing determination unit 840. The overtaking determination unit 840 receives the road traffic information from the road traffic information acquisition unit 454, the range data and the environment data of the periphery of the vehicle 20 from the recognition unit 456, and the own vehicle information of the vehicle 20 from the storage unit 460. The overtaking determination unit 840 outputs a determination result regarding the possibility of overtaking and a determination result regarding the difficulty of overtaking. The details of the information processing in the passing determination unit 840 are not limited to the information processing described with reference to fig. 8. The passing determination unit 840 may output the determination result by the information processing described with reference to fig. 8, or may output the determination result by a machine learning model.
An example of information processing in the passing area specification unit 842 will be described with reference to fig. 10 and 11. Specifically, an example of a method for the overtaking area specifying unit 842 to determine the available road width will be described with reference to the case where the vehicle 20 overtakes the preceding vehicle 40 traveling in the traveling direction 44. In fig. 10 and 11, the width or width may be abbreviated as W, and the length or distance may be abbreviated as L.
As shown in fig. 10, at time t1 when the overtaking operation is started, the preceding vehicle 40 travels at a position 1042 on the road 10. During the overtaking operation, the preceding vehicle 40 is expected to travel at the position 1044 of the road 10 at the time t2 (at the time t2 after t 1). However, near the position 1044, the annunciator 352 is disposed on the road. Thus, according to the present embodiment, the road width 1064 available to the vehicle 20 for overtaking maneuver at the location 1044 is smaller than the road width 1062 available to the vehicle 20 for overtaking maneuver at the location 1042. The road width 1062 and the road width 1064 are smaller than the width 1060 of the road 10.
In order for the availability determination unit 846 to more accurately determine whether or not the overtaking operation is available, the overtaking area determination unit 842 preferably estimates the minimum value of the available road width in the available area 1100 shown in fig. 11. Thus, the availability determination unit 846 can determine whether the overtaking operation is available based on the vehicle width 1120 of the vehicle 20.
As described above, the passing area specification unit 842 may specify the available area in the area inside the scanning range 1180. The length 1182 of the road 10 in the width direction in the scanning range 1180 may be equal to the width 1060 or may be larger than the width 1060. The length 1184 of the scanning range 1180 in the extending direction of the road 10 may be set to a preset value, or may be determined based on the relative speed of the vehicle 20 and the preceding vehicle 40.
The length 1184 is preferably larger than the distance that the vehicle 20 moves during the period from the start to the completion of the overtaking operation. This can suppress the suspension or interruption of the overtaking operation. However, the length 1184 may be smaller than the distance that the vehicle 20 moves during the period from the start of the overtaking operation to the completion thereof. For example, when the speed of the vehicle 20 is smaller than a preset value, the vehicle 20 can safely stop or interrupt the overtaking operation. Therefore, for example, when the speed of the vehicle 20 is smaller than a preset value, the length 1184 of the allowable scanning range 1180 may be set to be smaller than the distance the vehicle 20 moves during the period from the start to the completion of the overtaking operation. On the other hand, when the speed of the vehicle 20 is higher than the preset value, the length 1184 of the scan range 1180 may be set to be smaller than the distance the vehicle 20 moves during the period from the start to the completion of the overtaking operation.
The relative distance 1186 between the vehicle 20 and the scanning range 1180 may be set to a predetermined value, or may be determined based on the relative speed between the vehicle 20 and the preceding vehicle 40. The relative distance 1186 between the vehicle 20 and the scanning range 1180 may also be determined based on the relative distance 1124 between the vehicle 20 and the preceding vehicle 40.
Fig. 12 schematically shows an example of information processing in the assist control unit 114. In fig. 12, the step may be abbreviated as S.
According to the present embodiment, the determination process is started when the relative distance between the vehicle 20 and the preceding vehicle 40 is smaller than a preset value. When the determination process is started, first, in S1220, the passing event detection unit 820 determines whether or not a passing event is detected. In S1220, when the passing event detection unit 820 does not detect a passing event (in the case of no in S1220), the process of S1220 is repeated. On the other hand, in S1220, when the passing event detection unit 820 detects a passing event (in the case of yes in S1220), the output necessity determination unit 830 determines that the assist information needs to be output. In S1232, the availability determination unit 846 determines whether the overtaking operation is available.
If the availability determination unit 846 determines in S1232 that the overtaking is not possible (in the case of no in S1232), the output content determination unit 850 determines in S1242 to output the message 32 indicating that the overtaking is not possible, as the assist information. On the other hand, if the possibility determination unit 846 determines that the overtaking is possible (in the case of yes at S1232), the difficulty determination unit 848 determines the difficulty of the overtaking operation at S1234.
When the difficulty level determination unit 848 determines in S1234 that the passing operation is difficult (no in S1234), the output content determination unit 850 determines in S1244 to output the assist information 30 as the assist information. On the other hand, when the difficulty level determination unit 848 determines that the passing operation is easy (yes in S1234), the output content determination unit 850 determines to output the assist information 230 as assist information in S1246.
Thereafter, in S1250, for example, the passing determination unit 840 determines whether the passing operation is completed. If it is determined that the overtaking operation is not completed (no in S1250), the process from S1232 onward is repeated. On the other hand, if it is determined that the overtaking operation is completed (yes in S1250), the determination process is ended.
Fig. 13 shows an example of a computer 3000 capable of embodying a plurality of aspects of the present invention in whole or in part. A portion of the vehicle 20 may be implemented by a computer 3000. For example, the assist control unit 114 is realized by the computer 3000.
The program installed in the computer 3000 enables the computer 3000 to function as an operation related to a device related to an embodiment of the present invention or one or more "units" of the device, or enables the computer 3000 to execute the operation or the one or more "units", and/or enables the computer 3000 to execute a process related to an embodiment of the present invention or a step of the process. Such programs may be executed by the CPU3012 in order to cause the computer 3000 to perform certain operations associated with some or all of the functional blocks of the flowcharts and block diagrams described herein.
The computer 3000 of the present embodiment includes a CPU3012, a RAM3014, a graphic controller 3016, and a display device 3018, which are connected to each other through a main controller 3010. The computer 3000 further includes a communication interface 3022, a hard disk drive 3024, a DVD-ROM drive 3026, and an input/output unit such as an IC card drive, which are connected to the main controller 3010 via the input/output controller 3020. The computer also includes a conventional input/output unit such as a ROM3030 and a keyboard 3042, which are connected to the input/output controller 3020 via an input/output chip 3040.
The CPU3012 operates according to programs stored in the ROM3030 and the RAM3014, thereby controlling the respective units. The graphics controller 3016 acquires image data generated by the CPU3012 in a frame buffer or the like provided in the RAM3014 or itself, and causes the image data to be displayed on the display device 3018.
The communication interface 3022 communicates with other electronic apparatuses via a network. The hard disk drive 3024 stores programs and data used by the CPU3012 in the computer 3000. The DVD-ROM drive 3026 reads programs or data from the DVD-ROM 3001 and supplies the programs or data to the hard disk drive 3024 via the RAM 3014. The IC card driver reads and/or writes a program and data from/to the IC card.
The ROM3030 internally stores a startup program or the like executed by the computer 3000 when activated, and/or a program dependent on hardware of the computer 3000. The input/output chip 3040 may also connect various input/output units with the input/output controller 3020 via a parallel interface, a serial interface, a keyboard interface, a mouse interface, or the like.
The program is provided by a computer-readable storage medium such as a DVD-ROM 3001 or an IC card. The program is read from a computer-readable storage medium, installed to the hard disk drive 3024, the RAM3014, or the ROM3030, which is also an example of the computer-readable storage medium, and executed by the CPU 3012. The information processing described in these programs is read by the computer 3000, and the cooperation between the programs and the various types of hardware resources described above is realized. The apparatus or method may be configured to perform operations or processes on information in accordance with use of the computer 3000.
For example, in the case of performing communication between the computer 3000 and an external device, the CPU3012 may execute a communication program loaded on the RAM3014, and instruct a communication process to the communication interface 3022 based on a process described in the communication program. The communication interface 3022 reads transmission data stored in a transmission buffer area provided in a recording medium such as the RAM3014, the hard disk drive 3024, the DVD-ROM 3001, or the IC card, and transmits the read transmission data to the network, or writes reception data received from the network into a reception buffer area provided in the recording medium, or the like, under the control of the CPU 3012.
In addition, the CPU3012 can cause all or a necessary part of a file or database held in an external recording medium such as a hard disk drive 3024, a DVD-ROM drive 3026 (DVD-ROM 3001), an IC card, or the like to be read into the RAM3014, and perform various types of processing on data on the RAM 3014. The CPU3012 may then write the processed data back to the external recording medium.
Various kinds of information such as various kinds of programs, data, tables, and databases may be saved to a recording medium and applied to information processing. The CPU3012 can execute various processes described in various places in the present disclosure, including various operations specified by an instruction sequence of a program, information processing, condition judgment, conditional branching, unconditional branching, retrieval/replacement of information, and the like, on data read from the RAM3014, and write the result back to the RAM 3014. In addition, the CPU3012 can retrieve information in a file, a database, or the like within the recording medium. For example, when a plurality of items each having an attribute value of the 1 st attribute associated with an attribute value of the 2 nd attribute are stored in the recording medium, the CPU3012 may retrieve an item matching the condition, which specifies an attribute value of the 1 st attribute, from the plurality of items, and read the attribute value of the 2 nd attribute stored in the item, thereby acquiring the attribute value of the 2 nd attribute associated with the 1 st attribute satisfying the preset condition.
The programs or software modules described above may be stored on a computer-readable storage medium on computer 3000 or in the vicinity of computer 3000. A recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the internet may be used as a computer-readable storage medium, and the program may be provided to the computer 3000 via a network.
The functional blocks in the flowcharts and block diagrams in the above embodiments may be expressed as "units" having means for performing steps of the processes in which the operations are performed or the roles of the operations. Certain steps and "units" may be implemented by dedicated circuitry, programmable circuitry supplied with computer-readable commands stored on a computer-readable storage medium, and/or a processor supplied with computer-readable commands stored on a computer-readable storage medium. The application specific circuits may comprise digital and/or analog hardware circuits, and may comprise Integrated Circuits (ICs) and/or discrete circuits. The programmable circuit may include, for example, a Field Programmable Gate Array (FPGA), a reconfigurable hardware circuit such as a Programmable Logic Array (PLA), OR the like, including a logical AND, a logical OR, a logical XOR, a logical NAND, a logical NOR, AND other logical operations, flip-flops, registers, memory elements, AND the like.
The computer readable storage medium may comprise any tangible device capable of storing instructions for execution by a suitable device, and as a result, a computer readable storage medium having stored instructions thereon may be an article of manufacture comprising instructions that are executable to implement means for performing the operations specified in the flowchart or block diagram block or blocks. As examples of the computer readable storage medium, an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, and the like may be included. As more specific examples of the computer-readable storage medium, floppy disks (registered trademark), floppy disks, hard disks, Random Access Memories (RAMs), Read Only Memories (ROMs), erasable programmable read only memories (EPROMs or flash memories), Electrically Erasable Programmable Read Only Memories (EEPROMs), Static Random Access Memories (SRAMs), compact disc read only memories (CD-ROMs), Digital Versatile Discs (DVDs), blu-ray (registered trademark) discs, memory sticks, integrated circuit cards, and the like may be included.
The computer-readable instructions may include assembler instructions, Instruction Set Architecture (ISA) instructions, machine-trusted instructions, microcode, firmware instructions, state-setting data, or any combination of 1 or more programming languages, including an object-oriented programming language such as Smalltalk, JAVA (registered trademark), C + +, or the like, and conventional procedural programming languages, such as the "C" programming language or the same programming language.
The computer-readable instructions, which may be executed by a processor or programmable circuitry of a general purpose computer, special purpose computer, or other programmable data processing apparatus to generate means for performing the operations specified in the flowchart or block diagram, may be provided to the processor or programmable circuitry of the general purpose computer, special purpose computer, or other programmable data processing apparatus via a Wide Area Network (WAN), such as a Local Area Network (LAN), the internet, etc. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
The present invention has been described above with reference to the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes and modifications can be made in the above embodiments. In addition, the matters described with respect to a specific embodiment can be applied to other embodiments within a range not technically contradictory. Each component may have the same features as other components having the same names and different reference numerals. It is apparent from the description of the claims that such modifications and improvements can be made within the technical scope of the present invention.
Note that the order of execution of the respective processes such as the operations, flows, steps, and steps in the devices, systems, programs, and methods shown in the claims, the description, and the drawings is not particularly explicitly indicated as "preceding" or "preceding", and may be realized in any order as long as the output of the preceding process is not used in the subsequent process. Even if the description is made using "first", "next", and the like for convenience in the operation flows in the claims, the description, and the drawings, it does not mean that the operations are necessarily performed in this order.
[ description of reference ]
10 road
20 vehicle
22 driver
30 auxiliary information
32 messages
34 amount of steering operation
36 throttle operation amount
38 recommending routes
40 front vehicle
44 forward direction
110 driving support part
112 head-up display
114 auxiliary control unit
120 front screen
230 auxiliary information
330 images
352 annunciator
420 operation input unit
430 sensing part
440 communication unit
450 vehicle control unit
452 own position estimating part
454 road traffic information acquiring unit
456 recognition unit
460 storage part
520 built-in sensor
522 speed sensor
524 inertial sensor
540 external sensor
542 GPS sensor
544 camera
546 ranging sensor
548 environment sensor
620 road state identifying part
630 peripheral vehicle recognition unit
640 environmental information recognition unit
650 user state recognition unit
720 auxiliary information acquisition unit
730 image processing part
740 image projection unit
820 overtaking event detecting part
830 output judging part
840 overtaking determination part
842 overtaking area determining part
844 threshold value setting unit
846 determination section
848 difficulty determination unit
850 output content determining unit
860 output data generating unit
1042 position
1044 position
1060 width
1062 road width
1064 road width
1100 available area
1120 vehicle width
1124 relative distance
1180 scan range
1186 relative distance
3000 computer
3001 DVD-ROM
3010 Main controller
3012 CPU
3014 RAM
3016 graphics controller
3018 display device
3020 input/output controller
3022 communication interface
3024 hard disk drive
3026 DVD-ROM drive
3030 ROM
3040 input/output chip
3042 a keyboard.

Claims (18)

1. An information processing apparatus, wherein,
the control device includes a determination unit that determines the content of driving assistance information that is output to assist a driver of a mobile object in driving the mobile object,
(a) the determination unit determines to output the driving assistance information including the 1 st information when the moving body is movable from the rear or the side of the 2 nd moving body moving in front of or to the side of the moving body toward the front of the 2 nd moving body and the operation of the moving body is difficult,
(b) the determination unit determines to output the driving assistance information including 2 nd information different from the 1 st information when the moving object is movable from the rear or the side of the 2 nd moving object toward the front of the 2 nd moving object and the moving object is easy to operate.
2. The information processing apparatus according to claim 1, further comprising:
a 1 st determination unit that determines whether or not the mobile body can move from the rear or the side of the 2 nd mobile body toward the front of the 2 nd mobile body; and
a 2 nd determination unit that determines whether or not an operation for moving the moving body from the rear or the side of the 2 nd moving body toward the front of the 2 nd moving body is difficult,
(i) the determination unit determines to output the driving assistance information including the 1 st information when the 1 st determination unit determines that the mobile object is movable forward of the 2 nd mobile object and the 2 nd determination unit determines that the operation is difficult,
(ii) when the 1 st determination unit determines that the mobile object is movable forward of the 2 nd mobile object and the 2 nd determination unit determines that the operation is not difficult, the determination unit determines to output the driving assistance information including the 2 nd information.
3. The information processing apparatus according to claim 2, further comprising:
a road information acquisition unit that acquires road information relating to a road on which the mobile object travels;
a vehicle information acquisition unit that acquires peripheral vehicle information relating to one or more peripheral vehicles that travel around the mobile body; and
a region specifying unit that specifies a region that the moving object can use to move from the rear or the side of the 2 nd moving object to the front of the 2 nd moving object, based on the road information and the peripheral vehicle information,
the moving body is a vehicle and the moving body is,
the 2 nd mobile body is included in the one or more nearby vehicles,
when it is determined that the driving assistance information needs to be output and the region determined by the region determining unit has a predetermined characteristic, the 2 nd determining unit determines that it is difficult for the moving body to move from the rear or the side of the 2 nd moving body toward the front of the 2 nd moving body.
4. The information processing apparatus according to claim 3,
the predetermined characteristic is a condition that a minimum value of a region width, which is a length of the region in a direction substantially parallel to the vehicle width direction of the 2 nd mobile object, is smaller than a 1 st threshold value.
5. The information processing apparatus according to claim 4, further comprising:
a speed information acquisition unit that acquires speed information indicating a speed of the mobile body; and
a threshold value determination unit that determines the 1 st threshold value based on the velocity of the mobile object indicated by the velocity information,
the threshold determination unit determines the 1 st threshold so that the 1 st threshold when the speed of the mobile object is greater than the 2 nd threshold is greater than the 1 st threshold when the speed of the mobile object is less than the 2 nd threshold.
6. The information processing apparatus according to any one of claim 3 to claim 5,
the road information includes: at least one of image data or ranging data of a road on which the mobile body travels, structural information indicating a structure of the road on which the mobile body travels, and regulation information indicating a traffic regulation applicable to the road on which the mobile body travels.
7. The information processing apparatus according to any one of claim 2 to claim 6,
the 1 st determination unit determines whether or not the moving body is movable from the rear or the side of the 2 nd moving body toward the front of the 2 nd moving body, based on the vehicle width of the moving body, and/or
The 2 nd deciding unit decides whether or not the operation of the moving body is difficult based on the vehicle width of the moving body.
8. The information processing apparatus according to any one of claim 1 to claim 7,
a 3 rd determination unit for determining whether or not the driving assistance information needs to be output,
when the 3 rd determination unit determines that the driving assistance information needs to be output, the determination unit determines the content of the driving assistance information.
9. The information processing apparatus according to claim 8,
further comprising an intention detection unit for detecting that the occupant intends to move the moving body from the rear or side of the 2 nd moving body toward the front of the 2 nd moving body,
when the intention detection unit detects the intention, the 3 rd determination unit determines that the driving assistance information needs to be output.
10. The information processing apparatus according to any one of claim 1 to claim 9,
(c) when it is determined that the mobile object cannot move forward of the 2 nd mobile object, the determination unit determines to output the driving assistance information including the 3 rd information different from the 1 st information and the 2 nd information.
11. The information processing apparatus according to any one of claim 1 to claim 10,
a route determining unit configured to determine a route along which the mobile object moves from the rear or the side of the 2 nd mobile object to the front of the 2 nd mobile object,
the 1 st information includes information indicating the route determined by the route determination unit.
12. The information processing apparatus according to claim 11,
further comprising an image generating unit for generating an image to be shown to the passenger by superimposing the route determined by the route determining unit on an object image in real space,
the 1 st information includes data of the image generated by the image generating unit.
13. The information processing apparatus according to claim 11 or claim 12,
further comprising a 1 st operation amount determining unit that determines an operation amount of a 1 st operation for changing a moving direction of the mobile object,
the 1 st operation amount determining unit determines an operation amount of the 1 st operation required to move the mobile object along the route determined by the route determining unit,
the 2 nd information includes information indicating the operation amount of the 1 st operation determined by the 1 st operation amount determining unit.
14. The information processing apparatus according to any one of claim 11 to claim 13,
further comprising a 2 nd operation amount determining unit that determines an operation amount of a 2 nd operation for changing a moving speed of the mobile object,
the 2 nd operation amount determining unit determines an operation amount of the 2 nd operation required for moving the mobile object along the route determined by the route determining unit,
when the relative speed between the mobile object and the 2 nd mobile object is smaller than the 3 rd threshold, the determination unit determines to output the driving assistance information including information indicating the operation amount of the 2 nd operation determined by the 2 nd operation amount determination unit.
15. A driving assistance device is provided with:
the information processing apparatus of any one of claim 1 to claim 14; and
and an information presentation device that presents the driving assistance information determined by the determination unit to the occupant of the mobile body.
16. A moving body in which, in a moving body,
the driving assistance device according to claim 15 is provided.
17. An information processing method, wherein,
the method includes a determination step of determining the content of driving assistance information to be output for assisting a driver of a mobile object in driving the mobile object,
the determining step includes:
(a) determining to output the driving assistance information including 1 st information when the moving body is movable from a rear side or a side of a 2 nd moving body moving in front of or to a side of the moving body toward a front side of the 2 nd moving body and an operation of the moving body is difficult; and
(b) and a step of determining to output the driving assistance information including 2 nd information different from the 1 st information when the moving body is movable from a rear side or a side of the 2 nd moving body toward a front side of the 2 nd moving body and the moving body is easy to operate.
18. A storage medium that is a computer-readable storage medium storing a program,
the storage medium executes an information processing method when the program is executed,
the information processing method includes a determination step of determining a content of driving assistance information to be output for assisting a driver of a mobile object in driving operation of the mobile object,
the determining step includes:
(a) determining to output the driving assistance information including 1 st information when the moving body is movable from a rear side or a side of a 2 nd moving body moving in front of or to a side of the moving body toward a front side of the 2 nd moving body and an operation of the moving body is difficult; and
(b) and a step of determining to output the driving assistance information including 2 nd information different from the 1 st information when the moving body is movable from a rear side or a side of the 2 nd moving body toward a front side of the 2 nd moving body and the moving body is easy to operate.
CN202010213742.8A 2020-03-24 2020-03-24 Information processing device, driving assistance device, moving object, information processing method, and storage medium Pending CN113442921A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010213742.8A CN113442921A (en) 2020-03-24 2020-03-24 Information processing device, driving assistance device, moving object, information processing method, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010213742.8A CN113442921A (en) 2020-03-24 2020-03-24 Information processing device, driving assistance device, moving object, information processing method, and storage medium

Publications (1)

Publication Number Publication Date
CN113442921A true CN113442921A (en) 2021-09-28

Family

ID=77806468

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010213742.8A Pending CN113442921A (en) 2020-03-24 2020-03-24 Information processing device, driving assistance device, moving object, information processing method, and storage medium

Country Status (1)

Country Link
CN (1) CN113442921A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005149402A (en) * 2003-11-19 2005-06-09 Fujitsu Ten Ltd Driving support system
JP2008039501A (en) * 2006-08-03 2008-02-21 Denso Corp Vehicle navigation apparatus
CN105793910A (en) * 2014-01-29 2016-07-20 爱信艾达株式会社 Automatic driving assistance device, automatic driving assistance method, and program
CN107848540A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method, driving auxiliary program and automatic driving vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005149402A (en) * 2003-11-19 2005-06-09 Fujitsu Ten Ltd Driving support system
JP2008039501A (en) * 2006-08-03 2008-02-21 Denso Corp Vehicle navigation apparatus
CN105793910A (en) * 2014-01-29 2016-07-20 爱信艾达株式会社 Automatic driving assistance device, automatic driving assistance method, and program
CN107848540A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method, driving auxiliary program and automatic driving vehicle

Similar Documents

Publication Publication Date Title
EP3232289B1 (en) Information presentation control method and autonomous vehicle, and autonomous-vehicle driving support system
US10293748B2 (en) Information presentation system
CN110641472B (en) Safety monitoring system for autonomous vehicle based on neural network
CN109426256B (en) Lane assist system for autonomous vehicles based on driver intention
KR102070530B1 (en) Operation method and system of autonomous vehicle based on motion plan
US10145697B2 (en) Dynamic destination navigation system
US10589752B2 (en) Display system, display method, and storage medium
CN109383404B (en) Display system, display method, and medium storing program
JP6036371B2 (en) Vehicle driving support system and driving support method
US20180120844A1 (en) Autonomous driving assistance system, autonomous driving assistance method, and computer program
US11970158B2 (en) Driving assistance system, driving assistance device, and driving assistance method for avoidance of an obstacle in a traveling lane
CN111724627B (en) Automatic warning system for detecting backward sliding of front vehicle
US10583841B2 (en) Driving support method, data processor using the same, and driving support system using the same
JP2011196346A (en) On-vehicle apparatus
US20220315001A1 (en) Driving assistance device, driving assistance method, and storage medium
CN115257794A (en) System and method for controlling head-up display in vehicle
CN113442921A (en) Information processing device, driving assistance device, moving object, information processing method, and storage medium
CN114084159A (en) Driving assistance function reminding method, device, medium and vehicle
JP6096750B2 (en) Driving support device
JP2019148900A (en) Vehicle control device, vehicle, and route guide device
US20240140417A1 (en) Controller
KR20170069030A (en) Driving assistant device for self-driving car and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination