WO2017072956A1 - 運転支援装置 - Google Patents
運転支援装置 Download PDFInfo
- Publication number
- WO2017072956A1 WO2017072956A1 PCT/JP2015/080750 JP2015080750W WO2017072956A1 WO 2017072956 A1 WO2017072956 A1 WO 2017072956A1 JP 2015080750 W JP2015080750 W JP 2015080750W WO 2017072956 A1 WO2017072956 A1 WO 2017072956A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target object
- candidate
- vehicle
- unit
- parking space
- Prior art date
Links
- 230000002093 peripheral effect Effects 0.000 claims description 18
- 230000004424 eye movement Effects 0.000 claims description 7
- 238000005516 engineering process Methods 0.000 abstract description 2
- 238000001514 detection method Methods 0.000 description 46
- 238000010586 diagram Methods 0.000 description 28
- 230000006870 function Effects 0.000 description 19
- 238000000034 method Methods 0.000 description 19
- 238000012545 processing Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 11
- 230000004048 modification Effects 0.000 description 9
- 238000012986 modification Methods 0.000 description 9
- 210000003128 head Anatomy 0.000 description 6
- 230000004397 blinking Effects 0.000 description 2
- 230000002250 progressing effect Effects 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- 241000270295 Serpentes Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/10—Interpretation of driver requests or demands
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B60K35/10—
-
- B60K35/29—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
- B60W30/06—Automatic manoeuvring for parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
- B60W30/14—Adaptive cruise control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- B60K2360/149—
-
- B60K2360/199—
-
- B60K2360/21—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/20—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
- B60R2300/205—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/586—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of parking space
Definitions
- the present invention relates to a driving support device that supports driving of a vehicle capable of automatic driving.
- driving support devices that realize comfortable parking by providing functions that support driving operations during driving, such as automatic braking according to obstacle detection, have been developed.
- driving assistance device As a substitute for all or part of the driving operation performed by the driver in anticipation of fully automatic driving.
- Patent Document 1 discloses a technique for selecting a parking mode based on a steering angle of a steering wheel when a driver selects one parking mode from a plurality of parking modes in automatic driving.
- Patent Document 2 discloses a driving support technology that detects a driver's line of sight and assists steering angle control in a direction intended by the driver.
- Patent Document 1 has a problem that if the steering wheel is turned too much, it is necessary to return the steering angle of the steering wheel, which is troublesome for the driver.
- Patent Document 2 it is possible to indicate a desired traveling direction with a line of sight, but it cannot be used for a method of selecting and determining a desired parking space after comparing a plurality of parking spaces. There was a problem.
- patent document 1 and patent document 2 for an automatic tracking function, the same problem as the automatic parking function has arisen.
- an object of the present invention is to provide a technique that allows a driver to intuitively select and determine a target object to which a vehicle should go. .
- a driving support device is a driving support device that supports driving of a vehicle capable of automatic driving, and is a candidate for a target object to which the vehicle should go, which is detected based on peripheral information about the vehicle periphery.
- a candidate acquisition unit for acquiring a target object candidate; an output control unit for displaying a set range set for each of a plurality of target object candidates acquired by the candidate acquisition unit on a head-up display mounted on the vehicle;
- the candidate selection unit that selects the one target object candidate, and the candidate selection unit selects the target object candidate after the target object candidate is selected by the candidate selection unit and then operated by the driver.
- the target object candidate and a target object determining unit that determines as a target object.
- the target object determining unit outputs the determined target object to an
- the present invention determines, based on line-of-sight information relating to the driver's line of sight of the vehicle, that the driver's line-of-sight position is included in one target object candidate setting range among a plurality of target object candidate setting ranges. If the candidate selection unit that selects the one target object candidate and the target selection candidate is selected by the candidate selection unit and then the decision button that can be operated by the driver is operated, candidate selection is performed.
- a target object determination unit that determines a target object candidate selected by the unit as a target object. As a result, the driver can intuitively select and determine the target object to which the vehicle should go from among the target object candidates.
- FIG. 1 is a block diagram illustrating a configuration of a driving support device according to Embodiment 1.
- FIG. FIG. 6 is a block diagram illustrating a configuration of a driving support device according to a second embodiment.
- 10 is a flowchart showing an operation of a peripheral recognition unit according to the second embodiment. It is a flowchart which shows operation
- FIG. 6 is a flowchart illustrating an operation of a storage unit according to the second embodiment.
- 10 is a diagram illustrating a display example of a head-up display according to Embodiment 2.
- FIG. 6 is a flowchart illustrating an operation of an output control unit according to the second embodiment.
- 10 is a flowchart showing an operation of a candidate selection unit according to the second embodiment.
- 10 is a flowchart illustrating an operation of a target object determining unit according to Embodiment 2.
- 6 is a flowchart illustrating an operation of an automatic driving determination unit according to the second embodiment.
- 6 is a flowchart illustrating an operation of an automatic operation control unit according to the second embodiment. It is a block diagram which shows the structure of the driving assistance device which concerns on a modification.
- FIG. 6 is a block diagram illustrating a configuration of a driving support apparatus according to a third embodiment.
- 10 is a diagram illustrating a display example of a head-up display according to Embodiment 3.
- FIG. 10 is a diagram illustrating a display example of a head-up display according to Embodiment 3.
- FIG. 10 is a diagram illustrating a display example of a head-up display according to Embodiment 3.
- FIG. 10 is a flowchart illustrating an operation of an output control unit according to the third embodiment.
- FIG. 10 is a block diagram illustrating a configuration of a driving support device according to a fourth embodiment. 14 is a flowchart showing an operation of a candidate selection unit according to the fourth embodiment.
- 10 is a flowchart showing the operation of a target object determining unit according to Embodiment 4.
- FIG. 10 is a block diagram illustrating a configuration of a driving support apparatus according to a fifth embodiment.
- FIG. 10 is a flowchart showing an operation of a passenger determination acquisition unit according to the fifth embodiment.
- 10 is a diagram showing a display example of a head-up display according to Embodiment 5.
- FIG. 10 is a flowchart illustrating an operation of a target object determining unit according to Embodiment 5.
- FIG. 10 is a block diagram illustrating a configuration of a driving support apparatus according to a sixth embodiment.
- FIG. 10 is a block diagram illustrating a configuration of a driving support device according to a seventh embodiment.
- FIG. 10 is a block diagram illustrating a configuration of a driving support apparatus according to an eighth embodiment.
- FIG. 20 is a diagram illustrating a display example of a head-up display according to an eighth embodiment. It is a block diagram which shows an example of the hardware constitutions of a driving assistance device. It is a block diagram which shows an example of the hardware constitutions of a driving assistance device.
- the driving support apparatus according to Embodiment 1 of the present invention is an apparatus that supports driving of a vehicle capable of automatic driving.
- FIG. 1 is a block diagram showing the configuration of the driving support apparatus according to Embodiment 1 of the present invention.
- own vehicle a vehicle that is capable of automatic driving and that is driven by the driving support device 1 will be described as “own vehicle”.
- the driving support apparatus 1 in FIG. 1 includes sensing data from a vehicle-mounted sensor (peripheral information related to the periphery of the host vehicle), line-of-sight information from the line-of-sight detection device (line-of-sight information related to the driver's line of sight of the host vehicle), and operation of a decision button Based on the information, the head-up display is controlled, the target object to which the host vehicle is to go is determined, and the determination result is output to a vehicle control ECU (Electronic Control Unit).
- the driving support device 1 displays the target object candidate, which is a candidate for the target object to which the host vehicle should be directed, on the head-up display so that the target object can be identified, and the host vehicle is directed to the determined target object. It is possible to control the automatic driving of the host vehicle.
- the target object will be described as a parking space in which the host vehicle is to be parked or another vehicle that the host vehicle is to follow in the following embodiment, but is not limited thereto.
- the in-vehicle sensor the head-up display, the line-of-sight detection device, the determination button, the automatic driving control unit, and the vehicle control ECU will be briefly described. These are mounted on the host vehicle.
- in-vehicle sensors examples include “Yuji Fukaya,“ Technological Trends of Automotive Sensors ”, Denso Technical Review, 2006, Vol. 1, p.
- the various sensors described in Table 2 of “92-99” are applied.
- the sensing data output from the vehicle-mounted sensor to the driving support device 1 at any time includes, for example, the position and moving direction of the host vehicle, an image of the scenery around the host vehicle, the position of obstacles around the host vehicle, and the like.
- the head-up display displays various information such as images on the windshield (windshield) of the host vehicle, and displays various information superimposed on the actual scenery for the passenger (driver and passenger). As a result, the driver can view various information while viewing the actual scenery, and thus can drive the host vehicle comfortably.
- a gaze detection sensor (camera) of Patent Document 2 is applied to the gaze detection device.
- the line-of-sight information output as needed from the line-of-sight detection device to the driving support device 1 includes, for example, the line-of-sight position of the driver of the host vehicle and whether or not the line-of-sight position is detected.
- the decision button is assumed to be a push button (hard key), for example, and is disposed at a location where the driver can easily operate (for example, a location around the steering wheel).
- the automatic driving control unit outputs the control signal to the vehicle control ECU, thereby automatically driving the host vehicle so that the host vehicle is directed to the target object determined by the driving support device 1 (target object determining unit 14). Control is possible.
- the automatic driving includes a fully automatic driving in which the vehicle performs all driving operations, a semi-automatic driving in which the driver performs some driving operations such as an accelerator and a brake, and limited conditions. This includes at least one of semi-automatic driving in which the vehicle performs all driving operations under the vehicle (for example, the host vehicle is traveling on a highway).
- the vehicle control ECU can control, for example, at least one of an accelerator, a brake, and a steering wheel of the host vehicle based on a control signal from the automatic driving control unit.
- the driving support device 1 in FIG. 1 includes a candidate acquisition unit 11, an output control unit 12, a candidate selection unit 13, and a target object determination unit 14. Next, each component of the driving assistance apparatus 1 of FIG. 1 is demonstrated in detail.
- the candidate acquisition unit 11 acquires a target object candidate that is a target object candidate detected based on sensing data from the in-vehicle sensor. Thereby, for example, the position and range of the target object candidate are acquired.
- the candidate acquisition unit 11 may acquire the target object candidate detected by the external device based on the sensing data from the external device, or the candidate acquisition unit 11 may acquire the target object candidate based on the sensing data. May be detected (acquired).
- a setting range is set for the target object candidates acquired by the candidate acquisition unit 11.
- the range of the target object candidate acquired by the candidate acquisition unit 11 may be set, or a range that is larger or smaller than the range may be set.
- candidate acquisition part 11 explains as what sets a setting range to each of a plurality of target object candidates, it is not restricted to this.
- the output control unit 12 causes the head-up display to display the setting range set for each of the plurality of target object candidates acquired by the candidate acquisition unit 11.
- One target object candidate is selected.
- the target object determination unit 14 selects the target object candidate selected by the candidate selection unit 13 when the determination button operable by the driver is operated after the candidate selection unit 13 selects the target target candidate. Is determined (determined) as a target object. Then, the target object determination unit 14 outputs the determined target object to the automatic operation control unit.
- the driver's line-of-sight position is included in the setting range of one target object candidate among the setting ranges of the plurality of target object candidates.
- the one target object candidate is selected, and when the determination button is operated thereafter, the selected target object candidate is determined as the target object to which the host vehicle should go.
- the driver can intuitively select and determine the target object from the target object candidates.
- FIG. 2 is a block diagram showing a configuration of the driving support apparatus according to Embodiment 2 of the present invention.
- the same or similar components as those in the first embodiment are denoted by the same reference numerals, and different components will be mainly described.
- the driving support device 1 can control the automatic driving of the host vehicle so that the host vehicle goes to the parking space (the target object to which the host vehicle should head) where the host vehicle should park. It has become. That is, a parking space where the host vehicle is to be parked is applied to the target object.
- the candidate acquisition part 11 demonstrated in Embodiment 1 is provided with the periphery recognition part 11a and the parking space candidate detection part 11b.
- each component of the driving assistance apparatus 1 of FIG. 2 is demonstrated in detail.
- the periphery recognition unit 11a in FIG. 2 generates a periphery map based on sensing data (peripheral information related to the periphery of the vehicle) acquired from the in-vehicle sensor.
- the surrounding map is a map obtained by modeling various shapes around the host vehicle in a three-dimensional space.
- the peripheral recognition unit 11a detects white line information detected from a captured image of a front camera (a type of on-vehicle sensor) and information on obstacles such as other vehicles, and the self-measured by a sonar sensor (a type of on-vehicle sensor).
- a peripheral map that models the white line and the shape of the obstacle in the traveling direction of the host vehicle is generated by combining the distance between the vehicle and the obstacle.
- the various shapes in the peripheral map may be determined and configured by point groups, but may be determined and configured in units of objects such as walls, white lines, or various moving objects (for example, forward vehicles, pedestrians).
- the surrounding recognition unit 11a may generate a surrounding map by combining high-accuracy map data, a dynamic map, and the like with sensing data from an in-vehicle sensor.
- the periphery recognition unit 11a is configured to acquire map data including parking space information in the parking lot, or to acquire the usage status of each parking space in real time via the network.
- the parking space information in the parking lot and the usage status of each parking space may be included in the surrounding map.
- FIG. 3 is a flowchart showing the operation of the peripheral recognition unit 11a according to the second embodiment.
- step S1 the surrounding recognition unit 11a acquires sensing data from the in-vehicle sensor.
- step S2 the surrounding recognition unit 11a generates a surrounding map based on the acquired sensing data.
- step S3 the surrounding recognition unit 11a outputs the generated surrounding map to the automatic driving determination unit 17 and the parking space candidate detection unit 11b. Thereafter, the operation of FIG. 3 ends. Note that the operation illustrated in FIG. 3 and the operation described with reference to the flowcharts in FIG. 3 and thereafter are repeatedly performed in real time, for example.
- the parking space candidate detection unit 11b in FIG. 2 detects (detects) a parking space in which the host vehicle can be parked as a parking space candidate based on the surrounding map generated by the surrounding recognition unit 11a.
- the parking space candidate is a candidate for the parking space to which the host vehicle should go, and detection of the parking space candidate includes detection of the position and range of the parking space candidate.
- the candidate acquisition unit 11 acquires a target object candidate such as a parking space candidate and the parking space candidate detection unit 11b detects a target object candidate such as a parking space candidate. is there.
- the parking space candidate detection unit 11b generates white line information and information on the arrangement interval of other vehicles based on the surrounding map, and detects parking space candidates based on the information.
- the parking space candidate detection unit 11b may detect the parking space candidates in consideration of these.
- the parking space candidate detection part 11b sets a setting range to the detected parking space candidate.
- FIG. 4 is a flowchart showing the operation of the parking space candidate detection unit 11b according to the second embodiment.
- step S11 the parking space candidate detection unit 11b acquires a surrounding map from the surrounding recognition unit 11a.
- step S12 the parking space candidate detection unit 11b detects a parking space candidate based on the acquired peripheral map.
- step S ⁇ b> 13 the parking space candidate detection unit 11 b sets a setting range for the detected parking space candidate and outputs the set range to the storage unit 16 and the output control unit 12. Thereafter, the operation of FIG. 4 ends.
- the storage unit 16 of FIG. 2 stores the parking space candidate setting range detected by the parking space candidate detection unit 11b and the parking space determined by the target object determination unit 14.
- FIG. 5 is a flowchart showing the operation of the storage unit 16 according to the second embodiment.
- step S21 the storage unit 16 proceeds to step S22 when the setting range of the parking space candidate is received from the parking space candidate detection unit 11b, and proceeds to step S23 otherwise.
- step S ⁇ b> 22 storage unit 16 stores the accepted parking space candidate setting range.
- storage part 16 may be updated suitably by the parking space candidate detection part 11b.
- step S23 the storage unit 16 proceeds to step S24 when the parking space determined by the target object determination unit 14 is received, and ends the operation of FIG. 5 otherwise.
- step S24 the storage unit 16 stores the received parking space (the parking space determined by the target object determining unit 14). Thereafter, the operation of FIG.
- FIG. 6 is a diagram illustrating a display example of the head-up display according to the second embodiment.
- the output control unit 12 causes the head-up display to display the set ranges of the plurality of parking space candidates detected by the parking space candidate detection unit 11b.
- two parking space candidate setting ranges 31 are displayed in a identifiable manner by broken lines.
- the output control unit 12 displays the driver's line-of-sight position on the head-up display based on the line-of-sight information acquired from the line-of-sight detection device.
- the driver's line-of-sight position 32 is displayed so as to be identifiable by a mark with + in a circle.
- FIG. 7 is a flowchart showing the operation of the output control unit 12 according to the second embodiment.
- step S31 the output control unit 12 acquires the setting range of the latest parking space candidate from the parking space candidate detection unit 11b.
- the output control unit 12 acquires line-of-sight information from the line-of-sight detection device.
- step S ⁇ b> 33 the output control unit 12 acquires a previous setting range of parking space candidates from the storage unit 16 more recently.
- step S34 the output control unit 12 performs control to display the setting range of the parking space candidate acquired in steps S31 and S33 on the head-up display.
- the setting range of the parking space candidate is displayed in association with the parking space in the actual scenery in an identifiable state.
- this display control is not performed when the setting range of a parking space candidate is not acquired by step S31 and S33.
- step S35 the output control unit 12 performs control to display the driver's line-of-sight position on the head-up display based on the line-of-sight information acquired in step S32.
- the driver's line-of-sight position is displayed in an identifiable state so as to be superimposed on the actual scenery. Note that this display control is not performed when the line-of-sight information is not acquired in step S32, or when the driver's line of sight is directed into the vehicle. Thereafter, the operation of FIG.
- the candidate selection unit 13 in FIG. 2 is based on the line-of-sight information acquired from the line-of-sight detection device, and the driver's line-of-sight position is one of the parking space candidates in the set range of a plurality of parking space candidates (target object candidates). When it is determined that it is included in the set range, the one parking space candidate is selected.
- FIG. 8 is a flowchart showing the operation of the candidate selection unit 13 according to the second embodiment.
- step S41 the candidate selection unit 13 acquires line-of-sight information from the line-of-sight detection device.
- step S ⁇ b> 42 the candidate selection unit 13 acquires a setting range of parking space candidates from the storage unit 16.
- step S43 the candidate selection unit 13 determines whether the driver's line-of-sight position based on the acquired line-of-sight information is included in any of the acquired setting ranges of the plurality of parking space candidates. If it is determined that it falls within the set range, the process proceeds to step S44, and if not, the operation in FIG. 8 ends.
- step S44 the candidate selection unit 13 selects the determined parking space candidate, and outputs the selected parking space candidate to the target object determination unit 14. Thereafter, the operation of FIG.
- ⁇ Target object determination unit 14 The target object determination unit 14 in FIG. 2 selects the parking space candidate selected by the candidate selection unit 13 when the host vehicle selects the parking space candidate after the candidate selection unit 13 selects the parking space candidate. Determined (fixed) as a parking space to head. Then, the target object determination unit 14 outputs the determined target object to the automatic driving determination unit 17 (substantially the automatic driving control unit 15) or the like.
- FIG. 9 is a flowchart showing the operation of the target object determining unit 14 according to the second embodiment.
- step S51 the target object determination unit 14 determines whether the parking space candidate selected by the candidate selection unit 13 has been received. When it determines with having received, it progresses to step S52, and when that is not right, operation
- step S52 it is driven whether the output control part 12 determines the said parking space candidate by outputting the parking space candidate selected by the candidate selection part 13 to the output control part 12.
- a display for the person to confirm may be displayed on the head-up display.
- step S52 the target object determination unit 14 determines whether or not the determination button has been operated. If it is determined that it has been operated, the process proceeds to step S53, and if not, the operation in FIG. 9 ends.
- step S53 the target object determining unit 14 determines the parking space candidate received in step S51 as a parking space (target object) to which the host vehicle should go, and the determined parking space is stored in the storage unit 16 and automatically. It outputs to the driving judgment part 17. Thereafter, the operation of FIG. 9 ends.
- the automatic driving determination unit 17 in FIG. 2 causes the host vehicle to go to the parking space or the like based on the surrounding map generated by the surrounding recognition unit 11a.
- the movement route information is generated.
- the moving route information includes trajectory information indicating a route to be moved from the current position when the host vehicle goes to a parking space, etc., and a target speed, a target acceleration, and a target snake angle during the movement are arranged in time series. Information.
- FIG. 10 is a flowchart showing the operation of the automatic driving determination unit 17 according to the second embodiment.
- step S61 the automatic driving determination unit 17 acquires a surrounding map from the surrounding recognition unit 11a.
- step S62 it is determined whether the parking space determined by the target object determining unit 14 has been received. When it determines with having received, it progresses to step S63, and when that is not right, it progresses to step S64.
- step S63 the automatic driving determination unit 17 generates movement route information for heading to the parking space based on the acquired surrounding map and the received parking space. Thereafter, the process proceeds to step S65.
- step S64 the automatic driving determination unit 17 generates movement route information for continuing the course based on the acquired peripheral map. Thereafter, the process proceeds to step S65.
- step S65 the automatic driving determination unit 17 outputs the generated movement route information to the automatic driving control unit 15. Thereafter, the operation of FIG.
- the automatic operation control unit 15 in FIG. 2 outputs the control signal to the vehicle control ECU so that the own vehicle automatically moves so that the own vehicle goes to the parking space (target object) determined by the target object determination unit 14. Control driving.
- FIG. 11 is a flowchart showing the operation of the automatic operation control unit 15 according to the second embodiment.
- step S71 the automatic driving control unit 15 acquires the movement route information from the automatic driving determining unit 17.
- step S72 the automatic driving control unit 15 generates a control signal to be output to the vehicle control ECU based on the acquired travel route information.
- step S73 the automatic driving control unit 15 outputs the generated control signal to the vehicle control ECU. Thereafter, the operation of FIG. 11 ends.
- the driver can intuitively select and determine the parking space to which the host vehicle should go from among the parking space candidates.
- operation of the own vehicle is a fully automatic driving
- the driver's line-of-sight position 32 is displayed on the head-up display so as to be identifiable (FIG. 6). Therefore, the driver can easily adjust his / her line-of-sight position 32 to the setting range 31 of the candidate parking space to be selected.
- FIG. 12 is a block diagram illustrating a configuration of the driving support apparatus according to the present modification.
- the same or similar components as those in the second embodiment are denoted by the same reference numerals, and different components are mainly described.
- the driver's line-of-sight position (for example, the line-of-sight position in the surrounding map) based on the line-of-sight information is stored in the storage unit 16.
- the storage unit 16 stores the driver's line-of-sight position based on the line-of-sight information.
- the output control unit 12 may cause the head-up display to display a display indicating that the parking space corresponding to the line-of-sight position has not been detected.
- the target object determination unit 14 A parking space candidate is determined as a parking space to which the host vehicle should head.
- the candidate selection unit 13 may determine whether or not the line-of-sight position stored in the storage unit 16 is included in the parking space candidate setting range.
- FIG. 13 is a block diagram showing a configuration of the driving support apparatus according to Embodiment 3 of the present invention.
- the same or similar components as those in the second embodiment are denoted by the same reference numerals, and different components will be mainly described.
- the driving support apparatus 1 (output control unit 12) according to the third embodiment can control the output device.
- the output device includes a plurality of audio output devices (for example, a plurality of speakers) mounted on the host vehicle.
- the output control part 12 changes the display mode of the setting range of the parking space candidate displayed by a head-up display based on the timing which the parking space candidate detection part 11b detected the parking space candidate.
- FIG. 14 shows a display example of the head-up display before the parking space candidate detection unit 11b newly detects the parking space candidate setting range 31a.
- FIG. 15 shows a state after the parking space candidate setting range 31a is newly detected. The example of a head-up display is shown.
- the output control part 12 sets the setting range 31a of the parking space candidate detected nearest by the parking space candidate detection part 11b, and the setting of the parking space candidate detected before the latest. You may make it highlight by the display mode (For example, addition of a display color, a mark or an icon, an animation, etc.) different from the range 31b. As a result, the driver's attention can be directed to the most recently detected parking space candidate.
- the output control unit 12 has one of the parking space candidates among the setting ranges of the plurality of parking space candidates detected by the parking space candidate detection unit 11 b.
- the setting range 31c of one parking space candidate and the setting range 31d of other parking space candidates are displayed on the head-up display so as to be distinguishable. That is, the setting range of one parking space candidate selected by the candidate selection unit 13 and the setting range of other parking space candidates are displayed so as to be distinguishable.
- one parking space candidate setting range 31c is different from the other parking space candidate setting range 31d (for example, display color, addition of mark or icon, animation, etc.) It is assumed that the highlighting is performed.
- the determination as to whether or not the driver's line-of-sight position 32 is included in the setting range of one parking space candidate is performed by the output control unit 12, It may be done.
- the output control unit 12 sets the parking space candidate setting range before being determined by the target object determining unit 14 and the parking space setting range determined by the target object determining unit 14. May be displayed in an identifiable manner. Similarly, the output control unit 12 may display the setting range of the parking space candidate recommended by the driving support device 1 and the setting range of other parking space candidates so as to be distinguishable.
- the output control unit 12 causes the sound output device to output a notification sound (makes a sound output) when changing the display mode of the parking space candidate displayed by the head-up display.
- the output control unit 12 controls the audio output of a plurality of audio output devices mounted on the host vehicle based on the position of the parking space candidate most recently detected by the parking space candidate detection unit 11b. For example, when the parking space candidate detection unit 11b detects the parking space candidate most recently, the output control unit 12 sends the parking space candidate to the audio output device in the direction of the parking space candidate among the plurality of audio output devices.
- a notification sound is output with a size corresponding to the distance to the candidate (sound output). The sound at that time may use a tone different from other notification sounds such as obstacle detection. As a result, the driver's attention can be directed to the most recently detected parking space candidate.
- FIG. 17 is a flowchart showing the operation of the output control unit 12 according to the third embodiment. Of the steps in FIG. 17, steps similar to those in FIG. 7 are denoted by the same reference numerals, and description thereof is omitted as appropriate.
- step S81 the output control unit 12 determines the parking space to be displayed according to the state of the parking space candidate (the state in which the setting range includes the line-of-sight position or the parking space candidate has been detected most recently). The display mode of the candidate setting range is determined.
- step S82 the output control unit 12 determines whether the display mode of the parking space candidate displayed by the head-up display has been changed. If it is determined that the change has been made, the process proceeds to step S83; otherwise, the operation of FIG.
- step S83 the output control unit 12 causes the audio output device to output a notification sound. Thereafter, the operation of FIG.
- Embodiment 3 the display mode of the setting range of the parking space candidate displayed by a head-up display is changed based on the timing which detected the parking space candidate. Thereby, for example, the driver's attention can be directed to the most recently detected parking space candidate.
- the voice output device when changing the display mode of the parking space candidate displayed by the head-up display, the voice output device is caused to output the voice. Thereby, the driver can know that the display mode of the parking space candidate has been changed.
- the sound output of a plurality of sound output devices mounted on the host vehicle is controlled based on the position of the most recently detected parking space candidate.
- the driver can know the position of the most recently detected parking space candidate using sound.
- the output control unit 12 is not detected by the parking space candidate detection unit 11b based on the surrounding information generated by the host vehicle, but is based on the surrounding information generated outside the host vehicle (for example, a network).
- the setting range of the parking space candidate detected by 11b may be displayed in a pop-up on the head-up display. According to such a configuration, it is possible to make it easy to see the setting range of parking space candidates that are difficult to see from the viewpoint of the driver and cannot be detected based on the surrounding information of the host vehicle.
- FIG. 18 is a block diagram showing a configuration of the driving support apparatus according to Embodiment 4 of the present invention.
- the same or similar components as those in the second embodiment are denoted by the same reference numerals, and different components will be mainly described.
- ⁇ Candidate selection unit 13> Operation information for the hold button is input to the driving assistance device 1 according to the fourth embodiment. Then, after the parking space candidate is selected by the candidate selecting unit 13, when the hold button is operated, the candidate selecting unit 13 stores the selected parking space candidate in the storage unit 16. That is, after the parking space candidate is selected by the candidate selection unit 13, the storage unit 16 selects the candidate when the hold button is operated (when a predetermined operation other than the determination button operation is performed). The parking space candidate selected by the unit 13 is stored.
- FIG. 19 is a flowchart showing the operation of the candidate selection unit 13 according to the fourth embodiment. Of the steps in FIG. 19, steps similar to those in FIG. 8 are denoted by the same reference numerals, and description thereof is omitted as appropriate.
- step S91 the candidate selection unit 13 determines whether or not the hold button has been operated. If it is determined that it has been operated, the process proceeds to step S92, and if not, the operation in FIG. 19 ends.
- step S92 the candidate selection unit 13 causes the storage unit 16 to store the parking space candidate selected in step S44.
- the driving support device 1 extracts the relevant information such as the camera image and GPS (Global Positioning System) position information regarding the parking space candidate from the sensing data, and the related information and the parking space candidate are put on hold. You may memorize
- GPS Global Positioning System
- the target object determination unit 14 determines the parking space candidate stored in the storage unit 16 by the candidate selection unit 13 as the parking space to which the host vehicle should go when the determination button is operated after the hold button is operated.
- FIG. 20 is a flowchart showing the operation of the target object determining unit 14 according to the fourth embodiment. Of the steps in FIG. 20, steps similar to those in FIG. 9 are denoted by the same reference numerals, and description thereof is omitted as appropriate.
- step S51 the target object determining unit 14 performs the same operation as in step S51 of FIG.
- the target target determination part 14 determines whether the hold button was operated in step S101. If it is determined that the hold button has been operated, the process proceeds to step S102. If not, the process proceeds to step S52 and the same operation as in the second embodiment is performed.
- the output control unit 12 may display the stored (held) parking space candidates on the head-up display or other displays. Moreover, when the relevant information mentioned above is memorize
- step S102 the target object determining unit 14 determines whether or not the determination button has been operated on the stored (held) parking space candidate. At this time, the stored parking space candidates may not be stored again. If it is determined that it has been operated, the process proceeds to step S103. If not, step S102 is repeated until it is determined that it has been operated.
- step S103 the target object determination unit 14 determines the parking space candidate (the parked parking space candidate) stored in the storage unit 16 as a parking space to which the host vehicle should go, and determines the determined parking space.
- the data is output to the storage unit 16 and the automatic driving determination unit 17. Thereafter, the operation of FIG.
- FIG. 21 is a block diagram showing a configuration of the driving support apparatus according to Embodiment 5 of the present invention.
- the same or similar components as those in the second embodiment are denoted by the same reference numerals, and different components will be mainly described.
- 21 includes a passenger determination acquisition unit 18 in addition to the components described in the second embodiment.
- the passenger determination acquisition part 18 acquires the parking space determined by the passenger of the own vehicle or the remote operator.
- the determination of the parking space by the passenger is “the parking space is determined by operating a determination button that can be operated by the passenger among the parking spaces selected based on the passenger's line-of-sight position”. Will be described below.
- the determination of the parking space by the passenger is not limited to this. For example, among the parking spaces selected based on the passenger's line-of-sight position, the parking space is determined by a gesture based on the passenger's eye movement. It may be “to be done”.
- the determination button that can be operated by the passenger is a determination button that is different from the determination button that can be operated by the driver described in the first embodiment and the like, and is disposed at a place that is easily operated by the passenger. And in the following description, a determination button that can be operated by the passenger is referred to as a “determination button for the passenger”, and a determination button that can be operated by the driver is referred to as a “determination button for the driver”.
- FIG. 22 is a flowchart showing the operation of the passenger determination acquisition unit 18 according to the fifth embodiment.
- step S ⁇ b> 111 the passenger determination acquisition unit 18, like the candidate selection unit 13, includes the passenger's line-of-sight position in one of the plurality of parking space candidate setting ranges. If it is determined, the one parking space candidate is selected. At this time, as shown in FIG. 23, the output control unit 12 may display the passenger's line-of-sight position 33 on the head-up display so as to be distinguishable from the driver's line-of-sight position 32. Also, when the passenger's line-of-sight position 33 is included in one parking space candidate setting range 31e, the output control unit 12 displays the one parking space candidate setting range 31e on the head-up display so as to be identifiable. You may let them.
- step S112 the passenger determination acquisition unit 18 determines the parking space candidate selected in step S111 of the host vehicle when the determination button for the passenger is operated, similarly to the target object determination unit 14. Acquired as a parking space determined by the passenger.
- step S113 the passenger determination acquisition unit 18 outputs the acquired parking space to the target object determination unit 14. Thereafter, the operation of FIG.
- ⁇ Target object determination unit 14> when the determination button for the driver is not operated, the parking space acquired by the passenger determination acquisition unit 18 is used as a parking space to be determined by the target object determination unit 14. .
- FIG. 24 is a flowchart showing the operation of the target object determining unit 14 according to the fifth embodiment.
- step S121 the target object determination unit 14 determines whether or not the parking space acquired by the passenger determination acquisition unit has been received. If it is determined that it has been received, the process proceeds to step S122; otherwise, the operation in FIG. 24 is terminated.
- step S122 the target object determination unit 14 determines whether or not the parking space is stored in the storage unit 16, that is, whether or not the parking space has already been determined by operating the determination button for the driver. Determine whether. If it is determined that a parking space has been determined, the operation of FIG. 24 is terminated, and if not, the process proceeds to step S123.
- step S123 the target object determination unit 14 determines the parking space received in step S121 as the parking space to which the host vehicle should go, and outputs the determined parking space to the storage unit 16 and the automatic driving determination unit 17. To do. Thereafter, the operation of FIG.
- the parking space acquired by the passenger determination acquisition unit 18 should be determined by the target object determination unit 14 when the determination button is not operated by the driver. Used as a parking space. Thereby, since the burden on the driver can be reduced, the driver can concentrate on driving or can determine an appropriate parking space that the driver has not noticed.
- the determination of the passenger's parking space is invalid. That is, the driver's decision was prioritized over the passenger's decision.
- the present invention is not limited to this, and the passenger's decision may be given priority over the driver's decision.
- priority is set for the passenger (driver and passenger), and based on the priority, the determination of the parking space by either the driver or the passenger is given priority over the other. Good.
- the user ID determined based on the position of the passenger is used in the passenger determination acquisition unit 18, the storage unit 16, the target object determination unit 14, and the like together with the parking space determined by the passenger. It will be.
- FIG. 25 is a block diagram showing a configuration of the driving support apparatus according to Embodiment 6 of the present invention.
- the same or similar components as those in the second embodiment are denoted by the same reference numerals, and different components are mainly described.
- the parking space is determined by operating the determination button.
- a parking space is determined by the gesture by a driver
- the target object determination unit 14 moves the driver's eyes based on the line-of-sight information from the line-of-sight detection device after the candidate selection unit 13 selects a parking space candidate. It is determined whether or not a predetermined gesture has been performed. When the target object determination unit 14 determines that the gesture has been performed, the target object determination unit 14 determines the parking space candidate selected by the candidate selection unit 13 as a parking space to which the host vehicle should go.
- the predetermined gestures described above include, for example, gestures caused by eye movements that are not performed in normal operation unless intended, such as gaze for a certain period of time or blinking for a certain number of times while maintaining the line-of-sight position. Applies.
- the target object determining unit 14 determines that the gaze is performed when the line-of-sight position based on the line-of-sight information is within a predetermined range for a predetermined time or more, and otherwise, the gaze is performed. What is necessary is just to determine that was not performed.
- the target object determination unit 14 determines that blinking has occurred when the line-of-sight position is detected intermittently within a predetermined range for a predetermined time or more. It is sufficient to determine that no blink has occurred.
- the configuration described before the sixth embodiment may also be applied as appropriate in the sixth embodiment.
- an eye gesture for example, two consecutive blinks
- an eye gesture for determining a parking space is performed (for example, three consecutive blinks).
- the parking space candidate selected by the candidate selection unit 13 may be stored in the storage unit 16.
- FIG. 26 is a block diagram showing a configuration of the driving support apparatus according to Embodiment 7 of the present invention.
- the same or similar components as those in the sixth embodiment are denoted by the same reference numerals, and different components will be mainly described.
- parking space candidates are selected according to the driver's line-of-sight position.
- a parking space candidate is selected according to the driver's pointing position. That is, in the seventh embodiment, a parking space candidate is selected based on the pointing position of the driver, and the parking space is determined by a gesture based on the driver's eye movement.
- the driving support apparatus 1 acquires pointing information regarding the pointing of the driver of the host vehicle from the pointing detection device.
- a device used for Leap Motion (registered trademark) or Kinect (registered trademark) is applied to the finger detection device.
- the candidate selection unit 13 is configured such that the driver's pointing position is one of a plurality of parking space candidate setting ranges based on the pointing information from the pointing detection device. If it is determined that it is included in the set range, the one parking space candidate is selected.
- the configuration described before the sixth embodiment may be applied as appropriate.
- an eye gesture for example, three consecutive blinks
- an eye gesture for determining a parking space for example, two consecutive blinks
- the parking space candidate selected by the candidate selection unit 13 may be stored in the storage unit 16.
- any one function may be selectively executed by a user's selection operation. According to such a configuration, the user can determine the parking space by a favorite operation or an operation with a small load on the user.
- FIG. 27 is a block diagram showing a configuration of the driving support apparatus according to Embodiment 8 of the present invention.
- the same or similar components as those in the second embodiment are denoted by the same reference numerals, and different components will be mainly described.
- the driving support apparatus 1 can control automatic driving of the host vehicle so that the host vehicle is directed to another vehicle (target object to which the host vehicle is to travel) that the host vehicle should follow. It has become.
- another vehicle hereinafter referred to as “following vehicle”
- following vehicle a vehicle in front of the host vehicle is used as the following vehicle.
- the following vehicle candidate detection unit 11c is the same as the parking space candidate detection unit 11b described so far, in which the detection target is changed from a parking space to a following vehicle.
- Other components such as the candidate selection unit 13 are the same as the components described so far except that the target is changed from the parking space to the following vehicle.
- FIG. 28 is a diagram showing a display example of the head-up display according to the eighth embodiment.
- the setting range 34 of the following vehicle candidate is displayed so as to be identifiable by a broken line.
- the candidate selecting unit 13 When the driver's line-of-sight position 32 is included in one of the following vehicle candidate setting ranges 34 among the plurality of following vehicle candidate (following vehicle candidates) setting ranges 34, the candidate selecting unit 13 performs the one following operation. Select a vehicle candidate. Then, when the decision button is operated after the candidate selection unit 13 selects the following vehicle candidate, the target object determination unit 14 causes the host vehicle to go to the following vehicle candidate selected by the candidate selection unit 13. It is determined as a power following vehicle.
- the driver can intuitively select and determine the following vehicle to which the host vehicle should go from the following vehicle candidates. If the automatic driving of the host vehicle is a fully automatic driving, the above selection and determination are performed without returning to manual driving by overriding, changing lanes, or deviating the line of sight from the front. be able to.
- the candidate acquisition unit 11, the output control unit 12, the candidate selection unit 13, and the target object determination unit 14 (hereinafter referred to as “candidate acquisition unit 11 etc.”) in the driving support device 1 described above are performed by the processing circuit 91 shown in FIG. Realized. That is, the processing circuit 91 is acquired by the candidate acquisition unit 11 and the candidate acquisition unit 11 that acquire a target object candidate that is a candidate of a target object to which the vehicle should go, detected based on the peripheral information about the vehicle periphery.
- the candidate selection unit 13 that selects the one target object candidate; After the target object candidate is selected by the candidate selection unit 13, when the determination button that can be operated by the driver is operated, the target object candidate selected by the candidate selection unit 13 is determined as the target object.
- a target object determining unit 14 outputs the determined target object to the automatic driving control section.
- Dedicated hardware may be applied to the processing circuit 91, or a processor (CPU, central processing unit, processing unit, arithmetic unit, microprocessor, microcomputer, Digital Signal Processor) that executes a program stored in the memory ) May be applied.
- the processing circuit 91 When the processing circuit 91 is dedicated hardware, the processing circuit 91 corresponds to, for example, a single circuit, a composite circuit, a programmed processor, a processor programmed in parallel, an ASIC, an FPGA, or a combination thereof. .
- Each function of each unit such as the candidate acquisition unit 11 may be realized by a plurality of processing circuits 91, or the functions of each unit may be realized by a single processing circuit 91.
- the processing circuit 91 When the processing circuit 91 is a processor, the functions of the candidate acquisition unit 11 and the like are realized by a combination with software or the like (software, firmware, or software and firmware). Software or the like is described as a program and stored in a memory. As shown in FIG. 30, the processor 92 applied to the processing circuit 91 reads out and executes a program stored in the memory 93, thereby realizing the functions of the respective units.
- the setting range set for each of the acquired target object candidates is displayed on a head-up display mounted on the vehicle, and the driver's line-of-sight information about the driver's line-of-sight is displayed.
- the memory 93 is a non-volatile memory such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), or the like.
- RAM Random Access Memory
- ROM Read Only Memory
- flash memory EPROM (Erasable Programmable Read Only Memory)
- EEPROM Electrically Erasable Programmable Read Only Memory
- a volatile semiconductor memory HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk), and a drive device thereof are applicable.
- each function of the candidate acquisition unit 11 and the like is realized by either hardware or software.
- the present invention is not limited to this, and a configuration in which part of the candidate acquisition unit 11 and the like is realized by dedicated hardware and another part is realized by software or the like may be used.
- the processing circuit 91 can realize the functions described above by hardware, software, or the like, or a combination thereof.
- storage part 16 is comprised from the memory 93, they may be comprised from the one memory 93 and each may be comprised from the separate memory 93.
- the driving support apparatus described above includes an installed navigation device that can be mounted on a vehicle, a Portable Navigation Device, a communication terminal (for example, a mobile terminal such as a mobile phone, a smartphone, and a tablet), and an application installed in these devices.
- This function can be applied to a driving support system constructed as a system by appropriately combining these functions and a server.
- each function or each component of the driving support device described above or each function or each component of the peripheral device may be distributed and arranged in each device that constructs the system. It may be arranged in a concentrated manner on the devices.
- the present invention can be freely combined with each embodiment and each modification within the scope of the invention, or can be appropriately modified and omitted with each embodiment and each modification.
Abstract
Description
本発明の実施の形態1に係る運転支援装置は、自動運転が可能な車両の運転を支援する装置である。図1は、本発明の実施の形態1に係る運転支援装置の構成を示すブロック図である。以下、自動運転が可能な車両であって、運転支援装置1により運転支援される車両を「自車両」と記載して説明する。
以上のような本実施の形態1に係る運転支援装置1によれば、運転者の視線位置が、複数の目標対象物候補の設定範囲のうちの一の目標対象物候補の設定範囲に含まれると判定した場合に、当該一の目標対象物候補を選択し、その後に決定ボタンが操作された場合に、選択された目標対象物候補を自車両が向かうべき目標対象物として決定する。これにより、運転者は、目標対象物候補の中から目標対象物を直感的に選択及び決定することができる。
図2は、本発明の実施の形態2に係る運転支援装置の構成を示すブロック図である。以下、本実施の形態2に係る運転支援装置1のうち、実施の形態1と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
図2の周辺認知部11aは、車載センサから取得したセンシングデータ(自車両周辺に関する周辺情報)に基づいて、周辺マップを生成する。ここで、周辺マップは、自車両周辺の各種形状を、3次元空間にてモデリングしたマップである。例えば、周辺認知部11aは、フロントカメラ(車載センサの一種)の撮像映像から検出された白線情報及び他車両等の障害物の情報と、ソナーセンサ(車載センサの一種)によって精度よく計測された自車両と障害物との間の距離とを組み合わせて、自車両の進行方向における白線及び障害物の形状をモデリングした周辺マップを生成する。
図2の駐車スペース候補検知部11bは、周辺認知部11aで生成された周辺マップに基づいて、自車両が駐車可能な駐車スペースを、駐車スペース候補として検知(検出)する。ここで、駐車スペース候補は、自車両が向かうべき駐車スペースの候補であり、駐車スペース候補の検知には、駐車スペース候補の位置及び範囲の検知が含まれる。また、候補取得部11が駐車スペース候補などの目標対象物候補を取得することと、駐車スペース候補検知部11bが駐車スペース候補などの目標対象物候補を検知することとは、実質的に同じである。
図2の記憶部16は、駐車スペース候補検知部11bで検知された駐車スペース候補の設定範囲、及び、目標対象物決定部14で決定された駐車スペースを記憶する。
図6は、本実施の形態2に係るヘッドアップディスプレイの表示例を示す図である。
図2の候補選択部13は、視線検知デバイスから取得した視線情報に基づいて、運転者の視線位置が、複数の駐車スペース候補(目標対象物候補)の設定範囲うちの一の駐車スペース候補の設定範囲に含まれると判定した場合に、当該一の駐車スペース候補を選択する。
図2の目標対象物決定部14は、候補選択部13で駐車スペース候補が選択された後、決定ボタンが操作された場合に、候補選択部13で選択された駐車スペース候補を、自車両が向かうべき駐車スペースとして決定(確定)する。そして、目標対象物決定部14は、決定した目標対象物を自動運転判断部17(実質的に自動運転制御部15)などに出力する。
図2の自動運転判断部17は、目標対象物決定部14で駐車スペースが決定された場合に、周辺認知部11aで生成された周辺マップに基づいて、自車両が当該駐車スペースなどに向かうための移動経路情報を生成する。なお、移動経路情報は、自車両が駐車スペースなどに向かう際に現在位置から移動すべき経路を示す軌跡情報と、当該移動時の目標速度、目標加速度及び目標蛇角とが時系列に並べられた情報とを含む。
図2の自動運転制御部15は、制御信号を車両制御ECUに出力することによって、目標対象物決定部14で決定された駐車スペース(目標対象物)に自車両が向かうように自車両の自動運転を制御する。
以上のような本実施の形態2によれば、実施の形態1と同様に、運転者は、駐車スペース候補の中から、自車両が向かうべき駐車スペースを直感的に選択及び決定することができる。なお、自車両の自動運転が完全自動運転である場合には、当該自動駐車の短縮化も期待できる。
図12は、本変形例に係る運転支援装置の構成を示すブロック図である。以下、本変形例に係る運転支援装置1のうち、実施の形態2と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
図13は、本発明の実施の形態3に係る運転支援装置の構成を示すブロック図である。以下、本実施の形態3に係る運転支援装置1のうち、実施の形態2と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
本実施の形態3では、駐車スペース候補を検知したタイミングに基づいて、ヘッドアップディスプレイにより表示される駐車スペース候補の設定範囲の表示態様を変更する。これにより、例えば、運転者の注意を、直近に検知された駐車スペース候補に向けることができる。
出力制御部12は、自車両で生成された周辺情報に基づいて駐車スペース候補検知部11bによって検知されずに、自車両外部(例えばネットワーク)で生成された周辺情報に基づいて駐車スペース候補検知部11bによって検知された駐車スペース候補の設定範囲を、ヘッドアップディスプレイにポップアップ表示させてもよい。このような構成によれば、自車両の周辺情報に基づいて検知できないような、運転者視点から見えにくい駐車スペース候補の設定範囲を、見やすくすることができる。
図18は、本発明の実施の形態4に係る運転支援装置の構成を示すブロック図である。以下、本実施の形態4に係る運転支援装置1のうち、実施の形態2と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
本実施の形態4に係る運転支援装置1には、保留ボタンの操作情報が入力される。そして、候補選択部13で駐車スペース候補が選択された後、保留ボタンが操作された場合に、候補選択部13は、選択した駐車スペース候補を記憶部16に記憶させる。つまり、候補選択部13で駐車スペース候補が選択された後、保留ボタンが操作された場合(決定ボタンの操作以外の予め定められた操作が行われた場合)に、記憶部16は、候補選択部13で選択された駐車スペース候補を記憶する。
目標対象物決定部14は、保留ボタン操作後に決定ボタンが操作された場合に、候補選択部13により記憶部16に記憶された駐車スペース候補を、自車両が向かうべき駐車スペースとして決定する。
以上のような本実施の形態4によれば、決定ボタンの操作以外の予め定められた操作が行われた場合に、候補選択部13で選択された駐車スペース候補が記憶部16に記憶され、予め定められた操作後に決定ボタンが操作された場合に、記憶部16に記憶された駐車スペース候補を、自車両が向かうべき駐車スペースとして決定する。これにより、自車両が向かうべき駐車スペースの決定を保留することができる。したがって、運転者にとって使いやすい運転支援装置1を実現することができる。
図21は、本発明の実施の形態5に係る運転支援装置の構成を示すブロック図である。以下、本実施の形態5に係る運転支援装置1のうち、実施の形態2と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
同乗者決定取得部18は、自車両の同乗者または遠隔操作者によって決定された駐車スペースを取得する。
本実施の形態5では、運転者用の決定ボタンが操作されなかった場合に、同乗者決定取得部18で取得された駐車スペースが、目標対象物決定部14で決定すべき駐車スペースとして用いられる。
以上のような本実施の形態5によれば、運転者により決定ボタンが操作されなかった場合に、同乗者決定取得部18で取得された駐車スペースが、目標対象物決定部14で決定すべき駐車スペースとして用いられる。これにより、運転者の負担を軽減することができるので、運転者は運転に集中することができたり、運転者が気付かなかった適切な駐車スペースを決定したりすることができる。
図25は、本発明の実施の形態6に係る運転支援装置の構成を示すブロック図である。以下、本実施の形態6に係る運転支援装置1のうち、実施の形態2と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
以上のような本実施の形態6によれば、実施の形態2と同様に、運転者は、駐車スペース候補の中から、自車両が向かうべき駐車スペースを直感的に選択及び決定することができる。
図26は、本発明の実施の形態7に係る運転支援装置の構成を示すブロック図である。以下、本実施の形態7に係る運転支援装置1のうち、実施の形態6と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
以上のような本実施の形態7によれば、実施の形態2と同様に、運転者は、駐車スペース候補の中から、自車両が向かうべき駐車スペースを直感的に選択及び決定することができる。
図27は、本発明の実施の形態8に係る運転支援装置の構成を示すブロック図である。以下、本実施の形態8に係る運転支援装置1のうち、実施の形態2と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
以上のような本実施の形態8によれば、運転者は、追従車両候補の中から、自車両が向かうべき追従車両を直感的に選択及び決定することができる。なお、自車両の自動運転が完全自動運転である場合には、一旦、オーバーライドにより手動運転に戻したり、車線変更したり、視線を前方から逸脱させたりすることなく、上述の選択及び決定を行うことができる。
上述した運転支援装置1における候補取得部11、出力制御部12、候補選択部13及び目標対象物決定部14(以下「候補取得部11等」と記す)は、図29に示す処理回路91により実現される。すなわち、処理回路91は、車両周辺に関する周辺情報に基づいて検知された、車両が向かうべき目標対象物の候補である目標対象物候補を取得する候補取得部11と、候補取得部11で取得された複数の目標対象物候補のそれぞれに設定された設定範囲を、車両に搭載されたヘッドアップディスプレイに表示させる出力制御部12と、車両の運転者の視線に関する視線情報に基づいて、運転者の視線位置が、複数の目標対象物候補の設定範囲のうちの一の目標対象物候補の設定範囲に含まれると判定した場合に、当該一の目標対象物候補を選択する候補選択部13と、候補選択部13で目標対象物候補が選択された後、運転者に操作可能な決定ボタンが操作された場合に、候補選択部13で選択された目標対象物候補を目標対象物として決定するとともに、決定した目標対象物を自動運転制御部に出力する目標対象物決定部14と、を備える。処理回路91には、専用のハードウェアが適用されてもよいし、メモリに格納されるプログラムを実行するプロセッサ(CPU、中央処理装置、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、Digital Signal Processor)が適用されてもよい。
Claims (13)
- 自動運転が可能な車両の運転を支援する運転支援装置であって、
前記車両周辺に関する周辺情報に基づいて検知された、前記車両が向かうべき目標対象物の候補である目標対象物候補を取得する候補取得部と、
前記候補取得部で取得された複数の前記目標対象物候補のそれぞれに設定された設定範囲を、前記車両に搭載されたヘッドアップディスプレイに表示させる出力制御部と、
前記車両の運転者の視線に関する視線情報に基づいて、前記運転者の視線位置が、前記複数の目標対象物候補の前記設定範囲のうちの一の前記目標対象物候補の前記設定範囲に含まれると判定した場合に、当該一の目標対象物候補を選択する候補選択部と、
前記候補選択部で前記目標対象物候補が選択された後、前記運転者に操作可能な決定ボタンが操作された場合に、前記候補選択部で選択された前記目標対象物候補を前記目標対象物として決定する目標対象物決定部と
を備え、
前記目標対象物決定部は、
決定した前記目標対象物に前記車両が向かうように前記車両の前記自動運転を制御可能な自動運転制御部に、決定した前記目標対象物を出力する、運転支援装置。 - 請求項1に記載の運転支援装置であって、
前記決定ボタンが、前記候補選択部による前記目標対象物候補の選択前に操作された場合に、前記視線情報に基づく前記運転者の視線位置を記憶する記憶部をさらに備え、
前記記憶部に記憶された視線位置が、前記決定ボタンの操作後に前記候補取得部で取得された前記目標対象物候補の前記設定範囲に含まれる場合に、当該目標対象物候補が、前記目標対象物決定部で決定すべき前記目標対象物として用いられる、運転支援装置。 - 請求項1に記載の運転支援装置であって、
前記出力制御部は、
前記運転者の視線位置を識別可能に、前記ヘッドアップディスプレイに表示させる、運転支援装置。 - 請求項1に記載の運転支援装置であって、
前記出力制御部は、
前記運転者の視線位置が、前記候補取得部で取得された複数の前記目標対象物候補の前記設定範囲のうちの一の前記目標対象物候補の前記設定範囲に含まれる場合に、前記一の目標対象物候補の前記設定範囲と、それ以外の前記目標対象物候補の前記設定範囲とを識別可能に前記ヘッドアップディスプレイに表示させる、運転支援装置。 - 請求項1に記載の運転支援装置であって、
前記出力制御部は、
前記候補取得部で前記目標対象物候補を取得したタイミングに基づいて、前記ヘッドアップディスプレイにより表示される前記目標対象物候補の前記設定範囲の表示態様を変更する、運転支援装置。 - 請求項1に記載の運転支援装置であって、
前記出力制御部は、
前記ヘッドアップディスプレイにより表示される前記目標対象物候補の表示態様を変更する場合に、前記車両に搭載された音声出力装置に音声出力させる、運転支援装置。 - 請求項1に記載の運転支援装置であって、
前記出力制御部は、
前記候補取得部で直近に取得された前記目標対象物候補の位置に基づいて、前記車両に搭載された複数の音声出力装置の音声出力を制御する、運転支援装置。 - 請求項1に記載の運転支援装置であって、
前記出力制御部は、
前記車両で生成された前記周辺情報に基づいて検知されずに、前記車両外部で生成された前記周辺情報に基づいて検知された前記目標対象物候補の前記設定範囲を、前記ヘッドアップディスプレイにポップアップ表示させる、運転支援装置。 - 請求項1に記載の運転支援装置であって、
前記候補選択部で前記目標対象物候補が選択された後、前記決定ボタンの操作以外の予め定められた操作が行われた場合に、前記候補選択部で選択された前記目標対象物候補を記憶する記憶部をさらに備え、
前記目標対象物決定部は、
前記予め定められた操作後に前記決定ボタンが操作された場合に、前記記憶部に記憶された前記目標対象物候補を前記目標対象物として決定する、運転支援装置。 - 請求項1に記載の運転支援装置であって、
前記車両の同乗者または遠隔操作者によって決定された前記目標対象物を取得する同乗者決定取得部をさらに備え、
前記決定ボタンが操作されなかった場合に、前記同乗者決定取得部で取得された前記目標対象物が、前記目標対象物決定部で決定すべき前記目標対象物として用いられる、運転支援装置。 - 請求項1に記載の運転支援装置であって、
前記目標対象物は、
前記車両が駐車すべき駐車スペース、または、前記車両が追従すべき他車両を含む、運転支援装置。 - 自動運転が可能な車両の運転を支援する運転支援装置であって、
前記車両周辺に関する周辺情報に基づいて検知された、前記車両が向かうべき目標対象物の候補である目標対象物候補を取得する候補取得部と、
前記候補取得部で取得された複数の前記目標対象物候補のそれぞれに設定された設定範囲を、前記車両に搭載されたヘッドアップディスプレイに表示させる出力制御部と、
前記車両の運転者の視線に関する視線情報に基づいて、前記運転者の視線位置が、前記複数の目標対象物候補の前記設定範囲のうちの一の前記目標対象物候補の前記設定範囲に含まれると判定した場合に、当該一の目標対象物候補を選択する候補選択部と、
前記候補選択部で前記目標対象物候補が選択された後、前記視線情報に基づいて、前記運転者の目の動きによる予め定められたジェスチャが行われたと判定した場合に、前記候補選択部で選択された前記目標対象物候補を前記目標対象物として決定する目標対象物決定部とを備え、
前記目標対象物決定部は、
決定した前記目標対象物に前記車両が向かうように前記車両の前記自動運転を制御可能な自動運転制御部に、決定した前記目標対象物を出力する、運転支援装置。 - 自動運転が可能な車両の運転を支援する運転支援装置であって、
前記車両周辺に関する周辺情報に基づいて検知された、前記車両が向かうべき目標対象物の候補である目標対象物候補を取得する候補取得部と、
前記候補取得部で取得された複数の前記目標対象物候補のそれぞれに設定された設定範囲を、前記車両に搭載されたヘッドアップディスプレイに表示させる出力制御部と、
前記車両の運転者の指差しに関する指差情報に基づいて、前記運転者の指差位置が、前記複数の目標対象物候補の前記設定範囲のうちの一の前記目標対象物候補の前記設定範囲に含まれると判定した場合に、当該一の目標対象物候補を選択する候補選択部と、
前記候補選択部で前記目標対象物候補が選択された後、前記車両の運転者の視線に関する視線情報に基づいて、前記運転者の目の動きによる予め定められたジェスチャが行われたと判定した場合に、前記候補選択部で選択された前記目標対象物候補を前記目標対象物として決定する目標対象物決定部と
を備え、
前記目標対象物決定部は、
決定した前記目標対象物に前記車両が向かうように前記車両の前記自動運転を制御可能な自動運転制御部に、決定した前記目標対象物を出力する、運転支援装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112015007066.4T DE112015007066T5 (de) | 2015-10-30 | 2015-10-30 | Fahrunterstützungsvorrichtung |
US15/754,852 US10618528B2 (en) | 2015-10-30 | 2015-10-30 | Driving assistance apparatus |
JP2017547318A JP6456516B2 (ja) | 2015-10-30 | 2015-10-30 | 運転支援装置 |
PCT/JP2015/080750 WO2017072956A1 (ja) | 2015-10-30 | 2015-10-30 | 運転支援装置 |
CN201580084135.3A CN108349503B (zh) | 2015-10-30 | 2015-10-30 | 驾驶辅助装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/080750 WO2017072956A1 (ja) | 2015-10-30 | 2015-10-30 | 運転支援装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017072956A1 true WO2017072956A1 (ja) | 2017-05-04 |
Family
ID=58629987
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/080750 WO2017072956A1 (ja) | 2015-10-30 | 2015-10-30 | 運転支援装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10618528B2 (ja) |
JP (1) | JP6456516B2 (ja) |
CN (1) | CN108349503B (ja) |
DE (1) | DE112015007066T5 (ja) |
WO (1) | WO2017072956A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019031107A1 (ja) * | 2017-08-07 | 2019-02-14 | 日立オートモティブシステムズ株式会社 | 駐車制御装置及び方法 |
JP2019215214A (ja) * | 2018-06-12 | 2019-12-19 | 矢崎総業株式会社 | 車両制御システム |
WO2020003558A1 (ja) * | 2018-06-26 | 2020-01-02 | クラリオン株式会社 | 駐車支援装置 |
KR102099407B1 (ko) * | 2018-11-27 | 2020-05-15 | 주식회사 파크에이아이 | 자율 주행 방법 및 자율 주행 장치 |
JPWO2020110186A1 (ja) * | 2018-11-27 | 2021-05-20 | 三菱電機株式会社 | 運転計画変更指示装置および運転計画変更指示方法 |
US11790783B2 (en) | 2018-09-14 | 2023-10-17 | Panasonic Holdings Corporation | Pedestrian device, vehicle-mounted device, mobile body guidance system, and mobile body guidance method |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017140890A (ja) * | 2016-02-09 | 2017-08-17 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
DE112016007237T5 (de) * | 2016-10-19 | 2019-06-27 | Ford Motor Company | Ein system und verfahren zum identifizieren von unbelegten parkpositionen |
JP2020165692A (ja) * | 2019-03-28 | 2020-10-08 | 本田技研工業株式会社 | 制御装置、制御方法およびプログラム |
CN112319496A (zh) * | 2019-08-01 | 2021-02-05 | 北京小马智行科技有限公司 | 自动驾驶车辆的故障处理方法、装置及存储介质 |
CN113173179B (zh) * | 2021-06-09 | 2023-02-21 | 中国第一汽车股份有限公司 | 一种驾驶模式切换提示方法、装置及车辆 |
JP2023051132A (ja) * | 2021-09-30 | 2023-04-11 | トヨタ自動車株式会社 | 運転支援システム、運転支援方法、運転支援プログラム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1086698A (ja) * | 1996-09-12 | 1998-04-07 | Hitachi Ltd | 自動車の走行制御装置 |
JP2007099199A (ja) * | 2005-10-07 | 2007-04-19 | Denso Corp | 画面移動型表示装置 |
WO2007058325A1 (ja) * | 2005-11-17 | 2007-05-24 | Aisin Seiki Kabushiki Kaisha | 駐車支援装置及び駐車支援方法 |
WO2014181543A1 (ja) * | 2013-05-09 | 2014-11-13 | 株式会社デンソー | 視線入力装置 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006172215A (ja) * | 2004-12-16 | 2006-06-29 | Fuji Photo Film Co Ltd | 運転支援システム |
EP2143611B1 (en) | 2007-05-02 | 2013-04-10 | Toyota Jidosha Kabushiki Kaisha | Vehicle behavior controller |
CN102933429B (zh) | 2010-06-09 | 2015-04-29 | 日产自动车株式会社 | 停车模式选择装置以及停车模式选择方法 |
US20120224060A1 (en) * | 2011-02-10 | 2012-09-06 | Integrated Night Vision Systems Inc. | Reducing Driver Distraction Using a Heads-Up Display |
US10339711B2 (en) * | 2013-03-15 | 2019-07-02 | Honda Motor Co., Ltd. | System and method for providing augmented reality based directions based on verbal and gestural cues |
US9285587B2 (en) * | 2013-03-15 | 2016-03-15 | Inrix, Inc. | Window-oriented displays for travel user interfaces |
US20160054795A1 (en) * | 2013-05-29 | 2016-02-25 | Mitsubishi Electric Corporation | Information display device |
EP3031388B1 (en) * | 2013-08-08 | 2020-01-15 | Panasonic Intellectual Property Management Co., Ltd. | Visual field-calculating unit and visual field-calculating method |
US9354073B2 (en) * | 2013-12-09 | 2016-05-31 | Harman International Industries, Inc. | Eye gaze enabled navigation system |
WO2015094371A1 (en) * | 2013-12-20 | 2015-06-25 | Intel Corporation | Systems and methods for augmented reality in a head-up display |
US9639968B2 (en) * | 2014-02-18 | 2017-05-02 | Harman International Industries, Inc. | Generating an augmented view of a location of interest |
JP6481846B2 (ja) * | 2014-03-27 | 2019-03-13 | 日本精機株式会社 | 車両用警報装置 |
TWI522257B (zh) * | 2014-07-09 | 2016-02-21 | 原相科技股份有限公司 | 車用安全系統及其運作方法 |
JP6149824B2 (ja) * | 2014-08-22 | 2017-06-21 | トヨタ自動車株式会社 | 車載装置、車載装置の制御方法及び車載装置の制御プログラム |
DE112015004431B4 (de) * | 2014-09-29 | 2021-07-29 | Yazaki Corporation | Fahrzeuganzeigevorrichtung |
US9690104B2 (en) * | 2014-12-08 | 2017-06-27 | Hyundai Motor Company | Augmented reality HUD display method and device for vehicle |
JP6372402B2 (ja) * | 2015-03-16 | 2018-08-15 | 株式会社デンソー | 画像生成装置 |
EP3072710B1 (en) * | 2015-03-24 | 2018-03-28 | LG Electronics Inc. | Vehicle, mobile terminal and method for controlling the same |
TWI578021B (zh) * | 2015-08-19 | 2017-04-11 | 國立臺北科技大學 | 擴增實境互動系統及其動態資訊互動顯示方法 |
-
2015
- 2015-10-30 DE DE112015007066.4T patent/DE112015007066T5/de not_active Withdrawn
- 2015-10-30 CN CN201580084135.3A patent/CN108349503B/zh active Active
- 2015-10-30 US US15/754,852 patent/US10618528B2/en active Active
- 2015-10-30 WO PCT/JP2015/080750 patent/WO2017072956A1/ja active Application Filing
- 2015-10-30 JP JP2017547318A patent/JP6456516B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1086698A (ja) * | 1996-09-12 | 1998-04-07 | Hitachi Ltd | 自動車の走行制御装置 |
JP2007099199A (ja) * | 2005-10-07 | 2007-04-19 | Denso Corp | 画面移動型表示装置 |
WO2007058325A1 (ja) * | 2005-11-17 | 2007-05-24 | Aisin Seiki Kabushiki Kaisha | 駐車支援装置及び駐車支援方法 |
WO2014181543A1 (ja) * | 2013-05-09 | 2014-11-13 | 株式会社デンソー | 視線入力装置 |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019031107A1 (ja) * | 2017-08-07 | 2019-02-14 | 日立オートモティブシステムズ株式会社 | 駐車制御装置及び方法 |
JPWO2019031107A1 (ja) * | 2017-08-07 | 2020-04-09 | 日立オートモティブシステムズ株式会社 | 駐車制御装置及び方法 |
US11372415B2 (en) | 2018-06-12 | 2022-06-28 | Yazaki Corporation | Vehicle control system |
JP2019215214A (ja) * | 2018-06-12 | 2019-12-19 | 矢崎総業株式会社 | 車両制御システム |
WO2019239629A1 (ja) * | 2018-06-12 | 2019-12-19 | 矢崎総業株式会社 | 車両制御システム |
JP7146466B2 (ja) | 2018-06-12 | 2022-10-04 | 矢崎総業株式会社 | 車両制御システム |
WO2020003558A1 (ja) * | 2018-06-26 | 2020-01-02 | クラリオン株式会社 | 駐車支援装置 |
JP2020001479A (ja) * | 2018-06-26 | 2020-01-09 | クラリオン株式会社 | 駐車支援装置 |
US11472401B2 (en) | 2018-06-26 | 2022-10-18 | Clarion Co., Ltd. | Parking assistance device |
JP7188916B2 (ja) | 2018-06-26 | 2022-12-13 | フォルシアクラリオン・エレクトロニクス株式会社 | 駐車支援装置 |
US11790783B2 (en) | 2018-09-14 | 2023-10-17 | Panasonic Holdings Corporation | Pedestrian device, vehicle-mounted device, mobile body guidance system, and mobile body guidance method |
JPWO2020110186A1 (ja) * | 2018-11-27 | 2021-05-20 | 三菱電機株式会社 | 運転計画変更指示装置および運転計画変更指示方法 |
JP7051263B2 (ja) | 2018-11-27 | 2022-04-11 | 三菱電機株式会社 | 運転計画変更指示装置および運転計画変更指示方法 |
KR102099407B1 (ko) * | 2018-11-27 | 2020-05-15 | 주식회사 파크에이아이 | 자율 주행 방법 및 자율 주행 장치 |
Also Published As
Publication number | Publication date |
---|---|
US20180244286A1 (en) | 2018-08-30 |
JPWO2017072956A1 (ja) | 2018-02-15 |
CN108349503B (zh) | 2022-08-02 |
US10618528B2 (en) | 2020-04-14 |
JP6456516B2 (ja) | 2019-01-23 |
DE112015007066T5 (de) | 2018-07-12 |
CN108349503A (zh) | 2018-07-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6456516B2 (ja) | 運転支援装置 | |
CN107430007B (zh) | 基于自动-手动驾驶偏好比例的路线选择 | |
KR102070530B1 (ko) | 모션 계획에 기초한 자율 주행 차량의 운행 방법 및 시스템 | |
CN106062514B (zh) | 便携式装置与车辆头端单元之间的交互 | |
EP3395600B1 (en) | In-vehicle device | |
WO2015136874A1 (ja) | 表示制御装置、表示装置、表示制御プログラム、表示制御方法、及び記録媒体 | |
EP3334626A1 (en) | Autonomous vehicle human driver takeover mechanism using electrodes | |
JP6448804B2 (ja) | 表示制御装置、表示装置および表示制御方法 | |
JP6851482B2 (ja) | 操作支援装置および操作支援方法 | |
JPWO2018078732A1 (ja) | 表示制御装置、表示装置および表示制御方法 | |
JP6430031B2 (ja) | 出力制御装置及び出力制御方法 | |
JPWO2018066026A1 (ja) | 自動運転制御パラメータ変更装置および自動運転制御パラメータ変更方法 | |
US20200011693A1 (en) | Device and method for assisting driving of vehicles | |
JP2008021234A (ja) | 運転支援画像表示システム及び車載装置 | |
JP6656359B2 (ja) | 駐車支援用表示制御装置および駐車支援用表示制御方法 | |
WO2015033470A1 (ja) | 運転支援装置および運転支援方法 | |
KR20210100187A (ko) | 자율 주행 차량 운전자의 운전자 정보의 출력을 제어하고 운전자 주의력을 유지하기 위한 장치 및 방법 | |
JP7051263B2 (ja) | 運転計画変更指示装置および運転計画変更指示方法 | |
JP2019082856A (ja) | 表示装置、表示制御方法、及びプログラム | |
JPWO2018167815A1 (ja) | 表示制御装置及び表示制御方法 | |
JP6475413B2 (ja) | ナビゲーションシステム | |
US20220306145A1 (en) | Assistance device for supporting switching between automatic driving mode and manual driving mode | |
JP7357784B2 (ja) | 警報制御装置及び警報制御方法 | |
US20210293569A1 (en) | System for navigation route guidance | |
JP6432312B2 (ja) | ナビゲーションシステム、ナビゲーション方法、及びナビゲーションプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15907320 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017547318 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15754852 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015007066 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15907320 Country of ref document: EP Kind code of ref document: A1 |