WO2013114871A1 - Dispositif d'aide à la conduite et procédé d'aide à la conduite - Google Patents

Dispositif d'aide à la conduite et procédé d'aide à la conduite Download PDF

Info

Publication number
WO2013114871A1
WO2013114871A1 PCT/JP2013/000498 JP2013000498W WO2013114871A1 WO 2013114871 A1 WO2013114871 A1 WO 2013114871A1 JP 2013000498 W JP2013000498 W JP 2013000498W WO 2013114871 A1 WO2013114871 A1 WO 2013114871A1
Authority
WO
WIPO (PCT)
Prior art keywords
predetermined
driver
head
vehicle
frequency
Prior art date
Application number
PCT/JP2013/000498
Other languages
English (en)
Japanese (ja)
Inventor
竜弘 橋本
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2013114871A1 publication Critical patent/WO2013114871A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/02Rear-view mirror arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/593Recognising seat occupancy
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/802Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views
    • B60R2300/8026Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views in addition to a rear-view mirror system

Definitions

  • the present disclosure relates to a driving support device and a driving support method that facilitate visual recognition of surroundings by a vehicle driver.
  • Patent Document 2 a device that optimizes the display position of an image by estimating the position of the driver's head (eyeball) has been proposed.
  • the driver When driving a vehicle, the driver is in a normal driving position, such as when checking the state of a child sitting in the back seat or an object placed on the back seat, or checking the white line on the side of the vehicle when parked. There is a case where it is desired to view a region that is difficult to visually recognize for a long time. Even in such a scene, it is preferable to detect the position of the driver's head and the movement of the line of sight so that the above-mentioned region can be visually recognized.
  • a normal driving position such as when checking the state of a child sitting in the back seat or an object placed on the back seat, or checking the white line on the side of the vehicle when parked.
  • the present disclosure has been made in view of the above points, and an object of the present disclosure is to provide a driving support device and a driving support method for visually recognizing a region that is difficult to visually recognize in a normal driving posture. .
  • the driving support device includes a head position calculation unit, a state determination unit, and a control unit.
  • the head position calculation unit analyzes an image obtained by capturing a driver of the vehicle and calculates a head position of the driver's head in the vehicle.
  • the state determination unit has a period condition in which a period in which the head position is a predetermined position different from a position in a normal driving posture is a predetermined threshold or more, and a frequency at which the head position is the predetermined position. It is determined whether or not at least one of the frequency conditions that are equal to or greater than a predetermined threshold is satisfied.
  • the control unit is a predetermined region that is visible when the head position becomes the predetermined position when the state determination unit determines that at least one of the period condition and the frequency condition is satisfied.
  • the predetermined device is controlled so that the driver can visually recognize the normal driving posture.
  • the driver can visually recognize an area that is difficult to visually recognize in a normal driving posture in a normal driving posture.
  • the driving support method analyzes an image obtained by capturing a driver of the vehicle to calculate a head position of the driver's head in the vehicle, and the head position is normally At least a period condition in which a period of a predetermined position different from the position in the driving posture is a predetermined threshold or more, and a frequency condition in which the frequency at which the head position is the predetermined position is a predetermined threshold or more It is determined whether or not any of them is satisfied, and when it is determined that at least one of the period condition and the frequency condition is satisfied, it is visible when the head position becomes the predetermined position. Controlling predetermined equipment so that the driver can visually recognize the predetermined area in the normal driving posture.
  • the driver can visually recognize an area that is difficult to visually recognize in a normal driving posture in a normal driving posture.
  • FIG. 1 is a block diagram showing a schematic configuration of a driving support system
  • FIG. 2 is a functional block diagram showing the control device in functional blocks.
  • FIG. 3 is a diagram for explaining the operation of confirming the underline on the side of the vehicle.
  • FIG. 4 is a diagram for explaining the operation of confirming the rear seat of the vehicle.
  • FIG. 5 is a flowchart for explaining the processing procedure of the device control processing.
  • FIG. 6 is a flowchart for explaining the procedure of the state determination process.
  • the driving support system 1 is used by being mounted on a vehicle. As shown in FIG. 1, the first camera 11 for driver shooting and the rear seat shooting are used. The second camera 13, the third camera 15 for photographing outside the vehicle, the mirror angle control device 17, the display device 19, the control device 21, and the like.
  • the control device 21 is an example of a driving support device in the present disclosure.
  • the first camera 11 is a well-known image pickup device that takes an image including a driver's face with an image pickup device, and includes an image pickup device such as a CCD or CMOS, and peripheral electronic circuits such as an optical lens, an optical filter, and a power source.
  • the first camera 11 is arranged so that the driver's seat is located in the shooting range.
  • the first camera 11 captures an image at a predetermined time interval, for example, every 1/30 seconds, and outputs it to the control device 21.
  • the second camera 13 is a well-known imaging device that is arranged so that the rear seat of the vehicle is in the shooting range.
  • the third camera 15 is a well-known imaging device, and is arranged in the vicinity of the number plate in front of the vehicle so that the left and right front sides of the vehicle are in the shooting range.
  • the mirror angle control device 17 has an actuator that controls the mirror angle of the right door mirror that is one of the side mirrors, and changes the mirror angle in accordance with a control signal from the control device 21.
  • the display device 19 is a device having a display capable of displaying an image, and displays an image according to a control signal from the control device 21.
  • the display device 19 is disposed in a place where the driver can visually recognize the vehicle in a normal driving posture, for example, near a dashboard or an installation panel.
  • the control device 21 is a well-known microcomputer including a CPU 31, a ROM 33, a RAM 35, and the like, and executes each process described later based on a program stored in the ROM 33.
  • the control device 21 may be configured by combining a plurality of circuits including a CPU, a ROM, a RAM, and the like.
  • the control device 21 has a known real-time clock (not shown) and can know the current time.
  • the control device 21 includes a calculation unit 41 that calculates a head position and a line-of-sight direction, a learning unit 43 that learns normal motion, a detection unit 45 that detects a predetermined motion, and an estimation unit 47 that estimates a motion purpose. , And a control unit 49 that controls the operation of a predetermined device.
  • the calculation unit 41 includes a head position calculation unit that calculates a head position and a gaze direction calculation unit that calculates a gaze direction.
  • the detection unit 45 and the estimation unit 47 function as an example of a state determination unit.
  • the calculation unit 41 analyzes the image acquired from the first camera 11 and calculates the position of the driver's head in the vehicle (hereinafter also simply referred to as the head position) and the line-of-sight direction.
  • the calculation unit 41 detects this information at a predetermined time interval and outputs it to the learning unit 43 and the detection unit 45.
  • a well-known technique can be used as a method of calculating the head position and the line-of-sight direction by analysis.
  • the three-dimensional coordinates of the head position and the feature points may be obtained by analyzing two captured images captured by two cameras arranged at different positions. That is, the structure which provides the two 1st cameras 11 may be sufficient.
  • the head position itself is calculated, and the feature point position is calculated by calculating the positions of feature points that are strongly related to the head position such as the eyes, nose, and mouth. May be used as the head position, or the position of a part related to the movement of the head such as both shoulders and neck may be calculated and used instead of the head position.
  • Japanese Patent Laid-Open No. 2008-13023 Japanese Patent Laid-Open No. 2005-66023, Japanese Patent Laid-Open No. 2002-352228, Japanese Patent Laid-Open No. 2007-249280, The method described in JP-A-2005-182452 can be used.
  • the learning unit 43 continuously acquires the head position and gaze direction information calculated by the calculation unit 41. Based on the information over a certain period, the head position when the driver is in the normal driving posture is calculated.
  • the method for calculating the head position in the normal driving posture is not particularly limited, and various known methods can be employed. For example, it can be obtained from an average value of head positions acquired within a predetermined period during vehicle travel. The average value may be calculated such that the head position calculated at a timing close to the current time is weighted more.
  • the calculation of the head position in the normal driving posture by the learning unit 43 is continuously executed separately from the device control process and the state determination process described later.
  • the detection unit 45 continuously acquires information on the head position and the line-of-sight direction calculated by the calculation unit 41, and the head position and the line-of-sight direction are in any of the following states (i) to (iv): Is detected.
  • the head position is above the normal driving posture and closer to the door mirror, and the line-of-sight direction is the door mirror direction.
  • the head position is above the normal driving posture and the line-of-sight direction is State that is in the rearview mirror direction (iii)
  • the head position is closer to the center of the vehicle than the normal driving posture, and the line-of-sight direction is rearward
  • the head position is the front position relative to the normal driving posture State
  • the above states (i) to (iv) will be referred to as a visual operation state.
  • Whether or not the head position is different from the normal driving posture is determined by whether or not the head position is away from the position in the normal driving posture determined by the learning unit 43 by a predetermined threshold or more.
  • the head positions in (i) to (iv) described above are examples of predetermined positions in the present disclosure
  • the line-of-sight directions in (i) to (iii) are examples of predetermined directions in the present disclosure.
  • the estimation unit 47 has a period condition in which the period detected by the detection unit 45 as being in the visual operation state is a predetermined threshold or more, for example, 2 seconds or more, and a frequency at which the period is detected as the visual operation state is predetermined. It is determined that at least one of the frequency conditions that are equal to or greater than the threshold value is satisfied, and the operation purpose of the driver is estimated.
  • the specific calculation method for determining whether the frequency is equal to or higher than a predetermined threshold is not particularly limited. For example, when it is detected that the visual recognition operation state has been detected for a predetermined period of time, for example, 10 seconds, for example, three times or more after the first detection of the visual operation state, the frequency is equal to or greater than a predetermined threshold value. It can be judged that there is. In addition, when the period until it is detected a predetermined number of times that the visual movement state has been reached is equal to or less than a predetermined threshold, it can be determined that the frequency is equal to or higher than the predetermined threshold. Of course, the frequency may be obtained by other methods.
  • a specific example of estimating the driver's purpose of operation will be described. At least that the period during which the state (i) is detected in the state (i) is not less than a predetermined threshold and the frequency at which the period (i) is detected is not less than the predetermined threshold. If any one of the conditions is met, it is estimated that the driver has confirmed the white line on the side of the vehicle or the vehicle stop. That is, if at least one of the period condition and the frequency condition is satisfied in the state (i), it is estimated that the driver has confirmed a white line or a vehicle stop on the side of the vehicle.
  • the right rear side of the vehicle 55 can be visually recognized as indicated by the arrow A.
  • the white line 57 at the bottom is difficult to see.
  • the head position is moved to a position above the normal driving posture and closer to the door mirror (the position of the head position 51B), the line-of-sight direction becomes as indicated by the arrow B, and the white line 57 can be easily visually recognized.
  • the white line 57 when visually recognizing the white line 57 in parking or the like, the white line 57 is often viewed continuously for a certain period of time, or frequently viewed while changing the line of sight alternately with other directions. For this reason, it can be estimated as described above.
  • the rear of the vehicle 65 can be viewed as indicated by the arrow C, but the rear seat of the vehicle 65 is difficult to view.
  • the head position is moved to a position above the normal driving posture (the position of the head position 61B)
  • the line-of-sight direction becomes as indicated by the arrow D and the rear seat can be easily visually recognized.
  • the rear seat can be visually recognized by turning the head position closer to the center of the vehicle and looking back. For this reason, it can be estimated as described above.
  • the control unit 49 can be visually recognized when the driver's head position is in the positions (i) to (iv) described above based on the estimation result of the estimation unit 47 after the driver returns to the normal driving posture.
  • the predetermined device is controlled so that the driver can visually recognize the predetermined region in a normal driving posture.
  • This predetermined area is not limited to an area that can be directly visually recognized, but also includes an area that can be visually recognized through a door mirror or a room mirror as described above.
  • the mirror angle control device 17 is controlled to change the mirror angle of the door mirror 53 shown in FIG. 3 so that the arrow B direction can be visually recognized at the head position 51A in the normal driving posture. To do.
  • the image taken by the second camera 13 is displayed on the display device 19.
  • a device for controlling the mirror angle of the room mirror 63 may be provided in the vehicle, and the rear seat may be visually recognized in a normal driving posture by controlling the device to change the mirror angle.
  • This device control process is started when power is supplied to the entire driving support system 1 including the control device 21.
  • the control device 21 is started when the ignition switch is turned on.
  • control device 21 acquires the image captured by the first camera 11 in S1, and calculates the position and line-of-sight direction of the driver's head based on the acquired image.
  • the control device 21 determines whether or not the position of the head and the line-of-sight direction are detected as any of the above-described visual movement states in S2. If it is not in the visual recognition operation state (S2: NO), the control device 21 returns to S1.
  • the control device 21 stores the visual operation state (i) to (iv) described above in the memory (RAM 35), and performs the state determination process in S3. Do. In S3, the control device 21 determines that the period of the visual recognition operation state is equal to or greater than a predetermined threshold value, or that the frequency of the visual recognition operation state is equal to or greater than the predetermined threshold value. When it is determined that the period of the visual operation state is equal to or greater than the predetermined threshold or the frequency of the visual operation state is equal to or greater than the predetermined threshold, the control device 21 sets the device control flag to ON. . Details of this state determination processing will be described later.
  • the control device 21 determines whether or not the device control flag is turned on in S4. If the device control flag is not ON (S4: NO), the control device 21 returns to S1.
  • the control device 21 displays a predetermined display indicating that on the display device in the instrument panel of the vehicle, and in the subsequent S5, the first camera 11 A newly photographed image is acquired from the above and the head position and the line-of-sight direction are calculated.
  • the control device 21 determines whether or not the driver is in a normal driving posture based on the information calculated in S5. If it is not a normal driving posture (S6: NO), the control device 21 returns to S5. That is, the driver waits until the driver returns to the normal driving posture from the visual recognition operation state.
  • the control device 21 controls the operation of a predetermined device in S7.
  • the control device 21 controls the mirror angle control device 17, the display device 19 and the like according to the visual operation states (i) to (iv) detected in S2 in S7. After the predetermined time has elapsed, the control device 21 ends the operation of the predetermined device, and returns to S1.
  • the mirror angle control device 17 and the display device 19 function as predetermined devices.
  • the control device 21 executes this state determination process when it is determined in S2 of the device control process that the visual operation state is set.
  • the control device 21 starts counting elapsed time in S11. Then, an image newly captured by the first camera 11 is acquired, and the head position and the line-of-sight direction are calculated in S12 in the same manner as in S1, and the same as the visual operation state detected in S2 immediately before in S13. It determines whether it is in a visual recognition state.
  • the control device 21 determines whether or not a predetermined time (for example, 2 seconds) has elapsed in S14, and the predetermined time must have elapsed. If (S14: NO), the process returns to S12, and if the predetermined time has elapsed (S14: YES), the device control flag is turned ON in S15. Thereafter, the present process is terminated, and the process proceeds to S4 of the device control process.
  • a predetermined time for example, 2 seconds
  • control device 21 repeatedly determines whether or not it is in the visual operation state in the processes of S12 to S15, and the duration of the state after it is detected as the visual operation state is a predetermined threshold (predetermined time). When it becomes above, the device control flag is turned ON.
  • control apparatus 21 determines with S13 not being a visual recognition operation state (S13: NO), it will memorize
  • S17 it is determined whether or not the number of times that the visual operation state is the same as the visual operation state detected in the immediately preceding S2 is equal to or greater than a preset value.
  • a preset value for example, 3
  • the control device 21 If the counted number is equal to or greater than the set value (S17: YES), the control device 21 turns on the device control flag in S15 and ends this process. On the other hand, if it is not more than the set value (S17: NO), the control device 21 determines whether or not a predetermined time (for example, 10 seconds) has passed in S18, and if the predetermined time has passed (S18). : YES), this process is terminated without turning on the device control flag.
  • a predetermined time for example, 10 seconds
  • control device 21 obtains a new image taken by the first camera 11 again, calculates the head position and the line-of-sight direction in S19, and S20. It is determined whether or not the visual operation state is the same as the visual operation state detected in the immediately preceding S2, and the process returns to S16 to store the determination result.
  • control device 21 determines that the number of times that the predetermined visual recognition operation state is reached before the predetermined time has passed is greater than or equal to the set value, that is, the frequency at which the predetermined visual operation state is reached is predetermined.
  • the device control flag is set to ON when the threshold is exceeded.
  • the driver can visually recognize a predetermined area that is visible in the visual operation state in a normal driving posture.
  • the predetermined area that is visible when the driver is in the visual motion state is difficult to visually recognize in a normal driving posture such as a white line on the side of the vehicle, a car stop, a rear seat of the vehicle, and a right and left direction of the intersection. This is an important area. Therefore, the driver does not need to take an unreasonable posture in order to visually recognize the predetermined area described above, and can maintain a normal driving posture, thereby suppressing a driving operation error and an accident.
  • mirror angle control is performed only when the period of time during which the state continues to be detected is greater than or equal to a predetermined threshold or when the frequency of entering the visual operation state is greater than or equal to a predetermined threshold. Since the device 17, the display device 19, and the like are controlled, it is possible to suppress the operation of the predetermined device until the driver does not intend to view the predetermined region. If the driver's head accidentally moves to a predetermined position when the above-described conditions are not set, and the predetermined device is operated based only on the fact that the driver's head is at a predetermined position.
  • the predetermined device when it is erroneously detected that the head is in a predetermined position for only one frame, the predetermined device may be operated, and the driver feels troublesome due to an unexpected operation by the driver. According to the control device 21 according to the present disclosure, it is possible to prevent the driver from feeling troublesome.
  • a predetermined device is Does not work. Therefore, it can be suppressed that the driver is bothered by unintended operation of a predetermined device. Furthermore, unnecessary energy consumption can be suppressed.
  • the control device 21 determines that the timing for controlling the predetermined device is determined to satisfy at least one of the period condition and the frequency condition, and the driver's head is no longer in a predetermined position. Sometimes. Therefore, when the driver's head is no longer in the predetermined position, for example, when the driver returns to the normal driving posture, the predetermined device is operated, so the head is in the predetermined position and no assistance from the predetermined device is required. It is possible to prevent the driver from feeling unnecessary because the device is operated until unnecessary energy is consumed, or the driver performs an unexpected operation.
  • the predetermined device for example, a mirror angle control device that controls the angle of the mirror mounted on the vehicle is used.
  • the control device 21 controls the mirror angle control device so that the driver can visually recognize a predetermined region that is visible when the head is in a predetermined position through a mirror in a normal driving posture.
  • Control The control device 21 configured as described above allows the driver to visually recognize the predetermined area described above in a normal driving posture by changing the angle of the mirror. Further, if the timing of operating the mirror is when the driver's head is no longer in the predetermined position as described above, the mirror does not operate when the driver's head is in the predetermined position, and the predetermined area is visually confirmed. You can continue.
  • a display device arranged at a position where the driver can visually recognize in a normal driving posture can be used.
  • This display device displays an image captured by an imaging device that captures an area that is visible when the driver's head is in a predetermined position, and the control device 21 displays the image on the display device.
  • the control device 21 configured as described above allows the driver to visually recognize the predetermined region described above by causing the display device to display the image of the region described above.
  • the control device 21 calculates the driver's line-of-sight direction in addition to the position of the driver's head when image analysis is performed on an image of the driver of the vehicle.
  • the control device 21 has a period condition in which a period in which the head is in a predetermined position and the line-of-sight direction is a predetermined direction corresponding to the predetermined position is equal to or greater than a predetermined threshold, and the head is in a predetermined position.
  • the predetermined device is controlled.
  • control device 21 configured in this way controls not only the position of the driver's head but also the driver's line of sight and controls the predetermined device, the control device 21 needs to be supported by the predetermined device. It can discriminate
  • the above-described device control processing is realized by a program including an ordered sequence of instructions suitable for processing by a computer, and this program uses a warning device or the like via various recording media and communication lines.
  • the said program which consists of a command executable by the computer for controlling the control apparatus 21 as mentioned above uses an alerting device and this via a tangible and persistent storage medium readable by the computer.
  • the program including instructions executable by a computer for controlling the control device 21 as described above can be provided to the alerting device or a user using the program via a communication line.
  • the configuration in which it is determined whether or not the driver has returned to the normal driving posture in S6 of the device control process is illustrated, but it is not determined whether or not the driver returns to the normal driving posture.
  • the mirror angle control device 17, the display device 19, and the like may be controlled with the device control flag in S 15 of the state determination process set as ON as a trigger.
  • the driver may return to the normal driving posture and determine whether or not the driver is no longer in the visual operation state.
  • the mirror angle control device 17 and the display device 19 are used as predetermined devices that enable the driver to visually recognize a predetermined region that can be visually recognized when the driver is in the visual operation state. Although illustrated, other devices may be used.
  • the learning unit 43 is configured to learn the head position and the line-of-sight direction when the door mirror and the room mirror are visually recognized during driving, and the detection unit 45 calculates the head position and the line-of-sight direction calculated from the newly captured image. If it is different from the learned one, it may be configured to determine that it is in the visual recognition operation state.
  • the head position in the normal driving posture is exemplified by the learning unit 43.
  • the predetermined position is set as the head position in the normal driving posture, and the learning is not performed. Also good.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Rear-View Mirror Devices That Are Mounted On The Exterior Of The Vehicle (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention a trait à un dispositif d'aide à la conduite qui est équipé d'une unité de calcul (41), d'une unité de détermination d'état (45, 47) et d'une unité de commande (49). L'unité de calcul (41) calcule la position de tête, dans un véhicule, de la tête du conducteur du véhicule. L'unité de détermination d'état (45, 47) détermine si une condition de durée selon laquelle la durée pendant laquelle la position de la tête a été une position prédéterminée différente de la position au cours d'une posture de conduite normale a été au moins égale à un seuil prédéterminé et/ou si une condition de fréquence selon laquelle la fréquence à laquelle la position de la tête est dans la position prédéterminée est au moins égale à un seuil prédéterminé sont satisfaites. Lorsqu'il est déterminé que la condition de durée et/ou la condition de fréquence sont satisfaites, l'unité de commande (49) commande un appareil prédéterminé de manière à ce que, dans la posture de conduite normale, le conducteur puisse voir une région prédéterminée au niveau de laquelle la position de la tête est visible lorsqu'elle est au niveau de la position prédéterminée.
PCT/JP2013/000498 2012-01-31 2013-01-30 Dispositif d'aide à la conduite et procédé d'aide à la conduite WO2013114871A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012018685A JP2013154836A (ja) 2012-01-31 2012-01-31 運転支援装置およびプログラム
JP2012-018685 2012-08-03

Publications (1)

Publication Number Publication Date
WO2013114871A1 true WO2013114871A1 (fr) 2013-08-08

Family

ID=48904916

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/000498 WO2013114871A1 (fr) 2012-01-31 2013-01-30 Dispositif d'aide à la conduite et procédé d'aide à la conduite

Country Status (2)

Country Link
JP (1) JP2013154836A (fr)
WO (1) WO2013114871A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016041576A (ja) * 2014-08-13 2016-03-31 センソリー・インコーポレイテッド 自動化された死角の可視化のための技術
CN109034137A (zh) * 2018-09-07 2018-12-18 百度在线网络技术(北京)有限公司 头部姿态标记更新方法、装置、存储介质和终端设备
CN110382321A (zh) * 2017-03-17 2019-10-25 日立汽车系统株式会社 驾驶辅助装置以及方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001260776A (ja) * 2000-03-22 2001-09-26 Mazda Motor Corp 車両用障害物警報装置
JP2002274265A (ja) * 2001-03-22 2002-09-25 Honda Motor Co Ltd ミラー調整装置
JP2005062911A (ja) * 2003-06-16 2005-03-10 Fujitsu Ten Ltd 車両制御装置
JP2009023565A (ja) * 2007-07-20 2009-02-05 Denso It Laboratory Inc 運転支援装置および運転支援方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001260776A (ja) * 2000-03-22 2001-09-26 Mazda Motor Corp 車両用障害物警報装置
JP2002274265A (ja) * 2001-03-22 2002-09-25 Honda Motor Co Ltd ミラー調整装置
JP2005062911A (ja) * 2003-06-16 2005-03-10 Fujitsu Ten Ltd 車両制御装置
JP2009023565A (ja) * 2007-07-20 2009-02-05 Denso It Laboratory Inc 運転支援装置および運転支援方法

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016041576A (ja) * 2014-08-13 2016-03-31 センソリー・インコーポレイテッド 自動化された死角の可視化のための技術
CN110382321A (zh) * 2017-03-17 2019-10-25 日立汽车系统株式会社 驾驶辅助装置以及方法
CN110382321B (zh) * 2017-03-17 2022-12-27 日立安斯泰莫株式会社 驾驶辅助装置
CN109034137A (zh) * 2018-09-07 2018-12-18 百度在线网络技术(北京)有限公司 头部姿态标记更新方法、装置、存储介质和终端设备
CN109034137B (zh) * 2018-09-07 2019-11-19 百度在线网络技术(北京)有限公司 头部姿态标记更新方法、装置、存储介质和终端设备

Also Published As

Publication number Publication date
JP2013154836A (ja) 2013-08-15

Similar Documents

Publication Publication Date Title
CN108621923B (zh) 车辆的显示系统及车辆的显示系统的控制方法
JP5099451B2 (ja) 車両周辺確認装置
US10166922B2 (en) On-vehicle image display device, on-vehicle image display method for vehicle, and on-vehicle image setting device
US7379089B2 (en) Apparatus and method for monitoring the immediate surroundings of a vehicle
JP5093611B2 (ja) 車両周辺確認装置
JP6197814B2 (ja) 車両用表示装置
JP5092776B2 (ja) 視線方向検出装置及び視線方向検出方法
WO2014034065A1 (fr) Dispositif d'avertissement de corps en mouvement et procédé d'avertissement de corps en mouvement
US10450003B2 (en) Parking assist device and parking assist system
WO2017145549A1 (fr) Système d'avertissement de regard de côté et de surveillance et programme informatique
JP2013132970A (ja) ミラー制御装置およびプログラム
JP2012176656A (ja) 駐車支援装置
JP5402047B2 (ja) 死角表示装置
JP2018022958A (ja) 車両用表示制御装置及び車両用モニタシステム
WO2013114871A1 (fr) Dispositif d'aide à la conduite et procédé d'aide à la conduite
US10596966B2 (en) Display device for vehicle and display method for vehicle
JPWO2020208804A1 (ja) 表示制御装置、表示制御方法、及び表示制御プログラム
JP2009251761A (ja) 運転支援システム
JP2007290570A (ja) 車両用表示装置
JP5845909B2 (ja) 障害物警報装置
JP2008162550A (ja) 外部環境表示装置
WO2019167109A1 (fr) Dispositif de commande d'affichage et procédé de commande d'affichage pour véhicule
JP5020606B2 (ja) 周辺監視装置
JP2009044570A (ja) 車両死角補助システム
JP5943207B2 (ja) 車両の駐車操作支援用映像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13744298

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13744298

Country of ref document: EP

Kind code of ref document: A1