WO2006035510A1 - 車両の外界認識装置 - Google Patents
車両の外界認識装置 Download PDFInfo
- Publication number
- WO2006035510A1 WO2006035510A1 PCT/JP2004/014666 JP2004014666W WO2006035510A1 WO 2006035510 A1 WO2006035510 A1 WO 2006035510A1 JP 2004014666 W JP2004014666 W JP 2004014666W WO 2006035510 A1 WO2006035510 A1 WO 2006035510A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- vehicle
- image
- holding member
- information
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 17
- 238000012545 processing Methods 0.000 claims description 32
- 230000005540 biological transmission Effects 0.000 claims description 24
- 230000003287 optical effect Effects 0.000 claims description 21
- 238000012937 correction Methods 0.000 claims description 8
- 238000003384 imaging method Methods 0.000 claims description 6
- 230000008054 signal transmission Effects 0.000 abstract 2
- 230000001419 dependent effect Effects 0.000 abstract 1
- 239000002131 composite material Substances 0.000 description 23
- 238000000034 method Methods 0.000 description 11
- 239000000758 substrate Substances 0.000 description 11
- 238000005259 measurement Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000009191 jumping Effects 0.000 description 2
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011982 device technology Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000017525 heat dissipation Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000004078 waterproofing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/301—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/40—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
- B60R2300/402—Image calibration
Definitions
- the present invention relates to a vehicle external recognition device.
- AC C Adaptive Cruise Control
- a lane departure warning which detects the danger of a vehicle lane departure and alerts the driver Driving assistance devices
- driving assistance control so-called pre-crash safety
- pre-crash safety has been put into practical use, which detects the danger of a collision and assists in the depression of the brake when it is judged that collision avoidance is difficult. It is expected that safer and more convenient control will be realized for such driving support devices.
- the driving assist apparatus as described above, external environment automobile is placed, i.e. the circumstances Ya vehicle, the other vehicle run ⁇ 1 state and the like can be detected accurately, realizing a more advanced control Is important.
- an imaging unit that captures a situation around the host vehicle, a traveling lane and another vehicle from the captured image by the imaging unit
- image processing means that recognizes at least, and those that detect the position of other vehicles around the host vehicle by radiating electromagnetic waves toward the area to be searched and detecting reflected waves of the electromagnetic waves .
- a mounting structure for an external environment recognition device is a door / mirror or side mirror camera.
- the target detected by the camera and the target detected by the radar are the same vehicle or different vehicles.
- Information processing is important. In this matching process, the imaging direction of the camera and the detection direction of the radar must match with a predetermined accuracy.
- the present invention reduces the mounting error between a plurality of sensors and the change in error due to secular change, and simplifies the work of mounting the outside recognition device on a vehicle in an outside recognition device combining a plurality of sensors.
- the purpose is to
- the present invention comprises means for acquiring image information, and object detection means for detecting the target object by transmitting and receiving signals.
- Information by the image capture means and information by the object detection means In the external environment recognition apparatus for a vehicle that recognizes the external environment of the vehicle using the above, the image capturing means and the object detection means are arranged on a common holding member.
- Fig. 1 shows an example of the system configuration of the combined sensor unit V.
- Fig. 2 shows an example of a combined sensor unit.
- Figure 3 shows an example of a compound sensor unit.
- Figure 4 shows an example of a combined sensor unit.
- Fig. 5 shows an example of mounting the combined sensor unit to the vehicle.
- Fig. 6 shows an example of mounting the combined sensor unit on a vehicle.
- FIG. 7 shows an example of mounting the combined sensor unit on a vehicle.
- Fig. 8 shows an example of mounting the combined sensor unit to the vehicle.
- Figure 9 shows an example of a combined sensor unit.
- FIG. 10 is an example of a sensor mounting board.
- Fig. 11 shows an example of how to measure and correct the optical axis deviation.
- Figure 12 shows an example of the prior art. BEST MODE FOR CARRYING OUT THE INVENTION
- FIGS. 12 (b) and 12 (a) Prior to the description of the preferred embodiment of the present invention, the contents and problems of the prior art will be described with reference to FIGS. 12 (b) and 12 (a).
- Fig. 12 (a) two sensors, Radar Unit 13 and Camera Unit 14, have been placed in different locations. Considering the detection performance of the radar unit alone, placing the radar unit 13 at the tip of the car can receive the radio waves transmitted by itself more efficiently. Similarly, if we focus only on the performance of image recognition with a power camera, the camera unit 14 is easier to recognize when viewed from a bird's eye view as high as possible. However, when recognizing the external environment by combining both the information from radar unit 1 3 and the information from camera unit 1 4, the relative positional relationship between radar unit 1 3 and camera unit 14 is strictly It is desirable to set. ,
- the relative positional relationship between the radar unit 1 3 and the camera unit 14 is predetermined.
- the radar unit 1 3 and the camera unit 14 need to be attached to the vehicle body so as to be related, and the problem that the axis adjustment work needs to be performed twice arises.
- the radar unit 1 3 and the camera unit 14 have theoretical mounting angles X, ,, With respect to Z, it has a mouth angle error ⁇ ⁇ 1 and ⁇ X2, a pitch angle error ⁇ Y1 and ⁇ Y2, and a corner angle error ⁇ Z1 and ⁇ Z2.
- the radar: i unit 1 3 and camera unit 14 are installed separately,
- the shaft adjustment work must be performed twice in the conventional technology.
- This axis adjustment is performed by imaging the pseudo target or the actual mounting position and angle between the vehicle body and the radar unit 13 or the camera unit 14. Since it is a complicated task to adjust while detecting, if the axis is adjusted twice, the time required for the assembly process of the car will increase, resulting in an increase in production cost.
- Fig. 1 is a system block diagram of an external recognition device that integrates a camera and radar.
- An image capturing unit 2 and a transmitting / receiving unit 3 are mounted on the sensor mounting substrate 1.
- the image capturing unit 2 generally uses a CCD image sensor or a CMOS image sensor.
- the image processing unit 5 obtains a quantitative sensor amount such as the relative distance from the preceding vehicle or the oncoming vehicle or the vehicle position obtained from the lane mark on the road from the image data obtained by the image capturing unit 2. .
- the transmission / reception unit 3 includes a transmission circuit such as a millimeter wave and a reception circuit that receives the reflected signal.
- the radar signal processing unit 7 processes the transmission / reception signal obtained by the transmission / reception unit 3, and detects the distance, relative speed, and azimuth angle with the reflector in the radar detection range.
- the signal processing unit 4 calculates an external recognition amount based on information from the image processing unit 5 and the radar signal processing unit 7, and can be implemented by, for example, a microcomputer.
- the optical axis error storage unit 6 records an assembly error (amount of optical axis deviation) between the composite sensor unit 9 and the vehicle body, and the signal processing unit 4 includes the image processing unit 5 and the radar signal processing unit 7.
- the information on the optical axis deviation stored in the optical axis error storage unit 6 is used when calculating the amount of recognition of the outside world based on this information.
- the control unit 10 is a controller that controls the brakes and accelerators of the vehicle based on the external recognition amount calculated by the composite sensor unit 9, and controls the vehicle speed according to the distance from an obstacle or a preceding vehicle, for example. To do.
- FIG. 2 shows a detailed embodiment of the sensor mounting board 1.
- An image capturing unit 2 that is an image sensor and an antenna unit 1 are arranged on the sensor mounting substrate 1.
- the misalignment error between the image sensor and the antenna unit 1 can be made extremely small by the recent substrate mounting technology. Therefore, quoting the symbol in Fig. 12 (b),
- ⁇ X 1 ⁇ X 2
- ⁇ ⁇ 1 ⁇ ⁇ 2
- ⁇ ⁇ 1 ⁇ ⁇ 2
- the relative positional relationship between the image capture unit 2 and the transmission / reception unit 3 depends on the tolerance of the base mounting. Since it is possible to maintain a substantially constant accuracy to be determined, it is not necessary to adjust and store the relative positional relationship between the image capture unit 2 and the transmission / reception unit 3 for each unit. Become. Further, the signal processing unit 4 also does not need to correct the relative positional relationship between the image capturing unit 2 and the transmission / reception unit 3.
- the image capturing unit 2 and the transmission / reception unit 3 are attached to the vehicle at a time by attaching the composite sensor unit 9 to the vehicle. Just do it.
- an example of the structure of the composite sensor unit 9 will be described in detail with reference to FIG.
- An image capturing unit 2 and a transmission / reception unit 3 which are image sensors are mounted on the sensor mounting substrate 1.
- the image capture unit 2 and the transmission / reception unit 3 are mounted so that the positions in the X, Y, and Z directions, and the angle, roll angle, and pitch angle are predetermined.
- the angle and pitch angle of the image capturing unit 2 and the transmitting / receiving unit 3 can be matched.
- an image capturing circuit unit 16 that drives the image capturing unit 2 and a high-frequency circuit unit 15 that supplies a transmission signal to the transmitting / receiving unit 3 are mounted on the sensor mounting substrate 1.
- a signal processing unit 4 is arranged on the back surface of the sensor mounting board 1, an image processing unit 5 and a radar signal processing unit 7 are mounted, and the amount of recognition of the outside world is calculated based on these information.
- a lens holder 19 and a lens 18 are arranged in front of the image capturing section '2. These are protected by a housing cover 20. Note that the signal processing unit 4 and the sensor mounting board 1 are connected by board-to-board connection.
- the vehicle mounting bracket ⁇ 17 is attached to the vehicle frame, and has a structure that allows the optical axis adjustment unit 8 to make fine adjustments after mounting the vehicle.
- the composite sensor unit 9 is provided with a housing side connector 23.
- the connector includes a power supply line for supplying power to the composite sensor unit 9, a ground line (GND), and the detected information to the composite sensor unit 9.
- the signal lines to be output to the outside of 9 are gathered, and the structure that fits with the harness side connector 24 provided on the external harness 25. It has become. ,
- the signal processing unit 4 is provided inside the composite sensor unit 9 and the recognized external environment information is output to the outside.
- the information is acquired by the image capturing unit 2 and the transmitting / receiving unit 3.
- the data may be transmitted to an external device, and the external device may combine the respective information to recognize the outside world.
- the housing cover 21 is shaped to cover the lens 18. As a result, it is possible to facilitate waterproofing of the housing. This configuration is particularly suitable when the composite sensor unit 9 is attached to a lower part of the vehicle pon- ent or the like like a conventional radar unit. At this time, a field window 22 is provided as shown in FIG. 4 (b), and imaging of the outside world is taken into consideration. In addition, the viewing window 22 can have a function of polarization fill or optical pass fill.
- the structure of the composite sensor unit 9 is the same as in FIG. 3 or FIG. .
- the combined sensor unit 9 is attached to the vehicle 12 so that the pixel center of the image capturing unit 2 and the center of the transmission / reception unit 3 are positioned on a line substantially perpendicular to the vehicle body.
- the offset amount of the mounting position in the lateral direction of the vehicle body between the image capturing unit 2 and the transmission / reception unit 3 can be substantially eliminated, and the environment environment is combined with image information and radar information. Correction of the offset amount between sensors when recognizing Is possible.
- the composite sensor unit 9 is attached to the vehicle 12 so that the pixel center of the image capturing unit 2 and the center of the transmission / reception unit 3 are coaxial with the vehicle center.
- correcting the offset position of the composite sensor unit when recognizing the external environment can be omitted by arranging the mounting position of the composite sensor unit at the center of the vehicle.
- Figure 6 shows an example of this specific attachment.
- FIG. 7 shows an embodiment in which the composite sensor unit is placed horizontally.
- the pixel center of the image capturing unit 2 and the center of the transmitting / receiving unit 3 in the composite sensor unit are not the same as the center line for the left and right of the same vehicle.
- the offset amount with the center line with respect to the left and right is known in advance by design, and the offset amount can be corrected by post-processing during control.
- Fig. 8 shows another embodiment of mounting the combined sensor unit to the vehicle.
- the combined sensor unit 9 may not be installed in front of the vehicle bonnet due to installation space or aesthetic reasons. In such a case, it is conceivable to install the composite sensor unit 9 in the vehicle interior, but in this case, in order to suppress the amount of the composite sensor unit 9 jumping into the vehicle interior, as shown in Fig. 8, It is effective to place the sensor unit horizontally.
- 9 (a) shows a more detailed example of the horizontal placement type composite sensor unit shown in FIG.
- the sensor mounting board 1 is accommodated in the housing 20, and the image capturing section 2 and the transmitting / receiving section 3 are arranged on the sensor mounting board 1.
- a lens holder 1 9 and a lens 1 8 are attached in front of the image capturing unit 2.
- the lens 18 is configured to protrude outward from a hole provided in the housing 20.
- the signal processing unit 4 is also provided with a sensor amount such as a relative distance from a preceding vehicle or an oncoming vehicle or a vehicle position obtained from a lane mark on the road based on the image data obtained by the image capturing unit 2.
- Radar signal processing that processes the transmission / reception signals obtained by the image processing unit 5 and the transmission / reception unit 3 to detect the distance, relative speed, azimuth angle, etc. with respect to the reflector within the radar detection range. Part 7 is provided.
- the signal processing unit 4 can be mounted on a circuit board.
- the board requires a certain amount of board area because of problems such as the arithmetic processing capability required for the signal processing unit 4 and heat dissipation.
- the area is larger than the sensor mounting board 1.
- the substrate of the signal processing unit 4 is arranged in the longitudinal direction, and the sensor mounting substrate 1 is connected by the inter-substrate connection harness 26.
- wires may be used for the board-to-board connection harness 26, but as shown in Fig. 9 (a) and 9 (b), using a flexible harness is more effective for miniaturization. .
- the amount of protrusion of the composite sensor unit 9 into the passenger compartment depends on the vertical size of the sensor mounting board 1, and the signal processing unit 4 board is made substantially parallel to the sensor mounting board 1. Compared to the arrangement, the amount of jumping into the passenger compartment can be reduced.
- a rectangular parallelepiped housing is used, but the vertical dimension of the part that houses the signal processing unit 4 should be smaller than the vertical dimension of the part that houses the sensor mounting board 1. And then In addition, it is possible to make it difficult to give the driver a feeling of pressure.
- Fig. 10 (a) and Fig. 10 (b) show the sensors shown in Fig. 2, Fig. 3, Fig. 4, Fig. 5, Fig. 9, (a) and Fig. 9 (b).
- 3 is a more detailed example of the mounting substrate 1.
- an image capturing unit 2 that is an image sensor and a transmission / reception unit 3 formed of a conductive thin film are arranged on the front side of the unit.
- Fig. 10 (b) is a view of the sensor mounting substrate 1 as viewed from the side opposite to Fig. 10 (a).
- an image capturing circuit unit 16 that drives the image capturing unit 2 and its wiring unit, and a high-frequency circuit unit 15 that supplies high-frequency signals to the transmitting and receiving unit 3 and its wiring unit are provided.
- Mutual interference here refers to, for example, when the electromagnetic wave emitted from the image capturing unit 2 affects the high-frequency circuit unit 15 or due to fluctuations (fluctuations) in the GND voltage, which is the installation potential, in the high-frequency circuit unit 15.
- the noise level may increase or the image data may be distorted in the image capture circuit.
- the optical axis deviation adjustment operation proceeds in step S10 depending on whether the axis adjustment execution flag is ON. This flag is confirmed by capturing the status of the external switch or performing overnight communication with the microcomputer. When this flag is OFF, the process proceeds to step S 20, and the axis error values of the radar and the camera are read from the optical axis error storage unit 6.
- the optical axis error storage unit 6 is composed of a nonvolatile memory or the like.
- step S 1 1 If the flag is ON, go to step S 1 1 to Measure the head with a radar and measure it as an axis adjustment value.
- the axis adjustment execution flag is set to ON, and the new axis error measured by executing the adjustment process in step S20 and subsequent steps is stored in the optical axis error storage unit 6.
- step S 1 2 the measured values of errors ⁇ ⁇ 1, ⁇ Y 1, and ⁇ ⁇ 1 are obtained from the determined true value of the target, and the process proceeds to step S 13.
- step S13 as with radar, the determined target is measured with a camera and measured as an axis adjustment value.
- step S 14 the measured value errors ⁇ ⁇ 2, ⁇ ⁇ 2, and ⁇ ⁇ 2 are obtained from the determined true value of the target.
- step S 15 each axis error value obtained as described above is stored in the axis error storage device 6 so that it can be used as a correction coefficient during actual running.
- step S 3 1 From step S 3 1 is the measurement operation during actual driving.
- step S31 first, the position X3,, 3, ⁇ 3 of the obstacle is measured by the radar. Thereafter, in step S 3 2, the true values X 3 ′, ⁇ 3 ′, and ⁇ 3 ′ can be obtained using the correction data measured in step S 20.
- step S 3 3 the positions ⁇ 4, ⁇ 4 and ⁇ 4 of the obstacle are measured by the camera.
- step S 3 4 the true values ⁇ 4 ′, ⁇ 4 ′, and ⁇ 4 ′ can be obtained using the correction data measured in step S 20.
- step S35 ⁇ 3 ', ⁇ 3', ⁇ 3 'obtained by the radar and ⁇ 4', ⁇ 4 ', ⁇ 4' obtained by the camera are used. Control is performed in combination with information.
- step S 3 6 after outputting the total sensor result, the process returns to step S 3 1 to repeat the measurement.
- radars according to steps S 3 1 and S 3 2 The order of measurement by, and the measurement by the camera at steps S 3 3 and S 3 4 may be reversed.
- the mounting error between the means for acquiring image information and the object detection means is reduced. Moreover, the adjustment work of these relative positional relationships is simplified. In addition, it is possible to substantially eliminate the secular change of the relative positional relationship between the means for acquiring image information and the object detection means due to vibration after mounting on the vehicle body.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Radar Systems Or Details Thereof (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006517877A JPWO2006035510A1 (ja) | 2004-09-29 | 2004-09-29 | 車両の外界認識装置 |
PCT/JP2004/014666 WO2006035510A1 (ja) | 2004-09-29 | 2004-09-29 | 車両の外界認識装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2004/014666 WO2006035510A1 (ja) | 2004-09-29 | 2004-09-29 | 車両の外界認識装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006035510A1 true WO2006035510A1 (ja) | 2006-04-06 |
Family
ID=36118661
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/014666 WO2006035510A1 (ja) | 2004-09-29 | 2004-09-29 | 車両の外界認識装置 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2006035510A1 (ja) |
WO (1) | WO2006035510A1 (ja) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2122599A2 (en) | 2007-01-25 | 2009-11-25 | Magna Electronics Inc. | Radar sensing system for vehicle |
JP2011047933A (ja) * | 2009-08-13 | 2011-03-10 | Tk Holdings Inc | 物体検出システム |
JP2012524890A (ja) * | 2009-04-24 | 2012-10-18 | ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 車両内の運転者支援システムのためのセンサ構成 |
DE102015218843A1 (de) | 2014-09-30 | 2016-03-31 | Nidec Elesys Corporation | Überwachungsvorrichtung |
DE102015217012A1 (de) | 2014-09-30 | 2016-03-31 | Nidec Elesys Corporation | Fahrzeugintegrierte Radarvorrichtung und Fahrzeug |
US9525206B2 (en) | 2014-02-13 | 2016-12-20 | Honda Elesys Co., Ltd. | Antenna unit, radar device, and composite sensor device |
CN107042799A (zh) * | 2016-01-19 | 2017-08-15 | 日本电产艾莱希斯株式会社 | 车辆 |
JP2017175515A (ja) * | 2016-03-25 | 2017-09-28 | 株式会社ファルテック | レーダカバー |
JP2019510967A (ja) * | 2016-02-26 | 2019-04-18 | ウェイモ エルエルシー | 非構造化データを使用したレーダ取り付け判定 |
EP3490062A1 (en) * | 2017-11-27 | 2019-05-29 | Panasonic Intellectual Property Management Co., Ltd. | Radar device |
EP3490065A1 (en) * | 2017-11-27 | 2019-05-29 | Panasonic Intellectual Property Management Co., Ltd. | Antenna device |
JPWO2018051906A1 (ja) * | 2016-09-15 | 2019-06-24 | 株式会社小糸製作所 | センサシステム、センサモジュール、およびランプ装置 |
WO2019172118A1 (ja) * | 2018-03-05 | 2019-09-12 | 株式会社小糸製作所 | センサシステム、センサモジュール、およびランプ装置 |
JP2019168455A (ja) * | 2018-03-22 | 2019-10-03 | コンティ テミック マイクロエレクトロニック ゲゼルシャフト ミット ベシュレンクテル ハフツングConti Temic microelectronic GmbH | センサ組立品及び散乱光シェードを備えた装置 |
JP2020501963A (ja) * | 2016-12-20 | 2020-01-23 | ヴィオニア ユーエス インコーポレイティド | 一体化されたカメラ及び通信アンテナ |
JP2020085571A (ja) * | 2018-11-20 | 2020-06-04 | トヨタ自動車株式会社 | センサ搭載構造 |
JP2021105602A (ja) * | 2019-12-27 | 2021-07-26 | パイオニア株式会社 | ライダ装置 |
EP3739685B1 (en) * | 2018-01-10 | 2023-06-21 | Zanini Auto Grup, S.A. | Radome for vehicles |
EP3564576B1 (en) * | 2016-12-28 | 2023-11-22 | Koito Manufacturing Co., Ltd. | Lamp device |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05301541A (ja) * | 1992-04-28 | 1993-11-16 | Fujitsu Ltd | カメラやセンサを内蔵したドア・ミラーあるいはサイド・ミラーによる自動車の進行方向確認方式 |
JPH08276787A (ja) * | 1995-04-03 | 1996-10-22 | Suzuki Motor Corp | 車載用画像処理装置及び画像表示システム |
JPH10147178A (ja) * | 1996-11-18 | 1998-06-02 | Dx Antenna Co Ltd | 車両の後方監視装置 |
JPH1178737A (ja) * | 1997-09-13 | 1999-03-23 | Honda Motor Co Ltd | 車両搭載カメラ |
JP2001158284A (ja) * | 1999-11-30 | 2001-06-12 | Honda Access Corp | 照明装置,カメラ装置,センサー装置などの車両外面部に組み込む組込装置の取付構造 |
JP2001233139A (ja) * | 2000-02-25 | 2001-08-28 | Fuji Heavy Ind Ltd | 車載用プレビューセンサの取り付け構造およびその位置ずれ調整装置 |
JP2003044995A (ja) * | 2001-07-26 | 2003-02-14 | Nissan Motor Co Ltd | 物体種別判別装置及び物体種別判別方法 |
JP2003169233A (ja) * | 2001-12-03 | 2003-06-13 | Toyoda Gosei Co Ltd | 自動車装着用カメラ |
WO2003053743A1 (de) * | 2001-12-20 | 2003-07-03 | Robert Bosch Gmbh | Stereo-kamera-anordnung in einem kraftfahrzeug |
JP2004082829A (ja) * | 2002-08-26 | 2004-03-18 | Denso Corp | 車載カメラ |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004260787A (ja) * | 2002-09-09 | 2004-09-16 | Rohm Co Ltd | イメージセンサモジュール |
-
2004
- 2004-09-29 WO PCT/JP2004/014666 patent/WO2006035510A1/ja active Application Filing
- 2004-09-29 JP JP2006517877A patent/JPWO2006035510A1/ja active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05301541A (ja) * | 1992-04-28 | 1993-11-16 | Fujitsu Ltd | カメラやセンサを内蔵したドア・ミラーあるいはサイド・ミラーによる自動車の進行方向確認方式 |
JPH08276787A (ja) * | 1995-04-03 | 1996-10-22 | Suzuki Motor Corp | 車載用画像処理装置及び画像表示システム |
JPH10147178A (ja) * | 1996-11-18 | 1998-06-02 | Dx Antenna Co Ltd | 車両の後方監視装置 |
JPH1178737A (ja) * | 1997-09-13 | 1999-03-23 | Honda Motor Co Ltd | 車両搭載カメラ |
JP2001158284A (ja) * | 1999-11-30 | 2001-06-12 | Honda Access Corp | 照明装置,カメラ装置,センサー装置などの車両外面部に組み込む組込装置の取付構造 |
JP2001233139A (ja) * | 2000-02-25 | 2001-08-28 | Fuji Heavy Ind Ltd | 車載用プレビューセンサの取り付け構造およびその位置ずれ調整装置 |
JP2003044995A (ja) * | 2001-07-26 | 2003-02-14 | Nissan Motor Co Ltd | 物体種別判別装置及び物体種別判別方法 |
JP2003169233A (ja) * | 2001-12-03 | 2003-06-13 | Toyoda Gosei Co Ltd | 自動車装着用カメラ |
WO2003053743A1 (de) * | 2001-12-20 | 2003-07-03 | Robert Bosch Gmbh | Stereo-kamera-anordnung in einem kraftfahrzeug |
JP2004082829A (ja) * | 2002-08-26 | 2004-03-18 | Denso Corp | 車載カメラ |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10107905B2 (en) | 2007-01-25 | 2018-10-23 | Magna Electronics Inc. | Forward facing sensing system for vehicle |
EP3624086A1 (en) * | 2007-01-25 | 2020-03-18 | Magna Electronics Inc. | Radar sensing system for vehicle |
EP2122599B1 (en) * | 2007-01-25 | 2019-11-13 | Magna Electronics Inc. | Radar sensing system for vehicle |
US10670713B2 (en) | 2007-01-25 | 2020-06-02 | Magna Electronics Inc. | Forward sensing system for vehicle |
US10877147B2 (en) | 2007-01-25 | 2020-12-29 | Magna Electronics Inc. | Forward sensing system for vehicle |
US11815594B2 (en) | 2007-01-25 | 2023-11-14 | Magna Electronics Inc. | Vehicular forward-sensing system |
EP2122599A2 (en) | 2007-01-25 | 2009-11-25 | Magna Electronics Inc. | Radar sensing system for vehicle |
US11506782B2 (en) | 2007-01-25 | 2022-11-22 | Magna Electronics Inc. | Vehicular forward-sensing system |
JP2012524890A (ja) * | 2009-04-24 | 2012-10-18 | ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 車両内の運転者支援システムのためのセンサ構成 |
US9229104B2 (en) | 2009-04-24 | 2016-01-05 | Robert Bosch Gmbh | Sensor assembly for driver assistance systems in motor vehicles |
JP2011047933A (ja) * | 2009-08-13 | 2011-03-10 | Tk Holdings Inc | 物体検出システム |
US9525206B2 (en) | 2014-02-13 | 2016-12-20 | Honda Elesys Co., Ltd. | Antenna unit, radar device, and composite sensor device |
DE102015217012A1 (de) | 2014-09-30 | 2016-03-31 | Nidec Elesys Corporation | Fahrzeugintegrierte Radarvorrichtung und Fahrzeug |
US10082571B2 (en) | 2014-09-30 | 2018-09-25 | Nidec Elesys Corporation | Monitoring apparatus |
US9799949B2 (en) | 2014-09-30 | 2017-10-24 | Nidec Corporation | On-vehicle radar device and vehicle |
CN105467385A (zh) * | 2014-09-30 | 2016-04-06 | 日本电产艾莱希斯株式会社 | 监控装置 |
DE102015218843A1 (de) | 2014-09-30 | 2016-03-31 | Nidec Elesys Corporation | Überwachungsvorrichtung |
CN107042799A (zh) * | 2016-01-19 | 2017-08-15 | 日本电产艾莱希斯株式会社 | 车辆 |
US10322566B2 (en) | 2016-01-19 | 2019-06-18 | Nidec Corporation | Vehicle |
CN107042799B (zh) * | 2016-01-19 | 2019-08-09 | 日本电产株式会社 | 车辆 |
US11016174B2 (en) | 2016-02-26 | 2021-05-25 | Waymo Llc | Radar mounting estimation with unstructured data |
JP2019510967A (ja) * | 2016-02-26 | 2019-04-18 | ウェイモ エルエルシー | 非構造化データを使用したレーダ取り付け判定 |
JP2017175515A (ja) * | 2016-03-25 | 2017-09-28 | 株式会社ファルテック | レーダカバー |
JP7061071B2 (ja) | 2016-09-15 | 2022-04-27 | 株式会社小糸製作所 | センサシステム、センサモジュール、およびランプ装置 |
JPWO2018051906A1 (ja) * | 2016-09-15 | 2019-06-24 | 株式会社小糸製作所 | センサシステム、センサモジュール、およびランプ装置 |
US11467284B2 (en) | 2016-09-15 | 2022-10-11 | Koito Manufacturing Co., Ltd. | Sensor system, sensor module, and lamp device |
JP2020501963A (ja) * | 2016-12-20 | 2020-01-23 | ヴィオニア ユーエス インコーポレイティド | 一体化されたカメラ及び通信アンテナ |
EP3564576B1 (en) * | 2016-12-28 | 2023-11-22 | Koito Manufacturing Co., Ltd. | Lamp device |
EP3490062A1 (en) * | 2017-11-27 | 2019-05-29 | Panasonic Intellectual Property Management Co., Ltd. | Radar device |
EP3490065A1 (en) * | 2017-11-27 | 2019-05-29 | Panasonic Intellectual Property Management Co., Ltd. | Antenna device |
US10804615B2 (en) | 2017-11-27 | 2020-10-13 | Panasonic Intellectual Property Management Co., Ltd. | Radar device |
EP3739685B1 (en) * | 2018-01-10 | 2023-06-21 | Zanini Auto Grup, S.A. | Radome for vehicles |
US11248767B2 (en) | 2018-03-05 | 2022-02-15 | Koito Manufacturing Co., Ltd. | Sensor system, sensor module, and lamp device |
WO2019172118A1 (ja) * | 2018-03-05 | 2019-09-12 | 株式会社小糸製作所 | センサシステム、センサモジュール、およびランプ装置 |
US10620308B2 (en) | 2018-03-22 | 2020-04-14 | Conti Temic Microelectronic Gmbh | Apparatus with a sensor assembly and a stray light baffle |
US10823839B2 (en) | 2018-03-22 | 2020-11-03 | Conti Temic Microelectronic Gmbh | Apparatus with a sensor assembly and a stray light baffle |
JP2019168455A (ja) * | 2018-03-22 | 2019-10-03 | コンティ テミック マイクロエレクトロニック ゲゼルシャフト ミット ベシュレンクテル ハフツングConti Temic microelectronic GmbH | センサ組立品及び散乱光シェードを備えた装置 |
JP2020085571A (ja) * | 2018-11-20 | 2020-06-04 | トヨタ自動車株式会社 | センサ搭載構造 |
US11693095B2 (en) | 2018-11-20 | 2023-07-04 | Toyota Jidosha Kabushiki Kaisha | Sensor mounting structure |
JP7131326B2 (ja) | 2018-11-20 | 2022-09-06 | トヨタ自動車株式会社 | センサ搭載構造 |
JP2021105602A (ja) * | 2019-12-27 | 2021-07-26 | パイオニア株式会社 | ライダ装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2006035510A1 (ja) | 2008-05-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006035510A1 (ja) | 車両の外界認識装置 | |
US11156711B2 (en) | Vehicular sensing system using RF sensors | |
US9229104B2 (en) | Sensor assembly for driver assistance systems in motor vehicles | |
EP2340185B1 (en) | Integrated radar-camera sensor | |
US20200238904A1 (en) | Apparatus, system and method for preventing collision | |
KR102179784B1 (ko) | 각도 측정용 자동차의 센서를 보정하는 방법, 계산 장치, 운전자 보조 시스템, 및 자동차 | |
US20140247349A1 (en) | Integrated lighting, camera and sensor unit | |
US20150123838A1 (en) | Radar antenna assembly | |
JP6317657B2 (ja) | レーダーセンサモジュール | |
JP2017147487A (ja) | レーダ装置を備えた車両 | |
JP3905410B2 (ja) | 車両用運転支援装置 | |
US11719561B2 (en) | Sensor module having a sensor carrier rotatable about an axis, and method for assembling a sensor module of this type | |
TWI540063B (zh) | 盲點偵測系統 | |
US20200158863A1 (en) | Sensor system for a vehicle and method for determining assessment threat | |
JP7305535B2 (ja) | 測距装置、及び、測距方法 | |
JP2009208581A (ja) | ミラー自動調整システム及び画像撮像装置 | |
JP2004085258A (ja) | レーダ装置 | |
US20210405186A1 (en) | Obstacle detection system and method using distance sensor | |
KR20140109716A (ko) | 차량용 레이더의 얼라인먼트 조정장치 및 방법 | |
CN113740854B (zh) | 雷达装置 | |
US20210349206A1 (en) | Radar for vehicle | |
JP7119444B2 (ja) | 測距システム、測距方法、車載装置、車両 | |
JP7526207B2 (ja) | 少なくとも1つの車両構成部分を含んでいる、車両用のレーダーアンテナ装置および車両 | |
WO2017171083A1 (ja) | 物体検知装置、物体検知方法 | |
US20230048226A1 (en) | Imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006517877 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 04788445 Country of ref document: EP Kind code of ref document: A1 |