WO2019176418A1 - Lampe de véhicule, procédé de détection de véhicule et dispositif de détection de véhicule - Google Patents

Lampe de véhicule, procédé de détection de véhicule et dispositif de détection de véhicule Download PDF

Info

Publication number
WO2019176418A1
WO2019176418A1 PCT/JP2019/004994 JP2019004994W WO2019176418A1 WO 2019176418 A1 WO2019176418 A1 WO 2019176418A1 JP 2019004994 W JP2019004994 W JP 2019004994W WO 2019176418 A1 WO2019176418 A1 WO 2019176418A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
light spot
unit
image
lamp
Prior art date
Application number
PCT/JP2019/004994
Other languages
English (en)
Japanese (ja)
Inventor
光治 眞野
亮太 小倉
高範 難波
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2018048049A external-priority patent/JP2019156276A/ja
Priority claimed from JP2018048050A external-priority patent/JP2019156277A/ja
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Publication of WO2019176418A1 publication Critical patent/WO2019176418A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/0017Devices integrating an element dedicated to another function
    • B60Q1/0023Devices integrating an element dedicated to another function the element being a sensor, e.g. distance sensor, camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/14Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights having dimming means
    • B60Q1/1415Dimming circuits
    • B60Q1/1423Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic
    • B60Q1/143Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic combined with another condition, e.g. using vehicle recognition from camera images or activation of wipers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2300/00Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
    • B60Q2300/05Special features for controlling or switching of the light beam
    • B60Q2300/056Special anti-blinding beams, e.g. a standard beam is chopped or moved in order not to blind

Definitions

  • the present disclosure relates to a vehicle lamp suitable for application to a headlamp of an automobile, and more particularly to a vehicle lamp provided with imaging means such as a camera.
  • the present disclosure also relates to a vehicle detection method and a vehicle detection device that are mounted on a vehicle such as an automobile and detect other vehicles such as an oncoming vehicle and a preceding vehicle.
  • ADB control Adaptive Driving Beam
  • AHB Automatic High Beam
  • ADB control Adaptive Driving Beam
  • AHB Automatic High Beam
  • a headlamp capable of light control hereinafter also referred to as ADB control
  • a headlamp in which a camera is integrated is proposed as an imaging means for detecting an oncoming vehicle or a preceding vehicle.
  • a camera unit is housed together with a lamp unit in a lamp housing, and imaging is performed by the camera unit.
  • the imaging optical axis of the imaging means and the light irradiation optical axis of the lamp unit can be brought closer as compared with the case where the imaging means is disposed in the front window of the automobile, such as ADB control. Can be performed with higher accuracy. In other words, it is possible to reduce the light distribution error due to the parallax generated between the two central axes by bringing the central axis of the image capturing the oncoming vehicle and the preceding vehicle close to the central axis of the light distribution such as ADB control. is there.
  • an ADB AdaptiveingDriving
  • Beam Beam
  • AHB Automatic High Beam
  • Patent Document 2 a light spot in an image obtained by an imaging means is detected, and the attribute of the detected light spot, for example, the color and behavior (moving state) of the light spot, A technique has been proposed in which the headlamps and taillights of other vehicles are discriminated to detect other vehicles. Patent Document 2 also discloses a technique for shortening the detection time by setting a detection area for detecting another vehicle based on the detected attribute of the light spot as a predetermined area and determining the set area. Proposed.
  • aiming adjustment for adjusting the optical axis direction of the lamp unit is performed after the headlamp is assembled to the car body of the automobile.
  • the image is picked up by the image pickup means. It is conceivable to perform aiming adjustment while confirming the light distribution pattern.
  • vehicle detection is performed by analyzing an image captured by the imaging unit. Detection by this image analysis In order to improve accuracy, it is considered to use AI (Artificial Intelligence) having a learning function.
  • AI Artificial Intelligence
  • An object of the present disclosure is to provide a vehicle lamp capable of realizing suitable light distribution control.
  • Patent Document 2 is effective in reducing the detection time by limiting the detection area.
  • it is required to detect other vehicles over a wide area, and therefore it is difficult for the technology of Patent Document 2 to meet this requirement.
  • illuminants such as road signs and advertisements increase, and the number of light spots in the captured image tends to increase. Therefore, a process for discriminating other vehicles based on all the behaviors of these light spots is required, and the number of detection steps increases and the detection time increases.
  • An object of the present disclosure is to provide a vehicle detection method and a vehicle detection device that can narrow down light spots in a captured image and detect a vehicle accurately and quickly.
  • the present disclosure relates to a lamp unit that performs light irradiation, an imaging unit that captures at least a light irradiation region of the lamp unit, and a signal bus disposed in a vehicle, for controlling the lamp unit and the imaging unit.
  • a vehicular lamp having a control unit and configured to perform aiming adjustment of the lamp unit based on a signal obtained by performing signal processing on an imaging signal captured by the imaging unit, and the control unit is configured to receive a predetermined command signal.
  • the signal processed signal is output to the signal bus.
  • the signal processed signal is preferably an image signal based on the imaging signal or an aiming adjustment signal.
  • the camera and the lamp unit are built in the lamp housing, and the camera is detachably supported with respect to the lamp housing.
  • the camera is disposed on one selected vehicle lamp of each vehicle lamp.
  • control unit includes a detection unit that detects at least another vehicle based on an imaging signal captured by the camera, and the detection unit is machine-learned based on the automatically generated traveling simulation data. It is preferable. In the traveling simulation data, it is preferable that traveling image data and teacher data are automatically generated.
  • the vehicle detection method is a detection method that is mounted on a vehicle, images the surroundings of the vehicle, and detects the vehicle from the behavior of a light spot that is captured in an image obtained by the imaging.
  • the method includes detecting a behavior of a target light spot in the vehicle and detecting a light spot having a predetermined behavior as a vehicle.
  • a vehicle detection device includes an imaging unit that is mounted on a vehicle and images the surroundings of the vehicle, and a detection unit that detects a vehicle from a light spot captured in an image captured by the imaging unit.
  • the vehicle is detected from the target light spot that is the vehicle detection target from the plurality of light spots existing in the light source, the light spot identification unit that identifies the other non-target light spots, and the behavior of the identified target light spot in the image
  • a light detection unit that recognizes shapes of the plurality of light spots, and identifies the target light spot and the non-target light spot based on the recognition;
  • the aiming adjuster can acquire a signal obtained by performing signal processing on an image pickup signal picked up by the image pickup unit, for example, an image signal or an aiming adjustment signal obtained by calculation, through the signal bus of the vehicle.
  • a signal obtained by performing signal processing on an image pickup signal picked up by the image pickup unit for example, an image signal or an aiming adjustment signal obtained by calculation, through the signal bus of the vehicle.
  • High-precision aiming adjustment can be easily performed, and suitable light distribution control can be realized.
  • the shape of the light spot in the image captured by the imaging unit is identified, and the light spot that is likely to be a vehicle is identified as the target light spot from this shape, and the identified target light spot
  • the process for detecting the vehicle only is executed. As a result, even when there are many light spots in the image, there is no need to perform detection processing for light spots that are not likely to be light spots of the vehicle, and the number of processing steps for detecting the vehicle is reduced. In addition, rapid vehicle detection is possible.
  • the conceptual diagram of the motor vehicle which applied the headlamp concerning this indication The schematic horizontal sectional view of a right headlamp. The exploded perspective view of a camera.
  • the block diagram of lamp ECU. The flow diagram of DL (deep learning).
  • the conceptual diagram which shows the 2nd aiming adjustment form. Schematic which shows the CAN connection state of a right-and-left headlamp.
  • the block diagram of lamp ECU of the headlamp of FIG. Schematic which shows the image imaged with the camera of the left-right headlamp, and its light distribution.
  • Schematic which shows the image imaged with the camera of the left-right headlamp, and its light distribution Schematic which shows the image imaged with the camera of the left-right headlamp, and its light distribution.
  • FIG. 3 is a schematic horizontal sectional view of a headlamp to which the present disclosure is applied.
  • FIG. 1 is a conceptual diagram of an embodiment in which the present disclosure is applied to a headlamp HL of an automobile.
  • the left and right headlamps L-HL and R-HL of the automobile CAR are each provided with a lamp unit 1 capable of ADB control in a lamp housing 3.
  • the lamp unit 1 includes, for example, a light source 11 composed of a plurality of LEDs (light emitting diodes). All or selected LEDs 11 are caused to emit light, and the emitted light is projected onto a front area of the automobile by a projection lens 12. Thus, it is possible to perform light irradiation with a desired light distribution pattern P.
  • each light emitting region of the plurality of LEDs 11 is set to illuminate a predetermined segmented region Ap in the front region of the automobile, and only the segmented region Ap corresponding to the LED 11 selected and emitted is included. Illuminated. Therefore, when all the LEDs 11 emit light, the entire area of the light distribution pattern P is illuminated. Moreover, since the area
  • a camera 2 that captures a moving image is provided as an imaging means in the lamp housing 3 of each of the headlamps L-HL and R-HL.
  • the camera 2 images at least a front area of the automobile irradiated with light by the lamp unit 1 and outputs an imaging signal.
  • the left and right headlamps L-HL and R-HL are respectively assembled to the left and right of the front part of the car body of the car CAR in the car assembly process.
  • the optical axis adjustment that is, aiming adjustment of the lamp unit 1 and the camera 2 is executed.
  • the irradiation optical axis of the lamp unit 1 and the imaging optical axis of the camera 2 are set in a predetermined direction, so that highly accurate ADB control can be realized.
  • FIG. 2 is a schematic horizontal sectional view of the right headlamp R-HL among the headlamps HL.
  • the lamp housing 3 includes a lamp body 31 and a translucent front cover 32.
  • a base plate 5 is supported on the lamp body 31 by an aiming mechanism 4.
  • the lamp unit 1 and the camera 2 are attached to the base plate 5.
  • the irradiation optical axis of the lamp unit 1 and the imaging optical axis of the camera 2 are attached to the base plate 5 in a state where they are directed in the same direction.
  • the base plate 5 is tilted up and down, left and right, and the irradiation optical axis of the lamp unit 1 and the imaging of the camera 2 are captured. Aiming adjustment of the optical axis is performed. Since the aiming mechanism 4 having such an aiming screw 41 is already known, detailed description thereof is omitted, but here, the aiming adjustment is performed manually.
  • the camera 2 includes a camera body 21 including an imaging lens 211 and an imaging element 212, and a camera holder 22 that holds the camera body 21.
  • the camera holder 22 is fixed to the base plate 5 by appropriate means.
  • the camera body 21 is detachably attached to the camera holder 22.
  • the camera body 21 can be attached and detached by sliding up and down with respect to the camera holder 22 from the outside of the lamp housing 3 through an opening not shown in the drawing provided on the upper surface of the lamp body 31. It is configured.
  • These camera body 21 and camera holder 22 are provided with electrodes 23 and 24, respectively. When the camera body 21 is mounted on the camera holder 22, both are electrically connected to each other by these electrodes 23 and 24.
  • FIG. 2 is a block diagram of the lamp ECU 6.
  • the lamp ECU 6 includes a main control unit 61.
  • An input / output unit 62, a signal processing unit 63, an image analysis unit 64, and a lighting drive unit 65 are connected to the main control unit 61.
  • the main control unit 61 includes the input / output unit 62 and the signal processing unit 63. The operations of the image analysis unit 64 and the lighting drive unit 65 are controlled.
  • the lamp unit 1 and the camera 2 described above are connected to the input / output unit 62.
  • the camera body 21 is connected to the input / output unit 62 via the camera holder 22.
  • the input / output unit 62 is connected to a signal bus, for example, CAN (Controller ⁇ ⁇ ⁇ Area Network) 100 that extends to the car CAR.
  • the input / output unit 62 is connected to various other ECUs of the automobile not shown in the drawing via the CAN 100.
  • the signal processing unit 63 processes the image pickup signal of the camera 2 input from the input / output unit 62 to generate a moving image or a still image signal.
  • the image analysis unit 64 analyzes an image obtained from the generated image signal, and detects an object in the captured image, particularly other vehicles such as an oncoming vehicle and a preceding vehicle. Therefore, the image analysis unit 64 is configured as other vehicle detection means.
  • the lighting drive unit 65 recognizes the object detected by the image analysis unit 64, sets an appropriate light distribution pattern that does not dazzle the oncoming vehicle and the preceding vehicle, and generates the light distribution pattern.
  • An ADB control signal is generated. Based on this ADB control signal, the light source of the lamp unit 1, that is, the light emission of the plurality of LEDs 11 is controlled.
  • the lamp ECU 6 detects the other vehicle such as the oncoming vehicle and the preceding vehicle based on the image obtained by imaging with the camera 2, and does not dazzle the detected other vehicle, while other regions.
  • the lighting of the lamp unit 1 is controlled so as to obtain a light distribution pattern capable of brightly illuminating and appropriate ADB control is executed.
  • the image analysis unit 64 that is, the other vehicle detection means detects the other vehicle by performing machine learning based on the traveling simulation data generated by DL (Deep Learning) using AI.
  • DL Deep Learning
  • the image analysis unit 64 detects the other vehicle by performing machine learning based on the traveling simulation data generated by DL (Deep Learning) using AI.
  • AI Deep Learning
  • it is configured.
  • it is desirable to perform a large amount of learning without a break.
  • the image analysis unit 64 when executing the DL, automatically generates a driving environment (S1), automatically generates and arranges road objects (S2), and automatically runs a road as shown in the flow of FIG. Generation (S3), acquisition of correct data by own vehicle traveling (S4), and learning by a learning device are repeated (S5). This increases the learning amount and improves the detection performance.
  • an image of traveling on a road in a virtual space is created by a traveling simulator.
  • the presence / absence of the vehicle and the position on the image viewed from the camera of the own vehicle are converted from 3D to 2D from the position, orientation, distance of the vehicle arranged at an arbitrary coordinate in the virtual space, and the direction and position of the camera of the own vehicle. Etc., and is recorded as correct answer data.
  • camera parameters F value, angle of view, exposure time, etc.
  • the vehicle On top of that, the vehicle will travel on all roads (both up and down).
  • the own vehicle and vehicles other than the own vehicle go around the course at the set road speed limit of ⁇ 20 km / h, and are set so as to observe traffic rules such as traffic lights and temporary stops. Also, make sure that the vehicles do not collide. Then, the learning device inputs the image and the correct answer obtained by traveling of the host vehicle as teacher data.
  • a road is generated in a virtual space from an actual road map image (2D map, satellite photograph image).
  • roads are generated by identifying roads and other roads by image processing from maps and satellite photographs.
  • the number of lanes may be set appropriately from the width of the road, or the lanes may be forcibly determined according to the test case.
  • Objects other than roads may be appropriately arranged according to areas other than roads such as buildings, parking lots, signboards, and mountainous areas. Pedestrian crossings and signals are placed where the roads intersect. Other vehicles may be arranged according to the length of the generated building or road, and may be set so as to continue moving from one building to another.
  • the detection performance by DL is improved by repeating the learning using the traveling simulator. Therefore, in this embodiment, the detection accuracy of other vehicles such as an oncoming vehicle and a preceding vehicle in the image analysis unit of the lamp ECU that has been machine-learned based on simulation data obtained based on such learning is improved. ADB control becomes possible.
  • the configuration of the left headlamp L-HL is substantially the same except that the arrangement of the lamp unit 1 and the camera 2 in the lamp housing 3 is bilaterally symmetrical to the configuration of FIG. Although illustration is omitted, also in the left headlamp L-HL, a lamp ECU is built in the lamp housing and connected to the CAN 100.
  • left and right headlamps L-HL and R-HL are mounted on the body of an automobile CAR, and then the irradiation optical axis of the lamp unit 1 and the imaging optical axis of the camera 2 are directed in a predetermined direction. Aiming adjustment is performed. In this aiming adjustment, as shown in FIG. 1, the light of the lamp unit 1 is irradiated to the screen Sc disposed at a predetermined position in front of the car CAR, and the light distribution pattern P generated by the light irradiation is obtained. Is captured by the camera 2.
  • the irradiation optical axis of the lamp unit 1 is detected from the captured image of the light distribution pattern, and the irradiation optical axis is aimed at a predetermined direction, for example, the center O that is the intersection of the horizontal line H and the vertical line V. Aiming adjustment is performed by operating the screw 41.
  • the aiming adjustment of the imaging optical axis of the camera 2 is also performed by the aiming adjustment of the lamp unit 1.
  • FIG. 6 is a schematic diagram for explaining the first adjustment mode.
  • an external connector 7 is provided on a part of the lamp housing 3 and is directly connected to the camera 2.
  • the external aiming adjuster 8A is connected to the external connector 7.
  • the first aiming adjuster 8A includes a signal processing unit 81 similar to the signal processing unit 63 provided in the lamp ECU 6, and a monitor 82 for displaying an image signal generated by the signal processing unit 63.
  • a monitor driving unit 83 for displaying an image corresponding to the image signal on the monitor 82 is provided.
  • a reference is clearly specified in advance on the screen Sc shown in FIG. 1, and the posture of the automobile CAR is adjusted in accordance with this reference.
  • the lamp unit 1 is turned on and the screen Sc is irradiated with light, and then the light distribution pattern P irradiated by the camera 2 is imaged.
  • the first aiming adjuster 8 ⁇ / b> A captures the captured image signal through the external connector 7, and the signal processing unit 81 generates an image signal.
  • the monitor driving unit 83 displays an image, that is, a captured light distribution pattern on the monitor 82 based on the generated image signal.
  • the operator confirms how much the light distribution pattern displayed while visually recognizing the monitor 82 is deviated from a predetermined position, for example, the center in each of the three directions of yaw, roll, and pitch, and aiming to eliminate this deviation. Manually adjust the screw 41. Thereby, aiming adjustment is executed.
  • the lamp housing 3 of the headlamp HL With an external connector 7 for connecting to the first aiming adjuster 8A in order to capture an image signal captured by the camera 2. This is an obstacle to miniaturization of the lamp HL and increases the cost. Further, it is necessary to connect and remove the external connector 7, and it takes time for the work and the work efficiency is deteriorated. Further, the first aiming adjuster 8A needs to incorporate the signal processing unit 81, which is expensive.
  • the lamp housing 3 of the headlamp HL is not provided with an external connector.
  • the second aiming adjuster 8B is not provided with a signal processing unit, and includes a monitor 82 and a monitor driving unit 83.
  • the second aiming adjuster 8B can be connected to the CAN 100 connected to the lamp ECU 6.
  • the connection to the CAN 100 can be easily realized by connecting to an existing CAN connector provided in the automobile CAR.
  • the lamp ECU 6 incorporates a predetermined program in the main control unit 61 in advance.
  • a predetermined command signal output from the second aiming adjuster 8B through the CAN 100 is input, the lamp ECU 6 outputs (transmits) the image signal generated by the signal processing unit 63 from the input / output unit 62 to the CAN 100. It is configured.
  • this predetermined command signal for example, a signal output when the power of the second aiming adjuster 8B is turned on or a signal output in advance by an operation in the second aiming adjuster 8B can be adopted.
  • the second aiming adjuster 8B inputs (receives) the image signal output from the lamp ECU 6 to the CAN 100 and causes the monitor driving unit 83 to display the image signal on the monitor 82.
  • the operator who performs aiming adjustment can perform aiming adjustment while visually recognizing the image picked up by the camera 2 on the monitor 82 as in the first adjustment mode.
  • the second adjustment mode it is not necessary to provide an external connector on the lamp housing 3, and connection and disconnection with respect to the external connector are not required. For this reason, according to the second adjustment mode, the headlamp HL can be reduced in size and cost, and quick and easy adjustment can be realized. Furthermore, since it is not necessary to incorporate a signal processing unit in the first aiming adjuster 8A, the first aiming adjuster 8A can be configured at a low price.
  • the lamp ECU 6 is configured to realize the second adjustment mode, so that the aiming adjuster having a simple configuration such as the second aiming adjuster 8B can be used quickly and easily. Aiming adjustment can be realized. Thereby, the irradiation optical axis of the lamp unit 1 and the imaging optical axis of the camera 2 in the headlamp HL can be accurately adjusted, and a suitable ADB can be realized.
  • the aiming adjuster may include a calculation unit that calculates the amount of deviation of the light distribution pattern from a predetermined position and a motor drive unit that feedback-controls the motor so that the amount of deviation is zero. Thereby, aiming adjustment can be performed automatically. In this case, since the aiming adjustment can be performed only by using the image signal, the monitor and the monitor driving unit may be omitted.
  • the image signal of an area necessary for aiming adjustment for example, a predetermined including the center of the imaging area captured by the camera 2 is included. Only the image signal in the area may be output to the CAN 100. In this way, it is possible to reduce the amount of data when outputting the image signal and to increase the speed of the aiming adjustment. Alternatively, when it is predicted that the amount of deviation in aiming is not so large, the luminance of the captured light distribution pattern is measured, and it is highly probable that the high luminance region is a region including the center. May be output to the CAN 100.
  • the main control unit 61 of the lamp ECU 6 may include a calculation unit that has calculated the deviation amount in the second aiming adjuster, and the rotation amount of the aiming screw motor calculated based on the deviation amount. You may provide the calculating part which calculates
  • the former aiming adjustment signal it is only necessary to output the calculated deviation amount from the lamp ECU 6 to the CAN 100, and the second aiming adjustment machine only needs to include a motor drive unit.
  • the latter aiming adjustment signal it is only necessary to output only the calculated motor rotation speed from the lamp ECU 6, and the configuration of the motor drive unit of the second aiming adjustment machine can be simplified. Thereby, further aiming adjustment can be simplified and speeded up.
  • the left and right headlamps L-HL and R-HL lamp ECUs 6 send and receive signals to each other through the CAN 100. It becomes possible to do. That is, an image signal output from one headlamp HL to the CAN 100 can be captured by the lamp ECU 6 of the other headlamp HL. For example, when the lamp ECU 6 of the other headlamp HL receives a command signal output from the lamp ECU 6 of one headlamp HL to the CAN100, the received lamp ECU6 outputs an image signal generated by itself to the CAN100. This output image signal can be received by the lamp ECU 6 of one headlamp HL and captured.
  • the left and right headlamps L-HL and R-HL lamp ECUs 6 (6L, 6R) are each provided with an abnormality detection unit 66 for detecting an abnormality of the camera, as shown in FIG.
  • the main control unit 61 is configured to output a predetermined abnormality command signal when the abnormality detection unit 66 detects an abnormality of the camera 2.
  • the right headlamp R-HL uses a lamp based on the image signal captured by the camera 2R on the own side.
  • Unit 1 cannot perform ADB control.
  • an abnormal command signal is output from the main control unit 61 of the lamp ECU 6 to the CAN 100.
  • the abnormality of the camera is a state in which a normal image cannot be acquired, such as a case where imaging is impossible or a field of view is deteriorated during imaging.
  • the lamp ECU 6L of the left headlamp L-HL performs ADB control based on the image signal captured by the camera 2L on its own side, but the abnormal command signal output from the lamp ECU 6R of the right headlamp R-HL is CAN100.
  • the signal is input through the camera 100, an image signal captured by the camera 2L on the own side is output to the CAN 100. Since this image signal is input to the lamp ECU 6R of the right headlamp R-HL in an abnormal state through the CAN 100, the right headlamp R-HL is converted into the image signal captured by the camera 2L of the input left headlamp L-HL. Based on this, ADB control is executed.
  • both heads are used based on the image captured by the other normal camera 2R or 2L.
  • ADB control can be secured in each of the lamps L-HL and R-HL.
  • the parallax angle between the irradiation optical axis of the lamp unit and the imaging optical axis of a normal camera becomes large. It is preferable to increase the value so as to reliably prevent dazzling oncoming vehicles and preceding vehicles.
  • the left and right headlamps L-HL and R-HL are each provided with a camera.
  • the camera is provided only on one headlamp. It is done.
  • the parallax angle between the imaging optical axis of the camera and the irradiation optical axis of the lamp unit of the other headlamp not equipped with the camera becomes a problem.
  • ADB control is performed with the other headlamp based on an image obtained from the camera of one headlamp, a situation may occur in which the other vehicle is dazzled.
  • FIGS. 10A to 10C are examples of left-hand traffic, but in a driving situation where the preceding vehicle CAR1 and the oncoming vehicle CAR2 of FIG. 10A exist, images captured by the camera 2R of the right headlamp R-HL on the oncoming lane side are FIG. 10B is an image captured by the camera 2L of the left headlamp L-HL on the shoulder side in FIG. 10B.
  • the camera 2L of the left headlamp L-HL on the shoulder side may not be able to reliably image the oncoming vehicle CAR2 existing in the vicinity of the host vehicle, and it is difficult to detect this.
  • a headlamp on the opposite lane that makes it easy to reliably image and detect an oncoming vehicle in the vicinity of the host vehicle in the opposite lane, for example, in a left-hand traffic area such as Japan, a camera is installed in the right headlamp. Then, ADB control is executed in each of the left and right headlamps based on the image captured by the camera.
  • the camera 2 includes a camera body 21 and a camera holder 22.
  • the camera body 21 is supported by the base plate 5 via the camera holder 22 and is electrically connected to the lamp ECU 6. Accordingly, if the left and right headlamps L-HL and R-HL are configured in this manner, one camera body 21 is provided in each of the left and right headlamps L-HL and R-HL.
  • the holder 22 can be used for attachment or removal.
  • the ADB control can be executed by attaching the camera body 21 to the right headlamp R-HL.
  • ADB control can be performed by removing the camera body 21 attached to the right headlamp R-HL and attaching it to the left headlamp L-HL.
  • the headlamps are different even when the automobile travels across regions of different transportation systems.
  • High-precision ADB control based on an image captured by the headlamp camera on the opposite lane side can be realized without changing to a specification.
  • the left and right headlamps are provided with the lamp ECUs, respectively, but one lamp ECU is connected to each lamp unit of the left and right headlamps and the camera by, for example, LIN (Local Interconnect Network). May be.
  • LIN Local Interconnect Network
  • an external aiming adjuster is connected to LIN.
  • input / output of image signals between the left and right headlamps is performed via the one lamp ECU and LIN.
  • one lamp unit is provided in the lamp housing, but one or more lamp units, for example, a lamp unit such as a clearance lamp or a turn signal lamp, may be incorporated.
  • a lamp unit such as a clearance lamp or a turn signal lamp
  • ADB control is shown as the light distribution control of the present disclosure.
  • the present invention can be similarly applied to a headlamp that performs the above-described AHB control and other light distribution controls.
  • FIG. 11 is a schematic horizontal sectional view of an embodiment in which the vehicle detection device according to the present disclosure is incorporated in a headlamp HL1 provided on the front left and right of a vehicle body 108 of an automobile.
  • FIG. 11 shows the internal structure of the right headlamp HL1, but the left headlamp has the same configuration.
  • the lamp housing 103 of the headlamp HL1 includes a lamp body 1031 and a translucent front cover 1032.
  • a lamp unit 101 and a camera 102 are housed inside the lamp housing 103.
  • the lamp unit 101 and the camera 102 are supported by a base plate 105.
  • the lamp unit 101 and the camera 102 can be adjusted in their optical axis directions by an aiming mechanism 104 including an aiming screw 1041 and the like.
  • the lamp unit 101 includes a light source 1011 having a plurality of LEDs arranged in a simplified manner, and a projection lens 1012 that projects light emitted from the light source 1011 toward the front of the automobile.
  • the lamp unit 101 can illuminate the front area of the automobile with a desired light distribution by controlling the light emission of the light source 1011.
  • the camera 102 is an imaging unit according to the present disclosure and is not shown in the drawing, but is a digital camera including an imaging element, for example.
  • the camera 102 images a front area of the host vehicle, at least an area including an area irradiated with light from the lamp unit 101, and outputs an imaging signal.
  • Objects captured by the camera 102 include signs, signs, and vehicles such as preceding vehicles and oncoming vehicles that are present in the area where the host vehicle is traveling.
  • a lamp ECU 106 is built in the lamp housing 103 and is connected to the lamp unit 101 and the camera 102, respectively.
  • the lamp ECU 106 can set a suitable light distribution based on an image obtained by imaging with the camera 102.
  • the lamp ECU 106 can perform light distribution of the lamp unit 101 based on this setting, for example, ADB control.
  • the lamp ECU 106 is connected to a vehicle ECU or the like not shown in the figure via a CAN (Controller (Area Network) 10100, and sends and receives required signals to and from them.
  • CAN Controller (Area Network) 10100
  • the lamp ECU 106 performs signal processing on an imaging signal output from the imaging element of the camera 102 to generate an image signal as image data, and this image signal.
  • the image analysis unit 1062 detects the vehicle existing in the image, and the lighting control unit 1063 recognizes the detected vehicle and controls the light distribution of the lamp unit 101. And.
  • the signal processing unit 1061 and the image analysis unit 1062 perform an operation for detecting the vehicle. These constitute a vehicle detection means in a broad sense, but in particular, the vehicle is directly detected. In a sense, the image analysis unit 1062 constitutes a vehicle detection means in a narrow sense. Therefore, the lamp ECU 106 detects the vehicle by the image analysis unit 1062, that is, the vehicle detection means, based on the image signal obtained by the camera 102, and controls the lighting so as to shield the area where the detected vehicle exists.
  • the ADB control is executed by controlling the light distribution of the lamp unit 101 in the unit 1063.
  • the image analysis unit 1062 serving as a vehicle detection unit identifies a light spot extraction unit 1064 that extracts a light spot existing in the image, and identifies the shape of the extracted light spot.
  • a vehicle detection operation in the image analysis unit 1062 as vehicle detection means will be described.
  • the image of the front region obtained by capturing with the camera 102 is in the state of FIG. 13 when the automobile in which the ADB control of the headlamp HL1 is performed at night.
  • the preceding vehicle CAR11 and the oncoming vehicles CAR12 and CAR13, whose lamps are lit, are traveling on a road on which a white line (including a yellow line) LINE is drawn.
  • lighting facilities such as a road lamp B and a delineator D are provided, and a rectangular road sign S1 and a rhombus road sign S2 are provided.
  • various signs are also included in the sign.
  • An image signal (image data) is obtained by imaging the front area with the camera 102 and processing the image signal with the signal processing unit 1061. Furthermore, the image shown in FIG. 14 is obtained from this image signal.
  • This image is obtained by arranging bright and dark pixels (light dots) corresponding to the image sensor of the camera 102 in the X direction (row direction) and the Y direction (column direction). And in this image, the imaged vehicle, lighting equipment, a sign, etc. are displayed as a light spot.
  • the dark area of the background is represented by a white background, and the light spot is drawn with dots.
  • the light spots LP1 to LP3 due to the head lamps or tail lamps of the vehicles CAR11 to CAR13, and the light spots LP4 and LP5 due to the lighting facilities B and D are directly captured by the camera 102.
  • the light spot has a shape close to.
  • the light spots LP6 and LP7 formed by the signs S1 and S2 are picked up by the camera 102 with respect to the high-luminance surface illuminated by the light from the illuminant, so that each shape, that is, a rectangle (square, rectangle) or rhombus
  • the light spot has the shape of The white line LINE on the road surface is a long and narrow light spot LP8.
  • the light spot extraction unit 1064 extracts light spots that are candidates for inspection from this image. For example, by setting a luminance value of a predetermined level as a threshold value, the light spot extraction unit 1064 detects a region having a luminance higher than the threshold value. As a result, light spots of low brightness due to light reflected simply by the object, for example, road surface areas illuminated by the headlamps of the host vehicle are excluded, and light spots by a vehicle equipped with a sign or lamp equipped with a light emitter are detected. In the case of FIG. 14, all such light spots are detected.
  • the light spot extraction unit 1064 further sets the light spot area with the xy coordinates in the image for the detected light spot. That is, for the rectangular light spot LP6, a rectangular light spot area indicated by diagonal coordinates (xa, ya) to (xb, yb) is set as shown by a broken line in FIG. 15A. The same applies to the diamond light spot LP7 shown in FIG. 15B, the circular light spots LP1 to LP5 shown in FIG. 15C, and the white light spot LP8 shown in FIG. 15D, and the coordinates (xm, ym) of the maximum value in the xy directions, respectively.
  • the light spot region is set with ⁇ (xn, yn). In these figures, n and m are values set for each light spot.
  • the light spot identifying unit 1065 identifies the shape of each light spot for each set light spot area. For example, as shown by an arrow in FIG. 15A, while scanning in the x direction (row direction) from the position on the left side of the base of the set light spot area, that is, from the position of (xa, ya) to the right side, in one line The luminance of the pixel is measured. In this measurement, the above-described luminance threshold value can be used. Next, the above-mentioned row is performed, and this is repeated for several lines, for example, 2 to 5 rows in the y direction (column direction). Actually, about 3 lines are sufficient.
  • the shape of the light spot is identified as a rectangle.
  • the shape of the light spot is identified as a rhombus or a circle as shown in FIG. 15B or 15C.
  • the number of lines to be measured is increased by several lines, and the decrease state of the high luminance pixels is measured. If the rate of decrease is large, it is identified as a diamond in FIG. 15D, and if the rate of decrease is smaller than that, it is identified as a circle in FIG. 15C.
  • the shape of the intersection is identified as a linear shape.
  • the amount of change in position is constant, it is identified as a linear shape, and when the amount of change in position changes, it is identified as a curved shape.
  • FIG. 16 shows a light spot with a tag “target light spot”.
  • a tag “non-target light spot” is attached to the light spots LP6 and LP7. Then, the light spots LP6 and LP7 are excluded from extraction.
  • the vehicle detection unit 1066 detects the vehicle by determining the behavior of the light spots in FIG. 16 with the tag “target light spot”, in this case, the moving state in the image. .
  • the relative position change of the light spot with respect to the host vehicle that is, the movement direction and the movement speed are tracked over time, and the road is traveled based on the result.
  • FIG. 17 shows the light spot of the vehicle detected in this way.
  • the light spots LP1 and LP2 are detected as the preceding vehicle CAR11 and the oncoming vehicle CAR12.
  • the light spot LP3 due to the oncoming car CAR13 is small in size, so that it can be detected that it is located far away, and is excluded because there is little possibility of dazzling the oncoming car CAR13 by the headlamp of the host vehicle at this time.
  • the lighting control unit 1063 sets a light distribution that does not dazzle the preceding car CAR11 and the oncoming car CAR12, and executes the light distribution control in the lamp unit 101. Thereby, suitable ADB control is performed.
  • the light spot identifying unit 1065 identifies a light spot that is likely to be a vehicle based on the image obtained by imaging with the camera 102, and the vehicle detecting unit 1066 identifies this.
  • a process for detecting the vehicle only for the light spot is executed. Therefore, even when there are a large number of light spots in the captured image, detection processing is performed for light spots that are clearly not from the vehicle or light spots that are likely not from the vehicle. This eliminates the need to reduce the number of processing steps when detecting a vehicle, and enables rapid vehicle detection.
  • the camera 102 is provided integrally with the headlamp HL1, and is substantially at the same height as the headlamp of the oncoming vehicle. In some cases, it is lower than the headlamps of the oncoming vehicle. Therefore, when the camera is arranged at a position higher than the lamp of the oncoming vehicle, for example, the moving direction of the oncoming vehicle in the captured image is larger than when the camera is arranged on the front window or side mirror of the automobile. Is different.
  • FIG. 18A is an image when the camera is positioned higher than the headlamp HL1 of the oncoming vehicle CAR12 as in the case where the camera is disposed at the upper part of the front window of the host vehicle.
  • the moving direction of the light spot LP2 by the headlamp HL1 of the oncoming vehicle CAR12 is directed to the lower right with respect to the horizontal line H in the image as indicated by an arrow.
  • FIG. 18B is an image in the case where the camera is integrated with the headlamp as in this embodiment and is at a height or a position substantially equal to the headlamp HL1 of the oncoming vehicle CAR12.
  • the moving direction of the light spot LP2 by the headlamp HL1 of the oncoming vehicle CAR12 is directed to the right along the horizontal line H in the image, or slightly lower right than the horizontal line H.
  • the vehicle detection unit 1066 of the embodiment can detect the light spot moving in the horizontal direction in the image or in the lower right direction than the horizontal direction as the light spot of the oncoming vehicle.
  • the camera 102 provided on the host vehicle is tilted in the roll direction (rotation direction about the longitudinal axis of the automobile), for example, the camera 102 is mounted tilted in the left-down direction.
  • the image captured by the camera 102 is also tilted to the lower left as shown in FIG. 18C.
  • the light spot LP2 of the headlamp HL1 of the oncoming car CAR12 is moved to the upper right side of the horizontal line H in the image. Therefore, when the light spot is detected only in the moving direction in the image, the light spot LP2 of the oncoming vehicle may be erroneously detected as a light spot other than the vehicle, for example, a sign or a road illumination light. Therefore, when detecting the vehicle, it is necessary to detect not only the moving direction of the light spot in the image but also the moving speed.
  • the vehicle detection unit 1066 refers to the light spot that has been identified as the “non-target light spot” by the light spot identification unit 1065.
  • the vehicle detection unit 1066 refers to the light spot LP6 of the sign S1 identified as having a rectangular shape, and corrects the detected horizontal side, that is, the direction of the bottom side and the top side, in the corrected horizontal direction. Set as DH. Then, the vehicle detection unit 1066 corrects the moving direction of the light spot LP2 of the oncoming vehicle CAR12 in the image indicated by the arrow as the direction with respect to the corrected horizontal direction DH.
  • the vehicle detection unit 1066 displays an image. Can be corrected to the corrected horizontal direction DH. If the moving direction of the light spot LP6 of the oncoming vehicle CAR12 is detected with reference to the corrected horizontal direction DH, the vehicle detection unit 1066 detects that the light spot LP6 is moving in the horizontal direction or the lower right direction. It is possible to accurately detect the oncoming vehicle CAR12. That is, even if the moving speed of the light spot is not detected at the time of detection, the vehicle can be accurately detected only by detecting the moving direction, thereby simplifying the processing and speeding up the detection.
  • an average of the bottom and top sides of the plurality of rectangles may be taken as the reference horizontal direction.
  • the bottom side or the top side of the rectangular light spot having the largest dimension, which can easily identify the shape with high accuracy may be employed.
  • the reference horizontal direction may be set using a light spot of another shape. For example, in the case of a diamond-shaped light spot, a horizontal diagonal line may be used as the reference horizontal direction. If the vehicle has a vehicle height sensor or the like and the roll angle of the host vehicle can be detected, the moving direction of the light spot in the image may be corrected based on the detected roll angle. Good.
  • the light spot LP8 of the white line LINE identified by the light spot identifying unit 1065 may be used to further improve the accuracy of vehicle detection, particularly the accuracy of oncoming vehicle detection.
  • the camera 102 is provided integrally with the headlamp HL1, and is provided at a lower position closer to the road surface than when provided on the front window or side mirror.
  • the white line LINE can be imaged more clearly.
  • the white line LINE is a light spot obtained by imaging light reflected from the headlamps of the oncoming vehicle and the host vehicle, it is difficult to normally capture images with high brightness. By providing in, it can image as a high-intensity light spot.
  • the vehicle detection unit 1066 is in the vicinity of the recognized white line light spot LP8 among the light spots of the plurality of “target light spots” identified by the light spot identifying unit 1065, and Only the light spot that exists at the position along the extension direction of the light spot LP8 and moves in the extension direction can be detected as the light spot of the vehicle. Therefore, by performing vehicle detection based only on the detected light spot based on the moving direction and moving speed of the light spot, the accuracy of vehicle detection is improved, and the processing steps for vehicle detection are simplified, and the detection speed is increased. Can be realized.
  • the moving direction of the light spot in the image is a direction close to the light spot LP8 of the white line and along the extending direction thereof. It is possible to accurately detect the vehicle simply by detecting. That is, even when the host vehicle rolls and the camera and the vehicle body are tilted, accurate vehicle detection is possible without considering the corrected horizontal direction as described in FIG. 18C. The complexity of processing can be avoided.
  • the vehicle detection unit 1066 calculates the curvature from the shape of the light spot LP8 when it is identified by the light spot identification unit 1065.
  • the vehicle detection unit 1066 performs correction such as subtraction or addition of the movement amount based on the direction of the curved path and the calculated curvature. It may be. This makes it possible to accurately detect the vehicle from the light spot.
  • the vehicle detection device of the present disclosure is used for light distribution control by ADB control of a headlamp, but it may be used to perform AHB control or other light distribution control. Further, since the vehicle can be detected quickly and accurately, it can also be used for automatic driving control of an automobile.
  • the vehicle detection device of the present disclosure can be configured to take an image of a side region and a rear region of an automobile and detect other vehicles existing in the side and rear regions of the host vehicle.
  • the camera may be integrated into a side mirror or tail lamp.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mechanical Engineering (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

L'invention concerne : une unité de lampe (1) ; un moyen de capture d'image (caméra) (2) qui capture une image d'une région d'irradiation de lumière de l'unité de lampe (1) ; et un moyen de commande (ECU de lampe) (6) pour commander l'unité de lampe (1) et le moyen de capture d'image (2), le moyen de commande (6) étant connecté à un bus de signal (CAN) (100) installé dans un véhicule. La visée de l'unité de lampe (1) est ajustée sur la base de signaux obtenus par la réalisation d'un traitement de signal sur des signaux d'image capturés par le moyen de capture d'image (2). Lorsqu'un signal de commande prescrit a été entré, le moyen de commande (6) délivre le signal traité par signal au bus de signal (100), et ajuste la visée à l'aide d'un dispositif de réglage de visée connecté au bus de signal (100).
PCT/JP2019/004994 2018-03-15 2019-02-13 Lampe de véhicule, procédé de détection de véhicule et dispositif de détection de véhicule WO2019176418A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018048049A JP2019156276A (ja) 2018-03-15 2018-03-15 車両検出方法及び車両検出装置
JP2018-048049 2018-03-15
JP2018048050A JP2019156277A (ja) 2018-03-15 2018-03-15 車両用ランプ
JP2018-048050 2018-03-15

Publications (1)

Publication Number Publication Date
WO2019176418A1 true WO2019176418A1 (fr) 2019-09-19

Family

ID=67908305

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/004994 WO2019176418A1 (fr) 2018-03-15 2019-02-13 Lampe de véhicule, procédé de détection de véhicule et dispositif de détection de véhicule

Country Status (1)

Country Link
WO (1) WO2019176418A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114235343A (zh) * 2021-12-03 2022-03-25 常州青葵智能科技有限公司 一种基于lin总线的led车灯动态图像检测系统

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009147799A1 (fr) * 2008-06-04 2009-12-10 株式会社小糸製作所 Système de pointage de phare
JP2012020662A (ja) * 2010-07-15 2012-02-02 Koito Mfg Co Ltd 車両検出装置及びこれを備えた前照灯制御装置
JP2013028239A (ja) * 2011-07-27 2013-02-07 Denso Corp ライト検出装置、ライト検出プログラム、およびライト制御装置
JP2013067229A (ja) * 2011-09-21 2013-04-18 Denso Corp ライト検出装置、ライト検出プログラム、およびライト制御装置
JP2013147138A (ja) * 2012-01-19 2013-08-01 Koito Mfg Co Ltd 車両用ランプの配光制御装置
JP2013244890A (ja) * 2012-05-28 2013-12-09 Denso Corp 車両光源検出装置および車両光源検出プログラム
KR20140104857A (ko) * 2013-02-21 2014-08-29 주식회사 만도 차량 카메라의 영상 보정 방법 및 이를 이용하는 영상 처리 장치
JP2015195018A (ja) * 2014-03-18 2015-11-05 株式会社リコー 画像処理装置、画像処理方法、運転支援システム、プログラム
JP2015202756A (ja) * 2014-04-14 2015-11-16 株式会社小糸製作所 車両用灯具の制御装置
US20160097493A1 (en) * 2014-10-02 2016-04-07 Taylor W. Anderson Method and apparatus for a lighting assembly with an integrated auxiliary electronic component port
JP2017088124A (ja) * 2015-11-17 2017-05-25 株式会社小糸製作所 車両用灯具システム

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009147799A1 (fr) * 2008-06-04 2009-12-10 株式会社小糸製作所 Système de pointage de phare
JP2012020662A (ja) * 2010-07-15 2012-02-02 Koito Mfg Co Ltd 車両検出装置及びこれを備えた前照灯制御装置
JP2013028239A (ja) * 2011-07-27 2013-02-07 Denso Corp ライト検出装置、ライト検出プログラム、およびライト制御装置
JP2013067229A (ja) * 2011-09-21 2013-04-18 Denso Corp ライト検出装置、ライト検出プログラム、およびライト制御装置
JP2013147138A (ja) * 2012-01-19 2013-08-01 Koito Mfg Co Ltd 車両用ランプの配光制御装置
JP2013244890A (ja) * 2012-05-28 2013-12-09 Denso Corp 車両光源検出装置および車両光源検出プログラム
KR20140104857A (ko) * 2013-02-21 2014-08-29 주식회사 만도 차량 카메라의 영상 보정 방법 및 이를 이용하는 영상 처리 장치
JP2015195018A (ja) * 2014-03-18 2015-11-05 株式会社リコー 画像処理装置、画像処理方法、運転支援システム、プログラム
JP2015202756A (ja) * 2014-04-14 2015-11-16 株式会社小糸製作所 車両用灯具の制御装置
US20160097493A1 (en) * 2014-10-02 2016-04-07 Taylor W. Anderson Method and apparatus for a lighting assembly with an integrated auxiliary electronic component port
JP2017088124A (ja) * 2015-11-17 2017-05-25 株式会社小糸製作所 車両用灯具システム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114235343A (zh) * 2021-12-03 2022-03-25 常州青葵智能科技有限公司 一种基于lin总线的led车灯动态图像检测系统

Similar Documents

Publication Publication Date Title
JP4654163B2 (ja) 車両の周囲環境認識装置及びシステム
US6960005B2 (en) Vehicle headlamp apparatus
US7899213B2 (en) Image processing system and vehicle control system
JP4544233B2 (ja) 車両検出装置及びヘッドランプ制御装置
JP4415996B2 (ja) 車載用画像認識装置及び配光制御装置、並びに配光制御方法
US9669755B2 (en) Active vision system with subliminally steered and modulated lighting
US10364956B2 (en) Headlight device, headlight controlling method, and headlight controlling program
CN103782307A (zh) 用于识别车辆的环境中的对象的方法和设备
US9616805B2 (en) Method and device for controlling a headlamp of a vehicle
JPH0769125A (ja) 車両用前照灯装置
JP2013147112A (ja) 車両の走行環境認識装置
JP2014515893A (ja) 車両のカメラによって撮影した画像を評価するための方法および画像評価装置
JP5065172B2 (ja) 車両灯火判定装置及びプログラム
EP2525302A1 (fr) Système de traitement d'images
JP4980970B2 (ja) 撮像手段の調整装置および物体検出装置
JP2013045176A (ja) 信号機認識装置、候補点パターン送信装置、候補点パターン受信装置、信号機認識方法、及び候補点パターン受信方法
US10730427B2 (en) Lighting device
JP4007578B2 (ja) 前照灯照射範囲制御方法及び前照灯装置
WO2019176418A1 (fr) Lampe de véhicule, procédé de détection de véhicule et dispositif de détection de véhicule
JP7312913B2 (ja) 自動車両の照明システムを制御するための方法
US20200369200A1 (en) Vehicle detecting device and vehicle lamp system
JP2007124676A (ja) 車載用画像処理装置
JP5547580B2 (ja) 撮像カメラ及びこれを用いた車両検出装置とランプ制御装置
JPH04193641A (ja) 車両用障害物検出装置
CN111712854B (zh) 图像处理装置及车辆用灯具

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19768208

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19768208

Country of ref document: EP

Kind code of ref document: A1