CN211786115U - Sensor system - Google Patents

Sensor system Download PDF

Info

Publication number
CN211786115U
CN211786115U CN201921887104.3U CN201921887104U CN211786115U CN 211786115 U CN211786115 U CN 211786115U CN 201921887104 U CN201921887104 U CN 201921887104U CN 211786115 U CN211786115 U CN 211786115U
Authority
CN
China
Prior art keywords
light
vehicle
sensor system
cover
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201921887104.3U
Other languages
Chinese (zh)
Inventor
井上宙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koito Manufacturing Co Ltd
Original Assignee
Koito Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koito Manufacturing Co Ltd filed Critical Koito Manufacturing Co Ltd
Application granted granted Critical
Publication of CN211786115U publication Critical patent/CN211786115U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The purpose of the utility model is to restrain the reduction of the information detection ability of the sensor unit that is covered by the cover that forms a part of the surface of the vehicle. To achieve this object, a sensor system is provided. A LiDAR sensor unit (14) detects information outside the vehicle using detection light (14 a). A cover (12) forms a portion of the exterior surface of the vehicle in a manner that covers the LiDAR sensor unit (14) and allows the passage of detection light (14 a). The camera (15) outputs an image signal (S1) corresponding to an image of a light passage region (12a) in the cover (12) through which the detection light (14a) passes. The processor (162) detects a foreign substance adhering to the light passing region (12a) based on the image signal (S1). At least a part of the focal plane (15b) of the camera (15) overlaps the light passage region (12 a).

Description

Sensor system
Technical Field
The utility model relates to a carry on sensor system in vehicle.
Background
In order to assist driving of a vehicle, a sensor unit for detecting information outside the vehicle is mounted on a vehicle body. Patent document 1 discloses a radar as such a sensor unit. The radar is disposed in a lamp chamber of a lamp device that illuminates the outside of the vehicle. That is, the radar is covered by a cover that divides the lamp house and allows illumination light to pass through. The hood forms a part of the outer surface of the vehicle and also allows passage of detection light for radar detection of information outside.
The term "driving assistance" used in the present specification means a control process of performing at least one of a driving operation (steering wheel operation, acceleration, deceleration, and the like), monitoring of a running environment, and support of the driving operation, at least partially. That is, the present invention is intended to include a partial driving assistance such as a collision damage reduction braking function and a lane keeping assistance function to a full automatic driving operation.
Documents of the prior art
Patent document
Patent document 1: japanese laid-open patent publication No. 2007 and 106199
SUMMERY OF THE UTILITY MODEL
Problem to be solved by the utility model
An object of the present invention is to suppress a decrease in information detection capability of a sensor unit covered by a cover forming a part of an outer surface of a vehicle.
Means for solving the problems
One aspect for achieving the above object is a sensor system mounted on a vehicle, the sensor system including: a sensor unit that detects information outside the vehicle using light; a cover that forms a part of an outer surface of the vehicle in such a manner as to cover the sensor unit and allows passage of the light; a camera that outputs a signal corresponding to an image of a light passing area in the cover through which the light passes; and a processor that detects a foreign substance attached to the light passing area based on the signal, at least a part of a focal plane of the camera overlapping with the light passing area.
One aspect for achieving the above object is a sensor system mounted on a vehicle, the sensor system including: a sensor unit that detects information outside the vehicle using light; a cover that forms a part of an outer surface of the vehicle in such a manner as to cover the sensor unit and allows passage of the light; a camera that outputs a signal corresponding to an image of a light passing area in the cover through which the light passes; and a processor that detects a foreign substance adhering to the light passing area based on the signal, an optical axis of the camera extending in a direction different from a detection reference direction of the sensor unit.
One aspect for achieving the above object is a sensor system mounted on a vehicle, the sensor system including: a Light Detection and Ranging (optical radar) sensor unit that detects information outside the vehicle using Detection Light; a cover that forms a portion of an exterior surface of the vehicle in a manner that covers the LiDAR sensor unit and allows passage of the detection light; a camera that outputs a signal corresponding to an image of a light passing region in the cover through which the detection light passes; and a processor that detects foreign matter attached to the light passing area based on the signal.
The camera according to each of the above-described embodiments is not a device for acquiring an image of the outside of the vehicle (strictly speaking, an image outside the outer surface of the cover), but a device for acquiring an image of the light passing region of the cover located on the traveling path of the light used for information detection by the sensor unit. Thus, at least a portion of the focal plane of the camera overlaps the light passing area. Further, since the arrangement of the camera in which at least a part of the focal plane is overlapped with the light passing region is preferably adopted, the optical axis of the camera can be extended in a direction different from the detection reference direction of the sensor unit.
If a foreign object adheres to the light passage area, the detection of information outside the vehicle by the sensor unit is hindered. However, since the attachment of such foreign matter is detected by the camera configured as described above, appropriate processing can be performed in accordance with the detection result. Therefore, a decrease in the information detection capability of the sensor unit covered by the cover forming a part of the outer surface of the vehicle can be suppressed.
Since it is difficult to detect foreign matter adhering to a light passing area based on a LiDAR sensor unit, foreign matter detection by acquiring an image of the area based on a camera is more advantageous in combination with the LiDAR sensor unit.
The sensor system according to each of the above-described aspects can be configured as follows.
The sensor system includes a nozzle capable of ejecting liquid, and the processor causes the nozzle to eject the liquid toward the light passage area when the foreign matter is detected.
According to such a configuration, it is possible to automate the process for removing foreign matter adhering to the light passing area. Thus, the effect of suppressing a decrease in the information detection capability of the sensor unit covered by the cover forming a part of the outer surface of the vehicle is improved.
The sensor system according to each of the above-described aspects can be configured as follows.
The sensor system includes a nozzle capable of ejecting liquid, and the processor determines a position of the detected foreign object and causes the nozzle to eject the liquid toward the position.
According to such a configuration, since the liquid is more accurately ejected to the foreign matter adhering to the light passing region, the possibility of removing the foreign matter can be increased. Thus, the effect of suppressing a decrease in the information detection capability of the sensor unit covered by the cover forming a part of the outer surface of the vehicle is further improved.
The sensor system according to each of the above-described aspects can be configured as follows.
The camera is provided with: an image pickup element; a resin lens for forming an image in the image pickup element; and a circuit board supporting the imaging element and the resin lens.
With this configuration, the camera space can be significantly reduced, and the degree of freedom in the arrangement of the cameras for acquiring the image of the light passing region is increased. Therefore, it is easy to suppress a decrease in information detection capability of the sensor unit covered by the cover forming a part of the outer surface of the vehicle.
The sensor system according to each of the above-described aspects can be configured as follows.
The sensor system includes a lamp unit that emits illumination light to the outside of the vehicle, and the cover allows passage of the illumination light.
The lamp unit is generally disposed at a position where there is little shielding in the vehicle because of a function of supplying illumination light to the outside of the vehicle. By disposing the sensor unit at such a position as well, information on the outside of the vehicle can be acquired efficiently.
The term "light" used in the present specification refers to not only visible light but also electromagnetic waves having any wavelength such as ultraviolet light, infrared light, microwaves, millimeter waves, and the like.
The term "sensor unit" used in the present specification refers to a unit of components that have a desired information detection function and that can be distributed as a single body.
The term "lamp unit" used in the present specification refers to a unit of a component that has a desired lighting function and can itself be circulated as a single body.
Drawings
Fig. 1 shows a configuration of an example of a sensor system according to an embodiment.
Fig. 2 shows an external appearance of a vehicle mounted with the sensor system of fig. 1.
Fig. 3 is a diagram illustrating an operation of a processor in the sensor system of fig. 1.
Fig. 4 is a diagram illustrating an operation of a processor in the sensor system of fig. 1.
Fig. 5 shows the configuration of a sensor system according to another example.
Description of the reference numerals
1: a sensor system; 12: a cover; 12 a: a light passing area; 14: a LiDAR sensor unit; 14 a: detecting light; 14 b: detecting a reference direction; 15: a camera; 15 b: a focal plane; 15 c: an optical axis; 151: an image pickup element; 152: a resin lens; 153: a circuit substrate; 162: a processor; 17: a nozzle; 18: a lamp unit; 100: a vehicle; s1: an image signal.
Detailed Description
Hereinafter, examples of the embodiments will be described in detail with reference to the drawings. In the drawings used in the following description, the scale is appropriately changed so that each member can be recognized.
In the drawings, an arrow F indicates a front direction of the illustrated configuration. Arrow B indicates the rearward direction of the illustrated construction. Arrow U indicates the upward direction of the illustrated construction. Arrow D indicates the downward direction of the illustrated configuration. Arrow L indicates the left direction of the illustrated construction. Arrow R indicates the right direction of the illustrated construction. The terms "left" and "right" used in the following description denote left and right directions as viewed from the driver's seat.
Fig. 1 schematically shows a configuration of a sensor system 1 according to an embodiment. The sensor system 1 is mounted on a vehicle 100 shown in fig. 2. The shape of the body of the vehicle 100 is merely an example.
The sensor system 1 includes a housing 11 and a cover 12. The housing 11 and the cover 12 together define a housing chamber 13.
The sensor system 1 is provided with a LiDAR sensor unit 14. The LiDAR sensor unit 14 is disposed within the housing chamber 13. The cover 12 forms a portion of the exterior surface of the vehicle 100 in a manner that covers the LiDAR sensor unit 14.
The LiDAR sensor unit 14 includes a configuration for emitting the detection light 14a toward the detection area outside the vehicle 100, and a configuration for detecting return light (not shown) as a result of the detection light 14a being reflected by an object present in the detection area. As the detection light 14a, for example, infrared light having a wavelength of 905nm can be used. The emission direction of the detection light 14a is determined based on the detection reference direction 14 b.
The LiDAR sensor unit 14 can acquire the distance to the object associated with the return light based on, for example, the time from the timing when the detection light 14a is emitted in a certain direction until the return light is detected. By associating such distance data with the detection position and collecting the data, information on the shape of the object associated with the return light can be acquired. In addition to this, or instead of this, information relating to the properties such as the material of the object associated with the return light can be acquired based on the difference in the waveforms of the outgoing light and the return light. That is, the LiDAR sensor unit 14 is a device that detects information outside the vehicle 100 using light.
The detection light 14a and the return light pass through the light passing area 12a in the cover 12. In other words, the cover 12 is formed of a material that allows at least the detection light 14a and the return light to pass therethrough.
The sensor system 1 is provided with a camera 15. The camera 15 is disposed in the housing chamber 13. Thus, the camera 15 is also covered by the cover 12.
The camera 15 is a device that acquires an image of the light passing area 12a in the cover 12. That is, the camera 15 is disposed so that the light passing region 12a is positioned within the field of view indicated as a region between the pair of one-dot chain lines 15 a. The camera 15 is configured to output an image signal S1 corresponding to the acquired image. The image acquisition is repeated, for example, in units of 1 second.
Fig. 3 (a) shows an example of an image I1 that can be reproduced based on the image signal S1. The image I1 includes a plurality of pixels P1 to Pn (n is an integer of 2 or more). Image I1 includes an image of light passing area 12a in mask 12. In this example, foreign substances O1 and O2 adhere to the light passage region 12 a. As the foreign matter, raindrops, snow flakes, sludge, remains of insects, and the like can be exemplified.
As shown in fig. 1, the sensor system 1 includes a control device 16. The control device 16 includes an input interface 161 and a processor 162. The control device 16 may be disposed in the housing chamber 13, or may be supported by the housing 11 outside the housing chamber 13. Alternatively, the control device 16 may be disposed at an appropriate position in the vehicle 100 that is separate from the housing 11.
The input interface 161 receives the image signal S1 output from the camera 15. The processor 162 is configured to detect a foreign substance adhering to the light passing area 12a of the cover 12 based on the image signal S1. The input interface 161 may include a signal processing circuit that converts the image signal S1 into a form suitable for processing performed by the processor 162 as needed.
Fig. 4 shows one example of the flow of processing performed by the processor 162. The processor 162 generates a data set D1 shown in (B) in fig. 3 based on the image signal S1 (step 1). Specifically, the processor 162 generates a data set D1 including a plurality of pixel data PD1 to PDn by applying binarization processing to each of the plurality of pixels P1 to Pn. Therefore, the plurality of pixel data PD1 to PDn correspond one-to-one to the plurality of pixels P1 to Pn.
The plurality of pixels P1 to Pn each include position information and luminance information (received light intensity information) in the image I1. The processor 162 generates pixel data PDm having "1" as a luminance value when the luminance of a certain pixel Pm exceeds a predetermined threshold value. m is an integer arbitrarily selected from 1 to n. The processor 162 generates pixel data PDm having "0" as a luminance value when the luminance of a certain pixel Pm is equal to or less than a predetermined threshold value. Thus, the plurality of pixel data PD1 to PDn each have a luminance value of "1" or "0" in addition to the positional information in the image I1.
In the example shown in fig. 3 (B), pixel data having a luminance value of "1" is represented by a white rectangle, and pixel data having a luminance value of "0" is represented by a rectangle with oblique lines. It is known that the pixel data at the positions corresponding to the foreign substances O1, O2 in the image I1 have a luminance value of "0".
As shown in fig. 1, the control device 16 includes a memory 163. As shown in fig. 4, the processor 162 determines whether the data set D1 generated in the past based on the above-described method is stored in the memory 163 (step 2).
In the case where the data set D1 generated in the past is not stored in the memory 163 (no in step 2), the processor 162 stores the data set D1 generated in step 1 in the memory 163 (step 3). The process returns to step 1.
In the case where the data set D1 generated in the past is stored in the memory 163 (yes in step 2), the processor 162 compares the data set D1 generated in step 1 with the data set D1 stored in the memory 163 (step 4).
Specifically, it is determined whether or not the luminance value changes from "1" to "0" for each of the plurality of pixel data PD1 to PDn. When such a change occurs in a certain pixel data PDm, there is a high possibility that a foreign substance adheres to a position corresponding to the pixel data PDm. The processor 162 determines whether or not foreign matter is attached to the light passage area 12a of the cover 12 based on the comparison (step 5).
When none of the plurality of pixel data PD1 to PDn has a luminance value that changes from "1" to "0", the processor 162 determines that no foreign substance has adhered to the light passage area 12a of the cover 12 (no in step 5). In this case, the data set D1 generated in step 1 is restored to the memory 163 (step 3). After that, the process returns to step 1. The data set D1 stored in memory 163 is compared with the data set D1 generated next.
When a change in luminance value from "1" to "0" occurs in at least one of the plurality of pixel data PD1 to PDn, the processor 162 determines that a foreign substance is attached to the light passage area 12a of the cover 12 (yes in step 5). In this case, the processor 162 generates a detection signal S2 indicating the attachment of foreign matter (step 6).
Further, by configuring the processor 162 to determine that foreign matter is attached when the number of pixels whose luminance values change from "1" to "0" exceeds a predetermined threshold value, it is possible to avoid detection of fine foreign matter to such an extent that the detection of information is not hindered, and to improve the detection accuracy of foreign matter that needs to be removed.
As shown in fig. 1, the control device 16 includes an output interface 164. The processor 162 causes the output interface 164 to output the detection signal S2. The detection signal S2 can be transmitted to other control devices in the vehicle 100. For example, the other control device can notify the occupant of vehicle 100 that foreign matter is attached to light transmission region 12a of cover 12 based on detection signal S2. The reporting may be by at least one of a visual report, an audible report, a tactile report.
The occupant who receives the report can take appropriate measures. For example, the sensor system 1 may be provided with a nozzle 17 that ejects liquid toward the hood 12. As the liquid, water, hot water, a cleaning liquid, and the like can be exemplified. The occupant can perform an operation of causing the nozzle 17 to eject the liquid. This enables removal of foreign matter adhering to the light passage region 12 a.
The camera 15 according to the present embodiment is not a device for acquiring an image of the outside of the vehicle 100 (strictly speaking, an image outside the outer surface of the cover 12), but is a device for acquiring an image of the light passing region 12a located on the travel path of the detection light 14a and the return light of the LiDAR sensor unit 14. Thus, at least a part of the focal plane 15b of the camera 15 overlaps the light passing region 12 a. In addition, since the arrangement of the camera 15 in which at least a portion of the focal plane 15b overlaps the light passing region 12a is preferably adopted, as shown in fig. 1, the optical axis 15c of the camera 15 may extend in a direction different from the detection reference direction 14b of the LiDAR sensor unit 14.
If foreign matter adheres to the light passing area 12a located on the path of travel of the detection light 14a and the return light of the LiDAR sensor unit 14, detection of information outside the vehicle 100 by the LiDAR sensor unit 14 is hindered. However, since the attachment of such foreign matter is detected by the camera 15 configured as described above, appropriate processing can be performed in accordance with the detection result. Therefore, a decrease in the information detection capability of the LiDAR sensor unit 14 covered by the cover 12 that forms a portion of the exterior surface of the vehicle 100 can be suppressed.
The LiDAR sensor unit 14 may be replaced by a suitable sensor unit that uses light in order to detect information outside of the vehicle 100. As such a sensor unit, a camera unit using visible light, a TOF (Time of Flight) camera unit using infrared light, a radar unit using millimeter waves, and the like can be exemplified. However, since it is difficult to detect foreign matter adhering to the light passing area 12a based on the LiDAR sensor unit 14, detection of foreign matter by acquiring an image of the light passing area 12a based on the camera 15 is more advantageous in combination with the LiDAR sensor unit 14.
As shown in fig. 1, the detection signal S2 generated by the processor 162 may be used to operate the nozzle 17. That is, the processor 162 may cause the nozzle 17 to eject the liquid toward the light passing region 12a when detecting the foreign substance attached to the light passing region 12a of the cover 12.
With such a configuration, it is possible to automate the process for removing foreign matter adhering to the light passage area 12 a. Thus, the effect of suppressing a decrease in the information detection capability of the LiDAR sensor unit 14 covered by the cover 12 that forms a portion of the exterior surface of the vehicle 100 is improved.
As described above, the plurality of pixel data PD1 to PDn included in the data set D1 each have information corresponding to a position in the light passing region 12 a. Thus, the processor 162 can also determine the position of the foreign substance within the light passing region 12a based on the position information that the pixel data whose luminance value changes from "1" to "0" has. On the other hand, as shown in fig. 1, the nozzle 17 may be provided with a mechanism capable of adjusting the ejection direction of the liquid. In this case, the processor 162 may configure the detection signal S2 such that the nozzle 17 ejects the liquid toward the position of the detected foreign object.
With this configuration, since the liquid is more accurately ejected to the foreign matter adhering to the light passing region 12a, the possibility of removing the foreign matter can be increased. Thus, the effect of suppressing a decrease in the information detection capability of the LiDAR sensor unit 14 covered by the cover 12 that forms a portion of the exterior surface of the vehicle 100 is further enhanced.
As shown in fig. 1, the camera 15 may be implemented as a miniature camera module including an image pickup element 151, a resin lens 152, and a circuit substrate 153. The image pickup element 151 may be a CCD image sensor or a CMOS image sensor. The resin lens 152 is a lens for forming an image in the image pickup element 151. From the viewpoint of widening the field of view, it is preferable to use a wide-angle lens as the resin lens 152. The circuit board 153 supports the image pickup element 151 and the resin lens 152. A signal line for outputting the image signal S1 is electrically connected to the image pickup element 151 via the circuit board 153.
With such a configuration, the space occupied by the camera 15 in the housing chamber 13 can be significantly reduced, and therefore the degree of freedom in the arrangement of the camera 15 for acquiring the image of the light passing region 12a is increased. Thus, it becomes easy to suppress a decrease in the information detection capability of the LiDAR sensor unit 14 that is covered by the cover 12 that forms a part of the outer surface of the vehicle 100.
As shown in fig. 1, the sensor system 1 may be provided with a lamp unit 18. The lamp unit 18 is a device that emits illumination light to the outside of the vehicle 100. As the lamp unit 18, a headlamp unit, a vehicle width lamp unit, a winker lamp unit, a fog lamp unit, a rear combination lamp unit, and the like can be exemplified.
The lamp unit 18 is disposed in the housing chamber 13. Thus, the lamp unit 18 is covered by the cover 12. The cover 12 also allows passage of illumination light emitted from the lamp unit 18. In this case, the cover 12 is formed of a material that is also transparent to visible light.
The lamp unit 18 is generally disposed at a position where there is little shielding in the vehicle 100 because of a function of supplying illumination light to the outside of the vehicle 100. By also disposing the LiDAR sensor unit 14 at such a position, information outside the vehicle 100 can be efficiently acquired.
The processor 162 capable of executing the above-described processing may be provided by a general-purpose microprocessor operating in cooperation with a general-purpose memory, or may be provided as a part of an application-specific integrated circuit element. As a general-purpose microprocessor, a CPU, an MPU, a GPU, and the like can be exemplified. As the general memory, RAM, ROM may be exemplified. As the application specific integrated circuit element, a microcontroller, an ASIC, an FPGA, or the like can be exemplified. Processor 162 and memory 163 may be provided as separate components or may be packaged within a single component.
The above embodiments are merely examples for making the present invention easy to understand. The configuration of the above embodiment can be modified and improved as appropriate without departing from the gist of the present invention.
In the above-described embodiment, the image of the light passing region 12a in the cover 12 is acquired by the single camera 15. However, as shown in fig. 5, a configuration may be adopted in which an arbitrary portion in the light passing region 12a is included in any one of the fields of view of the plurality of cameras 15.

Claims (7)

1. A sensor system mounted on a vehicle, characterized in that,
the sensor system is provided with:
a sensor unit that detects information outside the vehicle using light;
a cover that forms a part of an outer surface of the vehicle in such a manner as to cover the sensor unit and allows passage of the light;
a camera that outputs a signal corresponding to an image of a light passing area in the cover through which the light passes; and
a processor that detects foreign matter attached to the light passing area based on the signal,
at least a portion of a focal plane of the camera overlaps the light passing area.
2. A sensor system mounted on a vehicle, characterized in that,
the sensor system is provided with:
a sensor unit that detects information outside the vehicle using light;
a cover that forms a part of an outer surface of the vehicle in such a manner as to cover the sensor unit and allows passage of the light;
a camera that outputs a signal corresponding to an image of a light passing area in the cover through which the light passes; and
a processor that detects foreign matter attached to the light passing area based on the signal,
an optical axis of the camera extends in a direction different from a detection reference direction of the sensor unit.
3. A sensor system mounted on a vehicle, characterized in that,
the sensor system is provided with:
a LiDAR sensor unit that detects information outside the vehicle using detection light;
a cover that forms a portion of an exterior surface of the vehicle in a manner that covers the LiDAR sensor unit and allows passage of the detection light;
a camera that outputs a signal corresponding to an image of a light passing region in the cover through which the detection light passes; and
a processor that detects foreign matter attached to the light passing area based on the signal.
4. The sensor system according to any one of claims 1 to 3,
the sensor system is provided with a nozzle capable of ejecting a liquid,
the processor causes the nozzle to eject the liquid toward the light passing area when the foreign substance is detected.
5. The sensor system according to any one of claims 1 to 3,
the sensor system is provided with a nozzle capable of ejecting a liquid,
the processor determines the location of the detected foreign object and causes the nozzle to eject the liquid toward the location.
6. The sensor system according to any one of claims 1 to 3,
the camera is provided with:
an image pickup element;
a resin lens for forming an image in the image pickup element; and
and a circuit board for supporting the imaging element and the resin lens.
7. The sensor system according to any one of claims 1 to 3,
the sensor system includes a lamp unit that emits illumination light to the outside of the vehicle,
the cover allows passage of the illumination light.
CN201921887104.3U 2018-11-13 2019-11-05 Sensor system Active CN211786115U (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-213126 2018-11-13
JP2018213126 2018-11-13

Publications (1)

Publication Number Publication Date
CN211786115U true CN211786115U (en) 2020-10-27

Family

ID=72963569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201921887104.3U Active CN211786115U (en) 2018-11-13 2019-11-05 Sensor system

Country Status (1)

Country Link
CN (1) CN211786115U (en)

Similar Documents

Publication Publication Date Title
CN111175311B (en) Sensor system
US10929693B2 (en) Vehicular vision system with auxiliary light source
CN109690349B (en) sensor system
US20220159181A1 (en) Vehicular vision system with infrared emitter synchronization
JP5680573B2 (en) Vehicle driving environment recognition device
US10137842B2 (en) Camera system for a vehicle
EP3079948B1 (en) Method for operating a rearview camera system of a motor vehicle after detection of a headlight flasher, rearview camera system and motor vehicle
CN105450947B (en) Vehicle optical sensor system
US20170113613A1 (en) Vehicle vision system with enhanced night vision
EP3770020B1 (en) Imaging system and vehicle window for use in same
CN112166055B (en) Imaging device and lamp device
JP7414440B2 (en) Distance sensor
CN110024376B (en) Solid-state imaging device, driving method, and electronic device
US10682966B2 (en) Vehicle light/display control system using camera
CN211786115U (en) Sensor system
JP2010245846A (en) Vehicle periphery monitoring device
CN211263757U (en) Sensor system
WO2021065437A1 (en) Ranging device
CN212275979U (en) Sensor system
WO2023085403A1 (en) Sensing system
WO2022239459A1 (en) Distance measurement device and distance measurement system
US20230094075A1 (en) Sensor system, control device, non-transitory computer-readable medium, and computer program
JP2023090924A (en) sensor system
CN110228428A (en) Sensing system
CN115917424A (en) Semiconductor device and optical structure

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant