US20200274998A1 - Determination of illuminator obstruction by known optical properties - Google Patents

Determination of illuminator obstruction by known optical properties Download PDF

Info

Publication number
US20200274998A1
US20200274998A1 US16/287,672 US201916287672A US2020274998A1 US 20200274998 A1 US20200274998 A1 US 20200274998A1 US 201916287672 A US201916287672 A US 201916287672A US 2020274998 A1 US2020274998 A1 US 2020274998A1
Authority
US
United States
Prior art keywords
illuminator
luminance
vehicle
computer
set forth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/287,672
Other versions
US10771665B1 (en
Inventor
David Michael Herman
Ashwin Arunmozhi
Venkatesh Krishnan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Priority to US16/287,672 priority Critical patent/US10771665B1/en
Assigned to FORD GLOBAL TECHNOLOGIES, LLC reassignment FORD GLOBAL TECHNOLOGIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KRISHNAN, VENKATESH, Arunmozhi, Ashwin, HERMAN, DAVID MICHAEL
Priority to CN202010118464.8A priority patent/CN111629128A/en
Priority to DE102020105059.3A priority patent/DE102020105059A1/en
Publication of US20200274998A1 publication Critical patent/US20200274998A1/en
Application granted granted Critical
Publication of US10771665B1 publication Critical patent/US10771665B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • H04N5/2256
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60SSERVICING, CLEANING, REPAIRING, SUPPORTING, LIFTING, OR MANOEUVRING OF VEHICLES, NOT OTHERWISE PROVIDED FOR
    • B60S1/00Cleaning of vehicles
    • B60S1/02Cleaning windscreens, windows or optical devices
    • B60S1/56Cleaning windscreens, windows or optical devices specially adapted for cleaning other parts or devices than front windows or windscreens
    • B60S1/566Cleaning windscreens, windows or optical devices specially adapted for cleaning other parts or devices than front windows or windscreens including wiping devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G06K9/00791
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • H04N23/811Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation by dust removal, e.g. from surfaces of the image sensor or processing of the image signal output by the electronic image sensor
    • H04N5/2171
    • H04N5/2254
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/103Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using camera systems provided with artificial illumination device, e.g. IR light source

Definitions

  • Autonomous vehicles include one or more devices for detecting a scene surrounding the vehicle.
  • the vehicle autonomously controls its steering, braking, acceleration, etc., based on the detected scene.
  • the vehicle may include one or more image sensors, e.g., near-field cameras.
  • the vehicle may include an illuminator for illuminating the field of view of the image sensor.
  • the illuminator may emit light that is not visible to the human eye, e.g., infrared light.
  • the illuminator includes a light source that generates the light, e.g., a light emitting diode (LED).
  • the illuminator may also include a lens that protects the light source and other components of the illuminator from obstructions, e.g., dirt, dust, mud, rain, snow, etc. Light is emitted from the light source through the lens to the field of view of the image sensor.
  • FIG. 1 is a perspective view of a vehicle including an image sensor and an illuminator with the illuminator unpowered and with a street lamp emitting light.
  • FIG. 2 is a perspective view of the vehicle with the illuminator at full power.
  • FIG. 3 is a perspective view of the vehicle with one illuminator illuminating a lane marker and another illuminator illuminating a road sign.
  • FIG. 4 is a block diagram of a system of the vehicle.
  • FIG. 5 is a flow chart of a method performed by the system.
  • a vehicle includes an image sensor having a field of view, an illuminator aimed at the field of view, and a computer including a processor and a memory storing instructions executable by the processor.
  • the instructions are executable by the processor to illuminate an object external to the vehicle; determine that the object has a known optical property; determine the optical property of the object from a database; calculate luminance of the illuminator based at least on the optical property of the object; and adjust at least one of the illuminator, the image sensor, and the computer based at least on the luminance of the illuminator.
  • the memory may store further instructions executable to adjust the illuminator by cleaning a lens of the illuminator based at least on the luminance of the illuminator.
  • the memory may store further instructions executable to spray fluid at the lens to clean the lens.
  • the memory may store further instructions executable to compare the luminance of the illuminator with a threshold and to adjust at least one of the illuminator, the image sensor, and the computer when the luminance is below the threshold.
  • the memory may store further instructions executable to determine the geometry of the object and to determine a type of the object based on the geometry.
  • the memory may store further instructions executable to determine the shape of the object and to calculate the luminance of the illuminator based at least on the shape.
  • the memory may store further instructions executable to determine the distance between the object and the illuminator and/or the orientation of the object relative to the illuminator and to calculate the luminance of the illuminator based at least on the distance and/or orientation.
  • the memory may store further instructions executable to capture an image the object during the illumination.
  • a system may include a computer including a processor and a memory, the memory storing instructions executable by the processor to illuminate an object external to a vehicle with an illuminator; determine that the object has a known optical property; determine the optical property of the object from a database; calculate luminance of the illuminator based at least on the optical property of the object; and clean a lens of the illuminator based at least on the luminance of the illuminator.
  • the memory may store further instructions executable to spray fluid at the lens to clean the lens.
  • the memory may store further instructions executable to compare the luminance of the illuminator with a threshold and to clean the lens of the illuminator when the luminance is below the threshold.
  • the memory may store further instructions executable to determine the geometry of the object and to determine a type of the object based on the geometry of the object.
  • the memory may store further instructions executable to determine the shape of the object relative to the illuminator and to calculate the luminance of the illuminator based at least on the shape.
  • the memory may store further instructions executable to determine the distance between the object and the illuminator and/or the orientation of the object relative to the illuminator and to calculate the luminance of the illuminator based at least on the distance and/or orientation.
  • a method includes illuminating an object; determining the that the object has a known optical property; determining the optical property of the object from a database; calculating luminance of the illuminator based at least on the optical property of the object; and adjusting at least one of the illuminator, an image sensor, and a computer based at least on the luminance of the illuminator.
  • Adjusting the illuminator may include cleaning a lens of the illuminator.
  • Determining a type of the object may include determining the geometry of the object.
  • the method may include comparing the luminance of the illuminator with a threshold and cleaning the illuminator when the luminance is below the threshold.
  • the method may include determining the shape of the object and calculating the luminance of the illuminator based at least on the shape.
  • the method may include determining the distance between the object and the illuminator and/or the orientation of the object relative to the illuminator and calculating the luminance of the illuminator based at least on the distance and/or orientation.
  • a vehicle 10 includes a system including an image sensor 12 having a field of view and an illuminator 14 aimed at the field of view.
  • the system of the vehicle 10 includes a computer 16 having a processor and a memory storing instructions executable by the processor.
  • the computer 16 is programmed to illuminate an object 18 external to the vehicle 10 , determine that the object 18 has a known optical property, determine the optical property of the object 18 from a database, calculate luminance of the illuminator 14 based at least on the optical property of the object 18 , and adjust at least one of the illuminator 14 , the image sensor 12 , and the computer 16 based at least on the luminance of the illuminator 14 .
  • the optical property of various objects 18 and/or various types of object 18 may be predetermined and stored in the database, as described below. After determining that the object 18 has a known optical property, e.g., based on the image of the object 18 and/or an HD map, the database is accessed to determine the optical property of the object 18 , e.g., as described below, object detection from sensor data and/or localization and HD map data, etc. That optical property is then used to calculate the luminance of the illuminator 14 . In other words, the luminance of the illuminator 14 is calculated based on the known optical property (e.g., diffuse reflectivity, retro-reflectivity, and specular reflectivity components) of the type of the object 18 .
  • the known optical property e.g., diffuse reflectivity, retro-reflectivity, and specular reflectivity components
  • the position and/or orientation of the object 18 relative to the light sensor 12 and/or illuminator 14 may also be used to calculate the luminance of the illuminator 14 . This calculation of the luminance of the illuminator 14 may then be used to determine if the system should be adjusted due to a blockage of the illuminator 14 , e.g., an obstruction on a lens 20 of the illuminator 14 . As one example, the lens 20 of the illuminator 14 may be cleaned.
  • the vehicle 10 may be any type of passenger or commercial automobile such as a car, a truck, a sport utility vehicle, a crossover vehicle, a van, a minivan, a taxi, a bus, etc.
  • the vehicle 10 may be an autonomous vehicle.
  • a computer can be programmed to operate the vehicle 10 independently of the intervention of a human driver, completely or to a lesser degree.
  • the computer may be programmed to operate the propulsion, brake system, steering, and/or other vehicle systems.
  • autonomous operation means the computer controls the propulsion, brake system, and steering
  • semi-autonomous operation means the computer controls one or two of the propulsion, brake system, and steering and a human driver controls the remainder
  • nonautonomous operation means the human driver controls the propulsion, brake system, and steering.
  • the vehicle 10 includes the image sensor 12 having a field of view and an illuminator 14 aimed at the field of view.
  • the image sensor 12 and the illuminator 14 may be adjacent to each other, as shown in FIGS. 1-3 , or may be spaced from each other.
  • the illuminator 14 has a lens 20 and the image sensor 12 has a lens 22 .
  • the lens 20 of the illuminator 14 and the lens 22 of the image sensor 12 may be separate from each other.
  • the image sensor 12 and the illuminator 14 may share a common lens (identified with 20 , 22 in FIGS. 1-3 ).
  • the image sensor 12 and/or illuminator 14 may be at any suitable location on the vehicle 10 , e.g., a side body panel, roof, etc.
  • the image sensor 12 may be any type of image sensor.
  • the image sensor 12 may be a digital camera, for example, a near-field camera.
  • the image sensor 12 may be lidar sensor (e.g., flash lidar), time-of-flight camera, etc.
  • the image sensor 12 is configured to capture an image of the scene exterior to the vehicle 10 .
  • the illuminator 14 is configured to illuminate the scene exterior to the vehicle 10 to illuminate the image captured by the image sensor 12 .
  • the illuminator 14 may, for example, emit infrared light.
  • the illuminator 14 has a light source that may be, for example an LED light source.
  • the illuminator 14 may emit light constantly or may emit flashes of light, e.g., for a flash lidar.
  • the illuminator 14 may emit a known pattern of light and, in such an example, may be spaced from the image sensor 12 , i.e., at a different viewpoint. In other words, the illuminator 14 may emit structured light.
  • the illuminator 14 may be configured to illuminate objects 18 in the scene exterior to the vehicle 10 , e.g., road signs, lane markers, street signs, trees, grass, bushes, and the image sensor 12 is configured to capture an image of the scene illuminated by the illuminator 14 .
  • the vehicle 10 may include a cleaning device 24 ( FIG. 4 ) for cleaning the lens 20 of the illuminator 14 .
  • the cleaning device 24 may include a nozzle 26 ( FIGS. 1-3 ) aimed at the illuminator 14 .
  • the nozzle 26 is shown in some examples in FIGS. 1-3 , and a nozzle 26 may be aimed at one or all of the illuminators 14 .
  • a nozzle 26 may be dedicated to one illuminator 14 or may be shared by multiple illuminators 14 .
  • the nozzles 26 shown in FIGS. 1-3 are on the vehicle body.
  • the nozzle 26 may be incorporated into a sensor housing, e.g., a housing that houses the image sensor 12 and/or the illuminator 14 .
  • the nozzle 26 may spray fluid, e.g., cleaning fluid and/or air, at the lens 20 of the illuminator 14 to clean the lens 20 .
  • the cleaning device 24 may include any suitable pump, reservoir, controller, etc., for selectively cleaning the lens 20 when instructed by the computer 16 , as described below.
  • the vehicle 10 includes a communication network 28 including hardware, such as a communication bus, for facilitating communication among vehicle components.
  • the communication network 28 may facilitate wired or wireless communication among the vehicle components in accordance with a number of communication protocols such as controller area network (CAN), Ethernet, WiFi, Local Interconnect Network (LIN), and/or other wired or wireless mechanisms.
  • CAN controller area network
  • Ethernet Ethernet
  • WiFi Wireless Fidelity
  • LIN Local Interconnect Network
  • the computer 16 implemented via circuits, chips, or other electronic components, is included in the vehicle 10 for carrying out various operations, including as described herein.
  • the computer 16 is a computing device that generally includes a processor and a memory, the memory including one or more forms of computer-readable media, and storing instructions executable by the processor for performing various operations, including as disclosed herein.
  • the memory of the computer 16 further generally stores remote data received via various communications mechanisms; e.g., the computer 16 is generally configured for communications on a controller area network (CAN) bus or the like, and/or for using other wired or wireless protocols, e.g., Bluetooth, etc.
  • the computer 16 may also have a connection to an onboard diagnostics connector (OBD-II).
  • OBD-II onboard diagnostics connector
  • the computer 16 may transmit data and messages to various devices in the vehicle 10 and/or receive data and messages from the various devices, including as described below.
  • Ethernet Wireless Fidelity
  • WiFi Wireless Fidelity
  • CAN Serial Bus
  • LIN Local Interconnect Network
  • the computer 16 is programmed to initiate the steps to calculate the luminance of the illuminator 14 .
  • the computer 16 is programmed to trigger the system and method.
  • the computer 16 may determine, based on inputs, that the steps to calculate the luminance should be initiated or may receive instructions to initiate.
  • the initiation may be based on distance traveled interval, time interval, or based on some image feature or change thereof.
  • the image quality of the image sensor 12 may be determined by known methods, i.e., known algorithms, and the results of such an image algorithm may be tracked over time and/or compared to a baseline.
  • the image quality may be tracked over time using a known statistical process control/tracking method.
  • the processor may be programmed to initiate based on changes in image quality, e.g., degradation in image quality.
  • the initiation may be based on detection of an object 18 by the computer 16 (i.e., based on input from the image sensor 12 ).
  • the computer 16 may initiate the steps to calculate luminance of the illuminator 14 .
  • the initiation may be based on cross reference with a high definition (HD) map to identify known objects 18 and to initiate based on proximity to approaching objects 18 on the HD map.
  • HD map is a digital map for autonomous navigation and includes layers of information (such as semantic objects such as road signs, lane markers, street signs, trees, grass, bushes, other vehicles, etc.) on a geometric map.
  • the layers of information may be a combination of information sourced from several autonomous vehicles to create a real-time map.
  • the computer 16 is programmed to image the scene around the vehicle 10 , i.e., external to the vehicle 10 .
  • the computer 16 is programmed to image the scene around the vehicle 10 with varying illuminator light levels. Varying the illuminator light levels of the images allows for ambient light to be subtracted to determine the luminance of the illuminator 14 , as described further below.
  • the scene may be imaged with no illumination from the illuminator 14 (i.e., the illuminator 14 at 0%) and may be imaged with full illumination from the illuminator 14 (i.e., the illuminator 14 at 100%).
  • At least one image is taken by the image sensor 12 with no illumination from the illuminator 14 and at least one image is taken by the image sensor 12 at full illumination from the illuminator 14 .
  • the scene may be imaged at levels between 0% and 100%. The imaging may occur at low vehicle speed or when the vehicle 10 is stopped or, as another example, multiple images may be fused together to avoid errors due to the shift in the image during movement of the vehicle 10 .
  • the computer 16 may strobe the illuminator 14 and use a rolling shutter to create a single “image” where each illumination level is a separate row of the image.
  • Imaging the scene includes imaging objects 18 in the scene.
  • the objects 18 may be, for example, road signs, lane markers, street signs, trees, grass, bushes, other vehicles, etc.).
  • the illumination of the scene by the illuminator 14 includes illuminating an object 18 external to the vehicle 10 .
  • the computer 16 is programmed to determine that the object 18 has a known optical property, i.e., an optical property that may be accessed from a database. As one example, the computer 16 is programmed to determine the type of one or more objects 18 in the image for which an optical property, e.g., reflectivity, is known. The optical property is then used to determine the luminance of the illuminator 14 , as described further below.
  • a known optical property i.e., an optical property that may be accessed from a database.
  • the computer 16 is programmed to determine the type of one or more objects 18 in the image for which an optical property, e.g., reflectivity, is known.
  • the optical property is then used to determine the luminance of the illuminator 14 , as described further below.
  • the computer 16 is programmed to determine the geometry of the object 18 and to identify the object 18 (e.g., on an HD map) and/or to determine the type of the object 18 based on the geometry (e.g., by object detection in the image) .
  • the geometry of the object 18 includes the shape of the object 18 in the image, the distance between the object 18 and the illuminator 14 and/or image sensor 12 , the orientation of the object 18 relative to the illuminator 14 and/or image sensor 12 .
  • the image of the scene taken by the image sensor 12 may be interpreted by one or more other sensor or knowledge and/or algorithm to construct an approximate model of the scene or at the least one or more objects 18 imaged.
  • the model of the scene may include geometry of the scene, i.e., shapes of objects 18 , distances between objects 18 and the illuminator 14 and/or image sensor 12 , orientation of the object 18 relative to the illuminator 14 and/or image sensor 12 .
  • This geometry may be accomplished by the use of structure from motion techniques; depth maps based on monocular camera through the use of neural networks; recognition of 3D objects and their orientation in space through use of neural networks; depth maps based on monocular camera structure from motion or visual slam; sensor fusion from another sensor such as Lidar, Radar, ultra-sonic; incorporation of image recognition fused with HD maps or simpler logic (e.g., a road surface is flat, lane marker lies on road, and vehicle 10 is approximately perpendicular to ground plane); stereo imaging; and/or time of flight camera, etc.
  • the computer 16 is programmed to identify the object 18 and/or to determine the type of the object 18 based on the image of the object 18 .
  • the model of the scene and the ways of constructing the model described above may determine the type of the object 18 , e.g., based at least on the shape of the object 18 in the image.
  • the object 18 may be identified by the use of an HD map along with location identification of the vehicle 10 , i.e., location of the vehicle 10 on the HD map.
  • the HD map may identify an object 18 and the proximity of the vehicle 10 to the object 18 may be known so that the system may image the scene when the object 18 is in the field of view of the image sensor 12 .
  • the computer 16 is programmed to determine the shape of the object 18 ; the distance between the object 18 and the illuminator 14 and/or image sensor 12 ; and/or the orientation of the object 18 relative to the illuminator 14 and/or the image sensor 12 .
  • the computer 16 is programmed to calculate the luminance of the illuminator 14 based at least on the shape, the distance, and/or the orientation.
  • the processor may use the shape, distance, and/or orientation to identify the object 18 and/or determine the type of the object 18 , as described above.
  • the processor may use the shape, distance, and/or orientation in the calculation of the illuminance described below.
  • the computer 16 is programmed to determine the optical property of the object 18 and/or the type of the object 18 .
  • the computer 16 is programmed to determine the optical property of the object 18 and/or the type of the object 18 from a database.
  • the database may be a lookup table, e.g., on the memory of the computer 16 , that includes optical properties for various types of objects 18 .
  • the database may be a database on an HD map.
  • the computer 16 may be programmed to image the scene when in the vicinity of an object 18 based on the HD map as described above, identify the type of the object 18 in the image as the type identified in the HD map, and access the optical property of that object 18 from the HD map.
  • the optical property of that specific object 18 may be continuously updated in the HD map based on input from other autonomous vehicles that have imaged the object 18 .
  • the computer 16 may be programmed to identify the object 18 in the image as an object identified in the HD map, i.e., based on geometry and location of the vehicle, and access the optical property of that object 18 from the HD map.
  • objects 18 that may be identified by type as described above may have known optical properties, e.g., reflection (specular, diffuse, retro reflection), absorption percentages, and geometric attributes (distance, relative direction), etc. This may be cross referenced to the specific wavelength of the illuminator 14 , time of year (winter vs summer), HD Maps (new vs old lane markers), and other factors. This information is used in the calculation of the luminance of the illuminator 14 as described below.
  • the database may be on the other vehicle or updated by the other vehicle.
  • vehicles and/or infrastructure in their V2X (vehicle-to-everything) communication may include and/or transmit this information.
  • a black vehicle might indicate it has a 10% diffuse reflectance, 2% retro reflection, and 5% specular reflection.
  • the vehicle may be identified in the imaging and type recognition described above and the optical property is transmitted via V2X and these two pieces of information may be tied together to determine the optical property of the object 18 being imaged, i.e., the black vehicle.
  • the computer 16 is programmed to calculate the luminance of the illuminator 14 based at least on the optical property of the object 18 .
  • the computer 16 is programmed to determine the distance between the object 18 and the illuminator 14 and/or the orientation of the object 18 relative to the illuminator 14 and to calculate the luminance of the illuminator 14 based at least on the distance and/or orientation.
  • the computer 16 is programmed to calculate the luminance of the illuminator 14 based on the known physical attributes of the image sensor 12 (e.g., exposure time, analog to digital gain, F-stop, vignetting, QE, focal length, F-stop, camera calibration sensitivity, FOV, orientation, position (relative and absolute), etc.) and the illuminator 14 (e.g., wavelength, luminesce vs power (V, I), position, orientation, Intensity of light source as a function of distance and angle from the light (see graph below in technical background, etc.).
  • the computer 16 may be programmed to account for weather based on absorption of light, e.g., fog.
  • the computer 16 is programmed to calculate the luminesce of the illuminator 14 based on a sub-region of the image in which the object 18 with known geometry and optical properties is segmented and analyzed through use of the equation below. The intensity of that region may be analyzed. If a large variation is found, then the object 18 may be further sub-divided.
  • the computer 16 may be programmed to account for dark current noise in the image when an object is at a distance where the dark current noise in the image is comparable to the signal.
  • the luminance of the illuminator 14 may be calculated in the following equation:
  • Luminance ( 4 2 ⁇ ⁇ 2 ⁇ r 4 * r diffuse ⁇ ( ⁇ ) + 4 1 ⁇ ⁇ 1 ⁇ r 2 * specural ⁇ ( ⁇ ) + 4 1 ⁇ ⁇ 1 ⁇ r 2 * retro_reflective ) * f LED ⁇ ( ⁇ ) * f obj ⁇ ( ⁇ ) * f lens ⁇ ( ⁇ ) * N d , 100 ⁇ % K c ⁇ ( f S 2 tS ) - N d , 0 ⁇ % K c ⁇ ( f S 2 tS )
  • r is approximately equal. It can also be assumed that the behavior of intensity of the light source propagating in space to the object 18 and back to the image sensor 12 follows a point spread function with a modification of the function, f( ⁇ ), which can account for the illuminator lens 20 , object 18 , and image sensor lens 22 orientation functionality.
  • the illuminator 14 may have strong orientation dependence and the image sensor 12 may experience vignetting effects depending on the relative orientations and the image sensor 12 image signal processing corrections. The reflection is accounted for as diffuse and may be determined based on the object 18 and its reflectance in the spectral regime of the light source.
  • the latter portion of the equation above determines the luminance of the object 18 based on the calibration of the image sensor 12 minus the effect of ambient light luminance.
  • the solution of the above equation calculates the luminance of the illuminator 14 .
  • specular( ⁇ ) in the equation above corrects for specular reflection if the object 18 is so correctly placed within the scene relative to the illuminator 14 and the image sensor 12 . It can be assumed that this term is normally zero and can be dropped from the equation for most objects 18 sampled.
  • the term “retro_reflective” in the equation above is the magnitude of the retro reflective effect multiplied by the illuminator's 14 diffuse light emission at impact to the object 18 .
  • the calculation above calculates a numerical value for the percentage decrease of the illuminator 14 .
  • the degree of degradation is quantified and appropriate action may be taken based on this information, as described above.
  • the computer 16 is programmed to determine if the luminance of the illuminator 14 luminance is lower than expected and/or needed.
  • the relative low luminance may be caused by a blockage, e.g., on the lens 20 of the illuminator 14 , and/or failure of the illuminator 14 , e.g., LED failure.
  • the computer 16 is programmed to compare the luminance of the illuminator 14 with a threshold.
  • the processor may be programmed to use a statistical process control and/or tracking method to compare and identify changes in the luminance.
  • the imaging at no illumination and full illumination and calculating the luminance of the illuminator 14 on the optical property may be repeated for varying scenes over time to determine a shift.
  • the processor may also cross-reference the specific object 18 with a database, e.g., from an HD map, to account for changes, e.g., new lane markers, or degradation over time.
  • the computer 16 is programmed to adjust the system based on the luminance of the illuminator 14 being lower than expected and/or needed.
  • the computer 16 is programmed to adjust at least one of the illuminator 14 , the image sensor 12 , and the computer 16 when the luminance is below the threshold.
  • the adjustment may be an adjustment of the illuminator 14 by cleaning a lens 20 of the illuminator 14 .
  • fluid such as cleaning liquid and/or air may be sprayed at the lens 20 of the illuminator 14 to clean the lens 20 .
  • the processor may be programmed to instruct a cleaning device 24 to clean the lens 20 in such a case.
  • the processor may be programmed to verify that the lens 20 is clean by repeating the calculation of the luminance described above.
  • Other examples of adjusting the system may include logging the results for future use, scheduling maintenance (including instructing the vehicle 10 to drive to a service provider for maintenance), disabling the system (e.g., disabling the image sensor 12 and/or illuminator 14 ), and/or modifying sensor fusion and perception algorithms/logic to account for a lower luminance.
  • scheduling maintenance including instructing the vehicle 10 to drive to a service provider for maintenance
  • disabling the system e.g., disabling the image sensor 12 and/or illuminator 14
  • modifying sensor fusion and perception algorithms/logic to account for a lower luminance.
  • the entire lens 20 , 22 may be cleaned or only a portion of the lens 20 , 22 through which the illuminator 14 is aimed may be cleaned.
  • the image sensor 12 may take longer exposures to obtain an improve quality image with sufficient image exposure assuming that the degradation is limited and the dark current noise of the image sensor 12 does not dominate in long exposures.
  • FIG. 5 A method 500 of operating the examples shown in FIGS. 1-4 is shown in FIG. 5 .
  • the computer 16 may be programmed to perform the method shown in FIG. 5 .
  • the method 500 includes initiating the steps to calculate the luminance of the illuminator 14 , i.e., triggering the system and method 500 .
  • Block 505 may include determining, based on inputs, that the steps to calculate the luminance should be initiated and/or receiving instructions to initiate.
  • block 505 may include calculating or receiving a distance traveled interval, a time interval, or some image feature or change thereof and initiating the system and method 500 based that information.
  • the method 500 in block 505 may include determining the image quality of the image sensor 12 by known methods, i.e., known algorithms, and the results of such an image algorithm may be tracked over time and/or compared to a baseline.
  • the method may include tracking the image quality over time using a known a statistical process control and/or tracking method.
  • the method may include cross-referencing a high definition (HD) map to identify known objects 18 and to initiate based on proximity to approaching objects 18 on the HD map.
  • HD high definition
  • the method includes imaging the scene around the vehicle 10 .
  • the method includes varying illuminator light levels.
  • the method includes imaging the scene with no illumination from the illuminator 14 (block 510 ) and with full illumination from the illuminator 14 (block 515 ).
  • block 510 includes imaging the scene with the image sensor 12
  • block 515 includes both illuminating the scene with the illuminator 14 and imaging the scene with the image sensor 12 .
  • the method may include imaging the scene at levels between 0% and 100%.
  • the method may include imaging at low vehicle speed or when the vehicle 10 is stopped.
  • the method may include fusing multiple images together to avoid errors due to the shift in the image during movement of the vehicle 10 .
  • Illuminating the scene includes illuminating one or more object 18 in the scene and imaging the scene includes imaging the object 18 .
  • the method includes determining the geometry of the object 18 (block 520 ) and determining that the object has a known optical property(block 525 ). This may be based on the geometry based on the image of the object 18 , i.e., the image taken at block 510 and/or the image taken at block 515 .
  • the method at block 520 may include calculating and/or receiving a measurement of distance between the object 18 and the illuminator 14 and/or image sensor 12 , geometry of the object 18 , orientation of the object 18 relative to the illuminator 14 and/or image sensor 12 , relative position from illuminator 14 and/or image sensor 12 , and/or other information.
  • the method at block 520 and/or block 525 includes interpreting the image of the scene taken by the image sensor 12 by one or more other sensor or knowledge and/or algorithm and constructing an approximate model of the scene or at the least one or more objects 18 imaged, as described above.
  • the computer 16 is programmed to determine the geometry of the object 18 and to identify the object 18 and/or determine the type of the object 18 based on the geometry.
  • the method at block 520 and/or block 525 includes interpreting the image of the scene taken by the image sensor 12 by one or more other sensor or knowledge and/or algorithm to construct an approximate model of the scene or at the least one or more objects 18 imaged, as described above.
  • the method at block 525 includes identifying the object 18 and/or determining the type of the object 18 based on the image of the object 18 .
  • the method may include determining the type of the object 18 based at least on the shape of the object 18 .
  • the model of the scene and the ways of constructing the model described above may identify the object 18 and/or determine the type of the object 18 .
  • the object 18 may be identified by the use of an HD map along with location identification of the vehicle 10 , i.e., location of the vehicle 10 on the HD map.
  • the HD map may identify and object 18 and proximity of the vehicle 10 to the object 18 so that the system may image the scene when the object 18 is in the field of view of the image sensor 12 .
  • the method includes determining the optical property of the type of the object 18 after identification of the object 18 and/or determination of the type as described above.
  • the method includes determining the optical property of the object 18 or the type of the object 18 from a database, as described above.
  • the method may include accessing a lookup table, e.g., on the memory of the computer 16 , that includes optical properties for various types of objects 18 .
  • the method may include imaging the scene when in the vicinity of an object 18 based on the HD map as described above, identifying the type of the object 18 in the image as the type identified in the HD map, and accessing the optical property of that object 18 from the HD map.
  • the method may include accessing the optical property by V2X communication as described above.
  • the method includes calculating the luminance of the illuminator 14 based on the optical property (i.e., based on the object 18 and/or the type of the object 1 , the image at no illumination, and the image at full illumination.
  • the calculation based on the object 18 and/or the type of the object 18 may include calculating based on the optical property of the object 18 and/or the type of the object 18 .
  • the method may include determining the distance between the object 18 and the illuminator 14 and/or the orientation of the object 18 relative to the illuminator 14 and calculating the luminance of the illuminator 14 based at least on the distance and/or orientation.
  • the method of calculating the luminance may include implementation of the calculation set forth above.
  • the method may include calculating the luminance based on a sub-region of the image in which the object 18 with known geometry and optical properties is segmented and analyzed through use of the equation below. The intensity of that region may be analyzed. If a large variation is found, then the object 18 may be further sub-divided.
  • the method includes determining if the luminance of the illuminator 14 luminance is lower than expected and/or needed.
  • the method includes comparing the luminance of the illuminator 14 (as calculated above) with a threshold.
  • the method may compare and identify changes in the luminance by using statistical process control and/or tracking.
  • the method may include repeating the imaging at no illumination and full illumination and calculating of the luminance of the illuminator 14 based on the optical property for varying scenes over time to determine a shift.
  • the method may include cross-referencing the specific object 18 with a database, e.g., from an HD map, to account for changes, e.g., new lane markers, or degradation over time.
  • the method includes adjusting the system based on the luminance of the illuminator 14 being lower than expected and/or needed.
  • the method includes adjusting at least one of the illuminator 14 , the image sensor 12 , and the computer 16 when the luminance is below the threshold.
  • the method includes cleaning a lens 20 of the illuminator 14 , e.g., spraying fluid such as cleaning liquid and/or air at the lens 20 of the illuminator 14 to clean the lens 20 .
  • the method may including verifying that the lens 20 is clean by repeating the calculation of the luminance described above.
  • adjusting the system may include logging the results for future use, scheduling maintenance, modifying sensor fusion and perception algorithms/logic to account for a lower luminance.
  • the image sensor 12 may take longer exposures to obtain an improve quality image with sufficient image exposure assuming that the degradation is limited and the dark current noise of the image sensor 12 does not dominate in long exposures.
  • the image sensor 12 e.g., in examples in which the image sensor 12 is a camera, may take multiple varying exposures to obtain a high dynamic range image with sufficient image intensity range.
  • Computing devices such as the computer 16 generally include computer-executable instructions, where the instructions may be executable by one or more computing devices such as those listed above.
  • Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, JavaTM, C, C++, Visual Basic, Java Script, Python, Perl, etc. Some of these applications may be compiled and executed on a virtual machine, such as the Java Virtual Machine, the Dalvik virtual machine, or the like.
  • a processor receives instructions, e.g., from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein.
  • instructions and other data may be stored and transmitted using a variety of computer-readable media.
  • a computer-readable medium includes any non-transitory (e.g., tangible) medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer).
  • a medium may take many forms, including, but not limited to, non-volatile media and volatile media.
  • Non-volatile media may include, for example, optical or magnetic disks and other persistent memory.
  • Volatile media may include, for example, dynamic random access memory (DRAM), which typically constitutes a main memory.
  • Such instructions may be transmitted by one or more transmission media, including coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to a processor of a computer.
  • Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.
  • system elements may be implemented as computer-readable instructions (e.g., software) on one or more computing devices (e.g., servers, personal computers, computing modules, etc.), stored on computer readable media associated therewith (e.g., disks, memories, etc.).
  • a computer program product may comprise such instructions stored on computer readable media for carrying out the functions described herein.

Abstract

A vehicle includes an image sensor having a field of view, an illuminator aimed at the field of view; and a computer including a processor and a memory storing instructions executable by the processor. The computer is programmed to illuminate an object external to the vehicle; determine that the object has a known optical property; determine the optical property of the object from a database; calculate luminance of the illuminator based at least on the optical property of the object; and adjust at least one of the illuminator, the image sensor, and the computer based at least on the luminance of the illuminator.

Description

    BACKGROUND
  • Autonomous vehicles include one or more devices for detecting a scene surrounding the vehicle. The vehicle autonomously controls its steering, braking, acceleration, etc., based on the detected scene. As one example, the vehicle may include one or more image sensors, e.g., near-field cameras.
  • The vehicle may include an illuminator for illuminating the field of view of the image sensor. The illuminator may emit light that is not visible to the human eye, e.g., infrared light. The illuminator includes a light source that generates the light, e.g., a light emitting diode (LED). The illuminator may also include a lens that protects the light source and other components of the illuminator from obstructions, e.g., dirt, dust, mud, rain, snow, etc. Light is emitted from the light source through the lens to the field of view of the image sensor.
  • Current methods are known for determining obstructions on lens of the image sensor and cleaning the identified obstructions. However, obstructions on the lens of the illuminator decreases the amount of generated light that reaches the field of view and degrades image quality. There remains in an opportunity to account for obstructions on the lens of the illuminator.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective view of a vehicle including an image sensor and an illuminator with the illuminator unpowered and with a street lamp emitting light.
  • FIG. 2 is a perspective view of the vehicle with the illuminator at full power.
  • FIG. 3 is a perspective view of the vehicle with one illuminator illuminating a lane marker and another illuminator illuminating a road sign.
  • FIG. 4 is a block diagram of a system of the vehicle.
  • FIG. 5 is a flow chart of a method performed by the system.
  • DETAILED DESCRIPTION
  • A vehicle includes an image sensor having a field of view, an illuminator aimed at the field of view, and a computer including a processor and a memory storing instructions executable by the processor. The instructions are executable by the processor to illuminate an object external to the vehicle; determine that the object has a known optical property; determine the optical property of the object from a database; calculate luminance of the illuminator based at least on the optical property of the object; and adjust at least one of the illuminator, the image sensor, and the computer based at least on the luminance of the illuminator.
  • The memory may store further instructions executable to adjust the illuminator by cleaning a lens of the illuminator based at least on the luminance of the illuminator. The memory may store further instructions executable to spray fluid at the lens to clean the lens.
  • The memory may store further instructions executable to compare the luminance of the illuminator with a threshold and to adjust at least one of the illuminator, the image sensor, and the computer when the luminance is below the threshold.
  • The memory may store further instructions executable to determine the geometry of the object and to determine a type of the object based on the geometry.
  • The memory may store further instructions executable to determine the shape of the object and to calculate the luminance of the illuminator based at least on the shape.
  • The memory may store further instructions executable to determine the distance between the object and the illuminator and/or the orientation of the object relative to the illuminator and to calculate the luminance of the illuminator based at least on the distance and/or orientation.
  • The memory may store further instructions executable to capture an image the object during the illumination.
  • A system may include a computer including a processor and a memory, the memory storing instructions executable by the processor to illuminate an object external to a vehicle with an illuminator; determine that the object has a known optical property; determine the optical property of the object from a database; calculate luminance of the illuminator based at least on the optical property of the object; and clean a lens of the illuminator based at least on the luminance of the illuminator.
  • The memory may store further instructions executable to spray fluid at the lens to clean the lens.
  • The memory may store further instructions executable to compare the luminance of the illuminator with a threshold and to clean the lens of the illuminator when the luminance is below the threshold.
  • The memory may store further instructions executable to determine the geometry of the object and to determine a type of the object based on the geometry of the object.
  • The memory may store further instructions executable to determine the shape of the object relative to the illuminator and to calculate the luminance of the illuminator based at least on the shape.
  • The memory may store further instructions executable to determine the distance between the object and the illuminator and/or the orientation of the object relative to the illuminator and to calculate the luminance of the illuminator based at least on the distance and/or orientation.
  • A method includes illuminating an object; determining the that the object has a known optical property; determining the optical property of the object from a database; calculating luminance of the illuminator based at least on the optical property of the object; and adjusting at least one of the illuminator, an image sensor, and a computer based at least on the luminance of the illuminator.
  • Adjusting the illuminator may include cleaning a lens of the illuminator.
  • Determining a type of the object may include determining the geometry of the object.
  • The method may include comparing the luminance of the illuminator with a threshold and cleaning the illuminator when the luminance is below the threshold.
  • The method may include determining the shape of the object and calculating the luminance of the illuminator based at least on the shape.
  • The method may include determining the distance between the object and the illuminator and/or the orientation of the object relative to the illuminator and calculating the luminance of the illuminator based at least on the distance and/or orientation.
  • With reference to the Figures, wherein like numerals indicate like parts throughout the several views, a vehicle 10 includes a system including an image sensor 12 having a field of view and an illuminator 14 aimed at the field of view. The system of the vehicle 10 includes a computer 16 having a processor and a memory storing instructions executable by the processor. The computer 16 is programmed to illuminate an object 18 external to the vehicle 10, determine that the object 18 has a known optical property, determine the optical property of the object 18 from a database, calculate luminance of the illuminator 14 based at least on the optical property of the object 18, and adjust at least one of the illuminator 14, the image sensor 12, and the computer 16 based at least on the luminance of the illuminator 14.
  • The optical property of various objects 18 and/or various types of object 18 may be predetermined and stored in the database, as described below. After determining that the object 18 has a known optical property, e.g., based on the image of the object 18 and/or an HD map, the database is accessed to determine the optical property of the object 18, e.g., as described below, object detection from sensor data and/or localization and HD map data, etc. That optical property is then used to calculate the luminance of the illuminator 14. In other words, the luminance of the illuminator 14 is calculated based on the known optical property (e.g., diffuse reflectivity, retro-reflectivity, and specular reflectivity components) of the type of the object 18. As discussed below, the position and/or orientation of the object 18 relative to the light sensor 12 and/or illuminator 14 may also be used to calculate the luminance of the illuminator 14. This calculation of the luminance of the illuminator 14 may then be used to determine if the system should be adjusted due to a blockage of the illuminator 14, e.g., an obstruction on a lens 20 of the illuminator 14. As one example, the lens 20 of the illuminator 14 may be cleaned.
  • The vehicle 10 may be any type of passenger or commercial automobile such as a car, a truck, a sport utility vehicle, a crossover vehicle, a van, a minivan, a taxi, a bus, etc. The vehicle 10 may be an autonomous vehicle. A computer can be programmed to operate the vehicle 10 independently of the intervention of a human driver, completely or to a lesser degree. The computer may be programmed to operate the propulsion, brake system, steering, and/or other vehicle systems. For the purposes of this disclosure, autonomous operation means the computer controls the propulsion, brake system, and steering; semi-autonomous operation means the computer controls one or two of the propulsion, brake system, and steering and a human driver controls the remainder; and nonautonomous operation means the human driver controls the propulsion, brake system, and steering.
  • The vehicle 10 includes the image sensor 12 having a field of view and an illuminator 14 aimed at the field of view. The image sensor 12 and the illuminator 14 may be adjacent to each other, as shown in FIGS. 1-3, or may be spaced from each other. The illuminator 14 has a lens 20 and the image sensor 12 has a lens 22. The lens 20 of the illuminator 14 and the lens 22 of the image sensor 12 may be separate from each other. As another example, the image sensor 12 and the illuminator 14 may share a common lens (identified with 20, 22 in FIGS. 1-3). The image sensor 12 and/or illuminator 14 may be at any suitable location on the vehicle 10, e.g., a side body panel, roof, etc.
  • The image sensor 12 may be any type of image sensor. As one example, the image sensor 12 may be a digital camera, for example, a near-field camera. As other examples, the image sensor 12 may be lidar sensor (e.g., flash lidar), time-of-flight camera, etc. The image sensor 12 is configured to capture an image of the scene exterior to the vehicle 10.
  • The illuminator 14 is configured to illuminate the scene exterior to the vehicle 10 to illuminate the image captured by the image sensor 12. The illuminator 14 may, for example, emit infrared light. The illuminator 14 has a light source that may be, for example an LED light source. The illuminator 14 may emit light constantly or may emit flashes of light, e.g., for a flash lidar. The illuminator 14 may emit a known pattern of light and, in such an example, may be spaced from the image sensor 12, i.e., at a different viewpoint. In other words, the illuminator 14 may emit structured light. The illuminator 14 may be configured to illuminate objects 18 in the scene exterior to the vehicle 10, e.g., road signs, lane markers, street signs, trees, grass, bushes, and the image sensor 12 is configured to capture an image of the scene illuminated by the illuminator 14.
  • The vehicle 10 may include a cleaning device 24 (FIG. 4) for cleaning the lens 20 of the illuminator 14. The cleaning device 24 may include a nozzle 26 (FIGS. 1-3) aimed at the illuminator 14. The nozzle 26 is shown in some examples in FIGS. 1-3, and a nozzle 26 may be aimed at one or all of the illuminators 14. A nozzle 26 may be dedicated to one illuminator 14 or may be shared by multiple illuminators 14. The nozzles 26 shown in FIGS. 1-3 are on the vehicle body. As other examples, the nozzle 26 may be incorporated into a sensor housing, e.g., a housing that houses the image sensor 12 and/or the illuminator 14. The nozzle 26 may spray fluid, e.g., cleaning fluid and/or air, at the lens 20 of the illuminator 14 to clean the lens 20. The cleaning device 24 may include any suitable pump, reservoir, controller, etc., for selectively cleaning the lens 20 when instructed by the computer 16, as described below.
  • The vehicle 10 includes a communication network 28 including hardware, such as a communication bus, for facilitating communication among vehicle components. The communication network 28 may facilitate wired or wireless communication among the vehicle components in accordance with a number of communication protocols such as controller area network (CAN), Ethernet, WiFi, Local Interconnect Network (LIN), and/or other wired or wireless mechanisms.
  • The computer 16, implemented via circuits, chips, or other electronic components, is included in the vehicle 10 for carrying out various operations, including as described herein. The computer 16 is a computing device that generally includes a processor and a memory, the memory including one or more forms of computer-readable media, and storing instructions executable by the processor for performing various operations, including as disclosed herein. The memory of the computer 16 further generally stores remote data received via various communications mechanisms; e.g., the computer 16 is generally configured for communications on a controller area network (CAN) bus or the like, and/or for using other wired or wireless protocols, e.g., Bluetooth, etc. The computer 16 may also have a connection to an onboard diagnostics connector (OBD-II). Via the communication network using Ethernet, WiFi, the CAN bus, Local Interconnect Network (LIN), and/or other wired or wireless mechanisms, the computer 16 may transmit data and messages to various devices in the vehicle 10 and/or receive data and messages from the various devices, including as described below.
  • The computer 16 is programmed to initiate the steps to calculate the luminance of the illuminator 14. In other words, the computer 16 is programmed to trigger the system and method. The computer 16 may determine, based on inputs, that the steps to calculate the luminance should be initiated or may receive instructions to initiate.
  • The initiation may be based on distance traveled interval, time interval, or based on some image feature or change thereof. For example, the image quality of the image sensor 12 may be determined by known methods, i.e., known algorithms, and the results of such an image algorithm may be tracked over time and/or compared to a baseline. For example, the image quality may be tracked over time using a known statistical process control/tracking method. The processor may be programmed to initiate based on changes in image quality, e.g., degradation in image quality.
  • As another example, the initiation may be based on detection of an object 18 by the computer 16 (i.e., based on input from the image sensor 12). In other words, when the computer 16 identifies an object 18 as an object for which an optical property is known, the computer 16 may initiate the steps to calculate luminance of the illuminator 14.
  • As another example, the initiation may be based on cross reference with a high definition (HD) map to identify known objects 18 and to initiate based on proximity to approaching objects 18 on the HD map. As is known, an HD map is a digital map for autonomous navigation and includes layers of information (such as semantic objects such as road signs, lane markers, street signs, trees, grass, bushes, other vehicles, etc.) on a geometric map. The layers of information may be a combination of information sourced from several autonomous vehicles to create a real-time map.
  • The computer 16 is programmed to image the scene around the vehicle 10, i.e., external to the vehicle 10. Specifically, the computer 16 is programmed to image the scene around the vehicle 10 with varying illuminator light levels. Varying the illuminator light levels of the images allows for ambient light to be subtracted to determine the luminance of the illuminator 14, as described further below. As an example, the scene may be imaged with no illumination from the illuminator 14 (i.e., the illuminator 14 at 0%) and may be imaged with full illumination from the illuminator 14 (i.e., the illuminator 14 at 100%). In other words, at least one image is taken by the image sensor 12 with no illumination from the illuminator 14 and at least one image is taken by the image sensor 12 at full illumination from the illuminator 14. In addition, or in the alternative, the scene may be imaged at levels between 0% and 100%. The imaging may occur at low vehicle speed or when the vehicle 10 is stopped or, as another example, multiple images may be fused together to avoid errors due to the shift in the image during movement of the vehicle 10. As another example, the computer 16 may strobe the illuminator 14 and use a rolling shutter to create a single “image” where each illumination level is a separate row of the image.
  • Imaging the scene includes imaging objects 18 in the scene. As set forth above, the objects 18 may be, for example, road signs, lane markers, street signs, trees, grass, bushes, other vehicles, etc.). The illumination of the scene by the illuminator 14 includes illuminating an object 18 external to the vehicle 10.
  • The computer 16 is programmed to determine that the object 18 has a known optical property, i.e., an optical property that may be accessed from a database. As one example, the computer 16 is programmed to determine the type of one or more objects 18 in the image for which an optical property, e.g., reflectivity, is known. The optical property is then used to determine the luminance of the illuminator 14, as described further below.
  • For example, the computer 16 is programmed to determine the geometry of the object 18 and to identify the object 18 (e.g., on an HD map) and/or to determine the type of the object 18 based on the geometry (e.g., by object detection in the image) . The geometry of the object 18 includes the shape of the object 18 in the image, the distance between the object 18 and the illuminator 14 and/or image sensor 12, the orientation of the object 18 relative to the illuminator 14 and/or image sensor 12.
  • The image of the scene taken by the image sensor 12, i.e., the sensors (CMOS, CCD, etc.) of the image sensor 12, may be interpreted by one or more other sensor or knowledge and/or algorithm to construct an approximate model of the scene or at the least one or more objects 18 imaged. The model of the scene may include geometry of the scene, i.e., shapes of objects 18, distances between objects 18 and the illuminator 14 and/or image sensor 12, orientation of the object 18 relative to the illuminator 14 and/or image sensor 12. This geometry may be accomplished by the use of structure from motion techniques; depth maps based on monocular camera through the use of neural networks; recognition of 3D objects and their orientation in space through use of neural networks; depth maps based on monocular camera structure from motion or visual slam; sensor fusion from another sensor such as Lidar, Radar, ultra-sonic; incorporation of image recognition fused with HD maps or simpler logic (e.g., a road surface is flat, lane marker lies on road, and vehicle 10 is approximately perpendicular to ground plane); stereo imaging; and/or time of flight camera, etc.
  • Based on this geometry, the computer 16 is programmed to identify the object 18 and/or to determine the type of the object 18 based on the image of the object 18. As one example, the model of the scene and the ways of constructing the model described above may determine the type of the object 18, e.g., based at least on the shape of the object 18 in the image. As another example, the object 18 may be identified by the use of an HD map along with location identification of the vehicle 10, i.e., location of the vehicle 10 on the HD map. For example, the HD map may identify an object 18 and the proximity of the vehicle 10 to the object 18 may be known so that the system may image the scene when the object 18 is in the field of view of the image sensor 12.
  • The computer 16 is programmed to determine the shape of the object 18; the distance between the object 18 and the illuminator 14 and/or image sensor 12; and/or the orientation of the object 18 relative to the illuminator 14 and/or the image sensor 12. The computer 16 is programmed to calculate the luminance of the illuminator 14 based at least on the shape, the distance, and/or the orientation. For example, the processor may use the shape, distance, and/or orientation to identify the object 18 and/or determine the type of the object 18, as described above. In the addition, or in the alternative, the processor may use the shape, distance, and/or orientation in the calculation of the illuminance described below.
  • The computer 16 is programmed to determine the optical property of the object 18 and/or the type of the object 18. As an example, the computer 16 is programmed to determine the optical property of the object 18 and/or the type of the object 18 from a database. The database may be a lookup table, e.g., on the memory of the computer 16, that includes optical properties for various types of objects 18. As another example, the database may be a database on an HD map. For example, the computer 16 may be programmed to image the scene when in the vicinity of an object 18 based on the HD map as described above, identify the type of the object 18 in the image as the type identified in the HD map, and access the optical property of that object 18 from the HD map. In such an example, the optical property of that specific object 18 may be continuously updated in the HD map based on input from other autonomous vehicles that have imaged the object 18. As another example the computer 16 may be programmed to identify the object 18 in the image as an object identified in the HD map, i.e., based on geometry and location of the vehicle, and access the optical property of that object 18 from the HD map.
  • In particular, objects 18 that may be identified by type as described above, e.g., road signs, lane markers, street signs, trees, grass, bushes, other vehicles, etc., may have known optical properties, e.g., reflection (specular, diffuse, retro reflection), absorption percentages, and geometric attributes (distance, relative direction), etc. This may be cross referenced to the specific wavelength of the illuminator 14, time of year (winter vs summer), HD Maps (new vs old lane markers), and other factors. This information is used in the calculation of the luminance of the illuminator 14 as described below.
  • As another example, in the event the object 18 is another vehicle, the database may be on the other vehicle or updated by the other vehicle. For example, vehicles and/or infrastructure in their V2X (vehicle-to-everything) communication may include and/or transmit this information. For example, a black vehicle might indicate it has a 10% diffuse reflectance, 2% retro reflection, and 5% specular reflection. The vehicle may be identified in the imaging and type recognition described above and the optical property is transmitted via V2X and these two pieces of information may be tied together to determine the optical property of the object 18 being imaged, i.e., the black vehicle.
  • The computer 16 is programmed to calculate the luminance of the illuminator 14 based at least on the optical property of the object 18. In addition, the computer 16 is programmed to determine the distance between the object 18 and the illuminator 14 and/or the orientation of the object 18 relative to the illuminator 14 and to calculate the luminance of the illuminator 14 based at least on the distance and/or orientation.
  • Specifically, the computer 16 is programmed to calculate the luminance of the illuminator 14 based on the known physical attributes of the image sensor 12 (e.g., exposure time, analog to digital gain, F-stop, vignetting, QE, focal length, F-stop, camera calibration sensitivity, FOV, orientation, position (relative and absolute), etc.) and the illuminator 14 (e.g., wavelength, luminesce vs power (V, I), position, orientation, Intensity of light source as a function of distance and angle from the light (see graph below in technical background, etc.). The computer 16 may be programmed to account for weather based on absorption of light, e.g., fog.
  • The computer 16 is programmed to calculate the luminesce of the illuminator 14 based on a sub-region of the image in which the object 18 with known geometry and optical properties is segmented and analyzed through use of the equation below. The intensity of that region may be analyzed. If a large variation is found, then the object 18 may be further sub-divided. The computer 16 may be programmed to account for dark current noise in the image when an object is at a distance where the dark current noise in the image is comparable to the signal.
  • Given the calibration information, previously obtained geometry, image sequence at varying illuminator power levels, and determined optical properties, the luminance of the illuminator 14 may be calculated in the following equation:
  • Luminance = ( 4 2 π 2 r 4 * r diffuse ( θ ) + 4 1 π 1 r 2 * specural ( θ ) + 4 1 π 1 r 2 * retro_reflective ) * f LED ( θ ) * f obj ( θ ) * f lens ( θ ) * N d , 100 % K c ( f S 2 tS ) - N d , 0 % K c ( f S 2 tS )
  • where:
    • r=distance between object 18 and image sensor 12 and/or illuminator 14;
    • rdiffuse(θ)=known diffuse reflection value of an object 18;
    • specular(θ)=known specular reflection value of an object 18;
    • retro_reflective=known retroreflective value of an object 18;
  • fLED(θ)=function of illuminator lens 20;
    • fobj(θ)=function of object 18;
    • flens(θ)=function of image sensor lens 22;
    • Nd=digital number (value) of the pixel in the image;
    • Kc=calibration constant for the image sensor 12;
    • t=exposure time (seconds);
    • fs=aperture number (f-stop);
    • S=ISO sensitivity;
    • Ls=luminance of the scene (candela/meter2).
  • It may be assumed in some instances that r is approximately equal. It can also be assumed that the behavior of intensity of the light source propagating in space to the object 18 and back to the image sensor 12 follows a point spread function with a modification of the function, f(θ), which can account for the illuminator lens 20, object 18, and image sensor lens 22 orientation functionality. For example, the illuminator 14 may have strong orientation dependence and the image sensor 12 may experience vignetting effects depending on the relative orientations and the image sensor 12 image signal processing corrections. The reflection is accounted for as diffuse and may be determined based on the object 18 and its reflectance in the spectral regime of the light source. The latter portion of the equation above determines the luminance of the object 18 based on the calibration of the image sensor 12 minus the effect of ambient light luminance. The solution of the above equation calculates the luminance of the illuminator 14. The term “specular(θ)” in the equation above corrects for specular reflection if the object 18 is so correctly placed within the scene relative to the illuminator 14 and the image sensor 12. It can be assumed that this term is normally zero and can be dropped from the equation for most objects 18 sampled. The term “retro_reflective” in the equation above is the magnitude of the retro reflective effect multiplied by the illuminator's 14 diffuse light emission at impact to the object 18. Further corrections can be added to account for spectral properties of the illuminator 14, object 18, and image sensor 12. Further sections of the object's pixels that may be affected by specular reflection from the illuminator 14 or other light sources may be removed to simplify the calculation in an object 18 with varying intensity across the sub-region.
  • The calculation above calculates a numerical value for the percentage decrease of the illuminator 14. Thus, the degree of degradation is quantified and appropriate action may be taken based on this information, as described above.
  • The computer 16 is programmed to determine if the luminance of the illuminator 14 luminance is lower than expected and/or needed. The relative low luminance may be caused by a blockage, e.g., on the lens 20 of the illuminator 14, and/or failure of the illuminator 14, e.g., LED failure. As an example, the computer 16 is programmed to compare the luminance of the illuminator 14 with a threshold. Specifically, the processor may be programmed to use a statistical process control and/or tracking method to compare and identify changes in the luminance. The imaging at no illumination and full illumination and calculating the luminance of the illuminator 14 on the optical property may be repeated for varying scenes over time to determine a shift. The processor may also cross-reference the specific object 18 with a database, e.g., from an HD map, to account for changes, e.g., new lane markers, or degradation over time.
  • The computer 16 is programmed to adjust the system based on the luminance of the illuminator 14 being lower than expected and/or needed. For example, the computer 16 is programmed to adjust at least one of the illuminator 14, the image sensor 12, and the computer 16 when the luminance is below the threshold. As an example, the adjustment may be an adjustment of the illuminator 14 by cleaning a lens 20 of the illuminator 14. For example, fluid such as cleaning liquid and/or air may be sprayed at the lens 20 of the illuminator 14 to clean the lens 20. The processor may be programmed to instruct a cleaning device 24 to clean the lens 20 in such a case. The processor may be programmed to verify that the lens 20 is clean by repeating the calculation of the luminance described above. Other examples of adjusting the system may include logging the results for future use, scheduling maintenance (including instructing the vehicle 10 to drive to a service provider for maintenance), disabling the system (e.g., disabling the image sensor 12 and/or illuminator 14), and/or modifying sensor fusion and perception algorithms/logic to account for a lower luminance. In examples where the lens 20, 22 is shared by the image sensor 12 and the illuminator 14, the entire lens 20, 22 may be cleaned or only a portion of the lens 20, 22 through which the illuminator 14 is aimed may be cleaned. As another example, the image sensor 12, e.g., in examples in which the image sensor 12 is a camera, may take longer exposures to obtain an improve quality image with sufficient image exposure assuming that the degradation is limited and the dark current noise of the image sensor 12 does not dominate in long exposures.
  • A method 500 of operating the examples shown in FIGS. 1-4 is shown in FIG. 5. The computer 16 may be programmed to perform the method shown in FIG. 5.
  • With reference to block 505, the method 500 includes initiating the steps to calculate the luminance of the illuminator 14, i.e., triggering the system and method 500. Block 505 may include determining, based on inputs, that the steps to calculate the luminance should be initiated and/or receiving instructions to initiate. For example, block 505 may include calculating or receiving a distance traveled interval, a time interval, or some image feature or change thereof and initiating the system and method 500 based that information. For example, the method 500 in block 505 may include determining the image quality of the image sensor 12 by known methods, i.e., known algorithms, and the results of such an image algorithm may be tracked over time and/or compared to a baseline. For example, the method may include tracking the image quality over time using a known a statistical process control and/or tracking method. As another example, the method may include cross-referencing a high definition (HD) map to identify known objects 18 and to initiate based on proximity to approaching objects 18 on the HD map.
  • With reference to blocks 510 and 515, the method includes imaging the scene around the vehicle 10. Specifically, the method includes varying illuminator light levels. In the examples in blocks 510 and 515, the method includes imaging the scene with no illumination from the illuminator 14 (block 510) and with full illumination from the illuminator 14 (block 515). In other words, block 510 includes imaging the scene with the image sensor 12 and block 515 includes both illuminating the scene with the illuminator 14 and imaging the scene with the image sensor 12. In addition, or in the alternative, the method may include imaging the scene at levels between 0% and 100%. The method may include imaging at low vehicle speed or when the vehicle 10 is stopped. As another example, the method may include fusing multiple images together to avoid errors due to the shift in the image during movement of the vehicle 10. Illuminating the scene includes illuminating one or more object 18 in the scene and imaging the scene includes imaging the object 18.
  • The method includes determining the geometry of the object 18 (block 520) and determining that the object has a known optical property(block 525). This may be based on the geometry based on the image of the object 18, i.e., the image taken at block 510 and/or the image taken at block 515. Specifically, the method at block 520 may include calculating and/or receiving a measurement of distance between the object 18 and the illuminator 14 and/or image sensor 12, geometry of the object 18, orientation of the object 18 relative to the illuminator 14 and/or image sensor 12, relative position from illuminator 14 and/or image sensor 12, and/or other information. The method at block 520 and/or block 525 includes interpreting the image of the scene taken by the image sensor 12 by one or more other sensor or knowledge and/or algorithm and constructing an approximate model of the scene or at the least one or more objects 18 imaged, as described above. For example, the computer 16 is programmed to determine the geometry of the object 18 and to identify the object 18 and/or determine the type of the object 18 based on the geometry. Specifically, the method at block 520 and/or block 525 includes interpreting the image of the scene taken by the image sensor 12 by one or more other sensor or knowledge and/or algorithm to construct an approximate model of the scene or at the least one or more objects 18 imaged, as described above.
  • The method at block 525 includes identifying the object 18 and/or determining the type of the object 18 based on the image of the object 18. The method may include determining the type of the object 18 based at least on the shape of the object 18. As one example, the model of the scene and the ways of constructing the model described above may identify the object 18 and/or determine the type of the object 18. As another example, the object 18 may be identified by the use of an HD map along with location identification of the vehicle 10, i.e., location of the vehicle 10 on the HD map. For example, the HD map may identify and object 18 and proximity of the vehicle 10 to the object 18 so that the system may image the scene when the object 18 is in the field of view of the image sensor 12.
  • With reference to block 530, the method includes determining the optical property of the type of the object 18 after identification of the object 18 and/or determination of the type as described above. As an example, the method includes determining the optical property of the object 18 or the type of the object 18 from a database, as described above. For example, the method may include accessing a lookup table, e.g., on the memory of the computer 16, that includes optical properties for various types of objects 18. As another example, the method may include imaging the scene when in the vicinity of an object 18 based on the HD map as described above, identifying the type of the object 18 in the image as the type identified in the HD map, and accessing the optical property of that object 18 from the HD map. As another example, the method may include accessing the optical property by V2X communication as described above.
  • With reference to block 535, the method includes calculating the luminance of the illuminator 14 based on the optical property (i.e., based on the object 18 and/or the type of the object 1, the image at no illumination, and the image at full illumination. Specifically, the calculation based on the object 18 and/or the type of the object 18 may include calculating based on the optical property of the object 18 and/or the type of the object 18. In addition, the method may include determining the distance between the object 18 and the illuminator 14 and/or the orientation of the object 18 relative to the illuminator 14 and calculating the luminance of the illuminator 14 based at least on the distance and/or orientation. The method of calculating the luminance may include implementation of the calculation set forth above.
  • The method may include calculating the luminance based on a sub-region of the image in which the object 18 with known geometry and optical properties is segmented and analyzed through use of the equation below. The intensity of that region may be analyzed. If a large variation is found, then the object 18 may be further sub-divided.
  • With reference to decision box 540, the method includes determining if the luminance of the illuminator 14 luminance is lower than expected and/or needed. As an example, the method includes comparing the luminance of the illuminator 14 (as calculated above) with a threshold. Specifically, the method may compare and identify changes in the luminance by using statistical process control and/or tracking. The method may include repeating the imaging at no illumination and full illumination and calculating of the luminance of the illuminator 14 based on the optical property for varying scenes over time to determine a shift. The method may include cross-referencing the specific object 18 with a database, e.g., from an HD map, to account for changes, e.g., new lane markers, or degradation over time.
  • With reference to box 545, the method includes adjusting the system based on the luminance of the illuminator 14 being lower than expected and/or needed. For example, the method includes adjusting at least one of the illuminator 14, the image sensor 12, and the computer 16 when the luminance is below the threshold. As an example, the method includes cleaning a lens 20 of the illuminator 14, e.g., spraying fluid such as cleaning liquid and/or air at the lens 20 of the illuminator 14 to clean the lens 20. In such a case, the method may including verifying that the lens 20 is clean by repeating the calculation of the luminance described above. Other examples of adjusting the system may include logging the results for future use, scheduling maintenance, modifying sensor fusion and perception algorithms/logic to account for a lower luminance. As another example, the image sensor 12 may take longer exposures to obtain an improve quality image with sufficient image exposure assuming that the degradation is limited and the dark current noise of the image sensor 12 does not dominate in long exposures. As another example, the image sensor 12, e.g., in examples in which the image sensor 12 is a camera, may take multiple varying exposures to obtain a high dynamic range image with sufficient image intensity range.
  • With regard to the process 500 described herein, it should be understood that, although the steps of such process 500 have been described as occurring according to a certain ordered sequence, such process 500 could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the description of the process 500 herein is provided for the purpose of illustrating certain embodiments and should in no way be construed so as to limit the disclosed subject matter.
  • Computing devices, such as the computer 16, generally include computer-executable instructions, where the instructions may be executable by one or more computing devices such as those listed above. Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, Java™, C, C++, Visual Basic, Java Script, Python, Perl, etc. Some of these applications may be compiled and executed on a virtual machine, such as the Java Virtual Machine, the Dalvik virtual machine, or the like. In general, a processor (e.g., a microprocessor) receives instructions, e.g., from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions and other data may be stored and transmitted using a variety of computer-readable media.
  • A computer-readable medium (also referred to as a processor-readable medium) includes any non-transitory (e.g., tangible) medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including, but not limited to, non-volatile media and volatile media. Non-volatile media may include, for example, optical or magnetic disks and other persistent memory. Volatile media may include, for example, dynamic random access memory (DRAM), which typically constitutes a main memory. Such instructions may be transmitted by one or more transmission media, including coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to a processor of a computer. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.
  • In some examples, system elements may be implemented as computer-readable instructions (e.g., software) on one or more computing devices (e.g., servers, personal computers, computing modules, etc.), stored on computer readable media associated therewith (e.g., disks, memories, etc.). A computer program product may comprise such instructions stored on computer readable media for carrying out the functions described herein.
  • The disclosure has been described in an illustrative manner, and it is to be understood that the terminology which has been used is intended to be in the nature of words of description rather than of limitation. Many modifications and variations of the present disclosure are possible in light of the above teachings, and the disclosure may be practiced otherwise than as specifically described.

Claims (20)

1. A vehicle comprising:
an image sensor having a field of view;
an illuminator aimed at the field of view; and
a computer including a processor and a memory storing instructions executable by the processor to:
illuminate an object external to the vehicle;
determine that the object has a known optical property;
determine the optical property of the object from a database;
calculate luminance of the illuminator based at least on the optical property of the object; and
adjust at least one of the illuminator, the image sensor, and the computer based at least on the luminance of the illuminator.
2. The vehicle as set forth in claim 1, wherein the memory stores further instructions executable to adjust the illuminator by cleaning a lens of the illuminator based at least on the luminance of the illuminator.
3. The vehicle as set forth in claim 2, wherein the memory stores further instructions executable to spray fluid at the lens to clean the lens.
4. The vehicle as set forth in claim 1, wherein the memory stores further instructions executable to compare the luminance of the illuminator with a threshold and to adjust at least one of the illuminator, the image sensor, and the computer when the luminance is below the threshold.
5. The vehicle as set forth in claim 1, wherein the memory stores further instructions executable to determine the geometry of the object and to determine a type of the object based on the geometry.
6. The vehicle as set forth in claim 1, wherein the memory stores further instructions executable to determine the shape of the object and to calculate the luminance of the illuminator based at least on the shape.
7. The vehicle as set forth in claim 1, wherein the memory stores further instructions executable to determine the distance between the object and the illuminator and/or the orientation of the object relative to the illuminator and to calculate the luminance of the illuminator based at least on the distance and/or orientation.
8. The vehicle as set forth in claim 1, wherein the memory stores further instructions executable to capture an image the object during the illumination.
9. A system, comprising a computer including a processor and a memory, the memory storing instructions executable by the processor to:
illuminate an object external to a vehicle with an illuminator;
determine that the object has a known optical property;
determine the optical property of the object from a database;
calculate luminance of the illuminator based at least on the optical property of the object; and
clean a lens of the illuminator based at least on the luminance of the illuminator.
10. The system as set forth in claim 9, wherein the memory stores further instructions executable to spray fluid at the lens to clean the lens.
11. The system as set forth in claim 9, wherein the memory stores further instructions executable to compare the luminance of the illuminator with a threshold and to clean the lens of the illuminator when the luminance is below the threshold.
12. The system as set forth in claim 9, wherein the memory stores further instructions executable to determine the geometry of the object and to determine a type of the object based on the geometry of the object.
13. The system as set forth in claim 9, wherein the memory stores further instructions executable to determine the shape of the object relative to the illuminator and to calculate the luminance of the illuminator based at least on the shape.
14. The system as set forth in claim 9, wherein the memory stores further instructions executable to determine the distance between the object and the illuminator and/or the orientation of the object relative to the illuminator and to calculate the luminance of the illuminator based at least on the distance and/or orientation.
15. A method comprising:
illuminating an object;
determining the that the object has a known optical property;
determining the optical property of the object from a database;
calculating luminance of the illuminator based at least on the optical property of the object; and
adjusting at least one of the illuminator, an image sensor, and a computer based at least on the luminance of the illuminator.
16. The method as set forth in claim 15, wherein adjusting the illuminator includes cleaning a lens of the illuminator.
17. The method as set forth in claim 15, wherein determining a type of the object includes determining the geometry of the object.
18. The method as set forth in claim 15, further comprising comparing the luminance of the illuminator with a threshold and cleaning the illuminator when the luminance is below the threshold.
19. The method as set forth in claim 15, further comprising determining the shape of the object and calculating the luminance of the illuminator based at least on the shape.
20. The computer as set forth in claim 15, further comprising determining the distance between the object and the illuminator and/or the orientation of the object relative to the illuminator and calculating the luminance of the illuminator based at least on the distance and/or orientation.
US16/287,672 2019-02-27 2019-02-27 Determination of illuminator obstruction by known optical properties Active US10771665B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/287,672 US10771665B1 (en) 2019-02-27 2019-02-27 Determination of illuminator obstruction by known optical properties
CN202010118464.8A CN111629128A (en) 2019-02-27 2020-02-26 Determination of luminaire obstacles by known optical properties
DE102020105059.3A DE102020105059A1 (en) 2019-02-27 2020-02-26 DETERMINATION OF THE BLOCKING OF A LIGHTING DEVICE BY KNOWN OPTICAL CHARACTERISTICS

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/287,672 US10771665B1 (en) 2019-02-27 2019-02-27 Determination of illuminator obstruction by known optical properties

Publications (2)

Publication Number Publication Date
US20200274998A1 true US20200274998A1 (en) 2020-08-27
US10771665B1 US10771665B1 (en) 2020-09-08

Family

ID=72139116

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/287,672 Active US10771665B1 (en) 2019-02-27 2019-02-27 Determination of illuminator obstruction by known optical properties

Country Status (3)

Country Link
US (1) US10771665B1 (en)
CN (1) CN111629128A (en)
DE (1) DE102020105059A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230080085A1 (en) * 2021-09-10 2023-03-16 Aptiv Technologies Limited Driver vision assistance systems and methods
US20230176205A1 (en) * 2021-12-06 2023-06-08 Primax Electronics Ltd. Surveillance monitoring method

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020106109A1 (en) * 2000-08-12 2002-08-08 Retterath James E. System for road sign sheeting classification
US20030107669A1 (en) * 2001-12-07 2003-06-12 Akira Ito Image pick-up device and portable electronic device having the same
WO2004007255A2 (en) * 2002-07-16 2004-01-22 Trw Limited Rain detection apparatus and method
EP1498721A1 (en) * 2003-07-15 2005-01-19 ELMOS Semiconductor AG Device for recognition of fog, especially for a vehicle
JP2006078452A (en) * 2004-09-13 2006-03-23 Asahi Breweries Ltd Thermoplastic adhesive inspecting device and inspection method
US20150069224A1 (en) * 2013-09-06 2015-03-12 Ricoh Company, Ltd. Light guide member, object detection apparatus, and vehicle
US20150145956A1 (en) * 2012-07-27 2015-05-28 Nissan Motor Co., Ltd. Three-dimensional object detection device, and three-dimensional object detection method
US20150161457A1 (en) * 2012-07-27 2015-06-11 Nissan Motor Co., Ltd. Three-dimensional object detection device, and three-dimensional object detection method
US20150169967A1 (en) * 2012-07-03 2015-06-18 Clarion Co., Ltd. State recognition system and state recognition method
US20150177512A1 (en) * 2012-07-27 2015-06-25 Nissan Motor Co., Ltd. Camera device, three-dimensional object detection device, and lens cleaning method
US9126534B2 (en) * 2013-03-14 2015-09-08 Ford Global Technologies, Llc Automated camera wash for vehicles
US9319637B2 (en) * 2012-03-27 2016-04-19 Magna Electronics Inc. Vehicle vision system with lens pollution detection
US9445057B2 (en) * 2013-02-20 2016-09-13 Magna Electronics Inc. Vehicle vision system with dirt detection
US20160341848A1 (en) * 2015-05-22 2016-11-24 Satoshi Nakamura Object detection apparatus, object removement control system, object detection method, and storage medium storing object detection program
US20170034459A1 (en) * 2015-07-30 2017-02-02 Motorola Mobility Llc Electronic Device with Image Correction System and Methods Therefor
US9607242B2 (en) * 2015-01-16 2017-03-28 Ford Global Technologies, Llc Target monitoring system with lens cleaning device
US20170180615A1 (en) * 2015-12-18 2017-06-22 The Lightco Inc. Methods and apparatus for providing a camera lens or viewing point indicator
US20170270375A1 (en) * 2014-12-07 2017-09-21 Brightway Vision, Ltd. Object Detection Enhancement of Reflection-Based Imaging Unit
WO2018127789A1 (en) * 2017-01-03 2018-07-12 Innoviz Technologies Ltd. Lidar systems and methods for detection and classification of objects
US20180253609A1 (en) * 2015-07-28 2018-09-06 Apple Inc. System and method for light and image projection
US20190174029A1 (en) * 2016-08-09 2019-06-06 Clarion Co., Ltd. In-vehicle device
US20190202355A1 (en) * 2016-08-08 2019-07-04 Koito Manufacturing Co., Ltd. Vehicle monitoring system using a plurality of cameras
US20190208111A1 (en) * 2017-12-28 2019-07-04 Waymo Llc Multiple Operating Modes to Expand Dynamic Range

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3915742B2 (en) 2003-06-20 2007-05-16 株式会社デンソー Vehicle object recognition device
JP4437750B2 (en) 2005-02-04 2010-03-24 三機工業株式会社 Lamp cleaning system
DE112011102968A5 (en) 2010-11-30 2013-07-04 Conti Temic Microelectronic Gmbh Detecting raindrops on a glass by means of a camera and lighting
JP2012171536A (en) 2011-02-23 2012-09-10 Jvc Kenwood Corp Vehicular lamp lighting state determination apparatus
JP6120395B2 (en) 2012-07-03 2017-04-26 クラリオン株式会社 In-vehicle device
KR101527810B1 (en) 2014-11-27 2015-06-11 한국건설기술연구원 Infrared Image Pickup Device for Visibility of Road and Image Pickup Method Using the Same
US10035498B2 (en) 2015-04-22 2018-07-31 Ford Global Technologies, Llc Vehicle camera cleaning system

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020106109A1 (en) * 2000-08-12 2002-08-08 Retterath James E. System for road sign sheeting classification
US20030107669A1 (en) * 2001-12-07 2003-06-12 Akira Ito Image pick-up device and portable electronic device having the same
WO2004007255A2 (en) * 2002-07-16 2004-01-22 Trw Limited Rain detection apparatus and method
EP1498721A1 (en) * 2003-07-15 2005-01-19 ELMOS Semiconductor AG Device for recognition of fog, especially for a vehicle
JP2006078452A (en) * 2004-09-13 2006-03-23 Asahi Breweries Ltd Thermoplastic adhesive inspecting device and inspection method
US9319637B2 (en) * 2012-03-27 2016-04-19 Magna Electronics Inc. Vehicle vision system with lens pollution detection
US20150169967A1 (en) * 2012-07-03 2015-06-18 Clarion Co., Ltd. State recognition system and state recognition method
US20150145956A1 (en) * 2012-07-27 2015-05-28 Nissan Motor Co., Ltd. Three-dimensional object detection device, and three-dimensional object detection method
US20150161457A1 (en) * 2012-07-27 2015-06-11 Nissan Motor Co., Ltd. Three-dimensional object detection device, and three-dimensional object detection method
US20150177512A1 (en) * 2012-07-27 2015-06-25 Nissan Motor Co., Ltd. Camera device, three-dimensional object detection device, and lens cleaning method
US9445057B2 (en) * 2013-02-20 2016-09-13 Magna Electronics Inc. Vehicle vision system with dirt detection
US9126534B2 (en) * 2013-03-14 2015-09-08 Ford Global Technologies, Llc Automated camera wash for vehicles
US20150069224A1 (en) * 2013-09-06 2015-03-12 Ricoh Company, Ltd. Light guide member, object detection apparatus, and vehicle
US20170270375A1 (en) * 2014-12-07 2017-09-21 Brightway Vision, Ltd. Object Detection Enhancement of Reflection-Based Imaging Unit
US9607242B2 (en) * 2015-01-16 2017-03-28 Ford Global Technologies, Llc Target monitoring system with lens cleaning device
US20160341848A1 (en) * 2015-05-22 2016-11-24 Satoshi Nakamura Object detection apparatus, object removement control system, object detection method, and storage medium storing object detection program
US20180253609A1 (en) * 2015-07-28 2018-09-06 Apple Inc. System and method for light and image projection
US20170034459A1 (en) * 2015-07-30 2017-02-02 Motorola Mobility Llc Electronic Device with Image Correction System and Methods Therefor
US20170180615A1 (en) * 2015-12-18 2017-06-22 The Lightco Inc. Methods and apparatus for providing a camera lens or viewing point indicator
US20190202355A1 (en) * 2016-08-08 2019-07-04 Koito Manufacturing Co., Ltd. Vehicle monitoring system using a plurality of cameras
US20190174029A1 (en) * 2016-08-09 2019-06-06 Clarion Co., Ltd. In-vehicle device
WO2018127789A1 (en) * 2017-01-03 2018-07-12 Innoviz Technologies Ltd. Lidar systems and methods for detection and classification of objects
US20190208111A1 (en) * 2017-12-28 2019-07-04 Waymo Llc Multiple Operating Modes to Expand Dynamic Range

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230080085A1 (en) * 2021-09-10 2023-03-16 Aptiv Technologies Limited Driver vision assistance systems and methods
US20230176205A1 (en) * 2021-12-06 2023-06-08 Primax Electronics Ltd. Surveillance monitoring method

Also Published As

Publication number Publication date
US10771665B1 (en) 2020-09-08
DE102020105059A1 (en) 2020-08-27
CN111629128A (en) 2020-09-04

Similar Documents

Publication Publication Date Title
JP7157054B2 (en) Vehicle navigation based on aligned images and LIDAR information
US10649463B2 (en) Navigating a vehicle based on a detected barrier
US20210122364A1 (en) Vehicle collision avoidance apparatus and method
CN108604292B (en) Automatic prediction and lithe response of vehicles to cut lanes
KR102040353B1 (en) Methods and systems for detecting weather conditions using vehicle onboard sensors
US10086844B2 (en) Vehicle sensor diagnosis system and method and a vehicle comprising such a system
KR101030763B1 (en) Image acquisition unit, acquisition method and associated control unit
KR102327997B1 (en) Surround sensing system
US9815462B2 (en) Path determination for automated vehicles
US20180113216A1 (en) Methods Circuits Devices Assemblies Systems and Functionally Associated Machine Executable Code for Active Optical Scanning of a Scene
US11334754B2 (en) Apparatus and method for monitoring object in vehicle
KR101891460B1 (en) Method and apparatus for detecting and assessing road reflections
US9566900B2 (en) Driver assistance system and operating procedure for the latter
US10836356B2 (en) Sensor dirtiness detection
US11514343B2 (en) Simulating degraded sensor data
CN111413688A (en) Weak light sensor cleaning
US11562572B2 (en) Estimating auto exposure values of camera by prioritizing object of interest based on contextual inputs from 3D maps
US10771665B1 (en) Determination of illuminator obstruction by known optical properties
US20230048044A1 (en) Autonomous vehicle, system, and method of operating one or more autonomous vehicles for the pacing, protection, and warning of on-road persons
US11610412B2 (en) Vehicle neural network training
CN112766030A (en) System and method for LED flicker and strip detection
EP4170305A1 (en) System and method for simultaneous online lidar intensity calibration and road marking change detection
CN116434180A (en) Lighting state recognition device, lighting state recognition method, and computer program for recognizing lighting state
WO2023057261A1 (en) Removing non-relevant points of a point cloud

Legal Events

Date Code Title Description
AS Assignment

Owner name: FORD GLOBAL TECHNOLOGIES, LLC, MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HERMAN, DAVID MICHAEL;ARUNMOZHI, ASHWIN;KRISHNAN, VENKATESH;SIGNING DATES FROM 20190221 TO 20190225;REEL/FRAME:048459/0146

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE