WO2025040835A1 - Weather model-based automated parametrisation of gated camera apparatus - Google Patents
Weather model-based automated parametrisation of gated camera apparatus Download PDFInfo
- Publication number
- WO2025040835A1 WO2025040835A1 PCT/FI2024/050373 FI2024050373W WO2025040835A1 WO 2025040835 A1 WO2025040835 A1 WO 2025040835A1 FI 2024050373 W FI2024050373 W FI 2024050373W WO 2025040835 A1 WO2025040835 A1 WO 2025040835A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- landmark
- weather
- determined
- landmarks
- data
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 30
- 238000005259 measurement Methods 0.000 claims description 15
- 238000004458 analytical method Methods 0.000 claims description 10
- 230000000694 effects Effects 0.000 claims description 7
- 238000002310 reflectometry Methods 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 3
- 238000005286 illumination Methods 0.000 description 15
- 238000003384 imaging method Methods 0.000 description 13
- 230000002411 adverse Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 241001465754 Metazoa Species 0.000 description 1
- 238000000149 argon plasma sintering Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
- G01S17/18—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/87—Combinations of systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/95—Lidar systems specially adapted for specific applications for meteorological use
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
- G01S7/4863—Detector arrays, e.g. charge-transfer gates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
Definitions
- the present disclosure relates to gated imaging, and particularly to a method and a system that automatically adjusts operation parameter values of a gated camera apparatus to optimise its performance in variable weather conditions.
- VRU vulnerable road users
- ITS intelligent transport systems
- the term vulnerable road users is defined in the ITS directive determined by the European Union as referring to non-motorized road users, such as pedestrians and cyclists as well as motor-cyclists and persons with disabilities or reduced mobility and orientation.
- vehicles When vehicles are designed for autonomous driving, they must be able to reliably detect any vulnerable road users that may possibly cross their own planned course to avoid accidents.
- One possible method for detecting vulnerable road users is to use gated imaging.
- Gated imaging refers to using illumination pulses from an illumination source, such as one or more lasers, in combination with a synchronized imaging sensor.
- a gated camera refers to a camera apparatus that performs gated imaging.
- An illumination pulse sent from the one or more illumination sources serves as a source of light to illuminate the scene.
- By receiving reflected light with the imaging sensor at predefined, short time periods at predefined moments after emitting light by the illumination source it is possible, by use of neural network processes, to divide the obtained image into so called slices, which refer to different distance ranges. Each slice shows objects reflecting the emitted light at a specific distance range from the light source and the imaging sensor.
- gated imaging improves visibility of objects further away from the gated camera and in adverse weather conditions, and even provides some information on distance to those objects.
- Figures 1A to 1C illustrate schematically operation of a gated camera.
- Figure 1A represents a field of view 10 as seen by a normal camera. Visibility of any one of the objects 12, 13, 14 may be limited due to focal length of the camera, varying lighting conditions etc. For example, headlights of the car 13 in the background may cause the pedestrian 12, who is closer to the camera, to be unnoticeable. Furthermore, this figure has no clear indication about distance between the camera and the objects 12, 13, 14.
- Figure IB illustrates the scene.
- the gated camera with the field of view equal to that shown in the figure IB is at the origin, and objects are within its field of view, but at different distances r.
- Scene is illuminated with three different illumination pulses 18 with predetermined timing, duration and intensity I.
- three different images are obtained, each representing a different, determined range of distances (rl, r2, r3) referred to as slices.
- Obtained images representing each one of the slices are shown in the figure 1C.
- Slice 11a shows objects within range rl nearest to the camera
- slice lib shows objects within range r2 further away from the camera
- slice 11c shows objects withing range r3 furthest away from the camera.
- CMOS complementary metal-oxide-semiconductor
- CMOS complementary metal-oxide-semiconductor
- Distance can be determined based on time of flight of the illumination pulse.
- LiDAR is an acronym for "light detection and ranging” or “laser imaging, detection, and ranging”, which refer to a method for determining ranges by targeting objects or surfaces with a laser and measuring the time for the reflected light to return to the receiver.
- a LiDAR enables generating a digital three-dimensional (3D) representation of its field of view.
- a typical LiDAR produces a point cloud comprising information on objects in the field of view.
- a problem in practical use of gated imaging devices is that weather conditions require using different parameter settings.
- Adverse weather conditions such as rain, snow, hail, drizzle, haze, fog and smog cause increased scattering of emitted light. Scattering characteristics are different depending for instance on the type and size of particles in the air.
- emitted illumination penetrates even adverse weather, and the gated camera can reveal objects at different distance ranges as intended. For example, if power is too high, dense fog, rain or snow causes over exposure of the image by light scattering from the fog, rain, or snow. If the power is too low, only some of the slices get enough laser power for clear visibility, while other slices get too low laser power.
- Gated camera manufacturers have made extensive research to find proper parameter values to avoid effects of adverse weather conditions.
- a known solution used in commercially available gated cameras is that the user manually selects a setting to be applied among a plurality of different preprogrammed settings, each comprising predefined parameter settings optimized for a particular weather condition.
- Parameter settings of the gated camera are different for each slice.
- Typical key parameters set for each slice comprise intensity of emitted light, number of pulses to be sent, duration of illumination period per pulse, and a time window for receiving reflected illumination after sending the pulse.
- the time window can be determined for example as a predetermined shutter opening time instance after the illumination pulse is sent, and a duration of shutter opening period.
- Document US 11194043 B2 discloses using a radar for weather detection. Vehicle systems, like lighting, windscreen wipers or cruise control are dynamically controlled and actuated based on determined weather condition.
- Document US 2020/0166649 Al discloses a method and a system using a LIDAR sensor to determine presence of solid objects and adverse weather conditions.
- a problem with existing gated camera apparatuses is that parameter settings need to be manually selected based on visually determined weather conditions. Furthermore, compliance with eye-safety regulations need to be obliged.
- An object of the present disclosure is to provide a method and an apparatus for implementing the method so as to enable automated selection of gated camera parameter settings.
- the disclosure is based on the idea of using an existing sensor in the vehicle, namely a LiDAR, to detect predetermined landmarks with accurately known geographic positions and to automatically determine weather conditions based on characteristics of LiDAR data. Determined weather conditions are then used to determine parameter settings of the gated camera.
- a method for determining operation parameter values of a gated camera apparatus configured to obtain image data at a plurality of different distance ranges.
- the method comprises obtaining LiDAR data and processing said LiDAR data for generating a point cloud, locating one or more landmarks in the point cloud, and identifying said one or more landmarks on basis of at least determined geographic location thereof.
- the method further comprises comparing a respective portion of the obtained LiDAR data associated with each one of the one or more identified landmarks currently appearing in the point cloud with a plurality of weather models associated with the respective identified landmark.
- the weather model comprises a plurality of weather conditions characterized by means of values of one or more parameters indicating effect of weather-based interference levels in the respective portion of the obtained LiDAR data associated with the respective landmark.
- the plurality of weather models comprises one or more parameters indicating effect of weather-based interference levels.
- the parameters comprise at least one of determined reflectivity values associated with the landmark, and number of points in the point cloud associated with or representing the landmark.
- said parameters indicate weather-based interference levels are arranged as data sets, wherein each data set comprises respective parameter values as a function of distance to the respective landmark.
- said comparing comprises comparing a plurality of said portions of the obtained LiDAR data determined at different distances to the landmark to a respective plurality of parameter values determined as a function of distance to the respective landmark.
- said comparing comprises determining a relative change of a parameter value between the portion of LiDAR data obtained at two different distances to the landmark, determining a relative change of the respective parameter value between same two different distances in the plurality of data sets of the weather model, and comparing determined amounts of relative change of parameter values to find the best matching weather model data set to be determined as the current weather condition.
- said identifying the landmark is further based on type of the landmark.
- weather model data is stored in a weather model database.
- the weather model data is based on a plurality of field measurements performed in each of the plurality of different weather conditions. According to some aspects, said operation parameter values are determined individually for each applied distance range.
- said operation parameters of the gated camera apparatus comprise light source power parameters, impulse parameters, and gating parameters.
- light source power parameters comprise light source power for each individual light pulse, wherein impulse parameters comprise one or more of: number of light pulses to be emitted, duration of each light pulse, and time interval or intervals between consecutive light pulses, and wherein gating parameters comprise: timing of shutter opening periods, wherein the timing is determined with respect to emitted light pulses, and timing determines shutter opening time and at least one of shutter opening period or shutter closing time.
- said identifying said one or more landmarks based on their geographic location is based on i) comparing a calculated location of the landmark with a map database comprising known geographic locations of a plurality of landmarks, ii) optionally determining a type of the landmark based on visual appearance thereof, and iii) selecting identity of the landmark based on correlation between the calculated location and location in the map database and optionally based further on determining that the type of the landmark determined based on its visual appearance equals with type of the landmark stored in the map database.
- a system for determining operation parameter values of a gated camera apparatus configured to obtain image data at a plurality of different distance ranges.
- the system comprises a LiDAR apparatus configured to obtain LiDAR data and to process said LiDAR data for generating a point cloud, a landmark locating module configured to locate one or more landmarks in the point cloud, a landmark identification module configured to identify said one or more landmarks on basis of at least determined geographic location thereof, and a weather analysis module.
- the weather analysis module is configured to compare a respective portion of the obtained LiDAR data associated with each one of the one or more identified landmarks currently appearing in the point cloud with a plurality of weather models associated with the respective identified landmark, and based on said comparing, to determine a current weather condition.
- the system also comprises a gated camera controller module configured to determine operation parameter values of the gated camera apparatus for each of the plurality of different distance ranges based on the determined current weather condition.
- Figures 1A to 1C illustrate operation principle of a gated camera
- Figure 2 illustrates an aerial view of an area
- Figure 3 illustrates a view with detected landmarks
- Figure 4 illustrates average detected intensity of light reflected from a landmark
- Figure 5 illustrates point profiles that show number of points representing a landmark
- Figures 6A to 6D illustrate graphically sets of weather model data values in the weather model database
- Figure 7 illustrates functional elements of a system.
- Figure 2 illustrates an aerial view with landmarks 100 marked with white circles.
- Geographic location of each landmark 100 is known and available in a map database.
- map databases are used by vehicles, such as autonomous vehicles, for navigation.
- Geographic positions of landmarks 100 are preferably known with high accuracy.
- RTK real time kinematics
- GNSS satellite-based navigation system
- achievable location accuracy is in level of centimetres.
- Figure 3 illustrates view from a vehicle showing a plurality of identified landmarks, which are in this case traffic signs.
- Each traffic sign is identified by an unique identifier (196, 198, 199, 200, 201, 202), marked with white rectangles known as bounding boxes.
- This illustrative image has been taken with a camera during a rain shower. Although this image has been captured with a camera from behind the windscreen of a vehicle, it can be used to illustrate the situation as experienced by a LiDAR carried by and operating in the vehicle. Raindrops reduce visibility of the landmarks.
- Identification of landmarks may be based on determining their geographic location on basis of measured location of the vehicle and relative location between the vehicle and the landmark, and comparing this geographic location to a map database comprising known geographic locations and identifiers of landmarks. When the calculated location matches with the predetermined location in the map database, the landmark identifier can be obtained from the map database.
- a LiDAR When a LiDAR is used for detecting landmarks, it provides information on reflectivity of the landmark based on intensity of the received light reflected by the landmark.
- Figure 4 illustrates average detected intensity of reflected light measured by a LiDAR of a vehicle at selected distances from the landmark in few different weather conditions.
- the identified landmark is the traffic sign 200 shown in the figure 3.
- Intensity can be expressed as an absolute value or as a relative value.
- maximum intensity value is determined to be 255.
- dry conditions white bars
- detected intensity is particularly strong at certain distances, since the traffic sign is covered with reflective material that has high reflectivity in certain directions, while reflection in other directions is slightly weaker.
- points are calculated based on number of points within a bounding box comprising the landmark.
- the bounding box is typically a rectangular box with outer bound determined by detected outer edges of the landmark.
- points are calculated based on number of points representing the landmark itself within the bounding box.
- Figures 6A, 6B 6C and 6D illustrate average measured intensity and number of points detected of a landmark at different distances from a landmark.
- dry reference weather is compared to adverse weather conditions caused by snowfall. Since the LiDAR is capable of measuring 360°, also information obtained from backside of the landmark can be used for determining weather conditions.
- Figures 6A and 6B show exemplary values measured at different distances from the front side of a landmark (a traffic sign) in dry reference weather (white bars) and during snowfall (black bars), when the landmark, in this case a traffic sign, is at least partially covered with snow. Because the reflective surface of the traffic sign is covered with snow, relative intensity of reflected light is different from what would be expected, if the traffic sign was clean from snow cover.
- the figure 6A illustrates average intensity of reflected light at a few exemplary distances and the figure 6B illustrates number of points representing the landmark at exemplary distances.
- Figures 6C and 6D show exemplary values measured at different distances from the back side of a landmark (a traffic sign) in dry reference weather (white bars) and during snowfall (black bars). It can be noticed that snowfall has significant effect on capability of detecting the landmark from the backside; at greater distances, the landmark was not detected at all, and no measurement results were received for intensity nor number of points, while at shorter distances, number of points in the snowfall is in the same level as in the good, reference weather conditions, but intensity of the reflected light is significantly affected by the snowfall.
- FIGS 4, 5 and 6A to 6D demonstrate that by combining results of intensity and number of point measurements, weather profiles can be determined more accurately than relying in just one of these parameters.
- figures 6A to 6D represent, for illustration purposes, just a few selected detection distances, the weather model is not limited to these detection distances, but any number of detection distances at any selected intervals between those can be used in the weather model.
- a weather model can be established for each landmark.
- the weather model comprises weather model data.
- the weather model is provided in form of a weather model database.
- weather model data comprises, for each landmark and each weather condition, a plurality of weather model data values for reflectivity and number of points as function of distance.
- Weather model data values can be considered as predefined reference values to which measured LiDAR data is compared.
- Weather model data values are preferably determined based on field measurements performed by a LiDAR in known weather conditions.
- Weather model data values are preferably provided as a function of distance to the landmark in a plurality of different weather conditions. Measured weather model data values may be arranged as a table in the weather model database. Number of weather model data sets per landmark is not limited, but weather models are preferably determined for all kinds of weather conditions, such as different amounts and types of rainfall, snowfall, hail, drizzle, fog, smog, if applicable at the geographic location of the landmark. Automatic comparison of obtained LiDAR data and weather model data sets associated to identified landmarks enables using more detailed weather models than what would be possible by visual determination of weather conditions and manual selection of gated camera parameter settings. For example, same amount of rainfall per hour may cause different scattering characteristics depending on size of raindrops or different type of snow.
- All LiDAR measurement-based weather model data is preferably stored in a weather model database together with information on known weather conditions at the time of performing the respective measurements.
- FIG. 7 illustrates functional elements of a system for implementing some embodiments of the disclosure.
- a LiDAR 50 carried by a vehicle obtains LiDAR data.
- a landmark locating module 51 is configured to locate landmarks in the LiDAR data and a landmark identification module 52 is configured to identify located landmarks.
- the landmark locating module 51 and the landmark identification module 52 are preferably implemented as software modules. They may be part of the LiDAR system or provided by an existing on-boardunit (OBU) of the vehicle because landmark location and identification information may also be used for navigation and autonomous driving.
- OBU on-boardunit
- the landmark identification module 52 may be configured to identify landmarks based on matching their location with a map database. According to some embodiments, the portion of LiDAR data representing a landmark is further processed by a type determining module that is configured to classify detected landmarks are based on their type, and this landmark classification is used as additional data for improving landmark identification by the landmark identification module 52.
- a plurality of consecutive LiDAR measurements is made so that when the vehicle carrying the LiDAR moves, a plurality of observations on each detected and identified landmark are provided at different distances thereto.
- LiDAR data is provided to a weather analysis module 53 together with landmark identification of respective landmarks.
- the weather analysis module 53 is preferably a software module. Based on the landmark identification, the weather analysis module 53 obtains respective weather model data stored in the weather model database 60 and compares obtained LiDAR data of the respective landmark thereto, to determine the prevailing weather condition.
- Weather model data preferably comprises a plurality of data sets for each landmark comprised in the weather model data.
- each data set in the weather model represents a plurality of parameter values determined from a portion of LiDAR data obtained during a data collection phase in a known weather condition.
- Each data set comprises parameters indicating effect of weather-based interference levels in a portion of LiDAR data associated with the respective landmark.
- parameter values are provided in the data set as a function of distance to the respective landmark.
- the determined weather condition is used as input for a gated camera controller 54, which automatically selects parameter settings of the gated camera 55 based on the weather type information.
- the weather model database comprises a plurality of data sets each covering a plurality of one-meter slots for at least one weather-based interference indicating parameters.
- selected parameters may be determined as average reflectivity and points discussed above.
- other or further types of weather-interference indicating parameters may be applicable.
- Range of stored weather model data values may extend from a few meters, for example 1, 2, 3, 4, or 5 meters to some tens of meters, for example 20, 25 or 30 meters from the respective landmark.
- the weather analysis module 53 compares measured LiDAR data at different distances from the identified landmark to respective parameter values in weather model data stored in the weather model database. By comparing more than one type of data at more than one different distance enables improving certainty that the best match is selected as the determined current weather condition by the weather analysis module.
- the weather analysis module may select a few candidate weather conditions from the weather model data based on one set of LiDAR data, and based on these candidate weather conditions, expected values of LiDAR data at the next available distance are compared to measured values, and one or more of the best matching candidate weather conditions is selected. This selection can be repeated, iteratively rejecting a portion of weather conditions, until just one remains, which is then used as the determined weather condition.
- relative parameter values between two LiDAR measurements are considered rather than absolute parameter values. For example, changes between pairs of two different measured parameter values determined based on portion of the LiDAR data associated with the same landmark and obtained at a plurality of distances x, x-1, x-2, ...x-n meters are determined. Change can be determined in relative terms, i.e. that parameter value at x meters is 0,933088 times the parameter value at x-1 meters. These changes determined based on LiDAR data are compared to respective changes between parameter values associated with the same pairs of distances in the plurality of weather model data sets. This kind of relative comparison enables avoiding possible problems caused by use of different LiDAR apparatuses by different vehicles.
- distance is referred herein as a single distance value, although in practice distance may represent measured values over a distance range.
- individual parameter values of both LiDAR data and weather model data sets may be determined as values over a distance range of one meter.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The disclosure relates to a method for determining operation parameter values of a gated camera apparatus configured to obtain image data at a plurality of different distance ranges, and to a system performing the method. A portion of obtained LiDAR data associated with identified landmarks is compared with a plurality of weather models associated with the respective identified landmark, and based on said comparing, a current weather condition is determined. Operation parameter values of the gated camera apparatus are determined for each of the plurality of different distance ranges based on the determined current weather condition.
Description
WEATHER MODEL-BASED AUTOMATED PARAMETRISATION OF GATED CAMERA APPARATUS
FIELD
The present disclosure relates to gated imaging, and particularly to a method and a system that automatically adjusts operation parameter values of a gated camera apparatus to optimise its performance in variable weather conditions.
BACKGROUND
In the field of intelligent transport systems (ITS), the term vulnerable road users (VRU) is defined in the ITS directive determined by the European Union as referring to non-motorized road users, such as pedestrians and cyclists as well as motor-cyclists and persons with disabilities or reduced mobility and orientation. When vehicles are designed for autonomous driving, they must be able to reliably detect any vulnerable road users that may possibly cross their own planned course to avoid accidents.
One possible method for detecting vulnerable road users is to use gated imaging.
Gated imaging refers to using illumination pulses from an illumination source, such as one or more lasers, in combination with a synchronized imaging sensor. A gated camera refers to a camera apparatus that performs gated imaging. An illumination pulse sent from the one or more illumination sources serves as a source of light to illuminate the scene. By receiving reflected light with the imaging sensor at predefined, short time periods at predefined moments after emitting light by the illumination source, it is possible, by use of neural network processes, to divide the obtained image into so called slices, which refer to different distance ranges. Each slice shows objects reflecting the emitted light at a specific distance range from the light source and the imaging sensor. Thus, gated imaging improves visibility of objects further away from the gated camera and in adverse weather conditions, and even provides some information on distance to those objects.
Figures 1A to 1C illustrate schematically operation of a gated camera. Figure 1A represents a field of view 10 as seen by a normal camera. Visibility of any one of the objects 12, 13, 14 may be limited due to focal length of the camera,
varying lighting conditions etc. For example, headlights of the car 13 in the background may cause the pedestrian 12, who is closer to the camera, to be unnoticeable. Furthermore, this figure has no clear indication about distance between the camera and the objects 12, 13, 14.
Figure IB illustrates the scene. The gated camera with the field of view equal to that shown in the figure IB is at the origin, and objects are within its field of view, but at different distances r. Scene is illuminated with three different illumination pulses 18 with predetermined timing, duration and intensity I. By determining distinct time periods between each illumination pulse and subsequent opening of the shutter of the imaging sensor, three different images are obtained, each representing a different, determined range of distances (rl, r2, r3) referred to as slices. Obtained images representing each one of the slices are shown in the figure 1C. Slice 11a shows objects within range rl nearest to the camera, slice lib shows objects within range r2 further away from the camera and slice 11c shows objects withing range r3 furthest away from the camera.
As known in the art, time of flight for illumination pulse sent from the illumination source, reflected by an object at distance of ro, and arriving back to the imaging sensor is t=2ro/co, where co is speed of light. Thus, by determining a time window after emitting the respective illumination pulse, it is possible to receive reflected light only from a predetermined distance range and to get a clearer image of that particular distance range.
Known gated imaging devices use near-infrared or far-infrared light, which is invisible to human eyes, but also safe to humans and animals even if the light pulse accidentally hits eyes. Currently, typical commercially available gated cameras produce three slices, but the same principle is applicable also to more slices (e.g. 8 - 30 slices). Distance can be determined based on time of flight of the illumination pulse.
LiDAR is an acronym for "light detection and ranging" or "laser imaging, detection, and ranging", which refer to a method for determining ranges by
targeting objects or surfaces with a laser and measuring the time for the reflected light to return to the receiver. A LiDAR enables generating a digital three-dimensional (3D) representation of its field of view. A typical LiDAR produces a point cloud comprising information on objects in the field of view.
A problem in practical use of gated imaging devices is that weather conditions require using different parameter settings. Adverse weather conditions, such as rain, snow, hail, drizzle, haze, fog and smog cause increased scattering of emitted light. Scattering characteristics are different depending for instance on the type and size of particles in the air. With properly selected operation parameters, emitted illumination penetrates even adverse weather, and the gated camera can reveal objects at different distance ranges as intended. For example, if power is too high, dense fog, rain or snow causes over exposure of the image by light scattering from the fog, rain, or snow. If the power is too low, only some of the slices get enough laser power for clear visibility, while other slices get too low laser power.
Gated camera manufacturers have made extensive research to find proper parameter values to avoid effects of adverse weather conditions. A known solution used in commercially available gated cameras is that the user manually selects a setting to be applied among a plurality of different preprogrammed settings, each comprising predefined parameter settings optimized for a particular weather condition.
Parameter settings of the gated camera are different for each slice. Typical key parameters set for each slice comprise intensity of emitted light, number of pulses to be sent, duration of illumination period per pulse, and a time window for receiving reflected illumination after sending the pulse. The time window can be determined for example as a predetermined shutter opening time instance after the illumination pulse is sent, and a duration of shutter opening period.
Document US 11194043 B2 discloses using a radar for weather detection. Vehicle systems, like lighting, windscreen wipers or cruise control are dynamically controlled and actuated based on determined weather condition.
Document US 2020/0166649 Al discloses a method and a system using a LIDAR sensor to determine presence of solid objects and adverse weather conditions.
A problem with existing gated camera apparatuses is that parameter settings need to be manually selected based on visually determined weather conditions. Furthermore, compliance with eye-safety regulations need to be obliged.
BRIEF DESCRIPTION
An object of the present disclosure is to provide a method and an apparatus for implementing the method so as to enable automated selection of gated camera parameter settings.
The object of the disclosure is achieved by a method and a system characterized by what is stated in the independent claims. The preferred embodiments of the disclosure are disclosed in the dependent claims.
The disclosure is based on the idea of using an existing sensor in the vehicle, namely a LiDAR, to detect predetermined landmarks with accurately known geographic positions and to automatically determine weather conditions based on characteristics of LiDAR data. Determined weather conditions are then used to determine parameter settings of the gated camera.
According to a first aspect, a method for determining operation parameter values of a gated camera apparatus configured to obtain image data at a plurality of different distance ranges is provided. The method comprises obtaining LiDAR data and processing said LiDAR data for generating a point cloud, locating one or more landmarks in the point cloud, and identifying said one or more landmarks on basis of at least determined geographic location thereof. The method further comprises comparing a respective portion of the obtained LiDAR data associated with each one of the one or more identified landmarks currently appearing in the point cloud with a plurality of weather models associated with the respective identified landmark.
Based on said comparing, a current weather condition is determined, and operation parameter values of the gated camera apparatus are determined for each of the plurality of different distance ranges based on the determined current weather condition.
According to some aspects, the weather model comprises a plurality of weather conditions characterized by means of values of one or more parameters indicating effect of weather-based interference levels in the respective portion of the obtained LiDAR data associated with the respective landmark.
According to some aspects, the plurality of weather models comprises one or more parameters indicating effect of weather-based interference levels. The parameters comprise at least one of determined reflectivity values associated with the landmark, and number of points in the point cloud associated with or representing the landmark.
According to some aspects, said parameters indicate weather-based interference levels are arranged as data sets, wherein each data set comprises respective parameter values as a function of distance to the respective landmark.
According to some aspects, said comparing comprises comparing a plurality of said portions of the obtained LiDAR data determined at different distances to the landmark to a respective plurality of parameter values determined as a function of distance to the respective landmark.
According to some aspects, said comparing comprises determining a relative change of a parameter value between the portion of LiDAR data obtained at two different distances to the landmark, determining a relative change of the respective parameter value between same two different distances in the plurality of data sets of the weather model, and comparing determined amounts of relative change of parameter values to find the best matching weather model data set to be determined as the current weather condition.
According to some aspects, said identifying the landmark is further based on type of the landmark.
According to some aspects, weather model data is stored in a weather model database.
According to some aspects, the weather model data is based on a plurality of field measurements performed in each of the plurality of different weather conditions.
According to some aspects, said operation parameter values are determined individually for each applied distance range.
According to some aspects, said operation parameters of the gated camera apparatus comprise light source power parameters, impulse parameters, and gating parameters.
According to some aspects, light source power parameters comprise light source power for each individual light pulse, wherein impulse parameters comprise one or more of: number of light pulses to be emitted, duration of each light pulse, and time interval or intervals between consecutive light pulses, and wherein gating parameters comprise: timing of shutter opening periods, wherein the timing is determined with respect to emitted light pulses, and timing determines shutter opening time and at least one of shutter opening period or shutter closing time.
According to some aspects, said identifying said one or more landmarks based on their geographic location is based on i) comparing a calculated location of the landmark with a map database comprising known geographic locations of a plurality of landmarks, ii) optionally determining a type of the landmark based on visual appearance thereof, and iii) selecting identity of the landmark based on correlation between the calculated location and location in the map database and optionally based further on determining that the type of the landmark determined based on its visual appearance equals with type of the landmark stored in the map database.
According to another embodiment, a system for determining operation parameter values of a gated camera apparatus configured to obtain image data at a plurality of different distance ranges is provided. The system comprises a LiDAR apparatus configured to obtain LiDAR data and to process said LiDAR data for generating a point cloud, a landmark locating module configured to locate one or more landmarks in the point cloud, a landmark identification module configured to identify said one or more landmarks on basis of at least determined geographic location thereof, and a weather analysis module. The weather analysis module is configured to compare a respective portion of the obtained LiDAR data associated with each one of the one or more identified landmarks currently appearing in the point cloud with a plurality of weather
models associated with the respective identified landmark, and based on said comparing, to determine a current weather condition. The system also comprises a gated camera controller module configured to determine operation parameter values of the gated camera apparatus for each of the plurality of different distance ranges based on the determined current weather condition. An advantage of the method and arrangement of the disclosure is that best adverse weather performance is always achieved by the gated camera, thus facilitating detection of objects of interest, such as vulnerable road users, even at long distances and improving traffic safety. In this context, a long distance is for example 150 m.
BRIEF DESCRIPTION OF THE DRAWINGS
In the following the disclosure will be described in greater detail by means of preferred embodiments with reference to the accompanying drawings, in which
Figures 1A to 1C illustrate operation principle of a gated camera;
Figure 2 illustrates an aerial view of an area;
Figure 3 illustrates a view with detected landmarks;
Figure 4 illustrates average detected intensity of light reflected from a landmark;
Figure 5 illustrates point profiles that show number of points representing a landmark;
Figures 6A to 6D illustrate graphically sets of weather model data values in the weather model database;
Figure 7 illustrates functional elements of a system.
Figures are for illustrative purposes only. Unless explicitly indicated for example by a scale, figures are not in scale. Structurally or functionally similar features are referred to with same reference numbers. Not all the similar features are provided with reference numbers for sake of clarity.
DETAILED DESCRIPTION
Figure 2 illustrates an aerial view with landmarks 100 marked with white circles. Geographic location of each landmark 100 is known and available in a map database. In the field of intelligent transport systems (ITS), map databases are used by vehicles, such as autonomous vehicles, for navigation.
Geographic positions of landmarks 100 are preferably known with high accuracy. As known in the art, for example real time kinematics (RTK) may be used to improve accuracy of geographic location that was initially acquired by means of a satellite-based navigation system (GNSS) such as GPS, Galileo or Glonass. By using such accuracy-improving solutions, achievable location accuracy is in level of centimetres.
Figure 3 illustrates view from a vehicle showing a plurality of identified landmarks, which are in this case traffic signs. Each traffic sign is identified by an unique identifier (196, 198, 199, 200, 201, 202), marked with white rectangles known as bounding boxes. This illustrative image has been taken with a camera during a rain shower. Although this image has been captured with a camera from behind the windscreen of a vehicle, it can be used to illustrate the situation as experienced by a LiDAR carried by and operating in the vehicle. Raindrops reduce visibility of the landmarks.
Identification of landmarks may be based on determining their geographic location on basis of measured location of the vehicle and relative location between the vehicle and the landmark, and comparing this geographic location to a map database comprising known geographic locations and identifiers of landmarks. When the calculated location matches with the predetermined location in the map database, the landmark identifier can be obtained from the map database.
When a LiDAR is used for detecting landmarks, it provides information on reflectivity of the landmark based on intensity of the received light reflected by the landmark.
Figure 4 illustrates average detected intensity of reflected light measured by a LiDAR of a vehicle at selected distances from the landmark in few different weather conditions. In this example, the identified landmark is the traffic sign 200 shown in the figure 3. Intensity can be expressed as an absolute value or as a relative value. In the figure 4, maximum intensity value is determined to be 255. In good weather, referred to as "dry conditions" (white bars) detected intensity is particularly strong at certain distances, since the traffic sign is covered with reflective material that has high reflectivity in certain directions, while reflection in other directions is slightly weaker. Other three
measurements represent intensity of reflected light from the landmark in light rain, for example 0,5 mm/h (20% dotted bars), moderate rain, for example 1,5 mm/h (50% dotted bars), and in heavy rain, for example 2,5 mm/h (black bars). By repeating intensity measurements for landmarks in different weather conditions, a plurality of different intensity profiles for different weathers can be determined.
Figure 5 illustrates four different point profiles that show number of points representing or associated with a landmark in the point cloud generated by the LiDAR of the vehicle as function of distance from the landmark in different weather conditions. Distance to the landmark is a major factor in detected number of points, but adverse weather conditions further mask the view so that number of points of the point cloud determined as being associated or representing with the landmark varies. This exemplary measurement represents just a few selected determined points associated or representing with the same traffic sign 200 shown in the figure 3 for which measured intensity is shown in the figure 4. Four different measurement weather conditions are illustrated, one in dry conditions (white bars), one in light rain, for example 0,5 mm/h (20% dotted bars), one in moderate rain, for example 1,5 mm/h (50% dotted bars), and one in heavy rain, for example 2,5 mm/h (black bars). By repeating measurements in different weather conditions, a plurality of point profiles can be determined for different weather conditions.
According to some embodiments, points are calculated based on number of points within a bounding box comprising the landmark. The bounding box is typically a rectangular box with outer bound determined by detected outer edges of the landmark. According to some embodiments, points are calculated based on number of points representing the landmark itself within the bounding box.
Figures 6A, 6B 6C and 6D illustrate average measured intensity and number of points detected of a landmark at different distances from a landmark. In these examples, dry reference weather is compared to adverse weather conditions caused by snowfall. Since the LiDAR is capable of measuring 360°, also information obtained from backside of the landmark can be used for determining weather conditions. Figures 6A and 6B show exemplary values
measured at different distances from the front side of a landmark (a traffic sign) in dry reference weather (white bars) and during snowfall (black bars), when the landmark, in this case a traffic sign, is at least partially covered with snow. Because the reflective surface of the traffic sign is covered with snow, relative intensity of reflected light is different from what would be expected, if the traffic sign was clean from snow cover. The figure 6A illustrates average intensity of reflected light at a few exemplary distances and the figure 6B illustrates number of points representing the landmark at exemplary distances. Figures 6C and 6D show exemplary values measured at different distances from the back side of a landmark (a traffic sign) in dry reference weather (white bars) and during snowfall (black bars). It can be noticed that snowfall has significant effect on capability of detecting the landmark from the backside; at greater distances, the landmark was not detected at all, and no measurement results were received for intensity nor number of points, while at shorter distances, number of points in the snowfall is in the same level as in the good, reference weather conditions, but intensity of the reflected light is significantly affected by the snowfall.
Figures 4, 5 and 6A to 6D demonstrate that by combining results of intensity and number of point measurements, weather profiles can be determined more accurately than relying in just one of these parameters. Although figures 6A to 6D represent, for illustration purposes, just a few selected detection distances, the weather model is not limited to these detection distances, but any number of detection distances at any selected intervals between those can be used in the weather model.
Based on measurements made with the LiDAR in different known weather conditions, a weather model can be established for each landmark. The weather model comprises weather model data. According to some embodiments, the weather model is provided in form of a weather model database. According to some embodiments, weather model data comprises, for each landmark and each weather condition, a plurality of weather model data values for reflectivity and number of points as function of distance. Weather model data values can be considered as predefined reference values to which measured LiDAR data is compared. Weather model data values are preferably
determined based on field measurements performed by a LiDAR in known weather conditions.
Weather model data values are preferably provided as a function of distance to the landmark in a plurality of different weather conditions. Measured weather model data values may be arranged as a table in the weather model database. Number of weather model data sets per landmark is not limited, but weather models are preferably determined for all kinds of weather conditions, such as different amounts and types of rainfall, snowfall, hail, drizzle, fog, smog, if applicable at the geographic location of the landmark. Automatic comparison of obtained LiDAR data and weather model data sets associated to identified landmarks enables using more detailed weather models than what would be possible by visual determination of weather conditions and manual selection of gated camera parameter settings. For example, same amount of rainfall per hour may cause different scattering characteristics depending on size of raindrops or different type of snow.
All LiDAR measurement-based weather model data is preferably stored in a weather model database together with information on known weather conditions at the time of performing the respective measurements.
Figure 7 illustrates functional elements of a system for implementing some embodiments of the disclosure. A LiDAR 50 carried by a vehicle obtains LiDAR data. A landmark locating module 51 is configured to locate landmarks in the LiDAR data and a landmark identification module 52 is configured to identify located landmarks. The landmark locating module 51 and the landmark identification module 52 are preferably implemented as software modules. They may be part of the LiDAR system or provided by an existing on-boardunit (OBU) of the vehicle because landmark location and identification information may also be used for navigation and autonomous driving.
According to some embodiments, the landmark identification module 52 may be configured to identify landmarks based on matching their location with a map database. According to some embodiments, the portion of LiDAR data representing a landmark is further processed by a type determining module
that is configured to classify detected landmarks are based on their type, and this landmark classification is used as additional data for improving landmark identification by the landmark identification module 52.
Preferably, a plurality of consecutive LiDAR measurements is made so that when the vehicle carrying the LiDAR moves, a plurality of observations on each detected and identified landmark are provided at different distances thereto. LiDAR data is provided to a weather analysis module 53 together with landmark identification of respective landmarks. The weather analysis module 53 is preferably a software module. Based on the landmark identification, the weather analysis module 53 obtains respective weather model data stored in the weather model database 60 and compares obtained LiDAR data of the respective landmark thereto, to determine the prevailing weather condition.
Weather model data preferably comprises a plurality of data sets for each landmark comprised in the weather model data. According to some embodiments, each data set in the weather model represents a plurality of parameter values determined from a portion of LiDAR data obtained during a data collection phase in a known weather condition. Each data set comprises parameters indicating effect of weather-based interference levels in a portion of LiDAR data associated with the respective landmark. Preferably, parameter values are provided in the data set as a function of distance to the respective landmark.
The determined weather condition is used as input for a gated camera controller 54, which automatically selects parameter settings of the gated camera 55 based on the weather type information.
According to some embodiments, the weather model database comprises a plurality of data sets each covering a plurality of one-meter slots for at least one weather-based interference indicating parameters. For example, selected parameters may be determined as average reflectivity and points discussed above. Depending on capability of image processing of the LiDAR, other or further types of weather-interference indicating parameters may be applicable.
Range of stored weather model data values may extend from a few meters, for example 1, 2, 3, 4, or 5 meters to some tens of meters, for example 20, 25 or
30 meters from the respective landmark. While approaching the landmark, the weather analysis module 53 compares measured LiDAR data at different distances from the identified landmark to respective parameter values in weather model data stored in the weather model database. By comparing more than one type of data at more than one different distance enables improving certainty that the best match is selected as the determined current weather condition by the weather analysis module.
According to some embodiments, the weather analysis module may select a few candidate weather conditions from the weather model data based on one set of LiDAR data, and based on these candidate weather conditions, expected values of LiDAR data at the next available distance are compared to measured values, and one or more of the best matching candidate weather conditions is selected. This selection can be repeated, iteratively rejecting a portion of weather conditions, until just one remains, which is then used as the determined weather condition.
By repeating the weather analysis at relatively short time intervals, it is possible to adapt operation parameter values of the gated camera quickly and reliably also in changing weather conditions. For example, when a vehicle drives from a dry weather area into a rain shower, density of rain changes gradually. This is detected automatically by the system and consequently, parameter settings of the gated camera are also changed gradually along with changing weather conditions. This type of gradual adjustment of operation parameter settings is not practical to be performed manually, since it would require constantly making appropriate selections, often on basis of guessing rather than actual information on the weather condition and how it affects operation of the gated camera.
According to some embodiments, relative parameter values between two LiDAR measurements are considered rather than absolute parameter values. For example, changes between pairs of two different measured parameter values determined based on portion of the LiDAR data associated with the same landmark and obtained at a plurality of distances x, x-1, x-2, ...x-n meters are determined. Change can be determined in relative terms, i.e. that parameter value at x meters is 0,933088 times the parameter value at x-1 meters. These
changes determined based on LiDAR data are compared to respective changes between parameter values associated with the same pairs of distances in the plurality of weather model data sets. This kind of relative comparison enables avoiding possible problems caused by use of different LiDAR apparatuses by different vehicles.
For simplicity, distance is referred herein as a single distance value, although in practice distance may represent measured values over a distance range. For example, individual parameter values of both LiDAR data and weather model data sets may be determined as values over a distance range of one meter. It is apparent to a person skilled in the art that as technology advanced, the basic idea of the invention can be implemented in various ways. The invention and its embodiments are therefore not restricted to the above examples, but they may vary within the scope of the claims.
Claims
1. A method for determining operation parameter values of a gated camera apparatus configured to obtain image data at a plurality of different distance ranges, the method comprising:
- obtaining LiDAR data and processing said LiDAR data for generating a point cloud,
- locating one or more landmarks in the point cloud,
- identifying said one or more landmarks on basis of at least determined geographic location thereof,
- comparing a respective portion of the obtained LiDAR data associated with each one of the one or more identified landmarks currently appearing in the point cloud with a plurality of weather models associated with the respective identified landmark,
- based on said comparing, determining a current weather condition, and
- determining operation parameter values of the gated camera apparatus for each of the plurality of different distance ranges based on the determined current weather condition.
2. The method according to claim 1, wherein the weather model comprises a plurality of weather conditions characterized by means of values of one or more parameters indicating effect of weather-based interference levels in the respective portion of the obtained LiDAR data associated with the respective landmark.
3. The method according to claim 1 or 2, wherein the plurality of weather models comprises one or more parameters indicating effect of weatherbased interference levels, the parameters comprising at least one of:
- determined reflectivity values associated with the landmark, and
- number of points in the point cloud associated with or representing the landmark.
4. The method according to claim 2 or 3, wherein said parameters indicating weather-based interference levels are arranged as data sets, wherein each data set comprises respective parameter values as a function of distance to the respective landmark.
5. The method according to claim 4, wherein said comparing comprises comparing a plurality of said portions of the obtained LiDAR data determined at different distances to the landmark to a respective plurality of parameter values determined as a function of distance to the respective landmark.
6. The method according to claim 5, wherein said comparing comprises determining a relative change of a parameter value between the portion of LiDAR data obtained at two different distances to the landmark, determining a relative change of the respective parameter value between same two different distances in the plurality of data sets of the weather model, and comparing determined amounts of relative change of parameter values to find the best matching weather model data set to be determined as the current weather condition.
7. The method according to any one of the preceding claims, wherein said identifying the landmark is further based on type of the landmark.
8. The method according to any one of the preceding claims, wherein weather model data is stored in a weather model database.
9. The method according to claim 8, wherein the weather model data is based on a plurality of field measurements performed in each of the plurality of different weather conditions.
10. The method according to any one of the preceding claims, wherein said operation parameter values are determined individually for each applied distance range.
11. The method according to any one of the preceding claims, wherein said operation parameters of the gated camera apparatus comprise light source power parameters, impulse parameters, and gating parameters.
12. The method according to claim 11, wherein light source power parameters comprise light source power for each individual light pulse, wherein impulse parameters comprise one or more of: number of light pulses to be emitted, duration of each light pulse, and time interval or intervals between consecutive light pulses, and wherein gating parameters comprise: timing of shutter opening periods, wherein the timing is determined with respect to emitted light pulses, and timing determines shutter opening time and at least one of shutter opening period or shutter closing time.
13. The method according to any one of claims 1 to 12, wherein said identifying said one or more landmarks based on their geographic location is based on:
- comparing a calculated location of the landmark with a map database comprising known geographic locations of a plurality of landmarks,
- optionally determining a type of the landmark based on visual appearance thereof, and
- selecting identity of the landmark based on correlation between the calculated location and location in the map database and optionally based further on determining that the type of the landmark determined based on its visual appearance equals with type of the landmark stored in the map database.
14. A system for determining operation parameter values of a gated camera apparatus configured to obtain image data at a plurality of different distance ranges the system comprising :
- a LiDAR apparatus configured to obtain LiDAR data and to process said LiDAR data for generating a point cloud,
- a landmark locating module configured to locate one or more landmarks in the point cloud, a landmark identification module configured to identify said one or more landmarks on basis of at least determined geographic location thereof,
- a weather analysis module configured :
- to compare a respective portion of the obtained LiDAR data associated with each one of the one or more identified landmarks currently appearing in the point cloud with a plurality of weather models associated with the respective identified landmark, and
- based on said comparing, to determine a current weather condition, and
- a gated camera controller module configured :
- to determine operation parameter values of the gated camera apparatus for each of the plurality of different distance ranges based on the determined current weather condition.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FI20235925 | 2023-08-18 | ||
FI20235925A FI131065B1 (en) | 2023-08-18 | 2023-08-18 | WEATHER MODEL-BASED AUTOMATIC PARAMETERIZATION OF GATED CAMERA DEVICES |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2025040835A1 true WO2025040835A1 (en) | 2025-02-27 |
Family
ID=91898808
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FI2024/050373 WO2025040835A1 (en) | 2023-08-18 | 2024-07-04 | Weather model-based automated parametrisation of gated camera apparatus |
Country Status (2)
Country | Link |
---|---|
FI (1) | FI131065B1 (en) |
WO (1) | WO2025040835A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200166649A1 (en) | 2018-11-26 | 2020-05-28 | Continental Automotive Systems, Inc. | Adverse weather condition detection system with lidar sensor |
US11194043B2 (en) | 2018-01-18 | 2021-12-07 | Analog Devices International Unlimited Company | Radar for weather detection and dynamic control and actuation of vehicle systems |
EP3227742B1 (en) * | 2014-12-07 | 2023-03-08 | Brightway Vision Ltd. | Object detection enhancement of reflection-based imaging unit |
EP4203454A1 (en) * | 2020-08-21 | 2023-06-28 | Koito Manufacturing Co., Ltd. | Vehicle-mounted sensing system and gating camera |
-
2023
- 2023-08-18 FI FI20235925A patent/FI131065B1/en active
-
2024
- 2024-07-04 WO PCT/FI2024/050373 patent/WO2025040835A1/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3227742B1 (en) * | 2014-12-07 | 2023-03-08 | Brightway Vision Ltd. | Object detection enhancement of reflection-based imaging unit |
US11194043B2 (en) | 2018-01-18 | 2021-12-07 | Analog Devices International Unlimited Company | Radar for weather detection and dynamic control and actuation of vehicle systems |
US20200166649A1 (en) | 2018-11-26 | 2020-05-28 | Continental Automotive Systems, Inc. | Adverse weather condition detection system with lidar sensor |
EP4203454A1 (en) * | 2020-08-21 | 2023-06-28 | Koito Manufacturing Co., Ltd. | Vehicle-mounted sensing system and gating camera |
Also Published As
Publication number | Publication date |
---|---|
FI20235925A1 (en) | 2024-08-26 |
FI131065B1 (en) | 2024-08-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11609329B2 (en) | Camera-gated lidar system | |
CN108919838B (en) | An automatic tracking method for unmanned aerial vehicle transmission line based on binocular vision | |
US9989457B2 (en) | System and assessment of reflective objects along a roadway | |
US9453941B2 (en) | Road surface reflectivity detection by lidar sensor | |
EP2602640B1 (en) | Vehicle occupancy detection using time-of-flight sensor | |
KR101030763B1 (en) | Image Acquisition Units, Methods, and Associated Control Units | |
EP2883209B1 (en) | Strike detection using video images | |
WO2002101340A2 (en) | System for automated determination of retroreflectivity of road signs and other reflective objects | |
EP0890161A1 (en) | An aircraft detection system | |
WO2009013739A1 (en) | System and method for level of visibility determination and vehicle counting | |
CN116601681A (en) | Estimating automatic exposure values of a camera by prioritizing objects of interest based on relevant inputs of a 3D map | |
Hautière et al. | Estimation of the visibility distance by stereovision: A generic approach | |
KR20180068209A (en) | Lidar Apparatus | |
Godfrey et al. | Evaluation of flash LiDAR in adverse weather conditions toward active road vehicle safety | |
FI131065B1 (en) | WEATHER MODEL-BASED AUTOMATIC PARAMETERIZATION OF GATED CAMERA DEVICES | |
US20230336876A1 (en) | Vehicle-mounted sensing system and gated camera | |
Bayerl et al. | Following Dirt Roads at Night Time: Sensors and Features for Lane Recognition and Tracking | |
CN111448475A (en) | Optical detection method, optical detection device and mobile platform | |
EP4203454A1 (en) | Vehicle-mounted sensing system and gating camera | |
WO2002015144A2 (en) | System for road sign sheeting classification | |
CN112507795A (en) | System and method for estimating atmospheric visibility by adopting machine identification technology | |
US20240191991A1 (en) | Surveying device for improved target classification | |
Daley et al. | Sensing system development for HOV/HOT (high occupancy vehicle) lane monitoring. | |
CN118946820A (en) | Noise removal device, object detection device and noise removal method | |
AU720315B2 (en) | An aircraft detection system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24740947 Country of ref document: EP Kind code of ref document: A1 |