GB2598110A - A method for verifying a mounting position of a lidar sensor device on a motor vehicle by an electronic computing device - Google Patents

A method for verifying a mounting position of a lidar sensor device on a motor vehicle by an electronic computing device Download PDF

Info

Publication number
GB2598110A
GB2598110A GB2012863.3A GB202012863A GB2598110A GB 2598110 A GB2598110 A GB 2598110A GB 202012863 A GB202012863 A GB 202012863A GB 2598110 A GB2598110 A GB 2598110A
Authority
GB
United Kingdom
Prior art keywords
computing device
electronic computing
lidar sensor
motor vehicle
mounting position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2012863.3A
Other versions
GB202012863D0 (en
Inventor
Duchesneau Genevieve
Edskes Bouke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to GB2012863.3A priority Critical patent/GB2598110A/en
Publication of GB202012863D0 publication Critical patent/GB202012863D0/en
Publication of GB2598110A publication Critical patent/GB2598110A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to a method for verifying a mounting position of a LiDAR sensor device on a motor vehicle by an electronic computing device 10, comprising the steps of collecting a predetermined amount of data of a plurality of LiDAR sensor devices of a plurality of motor vehicles and transmitting the predetermined amount of data to the electronic computing device 10, S1, S2, generating a heat map 12 of an empirical field of view (eFOV) depending on the transmitted data by the electronic computing device 10, S5, comparing the heat map 12 with a theoretical field of view (eFOV) of the LiDAR sensor device of the motor vehicle S6, and verifying the mounting position of the LiDAR sensor device depending on a result of the comparison and a predetermined threshold 14, S7. Preferably the arrangement provides selections of filters or boundaries to a user, the comparison being performed by using the heat map under the conditions of the selected filters and/or boundaries S4. If the comparison leads to a predetermined threshold 14 not being met, the system suggests a new mounting position S9 for the LIDAR device.

Description

A METHOD FOR VERIFYING A MOUNTING POSITION OF A LIDAR SENSOR DEVICE ON A MOTOR VEHICLE BY AN ELECTRONIC COMPUTING DEVICE, AS WELL AS A CORRESPONDING ELECTRONIC COMPUTING DEVICE
FIELD OF THE INVENTION
[0001] The invention relates to the field of automobiles. More specifically, the invention relates to a method for verifying a mounting position of a LiDAR sensor device on a motor vehicle by an electronic computing device, as well as to a corresponding electronic computing device.
BACKGROUND INFORMATION
[0002] LiDAR (light detection and ranging) is one of the primary sensors for perception of autonomous vehicles. LiDAR is currently utilized in various areas, including but not limited to localization, mapping, and perception. LiDAR works by sending out a beam, which is reflected back by different objects within its range and passively received by the sensor. Each of these returns creates a point in space with an associated value for distance, position, and intensity. This is done multiple times per second, with most LiDAR sensors having several different units within them sending out pulses of light, each forming a "ring". When all the points from a specific period are combined, a point cloud in a single frame is created.
[0003] Current systems in the prior art allow users to analyze a singular point cloud. This is useful when trying to visualize what the sensor is seeing at a given point in time but may not allow for overall special analysis for the entirety of a dataset.
[0004] US 2019 017 9979 discloses a system for autonomous vehicle operation. For example, a computing system can obtain a scene that includes simulated objects associated with simulated physical properties. The computing system can generate sensor data, which can include simulated sensor interactions for the scene. The simulated sensor interactions can include simulated sensors detecting the simulated objects. Further, the simulated sensors can include simulated sensor properties. The simulated sensor interactions that satisfy one or more perception criteria of an autonomous vehicle perception system can be determined, based at least in part on the sensor data. Furthermore, changes for the autonomous vehicle perception system can be generated, based at least in part on the simulated sensor interactions that satisfy the one or more perception criteria.
[0005] KR 1 019 474 8 B1 discloses a method and an apparatus for determining the placement for light detection and ranging sensor. The present invention relates to a method and an apparatus for determining the placement for a LiDAR sensor, which place a sampling board, perform optimization based on distance information between the sampling board and the LiDAR sensor, and determine the placement of a plurality of LiDAR sensor.
[0006] US 1 047 416 0 B2 discloses a method or system which generates a high resolution 3-D point cloud to operate an autonomous driving vehicle (ADV) from a low resolution in 3-D point cloud and camera-captured images. The system receives a first image captured by a camera for a driving environment. The system receives a second image representing a first depth map of a first point cloud corresponding to the driving environment. The system downsamples the second image by a predetermined scale factor until a resolution of the second image reaches a predetermined threshold. The system generates a second depth map by applying a convolutional neural network (CNN) model to the first image and the downsampled second image, the second depth map having a higher resolution than the first depth map such that the second depth map represents a second point cloud perceiving the driving environment surrounding the ADV.
[0007] In the US 1 034 669 5 B2 a method and apparatus for classifying light detection and ranging sensor data are provided. The method includes transforming sensor data of the LiDAR into point cloud data, selecting a cell including a subset of the point cloud data, dividing the selected cell into a plurality of voxels, calculating a difference of gradients for the plurality of voxels, performing a first pass on the plurality of voxels to identify voxels that contain an object based on the difference of gradients, performing a second pass on the plurality of voxels to identify voxels that contain the object by adjusting a voxel with at least one from among a jitter parameter and a rotation parameter, and outputting a centroid average of voxels identified as containing the object.
[0008] Previous methods for analyzing LiDAR cloud data involve viewing point clouds in a frame-by-frame manner. There is a need in the state of the art for enabling analysis to be done on a cumulative dataset.
SUMMARY OF THE INVENTION
[0009] It is an object of the invention to provide a method as well as a corresponding electronic computing device, by which a more efficient data analysis may be realized.
[0010] This object is solved by a method as well as an electronic computing device according to the independent claims. Advantageous embodiments are presented in the dependent claims.
[0011] One aspect of the invention relates to a method for verifying a mounting position of a LiDAR sensor device on a motor vehicle by an electronic computing device. In a first step collecting a predetermined amount of data of a plurality of LiDAR sensor devices of a plurality of motor vehicles and transmitting each data to the electronic computing device is performed. A heat map of an empirical field of view is generated depending on the plurality of transmitted data by the electronic computing device. The generated heat map is compared with a theoretical field of view of the LiDAR sensor device of the motor vehicle. The mounting position of the LiDAR sensor device is verified depending on a result of the comparison and a predetermined threshold.
[0012] The present invention provides advantages for analyzing sensor efficiency in application to localization and mapping by identifying the empirical field of view of a sensor, in particular a LiDAR sensor device, as well as areas where cluster density occurs over large sets of real world data from vehicles. The analysis system, which may also be regarded as the electronic computing device, according to the invention allows an analysis and verification of the empirical field of view (FOV) of the LiDAR sensor device to be completed through a method of analyzing cumulative LiDAR sensor point cloud datasets in relation to a motor vehicle. The method comprises a system for analyzing the effectiveness of LiDAR sensor device placement and a method for placing LiDAR sensors on the motor vehicle. Using the electronic computing device disclosed, LiDAR point cloud data may be analyzed to verify the empirical field of view of a LiDAR sensor device in a particular location on the motor vehicle.
[0013] According to an embodiment a selection of at least two filters and/or boundaries are provided to a user by the electronic computing device for generating the heat map.
[0014] In another embodiment the comparison is performed by using the heat map under the condition of the at least one filter and/or at least one boundary.
[0015] In another embodiment, if the predetermined threshold is not met during the comparison, a suggestion is generated by the electronic computing device for placing the LiDAR sensor device in a new mounting position on the motor vehicle.
[0016] In another embodiment collecting a predetermined amount of new data of a plurality of LiDAR sensor devices in the new mounting position is performed and the new data is transmitted to the electronic computing device.
[0017] Another aspect of the invention relates to an electronic computing device for verifying a mounting position of a LiDAR sensor device on a motor vehicle, wherein the electronic computing device is configured to perform a method according to the preceding aspect. In particular, the method is performed by the electronic computing device.
[0018] Advantageous forms of configuration of the method are to be regarded as advantageous forms of the electronic computing device. The electronic computing device therefore comprises means for performing the method.
[0019] Further advantages, features, and details of the invention derive from the following description of preferred embodiment as well as from the drawing. The feature and feature combinations previously mentioned in the description as well as the feature and feature combinations mentioned in the following description of the figure and/or shown in the figure alone can be employed not only in the respectively indicated combination but also in any other combination or taken alone without leaving the scope of the invention.
BRIEF DESCRIPTION OF THE DRAWING
[0020] The novel features and characteristics of the disclosure are set forth in the independent claims. The accompanying drawing, which is incorporated in and constitutes part of this disclosure, illustrates an exemplary embodiment and together with the description, serves to explain the disclosed principles. In the figure, the same reference signs are used throughout the figure to refer to identical features and components. Some embodiments of the system and/or methods in accordance with embodiments of the present subject matter are now described below, by way of example only, and with reference to the accompanying figure.
[0021] Fig. 1 shows a schematic flow chart according to an embodiment of the method.
[0022] In the figure same elements or elements having the same function are indicated by the same reference signs.
DETAILED DESCRIPTION
[0023] In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
[0024] While the disclosure is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawing and will be described in detail below. It should be understood, however, that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
[0025] The terms "comprises', "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion so that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus preceded by "comprises" or "comprise" does not or do not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
[0026] In the following detailed description of the embodiment of the disclosure, reference is made to the accompanying drawing that forms part hereof, and in which is shown by way of illustration a specific embodiment in which the disclosure may be practiced. This embodiment is described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
[0027] Fig. 1 shows a schematic flow chart according to an embodiment of the method. The method, which is shown in Fig. 1 is performed by an electronic computing device 10. The electronic computing device 10 is for analyzing a cumulative LiDAR point cloud dataset relative to an entire motor vehicle. The method of using the electronic computing device 10 to optimize LiDAR sensor device placement is also shown in the figure.
[0028] The electronic computing device 10 is a system that allows a user to input large LiDAR point cloud datasets and convert them into a heat map 12 that includes a cumulative representation of all point clouds relative to the motor vehicle. The primary purpose of this tool is to correlate the heat map 12 to the frame of the motor vehicle, accurately convey the field of view of the LiDAR sensor device, and allow a user to analyze point cloud clustering in relation to a set of assigned filters. By doing so, an empirical field of view eFOV of each LiDAR sensor device on the motor vehicle may be measured. Next, the empirical field of view eFOV may be compared with a theoretical field of view tFOV to determine if the LiDAR sensor device is currently in an optimal placement. To complete the comparison, a threshold 14 may be created to determine if the empirical eFOV is verified. If the empirical eFOV does not meet the threshold 14 for verification, a determination can be made to move the LiDAR sensor device placement on the motor vehicle in order to meet the threshold 14.
[0029] In one embodiment, the method first starts in a first step Si by collecting LiDAR sensor data from a fleet of motor vehicles. For example, a vehicle fleet network may comprise a backend with a memory that stores data collected by the LiDAR sensor devices equipped on each motor vehicle. A user may obtain the data from the backend of the motor vehicle fleet network. For example, the backend for the motor vehicle fleet network may be an original equipment manufacturer (OEM) backend. Bounds may be set to create the dataset that will be analyzed by the electronic computing device 10, for example time frames or route travel. Next, the bounded LiDAR sensor dataset is input in the electronic computing device 10, which is shown with a second step 52.
[0030] In a third step S3, once the LiDAR point cloud dataset is input into the electronic computing device 10, the user may specify the topics to analyze by creating filters and boundaries 16. In an exemplary example of a method of filtering the data the user may first select what topics from the LiDAR sensor data file they would like to analyze. The user would then select a variable to be analyzed. The variable may be one of count, intensity or height. Next, a user may select sub-methods, which may be one of average, log scaling, field of view, or weighted by velocity, to be included in the analysis. Further filters may then be included or excluded, such as intensity, ring, or regions, determined by an X, Y, Z coordinate system relative to the motor vehicle. Users may then select boundaries for the heat map 12, which may be based on overall counts per point, allowing users to exclude ranges of their selecting, for example counts over 300 are not included in the displayed results. Lastly, users may select the resolution. For example, resolution may be from a range of 10 cm to 5 m. According to an embodiment, filters may be saved in a memory for future use of the electronic computing device 10. In another embodiment, when the electronic computing device 10 routinely obtains LiDAR sensor data from the vehicle fleet network, the data may automatically be filtered by the filter options stored in the memory. In a fourth step S4, the data is parsed based on the selected filters and boundaries 16. Once the data is parsed, a fifth step S5 of the method is performed to output the heat map 12, which displays a cumulative collection of the LiDAR point cloud based on the selected filters and boundaries 16.
[0031] In a sixth step S6 analysis of the heat map 12 occurs to identify regions of interest. First, an initial analysis may be completed which shows the field of view for each LiDAR sensor device on the motor vehicle. In this step, users can compare the theoretical field of view tFOV to the empirical field of view eFOV seen by the cumulative datasets. After the initial analysis, the datasets may be profiled to perform a bounded analysis. Profiling may occur based on the percentile of counts per object, or a measure of total intensity for each point within the field of view of the LiDAR sensor devices measured. For example, the data may be split 95% and 100% counts, and 0% and 95% counts. Boundaries may then be set in each profiled dataset relation to an X, Y, Z coordinate system in relation to the motor vehicle. For example, boundaries may relate to the usable space of the motor vehicle in driving conditions, focusing on what is applicable in a localization scenario. Applicable space may be an area of 100 m in front of and behind the LiDAR sensor devices, as well as in spacing to both sides of a road. Regions containing the motor vehicle may be excluded in the boundaries.
[0032] In one embodiment, the profiling and boundary settings may be stored in the memory for future use by the electronic computing device 10. After initial analysis, profiling and bounded analysis may be automatically run based on the stored settings. In one embodiment, a region of interest may be the point cloud dataset of 95% to 100% counts.
[0033] In one embodiment, artifacts, or objects, may be identified through the profiled and bounded analysis. For example, man-made structures may become apparent in the heat map 12. In another example, edges of the road may be identified. In another example, if the motor vehicle is a tractor-trailer, the trailer may be detected due to trailer articulation during a turn. Identification of artifacts or objects in the analysis of the region of interest may assist a user in identifying the effectiveness of the LiDAR sensor device. In one embodiment, object detection methods may be used to automate the identification of artefacts based on the dataset.
[0034] In a seventh step S7 a verification of the empirical field of view eFOV occurs. First, the threshold 14 is set to determine whether the empirical field of view eFOV is verified to meet the theoretical field of view tFOV. For example, the threshold 14 may be set according to the size of the region of interest, for example the top 5 percentile profiled dataset. To elaborate, if the region of interest is small based on point clustering occurring outside of the boundaries relative to the motor vehicle, or portions of the motor vehicle are responsible for a large portion of point clustering in the data, and both of these areas are filtered out during the profiling step, the LiDAR sensor device may not be positioned to effectively detect objects in the area surrounding the motor vehicle.
[0035] In another embodiment, an additional step of verification may occur by concurrently analyzing video data from cameras onboard the motor vehicle and the heat maps 12 produced by the electronic computing device 10. For example, video data from the network of motor vehicles may be obtained from the backend of the motor vehicle fleet network. In one embodiment, video data may be automatically obtained with time boundaries that match the time boundaries of the collected LiDAR point cloud data that is provided to the electronic computing device 10. Comparisons between the heat maps 12 and the video data may be made by a user to determine if objects appearing in the video data are reflected in the LiDAR point cloud data. The threshold 14 may be generated based on this comparison of data to verify the empirical field of view eFOV meets the theoretical field of view tFOV.
[0036] In another embodiment, verification of the empirical field of view eFOV may be completed in the initial analysis step. The threshold 14 may be created using the theoretical field of view tFOV. If, when viewing the heat map 12 of the cumulative LiDAR point cloud dataset in relation the entire motor vehicle, the empirical field of view eFOV does not match, or come into a determined range with, the theoretical field of view tFOV, the empirical field of view eFOV may not meet the threshold 14 for being verified to meet the theoretical field of view tFOV.
[0037] In an eighth step 58, if the threshold 14 for verifying the empirical field of view eFOV is met, the LiDAR sensor device may remain in the current mounting position on the motor vehicle or on the fleet of motor vehicles. If the threshold 14 for verifying the empirical field of view eFOV is not met, which is shown by a ninth step S9, the LiDAR sensor device may be mounted in a new location to optimize the empirical field of view eFOV. The placement of the LiDAR sensor device may be determined based on feedback gathered through the analysis of verification process. For example, the LiDAR sensor device may be placed in an area that would maximize the region of interest once data is collected and is run through the electronic computing device 10 again. In another embodiment, if through comparison between the video data and the heat map 12, a certain area relative to the motor vehicle is identified where objects are present in the video data but not included in the region of interest of the LiDAR point cloud data, the LiDAR sensor device may be mounted in a location on the motor vehicle to optimize the coverage of this area relative to the motor vehicle.
[0038] If the LiDAR sensor device placement is modified on the at least one motor vehicle in the motor vehicle network included in the LiDAR point cloud dataset, further collection of data based on the new LiDAR sensor device placement may be carried out, and the process can repeat completing a new application of the LiDAR sensor device empirical field of view eFOV. In other words, coming from the eighth step S8 or from the ninth step S9 the first step Si may be performed again.
[0039] In particular, the invention shows in an embodiment a system for LiDAR sensor field of view analysis and a heat map 12 generation.
Reference Signs electronic computing device 12 heat map 14 threshold 16 filters and boundaries Si first step 52 second step 33 third step 34 fourth step fifth step 36 sixth step 37 seventh step S8 eighth step 39 ninth step
eFOV empirical field of view
tFOV theoretical field of view

Claims (1)

  1. CLAIMS1. A method for verifying a mounting position of a LiDAR sensor device on a motor vehicle by an electronic computing device (10), comprising the steps: - collecting a predetermined amount of data of a plurality of LiDAR sensor devices of a plurality of motor vehicles and transmitting the predetermined amount of data to the electronic computing device (10); (Si, S2) -generating a heat map (12) of an empirical field of view (eFOV) depending on the transmitted data by the electronic computing device (10); (55) - comparing the heat map (12) with a theoretical field of view (tFOV) of the LiDAR sensor device of the motor vehicle; (S6) and - verifying the mounting position of the LiDAR sensor device depending on a result of the comparison and a predetermined threshold (14). (S7) 2. The method according to claim 1, characterized in that a selection of at least two filters and/or boundaries (16) are provided to a user by the electronic computing device (10) for generating the heat map (12). (S3) 3. The method according to claim 2, characterized in that the comparison is performed by using the heat map (12) under the condition of the at least one filter and/or at least one boundary. (S4) 4. The method according to any one of claims 1 to 3, characterized in that if the predetermined threshold (14) is not met during the comparison, a suggestion is generated by the electronic computing device (10) for placing the LiDAR sensor device in a new mounting position at the motor vehicle. (S9) 5. The method according to claim 4, characterized in that collecting a predetermined amount of new data of a plurality of LiDAR sensor devices in the new mounting position is performed and the new data is transmitted to the electronic computing device (10).6. An electronic computing device (10) for verifying a mounting position of a LiDAR sensor device on a motor vehicle, wherein the electronic computing device (10) is configured to perform a method according to any one of claims 1 to 5.
GB2012863.3A 2020-08-18 2020-08-18 A method for verifying a mounting position of a lidar sensor device on a motor vehicle by an electronic computing device Withdrawn GB2598110A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2012863.3A GB2598110A (en) 2020-08-18 2020-08-18 A method for verifying a mounting position of a lidar sensor device on a motor vehicle by an electronic computing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2012863.3A GB2598110A (en) 2020-08-18 2020-08-18 A method for verifying a mounting position of a lidar sensor device on a motor vehicle by an electronic computing device

Publications (2)

Publication Number Publication Date
GB202012863D0 GB202012863D0 (en) 2020-09-30
GB2598110A true GB2598110A (en) 2022-02-23

Family

ID=72615405

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2012863.3A Withdrawn GB2598110A (en) 2020-08-18 2020-08-18 A method for verifying a mounting position of a lidar sensor device on a motor vehicle by an electronic computing device

Country Status (1)

Country Link
GB (1) GB2598110A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180180719A1 (en) * 2018-02-26 2018-06-28 GM Global Technology Operations LLC Extendable sensor mount
CN108595771A (en) * 2018-03-28 2018-09-28 北京空间技术研制试验中心 Spacecraft equipment visual field analog analysing method
US20190293772A1 (en) * 2018-03-21 2019-09-26 Zoox, Inc. Automated detection of sensor miscalibration
EP3629056A1 (en) * 2018-09-25 2020-04-01 Aptiv Technologies Limited Object detection sensor alignment
WO2020094544A1 (en) * 2018-11-08 2020-05-14 Valeo Schalter Und Sensoren Gmbh Method and measuring system for determining the value of the oscillation amplitude of a micro-oscillating mirror of an object detecting device
US10788316B1 (en) * 2016-09-21 2020-09-29 Apple Inc. Multi-sensor real-time alignment and calibration

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10788316B1 (en) * 2016-09-21 2020-09-29 Apple Inc. Multi-sensor real-time alignment and calibration
US20180180719A1 (en) * 2018-02-26 2018-06-28 GM Global Technology Operations LLC Extendable sensor mount
US20190293772A1 (en) * 2018-03-21 2019-09-26 Zoox, Inc. Automated detection of sensor miscalibration
CN108595771A (en) * 2018-03-28 2018-09-28 北京空间技术研制试验中心 Spacecraft equipment visual field analog analysing method
EP3629056A1 (en) * 2018-09-25 2020-04-01 Aptiv Technologies Limited Object detection sensor alignment
WO2020094544A1 (en) * 2018-11-08 2020-05-14 Valeo Schalter Und Sensoren Gmbh Method and measuring system for determining the value of the oscillation amplitude of a micro-oscillating mirror of an object detecting device

Also Published As

Publication number Publication date
GB202012863D0 (en) 2020-09-30

Similar Documents

Publication Publication Date Title
CN110988912B (en) Road target and distance detection method, system and device for automatic driving vehicle
CN109949372B (en) Laser radar and vision combined calibration method
CN112816954B (en) Road side perception system evaluation method and system based on true value
US9911030B1 (en) System and method for evaluating the perception system of an autonomous vehicle
CN106462996B (en) Method and device for displaying vehicle surrounding environment without distortion
US10699167B1 (en) Perception visualization tool
KR102094341B1 (en) System for analyzing pot hole data of road pavement using AI and for the same
US9607220B1 (en) Image-based vehicle speed estimation
CA3028653A1 (en) Methods and systems for color point cloud generation
CN106407947A (en) Target object recognition method and device applied to unmanned vehicle
CN106529495A (en) Obstacle detection method of aircraft and device
CN108645375B (en) Rapid vehicle distance measurement optimization method for vehicle-mounted binocular system
KR101735557B1 (en) System and Method for Collecting Traffic Information Using Real time Object Detection
US20140160289A1 (en) Apparatus and method for providing information of blind spot
CN104902261A (en) Device and method for road surface identification in low-definition video streaming
CN109543493A (en) A kind of detection method of lane line, device and electronic equipment
CN109886064B (en) Method for determining the boundary of a drivable space
CN114359181A (en) Intelligent traffic target fusion detection method and system based on image and point cloud
Lion et al. Smart speed bump detection and estimation with kinect
US11373409B2 (en) Photography system
KR20180024757A (en) Traffic information offering system using multi view blackbox camera image
JP2007011994A (en) Road recognition device
CN114463303A (en) Road target detection method based on fusion of binocular camera and laser radar
CN114821497A (en) Method, device and equipment for determining position of target object and storage medium
EP4250245A1 (en) System and method for determining a viewpoint of a traffic camera

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20221201 AND 20221207

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)