EP3635630A1 - Computerized device for driving assistance - Google Patents
Computerized device for driving assistanceInfo
- Publication number
- EP3635630A1 EP3635630A1 EP18748960.4A EP18748960A EP3635630A1 EP 3635630 A1 EP3635630 A1 EP 3635630A1 EP 18748960 A EP18748960 A EP 18748960A EP 3635630 A1 EP3635630 A1 EP 3635630A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- data
- node
- given
- point cloud
- height
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000013598 vector Substances 0.000 claims description 31
- 239000011159 matrix material Substances 0.000 claims description 22
- 238000004364 calculation method Methods 0.000 claims description 18
- 230000002123 temporal effect Effects 0.000 claims description 13
- 238000000034 method Methods 0.000 claims description 12
- 238000004422 calculation algorithm Methods 0.000 claims description 7
- 230000006870 function Effects 0.000 description 21
- 238000005259 measurement Methods 0.000 description 7
- 239000002689 soil Substances 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 3
- 238000009472 formulation Methods 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000013499 data model Methods 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000005352 clarification Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000017105 transposition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/29—Graphical models, e.g. Bayesian networks
- G06F18/295—Markov models or related models, e.g. semi-Markov models; Markov random fields; Networks embedding Markov models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/84—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using probabilistic graphical models from image or video features, e.g. Markov models or Bayesian networks
- G06V10/85—Markov-related models; Markov random fields
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Definitions
- the invention relates to the field of driving assistance.
- the area of driver assistance includes assistance to fully autonomous vehicles and assistance to assisted vehicles with or without driver. This area may also include other driving situations, such as environmental management, whether on a road or in an industrial environment.
- autonomous vehicles that are perceptive, that is to say capable of finding their bearings in any environment
- assisted autonomous vehicles that is to say vehicles that have no real local perception of the environment.
- these two families are technically very different, and use completely different technologies. Beyond the different techniques, these families of vehicles can be used in very different contexts. For example, a perceptual autonomous vehicle is unavoidable when there is no or little mapping of the environment of use, and / or that geo-localization in this environment is complex. This is the case for example many applications as in the field of agriculture or industrial robotics.
- the concept of autonomous vehicle will therefore be taken in a broad sense, which can cover cars as much as tractors or other agricultural machinery or a robot capable of moving.
- the invention is more particularly concerned with the field of autonomous perceptual vehicles.
- this type of vehicle uses means to analyze their environment in order to define the obstacles. Historically, these obstacles were detected by laser telemetry plan.
- Techniques have therefore been developed to determine in a point cloud obtained by LIDAR or other technique including the ground in the data points which portions correspond to the ground, and which portions correspond to obstacles.
- the known techniques are either precise but not real-time (for example current implementations using conditional Markov or MRP fields), or compatible with real-time use but insufficiently precise (for example, by fitting a plane on the ground) .
- the invention improves the situation.
- the invention proposes a computing device for driving assistance, comprising a memory arranged to receive data from a cloud of data points in which a cloud of points associates, for a given instant, points presenting each coordinates in a plane associated with the point cloud and a value designating a height.
- the device further comprises a computer arranged to access the memory and, for a given cloud of points, for calculating, on the one hand, reference area membership probability data associated with each point of the given point cloud, and on the other hand node data associating a value designating a height and two values indicating a slope in a plane associated with the plane of the given point cloud, by determining a Gaussian random conditional field using the data point cloud data. corresponding to the given point cloud, which Gaussian random conditional field is represented by a mesh of nodes in said associated plane, which nodes are defined by the node data, and to return the membership probability data to a reference surface and / or at least some node data and values designating a height.
- This device is particularly interesting because it makes it possible to distinguish between ground and obstacles, as well as to know a respective height of the data points, with a computational cost compatible with real-time use in an autonomous vehicle.
- the device may have one or more of the following additional features:
- the computer is arranged to apply a Gaussian random conditional field comprising a first calculated spatial component, for a given node, from a value derived from the difference between the height of the given node and a height calculated from the coordinates of the node. given, slope data of the given node and coordinates of the points closest to the given node, and a calculated second spatial component, for a given node, from a value derived from the difference between node node data.
- a Gaussian random conditional field comprising a first calculated spatial component, for a given node, from a value derived from the difference between the height of the given node and a height calculated from the coordinates of the node.
- slope data of the given node and coordinates of the points closest to the given node slope data of the given node and coordinates of the points closest to the given node
- a calculated second spatial component for a given node, from a value derived from the difference between node node data.
- the calculator is furthermore arranged to apply a Gaussian random conditional field comprising a calculated temporal component, for a given node and a given instant, from the node data of the given node and the node data of the given node at a previous instant the given moment,
- the computer is arranged to apply an expectation-maximization algorithm for calculating the reference surface membership probability data and the node data
- the computer is arranged, after an initialization step, to apply the expectation-maximization algorithm by alternating an expectancy calculation step and a maximization calculation step;
- the calculator is arranged to execute the step of calculating expectation by updating the reference area membership probability data of a given point of the given point cloud from a reference distance value derived from the difference between the value designating a height the given point and a height calculated from the coordinates of the given point, the coordinates of the node closest to the given point and the node data of the node closest to the given point, the computer is arranged to update data for the probability of belonging to a reference surface of a given point of the given point cloud as a function of the sign of the reference distance value,
- the computer is arranged to execute the maximization calculation step by calculating on the one hand an information node data vector and information matrix data from the first spatial component and the second spatial component; , and
- the computer is further arranged to perform the maximization calculation step from the time component.
- the invention also relates to a computer method for driving assistance which comprises the following operations:
- the method may have one or more of the following additional features:
- the Gaussian random conditional field comprises a first calculated spatial component, for a given node, from a value derived from the difference between the height of the given node and a height calculated from the coordinates of the given node, slope data of the given node and coordinates of the nearest points of the given node, and a calculated second spatial component, for a given node, from a value derived from the difference between the node data of the given node and node data computed from the coordinates of the given node, node data of the nodes closest to the given node and coordinates of the nodes closest to the given node,
- the Gaussian random conditional field further comprises a calculated temporal component, for a given node and a given instant, from the node data of the given node and the node data of the given node at a time preceding the given instant, the operation b) comprises applying an expectation-maximization algorithm for calculating the reference area membership probability data and the node data,
- operation b) comprises:
- steps b2) and b3) being repeated based on the results of the preceding steps until a convergence condition is fulfilled
- the operation b2) comprises updating the data of probability of belonging to a reference surface of a given point of the given point cloud from a reference distance value drawn from the difference between the value designating a height of the given point and a height calculated from the coordinates of the given point, the coordinates of the node closest to the given point and the node data of the node closest to the given point,
- the operation b2) comprises updating the data of probability of belonging to a reference surface of a given point of the given point cloud as a function of the sign of the reference distance value
- operation b3) comprises computing an information node data vector) and information matrix data) from the first spatial component and the second spatial component, and
- FIG. 1 represents a schematic diagram of a device according to the invention and data it processes
- FIG. 2 represents an exemplary diagram of the data model used by the device of FIG. 1,
- FIG. 3 represents an example of an operating loop of the device of FIG. 1, and
- FIG. 4 represents an exemplary implementation of a function of FIG. 3.
- CRF Random conditional fields
- Lu et al. uses supervised learning to classify obstacles using feature extraction, which is expensive and unsuitable for a navigation application with various obstacles such as autonomous driving.
- Wang et al is specific to the segmentation of image sequences, which also makes it inapplicable to the field of autonomous vehicles.
- the Applicant has discovered a means of adapting the CRFs to the context of autonomous vehicles in order to discriminate in a cloud of points those relating to the ground, and those belonging to obstacles, and this in real time.
- Real-time means a device that has an economically reasonable computing power and can be embedded in a vehicle, and is able to process data points faster than the data point acquisition frequency.
- Figure 1 shows a schematic diagram of a device 2 according to the invention.
- the device 2 comprises a memory 4 and a computer 6.
- the memory 4 can be any type of data storage suitable for receiving digital data: hard disk, hard disk flash memory (SSD in English), flash memory in any form, RAM, disk magnetic, distributed storage locally or in the cloud, etc.
- the data calculated by the device can be stored on any type of memory similar to or on memory 8. This data can be erased after the device has completed its tasks or stored.
- the memory 4 receives data point cloud data 8 at a frequency corresponding to the acquisition frequency of the sensor of the autonomous vehicle in which the device 2 is installed. Alternatively, the device 2 can be deported and communicate with the autonomous vehicle by any appropriate means.
- the computer 6 processes the point cloud data and returns discriminated data point cloud data 10, in which each data point is identified as belonging to the ground or to an obstacle.
- a height relative to the ground may be appended to each data point of the discriminated data point cloud.
- the data point cloud data 8 as the discriminated data point cloud data 10 is always coupled to a time marker, which makes it possible to determine when the data acquisition was performed.
- the computer 6 can also return a soil elevation map in the form of a grid of points, with the slope associated with each point of the grid.
- the computer 6 is an element directly or indirectly accessing the memory 4. It can be implemented in the form of an appropriate computer code executed on one or more processors. By processors, it must be understood any processor adapted to the calculations described below. Such a processor can be made in any known manner, in the form of a microprocessor for a personal computer, a dedicated FPGA or SoC chip ("System on chip"), a computing resource on a grid, a microcontroller, or any other form to provide the computational power necessary for the embodiment described below. One or more of these elements can also be realized in the form of specialized electronic circuits such as an ASIC. A combination of processor and electronic circuits can also be considered.
- FIG. 2 represents an exemplary diagram of the data model used by the device of FIG. 1.
- STCRF spatio-temporal random conditional field
- the ground is modeled as a conditional random field represented by a mesh of nodes in a two-dimensional plane centered on the vehicle.
- Each node has its own coordinates in this plane and is associated with a continuous random variable hidden in the form of a triplet comprising a height, a slope along a first axis of the two-dimensional plane, and a slope along a second axis of the two-dimensional plane.
- the measurements contained in the data point cloud data 8 represent coordinate triplets (xj, yj, zj) that are matched to the two-dimensional plane reference.
- the STCRF is based on three interdependent neighborhood groups:
- the real neighborhood which includes observations (for example a prior classification of certain points and / or nodes) and hidden variables (such as the height of points and / or nodes and a probability of belonging to the ground).
- the modeling via this STCRF makes it possible to determine, for each given point, the value of the probability of its belonging to the ground, as well as the height for the nodes of the mesh.
- the STCRF data is distributed over time planes 20 and 22.
- the time plane 20 represents the mesh for the particular data point cloud.
- the time plane 22 represents the mesh for the cloud of data points immediately prior to the particular data point cloud.
- the temporal neighborhood is thus formed, for a node 24 of the temporal plane 20 by the node 26 which corresponds to it in the temporal plane 22. The manner in which account is taken of the displacement of the vehicle to determine the node 26 will be explained below.
- Node 24 is surrounded by a set of data points 28.
- each data point 28 has coordinates (xj, yj, zj) which represent a measure here obtained by LIDAR).
- a binary random variable cj is associated with each measure.
- the device 2 implements a calculation loop which makes it possible to evaluate the value of the variable cj. For the purposes of these calculations, each data point 28 is associated with the node Gi of which it is closest according to a metric measurement.
- Equation [10] relates to the real neighborhood
- equation [20] concerns the spatial neighborhood
- equation [30] concerns the temporal neighborhood.
- W1 is a normalization factor
- a is a weighting factor
- Mi is the set of data points that are associated with a given node Gi
- Hij is a matrix defined by the formula [ 15] of Appendix A.
- the multiplication of the matrix Hij by Gi amounts to estimating the height of the node Gi at the data point of index j, taking into account the slope sxi and syi of Gi and the distances between the node Gi and the index data point j.
- equation [10] determines a value derived from the difference between the height of the measurement of each of the data points and the height that would have the nearest node at these points, the variable cj influencing according to each point of data is associated with the ground or not.
- the selection of the data points of the real neighborhood is performed by shifting the grid grid and centering it on the nodes Gi.
- each grid square contains the data points closest to the node Gi in the center of the grid square considered.
- Equation [20] W2 is a normalization factor, ⁇ is a weighting factor, Vi denotes the set of neighboring nodes of a given node Gi, and Fij is a matrix defined by the formula [25] of Annex A.
- the application of the matrix Fij to Gj amounts to estimating the height of the node Gj at the location of the index node i, taking into account the slope sxj and syj of Gj, and distances between the node Gj and the node Gi.
- equation [20] determines a value derived from the difference between the height of each of the nodes and the height that neighboring nodes would have at these nodes.
- the spatial neighbors are the 8 nodes closest to each given node.
- equation [20] could comprise a weighting factor for each neighboring node Gj, for example of the Gaussian kernel type.
- W3 is a normalization factor
- ⁇ is a weighting factor
- Qi is a transition matrix of node Gi for the cloud of data points immediately prior to the current data point cloud.
- Gi node state for the current data point cloud.
- the matrix Qi is chosen equal to the identity matrix, but it could be different to introduce noise that would correspond to a confidence interval on the measurements that made it possible to interpolate the node Gi t_1 or to hold account of a change in the height of the soil in time (due for example to waves in a maritime application).
- the determination of the node Gi M is done by transforming the mesh of the data point cloud immediately prior to the current data point cloud according to the displacement and change data. vehicle orientation.
- This data can be derived from an inertial measurement unit, and / or GPS data, and / or data from an odometer or other methods using other sensors (visual odometry, laser SLAM, etc.).
- the value of Gi t_1 is then calculated by interpolation of the transformed mesh at the location of the node Gi.
- Gi t_1 For nodes for which no information was available, that is to say, corresponding to an area not visited by the vehicle, Gi t_1 is marked as undefined, so the couple Gi Gi- t_1 not taken account, while areas that disappear as a result of the transformation may be discarded or stored in a long-term map.
- the temporal neighborhood might include an even older cloud node.
- the use of a single temporal neighbor is enough to force coherence for the nodes that are in an area that has become blind because of the occurrence of an obstacle like another vehicle, or by the displacement of the autonomous vehicle itself. even.
- the temporal neighborhood and the equation [30] could be omitted, provided that they agree to lose the temporal cohesion, and to undergo dynamic shadow zones as a function of the obstacles encountered.
- the invention is based on the transformation of a mesh a priori into a mesh that maximizes the posterior probability according to the formula [40] of Appendix A.
- each node is assigned a Gaussian model for the state of the ground, each node Gi being represented by a mean vector Gmi and a covariance matrix Mi.
- Gmi the matrix of the matrix Pi and the vector Xi are determined.
- an iterative expectation-maximization algorithm is executed to determine the matrix Pi and the vector Xi, in which the expectation step estimates the probability distribution on the classification distribution of the points.
- Ci data and a maximization step uses this distribution to estimate the state distribution of the soil.
- Figure 3 shows an example of a function performed by the computer 6 to implement these calculations.
- the computer 6 executes an InitQ function.
- the starting grid is flat.
- This function therefore initializes the Gmi vectors of G (0) with a height and zero slopes. This also initializes X (0) to 0.
- the function InitQ would initialize the vectors Gmi according to this other mesh. This initialization will also be performed each time a new node (that is, a node at a location that has not been processed in any of the previous loops) is defined for a cloud of data points, as the vehicle moves.
- the covariance matrix Mi is initialized with important diagonal coefficients, so that these initialization values do not influence the final result of the algorithm but make it possible to accelerate its convergence.
- a loop is then executed until the operation of the autonomous vehicle is stopped, with in an operation 310 the incrementation of an index t, in an operation 320 the execution of a function Rec () which receives a cloud of current data point and stores the corresponding measurements in a vector Z, and in an operation 330 the execution of a function EM () which determines the vector G (t) while receiving as arguments the vector Z and the vector G (t -1) of the previous loop.
- Figure 4 shows an example of implementation of the function EM ().
- the function EM () is initialized by defining an index i to 0 and an index k to 1, by determining P (t-1) and X (t-1) from G (t-1). ) and defining the current loop index t.
- the first loop traverses, for a given index k, all the nodes of index i and update the probability C [], the vector X [i] and the matrix P [i].
- the vector C [] therefore contains, for a given index k, the probabilities of all the nodes of index i of the current loop of index t.
- the second loop consists in repeating the first loop with the updated nodes, thus varying the index k, until a convergence condition is fulfilled.
- the operations of the first loop could be performed in parallel.
- the first loop associated with each index i could be parallelized, and be for example executed all or in part in parallel on a graphics processor, whose qualities are known for these applications.
- the probability C is updated in an operation 410 by a function East () which receives as argument the vector Z and the vector X [i].
- the function East () selects the set of data points of the vector Z which are associated with the index node i, and evaluates for each of them a probability cj that it stores in the vector C [k].
- the probability cj is calculated according to formula [60] of Annex A.
- a function XMax () receives as arguments the vector C [], the vector X [i] (t-1) and the vector X [i] (k1, t) of the previous iteration, and that the vector Z and update the vector X [i] (k, t) of the current iteration by applying the formula [70] of Annex A.
- the formula [70] of Annex A amounts to maximizing the formulas [10], [20] and [30] for the vector X [i], the exponential formulation of the latter allowing a linear formula in the presence of the model Gaussian.
- a function PMax () receives as arguments the vector C [], the matrix P [i] (t-1) and the matrix P [i] (kl, t) of the previous iteration, as well as the vector Z and updates the vector P [i] (k, t) of the current iteration by applying the formula [80] of Annex A.
- formula [80] of Annex A amounts to maximizing formulas [10], [20] and [30] for matrix P [i], exponential formulation of the latter allowing a linear formula in the presence of the Gaussian model.
- weighting factor a was set at 1
- weighting factor ⁇ was set at 0.5
- weighting factor ⁇ was set at 0.2.
- the index i is then tested to determine if all the nodes have been updated for the index loop k current in an operation 440. If this is not the case, then the index i is incremented in an operation 450 and operations 410 to 440 are repeated.
- the estimate for the current iteration of each vector Gmi, stored in a table G (k, t) is computed in an operation 460 which receives as arguments the vector X (k, t) current and the current matrix P (k, t). This is done by applying formula [90] of Annex A, which corresponds to the transposition to the current iteration of formula [50] of Annex A.
- a function Conv () is executed in an operation 470.
- the function Conv () determines whether the function EM () has converged, or if it is necessary to make another iteration. This determination can be based on the number of iterations, the Applicant having found that, for the parameters described above, 10 iterations are always sufficient to converge, or on a specific convergence condition, for example the comparison between G (k, t) and G (kl, t).
- the index k is incremented in an operation 480 and the index i is reset in an operation 490, and the loop resumes with the operation 410. Otherwise, the function EM () ends by returning the array G (k, t) in an operation 499.
- the example described here was validated with data point clouds obtained by LIDAR, using a Velodyne HDL64 LIDAR mounted on a Renault Zoe as well as with 3 Ibeo Lux LIDARs.
- other point cloud sources could be used as stereo cameras or Kinect type depth cameras or the like.
- the actual neighborhood could be reduced by taking into account the distance to the vehicle. Indeed, by nature, the LIDAR returns a lot of points near the vehicle, then less and less as the angle of fire increases. As a variant, the number of neighbors could be conserved, but used to accelerate the convergence by forcing the coherence of the nodes of the mesh with these. To accelerate the convergence, the points very close to the vehicle could also have a height forced to that of the vehicle.
- the grid described here is patterned squares, but it could be adapted to the technology of capturing clouds of data points, for example be a polar grid with circular patterns.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Remote Sensing (AREA)
- Life Sciences & Earth Sciences (AREA)
- Probability & Statistics with Applications (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- Game Theory and Decision Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Astronomy & Astrophysics (AREA)
- Navigation (AREA)
- Image Processing (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1755150A FR3067474B1 (en) | 2017-06-09 | 2017-06-09 | COMPUTER DEVICE FOR DRIVING ASSISTANCE |
PCT/FR2018/051299 WO2018224768A1 (en) | 2017-06-09 | 2018-06-06 | Computerized device for driving assistance |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3635630A1 true EP3635630A1 (en) | 2020-04-15 |
Family
ID=59325548
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18748960.4A Pending EP3635630A1 (en) | 2017-06-09 | 2018-06-06 | Computerized device for driving assistance |
Country Status (4)
Country | Link |
---|---|
US (1) | US11574480B2 (en) |
EP (1) | EP3635630A1 (en) |
FR (1) | FR3067474B1 (en) |
WO (1) | WO2018224768A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11933630B2 (en) * | 2017-06-26 | 2024-03-19 | Volvo Truck Corporation | Control arrangement for a vehicle |
US20190005667A1 (en) * | 2017-07-24 | 2019-01-03 | Muhammad Zain Khawaja | Ground Surface Estimation |
CN110443786B (en) * | 2019-07-25 | 2021-12-07 | 深圳一清创新科技有限公司 | Laser radar point cloud filtering method and device, computer equipment and storage medium |
CN112907739B (en) * | 2021-01-22 | 2022-10-04 | 中北大学 | Method, device and system for acquiring height difference information of well lid |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008034465A1 (en) * | 2006-09-19 | 2008-03-27 | Telecom Italia S.P.A. | Method of deriving digital terrain models from digital surface models |
US8215252B1 (en) * | 2009-07-14 | 2012-07-10 | Lockheed Martin Corporation | System and method for dynamic stabilization and navigation in high sea states |
US8605998B2 (en) * | 2011-05-06 | 2013-12-10 | Toyota Motor Engineering & Manufacturing North America, Inc. | Real-time 3D point cloud obstacle discriminator apparatus and associated methodology for training a classifier via bootstrapping |
US9188783B2 (en) * | 2011-09-09 | 2015-11-17 | Disney Enterprises, Inc. | Reflective and refractive surfaces configured to project desired caustic pattern |
CA2834877A1 (en) * | 2012-11-28 | 2014-05-28 | Henry Leung | System and method for event monitoring and detection |
CN104899854B (en) * | 2014-03-05 | 2018-01-16 | 航天信息股份有限公司 | The detection method and device of heap grain altitude line |
US10133944B2 (en) * | 2016-12-21 | 2018-11-20 | Volkswagen Ag | Digital neuromorphic (NM) sensor array, detector, engine and methodologies |
US10229341B2 (en) * | 2016-12-21 | 2019-03-12 | Volkswagen Ag | Vector engine and methodologies using digital neuromorphic (NM) data |
JP6565967B2 (en) * | 2017-05-12 | 2019-08-28 | トヨタ自動車株式会社 | Road obstacle detection device, method, and program |
-
2017
- 2017-06-09 FR FR1755150A patent/FR3067474B1/en active Active
-
2018
- 2018-06-06 US US16/620,123 patent/US11574480B2/en active Active
- 2018-06-06 WO PCT/FR2018/051299 patent/WO2018224768A1/en active Application Filing
- 2018-06-06 EP EP18748960.4A patent/EP3635630A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US11574480B2 (en) | 2023-02-07 |
WO2018224768A1 (en) | 2018-12-13 |
US20200104606A1 (en) | 2020-04-02 |
WO2018224768A4 (en) | 2019-01-31 |
FR3067474A1 (en) | 2018-12-14 |
FR3067474B1 (en) | 2021-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3635630A1 (en) | Computerized device for driving assistance | |
US20190005068A1 (en) | Information processing device, map update method, program, and information processing system | |
EP3138079B1 (en) | Method of tracking shape in a scene observed by an asynchronous light sensor | |
EP2572319B1 (en) | Method and system for fusing data arising from image sensors and from motion or position sensors | |
EP2724203B1 (en) | Generation of map data | |
EP3614306B1 (en) | Method for facial localisation and identification and pose determination, from a three-dimensional view | |
Shinzato et al. | Road estimation with sparse 3D points from stereo data | |
FR3005187A1 (en) | SAR IMAGE RECOVERY BY MUTUAL INFORMATION. | |
US11656365B2 (en) | Geolocation with aerial and satellite photography | |
EP2517152B1 (en) | Method of object classification in an image observation system | |
EP4002274A1 (en) | Iterative method for estimating the movement of a material body by generating a filtered movement grid | |
EP3384462B1 (en) | Method for characterising a scene by calculating the 3d orientation | |
CA2709180C (en) | Methods for updating and training for a self-organising card | |
CN116403191A (en) | Three-dimensional vehicle tracking method and device based on monocular vision and electronic equipment | |
Zhang et al. | Feature regions segmentation based RGB-D visual odometry in dynamic environment | |
Wang et al. | Simultaneous clustering classification and tracking on point clouds using Bayesian filter | |
Seo et al. | Segment-based free space estimation using plane normal vector in disparity space | |
Deledalle et al. | Glaciermonitoring: Correlation versus texture tracking | |
Bradski et al. | Robot-vision signal processing primitives [applications corner] | |
FR3135789A1 (en) | Device for detecting three-dimensional objects in a hardware environment | |
EP3757942A1 (en) | Method and device for passive telemetry by image processing | |
Akay et al. | Camera Auto-calibration for Planar Aerial Imagery, Supported by Camera Metadata | |
FR3138944A1 (en) | Device and method for estimating the traversability of terrain by a mobile system | |
EP3757943A1 (en) | Method and device for passive telemetry by image processing and use of three-dimensional models | |
EP4168828A1 (en) | Method and device for controlling the movement of a vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20191209 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210820 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230527 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |