WO2019068699A1 - Method for classifying an object point as static or dynamic, driver assistance system, and motor vehicle - Google Patents
Method for classifying an object point as static or dynamic, driver assistance system, and motor vehicle Download PDFInfo
- Publication number
- WO2019068699A1 WO2019068699A1 PCT/EP2018/076769 EP2018076769W WO2019068699A1 WO 2019068699 A1 WO2019068699 A1 WO 2019068699A1 EP 2018076769 W EP2018076769 W EP 2018076769W WO 2019068699 A1 WO2019068699 A1 WO 2019068699A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- point
- vector
- motor vehicle
- time
- dynamic
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the invention relates to a method for classifying at least one object point of an object in an environment of a motor vehicle as static or dynamic, wherein an image sequence of images is captured by means of at least one imaging sensor device of the motor vehicle of at least one part of the environment of the motor vehicle with the object point.
- the expression "dynamic" in this context preferably is to be understood as being in motion relative to a predefined coordinate system, like a road.
- the invention moreover relates to a driver assistance system for a motor vehicle as well as a motor vehicle with a driver assistance system.
- WO 2015 / 134 832 A1 uses the optical flow as well as epipolar geometry, however, again merely for the localization of a moving object and not for differentiating between static and moving objects.
- an image sequence of images is captured by means of at least one imaging sensor device of the motor vehicle of at least one part of the environment of the motor vehicle with the object point.
- an epipolar plane with respect to a first image of the image sequence taken at a first point in time and a second image of the image sequence taken at a second point in time is determined.
- the described method still can be realized by any other imaging sensor devices in analogy.
- the determination of the epipolar plane as well as the determination of the first vector are based on measurements, even for the case that the first object point is a static object point, it may happen that the first vector still deviates somewhat from the epipolar plane.
- a first tolerance range by which such measurement inaccuracies can be taken into consideration.
- an absolute value of a portion of the first vector extending in parallel to a normal vector of the epipolar plane is calculated and compared to a predetermined threshold value defining the predetermined first tolerance range.
- a third vector is determined, which is collinear to a third line, which extends through the position of the sensor device at the second point in time as well as through a point, which is on a first line, which extends through the position of the sensor device at the first point in time and the position of the object point at the first point in time.
- Fig. 3 a schematic representation of an epipolar plane reduced to a line according to one further embodiment of the invention
- Fig. 1 shows a schematic representation of a motor vehicle 1 with a driver assistance system 2 according to an embodiment of the invention.
- the driver assistance system 2 comprises in this example a sensor device configured as camera 3 as well as an analysis unit 4.
- the camera 3 can for instance be positioned behind a windscreen 5 of the motor vehicle 1 .
- the camera 3 can be positioned in any place of the motor vehicle 1 , however in such a way that same is configured to capture at least part of the environment 6 of the motor vehicle 1 .
- the images captured by the camera 3 are transmitted to the analysis unit 4 and analyzed by it. On the basis of the captured images by the analysis unit 4 for instance an object recognition and 3D reconstruction can be performed.
- the analysis unit is configured to differentiate between static and dynamic object points of objects in the environment 6 of the motor vehicle 1 , which is set out in more detail in the following.
- a calibrated camera 3 thus for every image point or every area on the image sensor provides a corresponding area on the projected image plane 10v in a predetermined distance from the camera position 8v or the projection center. From the captured image of the object point O at the first point in time thus the position of the image point O' on the projected image plane 10v can be determined. For determining the first line 9 then correspondingly a line can be drawn through the position 8v of the camera 3 at the first point in time and the image point O' of the projected image plane 10v. At a random distance from the camera position 8v at the first point in time now on this first line 9 the point P is selected.
- a threshold value can be set. This means if the distance d is between 0 and this threshold value, the object point O is classified as static, otherwise as dynamic.
- This threshold value can for instance be fix or be selected in dependence on one or several determined variables, such as for instance the motor vehicle speed. In this connection it is advantageous, if the threshold value is selected to be the larger, the faster the motor vehicle speed is.
- Fig. 4 shows a flow diagram for illustrating a method according to an embodiment of the invention.
- the length and direction of a flow vector for a certain object point can be compared with the flow vector assigned to this object point, which was determined in a preceding time step.
- Image points of objects, the assigned flow vectors of which have an excessive change in their magnitude or direction thus can advantageously be filtered prior to the further processing.
- these corresponding points need not necessarily be deleted, but can also be only marked as invalid.
- the classification of object points O also considers the camera positions for/at respective time steps, it is advantageous for a correct classification, if the camera positions at the respective time steps can be determined correctly.
- driving incidents such as for instance a driving over a road bump, from which vehicle movements and thus also camera movements result, which cannot, or at least not correctly, be captured by the vehicle sensor system, such as for instance movements of the motor vehicle in parallel to its vertical axis.
- the possibility is provided to detect, on the basis of the captured images of the image sequence, such predetermined driving incidents impairing a correct classification.
- step S9 the motor vehicle speed is determined on the basis of odometric data, which can be provided by motor vehicle sensors.
- step S10 it is checked whether the motor vehicle 1 is at a standstill. If this is not the case, in step S1 1 it is determined, how many of the object points in the current time step in relation to the totality of all classified object points in the current time step have been classified as dynamic. Further in step S12 it is checked whether now this percentage of the object points classified as dynamic exceeds a predetermined first threshold value, for instance 60 %.
- step S18 it is correspondingly checked whether the current percentage of the object points classified as dynamic in comparison with the mean percentage of the preceding time steps is raised by a predetermined threshold value, such as for instance 25 %. If this is the case, again it is proceeded with step S13. If this is not the case, in step S19 it is checked whether a predetermined driving incident has been determined as given. If this is not the case, in step S20 it is determined that no predetermined driving incident is given and it is started from the beginning. If the checkup in step S19, however, produces the result that a predetermined driving incident has already been determined as given, it is checked in step S21 whether the percentage of object points classified as dynamic determined in step 1 1 remains below a third predetermined threshold value.
- a predetermined threshold value such as for instance 25 %
- the image data for the duration of this driving incident can be rejected, i.e. a further processing not be provided, or at least be marked as little reliable.
- the image data again can be provided for further processing and optionally be marked as reliable.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a driver assistance system (2), motor vehicle (1), and method for classification of at least one object point (O) of an object in an environment (6) of a motor vehicle (1) as static or dynamic, wherein an image sequence of images is captured by means of at least one imaging sensor device (3) of the motor vehicle (1) of at least one part of the environment (6) of the motor vehicle with the object point (O) is captured. Moreover, at least for the case that an ego motion of the motor vehicle (1) relative to the environment (6) fulfils a predetermined criterion, an epipolar plane (7) with regard to a first image of the image sequence captured at a first point in time and a second image of the image sequence captured at a second point in time is determined, wherein a first vector (I) pointing from a position (8a) of the sensor device (3) at the second point in time in the direction of the object point (O) is determined on the basis of the second image and the at least one object point (O) is classified as static or dynamic in dependence on a relative position of the first vector (I) relative to the epipolar plane (7).
Description
Method for classifying an object point as static or dynamic, driver assistance system, and motor vehicle
The invention relates to a method for classifying at least one object point of an object in an environment of a motor vehicle as static or dynamic, wherein an image sequence of images is captured by means of at least one imaging sensor device of the motor vehicle of at least one part of the environment of the motor vehicle with the object point. The expression "dynamic" in this context preferably is to be understood as being in motion relative to a predefined coordinate system, like a road. The invention moreover relates to a driver assistance system for a motor vehicle as well as a motor vehicle with a driver assistance system.
From the prior art for instance methods for classifying objects or object points as static or dynamic are known, which for this purpose use the optical flow or flow vectors. If for instance a camera moves towards a certain target, static objects change their position in the images taken in this connection sequentially. This change of position can for instance be described by a corresponding flow vector. If in these sequentially taken images merely static objects are present, the motion curves or extended movement curves intersect in a joint focus of expansion representing the target, towards which the camera is moving. Movement curves of objects or object points, which do not extend through this focus of expansion, can correspondingly be classified as dynamic objects or object points.
Such methods, however, up to now have been very sensitive to noise, above all though at times they are computationally expensive and thus require a lot of time. Above all the calculation of such a focus of expansion in this connection requires a lot of effort.
However, precisely in motor vehicle applications the computation capacities available are limited. Moreover, methods that require a lot of time and need too much time for a correct classification of objects, are not acceptable in applications in the field of driver-supporting driver assistance systems or in the field of autonomous driving functions due to a safety risk resulting from such temporal delays.
Further known is also the so-called epipolar geometry. This describes the geometric relations between various camera images of the same object. The epipolar geometry is mainly employed in obtaining 3D information from images as well as for correspondence analysis of corresponding image points of two different images taken.
For instance the US 6,473,536 B1 describes an image synthesis method, in which both the optical flow as well as the epipolar geometry are employed. An option for
differentiating between static and dynamic objects or object points, however, is not described in this connection.
Moreover, the US 8,259,998 B2 describes an image processing method, in which the movement of a camera is to be determined on the basis of captured images. For this purpose moving object portions are removed from images, in order not to corrupt the camera movement to be calculated. How exactly it can here be differentiated between static and dynamic image regions, however, is not described.
Also the WO 2015 / 134 832 A1 uses the optical flow as well as epipolar geometry, however, again merely for the localization of a moving object and not for differentiating between static and moving objects.
It is therefore the task of the present invention to provide a method for classifying an object point of an object in an environment of a motor vehicle as static or dynamic, as well as a driver assistance system, and a motor vehicle, which facilitate a classification involving as little as possible computation effort whilst still being as reliable as possible.
This task is solved by a method for classifying at least one object point of an object, by a driver assistance system, and a motor vehicle having the features according to the respective independent claims. Advantageous embodiments of the invention are subject matter of the dependent patent claims, the description, and the figures.
According to the invention in a method for classifying at least one object point of an object in an environment of a motor vehicle as static or dynamic an image sequence of images is captured by means of at least one imaging sensor device of the motor vehicle of at least one part of the environment of the motor vehicle with the object point. Moreover, at least for the case that an ego motion of the motor vehicle with regard to the environment fulfils at least one predetermined criterion, an epipolar plane with respect to a first image of the image sequence taken at a first point in time and a second image of the image sequence taken at a second point in time is determined. Further, a first vector pointing from a position of the sensor device at the second point in time in the direction of the object point is determined and the at least one object point is classified as static or dynamic in dependence on a relative position of the first vector with reference to the epipolar plane.
The epipolar plane in this connection typically comprises the position of the sensor device at the first and the second point in time, as well as the position of the object point at the first point in time. The first point in time preferably represents an earlier point in time than the second point in time. For instance the first point in time and the second point in time can be understood as two consecutive time steps, in which respective images of the image sequence are captured. In the case that the object point is a static object point, accordingly also the position of the object point at the second point in time is in the epipolar plane, and thus at least theoretically also the first vector pointing at the second point in time from the position of the sensor device to the object point. If this first vector, or more precisely its end point, rather deviates from the epipolar plane, the object point is a dynamic object point, the position of which at the second point in time in comparison with its position at the first point in time has changed. This insight now can be used
advantageously in order to provide a particularly simple and particularly little
computationally expensive classification of an object point as dynamic or static. According to the invention thus advantageously a particularly reliable classification in dependence on the relative position of the first vector with regard to the epipolar plane can be performed, wherein both the relative position of the first vector as well as the epipolar plane itself can be provided by very few computing operations. Moreover, the method according to the invention allows for the classification of object points by means of a single imaging sensor device, such as for instance a single camera so that also the technical effort and thus costs can be kept to a minimum.
Although the configuration of the imaging sensor device as a camera is preferred, the described method still can be realized by any other imaging sensor devices in analogy.
In an advantageous embodiment of the invention it is checked whether the first vector lies within a predetermined first tolerance range in the epipolar plane, and in case the result of the checkup is that the first vector does not lie within the predetermined first tolerance range in the epipolar plane, the at least one object point is classified as dynamic, and in particular otherwise as static.
Since both the determination of the epipolar plane as well as the determination of the first vector are based on measurements, even for the case that the first object point is a static object point, it may happen that the first vector still deviates somewhat from the epipolar plane. Thus it is particularly advantageous to provide a first tolerance range, by which such measurement inaccuracies can be taken into consideration.
In particular, for checking whether the first vector lies within the predetermined first tolerance range in the epipolar plane, an absolute value of a portion of the first vector extending in parallel to a normal vector of the epipolar plane is calculated and compared to a predetermined threshold value defining the predetermined first tolerance range.
Hereby in a simple way it can be checked whether the first vector is coplanar relative to the epipolar plane. For instance for this purpose simply the scalar product of the first vector can be created with the normal vector of the epipolar plane. If for instance for this purpose the first vector as well as the normal vector are normalized to the length 1 , the absolute value of the scalar product is in the interval between including 0 and 1 . If the first vector is for instance in the epipolar plane and thus extends perpendicularly to the normal vector, the scalar product amounts to 0. If for instance the first vector extended equally perpendicularly to the epipolar plane and thus were in parallel to the normal vector, the absolute value for a scalar product is rendered to be 1 . If the first vector relative to the epipolar plane has an angle, the scalar product according to the definition is rendered as cosine of the angle. The creation of the scalar product of the first vector with the normal vector of the epipolar plane thus presents a particularly advantageous option for determining the portion of the first vector extending in parallel to the normal vector. If the absolute value of the scalar product for instance renders a value smaller than the predetermined threshold value, the first vector can be seen as extending in the epipolar plane, and the object point can be classified as static. If the absolute value of the scalar product renders a value larger than a predetermined threshold value, the object point is classified as dynamic.
In a further advantageous embodiment of the invention the ego motion of the motor vehicle fulfils the at least one predetermined criterion, if a speed of the motor vehicle is larger than a predetermined threshold value. This threshold value in this connection is dimensioned to be very small and characterizes a standstill of the motor vehicle or a quasi standstill or a very slow motion of the motor vehicle. This threshold value can for instance be in a range of 0.1 km/h and 0.9 km/h, such as for instance 0.6 km/h. If the motor vehicle is for instance at a standstill, the position of the sensor device at the first point in time coincides with its position at the second point in time so that no epipolar plane can be determined. Rather the epipolar plane in this case is reduced to a line. By this described criterion with regard to the ego motion of the motor vehicle thus it can be advantageously ensured that the epipolar plane is determined and thus also the relative position of the first vector with regard to this plane can be determined.
In a further advantageous embodiment of the invention the ego motion of the motor vehicle fulfils the at least one predetermined criterion if the motor vehicle moves in such a way relative to the at least one object point that the position of the sensor device at the second point in time with regard to a predetermined second tolerance range is not positioned on a first line, which is determined by the position of the sensor device at the first point in time and a position of the object point at the first point in time. As already described, the epipolar plane typically can be determined by the position of the sensor device at the first point in time, the position of which at the second point in time and by the position of the object point of the first point in time. If these three points, however, are in a line, here, too, no epipolar plane can be determined. By this predetermined criterion, too, it can be advantageously ensured that an epipolar plane can be determined. Preferably, also the two of these named criteria can be applied with reference to the ego motion of the motor vehicle so that to the ego motion both the first named criterion as well as the second named criterion is applied.
But even for the case that the ego motion of the motor vehicle does not fulfill the named criteria, nevertheless a classification of the object point as static or dynamic is possible in a simple way, as will be described in the following in more detail. For this purpose it is advantageous to determine further vectors, by which this is facilitated.
Therefore, according to a further advantageous embodiment of the invention it is envisaged that a second vector is determined, which extends collinearly to a second line, which extends through the position of the sensor device at the first point in time and the position of the sensor device at the second point in time.
According to a further advantageous embodiment of the invention a third vector is determined, which is collinear to a third line, which extends through the position of the sensor device at the second point in time as well as through a point, which is on a first line, which extends through the position of the sensor device at the first point in time and the position of the object point at the first point in time.
The second as well as the third vector thus are defined in such a way that these lie in the epipolar plane, and in particular form it. Thus, these two vectors, i.e. the second vector and the third vector, can also advantageously be used for geometrically defining the epipolar plane. Moreover, both the second vector and the third vector in a simple way can be established from the images captured at the first and the second point in time, as well as from the known position of the sensor device at the first and the second point in time. A
further great advantage of the determination of this second as well as this third vector moreover consists in that the above-described normal vector of the epipolar plane can be determined in a simple way as the cross product of the second and the third vector.
The second and/or the third vector now can advantageously also be used to classify the object point as static or dynamic, even if the ego motion of the motor vehicle does not fulfil the at least one predetermined criterion. In the above-described situations, in which consequently the motor vehicle for instance is at a standstill or the positions of the sensor device at the first and the second point in time as well as the position of the object point at the first point in time are in one line, the second vector and the third vector do not form a plane, but are collinear.
Therefore it represents a particularly advantageous embodiment of the invention that in case the ego motion of the motor vehicle does not fulfil the at least one predetermined criterion, it is checked whether the first vector within a predetermined second tolerance range is collinear with regard to the second and/or the third vector, and in particular, if this is not the case, the object point is classified as dynamic, and preferably, if this is the case, is classified as static. For the case that the second and the third vector are collinear and the object point moreover is a static object, also the first vector is collinear to the second as well as the third vector. Only if the movement of the object point is along the line through the position of the image sensor at the first point in time and the object point at the first point in time, the first vector would be collinear with the second and third vector as well, and in this case the object point would incorrectly be classified as static. However, as this is a very exceptional event, above described classification method still is very reliable. With regard to which of these two vectors the collinearity is checked, in this connection is irrelevant. Preferably, however, the collinearity of the first vector with reference to the third vector is checked, since not always a second vector can be determined, such as for instance at a standstill of the motor vehicle, since here the positions of the sensor device at the first and the second point in time coincide so that by these two coinciding positions no vector can be defined.
Since moreover the determination of the first, the second and/or the third vector are based on measurements, it is again advantageous to envisage in the checkup a predetermined second tolerance range, by which unavoidable measurement inaccuracies can be taken into consideration. The checkup with regard to collinearity again can be effected in a particularly simple way. For instance the scalar product between the first vector and the third vector can be created. Without loss of generality for simplification here again it is
assumed that the first vector and the third vector are normalized vectors, i.e. vectors with the length 1 . If the first and the third vector are collinear, the absolute value of the scalar product 1 is rendered, if these are for instance perpendicular to each other, the scalar product is 0. Here, too, for instance a corresponding second threshold value can be envisaged, which takes account of the predetermined tolerance range, wherein for the case that the absolute value of the scalar product is larger than this predetermined threshold value, the first vector is considered to be collinear to the third vector, and for the case that the absolute value of the scalar product is smaller than this predetermined threshold value, the first vector is not considered to be collinear to the third vector. Instead of the scalar product for checking the collinearity also the cross product between the first and the third vector can be created and its absolute value taken. The absolute value of the cross product for the case that the first and the third vector are perpendicular to each other, amounts to 1 , whilst for the other extreme case that the first and third vector are parallel to each other, the absolute value of the cross product amounts to 0.
Correspondingly, here, too, a corresponding threshold value can be envisaged in order to meet the described predetermined tolerance range for the check of collinearity.
Thus, also for the case that the ego motion of the motor vehicle does not fulfil the at least one predetermined criterion, a particularly simple and little computationally expensive option is provided, which allows for a reliable classification of the object point as static or dynamic in most cases, especially if the movement of the object point is not along the line through the position of the image sensor at the first point in time and the object point at the first point in time.
In order to check the fulfilment of the above criteria for the ego motion of the motor vehicle, for instance the speed of the motor vehicle can be captured and compared with the described threshold value. For checking the second criterion a focus of expansion can be determined and the position of the object point can be compared with it. If the object point position is in the focus of expansion or in its vicinity, the criterion is considered as not being fulfilled. A further particularly simple variant for checking both criteria can also be provided by checking whether the second vector is zero and/or whether the second vector, in case same is different from zero, and the third vector are collinear. If one of these two cases applies, the at least one criterion with reference to the ego motion of the motor vehicle is regarded as not being fulfilled.
In order to configure the classification even more robust, there are plural further advantageous options, which are described in the following in more detail.
According to one of these advantageous options a first flow vector on the basis of two consecutively captured images of the image sequence are determined for the first object point, and compared with a second previously determined flow vector, wherein for the case that the comparison renders the result that the first flow vector in its length and/or direction deviates more than a predetermined threshold value, the first flow vector is classified as invalid. Hereby advantageously an option is provided to detect so-called outliers. This procedure can be performed with respect to any arbitrary object point analogously as described with regard to the first object point. In other words, for example a first flow vector can be determined on the basis of two consecutively captured images of the image sequence for a second object point and compared with a second previously determined flow vector for the second object point, wherein for the case that the comparison renders the result that the first flow vector in its length and/or direction deviates more than a predetermined threshold value, the first flow vector for the second object point is classified as invalid. Otherwise, it can be classified as valid and be subject to the classification method for classifying the second object point as static or dynamic as described above with respect to the first object point. Hereby it is facilitated that unstable flow vectors, which consequently for instance "jump" in the image, are detected and for instance then their assigned object points are not considered in subsequent calculations. In order to detect such "jumping", advantageously it is suitable to consider the length and direction of these flow vectors of a current calculation time step with reference to a previous time step, i.e. quasi the temporal change of length and direction. Such proceeding in this connection can be performed for all the object points present in a respective captured image. Thereby in each time step advantageously the image points in the respective images taken and corresponding to respective object points can be marked as valid or invalid. The flow vector, i.e. the first as well as the second or every further flow vector, is preferably defined as a vector, which extends from the position of an object point in a first captured image or a predetermined projection of same to a position of the same object point in a second subsequent captured image or a predetermined projection of same. It characterizes consequently the movement of an object point in the image with reference to a respective preceding time step. The so-called predetermined projection can for instance represent a projection of the image captured by the image sensor to a projected image plane, which considers the imaging characteristics of an optical system of the camera. The projection rule, which considers these imaging characteristics, can be provided by a calibration of the camera in a simple way. Images, which are strongly distorted for instance by an optical system such as a fish eye lens, can be rectified again by this projection rule. This way of proceeding consequently is simple in terms of
calculation, and allows for a clearly higher robustness of the method by a reliable detection of outliers.
According to a further advantageous design of the invention it is checked whether a driving incident impairing a correct classification is given. There are certain driving incidents that may lead to an incorrect classification for instance of static objects as dynamic. This has the following background: For the definition of the epipolar plane, even if same is reduced to a line, the position of the sensor device at the first point in time as well as at the second point in time is employed. The sensor device can for instance be arranged on a fixed, known place of the motor vehicle. The movement of the sensor device thus is described by the movement of the motor vehicle, in particular relative to its environment. For this purpose captured movement parameters of the motor vehicle advantageously can also be used to determine the movement of the sensor device as well as their current positions at respective points in time. Such movement parameters are for instance the speed of the motor vehicle and the steering angle of the motor vehicle.
Possibly also the inclination of the motor vehicle, for instance with regard to its
longitudinal, vertical and/or lateral axis, can be determined and considered by inclination sensors. Generally, these parameters can be determined by typical motor vehicle sensors. However, there are also movements of the motor vehicle that cannot be captured or are hard to be captured, such as for instance a movement of the motor vehicle in the direction of its vertical axis. Depending on the equipment of the motor vehicle for instance also a determination of rolling and pitching motions is not always possible. Such movements are for instance carried out by the motor vehicle when driving over a bumpy or uneven road. The detection of such situations as driving incidents in this connection is particularly advantageous to avoid that an ego motion not detected by the motor vehicle's sensor system leads to it that static objects or object points are incorrectly classified as dynamic. Therefore it is particularly advantageous to perform a checkup as to the presence of such driving incidents, since a result of such checkup for instance the knowledge about the presence of such driving incident as the driving over a bump, can be used for instance in order to either reject the results of the classification during such a driving incident or to mark them at least as little reliable. In particular it is advantageous in this connection that such a checkup is performed equally on the basis of the captured images, as will be described in more detail in the following.
In this connection it is an advantageous embodiment of the invention that the checkup is effected in dependence on a determined number of object points classified as dynamic. In this connection advantageously the insight is used that in such situations in which it is not
recognized that a predetermined movement of the motor vehicle relative to its
environment is performed, an excessive number of object points in the environment are classified as dynamic, even though they are static. Therefore it is particularly
advantageous to consider the number of object points classified as dynamic for the detection of a driving incident impairing the classification. Advantageously in this connection again a threshold value can be employed to detect the presence of such driving incident.
What is above all advantageous in this connection is the fact that if the result of the checkup is that the correct classification impairing driving incident is given, if at least one of the following conditions is fulfilled:
The number of object points classified as dynamic with regard to a time step in relation to the overall number of object points classified as static and dynamic for this time step exceeds a predetermined first threshold value, for instance 60 %; and The number of object points classified as dynamic with regard to a time step in relation to the overall number of object points classified as static and dynamic for this time step is raised by at least one predetermined second threshold value, for instance 25 %, in comparison with an average value, which is rendered by an averaging of the respective numbers of object points classified as dynamic in relation to the overall number for several preceding time steps.
By these conditions it is advantageously facilitated to detect a driving incident, such as for instance the driving over a bump, a hill, an uneven road, or even a situation of starting a movement. Moreover, advantageously the fulfillment of both of these conditions can be checked, and if one or both of these two conditions are fulfilled, it can be concluded that the driving incident is given. For this purpose also again flow vectors of the respective object points can be viewed and these be classified for instance as valid or invalid, wherein for instance an invalid flow vector means that it does not move along the epipolar line and therefore does not behave like a flow vector of a static 3D object.
If for instance such a driving incident is detected, further image processing steps that are performed during this driving incident, such as for instance an object recognition of object detection, a 3D reconstruction of objects or the entire scene, for instance by 3D triangulation, and the like, can be prevented, in order to avoid misinterpretations of the capturing of the environment.
Moreover, the invention relates to a driving assistance system for a motor vehicle, which is configured to perform a method according to the invention or one of its embodiments.
Moreover, the invention also relates to a motor vehicle with a driver assistance system according to the invention. The motor vehicle preferably is configured as passenger car.
Further features of the invention are apparent from the claims, the figures and the description of figures. The features and feature combinations mentioned above in the description as well as the features and feature combinations mentioned below in the description of figures and/or shown in the figures alone are usable not only in the respectively specified combination, but also in other combinations without departing from the scope of the invention. Thus, implementations are also to be considered as encompassed and disclosed by the invention, which are not explicitly shown in the figures and explained, but arise from and can be generated by separated feature combinations from the explained implementations. Implementations and feature combinations are also to be considered as disclosed, which thus do not have all of the features of an originally formulated independent claim. Moreover, implementations and feature combinations are to be considered as disclosed, in particular by the implementations set out above, which extend beyond or deviate from the feature combinations set out in the relations of the claims.
These show in:
Fig. 1 a schematic representation of a motor vehicle in a plan view with a driver assistance system according to one embodiment of the invention;
Fig. 2 a schematic illustration of an epipolar plane for illustrating a method according to an embodiment of the invention;
Fig. 3 a schematic representation of an epipolar plane reduced to a line according to one further embodiment of the invention;
Fig. 4 a flow diagram for illustrating a method according to an embodiment of the invention; and
Fig. 5 a flow diagram for illustrating a method according to a further embodiment of the invention.
Fig. 1 shows a schematic representation of a motor vehicle 1 with a driver assistance system 2 according to an embodiment of the invention. The driver assistance system 2 comprises in this example a sensor device configured as camera 3 as well as an analysis unit 4. The camera 3 can for instance be positioned behind a windscreen 5 of the motor vehicle 1 . Generally, the camera 3 can be positioned in any place of the motor vehicle 1 , however in such a way that same is configured to capture at least part of the environment 6 of the motor vehicle 1 . The images captured by the camera 3 are transmitted to the analysis unit 4 and analyzed by it. On the basis of the captured images by the analysis unit 4 for instance an object recognition and 3D reconstruction can be performed. Further the analysis unit is configured to differentiate between static and dynamic object points of objects in the environment 6 of the motor vehicle 1 , which is set out in more detail in the following.
Fig. 2 shows a schematic representation of an epipolar plane 7, as it is applied in a method according to the invention for classification of an object point O as static or dynamic. This epipolar plane 7 is represented on the one hand by the camera position 8a, which in particular represents a projection center, at a current point in time, by the camera position 8v at an earlier point in time, as well as by a point P. The earlier point in time in this connection relates to a point in time, at which the image preceding the current image capture was taken. The current point in time, in the following also referred to as second point in time, and the earlier point in time, in the following also referred to as first point in time, are to be assigned to corresponding consecutive time steps with regard to the captured images of the image sequence.
The point P is defined such that the object point O, which is of interest and is to be classified as static or dynamic, represents an object point O in the environment 6 of the motor vehicle 1 , lies on a first line 9, which extends through the camera position 8v at the preceding point in time, and the point P. The capture of an image with the object point O renders an image point O' on a projected image plane 10v with reference to a first point in time. This projected image plane 10v here in Fig. 2 is represented only for the purpose of illustration and is associated with the plane of the image sensor of the camera 3 via an imaging rule predetermined by an optical system of the camera 3. With reference to the example of a pinhole camera this imaging rule would for instance consist in a simple point-symmetrical image. However, in order to be able to capture an as large as possible
area of the environment 6 of the motor vehicle 1 by means of a single camera, it is preferred to configure the camera 3 in such a way that it has an as large as possible capture range and is for instance configured as wide-angle camera. This can for instance be provided by a fish eye lens. Such lenses, however, in some circumstances lead to a distorted representation. As a consequence thereof straight lines in the environment 6 are for instance imaged as curved lines on the image sensor. In such a case also the imaging rule between the image sensor or the image senor plane and the projected image plane 10v is rendered more complicated. This image rule, however, can be determined in a simple way by a corresponding calibration of the camera 3. A calibrated camera 3 thus for every image point or every area on the image sensor provides a corresponding area on the projected image plane 10v in a predetermined distance from the camera position 8v or the projection center. From the captured image of the object point O at the first point in time thus the position of the image point O' on the projected image plane 10v can be determined. For determining the first line 9 then correspondingly a line can be drawn through the position 8v of the camera 3 at the first point in time and the image point O' of the projected image plane 10v. At a random distance from the camera position 8v at the first point in time now on this first line 9 the point P is selected.
The camera positions 8a, 8v in this connection are established from odometric data of the motor vehicle 1 . For this purpose for instance the speed of the motor vehicle as well as the steering angle can be determined. Since the camera 3 with respect to the motor vehicle 1 is arranged in a fixed place, from the motor vehicle movement also
simultaneously the movement of the camera 3 is rendered. Hereby the camera positions 8a, 8v can be determined at any point in time of a captured image of the image sequence.
For all object points O of objects in the environment 6 of the motor vehicle 1 , which are imaged in a respective image and are to be classified, such an epipolar plane 7 is determined. This means that in a respective time step the number of determined epipolar planes 7 corresponds to the number of object points O to be classified. The epipolar planes 7 determined with respect to a current classification time step in this connection intersect all along a second line 1 1 , which is determined by the camera positions 8a, 8v at the first and the second point in time.
Besides the epipolar plane 7 further a first vector v\ is determined, which from a camera position 8a at the second point in time points in the direction of the object point O at the second point in time. This first vector v\ in turn can be determined by connection of the camera position 8a at the second point in time with the image point O" of the projected
image plane 10a at the second point in time, the image point O" corresponding to the object point O.
In this example the object point O is a static object point O, i.e. it has the same position at the first point in time as it has at the second point in time. Since consequently the object point O has not changed in its position, also the determined first vector vl lies in the epipolar plane 7. Correspondingly, advantageously on the basis of the relative position of the first vector v\ with respect to the epipolar plane 7 a classification of an object point O as static or dynamic can be performed. In this connection consequently it merely needs to be checked whether this vector v\ is in the epipolar plane 7 or not.
For this purpose two further vectors are defined, namely a second vector v2 along the second line 1 1 , which extends from the camera position 8a at the second point in time to the camera position 8v of the camera 3 at the first point in time and a third vector v3, which extends on a third line 12 through the camera position 8a at the second point in time and through the point P. These two vectors v2, v3 thus form the epipolar plane 7. In order to check whether the object point O now is a static or a dynamic object point it now merely is to be checked whether these three vectors vl, vl, v3 are coplanar vectors.
This may for instance happen in that to start with the normal vector n of the epipolar plane 7 is determined by the cross product of the second vector v2 and the third vector v3 , i.e. n-v2 x v3 .
The distance d of the vector vl relative to the epipolar plane 7 then can be defined as absolute of the scalar product between the normal vector n and the first vector vl , i.e.
Alternatively, the distance d can also be represented as absolute of the determinant of a matrix, which has the first vector vl, the second vector v2 , and the third vector v3 as column vector or line vector. In the case of coplanarity of these three vectors vl, v2, v3 this distance d is equal 0. If the first vector vl is not in the epipolar plane 7, this distance d is not equal 0 and for instance in case the first vector vl is perpendicular to the epipolar plane 7, maximally. If the vectors vl, v2, v3 for instance were normalized to a length of 1 , this maximum distance also amounts to 1. In order to now classify the object or the
object point O, at this determined distance a threshold value can be set. This means if the distance d is between 0 and this threshold value, the object point O is classified as static, otherwise as dynamic. This threshold value can for instance be fix or be selected in dependence on one or several determined variables, such as for instance the motor vehicle speed. In this connection it is advantageous, if the threshold value is selected to be the larger, the faster the motor vehicle speed is.
For the case that the vectors v2, v3 are collinear, however, no epipolar plane 7 can be determined. Nevertheless, in this case a classification of the object point as static or dynamic can be performed in a simple way, which is now illustrated by Fig. 3.
Fig. 3 shows a schematic representation of an epipolar plane 7 now reduced to a line. In particular in this connection the camera positions 8a, 8v at the first and the second point in time as the point P are on a common line, for which reason now also the vectors v2 and v3 are collinear. If the position of the object point O during these two points in time of image capturing has not changed, also in turn the first vector ϊτϊ is parallel to the second vector v2 as well as the third vector v3. Here in Fig. 3 it is now represented in an exemplary way that the position of the object point O at the second point in time changed in comparison with the first point in time, which is meant to be illustrated by the
designation of the object point at the second point in time as 02. Correspondingly, also the first vector vl is not collinear to the other two vectors vl, v3. By checking whether the first vector vl is collinear to the other vectors v2 or v3 , however, thus
advantageously it can also be determined whether it is a static or a dynamic object point O. The collinearity again can in a simple way be checked by forming the scalar product or the cross product between a first vector vl and any one of the two other vectors vl, v3 . For instance again a distance d can be defined as follows:
This distance d is also represented in Fig. 3. Also for this distance d in turn a threshold value can be defined. In this way again it can be determined in a simple way whether the object point O is static or dynamic.
Fig. 4 shows a flow diagram for illustrating a method according to an embodiment of the invention. In order to render the afore-described classification method even more robust, it is advantageous to remove in respective captured images of the image sequence to start with so-called outliers. This may for instance be performed in an optional step S1 . For this
purpose for instance from two consecutive captured images of the image sequence in each case for the respective object point flow vectors are determined. Further, the length and direction of a flow vector for a certain object point can be compared with the flow vector assigned to this object point, which was determined in a preceding time step. Image points of objects, the assigned flow vectors of which have an excessive change in their magnitude or direction, thus can advantageously be filtered prior to the further processing. For this purpose these corresponding points need not necessarily be deleted, but can also be only marked as invalid.
This can now be followed by the described classification of object points. In this classification in this connection preferably the filtered object points are not considered. Now, to start with, in a step S2 it can be checked whether a situation is given in which the above-described vectors v2 and v3 are collinear or not. Such checkup need not necessarily be based on a preceding calculation of these vectors v2 and v3 . For instance for this purpose it can also be checked whether the motor vehicle 1 is at a standstill or not. Also on the basis of the optical flow it can be checked whether the object points O are in an focus of expansion or not. If consequently for instance the motor vehicle 1 is neither at a standstill nor the object point O to be classified is in the focus of expansion, also called epipole, it is proceeded to step S3, in which the first vector vl, the second vector v2 and the third vector v3 are determined. Subsequently in step S4 it is checked whether these three vectors vl, vl, v3 are coplanar. If this is the case, the object point O is classified as static in step S5, otherwise in step S6 as dynamic. If the result of the checkup in S2 is that for instance the motor vehicle 1 is at a standstill or else the object point O to be classified is in an focus of expansion, accordingly in step S7 the first vector vl is determined, as well as at least one of the two other vectors vl and/or v3. As is described as to Fig. 3 subsequently in step S8 it is checked whether the first vector vl is collinear to one of the two other vectors v2 and/or v3. If this is the case the corresponding object point O in step S5 is classified as static, otherwise in step S6 as dynamic. This method is repeated now time step by time step, i.e. with each new captured image of the image sequence.
Since the classification of object points O also considers the camera positions for/at respective time steps, it is advantageous for a correct classification, if the camera positions at the respective time steps can be determined correctly. However, there are driving incidents, such as for instance a driving over a road bump, from which vehicle movements and thus also camera movements result, which cannot, or at least not correctly, be captured by the vehicle sensor system, such as for instance movements of
the motor vehicle in parallel to its vertical axis. This leads to it that the relative movement between motor vehicle 1 and its environment 6 resulting from such driving incidents is incorrectly rated as movement of the environment 6 and not of the motor vehicle 1 . In order to avoid this according to embodiments of the invention further the possibility is provided to detect, on the basis of the captured images of the image sequence, such predetermined driving incidents impairing a correct classification.
This is now explained in more detail on the basis of Fig. 5 by an illustrating flow diagram. For this purpose now in step S9 the motor vehicle speed is determined on the basis of odometric data, which can be provided by motor vehicle sensors. In step S10 it is checked whether the motor vehicle 1 is at a standstill. If this is not the case, in step S1 1 it is determined, how many of the object points in the current time step in relation to the totality of all classified object points in the current time step have been classified as dynamic. Further in step S12 it is checked whether now this percentage of the object points classified as dynamic exceeds a predetermined first threshold value, for instance 60 %. If this is the case, in step S13 it is further checked whether already a predetermined driving incident, such as driving over a road bump, has been determined as already given or on the basis of preceding checks according to this method is to be considered as given. If this is the case, this driving incident in step S14 is determined as still given and the method starts from scratch. If in step S13, however, it is determined that such driving incident has not yet been detected as given, in step S15 a new driving incident is determined as given and the method starts all over again. If, however, in step S12 it is determined that the percentage of the object points classified as dynamic does not exceed the predetermined first threshold value, in step S16 a mean percentage of the object points classified as dynamic is determined via a predetermined number, for instance 10, of preceding time steps, in which respective captured images of the image sequence were taken. The percentages of object points classified as dynamic determined for the respective preceding time steps in this connection can be retrieved from a storage. For storing these data for instance a ring buffer can be employed with a fixed number of entries, such as for instance 10, in order to store information on the last 10 results of the calculated percentages. Accordingly in each time step, for instance following a step S1 1 , the determined percentage of the object points classified as dynamic can be deposited in this storage in step S17. In step S18 then it is correspondingly checked whether the current percentage of the object points classified as dynamic in comparison with the mean percentage of the preceding time steps is raised by a predetermined threshold value, such as for instance 25 %. If this is the case, again it is proceeded with step S13.
If this is not the case, in step S19 it is checked whether a predetermined driving incident has been determined as given. If this is not the case, in step S20 it is determined that no predetermined driving incident is given and it is started from the beginning. If the checkup in step S19, however, produces the result that a predetermined driving incident has already been determined as given, it is checked in step S21 whether the percentage of object points classified as dynamic determined in step 1 1 remains below a third predetermined threshold value. Preferably this is a hysteresis threshold value, with respect to the percentage of object points classified as dynamic at the point in time, at which the present driving incident was determined as new driving incident. For instance this percentage can be selected as 20 percent less than the percentage at the time of detection of the currently given driving incident. This means that if the current percentage is below this third threshold value, in step S22 the present driving incident is determined as no longer given and it is again started from scratch. If in step S21 it is, however, not remained below this threshold value, the present driving incident S14 is determined as still given and it is started from the beginning again.
At a standstill of the motor vehicle a reset may occur. This means that for instance the stored percentages in the ring buffer are deleted again and also driving incidents previously determined as given are now considered as no longer given. This is performed in step S23 for the case that in step S10 it has been determined that the motor vehicle 1 is at a standstill.
For the case that according to this method a new driving incident is detected or determined as already given, the image data for the duration of this driving incident can be rejected, i.e. a further processing not be provided, or at least be marked as little reliable. For the case that according to this method such driving incident is determined as not given or completed, the image data again can be provided for further processing and optionally be marked as reliable.
On the whole by the invention and its embodiments a particularly simple, little
computationally expensive, little time-consuming and reliable option for the classification of object points as static or dynamic is provided.
Claims
Claims
Method for classifying at least one object point (O) of an object in an environment
(6) of a motor vehicle (1 ) as static or dynamic, wherein an image sequence of images is captured by means of at least one imaging sensor device (3) of the motor vehicle (1 ) of at least one part of the environment (6) of the motor vehicle (1 ) with the object point (O),
characterized in that
at least for the case that an ego motion of the motor vehicle (1 ) with respect to the environment (6) meets at least one predetermined criterion, an epipolar plane (7) with respect to a first image of the image sequence captured at a first point in time and a second image of the image sequence captured at a second point in time is determined, wherein a first vector ( ΐ¾ pointing from a position (8a) of the sensor device to the second point in time in the direction of the object point (O) is determined and the at least one object point (O) is classified as static or dynamic in dependency on a relative position of the first vector ( vl) with regard to the epipolar plane (7).
Method according to claim 1 ,
characterized in that
it is checked whether the first vector ( vl) within a predetermined first tolerance range is in the epipolar plane (7), and if the result of the checkup is that the first vector ( vl) is not within the predetermined first tolerance range in the epipolar plane
(7) , the at least one object point (O) is classified as dynamic, and in particular otherwise as static.
Method according to claim 2,
characterized in that
for checking whether the first vector ( vl) is within the predetermined first tolerance range in the epipolar plane (7), an absolute value (d) of a portion of a first vector ( id ) extending in parallel to a normal vector of the epipolar plane (7) is calculated and compared with a predetermined threshold value defining the predetermined first tolerance range.
Method according to any one of the preceding claims,
characterized in that
the ego motion of the motor vehicle (1 ) meets the at least one predetermined criterion, if a speed of the motor vehicle (1 ) is greater than a predetermined threshold value.
Method according to any one of the preceding claims,
characterized in that
the ego motion of the motor vehicle (1 ) meets the at least one predetermined criterion if the motor vehicle (1 ) moves in such a way with respect to the at least one object point (O) that the position (8a) of the sensor device (3) at the second point in time relative to a second tolerance range is not on a first line (9), which is determined by the position (8v) of the sensor device (3) at the first point in time and a position of the object point (O) at the first point in time.
Method according to any one of the preceding claims,
characterized in that
a second vector ( v2) is determined, which extends collinearly to a second line (1 1 ), which extends through the position (8v) of the sensor device (3) at the first point in time and the position (8a) of the sensor device (3) at the second point in time.
Method according to any one of the preceding claims,
characterized in that
a third vector ( v3) is determined, which is collinear to a third line (1 2), which extends through the position (8v) of the sensor device (3) at the first point in time and the position of a point lying on a first line, which extends through the positon (8v) of the sensor device (3) at the first point in time and through the position of the object point (O) at the first point in time.
Method according to claim 6 or 7,
characterized in that
in case the ego motion of the motor vehicle (1 ) does not meet the at least one predetermined criterion, it is checked whether the first vector (i¾ within a predetermined second tolerance range is collinear with respect to the third ( v3)
and/or second vector ( i'2), and in particular in case this is not the case, the object point (O) is classified as dynamic.
9. Method according to any one of the preceding claims,
characterized in that
a first flow vector is determined on the basis of two successive images of an image sequence for the first object point (O), and is compared with a second determined flow vector, wherein for the case that the result of the comparison is that hat the first flow vector in its length and/or direction deviates more than a predetermined threshold value, the first flow vector is classified as invalid.
10. Method according to any one of the preceding claims,
characterized in that
it is checked whether a driving incident, which impairs a correct classification, is given.
1 1 . Method according to claim 10,
characterized in that
the checkup is effected in dependency on a determined number of object points (O) classified as dynamic.
12. Method according to claim 10 or 1 1 ,
characterized in that
the result of the checkup is that a driving incident, which impairs a correct classification, is given if at least one of the following conditions is met:
The number of object points (O) classified as dynamic in relation to the overall number of object points (O) classified as static and dynamic for this time step exceeds a predetermined first threshold value; and
The number of object points (O) classified as dynamic in relation to an overall number of object points (O) classified as static and dynamic is raised by at least one predetermined second threshold value in comparison to an average value, resulting from an average determination of respective numbers of object points (O) classified as dynamic in relation to the overall number for several preceding time steps.
13. Driver assistance system for a motor vehicle (1 ), which is configured for performing a method according to any one of the preceding claims.
14. Motor vehicle (1 ) comprising a driver assistance system according to claim 13.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102017123228.1 | 2017-10-06 | ||
DE102017123228.1A DE102017123228A1 (en) | 2017-10-06 | 2017-10-06 | Method for classifying an object point as static or dynamic, driver assistance system and motor vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019068699A1 true WO2019068699A1 (en) | 2019-04-11 |
Family
ID=63833984
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2018/076769 WO2019068699A1 (en) | 2017-10-06 | 2018-10-02 | Method for classifying an object point as static or dynamic, driver assistance system, and motor vehicle |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102017123228A1 (en) |
WO (1) | WO2019068699A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111950370A (en) * | 2020-07-10 | 2020-11-17 | 重庆邮电大学 | Dynamic environment offline visual milemeter expansion method |
US11882367B2 (en) | 2019-10-18 | 2024-01-23 | Connaught Electronics Ltd. | Image processing method for producing a high dynamic range image of a scene |
FR3140699A1 (en) * | 2022-10-05 | 2024-04-12 | Psa Automobiles Sa | Method and device for monitoring the environment of a vehicle |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102019214142A1 (en) * | 2019-09-17 | 2021-03-18 | Zf Friedrichshafen Ag | Procedure for warning of mass movements on sloping terrain |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6473536B1 (en) | 1998-09-18 | 2002-10-29 | Sanyo Electric Co., Ltd. | Image synthesis method, image synthesizer, and recording medium on which image synthesis program is recorded |
US8259998B2 (en) | 2008-09-30 | 2012-09-04 | Mazda Motor Corporation | Image processing device for vehicle |
EP2590141A1 (en) * | 2011-11-02 | 2013-05-08 | EADS Deutschland GmbH | Method and device for detecting moving objects in a video picture sequence |
WO2015134832A1 (en) | 2014-03-06 | 2015-09-11 | Nec Laboratories America, Inc. | High accuracy monocular moving object localization |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011013776A1 (en) * | 2011-03-12 | 2011-11-10 | Daimler Ag | Method for acquisition and/or tracking of objects e.g. static objects, in e.g. front side of vehicle, involves disregarding objects having optical flow distance smaller than distance value determined by sensor from detection range of sensor |
DE102014117708A1 (en) * | 2014-12-02 | 2016-06-02 | Connaught Electronics Ltd. | Method for classifying a flow vector, driver assistance system and motor vehicle |
US9946264B2 (en) * | 2016-03-22 | 2018-04-17 | Sharp Laboratories Of America, Inc. | Autonomous navigation using visual odometry |
-
2017
- 2017-10-06 DE DE102017123228.1A patent/DE102017123228A1/en active Pending
-
2018
- 2018-10-02 WO PCT/EP2018/076769 patent/WO2019068699A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6473536B1 (en) | 1998-09-18 | 2002-10-29 | Sanyo Electric Co., Ltd. | Image synthesis method, image synthesizer, and recording medium on which image synthesis program is recorded |
US8259998B2 (en) | 2008-09-30 | 2012-09-04 | Mazda Motor Corporation | Image processing device for vehicle |
EP2590141A1 (en) * | 2011-11-02 | 2013-05-08 | EADS Deutschland GmbH | Method and device for detecting moving objects in a video picture sequence |
WO2015134832A1 (en) | 2014-03-06 | 2015-09-11 | Nec Laboratories America, Inc. | High accuracy monocular moving object localization |
Non-Patent Citations (2)
Title |
---|
REUBEN STRYDOM ET AL: "TCM: A Vision-Based Algorithm for Distinguishing between Stationary and Moving Objects Irrespective of Depth Contrast from a UAS", INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, vol. 13, no. 3, 1 January 2016 (2016-01-01), CR, pages 84, XP055529191, ISSN: 1729-8814, DOI: 10.5772/62846 * |
YIN-TIEN WANG ET AL: "Detection of moving objects in image plane for robot navigation using monocular vision", EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, vol. 2012, no. 1, 1 December 2012 (2012-12-01), XP055529536, DOI: 10.1186/1687-6180-2012-29 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11882367B2 (en) | 2019-10-18 | 2024-01-23 | Connaught Electronics Ltd. | Image processing method for producing a high dynamic range image of a scene |
CN111950370A (en) * | 2020-07-10 | 2020-11-17 | 重庆邮电大学 | Dynamic environment offline visual milemeter expansion method |
CN111950370B (en) * | 2020-07-10 | 2022-08-26 | 重庆邮电大学 | Dynamic environment offline visual milemeter expansion method |
FR3140699A1 (en) * | 2022-10-05 | 2024-04-12 | Psa Automobiles Sa | Method and device for monitoring the environment of a vehicle |
Also Published As
Publication number | Publication date |
---|---|
DE102017123228A1 (en) | 2019-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109624850B (en) | Method for monitoring blind spot of vehicle and blind spot monitor using the same | |
JP7025912B2 (en) | In-vehicle environment recognition device | |
US10242576B2 (en) | Obstacle detection device | |
WO2019068699A1 (en) | Method for classifying an object point as static or dynamic, driver assistance system, and motor vehicle | |
JP6532229B2 (en) | Object detection apparatus, object detection system, object detection method and program | |
US11393126B2 (en) | Method and apparatus for calibrating the extrinsic parameter of an image sensor | |
US10776946B2 (en) | Image processing device, object recognizing device, device control system, moving object, image processing method, and computer-readable medium | |
JP4480083B2 (en) | Object recognition device | |
CN106447730B (en) | Parameter estimation method and device and electronic equipment | |
EP2665037A1 (en) | Onboard camera automatic calibration apparatus | |
JP7077910B2 (en) | Bound line detection device and lane marking method | |
JP7270499B2 (en) | Abnormality detection device, abnormality detection method, posture estimation device, and mobile body control system | |
JP6384802B2 (en) | MOBILE BODY DETECTING DEVICE, IMAGE PROCESSING DEVICE, MOBILE BODY DETECTING METHOD, AND INTEGRATED CIRCUIT | |
JP2019191806A (en) | Abnormality detection device and abnormality detection method | |
WO2020137110A1 (en) | Movement amount estimation device | |
US8213684B2 (en) | Motion estimating device | |
CN104697491A (en) | Distance determination using a monoscopic imager in a vehicle | |
JP6174884B2 (en) | Outside environment recognition device and outside environment recognition method | |
JP6932015B2 (en) | Stereo image processing device | |
EP3227827B1 (en) | Driver assistance system, motor vehicle and method for classifying a flow vector | |
JP6877636B2 (en) | In-vehicle camera device | |
JP2019020171A (en) | Vehicle width detector, vehicle width detection method, and program | |
US20190156512A1 (en) | Estimation method, estimation apparatus, and non-transitory computer-readable storage medium | |
EP2919191B1 (en) | Disparity value deriving device, equipment control system, movable apparatus, robot, and disparity value producing method | |
JP6432033B2 (en) | In-vehicle device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18785546 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18785546 Country of ref document: EP Kind code of ref document: A1 |