US20130335553A1 - Method and system for determining an ego-motion of a vehicle - Google Patents

Method and system for determining an ego-motion of a vehicle Download PDF

Info

Publication number
US20130335553A1
US20130335553A1 US13/994,893 US201113994893A US2013335553A1 US 20130335553 A1 US20130335553 A1 US 20130335553A1 US 201113994893 A US201113994893 A US 201113994893A US 2013335553 A1 US2013335553 A1 US 2013335553A1
Authority
US
United States
Prior art keywords
vehicle
motion
ego
determining
surroundings
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13/994,893
Other versions
US9789816B2 (en
Inventor
Thomas Heger
Stephan Simon
Michael Helmle
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Assigned to ROBERT BOSCH GMBH reassignment ROBERT BOSCH GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HELMLE, MICHAEL, HEGER, THOMAS, SIMON, STEPHAN
Publication of US20130335553A1 publication Critical patent/US20130335553A1/en
Application granted granted Critical
Publication of US9789816B2 publication Critical patent/US9789816B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/002Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
    • B60Q9/004Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
    • B60Q9/005Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a video camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • B62D15/0285Parking performed automatically
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/008Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30264Parking

Definitions

  • the present invention relates to a method for determining an ego-motion (self-movement) of a vehicle, which is carried out in driver assistance systems, particularly in parking assistance systems.
  • a driver assistance system may include, for example, an ABS (antilock system), ESP (electronic stability program), a distance-regulating cruise controller (“adaptive cruise control”, ACC), and/or a parking assistant for assisting parking or unparking (leaving a parking space).
  • ABS antilock system
  • ESP electronic stability program
  • ACC distance-regulating cruise controller
  • parking assistant for assisting parking or unparking (leaving a parking space).
  • determining a vehicle ego-motion is also required. This is so, for instance, in a system for collision-warning, which is based on a prediction as to whether, in response to the continuation of the instantaneous vehicle motion, a collision will occur or not.
  • Parking assistants require data for the instantaneous actual vehicle position with regard to objects bordering on a parking space, for example, for the calculation, monitoring and possibly an adjustment of a trajectory.
  • Parking garage assistants which include, as a rule, both a collision warning and a trajectory computation, therefore also need information on ego-motion from which an instantaneous position of the vehicle may also be computed.
  • a vehicle's ego-motion self-movement
  • sensors which measure a vehicle state independent of the surroundings.
  • bicycle system sensors, engine system sensors or braking system sensors supply data on vehicle speed
  • steering sensors supply data on an instantaneous steering angle.
  • one may raise an instantaneous vehicle position based on GPS (“global positioning system”). Data on cornering of the vehicle may be raised possibly based on a gyroscope.
  • ultrasonic sensors record data on the surroundings of the vehicle. From additional data on vehicle speed and the instantaneous steering angle, the ultrasonic sensor data measured in a coordinate system oriented relative to the vehicle are transformed into a coordinate system oriented relative to space, which yields a surroundings map oriented relative to space. From the surroundings map and the ego-motion of the vehicle, a collision probability is calculated of the vehicle with objects located in the surroundings of the vehicle.
  • German Patent No. 60 2004 012 962 a method is known for real-time obstacle detection from a vehicle moving relative to a road. Based on images of a video camera, the motion flows of points are calculated whose projected motion is recorded by the camera. Points which belong to potential obstacles, which do not move in common with the plane of the road, are determined via an optical flow method.
  • This camera-based obstacle detection method is able to be combined with additional systems, such as an ultrasonic obstacle detection system, in order to arrive at a greater accuracy and/or a more robust method.
  • the methods for collision protection and obstacle detection described above are either not very accurate or they are complex. However, if the complexity of a parking assistant, for example, is to remain limited for reasons of cost, this can only be achieved in that, for example, in the calculation of a parking trajectory or for collision warning, comparatively large safety distances are provided. This, however, limits the utility of corresponding systems during assisted parking or during navigation in an unclear surroundings such as a parking garage.
  • a method for determining an ego-motion (self-movement) of a vehicle which includes the following steps: Taking a sequence of images successive in time of a vehicle surroundings by a vehicle camera; determining at least one motion flow with regard to an object in the vehicle surroundings; and determining the ego-motion of the vehicle based on the at least one motion flow.
  • the vehicle camera may be a rearview camera, for example, which takes images in a rear vehicle surroundings, such as in backward travel during parking or unparking.
  • the object may be a static object in the vehicle surroundings, such as another vehicle, a parking space limitation, etc.
  • a surface of a structure is required, and consequently the object may also be a surface of a road or roadway, a parking space or another body surface that lies within the field of vision of the camera, to the extent that a motion flow is measurable in this case.
  • data may be drawn upon from vehicle sensors which measure an instantaneous speed and/or an instantaneous steering angle.
  • data may be drawn upon from ultrasonic sensors which measure the distance from objects in the vehicle surroundings. Additional data on the instantaneous position or motion of the vehicle may be drawn upon by GPS systems, gyroscopes or other position-sensitive or motion-sensitive detectors.
  • a “time to collision” (“time to contact”, “time to crash” TTC) is calculated, which goes into the determination of the ego-motion.
  • a TTC may be more direct and simpler to calculate than a motion of position of a vehicle, so that the determination of the ego-motion of the vehicle may also become simplified.
  • the TTC may also be used for a determination of a collision warning.
  • specific embodiments of the method according to the present invention may include that, based on at least one determined motion flow, and at least one additional determined motion flow and/or additional sensor data, such as ultrasound data, wheel state data and steering data, an ego-motion of an object in the vehicle surroundings is detected.
  • additional sensor data such as ultrasound data, wheel state data and steering data
  • a method is provide in a parking assistance system of a vehicle, which is used for the continuous calculation, checking and/or adjustment of a parking trajectory.
  • an ego-motion of the vehicle is determined in the manner sketched above.
  • a method for collision warning in a driver assistance system of a vehicle in which an ego-motion of a vehicle is determined as was sketched above.
  • a computer program is provided, according to which one of the methods described here is implemented when the computer program is run on a programmable computer device.
  • the computer device may be a module, for instance, for implementing a driver assistance system, or a subsystem thereof, in a vehicle.
  • the computer program is able to be stored on a machine-readable storage medium, such as a permanent or rewritable storage medium or in an assignment to a computer device or on a removable CD-ROM, DVD or a USB stick.
  • the computer program may be provided on a computer device, such as a server, for downloading, e.g., via a data network such as the Internet, or via a communication link such as a telephone line or a wireless connection.
  • a driver assistance system which is developed for determining an ego-motion of a vehicle.
  • This system includes the following components: A vehicle camera for taking a sequence of images successive in time of a vehicle surroundings; a component for determining, based on the image sequence, at least one motion flow with regard to an object in the vehicle surroundings; and a component for determining the ego-motion of the vehicle based on the at least one optical flow.
  • the present invention is based on the idea of improving the calculation of the ego-motion of a vehicle in a simple manner, by drawing upon data of a vehicle camera for the determination of the ego-motion, for instance, the data of a rearview camera.
  • images of a vehicle camera are used only for obstacle detection, and not for determining the vehicle's ego-motion; the latter is rather determined based on sensors independent of the surroundings, such as wheel state sensors and steering angle sensors, or via GPS, gyroscope and the like.
  • the present invention leads to an increase in the accuracy of parking assistants, collision warning systems, etc., without increasing their complexity excessively.
  • An improved accuracy in the determination of an ego-motion is helpful, for example, in the calculation, monitoring and adjustment of a parking/unparking trajectory, so as to achieve “gentle” parking in response to automatic guidance, for a collision warning, etc.
  • the more accurate determination of the vehicle's ego-motion makes possible an improved guidance of the vehicle along a precalculated trajectory during parking, for example.
  • yawing of the vehicle in the lateral direction may be detected earlier and better, and countersteering may be prompted.
  • the present invention also offers the possibility, additionally or alternatively to improve methods for distance estimation from objects in the vehicle's surroundings, which may be based on an ultrasound system, for example.
  • the combination with the determining of motion flows (optical flows) in a camera image sequence enables more accurate and more robust distance estimates. This applies, for example, to the simultaneous use of a rearview camera and an ultrasonic sensor system for a parking assistant.
  • the present invention may also be used for a determination of relative speed, perhaps within the scope of a precrash application. In this case, by the combination of the vehicle separation distance speed data and/or relative speed data determined from ultrasound data and video data, a threatening collision may be predicted more reliably and more accurately.
  • FIG. 1 shows a vehicle equipped with a driver assistance system according to the present invention, in an exemplary parking situation.
  • FIG. 2 shows a schematic representation of functional components of the driver assistance system of FIG. 1 .
  • FIG. 3 shows a manner of operating the driver assistance system of FIG. 2 in the form of a flow chart.
  • FIG. 1 illustrates in schematic form a vehicle 100 equipped, according to the present invention, with a driver assistance system 102 in an exemplary parking situation, in which the vehicle 100 is to be parked backwards in a parking space 104 between parking space-limiting objects 106 and 108 , which may be parking vehicles, for example.
  • Driver assistance system 102 has a camera 110 implemented as a rearview camera, as well as a set of ultrasonic sensors 112 . All the sensors 110 , 112 are connected via a vehicle network (not shown) to a central processing component 114 , which is implemented as an ECU (“electronic control unit”).
  • ECU electronic control unit
  • Vehicle 100 first moves along path 120 , past parking space 104 , the latter being measured by additional sensors (not shown) of driver assistance system 102 .
  • the driver decides to park, whereupon the vehicle stops at stopping point S 1 .
  • a parking assistant (a subsystem of driver assistance system 102 ) takes over the further parking in a partially or fully automatic manner.
  • Vehicle 100 is guided backwards up to stopping point S 2 along trajectory 122 calculated by the parking assistant, then is pulled forward along a calculated trajectory 124 up to an additional stopping point (not designated) and is then guided into parking space 104 , backwards via trajectory 126 .
  • a safety distance 130 between stopping point S 2 and object 106 is drawn in, which in the case of usual systems is of the order of magnitude of 10 cm or more, 20 cm, for example, although a smaller safety distance of less than 10 cm, such as 5 cm, or even only 1-2 cm would be desirable, in order perhaps to be able to use tight parking spaces, and to be able to park in a parking space in spite of cramped maneuvering space.
  • trajectory 126 is discussed here, but these statements analogously apply in the same way for trajectories 122 and 124 .
  • the inaccuracies in the determination of the ego-motion and the instantaneous position of the vehicle are advantageously minimized, in order early to bring under control curves along trajectories like the ones indicated by arrows 132 and 136 .
  • an exact determination of the ego-motion is achieved in that data of rearview camera 110 that is present are combined with additional data of the present ultrasonic sensors 112 and data of additional sensors (also already present for other reasons), for measuring the motion of vehicle 110 .
  • increased accuracy is able to be achieved in checking and adjusting the actual vehicle motion to trajectory 126 / 138 , the maximum deviations being so low as is indicated, for example, by trajectory 134 .
  • FIG. 2 schematically illustrates functional components of driver assistance system 102 , whose collaboration makes possible an improved determination of the ego-motion of vehicle 100 , according to the present invention.
  • Rearview cameras 110 , ultrasonic sensors 112 and ECU 114 are connected to a vehicle network 202 implemented as a bus system (these elements have already been shown in FIG. 1 ).
  • at least one sensor system 204 is connected to bus 202 , sensor system 202 being able to include the following, for example: A radar sensor or a plurality of such sensors for the determination, independent of the surroundings, of a vehicle speed; a steering sensor for determining a steering angle, a gyroscope for determining a rotational motion of the vehicle, a GPS sensor, etc.
  • ECU 114 includes a motion flow component 210 for determining motion flows in image sequences, an ego-motion component 212 for determining an ego-motion of vehicle 100 based on motion flows determined in component 210 , a trajectory component 214 for calculating, checking and/or adjusting trajectories such as the trajectories 122 - 126 shown in FIG. 1 , an ultrasonic component 216 for calculating distances of and directions towards objects which are detected by the ultrasonic sensor system 112 , as well as a sensor component 218 for processing data of sensor system 204 .
  • a motion flow component 210 for determining motion flows in image sequences
  • an ego-motion component 212 for determining an ego-motion of vehicle 100 based on motion flows determined in component 210
  • a trajectory component 214 for calculating, checking and/or adjusting trajectories such as the trajectories 122 - 126 shown in FIG. 1
  • an ultrasonic component 216 for calculating distances of and directions towards
  • driver assistance system 102 A manner of operating the components of driver assistance system 102 shown in FIG. 2 will be explained in greater detail below, with reference to flow chart 300 , in FIG. 3 .
  • the components shown in FIG. 2 work together to determine ( 302 ) an ego-motion of vehicle 100 during parking in a parking space.
  • rearview camera 110 takes a sequence of images, successive in time, of the rear surroundings of vehicle 100 .
  • rearview camera 110 will be active during the entire parking process along trajectories 122 - 126 and will supply to ECU 114 a continuous stream of images.
  • Component 210 of ECU 114 actuates rearview camera 110 in a corresponding way, and stores the image sequence supplied by camera 110 in temporary storage 224 for further processing.
  • ultrasonic sensors 112 also measure a rear vehicle surroundings range, and correspondingly supply data via bus 202 to assigned processing component 216 .
  • sensor system 204 also continually supplies instantaneous data on speed and steering angle of vehicle 100 , for example, to assigned processing component 208 .
  • processing component 212 determines from an image sequence, which includes a plurality of takes that are successive in time, which are stored in memory area 224 , a “motion flow” of at least one object that is identified in the rear vehicle surroundings.
  • memory area 224 may be designed as a cyclical memory, for example, in which a certain number of takes of rearview camera 110 are temporarily stored for a back period of time and are correspondingly overwritten by new takes.
  • the determination of a motion flow of points which identify an object, from an image sequence for instance, based on optical flow methods, is known per se to one skilled in the art, and will therefore not be further discussed.
  • processing component 212 would identify objects 106 and 108 from the stored images, and would in each case determine their motion flow.
  • component 212 calculates the motion flows of objects 106 and 108 in a coordinate system oriented to the vehicle.
  • ECU 114 In addition to the processing of the optical or infrared data supplied by camera 110 in ECU component 212 , ECU 114 , using its component 216 , processes representations of the rear vehicle surroundings supplied by ultrasound sensor system 112 . From the data, one could determine instantaneous distances from, and alignments towards objects 106 , 108 , for example. These data also refer to a coordinate system oriented to the vehicle, that is, based on the ultrasonic data, independent values for a motion of objects 106 and 108 could be determined in a coordinate system oriented to the vehicle.
  • Sensor system 204 is not sensitive to the surroundings of vehicle 100 ; component 218 processes corresponding sensor data and provides data independent of the surroundings to an instantaneous vehicle motion.
  • calculating component 214 receives the processing data of components 212 , 216 and 218 , i.e. specifically camera-based data on the apparent motion of objects 106 and 108 with reference to vehicle 100 from component 212 , corresponding ultrasound-based data from component 216 and data on the ego-motion of vehicle 100 with the aid of, for instance, wheel speed and steering angle from component 218 . With the aid of the data, component 214 ascertains an estimate for the “ego-motion” of vehicle 100 .
  • Component 214 is able to carry out a data fusion, for example, during which an ego-motion of vehicle 100 ascertained by sensor system 204 independent of the surroundings is corrected, based on the motion flows of objects 106 , 108 ascertained from the camera data.
  • the ego-motion ascertained may be corrected with regard to a lateral motion of objects 106 or 108 , which is able to be measured very accurately, based on the camera data after processing by an optical flow method. Consequently, for instance, slight yawing of the vehicle is able to be detected early as a deviation around a previously calculated trajectory (cf. the corresponding exemplary discussion below).
  • the ultrasonic data of ultrasonic sensor system 112 may also be drawn upon for the correction and the more accurate determination of the ego-motion of vehicle 100 .
  • distances from, and directions to objects 106 , 108 may be determined, in each case, both from the ultrasonic data and the camera data, and then brought together in a suitable manner. From the curves of the distance values and/or the direction values, one may then make a correction based on the sensors independent of the surroundings and/or the ego-motion ascertained by the camera, on the assumption of surrounding objects being at rest, for example.
  • an ego-motion of vehicle 100 may also take place based on the determination of a motion flow of the roadway surface and perhaps the ground surface of the parking bay 104 . If a ground surface is taken as the basis for a motion flow calculation, one may assume, in particular, that it is static, i.e. that an ego-motion of vehicle 100 may be reliably determined Based on the estimates on the ego-motion of vehicle 100 , based perhaps on the data of camera 110 , and also, independently thereof, sensor system 204 , a motion of an object 106 or 108 may also be determined as a further possibility.
  • Calculation component 214 may operate quasi-continuously and may, for instance, at regular intervals of 100 milliseconds, draw upon the processing data of components 210 , 216 and 218 with regard to a past time period of 0.1, 0.3, 0.5 or 1.0 seconds (particularly the ego-motion ascertained based on the takes of camera 110 stored temporarily for a corresponding time period).
  • calculating component 214 supplies the calculated value of the (instantaneous) ego-motion of vehicle 100 to one or more additional components of driver assistance system 102 .
  • trajectory calculation component 220 and collision warning component 222 are additional components of driver assistance system 102 .
  • Component 220 monitors, among other things, for example, with the aid of the provided ego-motion, to what extent vehicle 100 is actually following the precalculated trajectory 122 - 126 during parking in parking space 104 . Based on the ego-motion of vehicle 100 ascertained on the basis of the camera, component 220 is able to establish, for example, whether countersteering is required.
  • parking vehicle 100 along this trajectory would mean a certain apparent motion of objects 106 and 108 in the field of vision of rearview camera 110 .
  • objects 106 and 108 would move laterally in a certain way from out of the field of vision of the camera, and upon approaching stopping point S 4 , the distance of each of objects 106 , 108 would remain constant.
  • object 108 would not move in the expected manner from out of the field of vision of rearview camera 110 and/or the distance from object 108 , as calculated from an optical flow calculation in component 212 , would decrease, while at the same time object 106 is gradually moving out of the field of vision of rearview camera 110 and the distance is increasing. From this explanation it becomes clear that a comparison of the precalculated trajectory to the actual trajectory is able to be improved by taking into account the motion flows ascertained, based on the camera, of objects 106 , 108 .
  • component 220 initiates appropriate countersteering. Consequently, on an overall basis, only a slight deviation occurs from the ideal, precalculated trajectory 138 .
  • component 222 also, based on the improved ego-motion estimation and the distances from potential obstacles 106 , 108 (for instance, ascertained based on the camera or on ultrasound), calculates the probability of a collision and, if necessary, generates corresponding collision warnings, which are either output to the driver or may lead to automatic braking.
  • Sequence 300 is run through cyclically during parking in parking space 104 , so that step 312 branches back to step 304 .
  • ultrasonic data, camera data and sensor data independent of the surroundings are evaluated, in simple systems, a vehicle ego-motion, also exclusively camera-based, may take place, or may take place, without taking into account ultrasonic data, only camera-based and based on data that are independent of the surroundings.
  • a vehicle motion and a vehicle position were determined based independently of the surroundings on vehicle speed and steering angle, an additional or another sensor system may also supply data that are independent of the surroundings; as an example, a GPS sensor and/or a gyroscope may be used.
  • a camera-based and/or an ultrasound-based surroundings map based on a coordinate system oriented to the vehicle, is able to be transformed into a surroundings map based on a coordinate system oriented in space, in which the ego-motion of the vehicle may be shown. It is also conceivable, vice versa, to leave the vehicle-oriented surroundings map and to take into account accordingly the data of the sensor system independent of the surroundings.
  • TTC's may be derived comparatively simply from the determination of motion flows, for example, within the scope of an optical flow calculation. These TTC's may be output directly from the processing of the camera data (or after the correction of a vehicle-oriented/space-oriented surroundings map and the ego-motion of the vehicle by the additional data of an ultrasonic sensor system or further sensor data) to a collision warning system.

Abstract

A method for determining an ego-motion of a vehicle is described, which is carried out in driver assistance systems, particularly in parking assistance systems. The method involves: taking a sequence of images, successive in time of a vehicle surroundings by a vehicle camera; determining, based on the image sequence, at least one motion flow with regard to an object in the vehicle surroundings; and determining the ego-motion of the vehicle based on the at least one motion flow.

Description

    FIELD OF THE INVENTION
  • The present invention relates to a method for determining an ego-motion (self-movement) of a vehicle, which is carried out in driver assistance systems, particularly in parking assistance systems.
  • BACKGROUND INFORMATION
  • Driver assistance systems are used to assist the driver in certain driving situations. A driver assistance system may include, for example, an ABS (antilock system), ESP (electronic stability program), a distance-regulating cruise controller (“adaptive cruise control”, ACC), and/or a parking assistant for assisting parking or unparking (leaving a parking space). For at least some of the driver assistance subsystems, determining a vehicle ego-motion is also required. This is so, for instance, in a system for collision-warning, which is based on a prediction as to whether, in response to the continuation of the instantaneous vehicle motion, a collision will occur or not. Parking assistants require data for the instantaneous actual vehicle position with regard to objects bordering on a parking space, for example, for the calculation, monitoring and possibly an adjustment of a trajectory. Parking garage assistants, which include, as a rule, both a collision warning and a trajectory computation, therefore also need information on ego-motion from which an instantaneous position of the vehicle may also be computed.
  • It is generally known that one may determine a vehicle's ego-motion (self-movement) based on sensors which measure a vehicle state independent of the surroundings. For example, bicycle system sensors, engine system sensors or braking system sensors supply data on vehicle speed, steering sensors supply data on an instantaneous steering angle. In addition, it is also known that one may raise an instantaneous vehicle position based on GPS (“global positioning system”). Data on cornering of the vehicle may be raised possibly based on a gyroscope.
  • In German Published Patent Application No. 10 2008 036 009, a method is described for protecting a vehicle from a collision in a parking and maneuvering area. For this purpose, ultrasonic sensors record data on the surroundings of the vehicle. From additional data on vehicle speed and the instantaneous steering angle, the ultrasonic sensor data measured in a coordinate system oriented relative to the vehicle are transformed into a coordinate system oriented relative to space, which yields a surroundings map oriented relative to space. From the surroundings map and the ego-motion of the vehicle, a collision probability is calculated of the vehicle with objects located in the surroundings of the vehicle.
  • From German Patent No. 60 2004 012 962, a method is known for real-time obstacle detection from a vehicle moving relative to a road. Based on images of a video camera, the motion flows of points are calculated whose projected motion is recorded by the camera. Points which belong to potential obstacles, which do not move in common with the plane of the road, are determined via an optical flow method. This camera-based obstacle detection method is able to be combined with additional systems, such as an ultrasonic obstacle detection system, in order to arrive at a greater accuracy and/or a more robust method.
  • The methods for collision protection and obstacle detection described above are either not very accurate or they are complex. However, if the complexity of a parking assistant, for example, is to remain limited for reasons of cost, this can only be achieved in that, for example, in the calculation of a parking trajectory or for collision warning, comparatively large safety distances are provided. This, however, limits the utility of corresponding systems during assisted parking or during navigation in an unclear surroundings such as a parking garage.
  • SUMMARY
  • According to the present invention, a method is provided for determining an ego-motion (self-movement) of a vehicle, which includes the following steps: Taking a sequence of images successive in time of a vehicle surroundings by a vehicle camera; determining at least one motion flow with regard to an object in the vehicle surroundings; and determining the ego-motion of the vehicle based on the at least one motion flow.
  • The vehicle camera may be a rearview camera, for example, which takes images in a rear vehicle surroundings, such as in backward travel during parking or unparking.
  • The object, whose motion flow is being determined, may be a static object in the vehicle surroundings, such as another vehicle, a parking space limitation, etc. To determine a motion flow, finally, a surface of a structure is required, and consequently the object may also be a surface of a road or roadway, a parking space or another body surface that lies within the field of vision of the camera, to the extent that a motion flow is measurable in this case.
  • For determining the ego-motion of the vehicle, furthermore, data may be drawn upon from vehicle sensors which measure an instantaneous speed and/or an instantaneous steering angle. In addition or as an alternative, for the determination of the ego-motion of the vehicle, data may be drawn upon from ultrasonic sensors which measure the distance from objects in the vehicle surroundings. Additional data on the instantaneous position or motion of the vehicle may be drawn upon by GPS systems, gyroscopes or other position-sensitive or motion-sensitive detectors.
  • In certain specific embodiments of the method according to the present invention, from the at least one determined motion flow (which may relate to a motion of the vehicle relative to an obstacle, for instance, to a parking space-bordering object) a “time to collision” (“time to contact”, “time to crash” TTC) is calculated, which goes into the determination of the ego-motion. In connection with an optical flow calculation, a TTC may be more direct and simpler to calculate than a motion of position of a vehicle, so that the determination of the ego-motion of the vehicle may also become simplified. In addition or alternatively, the TTC may also be used for a determination of a collision warning.
  • Moreover, specific embodiments of the method according to the present invention may include that, based on at least one determined motion flow, and at least one additional determined motion flow and/or additional sensor data, such as ultrasound data, wheel state data and steering data, an ego-motion of an object in the vehicle surroundings is detected. This makes possible the detection of non-static objects, such as pedestrians crossing a parking space, moving vehicles bordering parking spaces, etc. Because of this, the reliability of collision warning systems may be appropriately improved.
  • Furthermore, according to the present invention, a method is provide in a parking assistance system of a vehicle, which is used for the continuous calculation, checking and/or adjustment of a parking trajectory. In this case, an ego-motion of the vehicle is determined in the manner sketched above.
  • Still further, a method is provided for collision warning in a driver assistance system of a vehicle, in which an ego-motion of a vehicle is determined as was sketched above.
  • Furthermore, in accordance with the present invention, a computer program is provided, according to which one of the methods described here is implemented when the computer program is run on a programmable computer device. The computer device may be a module, for instance, for implementing a driver assistance system, or a subsystem thereof, in a vehicle. The computer program is able to be stored on a machine-readable storage medium, such as a permanent or rewritable storage medium or in an assignment to a computer device or on a removable CD-ROM, DVD or a USB stick. In addition or as an alternative, the computer program may be provided on a computer device, such as a server, for downloading, e.g., via a data network such as the Internet, or via a communication link such as a telephone line or a wireless connection.
  • Furthermore, in accordance with the present invention, a driver assistance system is provided, which is developed for determining an ego-motion of a vehicle. This system includes the following components: A vehicle camera for taking a sequence of images successive in time of a vehicle surroundings; a component for determining, based on the image sequence, at least one motion flow with regard to an object in the vehicle surroundings; and a component for determining the ego-motion of the vehicle based on the at least one optical flow.
  • Among other things, the present invention is based on the idea of improving the calculation of the ego-motion of a vehicle in a simple manner, by drawing upon data of a vehicle camera for the determination of the ego-motion, for instance, the data of a rearview camera. In the systems known up to now, images of a vehicle camera are used only for obstacle detection, and not for determining the vehicle's ego-motion; the latter is rather determined based on sensors independent of the surroundings, such as wheel state sensors and steering angle sensors, or via GPS, gyroscope and the like.
  • The present invention leads to an increase in the accuracy of parking assistants, collision warning systems, etc., without increasing their complexity excessively. An improved accuracy in the determination of an ego-motion is helpful, for example, in the calculation, monitoring and adjustment of a parking/unparking trajectory, so as to achieve “gentle” parking in response to automatic guidance, for a collision warning, etc. These advantages are achieved, in part, in that, based on the improved accuracy in the determination of the ego-motion, lower safety distances from objects or potential obstacles along a travel route envelope or a trajectory during parking and unparking, navigating in a cramped surroundings such as a parking garage, etc., are required.
  • The more accurate determination of the vehicle's ego-motion makes possible an improved guidance of the vehicle along a precalculated trajectory during parking, for example. Thus, for example, yawing of the vehicle in the lateral direction may be detected earlier and better, and countersteering may be prompted.
  • The present invention also offers the possibility, additionally or alternatively to improve methods for distance estimation from objects in the vehicle's surroundings, which may be based on an ultrasound system, for example. The combination with the determining of motion flows (optical flows) in a camera image sequence enables more accurate and more robust distance estimates. This applies, for example, to the simultaneous use of a rearview camera and an ultrasonic sensor system for a parking assistant. The present invention may also be used for a determination of relative speed, perhaps within the scope of a precrash application. In this case, by the combination of the vehicle separation distance speed data and/or relative speed data determined from ultrasound data and video data, a threatening collision may be predicted more reliably and more accurately.
  • In the case of a vehicle camera that is already present, no additional hardware components are required to implement the invention. If motion flows and/or optical flow calculation methods in an existing implementation are used already for obstacle detection, corresponding calculation modules for an ego-motion calculation may be used again, which lowers the implementation costs.
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 shows a vehicle equipped with a driver assistance system according to the present invention, in an exemplary parking situation.
  • FIG. 2 shows a schematic representation of functional components of the driver assistance system of FIG. 1.
  • FIG. 3 shows a manner of operating the driver assistance system of FIG. 2 in the form of a flow chart.
  • DETAILED DESCRIPTION
  • FIG. 1 illustrates in schematic form a vehicle 100 equipped, according to the present invention, with a driver assistance system 102 in an exemplary parking situation, in which the vehicle 100 is to be parked backwards in a parking space 104 between parking space-limiting objects 106 and 108, which may be parking vehicles, for example. Driver assistance system 102 has a camera 110 implemented as a rearview camera, as well as a set of ultrasonic sensors 112. All the sensors 110, 112 are connected via a vehicle network (not shown) to a central processing component 114, which is implemented as an ECU (“electronic control unit”).
  • Vehicle 100 first moves along path 120, past parking space 104, the latter being measured by additional sensors (not shown) of driver assistance system 102. The driver decides to park, whereupon the vehicle stops at stopping point S1. From this point on, a parking assistant (a subsystem of driver assistance system 102) takes over the further parking in a partially or fully automatic manner. Vehicle 100 is guided backwards up to stopping point S2 along trajectory 122 calculated by the parking assistant, then is pulled forward along a calculated trajectory 124 up to an additional stopping point (not designated) and is then guided into parking space 104, backwards via trajectory 126.
  • Inaccuracies in recording the measurements of obstacles, the determination of the ego-motion (self-movement) as well as the instantaneous position of a vehicle, in a usual parking assistant, require a comparatively large tolerance, that is, in the calculation of trajectories like the ones sketched in FIG. 1, large safety distances from obstacles such as objects 106, 108 are maintained, in order safely to avoid dangerous approaches.
  • As an example, in FIG. 1 a safety distance 130 between stopping point S2 and object 106 is drawn in, which in the case of usual systems is of the order of magnitude of 10 cm or more, 20 cm, for example, although a smaller safety distance of less than 10 cm, such as 5 cm, or even only 1-2 cm would be desirable, in order perhaps to be able to use tight parking spaces, and to be able to park in a parking space in spite of cramped maneuvering space.
  • Inaccuracies in the determination of ego-motion and/or the instantaneous position of a vehicle also lead to the fact that, in the guidance of a vehicle along precalculated trajectories, deviations will occur. In the corrections which may then be required, the inaccuracies mentioned are in turn taken into account. As an example, in FIG. 1, for trajectory 126, the ideal course of the actual vehicle motion is indicated as a dashed line 138 up to reaching final stopping point S4. Reference numerals 132, 134 and 136 show examples of various real motion sequences, which deviate more or less greatly from ideal trajectory 126/138. In the case of trajectories 132 and 136, countersteering has to take place in time to avoid a collision with objects 106 and 108, respectively. For reasons of clarity, only trajectory 126 is discussed here, but these statements analogously apply in the same way for trajectories 122 and 124.
  • The inaccuracies in the determination of the ego-motion and the instantaneous position of the vehicle are advantageously minimized, in order early to bring under control curves along trajectories like the ones indicated by arrows 132 and 136. In the exemplary specific embodiment of the present invention described here, an exact determination of the ego-motion is achieved in that data of rearview camera 110 that is present are combined with additional data of the present ultrasonic sensors 112 and data of additional sensors (also already present for other reasons), for measuring the motion of vehicle 110. In this way, without using additional hardware components, increased accuracy is able to be achieved in checking and adjusting the actual vehicle motion to trajectory 126/138, the maximum deviations being so low as is indicated, for example, by trajectory 134.
  • FIG. 2 schematically illustrates functional components of driver assistance system 102, whose collaboration makes possible an improved determination of the ego-motion of vehicle 100, according to the present invention. Rearview cameras 110, ultrasonic sensors 112 and ECU 114 are connected to a vehicle network 202 implemented as a bus system (these elements have already been shown in FIG. 1). Furthermore, at least one sensor system 204 is connected to bus 202, sensor system 202 being able to include the following, for example: A radar sensor or a plurality of such sensors for the determination, independent of the surroundings, of a vehicle speed; a steering sensor for determining a steering angle, a gyroscope for determining a rotational motion of the vehicle, a GPS sensor, etc.
  • ECU 114 includes a motion flow component 210 for determining motion flows in image sequences, an ego-motion component 212 for determining an ego-motion of vehicle 100 based on motion flows determined in component 210, a trajectory component 214 for calculating, checking and/or adjusting trajectories such as the trajectories 122-126 shown in FIG. 1, an ultrasonic component 216 for calculating distances of and directions towards objects which are detected by the ultrasonic sensor system 112, as well as a sensor component 218 for processing data of sensor system 204.
  • A manner of operating the components of driver assistance system 102 shown in FIG. 2 will be explained in greater detail below, with reference to flow chart 300, in FIG. 3. As was mentioned before, the components shown in FIG. 2 work together to determine (302) an ego-motion of vehicle 100 during parking in a parking space. In step 304, rearview camera 110 takes a sequence of images, successive in time, of the rear surroundings of vehicle 100.
  • In practice, rearview camera 110 will be active during the entire parking process along trajectories 122-126 and will supply to ECU 114 a continuous stream of images. Component 210 of ECU 114 actuates rearview camera 110 in a corresponding way, and stores the image sequence supplied by camera 110 in temporary storage 224 for further processing.
  • In parallel to this, ultrasonic sensors 112 also measure a rear vehicle surroundings range, and correspondingly supply data via bus 202 to assigned processing component 216. In addition, sensor system 204 also continually supplies instantaneous data on speed and steering angle of vehicle 100, for example, to assigned processing component 208.
  • In step 306, processing component 212 determines from an image sequence, which includes a plurality of takes that are successive in time, which are stored in memory area 224, a “motion flow” of at least one object that is identified in the rear vehicle surroundings. For this purpose, memory area 224 may be designed as a cyclical memory, for example, in which a certain number of takes of rearview camera 110 are temporarily stored for a back period of time and are correspondingly overwritten by new takes. The determination of a motion flow of points which identify an object, from an image sequence, for instance, based on optical flow methods, is known per se to one skilled in the art, and will therefore not be further discussed.
  • In the exemplary scenario of FIG. 1, particularly objects 106 and 108 are located in the field of view of camera 110. Consequently, processing component 212 would identify objects 106 and 108 from the stored images, and would in each case determine their motion flow.
  • Because rearview camera 110 is fixedly fastened on vehicle 100 (or at least in a known instantaneous position and alignment), component 212 calculates the motion flows of objects 106 and 108 in a coordinate system oriented to the vehicle.
  • In addition to the processing of the optical or infrared data supplied by camera 110 in ECU component 212, ECU 114, using its component 216, processes representations of the rear vehicle surroundings supplied by ultrasound sensor system 112. From the data, one could determine instantaneous distances from, and alignments towards objects 106, 108, for example. These data also refer to a coordinate system oriented to the vehicle, that is, based on the ultrasonic data, independent values for a motion of objects 106 and 108 could be determined in a coordinate system oriented to the vehicle. Sensor system 204 is not sensitive to the surroundings of vehicle 100; component 218 processes corresponding sensor data and provides data independent of the surroundings to an instantaneous vehicle motion.
  • In step 308, calculating component 214 receives the processing data of components 212, 216 and 218, i.e. specifically camera-based data on the apparent motion of objects 106 and 108 with reference to vehicle 100 from component 212, corresponding ultrasound-based data from component 216 and data on the ego-motion of vehicle 100 with the aid of, for instance, wheel speed and steering angle from component 218. With the aid of the data, component 214 ascertains an estimate for the “ego-motion” of vehicle 100. Component 214 is able to carry out a data fusion, for example, during which an ego-motion of vehicle 100 ascertained by sensor system 204 independent of the surroundings is corrected, based on the motion flows of objects 106, 108 ascertained from the camera data.
  • For example, the ego-motion ascertained may be corrected with regard to a lateral motion of objects 106 or 108, which is able to be measured very accurately, based on the camera data after processing by an optical flow method. Consequently, for instance, slight yawing of the vehicle is able to be detected early as a deviation around a previously calculated trajectory (cf. the corresponding exemplary discussion below).
  • In addition to the sensor data of sensor system 204 and the camera data of camera 110, the ultrasonic data of ultrasonic sensor system 112 may also be drawn upon for the correction and the more accurate determination of the ego-motion of vehicle 100. For example, distances from, and directions to objects 106, 108 may be determined, in each case, both from the ultrasonic data and the camera data, and then brought together in a suitable manner. From the curves of the distance values and/or the direction values, one may then make a correction based on the sensors independent of the surroundings and/or the ego-motion ascertained by the camera, on the assumption of surrounding objects being at rest, for example.
  • In addition or as an alternative, an ego-motion of vehicle 100 may also take place based on the determination of a motion flow of the roadway surface and perhaps the ground surface of the parking bay 104. If a ground surface is taken as the basis for a motion flow calculation, one may assume, in particular, that it is static, i.e. that an ego-motion of vehicle 100 may be reliably determined Based on the estimates on the ego-motion of vehicle 100, based perhaps on the data of camera 110, and also, independently thereof, sensor system 204, a motion of an object 106 or 108 may also be determined as a further possibility. This would come about, for instance, from a systematic discrepancy between the expected position and the expected distance of the object, as is yielded from the ego-motion of vehicle 100 based on sensor system 204, as compared to the camera-based ascertained motion flow of this object.
  • Calculation component 214 may operate quasi-continuously and may, for instance, at regular intervals of 100 milliseconds, draw upon the processing data of components 210, 216 and 218 with regard to a past time period of 0.1, 0.3, 0.5 or 1.0 seconds (particularly the ego-motion ascertained based on the takes of camera 110 stored temporarily for a corresponding time period).
  • In step 310, calculating component 214 supplies the calculated value of the (instantaneous) ego-motion of vehicle 100 to one or more additional components of driver assistance system 102. Drawn as examples in FIG. 2 are trajectory calculation component 220 and collision warning component 222. Component 220 monitors, among other things, for example, with the aid of the provided ego-motion, to what extent vehicle 100 is actually following the precalculated trajectory 122-126 during parking in parking space 104. Based on the ego-motion of vehicle 100 ascertained on the basis of the camera, component 220 is able to establish, for example, whether countersteering is required.
  • If the calculated parking trajectory is the ideal 138 shown as a dashed line in FIG. 1, for example, parking vehicle 100 along this trajectory would mean a certain apparent motion of objects 106 and 108 in the field of vision of rearview camera 110. In particular, objects 106 and 108 would move laterally in a certain way from out of the field of vision of the camera, and upon approaching stopping point S4, the distance of each of objects 106, 108 would remain constant. If the vehicle actually moves, based on an inaccuracy in the steering (yawing), perhaps on trajectory 132, object 108 would not move in the expected manner from out of the field of vision of rearview camera 110 and/or the distance from object 108, as calculated from an optical flow calculation in component 212, would decrease, while at the same time object 106 is gradually moving out of the field of vision of rearview camera 110 and the distance is increasing. From this explanation it becomes clear that a comparison of the precalculated trajectory to the actual trajectory is able to be improved by taking into account the motion flows ascertained, based on the camera, of objects 106, 108.
  • In order to get from the ascertained or the actual trajectory 132 back into the vicinity of the ideal trajectory 138, component 220 initiates appropriate countersteering. Consequently, on an overall basis, only a slight deviation occurs from the ideal, precalculated trajectory 138.
  • For the ascertained or actual trajectory 132, component 222 also, based on the improved ego-motion estimation and the distances from potential obstacles 106, 108 (for instance, ascertained based on the camera or on ultrasound), calculates the probability of a collision and, if necessary, generates corresponding collision warnings, which are either output to the driver or may lead to automatic braking.
  • Sequence 300, shown in FIG. 3, is run through cyclically during parking in parking space 104, so that step 312 branches back to step 304.
  • Whereas, in the exemplary embodiment shown here, ultrasonic data, camera data and sensor data independent of the surroundings are evaluated, in simple systems, a vehicle ego-motion, also exclusively camera-based, may take place, or may take place, without taking into account ultrasonic data, only camera-based and based on data that are independent of the surroundings. Whereas in the example described above, a vehicle motion and a vehicle position were determined based independently of the surroundings on vehicle speed and steering angle, an additional or another sensor system may also supply data that are independent of the surroundings; as an example, a GPS sensor and/or a gyroscope may be used.
  • As was described above, a camera-based and/or an ultrasound-based surroundings map, based on a coordinate system oriented to the vehicle, is able to be transformed into a surroundings map based on a coordinate system oriented in space, in which the ego-motion of the vehicle may be shown. It is also conceivable, vice versa, to leave the vehicle-oriented surroundings map and to take into account accordingly the data of the sensor system independent of the surroundings.
  • Collision times TTC's may be derived comparatively simply from the determination of motion flows, for example, within the scope of an optical flow calculation. These TTC's may be output directly from the processing of the camera data (or after the correction of a vehicle-oriented/space-oriented surroundings map and the ego-motion of the vehicle by the additional data of an ultrasonic sensor system or further sensor data) to a collision warning system.
  • The present invention is not limited to the exemplary embodiments described above and the aspects emphasized therein; rather, a plurality of modifications are possible, that are within the scope of action of one skilled in the art, within the field of the present invention.

Claims (11)

1.-10. (canceled)
11. A method for determining an ego-motion of a vehicle, comprising:
taking, by a vehicle camera, a sequence of images, successive in time, of vehicle surroundings;
determining, based on the image sequence, at least one motion flow with respect to an object in the vehicle surroundings; and
determining the ego-motion of the vehicle based on the at least one motion flow.
12. The method as recited in claim 11, wherein the vehicle camera is a rearview camera that takes images of the vehicle surroundings from a rear perspective.
13. The method as recited in claim 11, wherein the determining of the ego-motion of the vehicle includes drawing data from at least one vehicle sensor that measures at least one of an instantaneous speed and an instantaneous steering angle.
14. The method as recited in claim 11, wherein the determining of the ego-motion of the vehicle includes drawing data from at least one ultrasonic sensor of the vehicle that measures a distance from the object in the vehicle surroundings.
15. The method as recited in claim 11, further comprising:
calculating a time-to-collision from the at least one motion flow, the time-to-collision being factored into at least one of the determining of the ego-motion and a determining of a collision warning.
16. The method as recited in claim 11, further comprising:
detecting the ego-motion of the object in the vehicle surroundings based on the at least one determined motion flow and at least one of at least one additional determined motion flow and additional sensor data.
17. A method in a parking assistance system of a vehicle for at least one of a continuous calculation, checking, and adjustment of a parking trajectory, comprising:
determining an ego-motion of the vehicle by:
taking, by a vehicle camera, a sequence of images, successive in time, of vehicle surroundings;
determining, based on the image sequence, at least one motion flow with respect to an object in the vehicle surroundings; and
determining the ego-motion of the vehicle based on the at least one motion flow.
18. A method for collision warning in a driver assistance system of a vehicle, comprising:
determining an ego-motion of the vehicle:
taking, by a vehicle camera, a sequence of images, successive in time, of vehicle surroundings;
determining, based on the image sequence, at least one motion flow with respect to an object in the vehicle surroundings; and
determining the ego-motion of the vehicle based on the at least one motion flow.
19. A computer program containing instructions for execution on a programmable computer device, the computer instructions when executed resulting in performing a method for determining an ego-motion of a vehicle, comprising:
taking, by a vehicle camera, a sequence of images, successive in time, of vehicle surroundings;
determining, based on the image sequence, at least one motion flow with respect to an object in the vehicle surroundings; and
determining the ego-motion of the vehicle based on the at least one motion flow.
20. A driver assistance system for determining an ego-motion of a vehicle, comprising:
a vehicle camera for taking a sequence of images, successive in time, of vehicle surroundings;
a component, for determining, based on the image sequence, at least one motion flow with respect to an object in the vehicle surroundings; and
a component for determining the ego-motion of the vehicle based on the at least one motion flow.
US13/994,893 2010-12-15 2011-12-06 Method and system for determining an ego-motion of a vehicle Active 2034-01-04 US9789816B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE102010063133 2010-12-15
DE102010063133A DE102010063133A1 (en) 2010-12-15 2010-12-15 Method and system for determining a self-motion of a vehicle
DE102010063133.7 2010-12-15
PCT/EP2011/071961 WO2012080044A1 (en) 2010-12-15 2011-12-06 Method and system for determining a self-movement of a vehicle

Publications (2)

Publication Number Publication Date
US20130335553A1 true US20130335553A1 (en) 2013-12-19
US9789816B2 US9789816B2 (en) 2017-10-17

Family

ID=45418637

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/994,893 Active 2034-01-04 US9789816B2 (en) 2010-12-15 2011-12-06 Method and system for determining an ego-motion of a vehicle

Country Status (6)

Country Link
US (1) US9789816B2 (en)
EP (1) EP2652706B1 (en)
JP (1) JP6005055B2 (en)
CN (1) CN103534729A (en)
DE (1) DE102010063133A1 (en)
WO (1) WO2012080044A1 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150134185A1 (en) * 2013-11-08 2015-05-14 Samsung Techwin Co., Ltd. Method of generating optimum parking path of unmanned driving vehicle, and unmanned driving vehicle adopting the method
US9274525B1 (en) * 2012-09-28 2016-03-01 Google Inc. Detecting sensor degradation by actively controlling an autonomous vehicle
WO2016020355A3 (en) * 2014-08-05 2016-03-31 Valeo Schalter Und Sensoren Gmbh Method for the at least semi-autonomous manoeuvring of a motor vehicle, driver assistance system and motor vehicle
US20160229398A1 (en) * 2015-02-07 2016-08-11 Hella Kgaa Hueck & Co. Method for at least partially automatically controlling a motor vehicle
US9631936B2 (en) * 2015-02-10 2017-04-25 Mobileye Vision Technologies Ltd. Forward navigation based on rearward facing camera
US20170137061A1 (en) * 2015-11-17 2017-05-18 Mitsubishi Electric Corporation Vehicle steering control apparatus
US20180101739A1 (en) * 2016-10-07 2018-04-12 Ford Global Technologies, Llc Rear obstacle detection and distance estimation
US20180105208A1 (en) * 2015-11-10 2018-04-19 Hyundai Motor Company Automatic parking system and automatic parking method
WO2018102697A1 (en) * 2016-12-02 2018-06-07 Bayerische Motoren Werke Aktiengesellschaft System and method for estimating vehicular motion based on monocular video data
US10053090B2 (en) * 2015-05-20 2018-08-21 Volkswagen Ag Method for providing user-defined customization of a vehicle
CN108475058A (en) * 2016-02-10 2018-08-31 赫尔实验室有限公司 Time to contact estimation rapidly and reliably is realized so as to the system and method that carry out independent navigation for using vision and range-sensor data
US20180253103A1 (en) * 2015-11-09 2018-09-06 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Controlling a Trajectory Planning Process of an Ego-Vehicle
US20180304887A1 (en) * 2015-10-22 2018-10-25 Robert Bosch Gmbh Method and device for reducing a risk of a collision of a motor vehicle with an object
US10279740B1 (en) * 2018-06-21 2019-05-07 Hongfujin Precision Electronics(Tianjin)Co.,Ltd. Warning apparatus and warning method
US10363960B2 (en) * 2015-11-06 2019-07-30 Ford Global Technologies, Llc Method and device for assisting a maneuvering process of a motor vehicle
US20200031312A1 (en) * 2018-07-24 2020-01-30 Nxp B.V. Methods and apparatuses involving vehicle tow-away detection
US20200198630A1 (en) * 2017-07-26 2020-06-25 Jaguar Land Rover Limited Proximity sensing systems and their control
WO2020139456A1 (en) * 2018-12-27 2020-07-02 Intel Corporation A method and apparatus to determine a trajectory of motion in a predetermined region
US11001301B2 (en) * 2016-01-21 2021-05-11 Valeo Schalter Und Sensoren Gmbh Method for moving a motor vehicle out of a parking space with at least semi-autonomous manoeuvring of the motor vehicle up to an end position, driver assistance system and motor vehicle
US11104327B2 (en) * 2015-07-13 2021-08-31 Magna Electronics Inc. Method for automated parking of a vehicle
US11167755B2 (en) 2015-02-07 2021-11-09 Hella Kgaa Hueck & Co. Method for at least partially automatically controlling a motor vehicle
US20210403012A1 (en) * 2018-10-30 2021-12-30 Daimler Ag Method for checking at least one driving environment sensor of a vehicle
US11227409B1 (en) 2018-08-20 2022-01-18 Waymo Llc Camera assessment techniques for autonomous vehicles
US20220196422A1 (en) * 2020-12-21 2022-06-23 Faurecia Clarion Electronics Co., Ltd. Parking assistance device and parking assistance method
US11427221B2 (en) * 2017-05-19 2022-08-30 Zf Cv Systems Europe Bv Method and control device for the autonomous emergency braking of an ego vehicle
US11699207B2 (en) 2018-08-20 2023-07-11 Waymo Llc Camera assessment techniques for autonomous vehicles

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013100040A1 (en) * 2013-01-03 2014-07-03 E-Lead Electronic Co., Ltd. Method for guiding reverse parking aid for motor car, involves computing park path for car by detector unit, and parking motor car on park surface based on overlapping of park path until mark of reverse park path expires completely
DE102013201379B4 (en) * 2013-01-29 2020-12-10 Robert Bosch Gmbh Motorcycle with a camera system
AT514588B1 (en) * 2013-08-29 2015-02-15 Tech Universität Wien Method for controlling a vehicle
CN105279767B (en) * 2014-12-26 2019-01-18 天津光电高斯通信工程技术股份有限公司 Train arrives at a station the recognition methods of state
KR101618501B1 (en) 2015-02-04 2016-05-09 한국기술교육대학교 산학협력단 Method for ego-motion estimation of vehicle
DE102015202230A1 (en) 2015-02-09 2016-08-11 Conti Temic Microelectronic Gmbh Fused eigenmotion calculation for a vehicle
DE102016204654A1 (en) * 2016-03-21 2017-09-21 Conti Temic Microelectronic Gmbh Device and method for self-motion estimation for a motor vehicle
JP6812173B2 (en) * 2016-08-31 2021-01-13 アイシン精機株式会社 Parking support device
DE102017207483A1 (en) * 2016-12-15 2018-06-21 Continental Teves Ag & Co. Ohg CONTROL DEVICE FOR A VEHICLE, BRAKE CONTROL DEVICE AND METHOD FOR CONTROLLING A VEHICLE
CN108454512B (en) * 2017-02-20 2021-06-04 奥迪股份公司 Device and method for avoiding foreign matters around vehicle wheel and vehicle
DE102017107701A1 (en) 2017-04-10 2018-10-11 Valeo Schalter Und Sensoren Gmbh A method of remotely maneuvering a motor vehicle on a parking area, a parking area infrastructure device, and a parking area communication system
DE102017214666A1 (en) * 2017-08-22 2019-02-28 Robert Bosch Gmbh Method and device for estimating a self-motion of a vehicle
DE102017217016A1 (en) * 2017-09-26 2019-03-28 Robert Bosch Gmbh Distribution device and method for distributing data streams for a control device for a highly automated mobile vehicle
DE102017219119A1 (en) * 2017-10-25 2019-04-25 Volkswagen Aktiengesellschaft Method for detecting the shape of an object in an exterior of a motor vehicle and motor vehicle
DE102019102923B4 (en) * 2019-02-06 2022-12-01 Bayerische Motoren Werke Aktiengesellschaft Method and device for sensor data fusion for a vehicle
TWI786311B (en) * 2019-07-04 2022-12-11 先進光電科技股份有限公司 Mobile vehicle assist system and parking control method thereof

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6704621B1 (en) * 1999-11-26 2004-03-09 Gideon P. Stein System and method for estimating ego-motion of a moving vehicle using successive images recorded along the vehicle's path of motion
US7113867B1 (en) * 2000-11-26 2006-09-26 Mobileye Technologies Limited System and method for detecting obstacles to vehicle motion and determining time to contact therewith using sequences of images
US20070154068A1 (en) * 2006-01-04 2007-07-05 Mobileye Technologies, Ltd. Estimating Distance To An Object Using A Sequence Of Images Recorded By A Monocular Camera
US7437244B2 (en) * 2004-01-23 2008-10-14 Kabushiki Kaisha Toshiba Obstacle detection apparatus and a method therefor
US20090182690A1 (en) * 2008-01-15 2009-07-16 Gideon Stein Detection and Classification of Light Sources Using a Diffraction Grating
US20100305857A1 (en) * 2009-05-08 2010-12-02 Jeffrey Byrne Method and System for Visual Collision Detection and Estimation
US7899211B2 (en) * 2005-12-07 2011-03-01 Nissan Motor Co., Ltd. Object detecting system and object detecting method
US20110115615A1 (en) * 2009-11-19 2011-05-19 Robert Bosch Gmbh Rear-view multi-functional camera system
US8559674B2 (en) * 2007-12-25 2013-10-15 Toyota Jidosha Kabushiki Kaisha Moving state estimating device
US8861792B2 (en) * 2004-04-08 2014-10-14 Mobileye Technologies Ltd. Collison warning system
US9233659B2 (en) * 2011-04-27 2016-01-12 Mobileye Vision Technologies Ltd. Pedestrian collision warning system

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7446798B2 (en) 2003-02-05 2008-11-04 Siemens Corporate Research, Inc. Real-time obstacle detection with a calibrated camera and known ego-motion
DE10307607A1 (en) * 2003-02-22 2004-09-09 Daimlerchrysler Ag Distributed image processing system for motor vehicles
JP4081548B2 (en) 2004-02-02 2008-04-30 独立行政法人産業技術総合研究所 Driving support system
DE102004048191A1 (en) * 2004-09-30 2006-04-06 Robert Bosch Gmbh Method and device for detecting an imminent collision
DE102005000651A1 (en) 2005-01-04 2006-07-13 Robert Bosch Gmbh Method for determining the intrinsic motion of a vehicle
JP2007176324A (en) * 2005-12-28 2007-07-12 Aisin Seiki Co Ltd Parking assist device
DE102006027123A1 (en) 2006-06-12 2007-12-13 Robert Bosch Gmbh Procedure for recording a traffic area
GB2447672B (en) * 2007-03-21 2011-12-14 Ford Global Tech Llc Vehicle manoeuvring aids
DE102007022524A1 (en) 2007-05-14 2008-11-20 Bayerische Motoren Werke Aktiengesellschaft motor vehicle
US20080306666A1 (en) * 2007-06-05 2008-12-11 Gm Global Technology Operations, Inc. Method and apparatus for rear cross traffic collision avoidance
KR101188588B1 (en) 2008-03-27 2012-10-08 주식회사 만도 Monocular Motion Stereo-Based Free Parking Space Detection Apparatus and Method
DE102008036009B4 (en) 2008-03-28 2018-03-22 Volkswagen Ag Method for collision protection of a motor vehicle and parking garage assistant
US8155853B2 (en) 2008-06-26 2012-04-10 GM Global Technology Operations LLC Mechanical time dilation algorithm for collision avoidance system
JP4453775B2 (en) 2008-06-27 2010-04-21 トヨタ自動車株式会社 Object detection device
DE102008058279A1 (en) 2008-11-20 2010-05-27 Hella Kgaa Hueck & Co. Method and device for compensating a roll angle
EP3588939B1 (en) * 2010-10-31 2023-10-18 Mobileye Vision Technologies Ltd. Bundling night vision and other driver assistance systems (das) using near infra red (nir) illumination and a rolling shutter

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6704621B1 (en) * 1999-11-26 2004-03-09 Gideon P. Stein System and method for estimating ego-motion of a moving vehicle using successive images recorded along the vehicle's path of motion
US7113867B1 (en) * 2000-11-26 2006-09-26 Mobileye Technologies Limited System and method for detecting obstacles to vehicle motion and determining time to contact therewith using sequences of images
US7437244B2 (en) * 2004-01-23 2008-10-14 Kabushiki Kaisha Toshiba Obstacle detection apparatus and a method therefor
US8861792B2 (en) * 2004-04-08 2014-10-14 Mobileye Technologies Ltd. Collison warning system
US7899211B2 (en) * 2005-12-07 2011-03-01 Nissan Motor Co., Ltd. Object detecting system and object detecting method
US20070154068A1 (en) * 2006-01-04 2007-07-05 Mobileye Technologies, Ltd. Estimating Distance To An Object Using A Sequence Of Images Recorded By A Monocular Camera
US8559674B2 (en) * 2007-12-25 2013-10-15 Toyota Jidosha Kabushiki Kaisha Moving state estimating device
US20090182690A1 (en) * 2008-01-15 2009-07-16 Gideon Stein Detection and Classification of Light Sources Using a Diffraction Grating
US20100305857A1 (en) * 2009-05-08 2010-12-02 Jeffrey Byrne Method and System for Visual Collision Detection and Estimation
US20110115615A1 (en) * 2009-11-19 2011-05-19 Robert Bosch Gmbh Rear-view multi-functional camera system
US9233659B2 (en) * 2011-04-27 2016-01-12 Mobileye Vision Technologies Ltd. Pedestrian collision warning system

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9927813B1 (en) * 2012-09-28 2018-03-27 Waymo Llc Detecting sensor degradation by actively controlling an autonomous vehicle
US9274525B1 (en) * 2012-09-28 2016-03-01 Google Inc. Detecting sensor degradation by actively controlling an autonomous vehicle
US11327501B1 (en) * 2012-09-28 2022-05-10 Waymo Llc Detecting sensor degradation by actively controlling an autonomous vehicle
US10310509B1 (en) * 2012-09-28 2019-06-04 Waymo Llc Detecting sensor degradation by actively controlling an autonomous vehicle
US9594379B1 (en) * 2012-09-28 2017-03-14 Google Inc. Detecting sensor degradation by actively controlling an autonomous vehicle
US10591924B1 (en) * 2012-09-28 2020-03-17 Waymo Llc Detecting sensor degradation by actively controlling an autonomous vehicle
US20150134185A1 (en) * 2013-11-08 2015-05-14 Samsung Techwin Co., Ltd. Method of generating optimum parking path of unmanned driving vehicle, and unmanned driving vehicle adopting the method
US9254870B2 (en) * 2013-11-08 2016-02-09 Hanwha Techwin Co., Ltd. Method of generating optimum parking path of unmanned driving vehicle, and unmanned driving vehicle adopting the method
CN106794838A (en) * 2014-08-05 2017-05-31 法雷奥开关和传感器有限责任公司 For at least semi-autonomous method for manipulating motor vehicles, driver assistance system and motor vehicles
US10345813B2 (en) * 2014-08-05 2019-07-09 Valeo Schalter Und Sensoren Gmbh Method for the at least semi-autonomous manoeuvring of a motor vehicle, driver assistance system and motor vehicle
WO2016020355A3 (en) * 2014-08-05 2016-03-31 Valeo Schalter Und Sensoren Gmbh Method for the at least semi-autonomous manoeuvring of a motor vehicle, driver assistance system and motor vehicle
US11167755B2 (en) 2015-02-07 2021-11-09 Hella Kgaa Hueck & Co. Method for at least partially automatically controlling a motor vehicle
US20160229398A1 (en) * 2015-02-07 2016-08-11 Hella Kgaa Hueck & Co. Method for at least partially automatically controlling a motor vehicle
US10081356B2 (en) * 2015-02-07 2018-09-25 Hella Kgaa Hueck & Co. Method for at least partially automatically controlling a motor vehicle
US9631936B2 (en) * 2015-02-10 2017-04-25 Mobileye Vision Technologies Ltd. Forward navigation based on rearward facing camera
US10053090B2 (en) * 2015-05-20 2018-08-21 Volkswagen Ag Method for providing user-defined customization of a vehicle
US11104327B2 (en) * 2015-07-13 2021-08-31 Magna Electronics Inc. Method for automated parking of a vehicle
US20180304887A1 (en) * 2015-10-22 2018-10-25 Robert Bosch Gmbh Method and device for reducing a risk of a collision of a motor vehicle with an object
US10363960B2 (en) * 2015-11-06 2019-07-30 Ford Global Technologies, Llc Method and device for assisting a maneuvering process of a motor vehicle
US20180253103A1 (en) * 2015-11-09 2018-09-06 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Controlling a Trajectory Planning Process of an Ego-Vehicle
US10739778B2 (en) * 2015-11-09 2020-08-11 Bayerische Motoren Werke Aktiengesellschaft Method and device for controlling a trajectory planning process of an ego-vehicle
US20180105208A1 (en) * 2015-11-10 2018-04-19 Hyundai Motor Company Automatic parking system and automatic parking method
US10919574B2 (en) * 2015-11-10 2021-02-16 Hyundai Motor Company Automatic parking system and automatic parking method
US9981691B2 (en) * 2015-11-17 2018-05-29 Mitsubishi Electric Corporation Vehicle steering control apparatus
US20170137061A1 (en) * 2015-11-17 2017-05-18 Mitsubishi Electric Corporation Vehicle steering control apparatus
US11001301B2 (en) * 2016-01-21 2021-05-11 Valeo Schalter Und Sensoren Gmbh Method for moving a motor vehicle out of a parking space with at least semi-autonomous manoeuvring of the motor vehicle up to an end position, driver assistance system and motor vehicle
CN108475058A (en) * 2016-02-10 2018-08-31 赫尔实验室有限公司 Time to contact estimation rapidly and reliably is realized so as to the system and method that carry out independent navigation for using vision and range-sensor data
US20180101739A1 (en) * 2016-10-07 2018-04-12 Ford Global Technologies, Llc Rear obstacle detection and distance estimation
US10318826B2 (en) * 2016-10-07 2019-06-11 Ford Global Technologies, Llc Rear obstacle detection and distance estimation
US10068140B2 (en) 2016-12-02 2018-09-04 Bayerische Motoren Werke Aktiengesellschaft System and method for estimating vehicular motion based on monocular video data
WO2018102697A1 (en) * 2016-12-02 2018-06-07 Bayerische Motoren Werke Aktiengesellschaft System and method for estimating vehicular motion based on monocular video data
US11427221B2 (en) * 2017-05-19 2022-08-30 Zf Cv Systems Europe Bv Method and control device for the autonomous emergency braking of an ego vehicle
US20200198630A1 (en) * 2017-07-26 2020-06-25 Jaguar Land Rover Limited Proximity sensing systems and their control
US10279740B1 (en) * 2018-06-21 2019-05-07 Hongfujin Precision Electronics(Tianjin)Co.,Ltd. Warning apparatus and warning method
US20200031312A1 (en) * 2018-07-24 2020-01-30 Nxp B.V. Methods and apparatuses involving vehicle tow-away detection
US11699207B2 (en) 2018-08-20 2023-07-11 Waymo Llc Camera assessment techniques for autonomous vehicles
US11227409B1 (en) 2018-08-20 2022-01-18 Waymo Llc Camera assessment techniques for autonomous vehicles
US20210403012A1 (en) * 2018-10-30 2021-12-30 Daimler Ag Method for checking at least one driving environment sensor of a vehicle
US11787424B2 (en) * 2018-10-30 2023-10-17 Daimler Ag Method for checking at least one driving environment sensor of a vehicle
WO2020139456A1 (en) * 2018-12-27 2020-07-02 Intel Corporation A method and apparatus to determine a trajectory of motion in a predetermined region
US20220196422A1 (en) * 2020-12-21 2022-06-23 Faurecia Clarion Electronics Co., Ltd. Parking assistance device and parking assistance method

Also Published As

Publication number Publication date
JP2014501401A (en) 2014-01-20
DE102010063133A1 (en) 2012-06-21
EP2652706B1 (en) 2018-07-11
WO2012080044A1 (en) 2012-06-21
EP2652706A1 (en) 2013-10-23
US9789816B2 (en) 2017-10-17
JP6005055B2 (en) 2016-10-12
CN103534729A (en) 2014-01-22

Similar Documents

Publication Publication Date Title
US9789816B2 (en) Method and system for determining an ego-motion of a vehicle
US10988139B2 (en) Vehicle position control method and device vehicle position control device for correcting position in drive-assisted vehicle
US9911330B2 (en) Driving assistance device and driving assistance method
JP6432679B2 (en) Stop position setting apparatus and method
US9796416B2 (en) Automated driving apparatus and automated driving system
US9714034B2 (en) Vehicle control device
US10345813B2 (en) Method for the at least semi-autonomous manoeuvring of a motor vehicle, driver assistance system and motor vehicle
US8170739B2 (en) Path generation algorithm for automated lane centering and lane changing control system
US11874660B2 (en) Redundant lateral velocity determination and use in secondary vehicle control systems
US20180162390A1 (en) Vehicle Control Device and Vehicle Control Method
WO2016194168A1 (en) Travel control device and method
CN112771591B (en) Method for evaluating the influence of an object in the environment of a vehicle on the driving maneuver of the vehicle
GB2486814A (en) Method for assisting a driver of a motor vehicle
US11390272B2 (en) Parking assistance device
US20200363218A1 (en) Driver assistance system and control method for the same
US20180022344A1 (en) Vehicle pivot technique
KR20200052997A (en) Apparatus of straight driving recognition for autonomous vehicle dead-reckoning performance improvement, and method thereof
JP2018167735A (en) Steering support device of vehicle
US10775804B1 (en) Optical array sensor for use with autonomous vehicle control systems
KR102259603B1 (en) Apparatus for calculating distance between vehicles and method thereof
KR102286747B1 (en) Apparatus for evaluating highway drive assist system and method thereof, highway drive assist system
JP2022139380A (en) Automatic operation vehicle
JP2023083942A (en) Positional accuracy determination device, positional accuracy determination computer program, and positional accuracy determination method
JP2022140084A (en) Automatic driving vehicle
JP2023049571A (en) Steering control method and steering control device

Legal Events

Date Code Title Description
AS Assignment

Owner name: ROBERT BOSCH GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEGER, THOMAS;SIMON, STEPHAN;HELMLE, MICHAEL;SIGNING DATES FROM 20130701 TO 20130709;REEL/FRAME:031146/0026

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4