WO2016020718A1 - Procédé et appareil servant à déterminer l'état dynamique d'un véhicule - Google Patents

Procédé et appareil servant à déterminer l'état dynamique d'un véhicule Download PDF

Info

Publication number
WO2016020718A1
WO2016020718A1 PCT/IB2014/001868 IB2014001868W WO2016020718A1 WO 2016020718 A1 WO2016020718 A1 WO 2016020718A1 IB 2014001868 W IB2014001868 W IB 2014001868W WO 2016020718 A1 WO2016020718 A1 WO 2016020718A1
Authority
WO
WIPO (PCT)
Prior art keywords
characteristic
pixels
vehicle
points
frame
Prior art date
Application number
PCT/IB2014/001868
Other languages
English (en)
Inventor
Akshay V. GOKHALE
Sascha WIRGES
Harsha Badarinarayan
Original Assignee
Hitachi Automotive Systems, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Automotive Systems, Ltd. filed Critical Hitachi Automotive Systems, Ltd.
Priority to PCT/IB2014/001868 priority Critical patent/WO2016020718A1/fr
Publication of WO2016020718A1 publication Critical patent/WO2016020718A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present invention relates generally to a method and apparatus for determining the dynamic state of a vehicle, such as an automotive vehicle.
  • a major drawback of these previously known systems is that they use image processing techniques like object detection to determine the dynamic state, etc.
  • image processing involves a large amount of data and object detection as well as classification, so that image processing in real time is infeasible.
  • the present invention provides both a method and system for determining the dynamic state of a vehicle in real time which can be used to control various vehicle systems, such as the suspension system of the vehicle, for a more comfortable and/or safer ride.
  • the system utilizes a number of video cameras as the input sensor. These video cameras are arranged in clusters with two or more cameras having overlapping fields of view. Each camera, furthermore, acquires a video image at spaced time intervals t, t+1, t+2, ... t+N. [0007] From each of the video images, characteristic pixels are identified which have a unique feature compared to the other pixels in the image. Various different qualities may be used to identify such characteristic pixels. For example, pixels having a unique intensity, a unique edge, a unique feature, or a unique texture in the neighborhood of the pixel all may form unique features.
  • a three-dimensional point map with the characteristic pixels or points is then reconstructed in three-dimensional space from the images or pictures taken by the plurality of cameras.
  • the motion of the characteristic three-dimensional points from a first frame and to a second subsequent frame is then determined so that the yaw, roll, pitch, or heave rate of the vehicle together with the translation vectors in the X, Y, and Z space between the time frames may be estimated to determine the dynamic state of the vehicle. That dynamic state is then used to control the vehicle systems, such as the suspension system, to provide a more comfortable or safer ride for the vehicle.
  • FIG. 1 is a diagrammatic view illustrating a vehicle
  • FIG. 2 is a system level process flowchart of the system
  • FIG. 3 is a flowchart illustrating the real time vehicle dynamic state estimation
  • FIG. 4 is a general flowchart illustrating the operation
  • FIGS. 5A and 5B are three-dimensional point maps in two subsequent time segments
  • FIG. 6 is a 3D point map of two subsequent time segments represented in single 3d plot
  • FIG. 7 is a diagrammatic view illustrating the slip rate of the vehicle
  • FIG. 8 is a flowchart illustrating the road surface condition estimation
  • FIG. 9 is a flowchart illustrating the operation of the suspension system.
  • FIG. 10 is a diagrammatic view illustrating a model of a suspension system.
  • a vehicle 20 is there shown having a body 22.
  • a plurality of wheels 24 are rotatably mounted to the body 22 in the conventional fashion.
  • Each wheel 24, furthermore, is supported to the body 22 by a suspension system 26.
  • An electronic control unit (ECU) 28 controls the operation of the overall systems of the vehicle.
  • the ECU 28 controls the operation of the suspension system 26 through output signals on output lines 30.
  • a wheel sensor 32 is also associated with each wheel 24 and provides an output signal as an input signal to the ECU 28 representative of the speed of its associated wheel.
  • the ECU 28 also communicates with a number of other systems on the vehicle 20, preferably through a bus 36, such as a CAN bus. These other systems include the electronic stability control (ESC) system 33, an antilock braking system 42, a brake system 38, engine control system 40, and electronic power steering (EPS) system 44.
  • ESC electronic stability control
  • EPS electronic power steering
  • a plurality of cameras 46 are mounted at known and predetermined locations on the automotive vehicle 20. These cameras 46, furthermore, are arranged in clusters so that at least two cameras have overlapping fields of view. Each camera 46 provides a digital output signal for each image or picture acquisition as an input signal to the ECU 28.
  • each camera 46 furthermore, is known relative to a center of gravity 48 of the automotive vehicle 20.
  • the position of the cameras 46 is preferably selected to minimize the complexity of converting the field of view for each image taken by the cameras 46 to a field of view from the center of gravity 48.
  • the ECU 28 also preferably includes a GPS module which communicates with a GPS satellite 50.
  • a vehicle-to-infrastructure (V2I) system is also contained within the ECU 28 to enable communication with other communication systems 52 infrastructure and a vehicle-to-vehicle (V2V) system is also contained within the ECU 28 to enable communication with systems in other vehicles.
  • V2I vehicle-to-infrastructure
  • V2V vehicle-to-vehicle
  • step 53 proceeds to step 55 in which data, i.e. the images or pictures, is acquired from each of the N cameras.
  • step 55 then proceeds to step 54 where the image frames are stored in an image buffer.
  • step 54 then proceeds to step 56.
  • step 56 the system reconstructs a three-dimensional (3D) point map of selected pixels in a fashion that will subsequently be described in greater detail.
  • step 56 then proceeds to step 58 where the real time dynamic state of the vehicle is obtained.
  • the method for obtaining the dynamic state of the vehicle will also be subsequently described in greater detail.
  • Step 58 also receives an input from the GPS and wheel speed sensors from step 74 to account for both longitudinal and lateral slip of the vehicle.
  • step 58 proceeds to step 60.
  • Step 60 then outputs control signals to one or more systems, such as the suspension system, of the automotive vehicle in order to improve the comfort or safety of the vehicle.
  • Step 54 also proceeds through parallel processing to step 62 which also reconstructs the 3D image of selected pixels from the camera images.
  • Step 62 proceeds to step 64 which also receives data from step 56, to estimate the road height.
  • step 66 determines the road surface condition;
  • step 68 determines the pedestrian, obstacle, and traffic; and
  • step 70 detects traffic signals, signs, etc.
  • V2V and V2I communications may also be received from nearby vehicles and infrastructure at step 76 so that the vehicle dynamic state may be estimated using alternative methods at step 78.
  • Step 78 then proceeds to step 60 to control the vehicle systems.
  • step 80 proceeds to step 82 where the image data from the cameras is acquired.
  • step 82 proceeds to step 84 where the image frames are stored in an image buffer and then to step 86 where a 3D point map is constructed from selected pixels or points in the camera images.
  • Step 86 then proceeds to step 88 where both the rotation vectors, i.e. yaw, roll, and pitch, as well as translational vectors in the X, Y, and Z axes of the velocity, are estimated in a fashion subsequently described. Step 88 then proceeds to step 90.
  • step 90 the vehicle acceleration, jerk, and displacement at the center of gravity is estimated by translating the vector data obtained from the camera images and translating that data from the viewpoint of the center of gravity 48 of the vehicle 20. Step 90 then proceeds to step 92.
  • Step 92 receives the GPS data and wheel speed sensor data from step 74 and then utilizes this data to estimate both the longitudinal and lateral slip of the vehicle. Step 92 then proceeds to step 60 (see also FIGS. 2 and 3) to control one or more systems in the automotive vehicle.
  • the present system does not conduct full image processing on all of the images contained within the pictures acquired by the cameras. Instead, in the present system, unique pixels are identified in each captured camera frame. These unique pixels may be identified in several different ways.
  • pixels within an image have a unique intensity or grayscale value that is easily distinguished from the other surrounding pixels
  • the pixel is identified as a unique pixel.
  • pixels in shadow or covered in water will have a different gray value than the pixels on the street itself. Consequently, any adjacent pixels to such an entity are termed as unique pixels with a unique neighborhood and can thus be used for further calculations.
  • pixels representing an edge of an object like the edge of the road or some other object will have different threshold of brightness or color. Such pixels that are adjacent to these pixels can be termed as unique pixels with a unique neighborhood. These unique pixels are then used for further calculation.
  • Still other pictures may have a unique feature, such as a pothole or bump or other unique feature on the road, and will have a different threshold value for brightness as compared with the surrounding pixels. These pixels with different brightness are then termed as unique pixels with the unique neighborhood and are used for further calculation.
  • pixels representing a unique texture are different from the texture of the surrounding road.
  • Groups of pixels adjacent to the cracks are unique pixels with a unique or different neighborhood. These groups of pixels will then be used to calculate the motion of the vehicle.
  • the ECU 28 calibrates the cameras to obtain the relationship between the world coordinate system and the image coordinate system.
  • the mapping of pixel w, in the image frame /) to a line of view /, in world coordinates is given by
  • the FOV L j of each camera j is obtained by the set of image points l j .
  • the group of world points in the cameras having overlapping field of view can thus be determined and the relation between the world points' position and the camera cluster centroid is obtained.
  • the actual or real time dynamic state of the vehicle is defined by the rotational vector and the translation vector.
  • the rotational vector takes into account the three degrees of rotation of the vehicle, namely yaw, roll, and pitch, while the translation vector accounts for the movement of the vehicle in the X, Y, and Z coordinate system.
  • m i and 1 1 are the rotation and translation vectors for camera i, respectively.
  • step 94 proceeds to step 96 where the camera images are captured.
  • step 96 then proceeds to step 98 where the pixels corresponding to features with unique neighboring pixels are identified.
  • step 98 then proceeds to step 100.
  • step 100 the relationship between the pixel in the camera frame and the world coordinate system is defined.
  • Step 100 then proceeds to step 102 where the relationship between the center of gravity of the vehicle and the points in the camera frame in the world coordinate system are defined.
  • Step 102 then proceeds to step 104 where the relationship between the center of gravity of the vehicle and the points defines a three-dimensional point map.
  • FIG. 5 A Such a three-dimensional point map of the unique pixels is shown in FIG. 5 A at time tl and in FIG. 5B at time t2 where t2 is subsequent to tl .
  • These pixels 1 10 having a unique background area are plotted in the three-dimensional space. Since only a limited number of pixels 110 have unique backgrounds, only a limited number of pixels 110 are plotted within a typical three-dimensional point map. This thus reduces the amount of computational power required as compared with full image processing.
  • FIG. 6 a 3D bitmap of both the images from FIGS. 5A and 5B is shown showing the movement of the centroids x and y .
  • quaternions are used to express rotation of the vehicle and find the optimal rotation matrix having the yaw, pitch, and roll vectors.
  • This minimization problem is transformed to an eigenvalue problem for a matrix N depending upon the sum of the products x ⁇ y[ .
  • the eigenvector to the most positive eigenvalue of N is a quaternion representing the rotation Aco.
  • the rotation matrix R ⁇ Aco) can be easily calculated by using the Rodrigues rotation formula. More specifically, the translation velocity v is equal to the difference of the first centroid and the scaled and rotated second centroid as shown in Figure 6 so that
  • the translation velocity i.e. velocity of vehicle in the X, Y, and Z direction
  • angular rates i.e. rate of change of yaw, pitch, and roll
  • the rate of change in orientation around all three axes i.e. vector dm
  • the rate of change in position (velocity) around all three axes i.e. vector dt is from frame at time t and to frame at time t+1
  • the rate of change in position (velocity) around all three axes is obtained by dividing the change in orientation between two frames by the time ((t+l)-t).
  • the accelerations in the X, Y, and Z directions can then be approximated by obtaining the discrete time derivative of the velocities in the X, Y, and Z directions as shown by:
  • the acceleration data obtained is filtered first. Then the lateral and longitudinal jerk required for G vectoring control is approximated by discrete time derivative of accelerations and is given by
  • the displacements of the vehicle in X, Y, and Z axes can be approximated by the discrete sum of velocities over time and is given by
  • the vehicle yaw, roll, and pitch angles can be approximated for yaw, roll, and pitch rate and integrating over discrete time interval respectively.
  • Vehicle slip is defined as the difference between the velocity of the wheels and the velocity of the vehicle in the X or longitudinal direction or Y or lateral direction.
  • the longitudinal slip furthermore, may be defined as follows:
  • V the velocity of the vehicle
  • the slide slip can be determined by using the GPS data in combination with the vehicle dynamic information.
  • the velocity in longitudinal direction is determined using the vehicle dynamic information and determined by GPS.
  • the formula to determine the slide slip angle is as follows: p ⁇ arccos
  • step 120 proceeds to step 122 where data is acquired from N number of cameras.
  • step 122 then proceeds to step 124 where the image data from the camera is stored in an image buffer and then to step 126 where the 3D image is reconstructed from the image frame in the previously described fashion.
  • step 126 then proceeds to step 128.
  • a multichannel histogram is computed for the images.
  • Step 128 then proceeds to step 130 where the histogram values are identified for the pixels corresponding to the road surface.
  • Step 130 then proceeds to step 122 where the road surface is classified as dry, wet, asphalt, snow, ice, etc. depending upon the histogram values.
  • step 132 proceeds to step 134 where the coefficient of friction of the road is obtained from a database depending upon the road condition.
  • Step 134 then proceeds to step 136 where the data is used to control the vehicle systems.
  • step 120 also proceeds to step 142 which proceeds through steps 142-146 which correspond with steps 122-126 previously described.
  • Step 146 then proceeds to step 148.
  • step 148 pixels or points that are above a threshold limit above the road in a Z direction are rejected.
  • the rejected points would represent, for example, objects that are not on the surface of the road, but rather vehicles or other objects positioned above the road.
  • Step 148 also provides data to step 130 when determining the road surface.
  • Step 148 then proceeds to step 150.
  • the depth of the remaining pixels is then estimated to obtain the road profile in the Z or vertical direction. For example, a gravel road will have a different texture than a paved road.
  • step 150 proceeds to step 152 which applies a correction from the real time vehicle dynamic state in order to eliminate error in the output data.
  • Step 152 then proceeds to step 136.
  • step 120 through parallel processing also proceeds to steps 162-166.
  • Steps 162-166 correspond to steps 122-126 and result in the reconstruction of a 3D point map of the pixels with the unique neighborhoods.
  • Step 166 then proceeds to step 168.
  • Step 168 also receives data from step 144 representative of points in the map that are above a threshold limit in the Z direction.
  • Step 168 then crops the area of the map corresponding to the road surface after elimination of the rejected points from data from step 148 and then proceeds to step 170.
  • step 170 the average spatial histogram of the cropped image is then computed based upon the depth of the image. Step 170 then proceeds to step 172 which isolates areas in the image having bumps or potholes. Step 172 then proceeds to step 174.
  • step 174 bumps and potholes are classified by comparing the identified bump or pothole to the average deviation of histograms of bumps and potholes. Step 174 then proceeds to step 176 which approximates the height of the bumps or depth of the potholes and then proceeds to step 178. At step 178, the height of the bumps or depth of the potholes is also estimated using the front tire when the front tire hits the bump or pothole. That information is then processed by the ECU 28 which then generates output signals to the suspension system for the rear wheel in order to compensate for the bump or pothole.
  • the 3D surface points are used to estimate the height distribution of the road surface.
  • the thresholds depend on the standard deviation of the estimated distribution. Areas or regions with possible bumps or potholes are then processed using morphological thinning algorithms. The image histogram is then computed and the average feature vector f avg for pavement is estimated using the standard deviation of the road surface gray intensities from the image. The feature vectors of potential potholes or bumps are then determined by the standard deviation of gray intensity values for that region. These feature vectors define the texture of the regions. The feature vectors f r of potential potholes/bumps are then compared with average feature vector of pavement. To detect potholes and cracks using feature vectors, the condition
  • step 190 proceeds to step 192 where the image data is acquired by the N number of cameras and then to step 196 where the camera images are stored in a buffer. Step 194 then proceeds to step 221.
  • Step 221 performs the road surface estimation previously described with respect to FIG. 8 and resulting in the use of the data to control the system at step 136. That description is incorporated by reference.
  • Step 196 also proceeds to step 198 where the 3D point map is reconstructed in the previously described fashion and then to step 200 where both the rotation vectors and translation vectors are determined. Step 200 then proceeds to step 202.
  • step 200 the vehicle acceleration, jerk, and displacement at the center of gravity is estimated and then proceeds to step 204.
  • Step 204 also receives data from step 195 representative of the GPS data as well as the wheel speed sensor data and then determines the vehicle lateral and longitudinal slip as previously described. Step 204 then proceeds to step 206.
  • step 206 the weight transfer is then computed for each wheel.
  • Step 206 then proceeds to step 208 where the displacement, acceleration, and velocities of each wheel are computed.
  • Step 208 then proceeds to step 210.
  • Step 210 applies a quarter car transformation for each wheel.
  • Step 210 receives data both from step 136 as well as from step 206.
  • Step 210 proceeds to step 212 where the required dynamic damper coefficient is calculated.
  • FIG. 10 a simplified model of a vehicle suspension system is shown which is used in conjunction with the damper coefficient computed at step 212 to determine the actuation and control strategy for the suspension system.
  • the model 220 illustrated in FIG. 10 represents a fairly accurate mathematical model of an automotive vehicle suspension system.
  • the road displacement is estimated from the road surface detection technique from the computed 3D points as shown in above sections.
  • each wheel is individually analyzed. Dynamic weight transfers are calculated based on the roll and pitch of vehicle as well as using lateral acceleration determined in above sections while cornering.
  • c min and c max are the minimum and maximum damping values and a is tunable parameter.
  • the damping coefficient for each damper of the vehicle is estimated using above equations. Antiroll and anti-pitch control can easily be realized as independent suspension control is possible.
  • the required damping coefficient is determined for each wheel from step 212 (FIG. 9), it is used to compute the actuation input required like current input in case of current controller damper (CCD), magneto-rheological (MR) damper, and electromagnetic damper (EMD) or voltage input in case of electro-rheological (ER) damper or other type of dampers, i.e.
  • the damping system change a damping coefficient for each damper of each wheel of the vehicle in order to reduce at least one of yaw, roll, pitch or heave against the yaw, roll, pitch or heave rate of the vehicle calculated based on the motion of the characteristic 3d points in the three-dimensional point map.
  • step 212 proceeds to step 222 and generates output signals based upon the actuation strategy of the actuators in the vehicle suspension system.
  • the present invention provides both a method and apparatus for the real time control of the vehicle systems, especially the suspension system, based upon the dynamic state of the vehicle.
  • the above mentioned systems may be able to utilize acceleration, inertia, gyro, wheel speed, steering angle, slip sensors, and the like to estimate the vehicle's dynamic state in real time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Vehicle Body Suspensions (AREA)

Abstract

L'invention concerne un procédé et un système de commande de véhicule permettant d'acquérir périodiquement une pluralité d'images à partir d'une pluralité de caméras. Des pixels sont identifiés dans chaque image qui ont une caractéristique unique par rapport aux pixels environnants dans l'image. Une carte de points tridimensionnels est ensuite reconstituée au moyen des points tridimensionnels caractéristiques en fonction des pixels uniques dans l'image. Le mouvement des points tridimensionnels caractéristiques allant d'une première trame et jusqu'à une deuxième trame à un moment ultérieur est construit de telle sorte qu'au moins l'un parmi le lacet, le tangage, le roulis, ou le taux de houle du véhicule est calculé en fonction du mouvement des points tridimensionnels caractéristiques dans la carte de points tridimensionnels.
PCT/IB2014/001868 2014-08-07 2014-08-07 Procédé et appareil servant à déterminer l'état dynamique d'un véhicule WO2016020718A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2014/001868 WO2016020718A1 (fr) 2014-08-07 2014-08-07 Procédé et appareil servant à déterminer l'état dynamique d'un véhicule

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2014/001868 WO2016020718A1 (fr) 2014-08-07 2014-08-07 Procédé et appareil servant à déterminer l'état dynamique d'un véhicule

Publications (1)

Publication Number Publication Date
WO2016020718A1 true WO2016020718A1 (fr) 2016-02-11

Family

ID=55263216

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/001868 WO2016020718A1 (fr) 2014-08-07 2014-08-07 Procédé et appareil servant à déterminer l'état dynamique d'un véhicule

Country Status (1)

Country Link
WO (1) WO2016020718A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9744967B2 (en) 2015-11-06 2017-08-29 Mazda Motor Corporation Vehicle behavior control device
US9889846B2 (en) 2015-11-06 2018-02-13 Mazda Motor Corporation Vehicle behavior control device
WO2018195150A1 (fr) * 2017-04-18 2018-10-25 nuTonomy Inc. Perception automatique de feux de circulation
US10220837B2 (en) 2016-06-30 2019-03-05 Mazda Motor Corporation Vehicle behavior control device
US10253711B2 (en) 2015-12-22 2019-04-09 Mazda Motor Corporation Vehicle behavior control device
US10266173B2 (en) 2015-11-06 2019-04-23 Mazda Motor Corporation Vehicle behavior control device
US10569765B2 (en) 2015-11-06 2020-02-25 Mazda Motor Corporation Vehicle behavior control device
US10643084B2 (en) 2017-04-18 2020-05-05 nuTonomy Inc. Automatically perceiving travel signals
US10650256B2 (en) 2017-04-18 2020-05-12 nuTonomy Inc. Automatically perceiving travel signals
US10960886B2 (en) 2019-01-29 2021-03-30 Motional Ad Llc Traffic light estimation

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03282710A (ja) * 1990-03-30 1991-12-12 Mazda Motor Corp 移動車の環境認識装置
JPH06183237A (ja) * 1991-11-01 1994-07-05 Unisia Jecs Corp 車両懸架装置
JPH08258588A (ja) * 1995-03-27 1996-10-08 Mazda Motor Corp 車両における路面状態検出装置
JP2007225408A (ja) * 2006-02-23 2007-09-06 Vios System:Kk 移動体の横滑り計測装置
JP2009017462A (ja) * 2007-07-09 2009-01-22 Sanyo Electric Co Ltd 運転支援システム及び車両
JP2010086267A (ja) * 2008-09-30 2010-04-15 Mazda Motor Corp 車両用画像処理装置
JP2012168788A (ja) * 2011-02-15 2012-09-06 Toyota Central R&D Labs Inc 運動推定装置及びプログラム
JP2013156769A (ja) * 2012-01-27 2013-08-15 Toyota Central R&D Labs Inc 運動推定装置及びプログラム
JP2013164643A (ja) * 2012-02-09 2013-08-22 Honda Elesys Co Ltd 画像認識装置、画像認識方法および画像認識プログラム

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03282710A (ja) * 1990-03-30 1991-12-12 Mazda Motor Corp 移動車の環境認識装置
JPH06183237A (ja) * 1991-11-01 1994-07-05 Unisia Jecs Corp 車両懸架装置
JPH08258588A (ja) * 1995-03-27 1996-10-08 Mazda Motor Corp 車両における路面状態検出装置
JP2007225408A (ja) * 2006-02-23 2007-09-06 Vios System:Kk 移動体の横滑り計測装置
JP2009017462A (ja) * 2007-07-09 2009-01-22 Sanyo Electric Co Ltd 運転支援システム及び車両
JP2010086267A (ja) * 2008-09-30 2010-04-15 Mazda Motor Corp 車両用画像処理装置
JP2012168788A (ja) * 2011-02-15 2012-09-06 Toyota Central R&D Labs Inc 運動推定装置及びプログラム
JP2013156769A (ja) * 2012-01-27 2013-08-15 Toyota Central R&D Labs Inc 運動推定装置及びプログラム
JP2013164643A (ja) * 2012-02-09 2013-08-22 Honda Elesys Co Ltd 画像認識装置、画像認識方法および画像認識プログラム

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10266173B2 (en) 2015-11-06 2019-04-23 Mazda Motor Corporation Vehicle behavior control device
US9889846B2 (en) 2015-11-06 2018-02-13 Mazda Motor Corporation Vehicle behavior control device
US10569765B2 (en) 2015-11-06 2020-02-25 Mazda Motor Corporation Vehicle behavior control device
US9744967B2 (en) 2015-11-06 2017-08-29 Mazda Motor Corporation Vehicle behavior control device
US10253711B2 (en) 2015-12-22 2019-04-09 Mazda Motor Corporation Vehicle behavior control device
US10220837B2 (en) 2016-06-30 2019-03-05 Mazda Motor Corporation Vehicle behavior control device
WO2018195150A1 (fr) * 2017-04-18 2018-10-25 nuTonomy Inc. Perception automatique de feux de circulation
CN111094095A (zh) * 2017-04-18 2020-05-01 优特诺股份有限公司 自动地接收行驶信号
US10643084B2 (en) 2017-04-18 2020-05-05 nuTonomy Inc. Automatically perceiving travel signals
US10650256B2 (en) 2017-04-18 2020-05-12 nuTonomy Inc. Automatically perceiving travel signals
CN111094095B (zh) * 2017-04-18 2021-09-07 动态Ad有限责任公司 自动地感知行驶信号的方法、装置及运载工具
US11182628B2 (en) 2017-04-18 2021-11-23 Motional Ad Llc Automatically perceiving travel signals
US11727799B2 (en) 2017-04-18 2023-08-15 Motional Ad Llc Automatically perceiving travel signals
US10960886B2 (en) 2019-01-29 2021-03-30 Motional Ad Llc Traffic light estimation
US11529955B2 (en) 2019-01-29 2022-12-20 Motional Ad Llc Traffic light estimation

Similar Documents

Publication Publication Date Title
WO2016020718A1 (fr) Procédé et appareil servant à déterminer l'état dynamique d'un véhicule
CN112906449B (zh) 基于稠密视差图的路面坑洼检测方法、系统和设备
JP6785620B2 (ja) ステレオカメラセンサを用いた車両用の予測的サスペンション制御
CN112698302B (zh) 一种颠簸路况下的传感器融合目标检测方法
US20200016952A1 (en) Suspension system using optically recorded information, vehicles including suspension systems, and methods of using suspension systems
CN104220317B (zh) 路面状态推定装置
CN113361121B (zh) 一种基于时空同步与信息融合的路面附着系数估计方法
JP5926228B2 (ja) 自律車両用の奥行き検知方法及びシステム
US9912933B2 (en) Road surface detection device and road surface detection system
US9361696B2 (en) Method of determining a ground plane on the basis of a depth image
CN107750364A (zh) 使用稳定的坐标系的道路垂直轮廓检测
CN104204726A (zh) 移动物体位置姿态估计装置和移动物体位置姿态估计方法
CN106503636B (zh) 一种基于视觉图像的道路视距检测方法及装置
WO2020158219A1 (fr) Système de détection de nids-de-poule
JP6574611B2 (ja) 立体画像に基づいて距離情報を求めるためのセンサシステム
US7623700B2 (en) Stereoscopic image processing apparatus and the method of processing stereoscopic images
CN105593776B (zh) 车辆位置姿势角推定装置及车辆位置姿势角推定方法
JP2007022117A (ja) 車両安定化制御システム
KR102566583B1 (ko) 서라운드 뷰 이미지의 안정화를 위한 방법 및 장치
WO2007017693A1 (fr) Procédé et dispositif de détermination de déplacement d’un véhicule
JP5310027B2 (ja) 車線認識装置、及び車線認識方法
US20210012119A1 (en) Methods and apparatus for acquisition and tracking, object classification and terrain inference
EP3649571A1 (fr) Système et procédé avancés d'aide à la conduite
KR102039801B1 (ko) 스테레오 카메라 기반 도로 구배 예측방법 및 시스템
EP3389015A1 (fr) Procédé et dispositif d'étalonnage d'angle de roulis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14899367

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14899367

Country of ref document: EP

Kind code of ref document: A1