WO2012172713A1 - Dispositif destiné à déterminer le profil d'une route, dispositif de reconnaissance d'image embarqué, dispositif d'ajustement d'axe d'acquisition d'image, et procédé de reconnaissance de voie - Google Patents

Dispositif destiné à déterminer le profil d'une route, dispositif de reconnaissance d'image embarqué, dispositif d'ajustement d'axe d'acquisition d'image, et procédé de reconnaissance de voie Download PDF

Info

Publication number
WO2012172713A1
WO2012172713A1 PCT/JP2012/001576 JP2012001576W WO2012172713A1 WO 2012172713 A1 WO2012172713 A1 WO 2012172713A1 JP 2012001576 W JP2012001576 W JP 2012001576W WO 2012172713 A1 WO2012172713 A1 WO 2012172713A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
unit
imaging
lane
road
Prior art date
Application number
PCT/JP2012/001576
Other languages
English (en)
Japanese (ja)
Inventor
高浜 琢
文紀 武田
Original Assignee
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社 filed Critical 日産自動車株式会社
Priority to US14/125,832 priority Critical patent/US20140118552A1/en
Priority to EP12801148.3A priority patent/EP2720213A4/fr
Priority to JP2013520407A priority patent/JP5733395B2/ja
Priority to CN201280026555.2A priority patent/CN103582907B/zh
Publication of WO2012172713A1 publication Critical patent/WO2012172713A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • B60W40/072Curvature of the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • B60W2050/0083Setting, resetting, calibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/0215Sensor drifts or sensor failures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/18Steering angle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Definitions

  • the present invention relates to a technique for recognizing a lane shape or the like on which a vehicle travels by a camera mounted on the vehicle.
  • the left and right lane markers in the traveling lane in which the vehicle is traveling are image-recognized based on the result captured by the camera. Then, the intersection of the extension lines is obtained from the recognized left and right lane markers, and these are collected and averaged to obtain the camera mounting angle error.
  • the present invention has been made in view of the above problems, and an object of the present invention is to determine whether or not the imaging angle of an imaging unit provided in a vehicle is a straight road with a smaller calculation load.
  • the periphery of the vehicle is imaged by an imaging unit provided in the vehicle, and the lane shape of the traveling lane in which the vehicle travels is recognized from the captured image.
  • the left and right positions located in the vicinity region are based on the lane shape in the vicinity region that is relatively close to the host vehicle and the lane shape in a distant region that is far from the host vehicle among the recognized lane shapes.
  • the deviation of the intersection of the extension line that approximates the lane marker with a straight line and the intersection of the extension line that linearly approximates the left and right lane markers located in the far region portion is determined to be equal to or less than a preset threshold value, it is determined as a straight road.
  • FIG. 1 It is a figure showing an example of vehicles carrying an in-vehicle image recognition device concerning a 1st embodiment of the present invention.
  • It is a functional block diagram which shows an example of a structure of the vehicle-mounted image recognition apparatus which concerns on 1st Embodiment of this invention.
  • 3 is a functional block diagram illustrating an example of a configuration of a lane shape recognition unit 102.
  • FIG. It is a schematic diagram which shows the concept of the process in the lane shape recognition part. It is a schematic diagram which shows the concept in the case of performing a lane recognition process by classifying a near field and a far field.
  • It is a flowchart which shows an example of the process in the vehicle-mounted image recognition apparatus which concerns on 1st Embodiment of this invention.
  • FIG. 1 is a diagram illustrating an example of a vehicle equipped with an in-vehicle image recognition device according to the present embodiment.
  • the in-vehicle image recognition apparatus according to the present embodiment is an apparatus for recognizing a lane in which a vehicle travels from an image captured by an in-vehicle camera.
  • the vehicle 1 includes a camera 10 incorporating an image processing device 10a, a vehicle speed detection device 20, a steering angle detection device 30, a steering angle control device 40, and a steering angle actuator 50.
  • the camera 10 captures an image in front of the vehicle 1.
  • the camera 10 is a digital camera provided with an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). More specifically, the camera 10 is a progressive scan type 3CMOS camera that captures images at high speed.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the camera 10 is installed, for example, at the front center of the ceiling of the vehicle 1 so as to image the front of the vehicle 1 and images the traveling road ahead of the vehicle 1 through the windshield. Yes.
  • any other installation mode can be used as long as it is a camera that captures an image of the traveling path of the host vehicle 1.
  • the camera 10 can be attached to the rear of the vehicle 1 like a back view camera, or can be attached to the front end of the vehicle 1 such as a bumper, and the vanishing point is not reflected in the field of view of the camera 10. You can also.
  • the virtual vanishing point can be calculated by detecting the edge of the lane marker and calculating the approximate straight line.
  • the image processing apparatus 10a is an apparatus that executes the lane recognition process according to the present embodiment. That is, the camera 10 incorporating the image processing apparatus 10a of FIG. 1 corresponds to the in-vehicle image recognition apparatus according to the present embodiment. Information output from the image processing device 10 a, the vehicle speed detection device 20, and the steering angle detection device 30 is input to the steering angle control device 40. Then, the steering angle control device 40 outputs a signal for realizing target steering to the steering angle actuator 50.
  • the camera 10 and the steering angle control device 40 are each provided with a microcomputer, peripheral components thereof, drive circuits for various actuators, and the like, and transmit / receive information to / from each other via a communication circuit.
  • the lane recognition process according to the present embodiment is realized by the hardware configuration as described above.
  • the camera 10 incorporating the image processing apparatus 10a functionally has an imaging unit 101, a lane shape recognition unit 102, a vehicle behavior recognition unit 103, an imaging angle derivation unit 104, and an information bias determination unit 105.
  • the imaging angle correction unit 106 is provided.
  • the imaging unit 101 images the periphery of the vehicle 1.
  • the lane shape recognition unit 102 recognizes the lane shape of the travel lane in which the vehicle 1 travels from the captured image captured by the imaging unit 101.
  • a traveling lane detection method for example, a known method described in JP-A-2004-252827 may be employed.
  • a method for calculating the shape of the traveling lane and the position and posture of the vehicle for example, a known method described in Japanese Patent Application Laid-Open No. 2004-318618 may be employed.
  • the lane shape recognizing unit 102 obtains the intersection coordinates from the extension lines of the pair of left and right lane markers in the far region and the nearby region, using the lane shape recognized in this way. For example, the intersection coordinates are obtained from the extension lines of a pair of left and right lane markers in the far and near areas by the following method. That is, the lane shape recognition unit 102 analyzes the captured image of the imaging unit 101, and the yaw angle C of the vehicle 1, the pitch angle D of the vehicle 1, the height H of the imaging unit 101 from the road surface, and the lateral displacement from the lane center. An arithmetic unit for calculating A and the curvature B of the traveling lane is provided.
  • the lane shape recognition unit 102 outputs the yaw angle C of the vehicle 1, the lateral displacement A from the center of the lane, and the curvature B of the traveling lane calculated by the calculation device to the steering angle control device 40. Thereby, for example, automatic steering of the vehicle 1 is realized.
  • FIG. 3 is a diagram illustrating a configuration example of the lane shape recognition unit 102.
  • FIG. 4 is a schematic diagram showing the concept of processing in the lane shape recognition unit 102.
  • the lane shape recognition unit 102 includes a white line candidate point detection unit 200, a lane recognition processing unit 300, and an optical axis correction unit 400.
  • the white line candidate point detection unit 200 detects white line candidate points to be lane markings based on image data captured by the imaging unit 101.
  • the white line candidate point detection unit 200 acquires an image obtained by imaging the travel path of the host vehicle 1 from the imaging unit 101, and detects the white line edge Ed by performing image processing.
  • the position of the image processing frame F is determined based on road parameters (road shape and vehicle posture with respect to the road) described later with respect to the lane markings (white lines) positioned on the left and right of the acquired captured image.
  • the set image processing frame F is subjected to, for example, primary spatial differentiation using a Sobel filter to emphasize the edge of the boundary between the white line and the road surface, and then extract the white line edge Ed.
  • the lane recognition processing unit 300 includes a road shape calculation unit 310 that linearly approximates the road shape, and a road parameter estimation unit 320 that estimates the road shape and the vehicle posture with respect to the road.
  • the road shape calculation unit 310 passes pixels whose intensity of the white line edge Ed extracted by the white line candidate point detection unit 200 is equal to or greater than a preset threshold Edth, and more than a preset number of pixels Pth, and An approximate straight line Rf of the road shape is calculated by extracting a straight line connecting one point on the upper side and one lower side of the detection region by Hough transform.
  • the road image data obtained by photographing is divided into two areas, a distant area and a vicinity, and the road shape is linearly approximated in each area (see FIG. 5).
  • the road parameter estimation unit 320 estimates a road parameter (road shape and vehicle attitude with respect to the road) as a road model equation from the approximate straight line Rf detected by the road shape calculation unit 310 using the following equation (1). .
  • the parameters A, B, C, D, and H in the equation (1) are road parameters and vehicle state quantities estimated by the road parameter estimation unit 320.
  • the parameters A, B, C, D, and H are the lateral displacement (A) with respect to the lane of the vehicle 1, the road curvature (B), the yaw angle (C) with respect to the lane of the host vehicle 1, and the pitch of the vehicle 1, respectively.
  • W is a constant indicating the lane width (the distance between the inside of the left and right white lines on the actual road)
  • f is a camera perspective transformation constant
  • j is a parameter for distinguishing the left and right white lines
  • j 0 for the left white line
  • For the right white line, j 1.
  • (x, y) is the coordinates on the road image at an arbitrary point on the inner edge of the left or right white line, taking the upper left of the road image as the origin, the right direction is the x-axis positive direction, the lower direction Is the positive y-axis direction.
  • the optical axis correction unit 400 includes a straight road determination unit 410 that determines that the travel path of the host vehicle 1 is a straight road, and a parallel travel determination unit that determines that the host vehicle 1 travels parallel to the travel path. 420 and a virtual vanishing point calculation unit 430 that calculates a virtual vanishing point from the approximate straight line Rf of the road shape.
  • the straight road determination unit 410 compares the degree of coincidence between the slope of the linear equation and the intercept value of the approximate straight line Rf of the distant and neighboring road shapes calculated by the road shape calculation unit 310, so that the traveling road of the host vehicle 1 It is determined whether or not is a straight road.
  • the parallel travel determination unit 420 determines that the host vehicle 1 is traveling in parallel with the travel path from the vehicle posture with respect to the travel path of the host vehicle 1 estimated by the road parameter estimation unit 320. Specifically, the parallel running determination unit 420 uses the lateral displacement A of the host vehicle 1 with respect to the lane, which is one of the vehicle state quantities estimated by the road parameter estimation unit 320, from the difference between the current value and the past value. The lateral speed (the differential value of the lateral displacement A) with respect to the lane of the host vehicle 1 is calculated. When the calculated lateral speed is equal to or less than a preset threshold value, it is determined that the host vehicle 1 is traveling in parallel with the travel path. The fact that the host vehicle 1 is traveling in parallel with the travel path indicates that the host vehicle 1 is traveling straight when the travel path is a straight road.
  • the virtual vanishing point calculation unit 430 determines that the straight road determination unit 410 and the parallel travel determination unit 420 determine that the travel path of the host vehicle 1 is a straight road and the host vehicle 1 travels parallel to the travel path of the host vehicle 1. In this state, the intersection of the right and left approximate straight lines Rf of the road shape is calculated as a virtual vanishing point.
  • FIG. 5 is a schematic diagram showing a concept when a lane recognition process is performed by dividing a near area and a far area.
  • a curved road having a relatively large radius is shown as an example.
  • the lane shape recognition unit 102 divides the image data captured by the imaging unit 101 into a near region (lower image portion) and a far region (image center portion), and in each region, white line candidate inspection is performed.
  • the exit unit 200 and the lane recognition processing unit 300 detect the white line edge Ed and the approximate straight line Rf.
  • the straight road determination unit 410 determines whether or not the traveling road is a straight line. For example, the straight road determination unit 410 determines that these approximate straight lines Rf match when the slopes and intercepts of the approximate straight lines Rf in the near and far regions are within a preset threshold.
  • the travel path is determined to be a straight line.
  • the vehicle behavior recognition unit 103 recognizes the behavior of the vehicle 1 provided with the imaging unit 101. Specifically, the vehicle behavior recognition unit 103 detects the vehicle speed (traveling speed) of the vehicle 1 detected by the vehicle speed detection device 20, the steering angle detected by the steering angle detection device 30, the front and rear of the vehicle detected by an acceleration sensor (not shown), and The behavior of the vehicle 1 is determined based on the acceleration in the vehicle width direction, the yaw rate value detected by the yaw rate sensor, and the like.
  • the imaging angle deriving unit 104 obtains the imaging angle of the imaging unit 101 from the lane shape recognition of the lane shape recognition unit 102.
  • the information bias determination unit 105 determines whether or not there is a bias with respect to at least one of the lane shape recognition unit 102 and the vehicle behavior recognition unit 103.
  • the imaging angle correction unit 106 corrects the imaging angle of the imaging unit 101 using the imaging angle output by the imaging angle deriving unit 104 when the information bias determination unit 105 determines that the bias is equal to or less than a preset threshold value.
  • step S ⁇ b> 101 the lane shape recognition unit 102 reads an image in front of the vehicle 1 captured by the imaging unit 101.
  • step S102 the vehicle behavior recognition unit 103 reads the vehicle speed of the vehicle 1 detected by the vehicle speed detection device 20, the steering angle detected by the steering angle detection device 30, and the like.
  • step S103 the lane shape recognition unit 102 processes the captured image of the imaging unit 101 read in step S101 to recognize the traveling lane in which the vehicle 1 is traveling, and the position and vehicle posture of the vehicle 1 with respect to the traveling lane. Etc. are calculated.
  • step S104 the lane shape recognition unit 102 uses the lane shape recognized in step S103 to obtain the intersection coordinates from the extension lines of a pair of left and right lane markers in the far and near areas. As described above, in the present step, the intersection coordinates of the neighboring area obtained by the lane shape recognition unit 102 are defined as (Pn_x, Pn_y), and the intersection coordinates of the far area are defined as (Pf_x, Pf_y).
  • step S105 the straight road determination unit 410 determines whether or not the road is a straight road from the intersection coordinates of the distant area and the near area obtained in step S104 by the following expression. If the following expression is satisfied, the process proceeds to step S106. Otherwise, the process proceeds to step S110. abs (Pn_x-Pf_x) ⁇ TH_Px (0-1)
  • abs (A) is a function that returns the absolute value of A.
  • the value of HP_Px is a positive value set in advance, such as 1.0. Note that satisfying the above equation (0-1) means that the intersection coordinates on the extension of the left and right lane markers detected from the area only in the vicinity of the camera imaging screen and the intersection coordinates on the extension of the left and right lane markers detected from the far field are , Being close. In other words, the fact that this condition is satisfied means that the straight lines are in the same direction from a distant place to the vicinity.
  • step S106 the parallel running determination unit 420 uses the lateral offset position (the distance in the left-right direction to the lane marker) Y of the host vehicle with respect to the host lane obtained in step S103 as input, and performs pseudo time differentiation using the following transfer function. Then, the lateral speed Ydot of the host vehicle is calculated. If the following expression (0-4) is satisfied, the process proceeds to step S107, and otherwise, the process proceeds to step S110.
  • G (Z -1 ) (c-cZ -2 ) / (1-aZ -1 + bZ -2 ) (0-2)
  • Ydot G (Z -1 ) Y (0-3) abs (Ydot) ⁇ TH_Ydot (0-4)
  • Z ⁇ 1 is a delay operator, and coefficients a, b, and c are all positive numbers, and these are discretized at a sampling period of 50 ms so as to have a preset frequency characteristic.
  • the value of TH_Ydot is a positive value set in advance, such as 0.03, for example, and may be a large value according to the height of the vehicle speed.
  • satisfying the above expression (0-4) means that the vehicle has not moved laterally with respect to the lane marker. In other words, it means that the vehicle is traveling along the lane marker without wobbling from side to side. Further, when the above equations (0-1) and (0-4) are satisfied at the same time, it means that the vehicle is traveling straight on a straight road.
  • step S107 the straight road determination unit 410 determines whether or not the road curvature Row obtained in step S103 satisfies all the following conditions. If the road curvature Row satisfies all of the following conditions, the process proceeds to step S108, and otherwise, the process proceeds to step S110. abs (Row) ⁇ TH_ROW (1-1) abs (SumTotalRow + Row) ⁇ TH_ROW (1-2)
  • abs (A) is a function that returns the absolute value of A.
  • SumTotalRow is the total value of the road curvature Row.
  • TH_ROW is a threshold value when the traveling lane is regarded as a straight road.
  • the information bias determination unit 105 determines that the traveling lane is a straight road when the absolute value of the road curvature Row and the absolute value of the sum SumTotalRow + Row are less than TH_ROW.
  • the value of TH_ROW is a preset positive value such as 0.0003, for example.
  • the imaging angle of the imaging unit 101 is corrected in step S108 and subsequent steps.
  • the reason why the imaging angle of the image capturing unit 101 is corrected only in a scene corresponding to a straight road is that, in the case of a straight road, the point at infinity of the straight road becomes the center coordinate on the image when image processing is performed. It is. That is, in general, it is more accurate to obtain the center coordinates from the intersection of the pair of left and right markers recognized on the straight road than the case where the center coordinates are obtained by correcting from the estimated curvature value on the curved road. Because.
  • step S108 the sum total value SumTotalRow of road curves is updated by the following equation.
  • SumTotalRow SumTotalRow + Row (1-3)
  • the imaging angle correction unit 106 corrects the imaging angle of the imaging unit 101 by the following equation and determines the corrected imaging angle of the imaging unit 101.
  • FOE_X_est 0.9 * FOE_X_est + 0.1 * Pn_x (1-4)
  • FOE_Y_est 0.9 * FOE_Y_est + 0.1 * Pn_y (1-5)
  • FOE_X_est and FOE_Y_est are coordinates on the captured image of the front of the vehicle 1 corresponding to the imaging angle of the imaging unit 101, and the initial value is initial adjustment at a factory or the like (calibration of mounting error called factory aiming or the like) ) Is a measurement value of a camera mounting error (also referred to as a camera imaging angle error) with respect to a fixed target.
  • the coordinates calculated by the above formulas (1-4) and (1-5) are used as the origin coordinates when performing the lane recognition process in the subsequent step S103.
  • aiming refers to optical axis adjustment.
  • step S110 the past value used in the filter and the counter value used in the timer are updated and the process ends. Note that the value of SumTotalRow is initialized to “0” before the processing flow of FIG. 6 is executed.
  • the on-vehicle image recognition apparatus described above recognizes the lane shape of a travel lane in which the vehicle 1 travels from a captured image of the image capturing unit 101 that captures a travel path around the vehicle 1. Further, the imaging angle of the imaging unit 101 is obtained from the recognized lane shape. Then, it is determined whether or not the recognized lane shape is biased with respect to the recognized lane shape, and the imaging angle of the imaging unit 101 is corrected using the imaging angle when it is determined that there is no bias.
  • the imaging angle error of the imaging unit can be accurately detected with less processing load. Can be corrected.
  • the standard deviation calculation unit is not provided.
  • a result targeting a specific state in which the vehicle travels straight on a straight road without any bias is input. It may be corrected. In this case, the input value tends to have a normal distribution, so that the correction accuracy is also high.
  • the in-vehicle image recognition apparatus of the present embodiment is an in-vehicle image recognition apparatus provided in the vehicle 1.
  • the imaging unit 101 images the periphery of the vehicle 1.
  • the lane shape recognition unit 102 recognizes the lane shape of the traveling lane in which the vehicle 1 travels from the captured image of the imaging unit 101.
  • the imaging angle deriving unit 104 obtains the imaging angle of the imaging unit 101 from the lane shape recognized by the lane shape recognition unit 102.
  • the straight road determination unit 410 is located in the vicinity region based on the lane shape in the vicinity region relatively close to the host vehicle and the lane shape in the far region far from the host vehicle among the lane shapes recognized by the lane shape recognition unit.
  • the deviation of the intersection of the extension line obtained by linearly approximating the left and right lane markers and the intersection of the extension line obtained by linearly approximating the left and right lane markers located in the distant region portion is determined.
  • the imaging angle correction unit 106 corrects the imaging angle of the imaging unit 101 using the imaging angle obtained by the imaging angle deriving unit 104 when the straight road determination unit 410 determines that the bias is within the threshold.
  • the imaging angle of the imaging unit can be estimated with a smaller calculation load even when the road shape is biased, such as driving on a highway only one way or on a road with many right curves. it can.
  • the straight path can be determined with higher accuracy by using the deviation of the intersection.
  • the straight road determination unit 410 has the absolute value of the value indicating the recognized lane shape of the lane shape recognized by the lane shape recognition unit 102 smaller than a predetermined threshold, and the sum of the values indicating the lane shape. Is smaller than the threshold, it is determined that the bias is within the threshold. Further, the information bias determination unit 105 determines the bias using the road curvature recognized by the lane shape recognition unit 102. This makes it possible to estimate the imaging angle of the imaging unit with less computational load even when the road shape is biased, such as driving on a highway only on one way or on a road with many right curves. Can do.
  • the vehicle speed detection device 20 detects the vehicle speed of the vehicle 1.
  • the steering angle detection device 30 detects the steering angle of the vehicle 1.
  • the vehicle behavior recognition unit 103 recognizes the behavior of the vehicle 1 from the vehicle speed detected by the vehicle speed detection device 20 and the steering angle detected by the steering angle detection unit 30. If it is determined that the deviation from the behavior of the vehicle 1 recognized by the vehicle behavior recognition unit 103 is within the threshold, the imaging angle of the imaging unit 101 is corrected. As a result, the imaging angle of the imaging unit can be estimated with a smaller calculation load even when the road shape is biased, such as driving on a highway only one way or on a road with many right curves. it can.
  • the lane shape recognition unit 102 detects a parameter related to road curvature.
  • the information bias determination unit 105 determines that the parameter value related to the road curvature has converged within a predetermined range.
  • the information bias determination unit 105 integrates parameter values within a preset time from the time of determining the convergence.
  • the information bias determination unit 105 determines the straight running state of the vehicle by determining that the integrated value is within a predetermined value. Then, the image recognition apparatus performs an aiming process when the information bias determination unit 105 determines that the vehicle is in a straight traveling state.
  • the imaging angle of the imaging unit can be estimated with a smaller calculation load even when the road shape is biased, such as driving on a highway only one way or on a road with many right curves. it can.
  • FIG. 7 is a diagram illustrating an example of the configuration of the in-vehicle image recognition apparatus according to the present embodiment.
  • the in-vehicle image recognition apparatus according to the present embodiment includes an imaging unit 101, a lane shape recognition unit 102, a vehicle behavior recognition unit 103, an imaging angle derivation unit 104, an information bias determination unit 105, an imaging angle correction unit 106, and a standard deviation calculation unit. 107.
  • the standard deviation calculating unit 107 calculates the standard deviation of the imaging angle obtained by the imaging angle deriving unit 104 when the information bias determining unit 105 determines that there is no bias.
  • the imaging angle correction unit 106 corrects the imaging angle of the imaging unit 101 according to the standard deviation calculated by the standard deviation calculation unit 107.
  • step S ⁇ b> 201 the lane shape recognition unit 102 reads an image in front of the vehicle 1 captured by the imaging unit 101.
  • step S202 the vehicle behavior recognition unit 103 detects the vehicle speed of the vehicle 1 detected by the vehicle speed detection device 20, the steering angle detected by the steering angle detection device 30, the longitudinal acceleration detected by the acceleration sensor, and the yaw rate sensor. Read the yaw rate value of each.
  • step S203 the lane shape recognition unit 102 processes the captured image of the imaging unit 101 read in step S201 to recognize the traveling lane in which the vehicle 1 is traveling, and the position and vehicle posture of the vehicle 1 with respect to the traveling lane. Etc. are calculated.
  • step S204 the lane shape recognition unit 102 uses the lane shape recognized in step S203 to obtain the intersection coordinates from the extension lines of a pair of left and right lane markers in the far and near regions. As described above, in the present step, the intersection coordinates of the neighboring area obtained by the lane shape recognition unit 102 are defined as (Pn_x, Pn_y), and the intersection coordinates of the far area are defined as (Pf_x, Pf_y).
  • step S205 the straight road determination unit 410 determines whether or not the road is a straight road from the intersection coordinates of the far area and the near area obtained in step S204 by the following equation. If all of the following expressions are satisfied, the process proceeds to step S206. Otherwise, the process proceeds to step S213. This process is the same as step S105 of the first embodiment. abs (Pn_x-Pf_x) ⁇ TH_PX (2-1) abs (Pn_y-Pf_y) ⁇ TH_PY (2-2)
  • TH_PX is a threshold value for the difference in intersection coordinates between the distant region and the nearby region in the horizontal direction of the imaging screen.
  • TH_PY is a threshold value for a difference in intersection coordinates between a far area and a near area in the vertical direction of the imaging screen.
  • the parallel running determination unit 420 receives as input the lateral offset position (lateral distance to the lane marker) Y of the host vehicle with respect to the host lane obtained in step S203, and uses the above formulas (0-2) (0-3). ) Is pseudo-differentiated with respect to the transfer function to calculate the lateral speed Ydot of the host vehicle. If the above expression (0-4) is satisfied, the process proceeds to step S207. Otherwise, the process proceeds to step S213.
  • step S207 the information bias determination unit 105 determines whether or not all the conditions of the following expression are satisfied. If all the conditions of the following expression are satisfied, the process proceeds to step S208. Otherwise, the process proceeds to step S213.
  • YawRate is a yaw rate value representing the speed of the vehicle 1 in the rotational direction.
  • SumTotalYR is the total value of YawRate.
  • TH_YR is a threshold value when the vehicle 1 is considered to be traveling straight, and if the absolute value of YawRate and the total value SumTotalYR of YawRate is less than TH_YR, the vehicle 1 is regarded as traveling straight (formula (2-5), (2-6)).
  • VspDot is the acceleration in the longitudinal direction of the vehicle 1.
  • TH_VD is a threshold value when the vehicle 1 is considered to be traveling at a constant speed. When the absolute value of VspDot is less than TH_VD, the vehicle 1 is regarded as traveling at a constant speed. SumTotalVD is the total value of VspDot.
  • step S208 the imaging angle of the imaging unit 101 is corrected.
  • the reason why the imaging angle of the imaging unit 101 is corrected in a scene where the vehicle 1 travels straight on a straight road and there is no bias between the travel path and the travel is as follows. Street.
  • a time delay in hardware such as capturing of a captured image of the image capturing unit 101 and a time delay in software such as image processing always occur, but even in that case, disturbance due to the behavior of the vehicle 1 may occur. This is because it is difficult to be influenced and the intersection coordinates corresponding to the imaging angle of the imaging unit 101 are calculated with high accuracy. Further, even when the difference in encounter frequency between the right curve and the left curve is large, the camera mounting angle error is reduced. It can be obtained correctly.
  • step S208 SumTotalPx, SumTotalPy, SumTotalYR, SumTotalVD, and SumCount are updated and the coordinate data of the neighboring intersections are stored in the collection memory according to the following equations.
  • SumTotalPx SumTotalPx + Pn_x-Pf_x (2-9)
  • SumTotalPy SumTotalPy + Pn_y-Pf_y (2-10)
  • SumTotalYR SumTotalYR + YawRate (2-11)
  • SumTotalVD SumTotalVD + VspDot (2-12)
  • SumCount SumCount + 1 (2-15)
  • FOE_X_DataRcd [] is a parameter for storing the horizontal coordinate on the captured image in front of the traveling direction of the vehicle 1
  • FOE_Y_DataRcd [] stores the vertical coordinate on the captured image in front of the traveling direction of the vehicle 1. It is a parameter to do.
  • These parameters are stored in a RAM memory or the like (not shown).
  • SumCount is a counter that counts the number of collected coordinate data of neighboring intersections, and the initial value is “0”. Note that SumCount is initialized before the processing flow of FIG. 8 is executed.
  • step S210 the imaging angle deriving unit 104 calculates the imaging angle of the imaging unit 101 using the following equations (2-17) and (2-18). Further, the standard deviation calculation unit 107 calculates the standard deviation of the imaging angle of the imaging unit 101 by the following equations (2-19) and (2-20).
  • FOE_X_e_tmp ⁇ FOE_X_DataRcd / SumCount (2-17)
  • FOE_Y_e_tmp ⁇ FOE_Y_DataRcd / SumCount
  • FOE_X_stdev ⁇ (FOE_X_e_tmp-FOE_X_DataRcd) 2 / SumCount
  • FOE_Y_stdev ⁇ (FOE_Y_e_tmp-FOE_Y_DataRcd) 2 / SumCount (2-20)
  • ⁇ in the above equation is an operator for calculating the sum of the number of coordinate data of the neighboring intersections represented by SumCount.
  • step S ⁇ b> 211 the variation of the imaging angle candidates of the imaging unit 101 obtained by the imaging angle deriving unit 104 is determined. Specifically, if all the conditions of the following equation are satisfied, the process proceeds to step S212, and if not, the process proceeds to step S213.
  • TH_STDEV represents a threshold value indicating a variation allowed for the imaging angle candidate of the imaging unit 101 obtained by the imaging angle deriving unit 104.
  • TH_STDEV takes a positive value such as 1.0 pix, for example. That is, when the values of the standard deviations FOE_X_stdev and FOE_Y_stdev obtained in step S210 are smaller than TH_STDEV, it is determined that the variation of the imaging angle candidates of the imaging unit 101 obtained by the imaging angle deriving unit 104 is small. In S212, the imaging angle of the imaging unit 101 is corrected.
  • the correction accuracy can be improved as compared with the first embodiment by performing the correction only when the variation from the obtained standard deviation is small. Furthermore, the correction accuracy of the imaging angle based on the present invention performed after factory aiming can be defined as a specific value.
  • step S212 the imaging angle correction unit 106 determines the corrected imaging angle of the imaging unit 101 by the following equation. These coordinates are used as the origin coordinates when performing the lane recognition process in the subsequent step S203.
  • FOE_X_est FOE_X_e_tmp (2-23)
  • FOE_Y_est FOE_Y_e_tmp (2-24)
  • step S213 the past value used in the filter or the counter value used in the timer is updated and the process ends. Note that each value of SumCount is initialized to “0” before the processing flow of FIG. 8 is executed.
  • the on-vehicle image recognition apparatus of this embodiment is the same as that of the first embodiment except for the configuration of the standard deviation calculation unit 107.
  • the standard deviation calculation unit 107 calculates the standard deviation of the imaging angle of the imaging unit 101 when it is determined that there is no bias. Then, the imaging angle of the imaging unit 101 is corrected according to the calculated standard deviation. Thereby, the accuracy of estimation of the imaging angle of the imaging unit 101 can be improved.
  • the standard deviation calculation unit 107 calculates the standard deviation of the imaging angle obtained by the imaging angle deriving unit 104 when the information bias determination unit 105 determines that the bias is within the threshold.
  • the imaging angle correction unit 106 corrects the imaging angle of the imaging unit 101 according to the standard deviation calculated by the standard deviation calculation unit 107.
  • the information bias determination unit 105 determines the presence or absence of information bias and collects only the information determined to have no bias and calculates the standard deviation, only a small number of data (for example, the number of 50 data) However, the tendency to become a normal distribution becomes strong, and a correct determination of the degree of variation can be realized with a small calculation load.
  • the behavior of the vehicle 1 recognized by the vehicle behavior recognition unit 103 is information regarding the rotational behavior of the vehicle 1 in the vehicle width direction. Further, the vehicle behavior recognition unit 103 recognizes the behavior of the vehicle 1 from the position in the vehicle width direction of the vehicle 1 with respect to the travel lane recognized by the lane shape recognition unit 102 or the time change of the yaw angle. Thereby, the accuracy of estimation of the imaging angle of the imaging unit 101 can be improved.
  • FIG. 9 is a diagram illustrating an example of the configuration of the in-vehicle image recognition apparatus according to the present embodiment.
  • the in-vehicle image recognition apparatus according to the present embodiment includes an imaging unit 101, a lane shape recognition unit 102, a vehicle behavior recognition unit 103, an imaging angle derivation unit 104, an information bias determination unit 105, an imaging angle correction unit 106, and a standard deviation calculation unit. 107 and an end unit 108.
  • the standard deviation calculated by the standard deviation calculating unit 107 is less than a predetermined value
  • the ending unit 108 ends the correction of the imaging angle.
  • step S ⁇ b> 301 the lane shape recognition unit 102 reads an image in front of the vehicle 1 captured by the imaging unit 101.
  • step S302 the vehicle behavior recognizing unit 103 detects the vehicle width direction speed of the vehicle 1 detected by the vehicle speed detection device 20, the steering angle detected by the steering angle detection device 30, and the longitudinal acceleration detected by the acceleration sensor. The yaw rate value from the yaw rate sensor is read.
  • step S303 the lane shape recognition unit 102 processes the captured image of the imaging unit 101 read in step S301 to recognize the traveling lane in which the vehicle 1 is traveling, and the position and vehicle posture of the vehicle 1 with respect to the traveling lane. Etc. are calculated.
  • step S304 the end unit 108 determines whether the imaging angle correction processing of the imaging unit 101 has been completed. If it has been completed, step S305 follows. If it is not completed, step S314 follows. Specifically, if the condition of the following expression is satisfied, the process proceeds to step S305, and if not, the process proceeds to step S314.
  • FlgAimComplt is a flag indicating whether or not the imaging angle correction processing of the imaging unit 101 has been completed.
  • the initial value of FlgAimComplt is “0”.
  • step S305 the lane shape recognition unit 102 uses the lane shape recognized in step S303 to obtain the intersection coordinates from the extension lines of a pair of left and right lane markers in the far and near areas.
  • the intersection coordinates of the neighboring area obtained by the lane shape recognition unit 102 are defined as (Pn_x, Pn_y), and the intersection coordinates of the far area are defined as (Pf_x, Pf_y).
  • step S306 the straight road determination unit 410 determines whether or not the road is a straight road from the intersection coordinates of the distant area and the vicinity area obtained in step S305 by the following expression. If all of the above equations (2-1) and (2-2) are satisfied, the process proceeds to step S307, and otherwise, the process proceeds to step S314. This process is the same as step S205 of the second embodiment.
  • step S307 the parallel running determination unit 420 receives as input the lateral offset position (lateral distance to the lane marker) Y of the own vehicle with respect to the own lane obtained in step S303, and the above equations (0-2) (0-3) ) Is pseudo-differentiated with respect to the transfer function to calculate the lateral speed Ydot of the host vehicle. If the above equation (0-4) is satisfied, the process proceeds to step S308. Otherwise, the process proceeds to step S314. This process is the same as step S206 in the second embodiment.
  • step S308 the information bias determination unit 105 determines whether or not all the conditions of the following expression are satisfied. If all the conditions of the following equation are satisfied, the process proceeds to step S309, and otherwise, the process proceeds to step S314.
  • ysoku is a parameter indicating the speed of the vehicle 1 in the vehicle width direction.
  • the speed in the vehicle width direction of the state variable of the lane recognition Kalman filter used in the lane recognition process in step S303 may be used as it is, or the position in the vehicle width direction to the travel lane is equivalent to time differentiation. It is.
  • SumTotalYsoku is the total value of ysoku.
  • TH_YS is a threshold value when the vehicle 1 is regarded as traveling straight, and as shown in the equations (3-4) and (3-5), the absolute value of the vehicle width direction speed ysoku and its sum SumTotalYsoku + ysoku When the absolute value of is less than TH_YS, it is determined that the vehicle 1 is traveling straight.
  • the yaw rate of the state variable of the lane recognition Kalman filter used in the lane recognition process of step S303 may be used as it is.
  • the yaw angle with respect to the lane may be time differentiated.
  • the meanings of expressions other than expressions (3-4) and (3-5) are the same as those in the first embodiment and the second embodiment.
  • SumTotalRow SumTotalRow + Row (3-10)
  • SumTotalYsoku SumTotalYsoku + ysoku (3-11)
  • SumTotalYR SumTotalYR + YawRate (3-12)
  • SumTotalVD SumTotalVD + VspDot (3-13)
  • FOE_X_DataRcd [SumCount] Pn_x (3-14)
  • FOE_Y_DataRcd [SumCount] Pn_y (3-15)
  • SumCount SumCount + 1 (3-16)
  • step S313 the imaging angle correction unit 106 sets a completion flag FlgAimComplt of the imaging angle estimation process of the imaging unit 101 and determines the imaging angle of the imaging unit 101 according to the following equation. These coordinates are used as the origin coordinates when performing the lane recognition process in the subsequent step S303.
  • FlgAimComplt 1 (3-17)
  • FOE_Y_est FOE_Y_e_tmp (3-19)
  • step S314 the past value used in the filter or the counter value used in the timer is updated and the process ends. Note that before executing the processing flow of FIG. 10, each value of FlgAimComplt and SumTotalRow is initialized to “0”.
  • the on-vehicle image recognition apparatus of this embodiment is the same as that of the second embodiment except for the configuration of the end unit 108.
  • the end unit 108 ends the correction of the imaging angle when the calculated standard deviation is less than a predetermined value. Thereby, once it is determined that the variation of the imaging angle candidates of the imaging unit 101 is small, the processing for correcting the imaging angle can be completed, so the processing load of the in-vehicle image recognition device is reduced. can do.
  • FIG. 11 is a diagram illustrating the effects achieved by the in-vehicle image recognition device according to the present embodiment.
  • the graph shown in FIG. 11 is a result of executing the lane recognition process according to the present embodiment in a scene where there are many gentle curves on the highway.
  • the data in a range 70 surrounded by a circle is data indicating the result of Pn_x calculated in step S305.
  • the data in range 71 is the result of Pn_x collected in step S307.
  • the worst values indicated by the broken lines 80 and 81 approach the true value of 120.0 pixels by about 10 pixels. It was also confirmed from the standard deviation that the variation was almost halved (down about 44%). Furthermore, the number of data was reduced from 8000 to 50, and the processing load of standard deviation was also reduced.
  • the behavior of the vehicle 1 recognized by the vehicle behavior recognition unit 103 is information regarding the translational behavior of the vehicle 1 in the vehicle width direction. Further, the vehicle behavior recognition unit 103 recognizes the behavior of the vehicle 1 from the position in the vehicle width direction of the vehicle 1 with respect to the travel lane recognized by the lane shape recognition unit 102 or the time change of the yaw angle. Thereby, the accuracy of estimation of the imaging angle of the imaging unit 101 can be improved.
  • the vehicle speed detection device 20 constitutes a vehicle speed detection unit.
  • the steering angle detection device 30 constitutes a steering angle detection unit.
  • the lane shape recognition unit 102 constitutes a parameter detection unit.
  • the vehicle behavior recognition unit 103, or the vehicle speed detection device 20, the steering angle detection device 30, and the steering angle control device 40 constitute a parameter detection unit.
  • the straight road determination unit 410 constitutes an intersection bias determination unit and a recognition bias determination unit.
  • the information bias determination unit 105 constitutes a convergence determination unit, an integration unit, and a straight traveling state determination unit.
  • the imaging angle derivation unit 104 and the imaging angle correction unit 106 constitute an aiming execution unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

La présente invention estime l'angle d'acquisition d'image d'une unité d'acquisition d'image montée sur un véhicule en utilisant une faible charge de calcul. Un dispositif de reconnaissance d'image embarqué monté sur un véhicule (1) reconnaît le profil d'une voie sur laquelle le véhicule (1) se déplace à partir d'une image acquise par une caméra (10) destinée à acquérir des images de la trajectoire de déplacement au voisinage du véhicule (1). L'angle d'acquisition d'image de la caméra (10) est déterminé à partir du profil de voie reconnu. Un écart quelconque au profil de voie reconnu est déterminé, et l'angle d'acquisition d'image de la caméra (10) est corrigé en utilisant l'angle d'acquisition d'image instantané lorsqu'il a été déterminé qu'aucun écart n'était présent. L'écart type de l'angle d'acquisition d'image, lorsqu'il a été déterminé qu'aucun écart n'était présent, est calculé, et l'angle d'acquisition d'image de la caméra (10) est corrigé conformément à l'écart type calculé. La vitesse et l'angle de braquage du véhicule (1) sont calculés, le comportement du véhicule (1) est reconnu et un écart quelconque par rapport au comportement du véhicule (1) est déterminé, cela permettant de corriger l'angle d'acquisition d'image de la caméra (10).
PCT/JP2012/001576 2011-06-13 2012-03-07 Dispositif destiné à déterminer le profil d'une route, dispositif de reconnaissance d'image embarqué, dispositif d'ajustement d'axe d'acquisition d'image, et procédé de reconnaissance de voie WO2012172713A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US14/125,832 US20140118552A1 (en) 2011-06-13 2012-03-07 Road shape determining device, in-vehicle image recognizing device, imaging axis adjusting device, and lane recognizing method
EP12801148.3A EP2720213A4 (fr) 2011-06-13 2012-03-07 Dispositif destiné à déterminer le profil d'une route, dispositif de reconnaissance d'image embarqué, dispositif d'ajustement d'axe d'acquisition d'image, et procédé de reconnaissance de voie
JP2013520407A JP5733395B2 (ja) 2011-06-13 2012-03-07 車載用画像認識装置、撮像軸調整装置およびレーン認識方法
CN201280026555.2A CN103582907B (zh) 2011-06-13 2012-03-07 车载用图像识别装置及车道识别方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011131222 2011-06-13
JP2011-131222 2011-06-13

Publications (1)

Publication Number Publication Date
WO2012172713A1 true WO2012172713A1 (fr) 2012-12-20

Family

ID=47356734

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/001576 WO2012172713A1 (fr) 2011-06-13 2012-03-07 Dispositif destiné à déterminer le profil d'une route, dispositif de reconnaissance d'image embarqué, dispositif d'ajustement d'axe d'acquisition d'image, et procédé de reconnaissance de voie

Country Status (5)

Country Link
US (1) US20140118552A1 (fr)
EP (1) EP2720213A4 (fr)
JP (1) JP5733395B2 (fr)
CN (1) CN103582907B (fr)
WO (1) WO2012172713A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103991474A (zh) * 2013-02-14 2014-08-20 本田技研工业株式会社 车辆的转向控制装置
JP2018005617A (ja) * 2016-07-04 2018-01-11 株式会社Soken 走路形状認識装置、走路形状認識方法
US10800412B2 (en) * 2018-10-12 2020-10-13 GM Global Technology Operations LLC System and method for autonomous control of a path of a vehicle
US20200408586A1 (en) * 2018-03-22 2020-12-31 Panasonic Intellectual Property Management Co., Ltd. Axle load measuring apparatus and axle load measuring method
WO2022145054A1 (fr) * 2021-01-04 2022-07-07 日本電気株式会社 Dispositif de traitement d'image, procédé de traitement d'image et support d'enregistrement
JP7359922B1 (ja) 2022-09-26 2023-10-11 株式会社デンソーテン 情報処理装置、情報処理方法およびプログラム

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5747482B2 (ja) * 2010-03-26 2015-07-15 日産自動車株式会社 車両用環境認識装置
MY177486A (en) * 2013-04-18 2020-09-16 West Nippon Expressway Engineering Shikoku Company Ltd Device for inspecting shape of road travel surface
JP6093314B2 (ja) * 2014-02-14 2017-03-08 株式会社デンソー 境界線認識装置
JP6299373B2 (ja) * 2014-04-18 2018-03-28 富士通株式会社 撮像方向の正常性の判定方法、撮像方向の正常性の判定プログラムおよび撮像方向の正常性の判定装置
JP6189816B2 (ja) * 2014-11-19 2017-08-30 株式会社Soken 走行区画線認識装置
JP6389119B2 (ja) * 2014-12-25 2018-09-12 株式会社デンソー 車線境界線認識装置
JP6363518B2 (ja) * 2015-01-21 2018-07-25 株式会社デンソー 区画線認識装置
KR101673776B1 (ko) * 2015-06-05 2016-11-07 현대자동차주식회사 자동차용 헤드유닛 및 카메라 유닛의 고장 진단 방법
KR101748269B1 (ko) * 2015-11-11 2017-06-27 현대자동차주식회사 자율 주행 차량의 조향 제어 방법 및 장치
KR102433791B1 (ko) 2015-11-20 2022-08-19 주식회사 에이치엘클레무브 차선 이탈 경고 장치 및 방법
KR102503253B1 (ko) * 2015-12-14 2023-02-22 현대모비스 주식회사 주변 차량 인지 시스템 및 방법
US20170307743A1 (en) * 2016-04-22 2017-10-26 Delphi Technologies, Inc. Prioritized Sensor Data Processing Using Map Information For Automated Vehicles
DE102016207436A1 (de) * 2016-04-29 2017-11-02 Ford Global Technologies, Llc System und Verfahren zum Steuern- und/oder Regeln eines Lenksystems eines Fahrzeugs sowie Fahrzeug
JP6637399B2 (ja) * 2016-09-30 2020-01-29 株式会社デンソー 領域認識装置及び領域認識方法
EP3570262A4 (fr) * 2017-01-10 2019-12-18 Mitsubishi Electric Corporation Dispositif de reconnaissance de trajet de déplacement et procédé de reconnaissance de trajet de déplacement
JP2018173834A (ja) * 2017-03-31 2018-11-08 本田技研工業株式会社 車両制御装置
CN106910358B (zh) * 2017-04-21 2019-09-06 百度在线网络技术(北京)有限公司 用于无人车的姿态确定方法和装置
JP6627822B2 (ja) * 2017-06-06 2020-01-08 トヨタ自動車株式会社 車線変更支援装置
CN108099905B (zh) * 2017-12-18 2020-08-18 深圳大学 车辆偏航检测方法、系统及机器视觉系统
CN108229386B (zh) * 2017-12-29 2021-12-14 百度在线网络技术(北京)有限公司 用于检测车道线的方法、装置和介质
CN108427417B (zh) * 2018-03-30 2020-11-24 北京图森智途科技有限公司 自动驾驶控制系统及方法、计算机服务器和自动驾驶车辆
WO2020004231A1 (fr) * 2018-06-27 2020-01-02 日本電信電話株式会社 Dispositif, procédé et programme d'estimation de voie
CN108921079B (zh) * 2018-06-27 2022-06-10 盯盯拍(深圳)技术股份有限公司 拍摄角度调整方法、拍摄角度调整设备以及车载摄像装置
KR102132899B1 (ko) * 2018-10-08 2020-07-21 주식회사 만도 교차로에서의 경로 생성 장치 및 교차로에서의 차량 제어 장치 및 방법
CN113016179A (zh) * 2018-11-15 2021-06-22 松下知识产权经营株式会社 摄像机系统和车辆
US10728461B1 (en) * 2019-01-31 2020-07-28 StradVision, Inc. Method for correcting misalignment of camera by selectively using information generated by itself and information generated by other entities and device using the same
CN112686904A (zh) * 2020-12-14 2021-04-20 深兰人工智能(深圳)有限公司 车道划分方法、装置、电子设备和存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000242899A (ja) 1999-02-24 2000-09-08 Mitsubishi Electric Corp 白線認識装置
JP2002259995A (ja) * 2001-03-06 2002-09-13 Nissan Motor Co Ltd 位置検出装置
JP2004252827A (ja) 2003-02-21 2004-09-09 Nissan Motor Co Ltd 車線認識装置
JP2004318618A (ja) 2003-04-17 2004-11-11 Nissan Motor Co Ltd 車線認識装置
JP2008003959A (ja) * 2006-06-23 2008-01-10 Omron Corp 車両用通信システム
JP2009234543A (ja) * 2008-03-28 2009-10-15 Mazda Motor Corp 車両の車線逸脱警報装置
WO2010140578A1 (fr) * 2009-06-02 2010-12-09 日本電気株式会社 Dispositif de traitement d'image, procédé de traitement d'image, et programme de traitement d'image
JP2011221983A (ja) * 2010-03-26 2011-11-04 Nissan Motor Co Ltd 車両用環境認識装置

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5359666A (en) * 1988-09-28 1994-10-25 Honda Giken Kogyo Kabushiki Kaisha Driving way judging device and method
JPH0624035B2 (ja) * 1988-09-28 1994-03-30 本田技研工業株式会社 走行路判別装置
JP3357749B2 (ja) * 1994-07-12 2002-12-16 本田技研工業株式会社 車両の走行路画像処理装置
JP3521860B2 (ja) * 2000-10-02 2004-04-26 日産自動車株式会社 車両の走行路認識装置
JP3645196B2 (ja) * 2001-02-09 2005-05-11 松下電器産業株式会社 画像合成装置
EP1796042B1 (fr) * 2005-12-06 2011-02-23 Nissan Motor Co., Ltd. Appareil et procédé de détection
JP4820221B2 (ja) * 2006-06-29 2011-11-24 日立オートモティブシステムズ株式会社 車載カメラのキャリブレーション装置およびプログラム
JP2008241446A (ja) * 2007-03-27 2008-10-09 Clarion Co Ltd ナビゲーション装置及びその制御方法
JP4801821B2 (ja) * 2007-09-21 2011-10-26 本田技研工業株式会社 道路形状推定装置
WO2010146695A1 (fr) * 2009-06-18 2010-12-23 富士通株式会社 Dispositif de traitement d'image et procédé de traitement d'image

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000242899A (ja) 1999-02-24 2000-09-08 Mitsubishi Electric Corp 白線認識装置
JP2002259995A (ja) * 2001-03-06 2002-09-13 Nissan Motor Co Ltd 位置検出装置
JP2004252827A (ja) 2003-02-21 2004-09-09 Nissan Motor Co Ltd 車線認識装置
JP2004318618A (ja) 2003-04-17 2004-11-11 Nissan Motor Co Ltd 車線認識装置
JP2008003959A (ja) * 2006-06-23 2008-01-10 Omron Corp 車両用通信システム
JP2009234543A (ja) * 2008-03-28 2009-10-15 Mazda Motor Corp 車両の車線逸脱警報装置
WO2010140578A1 (fr) * 2009-06-02 2010-12-09 日本電気株式会社 Dispositif de traitement d'image, procédé de traitement d'image, et programme de traitement d'image
JP2011221983A (ja) * 2010-03-26 2011-11-04 Nissan Motor Co Ltd 車両用環境認識装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2720213A4

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103991474A (zh) * 2013-02-14 2014-08-20 本田技研工业株式会社 车辆的转向控制装置
JP2018005617A (ja) * 2016-07-04 2018-01-11 株式会社Soken 走路形状認識装置、走路形状認識方法
US20200408586A1 (en) * 2018-03-22 2020-12-31 Panasonic Intellectual Property Management Co., Ltd. Axle load measuring apparatus and axle load measuring method
US11976960B2 (en) * 2018-03-22 2024-05-07 Panasonic Intellectual Property Management Co., Ltd. Axle load measuring apparatus and axle load measuring method
US10800412B2 (en) * 2018-10-12 2020-10-13 GM Global Technology Operations LLC System and method for autonomous control of a path of a vehicle
WO2022145054A1 (fr) * 2021-01-04 2022-07-07 日本電気株式会社 Dispositif de traitement d'image, procédé de traitement d'image et support d'enregistrement
JP7359922B1 (ja) 2022-09-26 2023-10-11 株式会社デンソーテン 情報処理装置、情報処理方法およびプログラム

Also Published As

Publication number Publication date
EP2720213A1 (fr) 2014-04-16
JPWO2012172713A1 (ja) 2015-02-23
CN103582907A (zh) 2014-02-12
JP5733395B2 (ja) 2015-06-10
US20140118552A1 (en) 2014-05-01
CN103582907B (zh) 2016-07-20
EP2720213A4 (fr) 2015-03-11

Similar Documents

Publication Publication Date Title
JP5733395B2 (ja) 車載用画像認識装置、撮像軸調整装置およびレーン認識方法
US8670590B2 (en) Image processing device
EP3179445B1 (fr) Dispositif de reconnaissance d'environnement extérieur pour véhicules, et dispositif de commande de comportement de véhicule l'utilisant
JP3711405B2 (ja) カメラを利用した車両の道路情報抽出方法及びシステム
US7542835B2 (en) Vehicle image processing device
JP3780848B2 (ja) 車両の走行路認識装置
US11398051B2 (en) Vehicle camera calibration apparatus and method
EP2933790A1 (fr) Dispositif d'estimation d'angle d'attitude/emplacement d'objet mobile et procédé d'estimation d'angle d'attitude/emplacement d'objet mobile
US20100201814A1 (en) Camera auto-calibration by horizon estimation
EP3282389B1 (fr) Appareil de traitement d'image, appareil de capture d'image, système de commande d'appareil de corps mobile, procédé de traitement d'image, et programme
US8730325B2 (en) Traveling lane detector
US20150363653A1 (en) Road environment recognition system
US10235579B2 (en) Vanishing point correction apparatus and method
CN111164648B (zh) 移动体的位置推断装置及位置推断方法
JP6035095B2 (ja) 車両の衝突判定装置
JP3961584B2 (ja) 区画線検出装置
US20120128211A1 (en) Distance calculation device for vehicle
JP3319383B2 (ja) 走行路認識装置
JP5559650B2 (ja) 車線推定装置
JP6963490B2 (ja) 車両制御装置
US20180005051A1 (en) Travel road shape recognition apparatus and travel road shape recognition method
EP3287948B1 (fr) Appareil de traitement d'images, système de commande d'appareil de corps mobile, procédé de traitement d'images et programme
JP7064400B2 (ja) 物体検知装置
WO2020036039A1 (fr) Dispositif de caméra stéréo
JP3700681B2 (ja) 走行路検出装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12801148

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013520407

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2012801148

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2012801148

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 14125832

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE