WO2013186903A1 - 車線区分標示検出装置、運転支援システム - Google Patents
車線区分標示検出装置、運転支援システム Download PDFInfo
- Publication number
- WO2013186903A1 WO2013186903A1 PCT/JP2012/065275 JP2012065275W WO2013186903A1 WO 2013186903 A1 WO2013186903 A1 WO 2013186903A1 JP 2012065275 W JP2012065275 W JP 2012065275W WO 2013186903 A1 WO2013186903 A1 WO 2013186903A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lane marking
- image data
- lane
- road
- white line
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 147
- 238000005192 partition Methods 0.000 title abstract 9
- 230000002265 prevention Effects 0.000 claims description 2
- 238000005452 bending Methods 0.000 claims 1
- 238000000638 solvent extraction Methods 0.000 abstract 1
- 238000000034 method Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 14
- 238000004364 calculation method Methods 0.000 description 12
- 238000012937 correction Methods 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 238000000605 extraction Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000004075 alteration Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 1
- 238000007476 Maximum Likelihood Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000001915 proofreading effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/804—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for lane monitoring
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo or light sensitive means, e.g. infrared sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/30—Road curve radius
Definitions
- the present invention relates to a lane marking detection device that detects a lane marking marked on a road surface.
- a stereo camera capable of acquiring distance information is used as a camera for photographing a white line.
- the white line detection device detects the white line as follows: (1) A method of fixing an image for white line detection to either an image taken by the left camera or an image taken by the right camera. (2) A method of detecting from an image area taken in common by the left and right cameras.
- Patent Document 1 discloses an image processing apparatus that calculates three-dimensional position information from an image captured by a stereo camera and detects a white line existing on a road surface from each of left and right images.
- the white line detection camera when the white line detection camera is fixed to the left as in (1) above, if the road is curved to the right, the area where the white line on the right side is captured becomes narrow, and the white line detection camera When is fixed to the right, if the road curves to the left, the area where the white line on the left is captured becomes narrower.
- the white line detection device detects from an image area photographed in common by the left and right cameras, and when the road curves to the right or left, The area in which the white line is photographed in the image area photographed in common becomes narrow.
- FIG. 1 shows an example of images taken by each of the right camera and the left camera when the road curves to the left.
- 1A shows an image of the left camera
- FIG. 1B shows an image of the right camera.
- an image captured by the right camera is fixed to an image for white line detection, a far white line on the left side cannot be detected.
- the areas within the dotted lines of the left camera image and the right camera image are areas that were taken in common by the left and right cameras. Therefore, when detecting a white line from an image area photographed in common by the left and right cameras, similarly, a far white line on the left side cannot be detected.
- the left and right cameras are arranged apart from each other in the vehicle width direction (the left and right cameras have a predetermined baseline length), and the right camera is located on the right side from the center in the vehicle width direction. This occurs because the cameras are arranged on the left side of the center in the vehicle width direction.
- an object of the present invention is to provide a lane marking detection device capable of improving the detection accuracy of a lane marking regardless of the road shape.
- the present invention detects a lane marking from at least one image data among a plurality of photographing means for photographing a lane marking that divides a lane in which the host vehicle is traveling, and image data respectively generated by the plurality of photographing means.
- the road shape estimation means for estimating the road shape from the lane classification indication detected by the lane classification indication detection means,
- a lane marking detection apparatus having image data determining means for determining image data used by the detecting means to detect a lane marking.
- FIG. 2 is an example of a diagram illustrating the schematic features of the white line detection device of the present embodiment.
- a camera for white line detection among the left and right cameras is preset (S3).
- S3 a camera for white line detection among the left and right cameras.
- the right camera is a white line detection camera, a white line is detected from an image captured by the right camera.
- the white line detection device estimates the road parameter from the detected white line (S2). Since the road parameter includes, for example, the curvature and radius of the road, the white line detection device causes the road to bend in a direction opposite to the side where the current white line detection (for example, initial setting) camera is disposed (hereinafter referred to as a curve). ) Is determined (S3). That is, when the right camera is a current white line detection camera, the white line detection device 100 determines whether or not the road is curved in the left direction.
- the white line detection device switches the white line detection camera (S4). That is, when the right camera is the current white line detection camera, an image taken by the left camera from the next image is set as the white line detection image.
- the camera that captures the image for detecting the white line is set to either the right camera or the left camera, and the road curves in a direction opposite to the side where the camera for detecting the white line is arranged. Can detect the far white line.
- the white line detection device continues to determine whether or not the road is curved. If the road is curved in the direction opposite to the side where the current white line detection camera is arranged, the white line detection device Switch cameras. By repeating this process, a distant white line can be detected from an image taken by either the left or right camera regardless of the road shape.
- FIG. 3 shows an example of a schematic configuration diagram of a driving support system including the white line detection device 100.
- the driving support system 500 includes a white line detection device 100, a driving support ECU (Electronic Control Unit) 200, and an operation device 300 that are connected to each other via an in-vehicle LAN such as a CAN (Controller Area Network) bus.
- the white line detection device 100 corresponds to the lane marking detection device in the claims.
- the white line detection device 100 transmits target information up to the obstacle to the driving support ECU 200 in addition to the road parameter described later.
- the driving support ECU 200 determines the necessity of driving support based on the road parameter and target information, and requests the operation device 300 to operate.
- the operation device 300 is, for example, an alarm device for a meter panel, an electric power steering device, a brake actuator, or the like.
- the runway parameters are, for example, the curvature (radius) of the road, the lane width of the road, the lateral position of the host vehicle in the lane, the yaw angle of the host vehicle with respect to the road, and the like.
- the driving assistance ECU 200 performs driving assistance using information (for example, wheel speed, steering angle, yaw rate, etc.) detected by another ECU connected to the in-vehicle LAN or a sensor.
- LDW Likane
- LKA Li ⁇ ⁇ Keeping Assist
- LDW refers to driving assistance that alerts the driver with a warning sound or vibration when there is a risk of deviating from the lane from the lateral position, yaw angle, and vehicle speed.
- LKA is a driving support mode in which the electric power steering device is controlled to apply steering torque to the steering shaft, and braking and engine output for each wheel are controlled so that the host vehicle travels on a target driving line in the lane. .
- LKA In addition to controlling the steering torque, braking for each wheel, and engine output so as to travel on the target travel line, there is also an LKA that controls to maintain the lane when there is a risk of departure from the lane. In this way, various driving assistances are possible by detecting the white line.
- the target information is, for example, distance information, relative speed, and direction (lateral position).
- the driving assistance ECU 200 extracts an obstacle that may cause a collision from the direction (lateral position), and when the TTC (Time To Collation) calculated from the distance information and the relative speed falls below a predetermined value, it sounds an alarm sound or decelerates.
- TTC Time To Collation
- the white line detection apparatus 100 includes a right camera 11, a left camera 12, and a camera computer 13.
- the right camera 11 and the left camera 12 are one stereo camera.
- the stereo camera is arranged on the rear-view mirror with the optical axis facing the front of the vehicle, but may be arranged at another place such as a roof top.
- the right camera 11 and the left camera 12 are arranged at a predetermined interval (base line length), spaced apart in the vehicle width direction or the horizontal direction. The front and rear positions or height positions of the right camera 11 and the left camera 12 do not have to match.
- Each of the right camera 11 and the left camera 12 has a solid-state image sensor such as a CCD or CMOS or a backside-illuminated CMOS.
- the right camera 11 and the left camera 12 may be a monochrome camera that acquires only luminance information or a color camera.
- the right camera 11 and the left camera 12 periodically photograph a predetermined range in front of the vehicle almost simultaneously.
- a white line is marked on the road surface
- a white line is captured in the image.
- the white line in this embodiment corresponds to the lane marking in the claims.
- the lane markings are not limited to those that are marked with a white color on the road surface, but include, for example, linear signs with a chromatic color such as yellow or orange.
- the white line is not limited to the linear marking indicated by the continuous solid line, but is a dotted line in which many points continue to be linear, and the solid line is broken at equal intervals (or at different intervals). It may be a broken line.
- a dotted line it can be converted into a straight line by, for example, Hough transform.
- the white line is not limited to a planar sign, and may have a convex portion with respect to the road surface, such as a botsdot. Since botsdots are continuous (for example, equally spaced) points, they can be regarded as lines like dotted lines.
- a road fence that reflects the light of the host vehicle or emits light and marks the boundary between the lanes, such as a cat's eye, can be regarded as a line as well as a dotted line if it is continuously laid.
- the camera computer 13 is a computer provided with a CPU, ROM, RAM, CAN controller, input / output I / F, and other general circuits. As will be described later, the camera computer 13 performs distortion correction / parallelization, white line detection, runway parameter estimation, and switching of an image for camera or white line detection.
- FIG. 4 shows an example of a functional block diagram of the camera computer 13.
- the camera computer 13 includes a stereo image acquisition unit 21, a distortion correction / parallelization unit 22, an edge extraction unit 23, a white line detection unit 24, a runway parameter estimation unit 25, a processed image switching unit 26, and a parallax calculation unit 27.
- the stereo image acquisition unit 21 acquires the image data of the images periodically captured by the right camera 11 and the images periodically captured by the left camera 12, and stores them in a buffer or the like.
- the distortion correction / parallelization unit 22 corrects and parallelizes the distortion of the left and right images using the external parameters and internal parameters of the camera acquired in advance.
- the internal parameters are, for example, lens distortion and distortion aberration of the right camera 11 and the left camera 12, image sensor distortion, focal length, and the like.
- Lens distortion is reduced by, for example, correcting image data with reference to a correction conversion table generated based on a lens design value.
- Distortion is reduced by correcting image data based on parameter estimation using a radial distortion aberration model.
- External parameters are, for example, values obtained by quantifying the camera mounting position and orientation. Since there is a slight difference in the mounting positions (for example, height) and orientations (pitch, roll, yaw) of the left and right cameras, the optical axes of the right camera 11 and the left camera 12 are perfectly matched and parallel. It may not be possible. For this reason, for example, the imaging system of the right camera 11 and the left camera 12 may rotate relatively around the optical axis due to the difference in the external parameters. In order to reduce such rotation, the stereo camera is calibrated by a vehicle manufacturer before shipment. In calibration, the right camera 11 and the left camera 12 respectively shoot a distortion calibration chart.
- the proofreading chart has a checkered pattern in which square white cells and black cells are reversed.
- the correspondence between the pixels is specified so that the black and white square of the image of the left camera 12 matches the black and white square of the image of the right camera 11.
- the correspondence relationship is registered in the table, and for example, it is set to which pixel position the pixel position before correction is converted in correspondence with all the pixels of the right camera 11.
- data for correcting and collimating the left and right images based on the internal parameters and the external parameters is used as a lookup table 28. In this way, efficient parallax search is possible by correcting and collimating the distortion.
- FIG. 5A is a diagram showing an example of parallelized image data of the right camera 11 and image data of the left camera 12.
- the vehicle width direction is the X axis
- the vehicle height direction is the Y axis
- the traveling direction is the Z axis.
- the coordinates P (x, y, z) are reflected in the pixel Pl of the left camera 12 and the pixel Pr of the right camera 11, respectively. These correspondences will be described later.
- the parallax search may be performed on the left and right image data parallelized by the distortion correcting / parallelizing unit 22, but the search is facilitated by performing the parallax search after performing the edge enhancement processing. Therefore, the edge extraction unit 23 extracts edges from the left and right images.
- Various types of edge extraction filters are known. For example, a sobel filter is used.
- FIG. 5B is a diagram illustrating an example of the filtering process.
- the upper part of FIG. 5B is an image of the right camera 11, and the lower part of FIG. 5B is an image of the left camera 12.
- Each edge image generated from the luminance image is shown.
- white lines, guardrails, trees, and the like are photographed, but the edge lines of the right camera 11 and the left camera 12 are emphasized in the white lines, guardrails, or ends of the trees by the filter processing (in the figure).
- the image is shown with white pixels.
- FIG. 6A is an example of a diagram illustrating a parallax search.
- the parallax calculation unit 27 calculates SAD (Sum of Absolute Difference) or SSD (Sum of Squared Differences) for each region including the pixel of interest (hereinafter referred to as a window).
- SAD is the sum of absolute values of pixel value differences for each pixel
- SSD is the sum of squares of pixel value differences for each pixel. It means that the smaller the value is, the higher the matching degree of images included in the window.
- the parallax calculation unit 27 calculates the SAD or SSD in the left image window and the right image window, calculates the SAD or SSD by shifting the entire right image window to the right by one pixel. repeat. In other words, since they are parallelized, it is only necessary to shift the window by one pixel in the U direction without shifting the window in the V-axis direction in the right image.
- FIG. 6B shows an example of the relationship between the shift amount (parallax) in the U direction and SAD.
- the SAD shows a minimum value with respect to the shift amount.
- the shift amount of the pixel indicating the minimum value is the parallax at the pixel (u, v) focused on the left image.
- the parallax calculation unit 27 may obtain SAD or SSD for all the pixels in the U direction of the right image, or when the SAD or SSD indicates a minimum value that is equal to or less than the threshold without defining a search range.
- the calculation of SAD or SSD may be discontinued. In the former, the most certain parallax can be obtained, and in the latter, a certain degree of certain parallax can be efficiently obtained.
- the parallax may be obtained from the luminance image as described above. Further, the parallax may be obtained from both the edge image and the luminance image. When both an edge image and a luminance image are used, the parallax is determined by setting an average of pixel shift amounts at which SAD or SSD is minimized, or by weighting the edge image and the luminance image. In this way, the parallax can be obtained more accurately.
- the parallax calculation unit 27 calculates sub-pixel parallax in addition to the integer parallax in units of pixels. For obtaining sub-pixel parallax, equiangular fitting, parabolic fitting or the like is known.
- the distance to the object shown in the pixels can be calculated in units of pixels.
- Distance (f ⁇ m) / (n ⁇ d)
- f the focal length of the lens
- m the baseline length
- n the parallax (number of pixels)
- d the pixel pitch of the image sensor.
- the white line detection unit 24 detects a white line from an image captured by the right camera 11 or the left camera 12. Which image the white line is detected from is set as follows. (i) Immediately after starting the white line detection device 100 (initial setting), when the image of the right camera 11 is used as an image for white line detection. When a straight line or a right curve is detected from the runway parameters: the right camera 11 ⁇ If a left curve is detected from the road parameter: Left camera 12 Thereafter, when the left curve is no longer detected from the runway parameter, it may be returned to the right camera 11 or the left camera 12 may be left as it is.
- FIG. 7 is an example of a diagram illustrating a coordinate system in white line detection.
- the above-described road coordinate system in the X, Y, and Z directions and a plane coordinate system for designating plane coordinates of image data are used.
- the horizontal direction from left to right is the X axis
- the vehicle height direction is the Y axis
- the vehicle traveling direction is the Z axis.
- the planar coordinate system is assumed to be a coordinate system having an origin on the Z axis of the road coordinate system, a U axis parallel to the X axis, and a V axis planar to the Y axis.
- the distance information z of each pixel is obtained by a stereo camera. Therefore, if a known z is used for each pixel, x and y can be obtained uniquely.
- either the right camera 11 or the left camera 12 may be used for white line detection.
- FIG. 8 is an example of a diagram schematically illustrating detection of a white line.
- FIG. 8A schematically shows image data (white line in the plane coordinate system).
- the white line detection unit 24 identifies a pixel having an edge strength equal to or higher than a threshold among the edges extracted by the edge extraction unit 23.
- FIG. 8A for each of the left and right white lines, an edge that changes from the brightness of the road to the brightness of the white line and an edge that changes from the brightness of the white line to the brightness of the road are obtained.
- the white line detection area may be limited to a part on the right side and a part on the left side of the image. Thereby, the processing load of the camera computer 13 can be reduced.
- the white line detection unit 24 searches for pixels whose edge intensity is greater than or equal to a threshold value from the bottom to the top of the screen. Even if the white line is a straight line, the image is taken in the shape of a letter “C” (two straight lines with a wide lower gap and a wide upper gap). repeat. If there is a white line, an edge is obtained from the bottom of the screen upward in each of the left and right white lines. This edge is detected continuously if a solid white line is captured well, and is detected at intervals if it is a dot or a broken line.
- the white line detection unit 24 determines whether the edges are detected as being part of the white line by determining whether the edges are detected almost continuously, or if the edges are not evenly detected. Determine if you can. If it is determined that the edge is a part of the white line, the width of the two edge lines on the left and right is compared with the width of the general white line. To detect. Note that there are not necessarily white lines on the left and right, and only one of the white lines may be detected.
- the white line detection unit 24 plots the edge on the XZ plane by using a correspondence formula between the road coordinate system and the plane coordinate system (it is a process for facilitating the explanation without having to plot the actual process).
- FIG. 8B is an example of a diagram schematically showing an edge converted into coordinates (x, z) on the XZ plane. “X” corresponds to an edge.
- edges are detected at both ends of one white line, but approximation accuracy may be better if one edge is approximated to the road model.
- the white line detection unit 24 calculates the inner point of one white line, the outer side, or the midpoint of two edges in the X direction, and plots them on the XZ plane.
- the runway parameter estimation unit 25 estimates the runway parameters from the edges in FIG. There are several methods for estimating runway parameters. Using road model equations, the coordinates of multiple edge lines are substituted, and the road parameter is estimated using a method that specifies the coefficient of the road model by the least squares method or a maximum likelihood value estimation method such as Kalman filter or particle filter. There is a way to do it.
- x x 0 + ⁇ ⁇ z + 0.5 ⁇ c ⁇ z 2
- x 0 is the case of obtaining the track parameters of the right white line, the distance x or from the origin in the XZ plane to the right of the white line, the case of obtaining the track parameters on the left of the white line, the white line from the origin in the XZ plane of the left Xol .
- ⁇ is an angle formed by the optical axis of the right camera 11 or the left camera 12 and the white line direction (the yaw angle of the vehicle). ⁇ is positive in the right direction with respect to the front direction.
- c is the curvature of the road.
- Equation (1) starting from the x 0, to correct the deviation by the yaw angle (if the vehicle has a yaw right white line is corrected correspondingly so are photographed inclined to the left), The curvature c is caused to act as the square of z.
- the runway parameter estimation unit 25 substitutes the coordinates (x, z) of the edge line of the XZ plane in the equation (1), and obtains ⁇ and c by the least square method. Thereby, it is possible to obtain the yaw angle ⁇ and the road curvature c (radius R) among the runway parameters.
- the lane width W of the road may be obtained from x or + x ol or may be obtained from the x value of the white line at an arbitrary z coordinate. (If the right positive) position of the vehicle lane, for example a center position of the lane in the zero (reference), W / 2-x or can be represented by.
- the processing image switching unit 26 determines whether the curve is a right curve or a left curve based on the road curvature, and if necessary, instructs the white line detection unit 24 to use an image for white line detection.
- the processed image switching unit 26 determines whether or not the following condition is satisfied. If the absolute value of curvature is greater than zero, it can be estimated that the curve is curved. a. Having detected a left curve b. The absolute value of the curvature of the left curve is greater than or equal to the threshold value 1. If the threshold value 1 is included in the condition b, it is not necessary to switch the white line detection camera if the curve is a gentle left curve.
- the runway parameter estimation unit 25 obtains a runway parameter using, for example, a Kalman filter or the like, several frames of image data are required before the runway parameter can be stably estimated, and therefore the white line detection camera should not be switched. Thus, stable road parameter detection can be continued.
- the processed image switching unit 26 determines whether the following condition is satisfied. c. Detected right curve d. The absolute value of the curvature of the right curve is greater than or equal to the threshold value 1. According to the above conditions a to d, after switching to the left camera 12, the image of the left camera 12 is continuously used for white line detection. It becomes easy. After switching to the right camera 11 again, the image of the right camera 11 is easily used continuously for white line detection. Therefore, frequent switching of the white line detection camera can be prevented.
- Threshold 1 may be fixed, but it is also effective to make it variable according to the vehicle speed.
- the white line detection device 100 is set with a threshold value 1 that is smaller as the vehicle speed is higher. Therefore, when the vehicle speed is high, the white line detection image can be switched at an early stage.
- the white line detection unit 24 preferentially uses the image of the right camera 11 for white line detection, the condition of d is ignored (or the threshold 1 is made smaller). ), The image of the right camera 11 may be switched to white line detection only by “c. That a straight line or a right curve has been detected”. The conditions of a and b remain unchanged. Thereby, for example, in a country or region where detection is relatively difficult because the left white line is not a solid line but a dotted line or a broken line, or is thin, the image of the right camera 11 is used for white line detection under the condition b. It can be used preferentially.
- the conditions of ab and cd are reversed. That is, the image of the left camera 12 is switched to white line detection only by ignoring the condition of b and only “a. Detecting a straight line or a left curve” is established.
- the white line detection unit 24 detects both white lines if there are left and right white lines of the host vehicle. If there is no left or right white line, the white line identification information that cannot be detected is sent to the processing image switching unit 26. Notice. In this case, the processed image switching unit 26 does not instruct the white line detection unit 24 to switch to the left camera 12 if the left white line is not detected even if the left curve is detected. Similarly, the processed image switching unit 26 does not instruct the white line detection unit 24 to switch to the right camera 11 if the right white line is not detected even if the right curve is detected.
- the white line on the left or right is interrupted in confluence lanes, it is difficult to detect the white line in rainy weather, the white line is difficult to detect due to deterioration, or other vehicles are parked Therefore, it is possible to prevent the white line detection camera from being switched.
- the processed image switching unit 26 does not determine whether to switch the white line detection camera.
- FIG. 9 is an example of a flowchart illustrating a procedure in which the white line detection apparatus 100 detects a white line and switches a white line detection camera.
- the stereo image acquisition unit 21 acquires images taken by the right camera 11 and the left camera 12 of the stereo camera almost simultaneously (S10).
- the stereo camera may have three or more cameras.
- the white line detection device 100 identifies an image in which the white line is detected well among the three cameras, calculates the parallax, and estimates the road parameter.
- the distortion correction / parallelization unit 22 performs distortion correction and parallelization processing on the image of the right camera 11 and the image of the left camera 12 using, for example, the lookup table 28 (S20).
- the edge extraction unit 23 performs edge extraction processing on each of the image of the right camera 11 and the image of the left camera 12, and the white line detection unit 24 applies a white line from either the image of the right camera 11 or the image of the left camera 12. Is detected (S30). Note that it is not necessary to prohibit the detection of the white line from both the image of the left camera 12 and the image of the right camera 11.
- the parameter may be estimated. Examples of an image that has been favorably subjected to white line detection include an image having a larger number of edges and an image having a short distance interpolated when converting the edge into a line in the V-axis direction.
- the road parameter estimation unit 25 estimates the road parameter from the detected white line edge (S40).
- the processing image switching unit 26 determines whether or not the road ahead is curved to the right or left, and switches the white line detection image from the next image (S50). That is, when the image of the right camera 11 is an image for white line detection, when the left curve is detected, the image of the left camera 12 is switched to an image for white line detection.
- one or more of the steering direction, steering speed, and yaw rate by the driver may be added to the determination of the road curve direction.
- the steering is slowly started and the steering speed is gradually increased along the clothoid curve. Therefore, when the steering speed gradually increases in the uniform steering direction, it can be estimated that the front curve in the steering direction.
- the curve direction can be estimated not only by the driver's steering but also by the steering angle of the vehicle itself, such as the steering angle by the electric power steering device when the LKA or the like is activated.
- the white line detection device 100 can capture a far white line even if the road is curved by switching the white line detection image according to the road shape. Can be detected. Since the far white line can be detected, the accuracy of the runway parameter can be improved.
- the presence / absence of the curve and the curve direction are determined from the runway parameters, but in this embodiment, the white line detection device 100 that determines the presence / absence of the curve and the curve direction based on the white line length will be described.
- FIG. 10 is an example of a diagram for comparing the lengths of the outer and inner white lines.
- r be the radius of the inner circle
- ⁇ be the predetermined angle that looks into the arc.
- the length of the inner arc is r ⁇
- the length of the outer arc is (r + W) ⁇ . Therefore, when the road is curved, the length of the outer white line is expected to be longer by W ⁇ . Therefore, the curve direction can be estimated by comparing the lengths of the left and right white lines.
- FIG. 11 shows an example of a functional block diagram of the camera computer 13.
- the camera computer 13 of this embodiment has a white line length calculation unit 29.
- the white line length calculation unit 29 searches the edge position converted into the XZ plane as shown in FIG. 8B from the first edge to the last edge in the Z direction, and calculates the length of the white line equivalent part. That is, the edges detected as white lines are detected while increasing the Z coordinate from the bottom to the top, and the distance between the edges is calculated. Then, the distances between the edges are summed. By performing this for the left and right white lines, the length of the left and right white lines can be obtained.
- the distance between the edges is not calculated and summed, but the length of the white line can be obtained using the road model of the first embodiment.
- the road parameter is known from the road model and the coordinates of the edge
- the road shape can be expressed by a function such as equation (1).
- the processed image switching unit 26 detects that there is a curve ahead if the lengths of the left and right white lines are different from each other by a threshold value 2 or more.
- a camera for detecting a white line can be weighted.
- the length of the white line on the UV plane may be compared instead of comparing on the XZ plane.
- the curve on the outer peripheral side is photographed on the diagonal line of the screen, so that the white line outside the curve also becomes longer in this case.
- FIG. 12 is an example of a flowchart illustrating a procedure in which the white line detection apparatus 100 detects a white line and switches a white line detection camera.
- step S40 in FIG. 9 is replaced with step S42.
- step S42 the white line length calculation unit 29 calculates the lengths of the left and right white lines (S42).
- the processed image switching unit 26 compares the lengths of the left and right white lines, and switches the white line detection image from the next image (S50). That is, when the image of the right camera 11 is an image for white line detection, when the left curve is detected, the image of the left camera 12 is switched to an image for white line detection.
- the white line detection camera may be switched.
- the white line detection apparatus 100 can switch the white line detection image by estimating the road shape by calculating the length of the white line. Further, the method for switching the image for detecting the white line has been described by way of the embodiment. However, the present invention is not limited to the above embodiment, and various modifications and improvements can be made within the scope of the present invention.
Abstract
Description
(1)白線検出用の画像を左のカメラが撮影した画像又は右のカメラが撮影した画像のいずれかに固定する方法
(2)左右のカメラで共通に撮影された画像領域から検出する方法
の2つの方法がある(例えば、特許文献1参照。)。特許文献1には、ステレオカメラにより撮影された画像から三次元位置情報を計算し、左右画像のそれぞれから道路面上に存在する白線を検出する画像処理装置が開示されている。
12 左カメラ
13 カメラコンピュータ
21 ステレオ画像取得部
22 歪補正・平行化部
23 エッジ抽出部
24 白線検出部
25 走路パラメータ推定部
26 処理画像切り換え部
27 視差算出部
29 白線長さ算出部
100 白線検出装置
200 運転支援ECU
300 作動デバイス
図3は、白線検出装置100を含む運転支援システムの概略構成図の一例を示す。運転支援システム500では、CAN(Controller Area Network)バスなどの車載LANを介して互いに接続された白線検出装置100、運転支援ECU(Electronic Control Unit)200及び作動デバイス300を有している。白線検出装置100は、特許請求の範囲の車線区分標示検出装置に相当する。
図4は、カメラコンピュータ13の機能ブロック図の一例を示す。カメラコンピュータ13は、ステレオ画像取得部21、歪補正・平行化部22、エッジ抽出部23、白線検出部24、走路パラメータ推定部25、処理画像切り換え部26、及び、視差算出部27を有する。ステレオ画像取得部21は、右カメラ11が周期的に撮影した画像及び左カメラ12が周期的に撮影した画像の画像データをそれぞれ取得し、バッファなどに記憶する。
歪補正・平行化部22は、予め取得してあるカメラの外部パラメータ、及び、内部パラメータを用いて左右の画像の歪みを補正し、平行化する。内部パラメータは、例えば右カメラ11と左カメラ12それぞれのレンズの歪みや歪曲収差、撮像素子の歪み、焦点距離などをいう。レンズの歪みは、例えば、レンズの設計値に基づき生成された補正変換テーブルを参照して、画像データを補正することで低減される。歪曲収差は半径方向の歪曲収差のモデルを用いたパラメータ推定に基づき画像データを補正することで低減される。
視差を検出する際には、画素Plに画素Prが対応することを探索する。2つの画像が平行化されていることが分かっているので、白線検出装置100は、水平方向と平行にのみ画像データを探索すればよい。
図6(a)は、視差の探索を説明する図の一例である。ここでは左の画像データの画素に着目して、右の画像データから左の画像データと一致する(相関性が最も高い又は一定以上の相関性を有する)領域を特定する。視差算出部27は、着目している画素を含む領域(以下、ウィンドウという)毎に、SAD(Sum of Absolute Difference)又はSSD(Sum of Squared Differences)を計算する。SADは画素毎の画素値の差の絶対値の合計であり、SSDは画素毎の画素値の差の二乗和である。いずれも小さいほど、ウィンドウに含まれる画像の一致度が高いことを意味する。
距離=(f×m)/(n×d)
fはレンズの焦点距離、mは基線長、nは視差(画素数)、dは撮像素子の画素ピッチである。
白線検出部24は、右カメラ11又は左カメラ12が撮影した画像から白線を検出する。どちらの画像から白線を検出するかは、以下のように設定されている。
(i) 白線検出装置100の起動直後(初期設定)、右カメラ11の画像を白線検出用の画像とする場合
・走路パラメータから直線又は右カーブが検出された場合:右カメラ11
・走路パラメータから左カーブが検出された場合:左カメラ12
その後、走路パラメータから左カーブが検出されなくなった場合は、右カメラ11に戻してもよいし左カメラ12のままでもよい。
(ii) 白線検出装置100の起動直後(初期設定)、左カメラ12の画像を白線検出用の画像とする場合
・走路パラメータから直線又は左カーブが検出された場合:左カメラ12
・走路パラメータから右カーブが検出された場合:右カメラ11
その後、走路パラメータから右カーブが検出されなくなった場合は、左カメラ12に戻してもよいし右カメラ11のままでもよい。
v=-(y/z)・f
u=(x/z)・f
ステレオカメラの高さは固定なので、白線が標示されている路面までの高さも一定と見なすことができ、平面座標のvから道路座標系のzを求めることができる。また、vから求めたzにおけるxは、平面座標系のuを用いて算出できる。
走路パラメータ推定部25は図8(b)のエッジから走路パラメータを推定する。走路パラメータの推定にはいくつか方法がある。道路モデル式を用いて、複数のエッジ線の座標を代入し、最小二乗法により道路モデルの係数を特定する方法や、カルマンフィルタやパーティクルフィルタ等の最尤値推定手法を用いて、走路パラメータを推定する方法がある。
x=x0+θ×z+0.5×c×z2 …(1)
x0は、右側の白線の走路パラメータを求める場合は、XZ平面の原点から右側の白線までの距離xorであり、左側の白線の走路パラメータを求める場合は、XZ平面の原点から左側の白線までの距離xolである。θは、右カメラ11又は左カメラ12の光軸と白線方向とのなす角(車両のヨー角)である。θは正面方向に対し右向きを正とする。cは道路の曲率である。cはXZ平面の定義から左カーブで負、右カーブで正となる。式(1)によれば、x0を起点に、ヨー角によるずれを補正し(車両が右向きのヨーを有する場合、白線は左方向に傾斜して撮影されるのでその分を補正する)、曲率cをzの二乗で作用させている。
走路パラメータを求められた場合、処理画像切り換え部26は道路曲率に基づき右カーブか左カーブかを判定し、切り替える必要があれば、白線検出部24に白線検出に用いる画像を指示する。
a.左カーブを検出したこと
b.左カーブの曲率の絶対値が閾値1以上であること
bの条件に閾値1がふくまれることで、緩やかな左カーブであれば、白線検出用のカメラを切り替える必要がない。走路パラメータ推定部25が、例えばカルマンフィルタなどで走路パラメータを求めている場合、走路パラメータを安定して推定できるまでに何フレームかの画像データを必要とするので、白線検出用のカメラを切り替えないことで、安定した走路パラメータの検出を継続できる。
c.右カーブを検出したこと
d.右カーブの曲率の絶対値が閾値1以上であること
以上のa~dのような条件によれば、左カメラ12に切り替えた後は左カメラ12の画像が白線検出用に継続して使用されやすくなる。再度、右カメラ11に切り替えた後は右カメラ11の画像が白線検出用に継続して使用されやすくなる。したがって、白線検出用のカメラが頻繁に切り替わることを防止できる。
図9は、白線検出装置100が白線を検出し、白線検出用のカメラを切り替える手順を示すフローチャート図の一例である。
図10は、外側と内側の白線の長さの比較する図の一例である。内周の円の半径をr、円弧を見込む所定の角度をφとする。内周の円弧の長さはrφ、外周の円弧の長さは(r+W)φである。したがって、道路がカーブしている場合、外側の白線の長さはWφだけ長いと予想される。よって、左右の白線の長さを比較すれば、カーブ方向を推定できる。
L=∫√{1+(dx/dz)2}dz
処理画像切り換え部26は、左右の白線の長さが閾値2以上、異なる場合、前方にカーブがあることを検出する。すなわち、左側の白線の方が右側の白線よりも閾値2以上長い場合、右カーブであると判定し、右側の白線の方が左側の白線よりも閾値2以上長い場合、左カーブであると判定する。なお、実施例1と同様に、白線検出用とするカメラは重み付けしておくことができる。
Claims (8)
- 自車両が走行する車線を区分する車線区分標示を撮影する複数の撮影手段と、
複数の前記撮影手段によりそれぞれ生成された画像データのうち少なくとも1つの画像データから車線区分標示を検出する車線区分標示検出手段と、
前記車線区分標示検出手段が検出した車線区分標示から道路形状を推定する道路形状推定手段と、
前記道路形状推定手段が推定した道路形状に応じて、前記車線区分標示検出手段が車線区分標示を検出するために使用する画像データを決定する画像データ決定手段と、
を有する車線区分標示検出装置。 - 前記画像データ決定手段は、前記道路形状推定手段が推定した道路形状が前方の道路が湾曲していることを示す場合、車幅方向に離間して車載された複数の前記撮影手段のうち湾曲方向に最も近い前記撮影手段が生成した画像データを前記車線区分標示検出手段が車線区分標示を検出するために使用する画像データに決定する、
ことを特徴とする請求項1記載の車線区分標示検出装置。 - 前記道路形状推定手段は道路形状として道路の曲率を推定し、
前記画像データ決定手段は、道路の前記曲率が閾値以上の場合、複数の前記撮影手段のうち湾曲方向に最も近い前記撮影手段が生成した画像データを前記車線区分標示検出手段が車線区分標示を検出するために使用する画像データに決定する、
ことを特徴とする請求項2記載の車線区分標示検出装置。 - 前記道路形状推定手段は道路形状として道路の曲率を推定し、
複数の前記撮影手段によりそれぞれ生成される複数の画像データのうち、前記車線区分標示検出手段が車線区分標示を検出するために優先的に使用する画像データが予め定められており、
前記車線区分標示検出手段が優先的に使用する画像データから車線区分標示を検出している場合、前記画像データ決定手段は、道路の前記曲率が閾値以上の場合、複数の前記撮影手段のうち湾曲方向に最も近い前記撮影手段が生成した画像データを前記車線区分標示検出手段が車線区分標示を検出するために使用する画像データに決定し、
前記車線区分標示検出手段が優先的に使用する画像データでない画像データから車線区分標示を検出している場合、道路の前記曲率が直線道路と見なせる程度に小さい場合、前記車線区分標示検出手段が優先的に使用する画像データを、車線区分標示を検出する画像データに決定する、
ことを特徴とする請求項2記載の車線区分標示検出装置。 - 前記画像データ決定手段は、車線区分標示検出手段が検出した車線区分標示に加え、車両の舵角に基づき道路形状を推定する、
ことを特徴とする請求項1~4いずれか1項記載の車線区分標示検出装置。 - 前記車線区分標示検出手段が検出した車線区分標示の長さを算出する長さ算出手段を有し、
前記画像データ決定手段は、前記長さ算出手段が算出した自車両の左右の車線区分標示の長さの比較結果に応じて道路形状を推定する、
ことを特徴とする請求項1記載の車線区分標示検出装置。 - 前記道路形状推定手段は、道路の幅、前記幅方向の自車両の位置、及び、車線区分標示に対するヨー角を推定する、
ことを特徴とする請求項1項記載の車線区分標示検出装置。 - 車線区分標示検出装置と、運転支援装置とが車載ネットワークを介して接続された運転支援システムであって、
前記車線区分標示検出装置は、自車両が走行する車線を区分する車線区分標示を撮影する複数の撮影手段と、
複数の前記撮影手段によりそれぞれ生成された画像データのうち少なくとも1つの画像データから車線区分標示を検出する車線区分標示検出手段と、
前記車線区分標示検出手段が検出した車線区分標示から道路形状を推定する道路形状推定手段と、
前記道路形状推定手段が推定した道路形状に応じて、前記車線区分標示検出手段が車線区分標示を検出するために使用する画像データを決定する画像データ決定手段と、を有し、
前記運転支援装置は、前記道路形状推定手段が推定した、道路の幅、前記幅方向の自車両の位置、及び、車線区分標示に対するヨー角に基づき、自車両の車線逸脱防止支援を行う、ことを特徴とする運転支援システム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280073463.XA CN104335264A (zh) | 2012-06-14 | 2012-06-14 | 车道划分标示检测装置、驾驶辅助系统 |
PCT/JP2012/065275 WO2013186903A1 (ja) | 2012-06-14 | 2012-06-14 | 車線区分標示検出装置、運転支援システム |
EP12879087.0A EP2863374A4 (en) | 2012-06-14 | 2012-06-14 | CIRCULATION PATH SEPARATION MARKING DETECTION APPARATUS, AND DRIVER ASSISTANCE SYSTEM |
US14/407,645 US20150165973A1 (en) | 2012-06-14 | 2012-06-14 | Lane Separation Mark Detection Apparatus and Drive Support System |
JP2014521066A JP5880703B2 (ja) | 2012-06-14 | 2012-06-14 | 車線区分標示検出装置、運転支援システム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/065275 WO2013186903A1 (ja) | 2012-06-14 | 2012-06-14 | 車線区分標示検出装置、運転支援システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013186903A1 true WO2013186903A1 (ja) | 2013-12-19 |
Family
ID=49757765
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/065275 WO2013186903A1 (ja) | 2012-06-14 | 2012-06-14 | 車線区分標示検出装置、運転支援システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20150165973A1 (ja) |
EP (1) | EP2863374A4 (ja) |
JP (1) | JP5880703B2 (ja) |
CN (1) | CN104335264A (ja) |
WO (1) | WO2013186903A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015210764A (ja) * | 2014-04-30 | 2015-11-24 | 日産自動車株式会社 | 走行車線認識装置、走行車線認識方法 |
JP2015219569A (ja) * | 2014-05-14 | 2015-12-07 | 株式会社デンソー | 境界線認識装置および境界線認識プログラム |
JP2017102928A (ja) * | 2015-12-03 | 2017-06-08 | ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツングRobert Bosch Gmbh | 二輪車における傾斜検出 |
JP2021508901A (ja) * | 2018-05-31 | 2021-03-11 | 上▲海▼商▲湯▼智能科技有限公司Shanghai Sensetime Intelligent Technology Co., Ltd. | 区画線に基づくインテリジェントドライブ制御方法および装置、ならびに電子機器 |
JP7437512B2 (ja) | 2020-01-06 | 2024-02-22 | ルミナー,エルエルシー | 撮像システムのための車線の検出および追跡方法 |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EA031127B1 (ru) * | 2011-03-29 | 2018-11-30 | Юра Трейд, Лимитед | Способ и устройство для создания и аутентификации защищенных документов |
CN104736416B (zh) * | 2012-10-04 | 2016-08-24 | 日产自动车株式会社 | 转轮控制装置 |
DE102012112104A1 (de) * | 2012-12-11 | 2014-06-12 | Conti Temic Microelectronic Gmbh | Verfahren und vorrichtung zur befahrbarkeitsanalyse |
US9424475B1 (en) * | 2014-09-17 | 2016-08-23 | Google Inc. | Construction object detection |
DE112015006622T5 (de) * | 2015-06-15 | 2018-03-29 | Mitsubishi Electric Corporation | Fahrspurbestimmungsvorrichtung und Fahrspurbestimmungsverfahren |
KR20190012370A (ko) * | 2017-07-27 | 2019-02-11 | 삼성에스디에스 주식회사 | 차선 변경 지원 방법 및 장치 |
JP6965739B2 (ja) * | 2017-12-27 | 2021-11-10 | 株式会社デンソー | 車両制御装置 |
KR102595897B1 (ko) | 2018-08-27 | 2023-10-30 | 삼성전자 주식회사 | 차선 결정 방법 및 장치 |
KR102442230B1 (ko) * | 2018-09-30 | 2022-09-13 | 그레이트 월 모터 컴퍼니 리미티드 | 주행 좌표계의 구축 방법 및 응용 |
KR20200040374A (ko) * | 2018-10-10 | 2020-04-20 | 삼성전자주식회사 | 거리 추정 방법 및 장치 |
US10735716B2 (en) * | 2018-12-04 | 2020-08-04 | Ford Global Technologies, Llc | Vehicle sensor calibration |
FR3094317B1 (fr) * | 2019-04-01 | 2021-03-05 | Renault Sas | Module anticipateur, dispositif de contrôle en temps réel de trajectoire et procédé associés |
KR20210148756A (ko) * | 2020-06-01 | 2021-12-08 | 삼성전자주식회사 | 경사 추정 장치 및 이의 동작 방법 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001250199A (ja) * | 2000-03-07 | 2001-09-14 | Toyota Central Res & Dev Lab Inc | 走行コース推定装置 |
JP2005343417A (ja) * | 2004-06-07 | 2005-12-15 | Auto Network Gijutsu Kenkyusho:Kk | 駐車アシスト装置 |
JP2008225822A (ja) * | 2007-03-13 | 2008-09-25 | Toyota Motor Corp | 道路区画線検出装置 |
JP2009041972A (ja) | 2007-08-07 | 2009-02-26 | Toshiba Corp | 画像処理装置及びその方法 |
JP2010069922A (ja) * | 2008-09-16 | 2010-04-02 | Toyota Motor Corp | 車線認識装置 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69635569T2 (de) * | 1995-04-25 | 2006-08-10 | Matsushita Electric Industrial Co., Ltd., Kadoma | Vorrichtung zum Bestimmen der lokalen Position eines Autos auf einer Strasse |
JPH0995194A (ja) * | 1995-09-29 | 1997-04-08 | Aisin Seiki Co Ltd | 車両前方の物体検出装置 |
US7038577B2 (en) * | 2002-05-03 | 2006-05-02 | Donnelly Corporation | Object detection system for vehicle |
JP4134939B2 (ja) * | 2004-04-22 | 2008-08-20 | 株式会社デンソー | 車両周辺表示制御装置 |
WO2006121087A1 (ja) * | 2005-05-10 | 2006-11-16 | Olympus Corporation | 画像処理装置、画像処理方法および画像処理プログラム |
US7561032B2 (en) * | 2005-09-26 | 2009-07-14 | Gm Global Technology Operations, Inc. | Selectable lane-departure warning system and method |
CN1804928A (zh) * | 2005-11-24 | 2006-07-19 | 上海交通大学 | 基于机器视觉的车道局部几何结构和车辆位置估计方法 |
JP5124875B2 (ja) * | 2008-03-12 | 2013-01-23 | 本田技研工業株式会社 | 車両走行支援装置、車両、車両走行支援プログラム |
US8311283B2 (en) * | 2008-07-06 | 2012-11-13 | Automotive Research&Testing Center | Method for detecting lane departure and apparatus thereof |
JP4730406B2 (ja) * | 2008-07-11 | 2011-07-20 | トヨタ自動車株式会社 | 走行支援制御装置 |
US8364334B2 (en) * | 2008-10-30 | 2013-01-29 | Honeywell International Inc. | System and method for navigating an autonomous vehicle using laser detection and ranging |
US9459515B2 (en) * | 2008-12-05 | 2016-10-04 | Mobileye Vision Technologies Ltd. | Adjustable camera mount for a vehicle windshield |
JP5441549B2 (ja) * | 2009-07-29 | 2014-03-12 | 日立オートモティブシステムズ株式会社 | 道路形状認識装置 |
JP5286214B2 (ja) * | 2009-09-30 | 2013-09-11 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
DE102011014139B4 (de) * | 2010-03-17 | 2023-04-06 | Hl Klemove Corp. | Verfahren und system für eine spurhalteregelung |
-
2012
- 2012-06-14 WO PCT/JP2012/065275 patent/WO2013186903A1/ja active Application Filing
- 2012-06-14 US US14/407,645 patent/US20150165973A1/en not_active Abandoned
- 2012-06-14 JP JP2014521066A patent/JP5880703B2/ja active Active
- 2012-06-14 CN CN201280073463.XA patent/CN104335264A/zh active Pending
- 2012-06-14 EP EP12879087.0A patent/EP2863374A4/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001250199A (ja) * | 2000-03-07 | 2001-09-14 | Toyota Central Res & Dev Lab Inc | 走行コース推定装置 |
JP2005343417A (ja) * | 2004-06-07 | 2005-12-15 | Auto Network Gijutsu Kenkyusho:Kk | 駐車アシスト装置 |
JP2008225822A (ja) * | 2007-03-13 | 2008-09-25 | Toyota Motor Corp | 道路区画線検出装置 |
JP2009041972A (ja) | 2007-08-07 | 2009-02-26 | Toshiba Corp | 画像処理装置及びその方法 |
JP2010069922A (ja) * | 2008-09-16 | 2010-04-02 | Toyota Motor Corp | 車線認識装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2863374A4 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015210764A (ja) * | 2014-04-30 | 2015-11-24 | 日産自動車株式会社 | 走行車線認識装置、走行車線認識方法 |
JP2015219569A (ja) * | 2014-05-14 | 2015-12-07 | 株式会社デンソー | 境界線認識装置および境界線認識プログラム |
JP2017102928A (ja) * | 2015-12-03 | 2017-06-08 | ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツングRobert Bosch Gmbh | 二輪車における傾斜検出 |
JP7179440B2 (ja) | 2015-12-03 | 2022-11-29 | ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツング | 二輪車における傾斜検出 |
JP2021508901A (ja) * | 2018-05-31 | 2021-03-11 | 上▲海▼商▲湯▼智能科技有限公司Shanghai Sensetime Intelligent Technology Co., Ltd. | 区画線に基づくインテリジェントドライブ制御方法および装置、ならびに電子機器 |
JP7024115B2 (ja) | 2018-05-31 | 2022-02-22 | 上▲海▼商▲湯▼智能科技有限公司 | 区画線に基づくインテリジェントドライブ制御方法および装置、ならびに電子機器 |
US11314973B2 (en) | 2018-05-31 | 2022-04-26 | Shanghai Sensetime Intelligent Technology Co., Ltd. | Lane line-based intelligent driving control method and apparatus, and electronic device |
JP7437512B2 (ja) | 2020-01-06 | 2024-02-22 | ルミナー,エルエルシー | 撮像システムのための車線の検出および追跡方法 |
Also Published As
Publication number | Publication date |
---|---|
EP2863374A1 (en) | 2015-04-22 |
JPWO2013186903A1 (ja) | 2016-02-01 |
JP5880703B2 (ja) | 2016-03-09 |
CN104335264A (zh) | 2015-02-04 |
EP2863374A4 (en) | 2016-04-20 |
US20150165973A1 (en) | 2015-06-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5880703B2 (ja) | 車線区分標示検出装置、運転支援システム | |
JP5829980B2 (ja) | 路側物検出装置 | |
JP5276637B2 (ja) | 車線推定装置 | |
EP2889641B1 (en) | Image processing apparatus, image processing method, program and image processing system | |
JP6459659B2 (ja) | 画像処理装置、画像処理方法、運転支援システム、プログラム | |
JP5399027B2 (ja) | 自動車の運転を支援するための、立体画像を捕捉することができるシステムを有するデバイス | |
JP3822515B2 (ja) | 障害物検知装置及びその方法 | |
JP4930046B2 (ja) | 路面判別方法および路面判別装置 | |
TWI401175B (zh) | Dual vision front vehicle safety warning device and method thereof | |
JP5561064B2 (ja) | 車両用対象物認識装置 | |
EP1403615A2 (en) | Apparatus and method for processing stereoscopic images | |
KR20200000953A (ko) | 어라운드 뷰 모니터링 시스템 및 카메라 공차 보정 방법 | |
JP2015148887A (ja) | 画像処理装置、物体認識装置、移動体機器制御システム及び物体認識用プログラム | |
WO2011016257A1 (ja) | 車両用距離算出装置 | |
JP7229032B2 (ja) | 車外物体検出装置 | |
JP2007310591A (ja) | 画像処理装置及び駐車場判定方法 | |
CN110659553A (zh) | 车辆用标线检测装置 | |
JP4900377B2 (ja) | 画像処理装置 | |
WO2020121758A1 (ja) | 撮像部制御装置 | |
US20240103525A1 (en) | Vehicle and control method thereof | |
JP5129094B2 (ja) | 車両周辺監視装置 | |
JP7166096B2 (ja) | 画像処理装置および画像処理方法 | |
JP4381394B2 (ja) | 障害物検知装置及びその方法 | |
JP4471881B2 (ja) | 障害物認識装置及び障害物認識方法 | |
JP2014181957A (ja) | 車載カメラ校正装置、車載カメラ校正方法、プログラム及び媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12879087 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014521066 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2012879087 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012879087 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14407645 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |