CN106157289A - Line detecting method and equipment - Google Patents

Line detecting method and equipment Download PDF

Info

Publication number
CN106157289A
CN106157289A CN201510163192.2A CN201510163192A CN106157289A CN 106157289 A CN106157289 A CN 106157289A CN 201510163192 A CN201510163192 A CN 201510163192A CN 106157289 A CN106157289 A CN 106157289A
Authority
CN
China
Prior art keywords
line
model
detected
segment
line segment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510163192.2A
Other languages
Chinese (zh)
Other versions
CN106157289B (en
Inventor
贺娜
陈超
李静雯
师忠超
鲁耀杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Priority to CN201510163192.2A priority Critical patent/CN106157289B/en
Publication of CN106157289A publication Critical patent/CN106157289A/en
Application granted granted Critical
Publication of CN106157289B publication Critical patent/CN106157289B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

Provide a kind of line detecting method and equipment.Described method includes: extract the line segment of the feature of the line conforming to detection in current frame image;Obtain the line model pre-building;Update described line model based on the line segment being extracted;According to the line model after renewal, determine line to be detected.Can be greatly reduced by described line detecting method and equipment and block the impact updating line model with noise, thus improve the accuracy of line detection.

Description

Line detection method and apparatus
Technical Field
The present invention relates generally to image processing, and more particularly to line detection methods and apparatus.
Background
Line detection technology has wide application in the field of image processing, for example, lane line detection is an important application of line detection technology. The existing lane line detection technology can be divided into two types: feature-based methods and model-based methods.
The feature-based approach locates the lane lines through a combination of underlying features. However, this method does not have any global constraint on the edge shape of the lane line, and is therefore greatly affected by occlusion and noise.
Compared with the feature-based method, the model-based method uses some parameters to represent the lane lines when detecting the lane lines, so the method is more robust to occlusion and noise. The basic principle of the model-based method is to update the lane line model according to the input features, and then obtain the lane line to be detected. In general, the input features are feature points detected based on parameters representing the lane lines, such as edge points or filtered disparity points. When the model is updated, feature points are selected for each lane line in the lane line model, and the lane line model is updated by performing line fitting by using the selected feature points. The principle of selecting feature points is usually based on the distance of the feature points from each lane line in the lane line model. However, the detected feature points are not accurate, and noise points may be included therein. For example, fig. 1 illustrates an example case where a noise point is included in the detected feature points. Therefore, when the number of selected feature points is too large, particularly when the selected feature points include noise points, the update of the lane line model is affected to cause inaccurate lane line detection results.
Disclosure of Invention
The present disclosure aims to solve at least the above problems. Specifically, an object of the present disclosure is to provide a line detection technology, by which the influence of occlusion and noise on updating of a line model can be greatly reduced, thereby improving the accuracy of line detection.
According to an embodiment of the present invention, there is provided a line detection method including: extracting line segments which accord with the characteristics of the lines to be detected in the current frame image; acquiring a pre-established line model; updating the line model based on the extracted line segments; and determining the line to be detected according to the updated line model.
According to another embodiment of the present invention, there is provided a line detecting apparatus including: a line segment extraction section configured to extract a line segment conforming to a feature of a line to be detected in the current frame image; a model acquisition section configured to acquire a line model established in advance; an updating section configured to update the line model based on the extracted line segment; a detection component configured to determine a line to be detected according to the updated line model.
According to the line detection technique of the embodiment of the present invention, model fitting is performed using feature line segments instead of feature points to update a line model. Because the position distance and the direction distance between the characteristic line segment and each line in the line model are considered when the characteristic line segment is selected for model fitting, the possibility of selecting a noise line segment to update the line model is greatly reduced, the influence of noise on the updating of the line model is reduced, and the accuracy of line detection is improved.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in more detail embodiments of the present disclosure with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the principles of the disclosure and not to limit the disclosure. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 illustrates an example case where noise points are included in detected feature points.
Fig. 2 shows a flow chart of a line detection method according to an embodiment of the invention.
Fig. 3 shows an example of line segments extracted in the line detection method according to the embodiment of the present invention in the case where the line to be detected is a lane line.
Fig. 4 shows an example of a lane line model.
FIG. 5 illustrates a flow diagram of a process for updating a line model based on extracted line segments, according to one embodiment of the invention.
Fig. 6 illustrates a functional configuration block diagram of a line detecting apparatus according to an embodiment of the present invention.
FIG. 7 illustrates an overall hardware block diagram of a line detection system according to an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
Fig. 2 shows a flow chart of a line detection method according to an embodiment of the invention. For convenience of explanation, the following description will be given taking an example in which the line to be detected is a lane line.
As shown in fig. 2, in step S201, a line segment that conforms to the feature of a line to be detected is extracted in the current frame image.
The characteristic of the line to be detected may be any characteristic capable of characterizing the line. For example, it may be (but is not limited to) a color, grayscale, shape, edge, parallax, etc. feature of a line, or any combination of these features.
In this step, the line segments that conform to the characteristics of the line to be detected will be extracted in any suitable manner. As one example, a line segment conforming to the feature of a line to be detected may be directly detected by a straight line detection method such as hough transform in a current frame image obtained by photographing. As another example, feature points may be first detected in the current frame image by a detection method corresponding to a feature of a line to be detected based on the feature (for example, if the feature of the line to be detected is an edge feature, the feature points may be detected by an edge detection method), and then the line segment may be fitted using the detected feature points. Fig. 3 shows an example of a line segment extracted by this step in the case where the line to be detected is a lane line. The bold white line segments in the figure represent the extracted line segments.
In step S202, a pre-established line model is acquired.
Taking the example that the line to be detected is a lane line, accordingly, the line model is a lane line model. The lane line model may be pre-established using any means known in the art. How to build the lane line model is not critical to the present invention, and is only for completeness of description, and a simple description of an exemplary lane line model building method is provided.
To establish the lane line model, lane line detection is performed first. Here, the lane line may be detected in any existing manner. For example, as a commonly used method, a CHEVP algorithm in initialization of a B-Snake model may be used to detect lane lines in captured image frames. As another example, as a more basic approach, lane lines may be determined by manual labeling in the image frames.
After the lane line is detected, a lane line model may be established from the lane line. There are many types of existing lane line models, including linear models, isolated point models, parabolic models and their extensions, hyperbolic models, clothoid models, spline models, Snake models, 3D models, and so on. Those skilled in the art can build any type of lane line model according to specific needs. In the present example, a polynomial is employed to represent the lane line model.
Specifically, as shown in fig. 4, it is assumed that the center line of a certain lane is LmidThe left lane line of the lane is L-1The right lane line is L1The left lane line of the nth lane on the left side of the lane is L-nAnd the left lane line of the nth lane on the right is Ln(ii) a The lane width of the certain lane is w0The lane width of the nth lane on the left is w-nThe lane width of the nth lane on the right side is wn(ii) a vp represents the ordinate of the position where the lane line disappears; f is the focal length of the lens, and H is the height of the camera.
LmidUsing a polynomial expression as shown in equation (1):
x m = Σ i = 0 n a i y i - - - ( 1 )
the lane line model can be expressed as:
L j : x = x m - ( y - vp ) ( 1 2 k w 0 + &Sigma; i = j - 1 k w i ) ( j < 0 ) x = x m ( j = 0 ) x = x m + ( y - vp ) ( 1 2 k w 0 + &Sigma; i = 1 j k w i ) ( j > 0 ) - - - ( 2 )
wherein, k w i = f 2 w i H ( f 2 + vp 2 ) - - - ( 3 )
as described above, the lane line detection is performed before the lane line model is established, and therefore a can be calculated by substituting the lane line detection result into the above expressions (1), (2), (3)iAnd wiA value of (i ═ 0, 1.., n). Thus, the establishment of the lane line model is completed.
The lane line model shown in the above expression (2) is merely an example, and other forms of expressions may be adopted. For example, the lane line model expression may be established not using the lane center line but directly using the left and right lane lines of the lane. More specifically, the polynomial shown in expression (1) can be directly used to represent the left or right lane line of a certain lane, instead of the center line of the lane, and the expression of each lane in expression (2) is adjusted accordingly.
An exemplary lane line model building approach has been briefly described above. In this step S202, a lane line model established in advance by this example manner or any other manner is acquired. It can be understood that, as an example, when a lane line is continuously detected in a series of video frames, in the case where the current frame image to be detected is an image of a frame other than the first frame image, the previously established line model may be a lane line model used to detect a lane line in the previous frame image. It should be noted that, in actual operation, when lane lines are detected continuously in a series of video frames, a line model is usually established and verified in the first N frames in the series of video frames (for example, a line model is established in the first frame and verified in the subsequent N-1 frames, N is an integer greater than 1, and the value thereof can be set according to specific needs), rather than being used directly after the line model is established in the first frame. Therefore, as another example, in the case where the current frame image to be detected is an image of a frame subsequent to the N-th frame, the previously established line model may take a lane line model used for detecting a lane line in the previous frame image thereof.
In step S203, the line model is updated based on the extracted line segment.
It can be understood that since the line segments extracted in step S201 are in conformity with the features of the line to be detected, these line segments are candidate line segments forming the line to be detected, that is, the line to be detected can be obtained based on these line segments. Specifically, in this step, the line model acquired in step S202 will be updated with the extracted line segment.
As a basic updating method, model fitting can be directly performed by using each extracted line segment, so as to obtain an updated line model.
The model fitting can be performed using the usual least squares method. For example, for the lane line model as described in expression (2), a fitting manner such that expression (4) below has a minimum value may be selected, and a in the fitting manner may be determinediAnd wiThereby obtaining an updated lane line model:
&Sigma; j = 1 m DIFF ji 2 - - - ( 4 )
wherein, DIFFjiIs a line segment SjAnd line L in the line modeliThe difference may be represented by any means such as similarity, distance, etc. Of course, the least squares method is merely an example, and other existing methods such as gradient descent may be used to perform the model fitting.
The above-described line model update method may not work well in some situations. For example, as shown in fig. 3, there are 3 lane lines to be detected, that is, the lane line model includes 3 lane lines, and the extracted line segments are scattered in the current frame image, so that when the extracted line segments are subjected to model fitting, it is impossible to determine which lane line each line segment should correspond to, and updating of the lane line model is affected. For another example, it can be seen that the line segment enclosed by the circle in fig. 3 is noise, which is not applied to updating the lane line model, but according to the above line model updating method, the line segment will also be used for model fitting, which will affect updating of the lane line model. For this problem, as an example, a line model update method described below may be employed in step S203. The method will be described below with reference to fig. 5.
FIG. 5 illustrates a flow diagram of a process for updating a line model based on extracted line segments, according to one embodiment of the invention.
As shown in fig. 5, in step S2031, for each of the line segments, a line corresponding to the line segment in the line model is determined.
In this step, as an example, for each of the line segments, the line corresponding thereto may be determined in consideration of both the position distance and the direction distance thereof from each line in the line model. This exemplary method is described in detail below.
Suppose DijIs a line segment SjLine L into the line modeliIs a distance of
D ij = &Sigma; k = 1 n j d ki - - - ( 5 )
Wherein n isjIs a line segmentSjNumber of pixels above, dkiIs a line segment SjPoint k on to line L in the line modeliIs measured.
Positional distance d in expression (5)kiMay be calculated in various suitable ways. For example, the position distance d may be expressed by a euclidean distance, as an examplekiI.e. assuming said line LiIs expressed as aix+biy+ci0 (taking a straight line as an example), then
d ki = | a i x ki + b i y ki + c i | a i 2 + b i 2 - - - ( 6 )
It can be understood that the line L in the line modeliIn the case of a curve, the line segment S may be calculated by a method known in the art, such as derivationjTo the line LiThe Euclidean distance of (1) is not described herein again.
On the other hand, the positional distance d in expression (5) is expressed by the euclidean distancekiJust an example, it is also possible to pass, for example, the Papanicolaou distance, the Mahalanobis distanceEtc. to represent the position distance dki
The directional distance of the line segment from the line in the line model may be calculated by various suitable methods. For example, when a line in the line model is a straight line, a direction angle between a direction of the straight line and a direction of the line segment may be directly calculated as the direction distance. In the present disclosure, as an example, the directional distance of the line segment from the line in the line model is calculated by the following expression (7). In particular, assume TijIs a line segment SjLine L into the line modeli(the line LiEither straight or curved), then
T ij = &Sigma; k = 1 n j | K S j - K T ki | - - - ( 7 )
Wherein,is a line segment SjGradient at point k on (line segment S)jThe gradient at each point on the line segment is the same, i.e. the gradient of the line segment),as a line L in the line modeliThe gradient direction of the tangent at the point k', which is the line LiUpper and line segment SjThe points k on have the same abscissa or the same ordinateThe corresponding point of (2).
After the direction distance and the position distance from each line in the line model are calculated for each line segment as described above, the line having the smallest direction distance and position distance from the line segment may be taken as the line corresponding to the line segment. As an example, a line having the smallest directional distance and positional distance from the line segment may be determined by the following processes: for each line in the line model, the weighted operation result of the direction distance and the position distance of the line segment from the line is calculated as shown in expression (8), and the line corresponding to the smallest weighted operation result is selected as the line having the smallest direction distance and position distance from the line segment:
w1*Dij+w2*Tij(8)
wherein D isijAnd TijAre respectively line segments SjLine L into the line modeliW1 represents a weight of the positional distance, w2 represents a weight of the directional distance, and the weights can be arbitrarily set as needed, and w1+ w2 is usually 1.
In the above example, the line whose direction distance and position distance from the line segment are the smallest is determined based on the result of the weighting operation, which is merely an example, and the line whose direction distance and position distance from the line segment are the smallest may be determined based on, for example, a variance of the direction distance and position distance.
In step S2032, model fitting is performed using the line segments according to the determined correspondence between the line segments and the lines in the line model, thereby obtaining an updated line model.
In step S2031, the line in the line model corresponding to each line segment is determined, that is, the line segment should be used for updating each lane line. In step S2032, model fitting is performed using the line segments based on the determined correspondence.
As previously described, various means such as least squares, gradient descent, and the like may be utilizedTo perform model fitting. For example, taking the fitting using the least squares method as an example, for the lane line model as exemplified by expression (2), it is possible to select a fitting manner in which the following expression (9) has the minimum value, and determine that a is in the fitting manneriAnd wiThereby obtaining an updated lane line model:
&Sigma; i = 1 n &Sigma; j = 1 m i ( w 1 * D ij + w 2 * T ij ) 2 - - - ( 9 )
wherein D isijAnd TijAre respectively line segments SjLine L into the line modeliN is the number of model central lines, miThe number of line segments corresponding to the ith line is represented, w1 represents the weight of the position distance, w2 represents the weight of the direction distance, which can be arbitrarily set as needed, and w1+ w2 is usually 1.
An example of updating a line model based on extracted line segments is described above in connection with FIG. 5. Alternatively, in step S2031, if a weighted calculation result (or variance, etc.) of the direction distance and the position distance of a line segment from each line in the line model is calculated for the line segment, and the minimum value of the weighted calculation result (or variance, etc.) is also greater than a preset threshold, the line segment is considered as noise, and there is no line corresponding to the line segment in the line model. In this way, model fitting will not be performed with the line segment in the subsequent step S2032, so that the influence of noise on the updating of the line model can be avoided.
Returning to fig. 2, in step S204, a line to be detected is determined based on the updated line model.
The updated line model obtained by the above-described processing describes each line in the current frame image, and therefore the line to be detected in the current frame image can be directly obtained from the updated line model.
The line detection method according to the embodiment of the present invention in which model fitting is performed using the characteristic line segment to update the line model is described above in detail. In the line model updating method of the embodiment, when the feature line segment is selected for model fitting, not only the position distance between the feature line segment and each line in the line model but also the direction distance between the feature line segment and each line in the line model are considered, so that compared with a scheme that only the position distance between the feature point and each line is considered when the feature point is selected for fitting, the possibility of selecting a noise line segment for updating the line model is greatly reduced, the influence of noise on updating the line model is reduced, and the accuracy of line detection is improved.
A line detection apparatus according to an embodiment of the present disclosure is described below with reference to fig. 6. Fig. 6 illustrates a functional configuration block diagram of a line detecting apparatus 600 according to an embodiment of the present invention. As shown in fig. 6, the line detecting apparatus 600 may include a line segment extracting part 601, a model obtaining part 602, an updating part 603, and a detecting part 604, which may respectively perform the respective steps/functions of the line detecting method described above in connection with fig. 2. Therefore, only the main functions of the respective components of the line detecting apparatus 600 will be described below, and details that have been described above will be omitted.
The line segment extraction section 601 is configured to extract a line segment conforming to the feature of a line to be detected in the current frame image. The characteristic of the line to be detected may be any characteristic capable of characterizing the line. For example, it may be (but is not limited to) a color, grayscale, shape, edge, parallax, etc. feature of a line, or any combination of these features. The line segment extraction section 601 will extract a line segment that matches the characteristics of the line to be detected in any suitable manner. As one example, a line segment conforming to the feature of a line to be detected may be directly detected by a straight line detection method such as hough transform in a current frame image obtained by photographing. As another example, feature points may be first detected in the current frame image by a detection method corresponding to features of a line to be detected, based on the features, and then the line segment may be fitted using the detected feature points.
The model acquisition component 602 is configured to acquire a pre-established line model. As mentioned above, the line model may be pre-established by applying any means known in the art, and how to establish the line model is not critical to the present invention, and therefore will not be described in detail herein. It can be understood that, as an example, when a lane line is continuously detected in a series of video frames, in the case where the current frame image to be detected is an image of a frame other than the first frame image, the previously established line model may be a lane line model used to detect a lane line in the previous frame image. As another example, when lane lines are detected continuously in a series of video frames, if a line model is established in the first frame thereof and verified in the subsequent N-1 frame, the previously established line model may take a lane line model used for detecting a lane line in the previous frame image thereof in the case where the current frame image to be detected is an image of a frame subsequent to the nth frame.
The updating section 603 is configured to update the line model acquired by the model acquiring section 602 based on the line segment extracted by the line segment extracting section 601.
As an example, the updating section 603 performs model fitting using the extracted respective line segments, thereby obtaining an updated line model. Specifically, the updating section 603 may perform model fitting using various means such as least squares, gradient descent, and the like.
As another example, the updating component 603 is configured to determine, for each line segment, a line in the line model corresponding to the line segment, and perform model fitting using the determined correspondence between the line segments and the lines in the line model, thereby obtaining an updated line model. Wherein, when determining, for each line segment, the line corresponding thereto in the line model, the updating section 603 may be configured to calculate, for each of the line segments, a direction distance and a position distance thereof from each line in the line model, and regard the line having the smallest direction distance and position distance from the line segment as the line corresponding to the line segment.
The detection component 604 is configured to determine a line to detect from the updated line model.
FIG. 7 illustrates an overall hardware block diagram of a line detection system 700 according to an embodiment of the invention. As shown in fig. 7, the line detection system 700 may include: an input device 710 for inputting relevant images or information from the outside, such as a depth map, a gray scale map (color map), etc. photographed by a camera, for example, a keyboard, a mouse, a camera, etc.; a processing device 720 for implementing the line detection method according to the embodiments of the present disclosure or as the line detection device described above, which may be any device with processing capability capable of implementing the functions described above, for example, it may be a general-purpose processor, a Digital Signal Processor (DSP), an ASIC, a field programmable gate array signal (FPGA) or other Programmable Logic Device (PLD), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein; an output device 730 for outputting the result of performing the above-described line detection process, such as a detected line, to the outside, for example, a display, a printer, or the like; and a storage device 740 for storing, in a volatile or non-volatile manner, various volatile or non-volatile memories, such as a Random Access Memory (RAM), a Read Only Memory (ROM), a hard disk, a semiconductor memory, or the like, involved in the above-described line detection process, such as a depth map, a grayscale map (color map), various thresholds, a pre-established line model, extracted line segments, an updated line model, or the like.
The above description has been made taking as an example the application of the line detection technique according to the embodiment of the present invention to lane line detection, and it can be understood that the line detection technique according to the embodiment of the present invention can also be applied to various other line detection situations that require the use of a line model.
The foregoing describes the general principles of the present invention in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in this disclosure are only examples and not limitations, and should not be considered essential to every embodiment of the present invention. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the invention is not limited to the specific details described above.
The block diagrams of devices, apparatuses, systems referred to in this disclosure are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
The flowchart of steps in the present disclosure and the above description of the methods are only given as illustrative examples and are not intended to require or imply that the steps of the various embodiments must be performed in the order given, some steps may be performed in parallel, independently of each other or in other suitable orders. Additionally, words such as "thereafter," "then," "next," etc. are not intended to limit the order of the steps; these words are only used to guide the reader through the description of these methods.
Also, as used herein, "or" as used in a list of items beginning with "at least one" indicates a separate list, such that, for example, a list of "A, B or at least one of C" means A or B or C, or AB or AC or BC, or ABC (i.e., A and B and C). Furthermore, the word "exemplary" does not mean that the described example is preferred or better than other examples.
It should also be noted that the components or steps may be broken down and/or re-combined in the apparatus and method of the present invention. These decompositions and/or recombinations are to be regarded as equivalents of the present invention.
It will be understood by those of ordinary skill in the art that all or any portion of the methods and apparatus of the present disclosure may be implemented in any computing device (including processors, storage media, etc.) or network of computing devices, in hardware, firmware, software, or any combination thereof. The hardware may be implemented with a general purpose processor, a Digital Signal Processor (DSP), an ASIC, a field programmable gate array signal (FPGA) or other Programmable Logic Device (PLD), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any commercially available processor, controller, microcontroller or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. The software may reside in any form of computer readable tangible storage medium. By way of example, and not limitation, such computer-readable tangible storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk, as used herein, includes Compact Disk (CD), laser disk, optical disk, Digital Versatile Disk (DVD), floppy disk, and Blu-ray disk.
The intelligent control techniques disclosed herein may also be implemented by running a program or a set of programs on any computing device. The computing device may be a general purpose device as is well known. The disclosed intelligent techniques may also be implemented simply by providing a program product containing program code for implementing the methods or apparatus, or by any storage medium having such a program product stored thereon.
Various changes, substitutions and alterations to the techniques described herein may be made without departing from the techniques of the teachings as defined by the appended claims. Moreover, the scope of the claims of the present disclosure is not limited to the particular aspects of the process, machine, manufacture, composition of matter, means, methods and acts described above. Processes, machines, manufacture, compositions of matter, means, methods, or acts, presently existing or later to be developed that perform substantially the same function or achieve substantially the same result as the corresponding aspects described herein may be utilized. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or acts.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the invention. Thus, the present invention is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the invention to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (10)

1. A line detection method, comprising:
extracting line segments which accord with the characteristics of the lines to be detected in the current frame image;
acquiring a pre-established line model;
updating the line model based on the extracted line segments; and
and determining the line to be detected according to the updated line model.
2. The line detection method according to claim 1, wherein the feature of the line to be detected includes at least one of a color, a gradation, a shape, an edge, a parallax feature of the line.
3. The line detection method according to claim 1, wherein said extracting, in the current frame image, a line segment that conforms to a feature of a line to be detected further comprises:
detecting feature points in the current frame image based on features of lines to be detected;
and fitting the characteristic points obtained by detection to obtain the line segment.
4. The line detection method according to claim 1, wherein said extracting, in the current frame image, a line segment that conforms to a feature of a line to be detected further comprises:
a line segment conforming to the characteristics of a line to be detected is detected in the current frame image.
5. The line detecting method according to claim 1, wherein in a case where the current frame image is an image of a frame subsequent to an nth frame image in the sequence of video frames, the pre-established line model is a line model employed in a previous frame image of the current frame image, where N is an arbitrary integer equal to or greater than 1.
6. The line detection method of claim 1, wherein said updating the line model based on the extracted line segments further comprises:
and performing model fitting by using the extracted line segments so as to obtain an updated line model.
7. The line detection method of claim 1, wherein said updating the line model based on the extracted line segments further comprises:
for each line segment, determining a line corresponding to the line segment in a line model;
and according to the determined corresponding relation between each line segment and the line in the line model, performing model fitting by using each line segment to obtain an updated line model.
8. The line inspection method of claim 7, wherein determining, for each of the line segments, a line in the line model corresponding thereto further comprises:
for each line segment, calculating the direction distance and the position distance between the line segment and each line in the line model;
and the line with the minimum direction distance and position distance with the line segment is taken as the line corresponding to the line segment.
9. The line detection method of claim 8, wherein the determining a line having the smallest directional distance and positional distance from the line segment as the line corresponding to the line segment further comprises:
for each line in the line model, calculating a weighted operation result of the direction distance and the position distance between the line segment and the line;
and if the minimum value in each weighting operation result is smaller than a preset threshold value, taking the line corresponding to the minimum value as the line corresponding to the line segment.
10. A line detection apparatus comprising:
a line segment extraction section configured to extract a line segment conforming to a feature of a line to be detected in the current frame image;
a model acquisition section configured to acquire a line model established in advance;
an updating section configured to update the line model based on the extracted line segment;
a detection component configured to determine a line to be detected according to the updated line model.
CN201510163192.2A 2015-04-08 2015-04-08 Line detecting method and equipment Expired - Fee Related CN106157289B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510163192.2A CN106157289B (en) 2015-04-08 2015-04-08 Line detecting method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510163192.2A CN106157289B (en) 2015-04-08 2015-04-08 Line detecting method and equipment

Publications (2)

Publication Number Publication Date
CN106157289A true CN106157289A (en) 2016-11-23
CN106157289B CN106157289B (en) 2019-08-09

Family

ID=57336812

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510163192.2A Expired - Fee Related CN106157289B (en) 2015-04-08 2015-04-08 Line detecting method and equipment

Country Status (1)

Country Link
CN (1) CN106157289B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109931884A (en) * 2019-01-31 2019-06-25 上海市质量监督检验技术研究院 A kind of strip water nozzle rotation angle non-contact measurement method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102133876A (en) * 2011-02-23 2011-07-27 郭长有 Anti-deviation alarm device for automobile
CN102529975A (en) * 2010-12-13 2012-07-04 通用汽车环球科技运作有限责任公司 Systems and methods for precise sub-lane vehicle positioning
CN202345671U (en) * 2011-11-11 2012-07-25 长安大学 Video camera-based device for vehicle lane departure detection and early warning
CN103605362A (en) * 2013-09-11 2014-02-26 天津工业大学 Learning and anomaly detection method based on multi-feature motion modes of vehicle traces
CN103942533A (en) * 2014-03-24 2014-07-23 河海大学常州校区 Urban traffic illegal behavior detection method based on video monitoring system
CN103971081A (en) * 2013-01-25 2014-08-06 株式会社理光 Multi-lane detection method and system
CN104008387A (en) * 2014-05-19 2014-08-27 山东科技大学 Lane line detection method based on feature point piecewise linear fitting

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102529975A (en) * 2010-12-13 2012-07-04 通用汽车环球科技运作有限责任公司 Systems and methods for precise sub-lane vehicle positioning
CN102133876A (en) * 2011-02-23 2011-07-27 郭长有 Anti-deviation alarm device for automobile
CN202345671U (en) * 2011-11-11 2012-07-25 长安大学 Video camera-based device for vehicle lane departure detection and early warning
CN103971081A (en) * 2013-01-25 2014-08-06 株式会社理光 Multi-lane detection method and system
CN103605362A (en) * 2013-09-11 2014-02-26 天津工业大学 Learning and anomaly detection method based on multi-feature motion modes of vehicle traces
CN103942533A (en) * 2014-03-24 2014-07-23 河海大学常州校区 Urban traffic illegal behavior detection method based on video monitoring system
CN104008387A (en) * 2014-05-19 2014-08-27 山东科技大学 Lane line detection method based on feature point piecewise linear fitting

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
JIANG RUYI 等: "Lane detection and tracking using a new lane model and distance transform", 《MACHINE VISION AND APPLICATIONS》 *
JIN-WOOK LEE 等: "Effective lane detection and tracking method using statistical modeling of color and lane edge-orientation", 《2009 FOURTH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCES AND CONVERGENCE INFORMATION TECHNOLOGY》 *
余厚云 等: "直线模型下的车道线跟踪与车道偏离检测", 《自动化仪表》 *
刘献如 等: "结构化道路车道线的鲁棒检测与跟踪", 《光电子·激光》 *
张翀 等: "基于直线模型的车道线实时检测方法", 《计算机工程与设计》 *
王晓云 等: "基于线性双曲线模型的车道线检测算法", 《杭州电子科技大学学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109931884A (en) * 2019-01-31 2019-06-25 上海市质量监督检验技术研究院 A kind of strip water nozzle rotation angle non-contact measurement method

Also Published As

Publication number Publication date
CN106157289B (en) 2019-08-09

Similar Documents

Publication Publication Date Title
CN109829875B (en) Method and apparatus for estimating parallax
US8755630B2 (en) Object pose recognition apparatus and object pose recognition method using the same
US9142011B2 (en) Shadow detection method and device
US9576367B2 (en) Object detection method and device
JP5772821B2 (en) Facial feature point position correction apparatus, face feature point position correction method, and face feature point position correction program
JP6179639B2 (en) Road boundary detection method and detection apparatus
US10853960B2 (en) Stereo matching method and apparatus
EP2993621B1 (en) Method and apparatus for detecting shielding against object
KR20200060194A (en) Method of predicting depth values of lines, method of outputting 3d lines and apparatus thereof
JP2016194925A (en) Method and device of detecting road boundary object
KR20160010120A (en) Stereo matching apparatus and method using unary confidences learning and pairwise confidences learning
TWI504858B (en) A vehicle specification measuring and processing device, a vehicle specification measuring method, and a recording medium
US10726309B2 (en) Subject recognizing method and apparatus
KR20170091496A (en) Method and apparatus for processing binocular image
JP2013186902A (en) Vehicle detection method and apparatus
US20150131853A1 (en) Stereo matching system and method for generating disparity map using same
CN110705330A (en) Lane line detection method, lane line detection apparatus, and computer-readable storage medium
JP6845929B2 (en) 3D measuring device and method
CN108090401B (en) Line detection method and line detection apparatus
CN106157289B (en) Line detecting method and equipment
JP4685711B2 (en) Image processing method, apparatus and program
CN107305688B (en) Method, device and system for detecting road vanishing point
US11475233B2 (en) Image processing device and image processing method
JP6393495B2 (en) Image processing apparatus and object recognition method
CN112686155A (en) Image recognition method, image recognition device, computer-readable storage medium and processor

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190809

CF01 Termination of patent right due to non-payment of annual fee