CN107316331B - Vanishing point automatic calibration method for road image - Google Patents

Vanishing point automatic calibration method for road image Download PDF

Info

Publication number
CN107316331B
CN107316331B CN201710651702.XA CN201710651702A CN107316331B CN 107316331 B CN107316331 B CN 107316331B CN 201710651702 A CN201710651702 A CN 201710651702A CN 107316331 B CN107316331 B CN 107316331B
Authority
CN
China
Prior art keywords
image
line
similarity
window
vanishing point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710651702.XA
Other languages
Chinese (zh)
Other versions
CN107316331A (en
Inventor
陈卫刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Adtime Technology Co ltd
Zhejiang Gongshang University
Original Assignee
Zhejiang Gongshang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Gongshang University filed Critical Zhejiang Gongshang University
Priority to CN201710651702.XA priority Critical patent/CN107316331B/en
Publication of CN107316331A publication Critical patent/CN107316331A/en
Application granted granted Critical
Publication of CN107316331B publication Critical patent/CN107316331B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Abstract

The invention discloses a vanishing point automatic calibration method aiming at a road image, which comprises the steps of reading line data in a current input image and updating line stacking images; calculating the pixel value accumulation sum in the vertical direction of each row of stacked images to form a one-dimensional array; further calculating the similarity of the scanning positions through the one-dimensional array and recording; constructing a set S according to the similarity; for each triplet of the set S, a point (x, y) in the image plane is mapped to xt‑xbA straight line of the parameter plane; searching the coordinate position of the maximum value and establishing an intersection point set; and updating vanishing point coordinates of the current input image. By the method, the image acquired by the simple image acquisition device can be processed, the position of the vanishing point in the image can be automatically calibrated without any tool, convenience and high precision of the calibrated vanishing point position can be realized, and the requirements of subsequent application can be well met.

Description

Vanishing point automatic calibration method for road image
Technical Field
The invention relates to an image processing technology, in particular to a vanishing point automatic calibration method for a road image.
Background
With the development of sensor technology and electronic technology, Advanced Driving Assistance Systems (ADAS) have become an important direction for the development of the automotive industry. The advanced driving assistance system based on vision comprises the components of front vehicle detection, front vehicle anti-collision early warning, lane line detection, lane line deviation early warning and the like. For these applications, the location of the vanishing points in the image is an important input information.
Some ADAS systems at present need professionals to determine positions of vanishing points in a manual calibration mode, and the systems inevitably face the problems of high cost, incapability of freely changing installation positions and installing vehicles, high requirements on skills of installers and the like.
Chinese patent 201610492617.9 discloses an automatic vanishing point calibration method that does not depend on preceding vehicle detection and does not require additional prior conditions, but this method requires defining a plurality of horizon position templates, and determining whether these horizon position hypotheses match with actually observed features in a verified manner, thereby determining the position of the horizon.
Disclosure of Invention
The invention provides an automatic vanishing point calibration method for a road image, aiming at the defect that the vanishing point position is determined by a manual calibration mode in the prior art.
In order to solve the technical problem, the invention is solved by the following technical scheme:
a vanishing point automatic calibration method for road images comprises the following steps:
reading line data in the current input image, and updating a line stack image: setting a plurality of horizontal scanning lines at equal intervals from the middle position in the height direction of the image to the bottom of the image, wherein each horizontal scanning line corresponds to a line stacking image with the height of M and the width of N, N is equal to the width of an input image, M is a preset value, reading line data of each horizontal scanning line, and updating the line stacking image corresponding to the horizontal scanning line;
calculating the pixel value accumulation sum in the vertical direction of each row of stacked images to form a one-dimensional array: calculating and accumulating the pixel values of each row of stacked images in the vertical direction, and setting InCalculating the sum of pixel values of the n-th row of stacked images in the vertical direction to form a one-dimensional array hnThe calculation formula is as follows:
Figure BDA0001368212150000011
wherein, In(x, y) represents a pixel value of a coordinate (x, y) position in the line-stacked image, x representing a scanning position;
and further calculating the similarity of the scanning positions through a one-dimensional array and recording: by a one-dimensional array hnCalculating the similarity between the data in a plurality of scale sliding windows and the reference signal at each scanning position x, and taking the maximum value of the similarities calculated by the plurality of scale sliding windows as the similarity r of the scanning positionn(x) And for the similarity rn(x) Recording;
constructing a set S through the similarity: finding the similarity r in a recordn(x) The coordinate value X in the X direction is obtained from the position of the maximum value, and the image I is stacked with the linenThe position of the corresponding horizontal scanning line in the image obtains the coordinate value Y in the Y direction and the similarity value rn(x) Constituting a triplet (x, y, r)n(x) Receive the triplet (x, y, r)n(x) ) constitute a set S;
for each triplet of the set S, a point (x, y) in the image plane is mapped to xt-xbOne straight line of the parameter plane: presetting two horizontal lines L in the image planemidAnd LbotFirst horizontal line LmidIs equal to H/2, H is equal to the height of the image, a second horizontal line LbotIs equal to H-DeltaH,ΔHIs a predetermined constant; in a straight line with the horizontal line LmidX coordinate X of the intersection oftAnd with said horizontal line LbotX coordinate X of the intersection ofbFor parameters, for each triplet (x, y, r) in the set Sn(x) According to a preset parameter x)tThe point (x, y) in the image plane is mapped to x according to the following formulat-xbA straight line of the parameter plane, and the point through which the straight line passes accumulates the weight rn(x) The formula is as follows:
Figure BDA0001368212150000021
wherein the content of the first and second substances,
Figure BDA0001368212150000022
finding the coordinate position of the maximum value, and establishing an intersection set: at xt-xbThe parameter plane finds the coordinate position of the maximum value, a straight line is determined from the coordinates (t, b) of the maximum value position, the straight line passes through the point
Figure BDA0001368212150000023
And (b, H-Delta)H) Calculating the intersection point of the straight line and other straight lines to form an intersection point set P;
updating vanishing point coordinates of the current input image: and updating the vanishing point coordinates of the current input image through the intersection point set P and the vanishing point calculated by the previous frame of input image.
As an implementation manner, the reading of the line data of each horizontal scanning line and the updating of the line stack image corresponding to the horizontal scanning line includes:
the M lines of data in the line stacked image are organized into a circular queue;
when the queue is not full, performing enqueuing operation, and copying the row where the horizontal scanning line of the current input image is located to the position pointed by the queue tail pointer in the corresponding row stacked image; alternatively, the first and second electrodes may be,
when the queue is full, dequeue operation is performed first, then enqueue operation is performed, and the row where the horizontal scanning line is located is copied to the position pointed by the tail pointer of the queue.
As an implementation manner, the further calculating the similarity of the scanning positions by using the one-dimensional array specifically includes:
the value range of the scanning position is more than or equal to 0 and less than or equal to N-1, and N is equal to the width of the input image;
setting the current scanning position as x, the window size as B, starting from subscript x- (B-1)/2 and setting the window size as B in the array hnB continuous elements are taken to form a sub-array h ', and each element of the array h' is mapped into an interval [0,1 ]]The real numbers in the range are specifically represented as follows:
Figure BDA0001368212150000031
wherein min (h ') and max (h') respectively represent the minimum and maximum values of the logarithmic array elements;
the reference signal is determined according to the size of the current sliding window, and the formula is as follows:
Figure BDA0001368212150000032
wherein x ism(B-1)/2, s-1)/2, g (i) represents the ith element of the reference signal, the reference signal is a one-dimensional array with the same length as the window size, and B represents the window size;
for a scanning window with the window size B, calculating a similarity measure between data in the window and a reference signal, wherein the formula is as follows:
Figure BDA0001368212150000033
wherein B represents the window size, g is the reference signal, g (i) represents the i-th element of the reference signal, h' represents the subarray, the similarity measure of the data and the corresponding reference signal in the window of the plurality of scales is calculated at each of the scanning positions x, and the similarity value of the position is calculated according to the following formula, which is expressed as follows:
Figure BDA0001368212150000034
wherein η B represents the similarity measure of data and reference signal within a scanning window of window size B, exp (- η)B 2) Representing an exponential function with the euler number as the base for the data,
Figure BDA0001368212150000035
means for taking the maximum value for the data in the set;
and finally, the similarity between the data in the sliding window of each scanning position and the reference signal is obtained.
As one possible implementation, the updating the vanishing point coordinates of the current input image by the intersection set P and the vanishing point calculated by the previous frame includes:
if the calculated vanishing point of the input image of the previous frame is q0Selecting the point belonging to the intersection set P and located at the vanishing point q0A point within a scanning window of a preset size centered; finding the selected point and the vanishing point q0The formula is as follows:
Figure BDA0001368212150000041
where P' represents the set of all selected points, PiIs a point of the set P ', K is the number of elements in the set P', d represents the average of the distance vectors between the selected point and the current centerA value;
vanishing point q0And (3) correcting the position: q ═ q0And d, finally obtaining the vanishing point coordinate q of the current input image.
Due to the adoption of the technical scheme, the invention has the remarkable technical effects that:
by the method, the image acquired by the simple image acquisition device can be processed, the position of the vanishing point in the image can be automatically calibrated without any tool, convenience and high precision of the calibrated vanishing point position can be realized, and the requirements of subsequent application can be well met.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 shows a horizontal line marked in white;
FIG. 2 is a line stacked image formed by sequentially copying data of successive 36 frame images at white horizontal lines shown in FIG. 1;
FIG. 3 illustrates two horizontal lines at appropriate locations in the image, from which the intersection of the lane lines with their two intersections determines the straight line on which the lane lines lie;
fig. 4 is a schematic flow chart of a method for automatically calibrating a vanishing point of a road image according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples, which are illustrative of the present invention and are not to be construed as being limited thereto.
Example 1:
before using the method of the invention, it should be understood that, first, the lane marking appearing in the image has a greater brightness value than the surrounding road area; secondly, when the camera is mounted in a fixed manner at a suitable position, the width of the track line becomes smaller and smaller from near to far; again, the appearance of the trace in the image has persistence, namely: if a solid line trace appears at a certain position of the image, a plurality of subsequent frame traces will continuously appear at the position and the neighborhood position, and if the image is a dotted line trace, the image appears to be repeatedly appeared at intervals of a plurality of frames.
Fig. 1 is a schematic diagram showing the appearance of a trace line in an image with persistence, and referring to fig. 1, the line data of the image and the white horizontal lines of the 35-frame image that follow the image are sequentially copied to the 0 th line to the 35 th line of the image in fig. 2, thereby forming a line-stacked image shown in fig. 2. In the line-stacked image, the solid-line type of lane marking appears continuously in the height direction, and the dashed-line type of lane marking repeats at intervals, and for a large number of normally installed driving assistance systems, the horizon line usually appears at an intermediate position in the image height direction.
Arranging a plurality of horizontal scanning lines at equal intervals from the middle position of the image in the height direction to the bottom of the image, wherein each horizontal scanning line corresponds to a line stacking image with the height of M and the width of N, N is equal to the width of an input image, M is a preset value, reading line data of each horizontal scanning line, and updating the line stacking image corresponding to the horizontal scanning line; referring to FIG. 3, let the image height be H, at height coordinates H/2 and H- Δ, respectivelyHA horizontal line L is arranged atmidAnd LbotIn which ΔHIf the number of the lines is a preset small constant, the straight lines corresponding to the line marks converge at LmidLast smaller area, further, if a point p on the lane is detected1(x, y), setting the lane line and LmidCross point e of1Has a horizontal coordinate of xtIs connected to e1And p1The straight line and L are obtained by the following formulabotCross point e of2X coordinate value of (2)bThe formula is as follows:
Figure BDA0001368212150000051
wherein the content of the first and second substances,
Figure BDA0001368212150000052
in the formula (1), xtAnd xbExpressed as a linear relationship, any point in the image plane will be mapped to x by equation (1)t-xbA straight line in the plane, and a point in the image plane that lies on a straight line, will be mapped to x by equation (1)t-xbMultiple straight lines in a plane, since they have the same (x)t,xb) Value, so these lines will be at xt-xbIntersecting at a point in the plane.
According to the above observation, the present invention provides an automatic vanishing point calibration method for a road image, and in order to describe the present invention more specifically, the following detailed description of the technical solution of the present invention is provided with reference to the accompanying drawings and the specific embodiments:
as shown in fig. 4, the process of the vanishing point automatic calibration method for a road image of the present invention may include:
an automatic vanishing point calibration method for a road image, as shown in fig. 4, includes the following steps:
s1, reading line data in the current input image, and updating the line stack image: setting a plurality of horizontal scanning lines at equal intervals from the middle position in the image height direction to the bottom of the image, wherein each horizontal scanning line corresponds to a row-stacked image with the height of M and the width of N, N is equal to the width of an input image, M is a preset value, reading line data of each horizontal scanning line, updating the row-stacked image corresponding to the horizontal scanning line, and setting the horizontal scanning lines at equal intervals with the step length of delta Y, wherein M is equal to 30; referring to FIG. 1, a scan line L is shownnThe corresponding line-stacked image is InThe M lines of data in the line-stacked image are organized into a circular queue, and when the queue is not full, the enqueue operation is directly executed, and the current input image scanning line LnThe row is copied to InThe position pointed by the middle team tail pointer; when the queue is full, a dequeue operation is performed first,then, the enqueue operation is executed to scan the line LnWhere the row is copied to InThe position pointed by the middle team tail pointer;
s2, calculating the pixel value accumulation sum in the vertical direction for each row of stacked images to form a one-dimensional array: calculating pixel values of each row of stacked images in the vertical direction, accumulating and summing to obtain a one-dimensional multi-scale sliding window scanning array, calculating the similarity between data in each scanning position sliding window and a reference signal, and setting InStacking the image for the nth line, and calculating the pixel value of the vertical direction of the line of stacked image to form a one-dimensional multi-scale sliding window scanning array hnThe calculation formula is as follows:
Figure BDA0001368212150000061
wherein, In(x, y) represents a pixel value of a coordinate (x, y) position in the line-stacked image;
s3, scanning the array through a multi-scale sliding window, further calculating the similarity of the scanning positions and recording: scanning array h through multi-scale sliding windownCalculating the similarity between the data in a plurality of scale sliding windows and the reference signal at each scanning position x, and taking the maximum value of the similarities calculated by the plurality of scale sliding windows as the similarity r of the scanning positionn(x) And for the similarity rn(x) Recording, specifically, the value range of the scanning position is greater than or equal to 0 and less than or equal to the image width, each scale corresponds to a scanning window size in the multiple scales, the minimum and maximum window sizes are determined according to the size of the input image, and the window sizes of two adjacent scales are different by a difference Δ B, wherein Δ B is a preset constant; assuming that the current scan position is x, for an image of 1280 × 720, one implementation of the present invention takes B to range from greater than 15 to less than 37, Δ B ═ 2, starting with subscript x- (B-1)/2 at array hnTaking continuous B elements to form a sub-array h', mapping each element of the array to an interval [0,1 ] according to the following MIN-MAX standardization]Real numbers within the range, the formula:
Figure BDA0001368212150000062
wherein min (h ') and max (h') respectively represent the minimum and maximum values of the logarithmic array elements;
in a specific implementation, for a scanning window with a window size B, the reference signal may be calculated in the form of an exponential function of formula (4), where formula (4) is as follows:
Figure BDA0001368212150000063
wherein x ism(B-1)/2, s-1)/2, g (i) represents the ith element of the reference signal, the reference signal is a one-dimensional array with the same length as the window size, and B represents the window size;
for a scanning window with the window size B, calculating a similarity measure between data in the window and a reference signal, wherein the formula is as follows:
Figure BDA0001368212150000064
wherein, B represents the window size, g is the reference signal, g (i) represents the ith element of the reference signal, h' represents the subarray, the similarity measure of the data and the corresponding reference signal in the window of a plurality of scales is calculated at each scanning position x, and the maximum value of a plurality of similarities is taken as the similarity measure of the position:
since the similarity of the data to the corresponding reference signal is calculated in a sliding window of multiple scales at each scan position x, one embodiment of the present invention calculates the array r as followsnValue at scan position x:
Figure BDA0001368212150000071
s4, passing the similarity rn(x) And (3) constructing a set S: finding the similarity r of recordsn(x) In particular, the following formula is used for the similarity value rn(x) InFinding the neighborhood maximum, the formula is as follows:
Figure BDA0001368212150000072
and r isn(x)≥rTH(7)
Where, max { r (x + i) — W1≤i≤WiDenotes the maximization of the data within the set, W1Is a preset value representing the range of the neighborhood, and one embodiment of the present invention takes W1=16,rTHIs a predetermined threshold, one embodiment of the present invention takes rTH0.4, the coordinate value X in the X direction is obtained from the subscript of the array element satisfying the above condition, and the image I is stackednCorresponding scanning line LnObtaining the coordinate value Y of Y direction and the similarity value rn(x) Form a triplet (x, y, r)n(x) And added to the set S;
s5, for each triplet of the set S combination, mapping a point (x, y) in the image plane to xt-xbOne straight line of the parameter plane: presetting two horizontal lines L in the image planemidAnd LbotFirst horizontal line LmidIs equal to H/2, H is equal to the height of the image, a second horizontal line LbotIs equal to H-DeltaH,ΔHIs a predetermined constant; in a straight line with the horizontal line LmidX coordinate X of the intersection oftAnd with said horizontal line LbotX coordinate X of the intersection ofbFor parameters, for each triplet (x, y, r) in the set Sn(x) According to a preset parameter x)tIn one embodiment of the invention, x is settThe value range is as follows:
Figure BDA0001368212150000073
wherein ΔWOne embodiment of the present invention takes Δ as a predetermined constantW64, for triplet (x, y, r)n(x) X, y) of said x, taken in turntAll integers in the value range, and calculating corresponding x according to the formula (1)bA value; using a matrix C
Figure BDA0001368212150000075
In (x)t,xb) The position accumulates the weight according to the following formula, and the formula for accumulating the weight is as follows:
C(xt,xb)=C(xt,xb)+rn(x) (8);
s6, finding the coordinate position of the maximum value, and establishing an intersection set: at xt-xbThe parameter plane finds the coordinate position of the maximum value, a straight line is determined by the coordinate (t, b) of the maximum value position, and the straight line passes through two points
Figure BDA0001368212150000074
And (b, H-Delta)H) Calculating the intersection point of the straight line and other straight lines to form an intersection point set P;
specifically, according to equation (9), the coordinate position having the maximum value is searched in the matrix C in a window scanning manner, and equation (9) is as follows:
C(t,b)=max{C(t+u,b+v),-W2≤u,v≤W2and C (t, b) is not less than CTH(9)
Wherein, W2Is a preset value representing the size of a window, one embodiment of the present invention takes W2=12,CTHIs a predetermined threshold, one embodiment of the present invention takes CTH1.2, for a maximum position satisfying equation (9), its coordinates are added to the coordinate set Z, one straight line is determined from each coordinate in the set Z, and for each straight line, its intersection with other straight lines is calculated, resulting in an intersection set P. Specifically, if there is a coordinate (t)1,b1) Then, with (t)1H/2) and (b)1,H-ΔH) A straight line is determined for the two points. If the line is at any other point in the set (t)2,b2) If the determined straight lines are not parallel straight lines, calculating the intersection points of the straight lines and adding an intersection point set P;
s7, updating vanishing point coordinates of the current input image: updating vanishing point coordinates of the current input image through the intersection point set P and a vanishing point calculated by the previous frame of input image;
specifically, the method is realized by the following steps: step 1, with q0As a center, SHAnd SWSetting a window for the window size in the height and width directions, selecting points belonging to the intersection point set P and within the window, one embodiment of the invention takes SH=Sw=24;
Step 2, calculating the mean value of the distance vectors between all the points selected in step 1 and the current center according to a formula (10), wherein the formula (10) is as follows:
Figure BDA0001368212150000081
where P' represents the set of all selected points, PiIs one of the points, K is the number of elements in P', d represents the mean of the distance vectors between the selected point and the current center;
and 3, correcting the center position according to the following formula:
q=q0+d (11)
wherein q is0Expressing the vanishing point coordinate of the previous frame image, and q expressing the vanishing point coordinate of the current input image;
and (3) repeating the steps 1 to 3, and ending the circulation if the value d calculated according to the formula (10) is very small or reaches the preset maximum circulation times until a proper vanishing point is found.
In addition, it should be noted that the specific embodiments described in the present specification may differ in the shape of the components, the names of the components, and the like. All equivalent or simple changes of the structure, the characteristics and the principle of the invention which are described in the patent conception of the invention are included in the protection scope of the patent of the invention. Various modifications, additions and substitutions for the specific embodiments described may be made by those skilled in the art without departing from the scope of the invention as defined in the accompanying claims.

Claims (4)

1. A vanishing point automatic calibration method for road images is characterized by comprising the following steps:
reading line data in the current input image, and updating a line stack image: setting a plurality of horizontal scanning lines at equal intervals from the middle position in the height direction of the image to the bottom of the image, wherein each horizontal scanning line corresponds to a line stacking image with the height of M and the width of N, N is equal to the width of an input image, M is a preset value, reading line data of each horizontal scanning line, and updating the line stacking image corresponding to the horizontal scanning line;
calculating the pixel value accumulation sum in the vertical direction of each row of stacked images to form a one-dimensional array: calculating and accumulating the pixel values of each row of stacked images in the vertical direction, and setting InCalculating the sum of pixel values of the n-th row of stacked images in the vertical direction to form a one-dimensional array hnThe calculation formula is as follows:
Figure FDA0002326275900000011
wherein, In(x ', y ') represents a pixel value of a position of coordinates (x ', y ') in the line-stacked image, and x ' represents a scanning position;
calculating and recording the maximum value of the similarity of the data in the multiple scale sliding windows and the reference signal at each scanning position through a one-dimensional array: by a one-dimensional array hnCalculating the similarity between the data in a plurality of scale sliding windows and the reference signal at each scanning position x', and taking the maximum value of the similarity calculated by the data in the plurality of scale sliding windows as the similarity r of the scanning positionn(x') and for the similarity rn(x') recording, wherein the reference signal is determined according to the size of the current sliding window;
constructing a set S through the similarity: finding the similarity r in a recordn(X') a maximum value, obtaining a coordinate value X in the X direction from the position of the maximum value, and stacking the image I with the linenThe position of the corresponding horizontal scanning line in the image is used for obtaining the coordinate value in the Y directiony, and corresponding similarity values rn(x) Constituting a triplet (x, y, r)n(x) Receive the triplet (x, y, r)n(x) ) constitute a set S;
for each triplet (x, y, r) of the set Sn(x) Mapping coordinates (x, y) in the image plane to x)t-xbOne straight line of the parameter plane: presetting two horizontal lines L in the image planemidAnd LbotFirst horizontal line LmidIs equal to H/2, H is equal to the height of the image, a second horizontal line LbotHas a Y coordinate equal to H- △H,△HIs a predetermined constant; in a straight line with the horizontal line LmidX coordinate X of the intersection oftAnd with said horizontal line LbotX coordinate X of the intersection ofbFor parameters, for each triplet (x, y, r) in the set Sn(x) With x) are addedtFor the independent variable, the preset value range and precision are calculated according to the following formulabThereby mapping coordinates (x, y) in the image plane to xt-xbA straight line of the parameter plane, the formula is as follows:
Figure FDA0002326275900000021
wherein the content of the first and second substances,
Figure FDA0002326275900000022
each coordinate position (x) on the mapped straight line using the matrix Ct,xb) The weight is accumulated according to the following formula, and the formula for accumulating the weight is as follows:
C'(xt,xb)=C(xt,xb)+rn(x)
wherein C (x)t,xb) For updating the pre-matrix at (x)t,xb) Value of position, C' (x)t,xb) To an updated value, rn(x) Similarity values recorded in the triple set;
finding the coordinate position of the maximum value, and determining an intersection set: finding the coordinate position with the maximum value in the matrix C in a window scanning mode, wherein the formula is as follows:
C(t,b)=max{C(t+u,b+v),-W2≤u,v≤W2and C (t, b) is not less than CTH
Wherein, W2Is a preset value representing the size of the window, u and v are used to represent coordinate positions within the preset window, CTHIs a preset threshold;
determining a straight line from the coordinates (t, b) of the position of the maximum value, said straight line passing through the point
Figure FDA0002326275900000023
And (b, H- △)H) Calculating the intersection point of the straight line and other straight lines to form an intersection point set P, wherein the other straight lines are the straight lines which are determined by any other maximum value position (t ', b') in the set and are not parallel to the straight lines;
updating vanishing point coordinates of the current input image: and updating the vanishing point coordinates of the current input image through the intersection point set P and the vanishing point calculated by the previous frame of input image.
2. The automatic vanishing point calibration method for road images as claimed in claim 1, wherein the reading of the line data of each horizontal scanning line and the updating of the line stack images corresponding to the horizontal scanning lines comprises:
the M lines of data in the line stacked image are organized into a circular queue;
when the queue is not full, performing enqueuing operation, and copying the row where the horizontal scanning line of the current input image is located to the position pointed by the queue tail pointer in the corresponding row stacked image; or when the queue is full, the dequeue operation is executed first, then the enqueue operation is executed, and the row where the horizontal scanning line is located is copied to the position pointed by the tail pointer of the queue.
3. The vanishing point automatic calibration method for road images as claimed in claim 1, wherein the maximum value of the similarity between the data in the multiple scale sliding windows and the reference signal is calculated and recorded at each scanning position through a one-dimensional array, specifically comprising:
the value range of the scanning position is more than or equal to 0 and less than or equal to N-1, and N is equal to the width of the input image;
setting the current scanning position as x ', the window size as B, starting from subscript x' - (B-1)/2 in array hnTaking continuous B elements to form a sub-array h ', and mapping each element of the sub-array h' to an interval [0,1 ]]The real number in the interior is as follows:
Figure FDA0002326275900000031
wherein min (h ') and max (h') respectively represent the minimum and maximum values of the logarithmic array elements;
the reference signal is a one-dimensional array with the length being the same as the size of the window, and is determined according to the size of the current sliding window, wherein each element is calculated according to the following formula:
Figure FDA0002326275900000032
wherein g (i) denotes the i-th element of the reference signal, B denotes the window size, xmAnd s is calculated according to the window size, xmCorresponding to the center position, x, of the windowm=(B-1)/2,s=(B-1)/2;
For a scanning window with the window size B, calculating a similarity measure between data in the window and a reference signal, wherein the formula is as follows:
Figure FDA0002326275900000033
where B denotes the window size, g (i) denotes the ith element of the reference signal,
Figure FDA0002326275900000034
the value representing child array h' maps to [0,1]Array formed after range, calculating multiple dimensions at each of said scanning positionsAnd calculating the similarity value of the position according to the following formula by using the similarity measurement of the data in the window and the corresponding reference signal, wherein the similarity value is represented as follows:
Figure FDA0002326275900000041
wherein h isBA similarity measure, exp (- η), representing the data within a scanning window of window size B and the reference signalB 2) Representing an exponential function with the euler number as the base for the data,
Figure FDA0002326275900000042
means for taking the maximum value for the data in the set;
and finally, the similarity between the data in the sliding window of each scanning position and the reference signal is obtained.
4. The vanishing point automatic calibration method for road images as claimed in claim 1, wherein the updating vanishing point coordinates of the current input image by the vanishing point calculated by the intersection set P and the previous frame comprises:
if the calculated vanishing point of the input image of the previous frame is q0Selecting the point belonging to the intersection set P and located at the vanishing point q0A point within a scanning window of a preset size centered; finding the selected point and the vanishing point q0The formula is as follows:
Figure FDA0002326275900000043
where P' represents the set of all selected points, PiIs a point in the set P ', K is the number of elements in the set P', d represents the mean value of the distance vector between the selected point and the current center;
vanishing point q0And (3) correcting the position: q ═ q0And d, finally obtaining the vanishing point coordinate q of the current input image.
CN201710651702.XA 2017-08-02 2017-08-02 Vanishing point automatic calibration method for road image Expired - Fee Related CN107316331B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710651702.XA CN107316331B (en) 2017-08-02 2017-08-02 Vanishing point automatic calibration method for road image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710651702.XA CN107316331B (en) 2017-08-02 2017-08-02 Vanishing point automatic calibration method for road image

Publications (2)

Publication Number Publication Date
CN107316331A CN107316331A (en) 2017-11-03
CN107316331B true CN107316331B (en) 2020-04-14

Family

ID=60175603

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710651702.XA Expired - Fee Related CN107316331B (en) 2017-08-02 2017-08-02 Vanishing point automatic calibration method for road image

Country Status (1)

Country Link
CN (1) CN107316331B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109272536B (en) * 2018-09-21 2021-11-09 浙江工商大学 Lane line vanishing point tracking method based on Kalman filtering
CN111174796B (en) * 2019-12-31 2022-04-29 驭势科技(浙江)有限公司 Navigation method based on single vanishing point, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204572A (en) * 2016-07-06 2016-12-07 合肥工业大学 The road target depth estimation method mapped based on scene depth
CN106228531A (en) * 2016-06-27 2016-12-14 开易(北京)科技有限公司 Automatic vanishing point scaling method based on horizon search and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106228531A (en) * 2016-06-27 2016-12-14 开易(北京)科技有限公司 Automatic vanishing point scaling method based on horizon search and system
CN106204572A (en) * 2016-07-06 2016-12-07 合肥工业大学 The road target depth estimation method mapped based on scene depth

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"基于垂线包络和平行线对的城市道路图像消失点检查算法";丁伟利等;《光学学报》;20141031;第34卷(第10期);186-192 *

Also Published As

Publication number Publication date
CN107316331A (en) 2017-11-03

Similar Documents

Publication Publication Date Title
EP3581890B1 (en) Method and device for positioning
US11734918B2 (en) Object identification apparatus, moving body system, object identification method, object identification model learning method, and object identification model learning apparatus
CN109752701B (en) Road edge detection method based on laser point cloud
US10860871B2 (en) Integrated sensor calibration in natural scenes
CN109829398B (en) Target detection method in video based on three-dimensional convolution network
EP3876141A1 (en) Object detection method, related device and computer storage medium
US8259998B2 (en) Image processing device for vehicle
JP4943034B2 (en) Stereo image processing device
CN112348902B (en) Method, device and system for calibrating installation deviation angle of road-end camera
CN111121849B (en) Automatic calibration method for orientation parameters of sensor, edge calculation unit and roadside sensing system
CN113156407B (en) Vehicle-mounted laser radar external parameter joint calibration method, system, medium and device
CN107316331B (en) Vanishing point automatic calibration method for road image
CN112396044B (en) Method for training lane line attribute information detection model and detecting lane line attribute information
JP2017181476A (en) Vehicle location detection device, vehicle location detection method and vehicle location detection-purpose computer program
CN111738071B (en) Inverse perspective transformation method based on motion change of monocular camera
CN113449692A (en) Map lane information updating method and system based on unmanned aerial vehicle
CN114897669A (en) Labeling method and device and electronic equipment
CN114419165B (en) Camera external parameter correction method, camera external parameter correction device, electronic equipment and storage medium
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment
CN112446353B (en) Video image trace line detection method based on depth convolution neural network
CN113012215A (en) Method, system and equipment for space positioning
US11087150B2 (en) Detection and validation of objects from sequential images of a camera by using homographies
CN117333846A (en) Detection method and system based on sensor fusion and incremental learning in severe weather
CN110880003B (en) Image matching method and device, storage medium and automobile
CN111598956A (en) Calibration method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201104

Address after: 310000, No. 18 Jiao Tong Street, Xiasha Higher Education Park, Hangzhou, Zhejiang

Patentee after: ZHEJIANG GONGSHANG University

Patentee after: HANGZHOU ADTIME TECHNOLOGY Co.,Ltd.

Address before: 310000, No. 18 Jiao Tong Street, Xiasha Higher Education Park, Hangzhou, Zhejiang

Patentee before: ZHEJIANG GONGSHANG University

TR01 Transfer of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200414

CF01 Termination of patent right due to non-payment of annual fee