CN115071733A - Auxiliary driving method and device based on computer - Google Patents
Auxiliary driving method and device based on computer Download PDFInfo
- Publication number
- CN115071733A CN115071733A CN202210856155.XA CN202210856155A CN115071733A CN 115071733 A CN115071733 A CN 115071733A CN 202210856155 A CN202210856155 A CN 202210856155A CN 115071733 A CN115071733 A CN 115071733A
- Authority
- CN
- China
- Prior art keywords
- driving
- image
- road
- traffic
- lane line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000001514 detection method Methods 0.000 claims description 20
- 238000004891 communication Methods 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 3
- 238000012216 screening Methods 0.000 claims description 3
- 230000009466 transformation Effects 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 abstract description 3
- 230000036626 alertness Effects 0.000 abstract description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/53—Road markings, e.g. lane marker or crosswalk
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/50—External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention relates to a driving assisting method and a driving assisting device based on a computer, wherein the method comprises the following steps: acquiring a driving image in front of a vehicle in real time; detecting the lane lines of the road according to the collected driving images; acquiring vehicle navigation information, and carrying out road planning reminding on a user by combining the vehicle navigation information and a lane line of a road; detecting traffic signs of the road according to the collected driving images, and identifying the types of the traffic signs; and carrying out voice reminding on the driver according to the identified traffic sign category. The invention can simultaneously detect straight line lane lines and curve lane lines, can simultaneously detect a plurality of lane lines in a visual field, and can distinguish broken line lane lines from solid line lane lines; meanwhile, the round traffic signs on the driving road can be identified, so that the driver can be assisted to acquire information on the road, the driver can be helped to analyze the road condition, and the safe driving alertness of the driver can be improved.
Description
Technical Field
The invention relates to the technical field of computers, in particular to a driving assisting method and device based on a computer.
Background
Computer vision means that a camera and a computer are used to replace human eyes to perform machine vision such as identification, tracking and measurement on a target, and further image processing is performed, so that the computer processing becomes an image more suitable for human eye observation or transmitted to an instrument for detection.
With the continuous improvement of the economic level, the number of automobiles is also continuously increased; however, not all drivers are trained, and many vehicles that do not comply with traffic regulations are ubiquitous; therefore, it has become the mainstream of modern driving to assist the driver in driving through computer vision.
However, in the prior art, when the driving is assisted by a computer, the following problems exist: the method can only carry out linear detection on the lane line and cannot carry out curve detection; only the solid line detection can be carried out on the lane line, and the dotted line detection cannot be carried out; lane lines of solid lines and dotted lines cannot be well distinguished; only a few lane lines can be detected, and a plurality of lane lines appearing in the visual field cannot be detected; the traffic sign of the round road cannot be well recognized.
The present invention has been made in view of this situation.
Disclosure of Invention
In order to overcome the technical defects in the prior art, the invention provides a driving assisting method and device based on a computer, which can effectively solve the problems in the background art.
In order to solve the technical problems, the technical scheme provided by the invention is as follows:
the embodiment of the invention discloses a driving assisting method based on a computer, which comprises the following steps:
acquiring a driving image in front of a vehicle in real time;
detecting the lane lines of the road according to the collected driving images;
acquiring vehicle navigation information, and carrying out road planning reminding on a user by combining the vehicle navigation information and a lane line of a road;
detecting traffic signs of the road according to the collected driving images, and identifying the types of the traffic signs;
and carrying out voice reminding on the driver according to the identified traffic sign category.
In any one of the above aspects, preferably, the detecting the lane line of the road from the collected driving image includes:
converting the acquired driving image from an RGB color space to an HSV color space;
extracting objects meeting the color threshold value according to the color threshold value where the lane line is located;
and eliminating the non-lane line objects according to the area and angle relation which the lane lines should meet, and reserving the objects which meet the internal relation between the lane lines.
In any of the above schemes, preferably, the extracting, according to the color threshold at which the lane line is located, the object that meets the color threshold includes:
setting the horizon line as 1/2 in the driving image, and removing the part above the horizon line in the driving image;
setting white threshold interval s in HSV color space 1 And v 1 Satisfying s ∈ s in the collected driving image 1 &v∈v 1 The object of (1) is reserved;
setting yellow threshold interval h in HSV color space 1 、s 2 And v 2 Satisfying h E h in the collected driving image 1 &s∈s 2 &v∈v 2 The object of (2) is reserved.
In any of the above schemes, preferably, the eliminating the non-lane line object according to the area and the angle relation that the lane line should satisfy, and the retaining the object that satisfies the intrinsic relation between the lane lines includes:
removing noise points from the driving image through median filtering;
scanning a binary image of the driving image, classifying pixel points belonging to the same category into the same category, classifying all the points in the image into different categories, and marking the same category identically to generate a plurality of connected regions of the driving image;
calculating the number of pixel points contained in each connected region, setting a minimum threshold and a maximum threshold, and removing the connected region when the area of the connected region block is smaller than the minimum threshold or the maximum threshold;
and searching the suspected lane line based on the quadratic curve.
In any of the above schemes, preferably, the finding the suspected lane line based on the quadratic curve includes:
scanning all connected regions starting from the small numbered connected regions:
when the length of the running diagonal line containing the connected area reaches a given threshold value, directly dividing the connected area into road and lane lines;
carrying out curve fitting on data points of the communicated region, and setting a curve equation to be fitted as follows:and b, c, solving by using a least square method a, b and c to fit a curve of the current communication area, extending the curve by n pixel points after obtaining the curve, and considering the current communication area as a lane line part when an extension line can reach other communication areas.
In any of the above schemes, preferably, the acquiring the vehicle navigation information and performing the road planning reminding on the user by combining the vehicle navigation information and the lane line of the road includes: and when the road and lane line in the acquired driving image is detected, acquiring the driving navigation information of the vehicle, judging whether the lane where the vehicle is located meets the driving navigation information of the vehicle or not according to the real-time position of the vehicle and the road and lane line in front of the vehicle, and if not, carrying out voice reminding on the driver through a vehicle-mounted voice playing device.
In any of the above schemes, preferably, the detecting traffic signs of the road according to the collected driving images and identifying the categories of the traffic signs includes:
extracting the contour of a target object in the acquired driving image based on the color characteristics;
removing the contour of a non-road sign in the collected driving image based on the shape characteristics so as to obtain a traffic road sign image in the driving image by screening;
summarizing multiple traffic road sign images through big data, and manually marking each traffic road sign image to generate an identification database;
in the image of the drivingSimilarity gamma is respectively carried out between the traffic road mark image and each type of traffic road mark image in the database i Calculating, wherein i is a class number in the database;
similarity gamma between the calculated traffic road sign image and all the traffic road sign images in the database i Sorting and obtaining the maximum similarity gamma i max。
In any of the above embodiments, it is preferable that the similarity γ between the calculated traffic sign image and all the traffic sign images in the database is calculated i Sorting and obtaining the maximum similarity gamma i max, and further comprising:
will have the maximum similarity gamma i max is judged with the similarity threshold value gamma, if gamma is i If max is less than gamma, judging that the traffic road sign image in the driving image is not matched with the i-th class of traffic road sign image in the database and is an invalid image; if gamma is i If max is larger than gamma, judging that the traffic road sign image in the driving image is matched with the ith class of traffic road sign image in the database, and outputting the manual annotation of the ith class of traffic road sign image in the database.
In any of the above aspects, preferably, the removing of the non-road-marking contour in the captured driving image based on the shape feature includes:
searching the extracted contour by a Graham scanning method through a convex hull;
and carrying out Hough circle detection on the object subjected to convex hull detection through Hough transformation so as to screen out the circular traffic road sign image in the driving image.
In a second aspect, a computer-based driving assistance apparatus includes:
the acquisition module is used for acquiring a driving image in front of the vehicle in real time;
the lane line detection module is used for detecting lane lines of a road according to the collected driving images;
the GPS navigation module is used for acquiring vehicle navigation information and carrying out road planning reminding on a user by combining the vehicle navigation information and a lane line of a road;
the traffic road sign recognition module is used for detecting the traffic road signs of the road according to the collected driving images and recognizing the types of the traffic road signs;
and the reminding module is used for carrying out voice reminding on the driver according to the identified traffic sign category.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides a computer-based driving assisting method and device, which are used for acquiring a driving image in front of a vehicle in real time; detecting the lane lines of the road according to the collected driving images; acquiring vehicle navigation information, and carrying out road planning reminding on a user by combining the vehicle navigation information and a lane line of a road; detecting traffic signs of the road according to the collected driving images, and identifying the types of the traffic signs; performing voice reminding on the driver according to the identified traffic sign category; can solve the problems existing in the prior art: the method has the advantages that the method can only carry out linear detection, cannot carry out curve detection, can only carry out detection, cannot carry out dotted line detection and cannot detect a plurality of lane lines for lane line detection, can simultaneously detect the linear lane line and the curve lane line, can simultaneously detect a plurality of lane lines in a visual field, and can distinguish the dotted lane line from the solid lane line; meanwhile, the round traffic signs on the driving road can be identified, so that the driver can be assisted to acquire information on the road, the driver can be helped to analyze the road condition, and the safe driving alertness of the driver can be improved.
Drawings
The drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification.
FIG. 1 is a schematic flow diagram of a computer-based driving assistance method of the present invention;
fig. 2 is a block schematic diagram of the computer-based driving assistance apparatus of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or be indirectly connected to the other element.
In the description of the present invention, it is to be understood that the terms "length", "width", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on the orientations or positional relationships illustrated in the drawings, and are used merely for convenience in describing the present invention and for simplicity in description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, are not to be construed as limiting the present invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
For better understanding of the above technical solutions, the technical solutions of the present invention will be described in detail below with reference to the drawings and the detailed description of the present invention.
As shown in fig. 1, the present invention provides a computer-based driving assistance method, including:
step 1: acquiring a driving image in front of a vehicle in real time;
step 2: detecting the lane lines of the road according to the collected driving images;
and step 3: acquiring vehicle navigation information, and carrying out road planning reminding on a user by combining the vehicle navigation information and a lane line of a road;
and 4, step 4: detecting traffic signs of a road according to the collected driving images, and identifying the types of the traffic signs;
and 5: and carrying out voice reminding on the driver according to the identified traffic sign category.
As shown in fig. 1, the detecting the lane line of the road according to the collected driving image includes:
step 21: converting the acquired driving image from an RGB color space to an HSV color space; in general, the driving image collected by the vehicle-mounted image collecting device is a color image, and all colors in the color image can be obtained by combining the three primary colors R, G, B, so as to form an RGB color space, in which the hue, brightness and shade of the image color are determined by the values of three different channels (red channel, green channel and blue channel). The division of a target object according to a certain type of color in the RGB color space is very complicated because the change of the value of any channel of a pixel can affect the color value of the pixel.
In the auxiliary driving method based on the computer in the embodiment of the invention, the formula is used;;(ii) a Mapping (r, g, b) values in RGB color space to [0,1]In the space, max is the largest of (r, g, b), and min is the smallest of (r, g, b).
In the computer-based driving assistance method according to the embodiment of the present invention, in the converted HSV space, the hue value h e [0, 360 ], the saturation and the brightness s, v e [0,1 ].
Step 22: extracting objects meeting the color threshold value according to the color threshold value where the lane line is located; in the driving assistance method based on the computer according to the embodiment of the present invention, since the lane line has a large color characteristic, such as white or yellow, in the HSV color space; the accessible sets up the color threshold value interval, the object that satisfies the color threshold value interval in the image of traveling that will gather is drawed, the object that does not satisfy the color threshold value interval is got rid of, furtherly, when gathering the road that travels through on-vehicle image acquisition device, half to more than the third part of the normal image in the image of traveling that gathers just is the end of road, belong to the part more than the horizon promptly, furtherly, can set up image cutting point to get rid of the part more than the horizon in the image, reduce later stage work load.
As shown in fig. 1, the extracting, according to the color threshold where the lane line is located, an object that satisfies the color threshold includes:
step 221: setting the horizon line as 1/2 in the driving image, and removing the part above the horizon line in the driving image;
step 222: setting white threshold interval s in HSV color space 1 And v 1 Satisfying s ∈ s in the collected driving image 1 &v∈v 1 The object of (2) is reserved;
step 223: setting yellow threshold interval h in HSV color space 1 、s 2 And v 2 Satisfying h E h in the collected driving image 1 &s∈s 2 &v∈v 2 The object of (2) is reserved.
Step 23: eliminating non-lane line objects according to the area and angle relation which the lane lines should meet, and reserving objects which meet the internal relation between the lane lines; when the color threshold is used to extract the portion of the driving image that meets the threshold, noise may be included in the driving image, and the edge of the white region may be jagged, so that the driving image needs to be denoised to recover the outline of the region of interest of the image, so that the white region in the image becomes smooth and saturated.
In the driving assistance method based on a computer according to the embodiment of the present invention, eliminating an object other than a lane line and reserving an object that satisfies an intrinsic relationship between lane lines includes:
step 231: removing noise points from the driving image through median filtering;
step 232: scanning a binary image of the driving image, classifying pixel points belonging to the same category into the same category, classifying all the points in the image into different categories, and marking the same category identically to generate a plurality of connected regions of the driving image;
step 233: calculating the number of pixel points contained in each connected region, setting a minimum threshold and a maximum threshold, and removing the connected region when the area of the connected region block is smaller than the minimum threshold or the maximum threshold; in the computer-based driving assistance method according to the embodiment of the present invention, the non-conforming object is removed according to the angle between the major axis of the elliptical region where the connected region object is located and the x axis, the angle between the x axis and the x axis is (-15 °, 15 °), that is, the object almost parallel to the x axis can be removed according to the angle, and the page between the x axis and the angle between the x axis and the x axis is (75 °, 90 °) and (-90 °, -75 °), that is, the object almost perpendicular to the x axis can be removed, because the object satisfying these two types of angles cannot be the lane line, the object having no relation to the lane line in the driving image (for example, the rod of the intersection rod at the side of the lane, a large area of sky, a white vehicle in the center of the lane, etc.) can be removed.
Step 234: searching a suspected lane line based on the quadratic curve; after removing the connected region, an almost pure road lane line extraction map can be obtained, but individual non-lane line objects are still not removed, however, the scattered lane line parts of the dotted lines have certain connection, and the non-lane line parts are only isolated and have no connection with other parts.
In the computer-based driving assistance method according to the embodiment of the present invention, the finding a suspected lane line based on a quadratic curve includes:
step 2341: scanning all connected regions starting from the small numbered connected regions:
step 2342: when the length of the running diagonal line containing the connected area reaches a given threshold value, directly dividing the connected area into road and lane lines;
step 2343: carrying out curve fitting on data points of the communicated region, and setting a curve equation to be fitted as follows:the method comprises the following steps that a, b and c are three different parameters respectively, solution is carried out through a least square method a, b and c, curve fitting is carried out on a current communication area, after a curve is obtained, the curve is prolonged by n pixel points, and when an extension line can touch other communication areas, the current communication area is considered to be a lane line part;
step 2344: and repeating the steps 2342 and 2343 until all the connected areas are scanned.
In the computer-based driving assistance method according to the embodiment of the present invention, the acquiring the vehicle navigation information and performing the road planning reminding on the user by combining the vehicle navigation information and the lane line of the road includes: and when the road and lane line in the acquired driving image is detected, acquiring the driving navigation information of the vehicle, judging whether the lane where the vehicle is located meets the driving navigation information of the vehicle or not according to the real-time position of the vehicle and the road and lane line in front of the vehicle, and if not, carrying out voice prompt on the driver through a vehicle-mounted voice playing device so as to prevent the deviation of the preset formation of the vehicle and help the driver to carry out auxiliary driving.
As shown in fig. 1, the detecting traffic signs of the road according to the collected driving images and identifying the categories of the traffic signs includes:
step 41: extracting the outline of a target object in the acquired driving image based on the color characteristics; in the computer-based driving assistance method according to the embodiment of the present invention, since the traffic sign has a significant characteristic in color, for example, the warning traffic sign is generally yellow, the prohibition-like traffic sign is red, and the indicative traffic sign is blue; the traffic sign may be contour extracted by setting a red threshold interval, a blue threshold interval, etc., as described in step 22.
Step 42: removing the non-road sign outline in the collected driving image based on the shape characteristics so as to obtain a traffic road sign image in the driving image by screening; in the driving assistance method based on the computer according to the embodiment of the present invention, since traffic signs generally have only three shapes: the invention mainly detects and identifies the traffic signs of the round type, so the detection and identification of the traffic signs of the rectangular type and the triangular type are not detailed in the embodiment of the invention;
as shown in fig. 1, the removing of the non-road-marking contour in the captured driving image based on the shape feature includes:
step 421: searching the extracted contour by a Graham scanning method through a convex hull;
step 422: and carrying out Hough circle detection on the object subjected to convex hull detection through Hough transformation so as to screen out the circular traffic sign image in the driving image.
Step 43: summarizing multiple traffic road sign images through big data, and manually marking each traffic road sign image to generate an identification database;
and step 44: respectively carrying out similarity gamma on the traffic road mark image in the driving image and each type of traffic road mark image in the database i Calculating, wherein i is a class number in the database;
step 45: similarity gamma between the calculated traffic road sign image and all the traffic road sign images in the database i Sorting and obtaining the maximum similarity gamma i max;
Step 46: will have the maximum similarity gamma i max is judged with the similarity threshold value gamma, if gamma is i If max is less than gamma, judging that the traffic road sign image in the driving image is not matched with the i-th class of traffic road sign image in the database and is an invalid image; if gamma is i If max is larger than gamma, judging that the traffic road sign image in the driving image is matched with the ith class of traffic road sign image in the database, and outputting the manual annotation of the ith class of traffic road sign image in the database.
In the computer-based driving assistance method according to the embodiment of the present invention, the voice reminding of the driver according to the identified traffic sign category includes: after the round traffic signs in the collected driving images are recognized, the driving information of the vehicle, such as the vehicle speed, the light and the like, is acquired, whether the vehicle meets the driving conditions in the round traffic sign category or not is judged according to the driving information of the vehicle and the round traffic sign category in front, if not, the voice prompt is carried out on the driver through the vehicle-mounted voice playing device, so that the illegal driving of the vehicle is prevented, the safe driving alert of the driver is improved, and the driver is assisted in driving.
As shown in fig. 2, the present invention also provides a driving assistance apparatus based on a computer, comprising:
the acquisition module is used for acquiring a driving image in front of the vehicle in real time;
the lane line detection module is used for detecting the lane line of the road according to the collected driving image;
the GPS navigation module is used for acquiring vehicle navigation information and carrying out road planning reminding on a user by combining the vehicle navigation information and a lane line of a road;
the traffic road sign recognition module is used for detecting the traffic road signs of the road according to the collected driving images and recognizing the types of the traffic road signs;
and the reminding module is used for carrying out voice reminding on the driver according to the identified traffic sign category.
Although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described above, or equivalents may be substituted for elements thereof. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (10)
1. A computer-based driving assistance method, characterized by: the method comprises the following steps:
acquiring a driving image in front of a vehicle in real time;
detecting the lane lines of the road according to the collected driving images;
acquiring vehicle navigation information, and carrying out road planning reminding on a user by combining the vehicle navigation information and a lane line of a road;
detecting traffic signs of the road according to the collected driving images, and identifying the types of the traffic signs;
and carrying out voice reminding on the driver according to the identified traffic sign category.
2. The computer-based driving assist method according to claim 1, characterized in that: the detecting of the lane line of the road according to the collected driving image includes:
converting the acquired driving image from an RGB color space to an HSV color space;
extracting objects meeting the color threshold value according to the color threshold value where the lane line is located;
and eliminating the non-lane line objects according to the area and angle relation which the lane lines should meet, and reserving the objects which meet the internal relation between the lane lines.
3. The computer-based driving assist method according to claim 2, characterized in that: the extracting the object meeting the color threshold value according to the color threshold value where the lane line is located comprises:
setting the horizon line as 1/2 in the driving image, and removing the part above the horizon line in the driving image;
setting white threshold interval s in HSV color space 1 And v 1 Satisfying s ∈ s in the collected driving image 1 &v∈v 1 The object of (2) is reserved;
setting yellow threshold interval h in HSV color space 1 、s 2 And v 2 Satisfying h e h in the collected driving image 1 &s∈s 2 &v∈v 2 The object of (2) is reserved.
4. The computer-based driving assist method according to claim 3, characterized in that: the eliminating of the non-lane line objects according to the area and angle relation that the lane lines should satisfy and the reserving of the objects that satisfy the intrinsic relation between the lane lines includes:
removing noise points from the driving image through median filtering;
scanning a binary image of the driving image, classifying pixel points belonging to the same category into the same category, classifying all the points in the image into different categories, and marking the same category identically to generate a plurality of connected regions of the driving image;
calculating the number of pixel points contained in each connected region, setting a minimum threshold and a maximum threshold, and removing the connected region when the area of the connected region block is smaller than the minimum threshold or the maximum threshold;
and searching the suspected lane line based on the quadratic curve.
5. The computer-based driving assist method according to claim 4, characterized in that: the finding of the suspected lane line based on the quadratic curve comprises:
scanning all connected regions starting from the small numbered connected regions:
when the length of the running diagonal line containing the connected area reaches a given threshold value, directly dividing the connected area into road and lane lines;
carrying out curve fitting on data points of the communicated region, and setting a curve equation to be fitted as follows:and b, c, solving by using a least square method a, b and c to fit a curve of the current communication area, extending the curve by n pixel points after obtaining the curve, and considering the current communication area as a lane line part when an extension line can reach other communication areas.
6. The computer-based driving assist method according to claim 5, characterized in that: the step of obtaining the vehicle navigation information and performing the road planning reminding on the user by combining the vehicle navigation information and the lane line of the road comprises the following steps: and when the road and lane line in the acquired driving image is detected, acquiring the driving navigation information of the vehicle, judging whether the lane where the vehicle is located meets the driving navigation information of the vehicle or not according to the real-time position of the vehicle and the road and lane line in front of the vehicle, and if not, carrying out voice reminding on the driver through a vehicle-mounted voice playing device.
7. The computer-based driving assist method according to claim 6, characterized in that: the detecting the traffic signs of the road according to the collected driving images and identifying the types of the traffic signs comprises the following steps:
extracting the contour of a target object in the acquired driving image based on the color characteristics;
removing the non-road sign outline in the collected driving image based on the shape characteristics so as to obtain a traffic road sign image in the driving image by screening;
summarizing multiple types of traffic road sign images through big data, and manually marking each type of traffic road sign images to generate an identification database;
respectively carrying out similarity gamma on the traffic road mark image in the driving image and each type of traffic road mark image in the database i Calculating, wherein i is a class number in the database;
similarity gamma between the calculated traffic road sign image and all the traffic road sign images in the database i Sorting and obtaining the maximum similarity gamma i max。
8. The computer-based driving assist method according to claim 7, characterized in that: similarity gamma between the calculated traffic sign image and all traffic sign images in the database i Sorting and obtaining the maximum similarity gamma i After max, also include:
will have the maximum similarity gamma i max is judged with the similarity threshold value gamma, if gamma is i max < gamma, thenJudging whether the traffic road sign image in the driving image is not matched with the ith class of traffic road sign image in the database, and judging the traffic road sign image is an invalid image; if gamma is i If max is larger than gamma, judging that the traffic road sign image in the driving image is matched with the ith class of traffic road sign image in the database, and outputting the manual annotation of the ith class of traffic road sign image in the database.
9. The computer-based driving assist method according to claim 8, characterized in that: the removing of the non-road-marking contour in the collected driving image based on the shape feature comprises:
searching the extracted contour by a Graham scanning method through a convex hull;
and carrying out Hough circle detection on the object subjected to convex hull detection through Hough transformation so as to screen out the circular traffic road sign image in the driving image.
10. A computer-based driver assistance apparatus, characterized by: the driving assistance apparatus includes:
the acquisition module is used for acquiring a driving image in front of the vehicle in real time;
the lane line detection module is used for detecting lane lines of a road according to the collected driving images;
the GPS navigation module is used for acquiring vehicle navigation information and carrying out road planning reminding on a user by combining the vehicle navigation information and a lane line of a road;
the traffic road sign recognition module is used for detecting the traffic road signs of the road according to the collected driving images and recognizing the types of the traffic road signs;
and the reminding module is used for carrying out voice reminding on the driver according to the identified traffic sign category.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210856155.XA CN115071733B (en) | 2022-07-21 | 2022-07-21 | Auxiliary driving method and device based on computer |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210856155.XA CN115071733B (en) | 2022-07-21 | 2022-07-21 | Auxiliary driving method and device based on computer |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115071733A true CN115071733A (en) | 2022-09-20 |
CN115071733B CN115071733B (en) | 2022-10-25 |
Family
ID=83259397
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210856155.XA Active CN115071733B (en) | 2022-07-21 | 2022-07-21 | Auxiliary driving method and device based on computer |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115071733B (en) |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050169501A1 (en) * | 2004-01-29 | 2005-08-04 | Fujitsu Limited | Method and apparatus for determining driving lane of vehicle, and computer product |
JP2009103539A (en) * | 2007-10-22 | 2009-05-14 | Denso Corp | Vehicle lane guiding device |
CN101608924A (en) * | 2009-05-20 | 2009-12-23 | 电子科技大学 | A kind of method for detecting lane lines based on gray scale estimation and cascade Hough transform |
KR20140071174A (en) * | 2012-12-03 | 2014-06-11 | 현대자동차주식회사 | Lane guide device in vehicle and method thereof |
DE102016100718A1 (en) * | 2016-01-18 | 2017-07-20 | Valeo Schalter Und Sensoren Gmbh | A method for detecting lanes on a roadway based on a frequency distribution of distance values, control device, driver assistance system and motor vehicle |
US20180060677A1 (en) * | 2016-08-29 | 2018-03-01 | Neusoft Corporation | Method, apparatus and device for detecting lane lines |
CN207274647U (en) * | 2017-04-20 | 2018-04-27 | 成都工业职业技术学院 | A kind of dilly lateral parking moves vehicle device |
CN108805065A (en) * | 2018-05-31 | 2018-11-13 | 华南理工大学 | One kind being based on the improved method for detecting lane lines of geometric properties |
CN109657632A (en) * | 2018-12-25 | 2019-04-19 | 重庆邮电大学 | A kind of lane detection recognition methods |
CN110276971A (en) * | 2019-07-03 | 2019-09-24 | 广州小鹏汽车科技有限公司 | A kind of auxiliary control method of vehicle drive, system and vehicle |
US20200026930A1 (en) * | 2018-07-20 | 2020-01-23 | Boe Technology Group Co., Ltd. | Lane line detection method and apparatus |
CN110920604A (en) * | 2018-09-18 | 2020-03-27 | 阿里巴巴集团控股有限公司 | Driving assistance method, driving assistance system, computing device, and storage medium |
KR102103941B1 (en) * | 2018-11-14 | 2020-04-23 | 주식회사 모빌테크 | Road and lane data real-time update method for autonomous driving vehicles based on point cloud map |
US20200180619A1 (en) * | 2018-12-07 | 2020-06-11 | Thinkware Corporation | Method for displaying lane information and apparatus for executing the method |
US20200391731A1 (en) * | 2019-06-11 | 2020-12-17 | Mando Corporation | Advanced driver assistance system, vehicle having the same, and method of controlling the vehicle |
US20210070303A1 (en) * | 2019-09-09 | 2021-03-11 | Honda Motor Co., Ltd. | Vehicle control device, vehicle control method, and storage medium |
CN112906583A (en) * | 2021-02-25 | 2021-06-04 | 北京经纬恒润科技股份有限公司 | Lane line detection method and device |
WO2021208110A1 (en) * | 2020-04-18 | 2021-10-21 | 华为技术有限公司 | Method for determining lane line recognition abnormal event, and lane line recognition apparatus and system |
CN114111811A (en) * | 2021-12-17 | 2022-03-01 | 奇瑞万达贵州客车股份有限公司 | Navigation control system and method for automatically driving public bus |
CN114582153A (en) * | 2022-02-25 | 2022-06-03 | 智己汽车科技有限公司 | Long solid line reminding method and system for ramp entrance and vehicle |
-
2022
- 2022-07-21 CN CN202210856155.XA patent/CN115071733B/en active Active
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050169501A1 (en) * | 2004-01-29 | 2005-08-04 | Fujitsu Limited | Method and apparatus for determining driving lane of vehicle, and computer product |
JP2009103539A (en) * | 2007-10-22 | 2009-05-14 | Denso Corp | Vehicle lane guiding device |
CN101608924A (en) * | 2009-05-20 | 2009-12-23 | 电子科技大学 | A kind of method for detecting lane lines based on gray scale estimation and cascade Hough transform |
KR20140071174A (en) * | 2012-12-03 | 2014-06-11 | 현대자동차주식회사 | Lane guide device in vehicle and method thereof |
DE102016100718A1 (en) * | 2016-01-18 | 2017-07-20 | Valeo Schalter Und Sensoren Gmbh | A method for detecting lanes on a roadway based on a frequency distribution of distance values, control device, driver assistance system and motor vehicle |
US20180060677A1 (en) * | 2016-08-29 | 2018-03-01 | Neusoft Corporation | Method, apparatus and device for detecting lane lines |
CN207274647U (en) * | 2017-04-20 | 2018-04-27 | 成都工业职业技术学院 | A kind of dilly lateral parking moves vehicle device |
CN108805065A (en) * | 2018-05-31 | 2018-11-13 | 华南理工大学 | One kind being based on the improved method for detecting lane lines of geometric properties |
US20200026930A1 (en) * | 2018-07-20 | 2020-01-23 | Boe Technology Group Co., Ltd. | Lane line detection method and apparatus |
CN110920604A (en) * | 2018-09-18 | 2020-03-27 | 阿里巴巴集团控股有限公司 | Driving assistance method, driving assistance system, computing device, and storage medium |
KR102103941B1 (en) * | 2018-11-14 | 2020-04-23 | 주식회사 모빌테크 | Road and lane data real-time update method for autonomous driving vehicles based on point cloud map |
US20200180619A1 (en) * | 2018-12-07 | 2020-06-11 | Thinkware Corporation | Method for displaying lane information and apparatus for executing the method |
CN109657632A (en) * | 2018-12-25 | 2019-04-19 | 重庆邮电大学 | A kind of lane detection recognition methods |
US20200391731A1 (en) * | 2019-06-11 | 2020-12-17 | Mando Corporation | Advanced driver assistance system, vehicle having the same, and method of controlling the vehicle |
CN110276971A (en) * | 2019-07-03 | 2019-09-24 | 广州小鹏汽车科技有限公司 | A kind of auxiliary control method of vehicle drive, system and vehicle |
US20210070303A1 (en) * | 2019-09-09 | 2021-03-11 | Honda Motor Co., Ltd. | Vehicle control device, vehicle control method, and storage medium |
WO2021208110A1 (en) * | 2020-04-18 | 2021-10-21 | 华为技术有限公司 | Method for determining lane line recognition abnormal event, and lane line recognition apparatus and system |
CN112906583A (en) * | 2021-02-25 | 2021-06-04 | 北京经纬恒润科技股份有限公司 | Lane line detection method and device |
CN114111811A (en) * | 2021-12-17 | 2022-03-01 | 奇瑞万达贵州客车股份有限公司 | Navigation control system and method for automatically driving public bus |
CN114582153A (en) * | 2022-02-25 | 2022-06-03 | 智己汽车科技有限公司 | Long solid line reminding method and system for ramp entrance and vehicle |
Non-Patent Citations (5)
Title |
---|
JIANYU YANG ET AL: "All Metadata":lane dectection) AND Lane Detection Based on Classification of Lane Geometrical Model", 《2012 IEEE 11TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING》 * |
宋锐等: "基于几何矩采样的车道检测算法", 《中国科学:信息科学》 * |
毛以芳等: "船舱内部监控图像模糊增强算法", 《舰船科学技术》 * |
胡延平等: "一种基于卡方统计的弯道识别算法", 《汽车工程学报》 * |
高琪等: "基于结构化道路的车道偏离实时预警算法", 《计算机仿真》 * |
Also Published As
Publication number | Publication date |
---|---|
CN115071733B (en) | 2022-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110197589B (en) | Deep learning-based red light violation detection method | |
CN105678285B (en) | A kind of adaptive road birds-eye view transform method and road track detection method | |
US8750567B2 (en) | Road structure detection and tracking | |
CN106203398B (en) | A kind of method, apparatus and equipment detecting lane boundary | |
CN107798335B (en) | Vehicle logo identification method fusing sliding window and Faster R-CNN convolutional neural network | |
CN103020623B (en) | Method for traffic sign detection and road traffic sign detection equipment | |
CN109670376B (en) | Lane line identification method and system | |
CN108447303B (en) | Peripheral visual field danger identification method based on coupling of human vision and machine vision | |
CN103116751B (en) | A kind of Method of Automatic Recognition for Character of Lcecse Plate | |
US20100110193A1 (en) | Lane recognition device, vehicle, lane recognition method, and lane recognition program | |
CN103824037B (en) | Vehicle anti-tracking alarm device | |
CN104036262B (en) | A kind of method and system of LPR car plates screening identification | |
CN107578012B (en) | Driving assistance system for selecting sensitive area based on clustering algorithm | |
CN107886034B (en) | Driving reminding method and device and vehicle | |
CN107506760A (en) | Traffic signals detection method and system based on GPS location and visual pattern processing | |
CN104899554A (en) | Vehicle ranging method based on monocular vision | |
US9355322B2 (en) | Road environment recognition device | |
CN106709412B (en) | Traffic sign detection method and device | |
CN105023452B (en) | A kind of method and device of multichannel traffic lights signal acquisition | |
CN111723625B (en) | Traffic light image recognition processing method and device, auxiliary traffic system and storage medium | |
CN109190483B (en) | Lane line detection method based on vision | |
CN105654073A (en) | Automatic speed control method based on visual detection | |
CN108446668A (en) | Traffic lights detection recognition method and system based on unmanned platform | |
JP2005316607A (en) | Image processor and image processing method | |
CN108304749A (en) | The recognition methods of road speed line, device and vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |