US12106459B2 - System for quantitively determining pavement marking quality - Google Patents
System for quantitively determining pavement marking quality Download PDFInfo
- Publication number
- US12106459B2 US12106459B2 US17/547,580 US202117547580A US12106459B2 US 12106459 B2 US12106459 B2 US 12106459B2 US 202117547580 A US202117547580 A US 202117547580A US 12106459 B2 US12106459 B2 US 12106459B2
- Authority
- US
- United States
- Prior art keywords
- pavement
- color space
- markings
- image frames
- space value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/582—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/36—Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/88—Image or video recognition using optical means, e.g. reference filters, holographic masks, frequency domain filters or spatial domain filters
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present disclosure relates to a system and method for quantitively determining pavement marking quality by determining a color distance measurement between a current marking color and an original or ideal marking color of the pavement markings, and a marking intensity contrast ratio between the pavement and the pavement markings.
- Pavement markings may be used to convey messages to roadway users. Specifically, pavement markings may indicate a particular part of the road to use, provide information about conditions ahead, and indicate where passing is allowed. For example, yellow lines separate traffic flowing in opposite directions, while white lines separate lanes for which travel is in the same direction.
- Pavement markings wear out over time, and therefore it is necessary for to restripe pavement markings on a regular basis.
- a subjective visual inspection may be used to determine the aesthetic quality acceptance for road marking.
- current quality detection has a qualitative component and does not provide uniform assessment, and therefore this approach results in quality variations that are based on the perception of the specific individual performing the visual inspection.
- some government agencies and municipalities may restripe pavement markings based on a manual or pre-set schedule instead of the current quality of the pavement markings. There is presently no approach currently available for measuring the deterioration of paint color or visibility contrast of a pavement marking compared to the surrounding pavement material.
- a system for quantitively determining quality for pavement markings disposed along pavement on a roadway includes one or more controllers in wireless communication with a plurality of vehicles, where the one or more controllers receive image data represents the pavement markings disposed along the pavement collected by the plurality of vehicles.
- the one or more controllers execute instructions to convert image frames based on the image data into grayscale image frames, where the grayscale image frames retain data indicating original color space values of the image frames.
- the one or more controllers execute instructions to create a grayscale filter by providing one or more color masks that isolate only the original color space values representing the pavement markings, and then combining the output of the one or more color masks together.
- the one or more controllers execute instructions to isolate, by the grayscale filter, the original color space values representing the pavement markings from the grayscale image frames to determine filtered grayscale image frames.
- the one or more controllers execute instructions to determine a mean color space value corresponding to the pavement markings and the mean color space value corresponding to the pavement based on the filtered grayscale image frames.
- the one or more controllers execute instructions to determine at least one of a color distance measurement between the mean color space value of the pavement markings and an ideal marking color space value and a marking intensity contrast ratio between the pavement markings and the pavement.
- the one or more controllers execute instructions to filter, by a Boolean mask, masked filtered grayscale image frames by assigning binary values to pixels of the filtered grayscale image frames.
- the pixels representing the pavement are assigned to first binary values and the pixels representing the pavement are assigned to second binary values.
- the one or more controllers execute instructions to determine the mean color space value for the pixels of the first binary values representing the pavement markings.
- the one or more controllers execute instructions to determines the mean color space value for the pixels of the second binary values representing the pavement, and identify a color represented by the mean color space value representing the pavement markings.
- the one or more controllers identifies boundaries between the pavement markings and the pavement of the filtered grayscale image frames.
- the one or more controllers execute instructions to correct the grayscale images frames for brightness to remove discolorations in the image data representing the pavement of the roadway.
- the one or more color masks include a first color mask isolating only color space values representing a first color and a second color mask isolating only color space values representing a second color.
- the first color is yellow and the second color is white.
- the marking intensity contrast ratio is determined by
- marking ⁇ intensity ⁇ contrast ⁇ ratio ⁇ M - ⁇ p ⁇ p where ⁇ M is a marking intensity and ⁇ p is a pavement intensity.
- the one or more controllers execute instructions to create a map plotting the color distance measurement of the pavement markings for a specific geographical location, where the map provides a visual indicator where the pavement markings require repainting.
- the one or more controllers execute instructions to determine the dominant color space values of the pavement markings and the pavement based on the original color space values of the filtered grayscale image frames.
- the one or more controllers determine a number of clusters each representing a dominant color space value for the pavement markings and the pavement.
- the one or more controllers determine a Euclidean distance between the mean color space value for each dominant color space value of the pavement markings and the ideal marking color space value.
- the pavement markings are lane markings.
- a method for quantitively determining quality for pavement markings disposed along pavement on a roadway includes receiving, by one or more controllers, image data representing the pavement markings disposed along the pavement collected by a plurality of vehicles.
- the method also includes converting, by the one or more controllers, image frames based on the image data into grayscale image frames, where the grayscale image frames retain data indicating original color space values of the image frames.
- the method also includes creating a grayscale filter by providing one or more color masks that isolate only the original color space values representing the pavement markings, and then combining the output of the one or more color masks together.
- the method further includes isolating, by the grayscale filter, the original color space values representing the pavement markings from the grayscale image frames to determine filtered grayscale image frames.
- the method also includes determining a mean color space value corresponding to the pavement markings and the mean color space value corresponding to the pavement based on the filtered grayscale image frames.
- the method includes determining at least one of a color distance measurement between the mean color space value of the pavement markings and an ideal marking color space value and a marking intensity contrast ratio between the pavement markings and the pavement.
- the method includes filtering, by a Boolean mask, masked filtered grayscale image frames by assigning binary values to pixels of the filtered grayscale image frames, where the pixels representing the pavement are assigned to first binary values and the pixels representing the pavement are assigned to a second binary values.
- the method includes determining the mean color space value for the pixels of the first binary values representing the pavement markings.
- the method includes determining the mean color space value for the pixels of the second binary values representing the pavement, and identifying a color represented by the mean color space value representing the pavement markings.
- the method includes identifying boundaries between the pavement markings and the pavement of the filtered grayscale image frames.
- the method includes correcting the grayscale images frames for brightness to remove discolorations in the image data representing the pavement of the roadway.
- the method comprises determining the marking intensity contrast ratio is determined by
- marking ⁇ intensity ⁇ contrast ⁇ ratio ⁇ M - ⁇ p ⁇ p where ⁇ M is a marking intensity and ⁇ p is a pavement intensity.
- the method comprises creating a map plotting the color distance measurement of the pavement markings for a specific geographical location, where the map provides a visual indicator where the pavement markings require repainting.
- the method comprises determining dominant color space values of the pavement markings and the pavement based on the original color space values of the filtered grayscale image frames.
- the method also includes determining a number of clusters each representing a dominant color space value for the pavement markings and the pavement, and determining a Euclidean distance between the mean color space value for each dominant color space value of the pavement markings and the ideal marking color space value.
- a system for quantitively determining quality for lane markings disposed along pavement on a roadway includes one or more controllers in wireless communication with a plurality of vehicles, where the one or more controllers receive image data represents the lane markings disposed along the pavement collected by the plurality of vehicles.
- the one or more controllers execute instructions to convert image frames based on the image data into grayscale image frames, where the grayscale image frames retain data indicating original color space values of the image frames.
- the one or more controllers create a grayscale filter by providing one or more color masks that isolate only the original color space values representing the lane markings, and then combining the output of the one or more color masks together.
- the one or more controllers isolate, by the grayscale filter, the original color space values representing the lane markings from the grayscale image frames to determine filtered grayscale image frames.
- the one or more controllers determine a mean color space value corresponding to the lane markings and the mean color space value corresponding to the pavement based on the filtered grayscale image frames.
- the one or more controllers determine at least one of a Euclidean distance between the mean color space value of the lane markings and an ideal marking color space value and a marking intensity contrast ratio between the lane markings and the pavement.
- FIG. 1 A is a schematic diagram of the disclosed system for quantitively determining quality of pavement markings including a computing system in wireless communication with a plurality of vehicles, according to an exemplary embodiment;
- FIG. 1 B is a diagram of pavement markings disposed along pavement of a roadway, according to an exemplary embodiment
- FIG. 2 is a diagram illustrating the computing system shown in FIG. 1 , according to an exemplary embodiment.
- FIG. 3 is a process flow diagram illustrating a method for determining the quality of the pavement markings shown in FIG. 3 A , according to an exemplary embodiment.
- an exemplary system 10 for quantitively determining a quality of pavement markings 12 disposed along a pavement 14 of a roadway 16 is disclosed.
- the pavement 14 represents a road surface for vehicular or foot traffic.
- the pavement markings 12 are broken or dashed white or yellow lines representing center lane markings.
- the figures are merely exemplary in nature, and the pavement markings 12 are not limited to lane markings and may include other shapes and colors as well.
- the pavement markings 12 are symbols such as, for example, a diamond indicating a lane reserved for use by high-occupancy vehicles or a bicycle indicating a lane reserved for bicyclists
- the system 10 may also be used to determine the quality of road signs such as, for example, stop signs, exit ramp signs, and traffic signs as well.
- the system 10 includes a computing system 20 including one or more controllers 26 in wireless communication with a fleet or a plurality of vehicles 22 .
- the one or more controllers 26 receive and aggregate image data 18 collected from the plurality of vehicles 22 .
- the plurality of vehicles 22 may include any type of vehicle having wireless capabilities connected to the computing system 20 such as, but not limited to, a sedan, truck, sport utility vehicle, van, or motor home.
- Each vehicle 22 includes one or more cameras 24 for capturing the image data 18 , where the image data 18 represents the pavement markings 12 disposed along the pavement 14 .
- the computing system 20 executes image processing algorithms and quantitative techniques to determine a color distance measurement between a current marking color and an original or ideal marking color, as well as a current pavement color and an ideal pavement color.
- the ideal marking color and the ideal pavement colors are based on standards derived from various governmental agencies such as, for example, the Department of Transportation (DoT) for federal and state governments.
- DoT Department of Transportation
- the disclosure describes a Euclidean distance model as an exemplary use case for determining the color distance measurement, other color difference formulas may be used as well such as, for example, a city block model or CIELAB color space models.
- the system 20 also determines a marking intensity contrast ratio between the pavement 14 and the pavement markings 12 .
- FIG. 2 is a block diagram of the computing system 20 including the one or more controllers 26 for determining the color distance measurement between the current marking color and the ideal marking color and a marking intensity contrast ratio between the pavement 14 and the pavement markings 12 .
- the computing system 20 includes a collection module 30 , a preprocessing module 32 , a feature extraction module 34 , a filtering module 36 , a Euclidean distance module 38 , a contrast ratio module 40 , and an assessment module 42 .
- the collection module 30 of the computing system 20 receives the image data 18 from the plurality of vehicles 22 , where the image data 18 is representative of the pavement markings 12 for a specific patch of the roadway 16 . It is to be appreciated that since the image data 18 is collected from more than one vehicle 22 , multiple set of image data 18 may exist for the specific patch of the roadway 16 .
- the image frames 44 are then sent to the preprocessing module 32 of the computing system 20 .
- the preprocessing module 32 performs one or more preprocessing techniques upon the image frames 44 to generate grayscale image frames 50 representative of the pavement markings 12 for the specific patch of the roadway 16 ( FIG. 1 B ) based on the image frames 44 .
- the preprocessing module 32 includes a grayscale block 52 and a brightness block 54 .
- the grayscale block 52 converts the image frames 44 , which are in color space values, into the grayscale image frames 50 . It is to be appreciated that although the image frames 44 are converted to grayscale, the grayscale image frames 50 still indicate original color space values. That is, the grayscale image frames 50 retain data inside each pixel indicating the original color space values of the original image frames 44 .
- the grayscale images frames 50 are then corrected for brightness to remove discolorations in the image data representing the pavement 14 by the brightness block 54 .
- the brightness block 54 executes a gamma correction algorithm to remove discolorations, however, it is to be appreciated that other techniques may be used as well.
- the grayscale image frames 50 corrected for brightness are then sent to the feature extraction module 34 .
- the feature extraction module 34 includes a color space block 60 , a color mask block 62 , a common mask 64 , a grayscale filter 68 , and an edge detection block 70 .
- the color space block 60 of the feature extraction module 34 converts the grayscale image frames 50 into image frames expressed in color space values 72 .
- the darkened grayscale images 50 are converted into image frames where the pavement markings 12 are separated from the pavement 14 based on a common mask 64 .
- the common mask 64 is created expressing the image frames 44 in hue, saturation, and lightness (HSL) color space values, however, it is to be appreciated that other color spaces such as, but not limited to, a hue, saturation, value (HSV), a red, blue, green (RGB) color space or a Y′UV color space may be used as well to determine color difference measurement and contrast ratio.
- HSL hue, saturation, and lightness
- the image frames expressed in the color space values 72 are then sent to the color mask block 62 , which includes one or more color masks to isolate the original color space values representing the pavement markings.
- the color mask block 62 includes a first color mask 62 A and a second color mask 62 B, however, more than two color masks may be used as well.
- the first color mask 62 A isolates only color space values representing a first color from the image frames expressed in the color space values 72 to create the first color mask 62 A
- the second color mask 62 B isolates only color space values representing a second color from the image frames expressed in the color space values 72 .
- the first color is yellow
- the second color is white
- the white color mask corresponds to the white markings
- the yellow color mask corresponds to the yellow markings.
- other colors may be used as well for the first and second colors.
- the first color mask 62 A and the second color mask 62 B are combined together to create a bitwise OR mask, which is the common mask 64 , where the bitwise OR operation is implemented for creating the common mask 64 from the two-colored masks (i.e., white and yellow).
- the common mask 64 isolates only the color space values representing either the first color or the second color from the image frames expressed in the color space values 72 , where output of the common mask 64 is combined together to create the grayscale filter 68 .
- the grayscale filter 68 isolates the original color space values representing the first color and the second color from the grayscale image frames 50 , thereby functioning as a bitwise AND mask.
- the grayscale filter 68 is created by first providing one or more color masks that isolate only the original color space values representing the pavement markings 12 (in the present example yellow and white), and then combining the output of the one or more color masks together (in the present example the output is all yellow or white pavement colors).
- the grayscale filter 68 receives the grayscale image frames 50 representative of the pavement markings 12 for the specific patch of the roadway 16 from the preprocessing module 32 and isolates the original color space values representing the first color and the second color from the grayscale image frames 50 to determine filtered grayscale image frames 80 , where the pavement markings 12 are separately visible compared to the pavement 14 .
- the grayspace filter 68 isolates the original color space values from the grayscale image frames 50 representing yellow and white color values, which are common colors chosen for pavement markings.
- the filtered grayscale image frames 80 are then sent to the edge detection block 70 .
- the filtered grayscale image frames 80 may be filtered first using any type of image noise reduction technique such as, for example, a Gaussian blur before being sent to the edge detection block 70 .
- Gaussian blur noise reduction techniques have been used to reduce image noise, and in the present case by reducing specs that may be visible in the pavement segment of the image frames.
- the edge detection block 70 identifies a boundary between the pavement markings 12 and the pavement 14 of the filtered grayscale image frames 80 .
- the edge detection algorithm is the Canny edge detector, however, it is to be appreciated that other algorithms may be used as well.
- the filtered grayscale image frames 80 are then sent to the filtering module 36 .
- the filtering module 36 includes a Boolean mask 82 , a first filter 84 , and a second filter 86 .
- the filtering module 36 determines mean color space values 88 , 90 corresponding to the pavement markings 12 and the pavement 14 respectively based on the filtered grayscale image frames 80 received from the feature extraction module 34 .
- the Boolean mask 82 determines masked filtered grayscale image frames 92 by assigning binary values to pixels of the filtered grayscale image frames 80 , where the pixels representing the pavement markings 12 are assigned a first binary value, and the pixels representing the pavement 14 are assigned a second binary value. For example, in an embodiment, the pixels representing the pavement markings 12 are assigned a 1, while the pixels representing the pavement 14 are assigned 0. Of course, this embodiment may be reversed so that the pavement markings 12 are assigned 0 and the pavement 14 is assigned a 1.
- the first filter 84 determines the mean color space value 88 for the pavement markings 12 and a mean color space value 90 for the pavement 14 based on the masked filtered grayscale image frames 92 . Specifically, the first filter 84 determines the mean color space value 88 corresponding to the pixels for all the first binary values representing the pavement markings 12 . The first filter 84 then determines the mean color space value 90 corresponding to the pixels for all the second binary values representing the pavement 14 . The mean color space values 88 , 90 are then sent to the second filter 86 . The second filter 86 identifies a color represented by mean color space value 88 representing the pavement markings 12 . For example, if the pavement markings 12 are yellow, then the second filter 86 determines the color represented by the mean color space value representing the pavement markings 12 is yellow.
- the mean color space values 88 , 90 and the color representing the mean color space value 88 representing the pavement markings 12 are then sent to both the Euclidean distance module 38 and the contrast ratio module 40 .
- the Euclidean distance module 38 determines the Euclidean distance between the mean color space value 88 and an ideal marking color space value. It is to be appreciated that for calculating Euclidean distance and color space values, HSL, RGB, and Y′UV color spaces may be used.
- Euclidean Distance HSV/HSL ⁇ square root over (( LM H ⁇ I H ) 2 +( LM S ⁇ I S ) 2 +( LM V/L ⁇ I V/L ) 2 ) ⁇ Equation 2
- Euclidean Distance YUV ⁇ square root over (( LM Y ⁇ I Y ) 2 +( LM U ⁇ I U ) 2 +( LM V ⁇ I V ) 2 ) ⁇ Equation 3
- Equation 1 is used if the mean color space value 88 is expressed in RGB color space
- Equation 2 is used if the mean color space value 88 is expressed as HSL or HSV color space values
- Equation 3 is used if the mean
- the contrast ratio module 40 determines the marking intensity contrast ratio between the pavement 14 and the pavement markings 12 based on the mean color space value 88 for the pavement markings 12 and the mean color space value 90 for the pavement 14 , where the mean color space values 88 , 90 represent the mean intensity of the target markings.
- the marking intensity contrast ratio is a measure of a difference between an intensity of the pavement markings 12 and an intensity of a surrounding background area, such as the pavement 14 in the present case. Therefore, a higher value marking intensity contrast ratio indicates the pavement markings 12 are easily perceived by individuals and autonomous vehicles.
- the marking intensity contrast ratio is determined based on a pavement marking intensity value and a pavement intensity value, and specifically by Equation 4, which is:
- marking ⁇ intensity ⁇ contrast ⁇ ratio ⁇ M - ⁇ p ⁇ p Equation ⁇ 4 where ⁇ M is the marking intensity and ⁇ p is the pavement intensity.
- FIG. 3 is a process flow diagram illustrating an exemplary method 200 determining a quality of pavement markings 12 disposed along a pavement 14 of a roadway 16 ( FIG. 1 B ). Specifically, the method 200 determines the color distance measurement between the current marking color and the ideal marking color and the marking intensity contrast ratio between the pavement 14 and the pavement markings 12 . Referring generally to FIGS. 1 A, 1 B, 2 , and 3 , the method 200 begins at block 202 . In block 202 , the preprocessing module 30 of the one or more controllers 26 converts the image frames 44 into the grayscale image frames 50 . As mentioned above, the grayscale image frames 50 retain the data indicating the original color space values of the original image frames 44 .
- the grayscale block 52 of the preprocessing module 30 converts the image frames 44 , which are in color space values, into the grayscale image frames 50 , and the brightness block 54 of the preprocessing module 30 then corrects the grayscale images frames 50 for brightness to remove discolorations in the image data representing the pavement 14 .
- the method 200 may then proceed to block 204 .
- the color space block 60 of the feature extraction module 34 converts the grayscale image frames 50 into image frames expressed in color space values 72 .
- the method 200 may then proceed to block 206 .
- the feature extraction module 34 of the of the one or more controllers 26 creates the grayscale filter 68 by first providing one or more color masks that isolate only the original color space values representing the pavement markings 12 , and then combining the output of the one or more color masks together.
- the first color mask 62 A isolates yellow and the second color mask 62 B isolates white, which are common colors for pavement markings.
- the method 200 may then proceed to block 208 .
- the grayscale filter 68 isolates the original color space values representing the pavement markings 12 from the darkened grayscale image frames 50 to determine filtered grayscale image frames 80 .
- the gray space filter 68 isolates the original color space values representing yellow and white color values, which are common colors chosen for pavement markings. The method 200 may then proceed to block 210 .
- the edge detection block 70 of the feature extraction module 34 identifies boundaries between the pavement markings 12 and the pavement 14 of the filtered grayscale image frames 80 .
- the method 200 may then proceed to block 212 .
- the filtering module 36 determines the mean color space value 88 corresponding to the pavement markings 12 and the mean color space value 90 corresponding to the pavement 14 based on the filtered grayscale image frames 80 .
- the Boolean mask of the filtering module 36 determines the masked filtered grayscale image frames 92 by assigning binary values to the pixels of the filtered grayscale image frames 80 .
- the first filter 84 determines the mean color space value 88 corresponding to the pixels for all binary values representing the pavement markings 12 and the mean color space value 90 corresponding to the pixels for all binary values representing the pavement 14 .
- the second filter 86 identifies the color represented by mean color space value representing the pavement markings 12 .
- the method 200 may then proceed to blocks 214 A and 214 B.
- the Euclidean distance module 38 determines the color distance measurement between the mean color space value 88 and the ideal marking color space value. Equations 1-3 as described above may be used to determine the Euclidean distance.
- the contrast ratio module 40 determines the marking intensity contrast ratio, which is described above in Equation 4. The method 200 may proceed to block 216 .
- the one or more controllers 26 create the map 98 plotting the Euclidean distance and marking intensity contrast ratio of the pavement markings 12 for a specific geographical location, where the map 98 provides a visual indicator where the pavement markings 12 of pavement marking quality and visibility.
- the map 98 may indicate where the pavement markings 12 for a specific section or length or roadway needs to be restriped or repainted to improve pavement marking quality and/or visibility. The method 200 may then terminate.
- system 20 further includes a dominant color module 100 that determines a dominant color space values for both the pavement markings 12 and the pavement 14 .
- the dominant color module 100 includes a clustering block 102 , a histogram block 104 , and a plotting block 106 .
- the histogram block 104 may then create a histogram illustrating the distribution of the N number of clusters 180 that each represent a dominant color for either the pavement markings 12 and the pavement 14 .
- the plotting block 106 may then create a color plot illustrating all of the dominant colors in the image frames 44 . All the original color space values (i.e., in RGB, HSV, or Y′UV color space) of the dominant colors of the pavement markings 12 and the pavement 14 may be presented as well.
- the N number of clusters 180 each representing a dominant color space value are then sent to a Euclidean distance module 138 that determines the Euclidean distance between each dominant color space value of the pavement markings 12 and the ideal marking color space value.
- the Euclidean distance is determined based on Equations 1-3.
- the Euclidean distance between the dominant color space value of the pavement markings 12 and the ideal marking color space is sent to the assessment module 42 .
- the assessment module 42 creates the map 98 plotting the Euclidean distance of the pavement markings 12 for a specific geographical location.
- the disclosed system provides various technical effects and benefits. Specifically, the disclosed system provides an approach for quantitively determining pavement marking quality that is devoid of human perception, which in turn may lead to uniform results that are not based on the variations in color perception between different individuals. The disclosed approach allows for various municipalities and government agencies require repainting based on specific quality standards, which in turn may lower repainting costs. Finally, the disclosed approach enables exploration of the variations in color perceived in pavement markings (e.g., such as the dominant color space approach), which was not previously possible.
- the controllers may refer to, or be part of an electronic circuit, a combinational logic circuit, a field programmable gate array (FPGA), a processor (shared, dedicated, or group) that executes code, or a combination of some or all of the above, such as in a system-on-chip.
- the controllers may be microprocessor-based such as a computer having a at least one processor, memory (RAM and/or ROM), and associated input and output buses.
- the processor may operate under the control of an operating system that resides in memory.
- the operating system may manage computer resources so that computer program code embodied as one or more computer software applications, such as an application residing in memory, may have instructions executed by the processor.
- the processor may execute the application directly, in which case the operating system may be omitted.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Quality & Reliability (AREA)
- Nonlinear Science (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
where μM is a marking intensity and μp is a pavement intensity.
where μM is a marking intensity and μp is a pavement intensity.
Euclidean DistanceRGB=√{square root over ((LM R −I R)2+(LM G −I G)2+(LM B −I B)2)} Equation 1
Euclidean DistanceHSV/HSL=√{square root over ((LM H −I H)2+(LM S −I S)2+(LM V/L −I V/L)2)} Equation 2
Euclidean DistanceYUV=√{square root over ((LM Y −I Y)2+(LM U −I U)2+(LM V −I V)2)} Equation 3
where Equation 1 is used if the mean color space value 88 is expressed in RGB color space, Equation 2 is used if the mean color space value 88 is expressed as HSL or HSV color space values, and Equation 3 is used if the mean color space value 88 is expressed as Y′UV color space values, LM represents the mean color space value 88 of the pavement marking 12, I represents the ideal marking color, R, G, B represents values for red, green, and blue, H represents the value of hue, V represents value, L represents lightness, Y is a luma component, and U, V represent chrominance components.
where μM is the marking intensity and μp is the pavement intensity.
Claims (20)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/547,580 US12106459B2 (en) | 2021-12-10 | 2021-12-10 | System for quantitively determining pavement marking quality |
| DE102022124853.4A DE102022124853A1 (en) | 2021-12-10 | 2022-09-27 | SYSTEM FOR DETERMINING ROAD MARKING QUALITY QUANTITATIVELY |
| CN202211309299.XA CN116259031A (en) | 2021-12-10 | 2022-10-25 | System for quantitative determination of pavement marking quality |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/547,580 US12106459B2 (en) | 2021-12-10 | 2021-12-10 | System for quantitively determining pavement marking quality |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20230186450A1 US20230186450A1 (en) | 2023-06-15 |
| US12106459B2 true US12106459B2 (en) | 2024-10-01 |
Family
ID=86498229
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/547,580 Active 2043-03-11 US12106459B2 (en) | 2021-12-10 | 2021-12-10 | System for quantitively determining pavement marking quality |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US12106459B2 (en) |
| CN (1) | CN116259031A (en) |
| DE (1) | DE102022124853A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130190981A1 (en) * | 2012-01-17 | 2013-07-25 | LimnTech LLC | Roadway mark data acquisition and analysis apparatus, systems, and methods |
| US20200294221A1 (en) * | 2019-03-15 | 2020-09-17 | Toyota Jidosha Kabushiki Kaisha | System and method for specifying lane marking deterioration |
| US20210256311A1 (en) * | 2020-02-14 | 2021-08-19 | Samsung Electronics Co., Ltd. | In-storage-based data processing using machine learning |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105718860B (en) * | 2016-01-15 | 2019-09-10 | 武汉光庭科技有限公司 | Localization method and system based on driving safety map and binocular Traffic Sign Recognition |
-
2021
- 2021-12-10 US US17/547,580 patent/US12106459B2/en active Active
-
2022
- 2022-09-27 DE DE102022124853.4A patent/DE102022124853A1/en active Pending
- 2022-10-25 CN CN202211309299.XA patent/CN116259031A/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130190981A1 (en) * | 2012-01-17 | 2013-07-25 | LimnTech LLC | Roadway mark data acquisition and analysis apparatus, systems, and methods |
| US20200294221A1 (en) * | 2019-03-15 | 2020-09-17 | Toyota Jidosha Kabushiki Kaisha | System and method for specifying lane marking deterioration |
| US20210256311A1 (en) * | 2020-02-14 | 2021-08-19 | Samsung Electronics Co., Ltd. | In-storage-based data processing using machine learning |
Non-Patent Citations (4)
| Title |
|---|
| B. Li, D. Song, H. Li, A. Pike and P. Carlson, "Lane Marking Quality Assessment for Autonomous Driving," 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 2018, pp. 1-9, doi: 10.1109/IROS.2018.8593855. (Year: 2018). * |
| H. H. Handayani and D. Wahiddin, "Digital Image Analysis of Beef Color Using Euclidean Distance Method," 2018 Third International Conference on Informatics and Computing (ICIC), Palembang, Indonesia, 2018, pp. 1-5, doi: 10.1109/IAC.2018.8780457. (Year: 2018). * |
| S. Yelmanov and Y. Romanyshyn, "Quantifying the contrast of objects in a complex image," 2020 IEEE 40th International Conference on Electronics and Nanotechnology (ELNANO), Kyiv, Ukraine, 2020, pp. 541-546, doi: 10.1109/ELNANO50318.2020.9088760. (Year: 2020). * |
| Z. Wang, Y. Fan and H. Zhang, "Lane-line Detection Algorithm for Complex Road Based on OpenCV," 2019 IEEE 3rd Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC), Chongqing, China, 2019, pp. 1404-1407, doi: 10.1109/IMCEC46724.2019.8983919. (Year: 2019). * |
Also Published As
| Publication number | Publication date |
|---|---|
| CN116259031A (en) | 2023-06-13 |
| DE102022124853A1 (en) | 2023-06-15 |
| US20230186450A1 (en) | 2023-06-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10635929B2 (en) | Saliency-based method for extracting road target from night vision infrared image | |
| TWI423166B (en) | Method for determining if an input image is a foggy image, method for determining a foggy level of an input image and cleaning method for foggy images | |
| US9396548B2 (en) | Multi-cue object detection and analysis | |
| CN106651872A (en) | Prewitt operator-based pavement crack recognition method and system | |
| CN109190523B (en) | Vehicle detection tracking early warning method based on vision | |
| CN103425989B (en) | Vehicle color identification method and system based on significance analysis | |
| CN104732227A (en) | Rapid license-plate positioning method based on definition and luminance evaluation | |
| EP2760207A1 (en) | Image processing device | |
| CN106023623A (en) | Recognition and early warning method of vehicle-borne traffic signal and symbol based on machine vision | |
| CN103345766A (en) | Method and device for identifying signal light | |
| CN110443166A (en) | A kind of licence plate recognition method of haze weather | |
| US20140125794A1 (en) | Vehicle environment monitoring device | |
| CN104766286A (en) | Image defogging device and method based on pilotless automobile | |
| US10179487B1 (en) | Method for generating images of predicted tire tread wear | |
| CN105488797A (en) | License plate location method for HSV space | |
| CN112419745A (en) | Highway group fog early warning system based on degree of depth fusion network | |
| CN115601717A (en) | Deep learning-based traffic violation classification detection method and SoC chip | |
| CN106803073B (en) | Auxiliary driving system and method based on stereoscopic vision target | |
| CN114764853A (en) | Method and device for determining occupation information | |
| KR101651061B1 (en) | Method and device for lane detection | |
| US12106459B2 (en) | System for quantitively determining pavement marking quality | |
| CN108288388A (en) | A kind of intelligent traffic monitoring system | |
| CN117456483B (en) | Intelligent traffic driving safety warning method and device based on image processing | |
| US10417518B2 (en) | Vehicle camera system | |
| CN109330833B (en) | Intelligent sensing system and method for assisting visually impaired patients to safely go out |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AHMED, FAHIM;GORDON, RICHARD;GRIMM, DONALD K.;SIGNING DATES FROM 20211206 TO 20211209;REEL/FRAME:058399/0641 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |