CN112417993A - Parking space line detection method for parking area and computer equipment - Google Patents

Parking space line detection method for parking area and computer equipment Download PDF

Info

Publication number
CN112417993A
CN112417993A CN202011203274.2A CN202011203274A CN112417993A CN 112417993 A CN112417993 A CN 112417993A CN 202011203274 A CN202011203274 A CN 202011203274A CN 112417993 A CN112417993 A CN 112417993A
Authority
CN
China
Prior art keywords
pixel
image
offset
channel image
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011203274.2A
Other languages
Chinese (zh)
Other versions
CN112417993B (en
Inventor
杨威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecarx Hubei Tech Co Ltd
Original Assignee
Hubei Ecarx Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Ecarx Technology Co Ltd filed Critical Hubei Ecarx Technology Co Ltd
Priority to CN202011203274.2A priority Critical patent/CN112417993B/en
Publication of CN112417993A publication Critical patent/CN112417993A/en
Application granted granted Critical
Publication of CN112417993B publication Critical patent/CN112417993B/en
Priority to PCT/CN2021/114814 priority patent/WO2022088900A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to a parking space line detection method and computer equipment for a parking area, wherein the method comprises the following steps: the method comprises the steps of obtaining an RGB image of a parking area, obtaining an R channel image, a G channel image and a B channel image of the parking area, determining a blue area characteristic and a yellow area characteristic according to the R channel image, the G channel image and the B channel image, determining a color space characteristic map of the parking area, determining a first gray scale map with color characteristics of the parking area according to the color space characteristic map and the RGB image, determining a binary image with color characteristics of the parking area according to the first gray scale map, detecting a straight line in the binary image, and determining a parking space line in the parking area according to the straight line.

Description

Parking space line detection method for parking area and computer equipment
Technical Field
The application relates to the technical field of automatic parking, in particular to a parking space line detection method and computer equipment for a parking area.
Background
An Automatic Parking System (APS) is a comprehensive System integrating functions of environmental sensing, decision and planning, intelligent control and execution, and the like, is an important component of an intelligent driving assistance System, and in recent years, with the rapid increase of the demand for the Parking assistance System, a method for detecting a Parking space line is provided.
In the related technology, most of parking spaces in an actual scene are white, and a certain number of parking spaces are blue and yellow, and a parking space line extraction method is roughly as follows. However, in an actual situation, pixel values corresponding to different color regions in the gray-scale image are different, the pixel value of the white space line region is greater than the pixel values of other color regions, the pixel value of the blue space line region is lower than the pattern values of other color regions, and the pixel value of the white space line region appears in the gray-scale image and fluctuates in an uncertain manner under different illumination conditions, so that the problem of inaccurate space line detection exists. In the related art, it is also possible to extract yellow, blue, and white components by setting different thresholds for different colors in the RGB image, but this requires knowing the colors of the vehicle-line lines in advance, or requires a multiple of the operation time to extract all the color components.
Disclosure of Invention
The embodiment of the application provides a parking space line detection method and computer equipment for a parking area, and aims to solve the problem that in the prior art, when a parking space line is extracted in an actual scene, the parking space line detection is inaccurate.
In a first aspect, an embodiment of the present application provides a parking space line detection method for a parking area, where the method includes:
acquiring an RGB image of a parking area, and acquiring an R channel image, a G channel image and a B channel image of the parking area according to the RGB image;
determining blue region characteristics according to the pixel value offset of the B channel image to the R channel image and the G channel image;
determining yellow region characteristics according to the pixel value offsets of the G channel image and the R channel image to the B channel image respectively;
superposing the blue area characteristic and the yellow area characteristic to determine a color space characteristic diagram of the parking area;
determining a first gray-scale map with color features of the parking area according to the color space feature map and the RGB image, and determining a binary image with color features of the parking area according to the first gray-scale map;
and determining a parking space line of the parking area in the binary image with the color characteristics of the parking area.
In some of these embodiments, determining blue region features from pixel value offsets of the B channel image to the R channel image and to the G channel image comprises:
determining a first forward pixel value offset of the B channel image to the R channel image according to the B channel image and the R channel image;
determining a second forward pixel value offset of the B channel image to the G channel image according to the B channel image and the G channel image;
and superposing the first forward pixel value offset and the second forward pixel value offset to determine the blue region characteristic.
In some embodiments, determining the yellow region feature according to the pixel value offset of the G channel image and the R channel image to the B channel image respectively comprises:
determining a third forward pixel value offset of the G channel image to the B channel image according to the G channel image and the B channel image;
determining a fourth forward pixel value offset of the R channel image to the B channel image according to the R channel image and the B channel image;
and superposing the third forward pixel value offset and the fourth forward pixel value offset to determine the yellow region characteristic.
In some embodiments, the determining a first grayscale map with color features of the parking area according to the color space feature map and the RGB image, and determining a binarized image with color features of the parking area according to the first grayscale map includes:
carrying out gray level conversion on the RGB image to determine a gray level image of the RGB image;
carrying out binarization processing on the color space characteristic diagram, and determining a binarization image of the color space characteristic diagram;
superposing the binarized image of the color space feature map and the gray scale map of the RGB image to determine the first gray scale map;
and carrying out binarization processing on the first gray-scale image, and determining a binarization image with color characteristics of the parking area.
In some embodiments, the binarizing the color space feature map and/or the binarizing the first grayscale map includes:
taking the color space feature map and/or the first gray scale map as an image to be processed; the image to be processed comprises a plurality of pixel rows and a plurality of pixel columns;
acquiring first binarization results of pixel points in all the pixel rows and second binarization results of pixel points in all the pixel columns;
and overlapping the first binarization result and the second binarization result to determine a binarization image of the image to be processed.
In some embodiments, obtaining the first binarization result of the pixels in all the pixel rows includes:
for each pixel row in the image to be processed, acquiring a horizontal pixel value offset of each pixel point in the pixel row and acquiring a horizontal pixel offset threshold of the pixel row,
determining a first binarization result of the pixel point according to the horizontal pixel value offset of the pixel point and the horizontal pixel deviation threshold of the pixel line;
obtaining second binarization results of pixel points in all the pixel columns comprises:
for each pixel column in an image to be processed, acquiring a vertical pixel value offset of each pixel point in the pixel column and acquiring a vertical pixel offset threshold of the pixel column;
and determining a second binarization result of the pixel point according to the vertical pixel value offset of the pixel point and the vertical pixel deviation threshold of the pixel row.
In some of these embodiments, obtaining the horizontal pixel deviation threshold for the row of pixels comprises:
acquiring the horizontal pixel value offset of all pixel points in the pixel row;
determining a horizontal pixel offset mean value of the pixel row according to the horizontal pixel value offsets of all pixel points in the pixel row;
if the horizontal pixel offset mean value is larger than a first preset threshold value, determining that the horizontal pixel offset mean value is a horizontal pixel deviation threshold value of the pixel row, and if the horizontal pixel offset mean value is smaller than the first preset threshold value, determining that the preset threshold value is a horizontal pixel deviation threshold value of the pixel row;
acquiring a vertical pixel deviation threshold for the pixel column comprises:
acquiring the vertical pixel value offset of all pixel points in the pixel column;
determining a vertical pixel offset mean value of the pixel column according to the vertical pixel value offsets of all pixel points in the pixel column;
if the vertical pixel offset mean value is larger than a second preset threshold value, determining that the vertical pixel offset mean value is a vertical pixel deviation threshold value of the pixel array, and if the vertical pixel offset mean value is smaller than the second preset threshold value, determining that the preset threshold value is a vertical pixel deviation threshold value of the pixel array.
In some embodiments, obtaining the horizontal pixel value offset of each of the pixels in the pixel row comprises:
calculating a first horizontal offset difference between the pixel point and a first horizontal neighboring pixel point, and calculating a second horizontal offset difference between the pixel point and a second horizontal neighboring pixel point,
the first horizontal similar pixel points are pixel points which are separated from the pixel points by a first preset distance in a first horizontal direction, the second horizontal similar pixel points are pixel points which are separated from the pixel points by the first preset distance in a second horizontal direction, and the first horizontal direction is opposite to the second horizontal direction;
obtaining the vertical pixel value offset of each pixel point in the pixel row comprises:
calculating a first vertical offset difference between the pixel point and a first vertical neighboring pixel point, and calculating a second vertical offset difference between the pixel point and a second vertical neighboring pixel point,
the first vertical similar pixel points are pixel points which are separated from the pixel points by a second preset distance in the first vertical direction, the second vertical similar pixel points are pixel points which are separated from the pixel points by the second preset distance in the second vertical direction, and the first vertical direction is opposite to the second vertical direction.
In a second aspect, an embodiment of the present application provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the processor implements the lane detection method for a parking area as described in the first aspect.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the lane detection method for a parking area according to the first aspect.
Compared with the prior art, the parking space line detection method for the parking area obtains the R channel image, the G channel image and the B channel image of the parking area according to the RGB image by obtaining the RGB image of the parking area; determining blue region characteristics according to the pixel value offset of the B channel image to the R channel image and the G channel image; determining yellow region characteristics according to the pixel value offsets of the G channel image and the R channel image to the B channel image respectively; superposing the blue area characteristic and the yellow area characteristic to determine a color space characteristic diagram of the parking area; determining a first gray-scale map with color features of the parking area according to the color space feature map and the RGB image, and determining a binary image with color features of the parking area according to the first gray-scale map; the parking space line of the parking area is determined in the binarization image with the color characteristics of the parking area, so that the problem of inaccurate parking space line detection in the prior art when the parking space line is extracted in an actual scene is solved, and the detection precision of the parking space line is improved.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a first flowchart of a parking space line detection method for a parking area according to an embodiment of the present application;
FIG. 2 is a flow chart of a method of determining blue region characteristics according to an embodiment of the present application;
FIG. 3 is a flow chart of a method of determining a yellow region characteristic according to an embodiment of the present application;
FIG. 4 is a flow chart of a method of determining a binarized image with color features for a parking area according to an embodiment of the present application;
fig. 5 is a flowchart of a method of binarization processing according to an embodiment of the present application;
FIG. 6 is a first flowchart of a method for obtaining first binarization results of pixels in all pixel rows according to an embodiment of the present application;
FIG. 7 is a flowchart illustrating a method for obtaining first binarization results of pixels in all pixel rows according to an embodiment of the present disclosure;
fig. 8 is a first flowchart of a method for obtaining a second binarization result of pixel points in all pixel rows according to an embodiment of the present application;
fig. 9 is a flowchart ii of a method for obtaining a second binarization result of pixel points in all pixel rows according to an embodiment of the present application;
fig. 10 is a schematic diagram of an internal structure of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application.
It is obvious that the drawings in the following description are only examples or embodiments of the present application, and that it is also possible for a person skilled in the art to apply the present application to other similar contexts on the basis of these drawings without inventive effort. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as referred to herein means two or more. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The detection of the parking space line can be applied to automatic parking, the parking space line color of an actual scene is mostly white, a certain number of parking space line colors are blue and yellow, the method for extracting the parking space line is roughly as follows, a gray level image is obtained by preprocessing an image through color image graying, a binary image is obtained by performing binary processing on the gray level image, and finally the parking space line is extracted through Hough transformation, however, in an actual situation, pixel values corresponding to different color areas in the gray level image are different, the pixel value of a white parking space line area is greater than that of other color areas, the pixel value of a blue parking space line area is lower than that of other color areas, and the pixel value presented in the gray level image fluctuates in an uncertain manner under different illumination conditions in the white parking space line area, so that different threshold values are usually set in three channels to extract yellow, blue and white components, the way of extracting the parking space lines on different color components is not enough, for example, 1, the color of the parking space line needs to be known in advance, or the parking space line is extracted for all the color components, which results in the condition input or multiple times of operation time; 2. simple threshold value is injectd under the illumination condition of difference, the colour weight condition of extracting the failure appears very easily, leads to the parking stall line to extract the failure, and then has the coarse problem of parking stall line detection, the parking stall line detection method in the parking area that this application provided, the different colour weight of self-adaptation of extraction to fuse the colour characteristic, when having solved among the correlation technique and carrying out the parking stall line under the actual scene and extracting, there is the coarse problem of parking stall line detection, the detection precision of car position line has been improved.
The present embodiment provides a method for detecting a parking space line in a parking area, and fig. 1 is a first flowchart of the method for detecting a parking space line in a parking area according to the embodiment of the present application, and as shown in fig. 1, the method includes the following steps:
step S101, acquiring RGB images of a parking area, and acquiring an R channel image, a G channel image and a B channel image of the parking area according to the RGB images;
in the process of obtaining the RGB image of the parking area, the RGB image may be directly obtained or the yuv format image may be obtained according to the type of the camera, the yuv format image may be subjected to color space conversion to obtain the RGB image, and finally the RGB image of the parking area of the bird's-eye view is obtained through transmission conversion. The RGB image obtains various colors by changing and superposing three color channels of red (R), green (G) and blue (B), wherein RGB represents the colors of the red, green and blue channels, and the R channel image, the B channel image and the G channel image can be obtained by separating the RGB three channels of the obtained RGB image of the parking area.
Step S102, determining blue area characteristics according to the pixel value offset of the B channel image to the R channel image and the G channel image;
in an actual scene, the color of the parking space line is a color close to white, yellow, blue and the like. In addition, if the car-to-car line in the RGB image is blue or a color close thereto, the pixel value of the B-channel image is greater than the pixel value of the R-channel image and the pixel value of the G-channel image, respectively, and therefore, the blue region feature can be obtained by shifting the pixel values of the R-channel image and the G-channel image by the B-channel image.
Step S103, determining yellow region characteristics according to the pixel value offsets of the G channel image and the R channel image to the B channel image respectively;
it should be noted that, if the car-to-car line in the RGB image is yellow or a color close to yellow, the pixel value of the R channel image and the pixel value of the G channel image are greater than the pixel value of the B channel image, and therefore, the shift amount of the R channel image from the B channel image to the G channel image and the shift amount of the R channel image from the R channel image to the B channel image can be calculated, so that the yellow region feature can be obtained.
And step S104, overlapping the blue region characteristic and the yellow region characteristic to determine a color space characteristic diagram of the parking region.
Step S105, determining a first gray-scale map with color features of the parking area according to the color space feature map and the RGB image, and determining a binary image with color features of the parking area according to the first gray-scale map;
wherein, Gray is R0.299G 0.587B 0.114, which is a calculation formula of the Gray image, Gray represents the Gray value obtained by calculation, R represents the pixel value of the R channel image, G represents the pixel value of the G channel image, and B represents the pixel value of the B channel image, and it can be known from the formula that the proportion of blue in the Gray image is minimum, and then the pixel value of the blue parking space line area on the Gray image is lower than the pixel values of other color areas, because binarization can only process the area of pixel points with larger pixel values, the pixel values of the pixel points belonging to blue and yellow in the first gray scale image can be improved by superposing the binarized image of the color space characteristic image and the gray scale image of the RGB image to obtain the first gray scale image, and effective extraction of blue and yellow can be realized in the binarization processing of the first gray-scale map.
And step S106, determining the parking space line of the parking area in the binarized image with the color characteristics of the parking area.
Through steps S101 to S106, under the condition that there is a difference in pixel values corresponding to different color regions in the grayscale image and the binarization can only process a region with a larger pixel value, the color space feature map of the RGB image is obtained first with respect to the yellow region feature and the blue region feature, then the first grayscale image with the color feature of the parking region is determined according to the color space feature map and the RGB image, and the binarized image with the color feature of the parking region is determined according to the first grayscale image, so that effective extraction of blue and yellow can be realized in the binarization processing of the first grayscale image, thereby solving the problem that in an actual environment, the extraction of a parking space line is failed due to relatively low pixel values in the grayscale image of the blue parking space line region and the fluctuation of pixel values in the grayscale image of the yellow parking space line region under different illumination, and solving the problem that in the related art, when the extraction of a parking space line is performed in an actual scene, the problem of inaccurate parking space line detection exists, the detection precision of the parking space line is improved, and different color components can be extracted in a self-adaptive mode.
In some embodiments, fig. 2 is a flowchart of a method for determining a blue region feature according to an embodiment of the present application, and as shown in fig. 2, the method for determining the blue region feature according to a pixel value offset of a B-channel image to an R-channel image and a G-channel image includes the following steps:
step S201, determining a first forward pixel value offset of the B channel image to the R channel image according to the B channel image and the R channel image;
under the condition that the RGB image is blue and blue close to the blue, the pixel value of the B channel image is respectively greater than the pixel value of the R channel image and the pixel value of the G channel image;
it should be noted that the first forward pixel value offset can be obtained by the following calculation formula:
Figure BDA0002756164500000091
formula 1;
in the above formula 1, (i, j) represents a pixel point in the image coordinate system, diffbg(i,j)For the first forward pixel value offset, B (i, j) is the pixel value of the B channel image of the pixel (i, j), and G (i, j) is the pixel value of the G channel image of the pixel (i, j).
Step S202, determining a second forward pixel value offset of the B channel image to the G channel image according to the B channel image and the G channel image;
if the RGB image area is blue, the pixel value of the B channel image is larger than that of the G channel image, the larger the difference value is, the more vivid blue is represented, and if the difference value is smaller than 0 or equal to 0, the area is not blue;
the second forward pixel value offset may be obtained by the following calculation formula:
Figure BDA0002756164500000092
formula 2;
in the above formula 2, (i, j) represents a pixel point in the image coordinate system, diffbr(i,j)If the RGB image region is blue and a color close to the blue color, the pixel value of the B channel image is greater than the pixel value of the R channel image, and if the difference value is larger, the blue color is more vivid, and if the difference value is smaller than 0 or equal to 0, the region is not blue;
the above equations 1 and 2 can also be understood as: in order to increase the pixel value of the blue component on the grayscale map, the pixel values of the blue region need to be increased in a targeted manner, that is, the pixel values of all the blue regions are increased, the increased pixel values of the blue regions are different to some extent, and the brighter blue portion is increased to have a larger pixel value, so that the luminance of blue is reflected in the difference between the pixel values of the B channel and the G channel, and the pixel value of the R channel, and therefore if the difference between the pixel values of the B channel and the G channel and the difference between the pixel values of the B channel and the R channel are not the forward pixel value offset, the pixel values of the blue region and the other color regions displayed on the grayscale map are close to each other, but after the forward pixel value offset of the B channel to the G channel and the forward pixel value offset of the B channel to the R channel are increased, the pixel value of the blue region is larger than the pixel values of the other color regions.
Step S203, overlapping the first forward pixel value offset and the second forward pixel value offset to determine the blue region characteristic;
wherein the first forward pixel value offset and the second forward pixel value offset can be superimposed by the following calculation formula:
diifb_gr(i,j)=diifbg(i,j)+diifbr(i,j)formula 3;
in the above formula 3, diffb-gr(i,j)Is characterized by a blue region, diffbg(i,j)Is the first forward pixel value offset, diffbr(i,j)For the second forward pixel value offset, formula 3 can also be understood as the pixel value difference values calculated according to formula 1 and formula 2 are superimposed.
Through steps S201 to S203, the first forward pixel value shift amount of the B-channel image pair to the R-channel image is superimposed with the second forward pixel value shift amount of the B-channel image pair to the G-channel image, so that the pixel value of the blue region is increased in a targeted manner, and in the process of increasing, the brighter blue part increases the pixel value more so as to obtain the blue region feature.
In some embodiments, fig. 3 is a flowchart of a method for determining a yellow region feature according to an embodiment of the present application, and as shown in fig. 3, the method for determining the yellow region feature according to the pixel value offset of the G channel image and the R channel image to the B channel image respectively includes the following steps:
step S301, determining a third forward pixel value offset of the G channel image to the B channel image according to the G channel image and the B channel image;
step S302, determining a fourth forward pixel value offset of the R channel image to the B channel image according to the R channel image and the B channel image;
it should be noted that, when the RGB image is yellow and a color close to yellow, the pixel value of the G channel image and the pixel value of the R channel image are both greater than the pixel value of the B channel image; it should be further noted that the third forward pixel value offset diff can be calculated according to the calculation methods of formula 1 and formula 2gb(i,j)And a fourth forward pixel value offset diffrb(i,j)And if the RGB image area is yellow, the pixel values of the G channel image and the R channel image are both greater than the pixel value of the B channel image, and the larger the difference is, the more vivid yellow is represented, and if the difference is less than 0 or equal to 0, the area is not yellow.
Step S303, overlapping the third forward pixel value offset and the fourth forward pixel value offset to determine the characteristics of a yellow area;
the third forward pixel value offset and the fourth forward pixel value offset may be superimposed by the following calculation formula:
diifgr_b(i,j)=diifgb(i,j)+diifrb(i,j)formula 4;
in the above equation 4, diffgb(i,j)Is the third forward pixel value offset, diffrb(i,j)For the fourth forward pixel value offset, equation 4 can also be understood as superimposing the difference in relative offsets of the different channels.
Through steps S301 to S303, the third forward pixel value offset amount of the G channel image pair to the B channel image is superimposed with the fourth forward pixel value offset amount of the R channel image pair to the B channel image, so that the pixel value of the yellow region is increased in a targeted manner, and in the process of increasing, the brighter yellow portion is increased by a larger pixel value so as to obtain the yellow region feature.
In some embodiments, fig. 4 is a flowchart of a method for determining a binarized image with color features of a parking area according to an embodiment of the present application, as shown in fig. 4, the method including the steps of:
step S401, carrying out gray level conversion on the RGB image to determine a gray level image of the RGB image;
the calculation formula of the gray scale map is as follows:
gray(i,j)b (i, j) × 0.114+ G (i, j) × 0.587+ R (i, j) × 0.299 formula 5;
in the above formula 5, gray (i, j) is a gray scale map, B (i, j) is a pixel value of a B channel image, G (i, j) is a pixel value of a G channel image, and R (i, j) is a pixel value of an R channel image.
Step S402, carrying out binarization processing on the color space characteristic diagram, and determining a binarization image of the color space characteristic diagram;
the color space feature map can be determined by superposing the blue region features and the yellow region features, so that the calculation formula of the color space feature map is as follows:
diff(i,j)=diifgr_b(i,j)+diifb_gr(i,j)equation 6;
in the above equation 6, diffgr-b(i,j)Characteristic of yellow region, diffb-gr(i,j)Is a blue area characteristic;
it should be noted that, the image binarization processing is to set the gray value of a pixel point on an image to be 0 or 255, that is, to present an obvious black-and-white effect to the whole image, in the related art, a commonly used binarization processing method is to set a threshold T, and the whole image uses a threshold T, and the threshold T divides the data of the image into two parts: the pixel group larger than the threshold T and the pixel group smaller than the threshold T may be set to 255 for the gray value of the pixel group larger than or equal to the threshold T, and set to 0 for the pixel group smaller than the threshold T, such that the entire image is used with one threshold T, and there is a case that the data segmentation of the image is not ideal, in the embodiment of the present application, the color space feature map is binarized by using the adaptive luminance difference, that is, in the binarization process of the color space feature map, when different parts of the color space feature map have different luminances, the adaptive threshold is used, that is, different thresholds are used for different regions on the same image, so as to obtain a better binarization processing result under the condition that the luminances are different, wherein the adaptive threshold in the embodiment of the present application may be a threshold calculated according to each small region on the image, or a plurality of different thresholds can be preset in advance according to different brightness of the color space feature map.
Step S403, superposing the binarized image of the color space feature map and the gray scale map of the RGB image to determine a first gray scale map;
the calculation formula of the first gray scale map is as follows:
Agray(i,j)=gray(i,j)+diff(i,j)*mask(i,j)equation 7;
in the above formula 7, aggregate (i, j) is the first gray scale map, gray (i, j) is the gray scale map of the RGB image, diff(i,j)Mask (i, j) is a binarized image of the color space feature map, wherein mask (i, j) can be expressed by the following formula:
Figure BDA0002756164500000121
equation 8.
And step S404, performing binarization processing on the first gray map, and determining a binarized image with color features of the parking area.
Through steps S401 to S404, the binarized image of the color space feature map and the grayscale map of the RGB image are superimposed to determine a first grayscale map with color features, and the binarized image of the color space feature map is binarized based on the adaptive luminance difference in the process of obtaining the binarized image of the color space feature map, so that the adaptive adjustment of the threshold value is achieved to reduce the interference caused by noise.
In some embodiments, fig. 5 is a flowchart of a method of binarization processing according to an embodiment of the present application, and as shown in fig. 5, the method further includes the following steps:
step S501, using the color space feature map and/or the first gray scale map as an image to be processed; the image to be processed includes a plurality of pixel rows and a plurality of pixel columns.
On the basis of the horizontal direction, the image to be processed is formed by splicing pixel rows, and each pixel row is formed by sequentially arranged pixel points on the same row; based on the vertical direction, the image to be processed is formed by splicing pixel columns, and each pixel column is formed by sequentially arranged pixel points on the same column; the pixel rows based on the horizontal direction and the pixel columns based on the vertical direction are two directions which are perpendicular to each other in an image coordinate system. And (3) carrying out binarization processing on each image to be processed according to pixel rows and pixel columns respectively to obtain a binarization image of the image to be processed.
Step S502, obtaining first binarization results of pixel points in all pixel rows and obtaining second binarization results of pixel points in all pixel columns;
step S503, overlapping the first binarization result and the second binarization result, and determining a binarization image of the image to be processed;
and (3) carrying out binarization processing on each image to be processed according to pixel rows, specifically carrying out binarization processing on pixel points in each pixel row to obtain a first binarization result of the pixel points. And (3) carrying out binarization processing on each image to be processed according to the pixel columns, specifically, carrying out binarization processing on pixel points in each pixel column to obtain a second binarization result of the pixel points. And after each pixel point of the image to be processed is subjected to binarization processing according to pixel rows and pixel columns, superposing the first binarization result and the second binarization result of each pixel point to finally form a binarization image of the image to be processed.
Through steps S501 to S503, a good binarization effect can be generated for the image with uneven illumination based on the binarization processing for adaptive brightness differentiation performed on the color space feature map and the binarization processing for adaptive brightness differentiation performed on the first gray scale map.
In some embodiments, fig. 6 is a first flowchart of a method for obtaining a first binarization result of pixel points in all pixel rows according to an embodiment of the present application, and as shown in fig. 6, the method further includes the following steps:
step S601: and acquiring the horizontal pixel value offset of each pixel point in the pixel row aiming at each pixel row in the image to be processed.
The horizontal pixel offset of each pixel point comprises a first horizontal offset difference value and a second horizontal offset difference value, the first horizontal offset difference value is the deviation of the pixel value between the pixel point and the first horizontal similar pixel point, and the second horizontal offset difference value is the deviation of the pixel value between the pixel point and the second horizontal similar pixel point. And the set of the first horizontal offset difference value and the second horizontal offset difference value is the horizontal pixel offset of the pixel point.
In an embodiment, the first horizontal neighboring pixel point is a pixel point separated from the pixel point by a first predetermined distance in a first horizontal direction, and the second horizontal neighboring pixel point is a pixel point separated from the pixel point by a first predetermined distance in a second horizontal direction, where the first predetermined distance may be set according to experience, and the first horizontal direction and the second horizontal direction are opposite directions in a pixel row.
Wherein, the first horizontal offset difference diff in the horizontal pixel value offset of the pixel point(i,j)-lAnd a second horizontal offset difference diff(i,j)-rThe calculation formula of (c) can be as follows:
Figure BDA0002756164500000131
equation 9;
Figure BDA0002756164500000132
equation 10;
wherein, a(i,j)Is the pixel value of the ith column and the jth row, the k value is a first preset distance, and can be adjusted according to the actual situation, S(i,j)The horizontal pixel accumulated value of pixel points (i, j) on a pixel line in the RGB image is obtained, namely the accumulation of the pixel values from a first pixel point to an ith pixel point in the jth line; s(i-k,j)The method comprises the steps that the horizontal pixel accumulated value of a first horizontal similar pixel point (i-k, j) on a pixel line in an RGB image is obtained, namely the accumulation of the pixel value from a first pixel point to an i-k pixel point in the jth line is obtained; s(i+k,j)The horizontal pixel accumulated value of a first horizontal similar pixel point (i + k, j) on a pixel line on the pixel line in the RGB image is the accumulated value from the first pixel point on the jth line to the (i + k) th pixel pointAccumulation of pixel values. The ith-k pixel point is a pixel point which is k pixels away from the ith pixel point in the first horizontal direction (left direction) of the jth row, and the (i + k) th pixel point is a pixel point which is k pixels away from the ith pixel point in the second horizontal direction (right direction) of the jth row. The horizontal pixel value offset of the pixel point is the first horizontal offset difference diff(i,j)-1Is deviated by a second horizontal deviation difference diff(i,j)-rSet of { diff(i,j)-1,diff(i,j)-r}。
In a specific embodiment, after the pixel values of each pixel point in the horizontal direction in the image to be processed are obtained, for the pixel of the jth row, the horizontal pixel accumulated value is
Figure BDA0002756164500000141
a(n,j)Is the pixel value of the nth pixel of the jth row.
Step S602: acquiring a horizontal pixel deviation threshold value of a pixel row;
in some embodiments, fig. 7 is a flowchart of a second method for obtaining the first binarization results of the pixels in all the pixel rows according to the embodiment of the present application, and as shown in fig. 7, the step of obtaining the horizontal pixel deviation threshold of the pixel row includes the following steps:
step S701: acquiring horizontal pixel value offset of all pixel points in a pixel row; for example, the horizontal pixel value offset of all the pixels in the pixel row is counted to obtain the set { diff(i,j)-1,diff(i,j)-r}j=1,2,3。。。n
Step S702: determining a horizontal pixel offset mean value of the pixel row according to the horizontal pixel value offsets of all pixel points in the pixel row; for example, calculate a set of horizontal pixel value offsets { diff(i,j)-1,diff(i,j)-r}j=1,2,3。。。nObtaining the average value diff of the horizontal pixel offset of the pixel row by the average value of all the elements in the pixel rowmean1
Step S703: if the horizontal pixel offset mean diffmean1If the difference is larger than the first preset threshold value T1, the average value diff of the horizontal pixel offset is determinedmean1Horizontal pixel deviation threshold for pixel rowThe value diff-T1; if the horizontal pixel offset mean diffmean1Less than the first predetermined threshold T1, the first predetermined threshold T1 is determined as the horizontal pixel deviation threshold diff-T1 of the pixel row, and the first predetermined threshold T1 is an empirical value. In one embodiment, the average horizontal pixel offset diff of the pixel row may also be determinedmean1Directly as the horizontal pixel deviation threshold diff-T1 for the pixel row.
Step S603: determining a first binarization result of the pixel points according to the horizontal pixel value offset of the pixel points and the horizontal pixel deviation threshold of the pixel rows;
determining a binarization result of a pixel point on a pixel line according to a horizontal pixel value offset of the pixel point on the pixel line and a horizontal pixel threshold diff-T1 of the pixel line; for example, the difference value of the first horizontal offset corresponding to the pixel point is larger than the horizontal pixel threshold of the pixel row or the difference value of the second horizontal offset corresponding to the pixel point is larger than the horizontal pixel threshold of the pixel row, i.e. diff(i,j)-1>diff-T1 or diff(i,j)-r>The pixel value of the pixel point of diff-T1 is set to 255, and the pixel values of other pixel points are set to 0, so that the binarization result in the horizontal direction on the image to be processed can be represented by the following formula 11:
Figure BDA0002756164500000142
equation 11;
in some embodiments, fig. 8 is a first flowchart of a method for obtaining second binarization results of pixel points in all pixel rows according to an embodiment of the present application, and as shown in fig. 8, the method includes the following steps:
step S801: acquiring a vertical pixel value offset of each pixel point in a pixel row aiming at each pixel row in an image to be processed;
the vertical pixel offset of each pixel point comprises a first vertical offset difference value and a second vertical offset difference value, wherein the first vertical offset difference value is the deviation of the pixel value between the pixel point and a first vertical similar pixel point, and the second vertical offset difference value is the deviation of the pixel value between the pixel point and a second vertical similar pixel point. And the set of the first vertical offset difference value and the second vertical offset difference value is the vertical pixel offset of the pixel point.
In an embodiment, the first vertical neighboring pixel point is a pixel point separated from the pixel point by a first predetermined distance in a first vertical direction, and the second vertical neighboring pixel point is a pixel point separated from the pixel point by a first predetermined distance in a second vertical direction, where the first predetermined distance may be set according to experience, and the first vertical direction and the second vertical direction are opposite directions on the pixel column. Wherein, the first vertical offset difference diff in the vertical pixel value offset of the pixel point(i,j)-uAnd a second vertical offset difference diff(i,j)-dThe calculation formula of (c) can be as follows:
Figure BDA0002756164500000151
equation 12;
Figure BDA0002756164500000152
equation 13;
wherein, a(i,j)The value h is a second preset distance, the first preset distance k may be equal to the second preset distance h, and the first preset distance k may be adjusted according to actual conditions. C(i,j)The vertical pixel accumulated value of a pixel point (i, j) on a pixel column in the RGB image is obtained, namely the accumulation of the pixel values from a first pixel point to an ith pixel point in the jth column; c(i-h,j)The pixel value of a first vertical similar pixel point (i-h, j) on a pixel column in the RGB image is accumulated, namely the accumulation of the pixel values from a first pixel point to an i-h pixel point in a jth column; c(i+h,j)The pixel value of the first vertical similar pixel point (i + h, j) on the pixel column in the RGB image is the accumulated value of the vertical pixel of the first vertical similar pixel point (i + h, j) on the pixel column in the jth column, namely the accumulated value of the pixel values from the first pixel point to the (i + h) th pixel point in the jth column. The ith-h pixel point is a pixel point which is separated from the ith pixel point in the first vertical direction (upward) of the jth row by h pixels, and the (i + h) th pixel point is a pixel point which is separated from the ith pixel point in the second vertical direction (downward) of the jth row by h pixels. The pixel pointThe vertical pixel value offset is the first vertical offset difference diff(i,j)-uBy a difference diff from the second vertical(i,j)-dSet of { diff(i,j)-u,diff(i,j)-d}。
In a specific embodiment, after the pixel values of the pixel points in the vertical direction in the image to be processed are obtained, for the pixel in the ith column, the vertical pixel accumulated value is obtained
Figure BDA0002756164500000161
Wherein, a(i,n)Is the pixel value of the nth pixel in the ith column.
Step S802: acquiring a vertical pixel deviation threshold of a pixel column;
in some embodiments, fig. 9 is a flowchart of a second method for obtaining second binarization results of pixel points in all pixel rows according to an embodiment of the present application, and as shown in fig. 9, obtaining a vertical pixel deviation threshold of a pixel row includes the following steps:
step S901: acquiring vertical pixel value offset of all pixel points in a pixel column; for example, the vertical pixel value offset of all the pixels in the pixel column is counted to obtain the set { diff(i,j)-u,diff(i,j)-d}i=1,2,3。。。n
Step S902: determining a vertical pixel offset mean value of the pixel column according to the vertical pixel value offsets of all pixel points in the pixel column; for example, calculate the vertical pixel value offset set { diff(i,j)-u,diff(i,j)-d}i=1,2,3。。。nObtaining the average value diff of vertical pixel offset of the pixel column by the average value of all the elements in the pixel columnmean2
Step S903, if the vertical pixel offset mean value diffmean2If the average value is greater than a second preset threshold value T2, the average value diff of the vertical pixel offset is determinedmean2Vertical pixel deviation threshold diff for pixel column-T2; if the vertical pixel offset mean value diffmean2If the second threshold value is less than the second preset threshold value T2, the second preset threshold value T2 is determined as the vertical pixel deviation threshold value diff of the pixel column-T2, and the second preset threshold T2 is an empirical value. In one embodiment, the pixel may also be a pixelVertical pixel offset mean diff of a columnmean2Vertical pixel deviation threshold diff directly as pixel column-T2。
Step S803: and determining a second binarization result of the pixel points according to the vertical pixel value offset of the pixel points and the vertical pixel deviation threshold of the pixel rows.
According to the offset of the vertical pixel value of the pixel point on the pixel row and the vertical pixel threshold diff of the pixel row-T2, determining the binarization result of the pixel point on the pixel column; for example, the first vertical offset difference corresponding to the pixel point is greater than the vertical pixel threshold of the pixel row or the second vertical offset difference corresponding to the pixel point is greater than the vertical pixel threshold of the pixel row, i.e., diff(i,j)-u>diff-T2 or diff(i,j)-d>The pixel value of the pixel point of diff-T2 is set to 255, and the pixel values of the other pixel points are all set to 0, so that the binarization result in the vertical direction on the image to be processed can be represented by the following formula 14:
Figure BDA0002756164500000162
equation 14
In some of these embodiments, the method of determining a lane line within a parking area may be as follows:
extracting straight line segments in the binary image based on a straight line extraction algorithm, and determining parallel line groups according to the straight line segments; the straight Line extraction algorithm can be Hough transform or LSD (Line Segment Detector, LSD for short) to perform regional straight Line detection on the binarized image with the color features, straight Line segments within a set angle range in a designated region can be respectively obtained, and every two straight Line segments are determined to obtain parallel Line groups, namely if the angles of the two straight lines are within a certain angle, the straight Line pairs are considered as a group of parallel lines;
acquiring an initial point and an end point of a straight line segment in the parallel line group; for each group of parallel lines, reserving a region range of each straight line and 2 pixel points on the left and right of the straight line on a binarization image result with color characteristics, and counting the number of pixel points in each line in the region range, if the number is greater than 0, considering that effective pixel points exist in the line, and if effective pixel points exist in N continuous lines, considering that an initial line and an end line of the N lines are the initial position and the end point of the line, wherein the corresponding two points are the initial point and the end point of the straight line;
determining a rectangular frame area according to the initial point and the end point, and determining a parking space area in the rectangular frame area based on an identification algorithm; judging whether each group of parallel lines is a suspected parking space area, judging whether the distance between the parallel lines, the length of the parallel line segments and the connecting line of the bottom end points of the parallel line groups are perpendicular to the parallel lines or not, and the like; for example, corresponding areas in the binarized image with color features are intercepted by the initial points and the end points of the parallel line groups, and x is obtained by comparison according to coordinates (x, y) of four end points (two initial points and two end points) of the parallel lines in the areasmin,ymin,xmax,ymaxThen four points (x) are determinedmin,ymin),(xmax,ymin),(xmin,ymax),(xmax,ymax) And obtaining a rectangular frame area according to the four points, and judging whether the rectangular frame area is a parking space area or not through an identification algorithm (an SVM classification algorithm (Support Vector Machines for short) can be used or replaced by other machine learning classification algorithms, the SVM classification algorithm is selected because the speed is high), if the rectangular frame area is judged to be positive, outputting the parking space, and if the rectangular frame area is negative, continuously judging the next area.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
In one embodiment, a computer device is provided, which may be a terminal. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of detecting a lane in a parking area. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
In one embodiment, fig. 10 is a schematic diagram of an internal structure of a computer device according to an embodiment of the present application, and as shown in fig. 10, a computer device is provided, where the computer device may be a server, and the internal structure diagram may be as shown in fig. 10. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of detecting a lane in a parking area.
Those skilled in the art will appreciate that the architecture shown in fig. 10 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the steps in the lane detection method for parking area provided in the above embodiments are implemented.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, and the computer program is executed by a processor to implement the steps of the lane detection method for a parking area provided in the above embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A parking space line detection method for a parking area is characterized by comprising the following steps:
acquiring an RGB image of a parking area, and acquiring an R channel image, a G channel image and a B channel image of the parking area according to the RGB image;
determining blue region characteristics according to the pixel value offset of the B channel image to the R channel image and the G channel image;
determining yellow region characteristics according to the pixel value offsets of the G channel image and the R channel image to the B channel image respectively;
superposing the blue area characteristic and the yellow area characteristic to determine a color space characteristic diagram of the parking area;
determining a first gray-scale map with color features of the parking area according to the color space feature map and the RGB image, and determining a binary image with color features of the parking area according to the first gray-scale map;
and determining a parking space line of the parking area in the binary image with the color characteristics of the parking area.
2. The method of claim 1, wherein determining blue region features from pixel value offsets of the B channel image to the R channel image and to the G channel image comprises:
determining a first forward pixel value offset of the B channel image to the R channel image according to the B channel image and the R channel image;
determining a second forward pixel value offset of the B channel image to the G channel image according to the B channel image and the G channel image;
and superposing the first forward pixel value offset and the second forward pixel value offset to determine the blue region characteristic.
3. The method of claim 1, wherein determining yellow region features according to pixel value offsets of the G channel image and the R channel image to the B channel image respectively comprises:
determining a third forward pixel value offset of the G channel image to the B channel image according to the G channel image and the B channel image;
determining a fourth forward pixel value offset of the R channel image to the B channel image according to the R channel image and the B channel image;
and superposing the third forward pixel value offset and the fourth forward pixel value offset to determine the yellow region characteristic.
4. The method according to claim 1, wherein the determining a first gray scale map with color features of the parking area from the color space feature map and the RGB image and determining a binarized image with color features of the parking area from the first gray scale map comprises:
carrying out gray level conversion on the RGB image to determine a gray level image of the RGB image;
carrying out binarization processing on the color space characteristic diagram, and determining a binarization image of the color space characteristic diagram;
superposing the binarized image of the color space feature map and the gray scale map of the RGB image to determine the first gray scale map;
and carrying out binarization processing on the first gray-scale image, and determining a binarization image with color characteristics of the parking area.
5. The method according to claim 4, wherein binarizing the color space feature map and/or binarizing the first grayscale map comprises:
taking the color space feature map and/or the first gray scale map as an image to be processed; the image to be processed comprises a plurality of pixel rows and a plurality of pixel columns;
acquiring first binarization results of pixel points in all the pixel rows and second binarization results of pixel points in all the pixel columns;
and overlapping the first binarization result and the second binarization result to determine a binarization image of the image to be processed.
6. The method of claim 5, wherein obtaining the first binarization results for the pixels in all of the pixel rows comprises:
for each pixel row in the image to be processed, acquiring a horizontal pixel value offset of each pixel point in the pixel row and acquiring a horizontal pixel offset threshold of the pixel row,
determining a first binarization result of the pixel point according to the horizontal pixel value offset of the pixel point and the horizontal pixel deviation threshold of the pixel line;
acquiring second binarization results of the pixel points in all the pixel rows comprises the following steps:
for each pixel column in an image to be processed, acquiring a vertical pixel value offset of each pixel point in the pixel column and acquiring a vertical pixel offset threshold of the pixel column;
and determining a second binarization result of the pixel point according to the vertical pixel value offset of the pixel point and the vertical pixel deviation threshold of the pixel row.
7. The method of claim 6, wherein obtaining the horizontal pixel deviation threshold for the row of pixels comprises:
acquiring the horizontal pixel value offset of all pixel points in the pixel row;
determining a horizontal pixel offset mean value of the pixel row according to the horizontal pixel value offsets of all pixel points in the pixel row;
if the horizontal pixel offset mean value is larger than a first preset threshold value, determining that the horizontal pixel offset mean value is a horizontal pixel deviation threshold value of the pixel row, and if the horizontal pixel offset mean value is smaller than the first preset threshold value, determining that the preset threshold value is a horizontal pixel deviation threshold value of the pixel row;
acquiring a vertical pixel deviation threshold for the pixel column comprises:
acquiring the vertical pixel value offset of all pixel points in the pixel column;
determining a vertical pixel offset mean value of the pixel column according to the vertical pixel value offsets of all pixel points in the pixel column;
if the vertical pixel offset mean value is larger than a second preset threshold value, determining that the vertical pixel offset mean value is a vertical pixel deviation threshold value of the pixel array, and if the vertical pixel offset mean value is smaller than the second preset threshold value, determining that the preset threshold value is a vertical pixel deviation threshold value of the pixel array.
8. The method of claim 6, wherein obtaining a horizontal pixel value offset for each of the pixels in the pixel row comprises:
calculating a first horizontal offset difference between the pixel point and a first horizontal neighboring pixel point, and calculating a second horizontal offset difference between the pixel point and a second horizontal neighboring pixel point,
the first horizontal similar pixel points are pixel points which are separated from the pixel points by a first preset distance in a first horizontal direction, the second horizontal similar pixel points are pixel points which are separated from the pixel points by the first preset distance in a second horizontal direction, and the first horizontal direction is opposite to the second horizontal direction;
obtaining the vertical pixel value offset of each pixel point in the pixel row comprises:
calculating a first vertical offset difference between the pixel point and a first vertical neighboring pixel point, and calculating a second vertical offset difference between the pixel point and a second vertical neighboring pixel point,
the first vertical similar pixel points are pixel points which are separated from the pixel points by a second preset distance in the first vertical direction, the second vertical similar pixel points are pixel points which are separated from the pixel points by the second preset distance in the second vertical direction, and the first vertical direction is opposite to the second vertical direction.
9. A computer arrangement comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements a lane detection method for a parking area according to any one of claims 1 to 8 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements a lane detection method for a parking area according to any one of claims 1 to 8.
CN202011203274.2A 2020-11-02 2020-11-02 Parking space line detection method for parking area and computer equipment Active CN112417993B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011203274.2A CN112417993B (en) 2020-11-02 2020-11-02 Parking space line detection method for parking area and computer equipment
PCT/CN2021/114814 WO2022088900A1 (en) 2020-11-02 2021-08-26 Parking space line detection method for parking area, and computer device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011203274.2A CN112417993B (en) 2020-11-02 2020-11-02 Parking space line detection method for parking area and computer equipment

Publications (2)

Publication Number Publication Date
CN112417993A true CN112417993A (en) 2021-02-26
CN112417993B CN112417993B (en) 2021-06-08

Family

ID=74828512

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011203274.2A Active CN112417993B (en) 2020-11-02 2020-11-02 Parking space line detection method for parking area and computer equipment

Country Status (2)

Country Link
CN (1) CN112417993B (en)
WO (1) WO2022088900A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022088900A1 (en) * 2020-11-02 2022-05-05 亿咖通(湖北)技术有限公司 Parking space line detection method for parking area, and computer device
CN114550129A (en) * 2022-01-26 2022-05-27 江苏联合职业技术学院苏州工业园区分院 Machine learning model processing method and system based on data set
CN115063414A (en) * 2022-08-05 2022-09-16 深圳新视智科技术有限公司 Method, device and equipment for detecting lithium battery pole piece gummed paper and storage medium

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115116033A (en) * 2022-06-16 2022-09-27 阿里云计算有限公司 Processing method and computing device for parking space range
CN116503481B (en) * 2023-06-25 2023-08-29 天津中德应用技术大学 Automatic parking position and orientation detecting system based on image visual guidance
CN117853346B (en) * 2024-03-08 2024-05-14 杭州湘亭科技有限公司 Radiation source three-dimensional radiation image intelligent enhancement method based on decontamination robot
CN117893457B (en) * 2024-03-18 2024-05-14 深圳市塔联科技有限公司 PCB intelligent detection method, device and computer equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150130640A1 (en) * 2013-11-14 2015-05-14 Hyundai Motor Company Parking area detecting apparatus and method thereof
CN108242178A (en) * 2018-02-26 2018-07-03 北京车和家信息技术有限公司 A kind of method for detecting parking stalls, device and electronic equipment
CN111611930A (en) * 2020-05-22 2020-09-01 华域汽车系统股份有限公司 Parking space line detection method based on illumination consistency

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101770646B (en) * 2010-02-25 2012-07-04 昆山锐芯微电子有限公司 Edge detection method based on Bayer RGB images
CN110570347B (en) * 2019-09-05 2023-01-17 延锋伟世通电子科技(上海)有限公司 Color image graying method for lane line detection
CN112417993B (en) * 2020-11-02 2021-06-08 湖北亿咖通科技有限公司 Parking space line detection method for parking area and computer equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150130640A1 (en) * 2013-11-14 2015-05-14 Hyundai Motor Company Parking area detecting apparatus and method thereof
CN108242178A (en) * 2018-02-26 2018-07-03 北京车和家信息技术有限公司 A kind of method for detecting parking stalls, device and electronic equipment
CN111611930A (en) * 2020-05-22 2020-09-01 华域汽车系统股份有限公司 Parking space line detection method based on illumination consistency

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PENG LIU ETC.: ""A Parking-Lines Recognition Algorithm Based on Freeman "", 《2015 7TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN-MACHINE SYSTEMS AND CYBERNETICS》 *
陆桂明等: ""泊车辅助系统中的车位线自动检测与识别"", 《电子科技》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022088900A1 (en) * 2020-11-02 2022-05-05 亿咖通(湖北)技术有限公司 Parking space line detection method for parking area, and computer device
CN114550129A (en) * 2022-01-26 2022-05-27 江苏联合职业技术学院苏州工业园区分院 Machine learning model processing method and system based on data set
CN115063414A (en) * 2022-08-05 2022-09-16 深圳新视智科技术有限公司 Method, device and equipment for detecting lithium battery pole piece gummed paper and storage medium
CN115063414B (en) * 2022-08-05 2022-12-20 深圳新视智科技术有限公司 Method, device and equipment for detecting lithium battery pole piece gummed paper and storage medium

Also Published As

Publication number Publication date
WO2022088900A1 (en) 2022-05-05
CN112417993B (en) 2021-06-08

Similar Documents

Publication Publication Date Title
CN112417993B (en) Parking space line detection method for parking area and computer equipment
EP3082066B1 (en) Road surface gradient detection device
RU2540849C2 (en) Device for detecting three-dimensional object and method of detecting three-dimensional object
US10041791B2 (en) Object detection apparatus method
WO2012039496A1 (en) Track estimation device and program
CN105005758A (en) Image processing apparatus
JP2001101415A (en) Image recognizing device and image processor
US9990549B2 (en) Complex marking determining device and complex marking determining method
CN105335955A (en) Object detection method and object detection apparatus
CN111179291B (en) Edge pixel point extraction method and device based on neighborhood relation
CN109163775B (en) Quality measurement method and device based on belt conveyor
CN112597846B (en) Lane line detection method, lane line detection device, computer device, and storage medium
US8619162B2 (en) Image processing apparatus and method, and image processing program
CN103888690B (en) Device and method for detecting defect pixel
JP5264457B2 (en) Object detection device
US10977803B2 (en) Correlation value calculation device
WO2014054124A1 (en) Road surface markings detection device and road surface markings detection method
JP2018137667A (en) Camera calibration method, program and device
JP4605582B2 (en) Stereo image recognition apparatus and method
CN112070081B (en) Intelligent license plate recognition method based on high-definition video
CN111435080B (en) Water level measuring method, device and system
CN107886550B (en) Image editing propagation method and system
US9430959B2 (en) Character region pixel identification device and method thereof
CN113112502B (en) Cable detection method, robot and device with storage function
CN112115784B (en) Lane line identification method and device, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220330

Address after: 430000 No. b1336, chuanggu startup area, taizihu cultural Digital Creative Industry Park, No. 18, Shenlong Avenue, Wuhan Economic and Technological Development Zone, Wuhan, Hubei Province

Patentee after: Yikatong (Hubei) Technology Co.,Ltd.

Address before: No.c101, chuanggu start up zone, taizihu cultural Digital Industrial Park, No.18 Shenlong Avenue, Wuhan Economic and Technological Development Zone, Hubei Province

Patentee before: HUBEI ECARX TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right