CN112966782A - Multi-view-angle feature-fused road surface water detection and identification method - Google Patents
Multi-view-angle feature-fused road surface water detection and identification method Download PDFInfo
- Publication number
- CN112966782A CN112966782A CN202110380331.2A CN202110380331A CN112966782A CN 112966782 A CN112966782 A CN 112966782A CN 202110380331 A CN202110380331 A CN 202110380331A CN 112966782 A CN112966782 A CN 112966782A
- Authority
- CN
- China
- Prior art keywords
- calculating
- hog
- area
- color
- detecting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000002352 surface water Substances 0.000 title claims abstract description 34
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000001514 detection method Methods 0.000 title description 13
- 230000004927 fusion Effects 0.000 claims abstract description 11
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 25
- 230000000007 visual effect Effects 0.000 claims description 24
- 238000009825 accumulation Methods 0.000 claims description 9
- 239000013598 vector Substances 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000013461 design Methods 0.000 description 5
- 241001270131 Agaricus moelleri Species 0.000 description 4
- 239000003086 colorant Substances 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000007547 defect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
Abstract
The invention discloses a method for detecting and identifying surface water through multi-view-angle feature fusion, which comprises the following steps of: step S1, performing offline all-round vision calibration to obtain image coordinates corresponding to ground public areas with different viewing angles; step S2: detecting the surface water of the public area based on the color histogram and the information entropy; step S5: outputting a result; step S3: inputting a candidate area, and judging whether the candidate area is a public area; if the candidate area is the common area, performing step S2; if the candidate area is not the common area, performing step S4; step S4, detecting the surface water of the non-public area based on the HOG characteristics; step S5: and outputting the result. The method for detecting and identifying the surface water through the multi-view-angle feature fusion has the advantages of quickly detecting the surface water and the like.
Description
Technical Field
The invention relates to the technical field of automobile detection, in particular to a method for detecting and identifying surface water through multi-view-angle feature fusion.
Background
With the continuous popularization of AI intelligent technology in vehicle-mounted scene applications, automatic driving and auxiliary driving technologies based on camera detection and recognition functions are also gradually applied to passenger cars. After the sunken ponding on road surface of traveling, have to the outstanding influence of on-vehicle vision perception: causing the false recognition to lead to the misoperation of the vehicle and bringing potential safety hazard to the driving. Therefore, accurate identification of surface water is essential to achieve vehicle vision intelligent perception more safely and reliably.
At present, the main methods for identifying the accumulated water are as follows: firstly, the method needs a large amount of data training and a large amount of calculation power for an operating platform based on the target classification of machine learning, and the method is applied to multiple application and security monitoring scenes and has poor performance aiming at driving roads with dynamically changeable backgrounds; secondly, based on a method of polarized light measurement, the vehicle-mounted lens cannot realize the optical structure at present, and then the concave and smooth ground reflection also presents polarized interference; thirdly, the defects of the ponding detection based on the reflection optical model and the white point prior hypothesis make the method easy to detect the lane line as the ponding reflection surface, resulting in a plurality of false detections.
Disclosure of Invention
The invention provides a method for detecting and identifying surface water by multi-view-angle feature fusion, and aims to solve the problems in the existing surface water detection.
According to the embodiment of the application, the method for detecting and identifying the multiple-view-angle feature fused surface water is provided, and comprises the following steps:
step S1: performing offline all-around calibration to obtain image coordinates corresponding to ground public areas with different viewing angles;
step S2: detecting the surface water of the public area based on the color histogram and the information entropy;
step S5: outputting a result;
step S3: inputting a candidate area, and judging whether the candidate area is a public area;
if the candidate area is the common area, performing step S2;
if the candidate area is not the common area, performing step S4;
step S4, detecting the surface water of the non-public area based on the HOG characteristics;
step S5: and outputting the result.
Preferably, the step S1 includes:
step S11: based on a calibration field, the calibration of the two-dimensional overlook is realized, and the corresponding relation between the ground space point and the original image coordinate is obtained based on the calibration:
step S12: based on the two-dimensional top view, corresponding ground coordinates of different viewing angles in the common area are obtained, based on the corresponding relation between the ground space point and the original image coordinate obtained through calibration, the inverse operation of the step S11 is realized, and the image coordinate corresponding to the common area is output;
in step S11, F represents the conversion relationship from the image coordinates to the ground coordinates, i is 0, 1, 2, and 3 and represents four different viewing angles, i.e., right, rear, left, and front.
Preferably, the step S2 includes:
step S21: extracting a target image of a first visual angle and extracting a target image of a second visual angle based on the common area coordinates;
step S22: respectively calculating color histograms of the first visual angle target image and the second visual angle target image;
step S23: matching the color histogram and calculating the similarity of the color histogram;
step S24: calculating information entropy deviation;
step S25: and judging whether the road surface is accumulated with water or not based on the similarity of the color histogram and the information entropy deviation.
Preferably, the step S21 includes: extracting a target image I-left of a first visual angle and extracting a target image I-right of a second visual angle; the step S22 includes a step S221: color reduction I-left color I-left/15, I-right color I-right/15; step S222: respectively counting RGB channel gray level histograms H rgbleft and H rgbright of the I-left color and the I-right color after the color reduction;
preferably, the step S23 includes:
step S231: the dominant color with the highest statistical value, C left, C right, is retrieved and the difference is calculated:
d C=abs(C left–C right);
step S232: if d C is greater than 30, S hist is 0, and the output returns the similarity of the histogram;
step S233: calculating a histogram sequence which accounts for not less than 70% of all pixels by taking C left as a center, wherein the total occupancy is p left, and calculating the proportion p right of all pixels in H rgbright based on a sequence range;
step S234: calculating the similarity, wherein shift is 1-abs (p left-p right);
the step S24 includes: respectively calculating the information entropies of I-left and I-right to obtain the information entropy EnleftAnd EnrightCalculating the information entropy deviation
In step S25, the conditions for determining whether water is accumulated on the road surface are: shist less than 0.4 and DEnLess than 0.4.
Preferably, the step S4 includes:
step S41: calculating HOG characteristic HOG cur of the current frame and the target region Rcur;
step S42: calculating a target area Rpre corresponding to the first 5 frame sequences based on motion compensation;
step S43: calculating HOG characteristics, HOG prei, of the first 5 frames corresponding to the Rprei;
step S44: respectively calculating the difference of the HOG cur and HOG prei characteristic vectors, and calculating the mean value Dhogman of the difference;
step S45: calculating the variance delta of the feature vector group composed of HOG cur and HOG preihog;
Step S46: judging whether water is accumulated on the road surface of the area, if the Dhogman is more than 0.7 and deltahogAnd if the water content is more than 0.2, accumulated water exists on the road surface of the target area.
The technical scheme provided by the embodiment of the application can have the following beneficial effects: compared with the traditional scheme, the scheme is based on the defect that the existing vehicle running road accumulated water detection method is applied under the vehicle-mounted environment. The invention designs a multi-view-angle feature-fused detection and identification method for the surface water accumulation based on the characteristics of an optical sensing system consisting of vehicle-mounted panoramic vision. The method is simple to implement and stable in performance. After water is accumulated on the road surface, mirror reflection is formed for illumination, the same water accumulation road surface is observed at different angles, and corresponding changes (different gray scales, colors and textures at different viewing angles; different gray scales, colors and textures at the same viewing angle) can be presented in an image video due to the dynamic change of the water surface: based on the prior knowledge, the method designs a method for efficiently measuring the difference of the ground image characteristics at different visual angles and the change of the image characteristics in the video stream at the same visual angle, and combines the difference and the change to distinguish the normal ground and the seeper ground so as to realize the rapid detection of the seeper on the road surface.
Because the four lenses are different in color and brightness caused by illumination, the consistency of the chromaticity and the brightness is repaired by the all-round looking system, and the color and the brightness of the same ground area observed at different visual angles are basically similar under normal conditions. The invention aims at different viewing angles to observe public areas, and judges whether the surface water is accumulated or not by calculating local color histograms and information entropies and fusing two characteristics. And detecting a non-public area, calculating the change amplitude of the inter-frame HOG characteristic based on extracting local gradient Histogram (HOG) characteristics of the same ground area in different video frames, and judging whether water is accumulated on the corresponding ground or not based on the size of the change amplitude.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart of a method for detecting and identifying surface water through multi-view feature fusion according to the present invention;
fig. 2 is a schematic flow chart of step S1 in the method for detecting and identifying surface water with multi-view feature fusion according to the present invention;
fig. 3 is a schematic flowchart of step S2 in the method for detecting and identifying surface water with multi-view feature fusion according to the present invention;
fig. 4 is a schematic flowchart of step S23 in the method for detecting and identifying surface water with multi-view feature fusion according to the present invention;
fig. 5 is a schematic flow chart of step S4 in the method for detecting and identifying surface water with multi-view feature fusion according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1, the invention discloses a method 10 for detecting and identifying surface water with multi-view feature fusion, which comprises the following steps:
step S1: performing offline all-around calibration to obtain image coordinates corresponding to ground public areas with different viewing angles;
step S2: detecting the surface water of the public area based on the color histogram and the information entropy;
step S5: outputting a result;
step S3: inputting a candidate area, and judging whether the candidate area is a public area;
if the candidate area is the common area, performing step S2;
if the candidate area is not the common area, performing step S4;
step S4, detecting the surface water of the non-public area based on the HOG characteristics;
step S5: and outputting the result.
By adopting the design, the scheme is based on the defect that the current detection method for the vehicle running road accumulated water is applied in the vehicle-mounted environment. The invention designs a multi-view-angle feature-fused detection and identification method for the surface water accumulation based on the characteristics of an optical sensing system consisting of vehicle-mounted panoramic vision. The method is simple to implement and stable in performance. After water is accumulated on the road surface, mirror reflection is formed for illumination, the same water accumulation road surface is observed at different angles, and corresponding changes (different gray scales, colors and textures at different viewing angles; different gray scales, colors and textures at the same viewing angle) can be presented in an image video due to the dynamic change of the water surface: based on the prior knowledge, the method designs a method for efficiently measuring the difference of the ground image characteristics at different visual angles and the change of the image characteristics in the video stream at the same visual angle, and combines the difference and the change to distinguish the normal ground and the seeper ground so as to realize the rapid detection of the seeper on the road surface.
Because the four lenses are different in color and brightness caused by illumination, the consistency of the chromaticity and the brightness is repaired by the all-round looking system, and the color and the brightness of the same ground area observed at different visual angles are basically similar under normal conditions. The invention aims at different viewing angles to observe public areas, and judges whether the surface water is accumulated or not by calculating local color histograms and information entropies and fusing two characteristics. And detecting a non-public area, calculating the change amplitude of the inter-frame HOG characteristic based on extracting local gradient Histogram (HOG) characteristics of the same ground area in different video frames, and judging whether water is accumulated on the corresponding ground or not based on the size of the change amplitude.
Referring to fig. 2, the step S1 includes:
step S11: based on a calibration field, the calibration of the two-dimensional overlook is realized, and the corresponding relation between the ground space point and the original image coordinate is obtained based on the calibration:
step S12: based on the two-dimensional top view, corresponding ground coordinates of different viewing angles in the common area are obtained, based on the corresponding relation between the ground space point and the original image coordinate obtained through calibration, the inverse operation of the step S11 is realized, and the image coordinate corresponding to the common area is output;
in step S11, F represents the conversion relationship from the image coordinates to the ground coordinates, i is 0, 1, 2, and 3 and represents four different viewing angles, i.e., right, rear, left, and front.
Referring to fig. 3, the step S2 includes:
step S21: extracting a target image of a first visual angle and extracting a target image of a second visual angle based on the common area coordinates;
step S22: respectively calculating color histograms of the first visual angle target image and the second visual angle target image;
step S23: matching the color histogram and calculating the similarity of the color histogram;
step S24: calculating information entropy deviation;
step S25: and judging whether the road surface is accumulated with water or not based on the similarity of the color histogram and the information entropy deviation.
Wherein the step S21 includes: extracting a target image I-left of a first visual angle and extracting a target image I-right of a second visual angle; the step S22 includes a step S221: color reduction I-left color I-left/15, I-right color I-right/15; in the present embodiment, the reference gray interval is 15; step S222: respectively counting RGB channel gray level histograms H rgbleft and H rgbright of the I-left color and the I-right color after the color reduction;
referring to fig. 4, the step S23 includes:
step S231: the dominant color with the highest statistical value, C left, C right, is retrieved and the difference is calculated:
d C=abs(C left–C right);
step S232: if d C is greater than 30, S hist is 0, and the output returns the similarity of the histogram;
step S233: calculating a histogram sequence which accounts for not less than 70% of all pixels by taking C left as a center, wherein the total occupancy is p left, and calculating the proportion p right of all pixels in H rgbright based on a sequence range;
step S234: calculating the similarity, wherein shift is 1-abs (p left-p right);
the step S24 includes: respectively calculating the information entropies of I-left and I-right to obtain the information entropy EnleftAnd EnrightCalculating the information entropy deviation
In step S25, the conditions for determining whether water is accumulated on the road surface are: shist less than 0.4 and DEnLess than 0.4.
Referring to fig. 5, the step S4 includes:
step S41: calculating HOG characteristic HOG cur of the current frame and the target region Rcur;
step S42: calculating a target area Rpre corresponding to the first 5 frame sequences based on motion compensation;
step S43: calculating HOG characteristics, HOG prei, of the first 5 frames corresponding to the Rprei;
step S44: respectively calculating the difference of the HOG cur and HOG prei characteristic vectors, and calculating the mean value Dhogman of the difference;
step S45: calculating the variance delta of the feature vector group composed of HOG cur and HOG preihog;
Step S46: judging whether water is accumulated on the road surface of the area, if the Dhogman is more than 0.7 and deltahogAnd if the water content is more than 0.2, accumulated water exists on the road surface of the target area.
In this embodiment, the motion compensation of step S42 is not the innovative point of the present invention, and will not be described herein again. In step S46, the threshold of the present invention is selected from 0.7 and 2.0 as the reference values.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (6)
1. A method for detecting and identifying surface water through multi-view-angle feature fusion is characterized by comprising the following steps:
step S1: performing offline all-around calibration to obtain image coordinates corresponding to ground public areas with different viewing angles;
step S2: detecting the surface water of the public area based on the color histogram and the information entropy;
step S5: outputting a result;
step S3: inputting a candidate area, and judging whether the candidate area is a public area;
if the candidate area is the common area, performing step S2;
if the candidate area is not the common area, performing step S4;
step S4, detecting the surface water of the non-public area based on the HOG characteristics;
step S5: and outputting the result.
2. The method for detecting and identifying the multi-view-angle feature-fused surface water accumulation as claimed in claim 1, wherein the step S1 comprises:
step S11: based on a calibration field, the calibration of the two-dimensional overlook is realized, and the corresponding relation between the ground space point and the original image coordinate is obtained based on the calibration:
step S12: based on the two-dimensional top view, corresponding ground coordinates of different viewing angles in the common area are obtained, based on the corresponding relation between the ground space point and the original image coordinate obtained through calibration, the inverse operation of the step S11 is realized, and the image coordinate corresponding to the common area is output;
in step S11, F represents the conversion relationship from the image coordinates to the ground coordinates, i is 0, 1, 2, and 3 and represents four different viewing angles, i.e., right, rear, left, and front.
3. The method for detecting and identifying the multi-view-angle feature-fused surface water accumulation as claimed in claim 2, wherein the step S2 comprises:
step S21: extracting a target image of a first visual angle and extracting a target image of a second visual angle based on the common area coordinates;
step S22: respectively calculating color histograms of the first visual angle target image and the second visual angle target image;
step S23: matching the color histogram and calculating the similarity of the color histogram;
step S24: calculating information entropy deviation;
step S25: and judging whether the road surface is accumulated with water or not based on the similarity of the color histogram and the information entropy deviation.
4. The method for detecting and identifying the multi-view-angle feature-fused surface water accumulation as claimed in claim 3, wherein the step S21 comprises: extracting a target image I-left of a first visual angle and extracting a target image I-right of a second visual angle; the step S22 includes a step S221: color reduction I-left color I-left/15, I-right color I-right/15; step S222: and respectively counting RGB channel gray level histograms H rgbleft and H rgbright of the I-left color and the I-right color after the color reduction.
5. The method for detecting and identifying the multi-view-angle feature-fused surface water accumulation as claimed in claim 4, wherein the step S23 comprises:
step S231: the dominant color with the highest statistical value, C left, C right, is retrieved and the difference is calculated:
d C=abs(C left–C right);
step S232: if d C is greater than 30, S hist is 0, and the output returns the similarity of the histogram;
step S233: calculating a histogram sequence which accounts for not less than 70% of all pixels by taking C left as a center, wherein the total occupancy is p left, and calculating the proportion p right of all pixels in H rgbright based on a sequence range;
step S234: calculating the similarity, wherein shift is 1-abs (p left-p right);
the step S24 includes: respectively calculating the information entropies of I-left and I-right to obtain the information entropy EnleftAnd EnrightCalculating the information entropy deviation
In step S25, the conditions for determining whether water is accumulated on the road surface are: shist less than 0.4 and DEnLess than 0.4.
6. The method for detecting and identifying the multi-view-angle feature-fused surface water accumulation as claimed in claim 1, wherein the step S4 comprises:
step S41: calculating HOG characteristic HOG cur of the current frame and the target region Rcur;
step S42: calculating a target area Rpre corresponding to the first 5 frame sequences based on motion compensation;
step S43: calculating HOG characteristics, HOG prei, of the first 5 frames corresponding to the Rprei;
step S44: respectively calculating the difference of the HOG cur and HOG prei characteristic vectors, and calculating the mean value Dhogman of the difference;
step S45: calculating the variance delta of the feature vector group composed of HOG cur and HOG preihog;
Step S46: judging whether water is accumulated on the road surface of the area, if the Dhogman is more than 0.7 and deltahogIf the road surface is larger than 0.2, the road surface of the target area existsAnd accumulating water.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110380331.2A CN112966782A (en) | 2021-04-09 | 2021-04-09 | Multi-view-angle feature-fused road surface water detection and identification method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110380331.2A CN112966782A (en) | 2021-04-09 | 2021-04-09 | Multi-view-angle feature-fused road surface water detection and identification method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112966782A true CN112966782A (en) | 2021-06-15 |
Family
ID=76280175
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110380331.2A Pending CN112966782A (en) | 2021-04-09 | 2021-04-09 | Multi-view-angle feature-fused road surface water detection and identification method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112966782A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113887412A (en) * | 2021-09-30 | 2022-01-04 | 中国科学院过程工程研究所 | Detection method, detection terminal, monitoring system and storage medium for pollution emission |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101673403A (en) * | 2009-10-10 | 2010-03-17 | 安防制造(中国)有限公司 | Target following method in complex interference scene |
CN105809131A (en) * | 2016-03-08 | 2016-07-27 | 宁波裕兰信息科技有限公司 | Method and system for carrying out parking space waterlogging detection based on image processing technology |
CN106568403A (en) * | 2016-09-28 | 2017-04-19 | 深圳市元征科技股份有限公司 | Road surface accumulated water detection method and device |
CN107909070A (en) * | 2017-11-24 | 2018-04-13 | 天津英田视讯科技有限公司 | A kind of method of road water detection |
CN108109162A (en) * | 2018-01-08 | 2018-06-01 | 中国石油大学(华东) | A kind of multiscale target tracking merged using self-adaptive features |
CN108198422A (en) * | 2018-01-22 | 2018-06-22 | 中国地质大学(武汉) | A kind of road ponding extraction system and method based on video image |
CN109741391A (en) * | 2018-12-26 | 2019-05-10 | 斑马网络技术有限公司 | Detection method, device and the storage medium of surface gathered water depth |
CN109766924A (en) * | 2018-12-20 | 2019-05-17 | 东南大学 | Image detecting method based on image information entropy Yu adaptive threshold DAISY characteristic point |
CN110490194A (en) * | 2019-07-24 | 2019-11-22 | 广东工业大学 | A kind of recognition methods of the multiple features segment fusion traffic sign of adaptive weight |
CN111860277A (en) * | 2020-07-14 | 2020-10-30 | 南京航空航天大学 | Civil aviation aircraft airspeed pipe sleeve safety warning method based on color histogram features |
CN112070740A (en) * | 2020-09-05 | 2020-12-11 | 王坚 | BIM-based port storage yard accumulated water imaging method, system and control equipment |
-
2021
- 2021-04-09 CN CN202110380331.2A patent/CN112966782A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101673403A (en) * | 2009-10-10 | 2010-03-17 | 安防制造(中国)有限公司 | Target following method in complex interference scene |
CN105809131A (en) * | 2016-03-08 | 2016-07-27 | 宁波裕兰信息科技有限公司 | Method and system for carrying out parking space waterlogging detection based on image processing technology |
CN106568403A (en) * | 2016-09-28 | 2017-04-19 | 深圳市元征科技股份有限公司 | Road surface accumulated water detection method and device |
WO2018058856A1 (en) * | 2016-09-28 | 2018-04-05 | 深圳市元征科技股份有限公司 | Road-surface ponding detection method and device |
CN107909070A (en) * | 2017-11-24 | 2018-04-13 | 天津英田视讯科技有限公司 | A kind of method of road water detection |
CN108109162A (en) * | 2018-01-08 | 2018-06-01 | 中国石油大学(华东) | A kind of multiscale target tracking merged using self-adaptive features |
CN108198422A (en) * | 2018-01-22 | 2018-06-22 | 中国地质大学(武汉) | A kind of road ponding extraction system and method based on video image |
CN109766924A (en) * | 2018-12-20 | 2019-05-17 | 东南大学 | Image detecting method based on image information entropy Yu adaptive threshold DAISY characteristic point |
CN109741391A (en) * | 2018-12-26 | 2019-05-10 | 斑马网络技术有限公司 | Detection method, device and the storage medium of surface gathered water depth |
CN110490194A (en) * | 2019-07-24 | 2019-11-22 | 广东工业大学 | A kind of recognition methods of the multiple features segment fusion traffic sign of adaptive weight |
CN111860277A (en) * | 2020-07-14 | 2020-10-30 | 南京航空航天大学 | Civil aviation aircraft airspeed pipe sleeve safety warning method based on color histogram features |
CN112070740A (en) * | 2020-09-05 | 2020-12-11 | 王坚 | BIM-based port storage yard accumulated water imaging method, system and control equipment |
Non-Patent Citations (1)
Title |
---|
朱爽: ""用直方图面积法进行图像相似度计算"", 《测绘通报》, no. 12 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113887412A (en) * | 2021-09-30 | 2022-01-04 | 中国科学院过程工程研究所 | Detection method, detection terminal, monitoring system and storage medium for pollution emission |
CN113887412B (en) * | 2021-09-30 | 2022-06-07 | 中国科学院过程工程研究所 | Detection method, detection terminal, monitoring system and storage medium for pollution emission |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Sun et al. | HSI color model based lane-marking detection | |
JP5867807B2 (en) | Vehicle identification device | |
TWI302879B (en) | Real-time nighttime vehicle detection and recognition system based on computer vision | |
CN109190523B (en) | Vehicle detection tracking early warning method based on vision | |
CN109299674B (en) | Tunnel illegal lane change detection method based on car lamp | |
JP4577153B2 (en) | Environment recognition device | |
TWI401473B (en) | Night time pedestrian detection system and method | |
CN114639078A (en) | Vehicle type recognition method, device and system | |
Kawai et al. | A method to distinguish road surface conditions for car-mounted camera images at night-time | |
Lin et al. | Adaptive IPM-based lane filtering for night forward vehicle detection | |
CN109919062A (en) | A kind of road scene weather recognition methods based on characteristic quantity fusion | |
JP2006318059A (en) | Apparatus, method, and program for image processing | |
CN113200052A (en) | Intelligent road condition identification method for unmanned driving | |
CN112419745A (en) | Highway group fog early warning system based on degree of depth fusion network | |
CN107045630B (en) | RGBD-based pedestrian detection and identity recognition method and system | |
CN112966782A (en) | Multi-view-angle feature-fused road surface water detection and identification method | |
Skodras et al. | Rear lights vehicle detection for collision avoidance | |
KR101620425B1 (en) | System for lane recognition using environmental information and method thereof | |
CN107506739B (en) | Night forward vehicle detection and distance measurement method | |
Sanberg et al. | Color-based free-space segmentation using online disparity-supervised learning | |
Barua et al. | An Efficient Method of Lane Detection and Tracking for Highway Safety | |
Dai et al. | A driving assistance system with vision based vehicle detection techniques | |
JP4070450B2 (en) | Forward vehicle recognition device and recognition method | |
CN110688876A (en) | Lane line detection method and device based on vision | |
Shen | Complex Lane Line Detection Under Autonomous Driving |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |