CN115755097A - Weather condition detection method, device, equipment and storage medium - Google Patents

Weather condition detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN115755097A
CN115755097A CN202211390299.7A CN202211390299A CN115755097A CN 115755097 A CN115755097 A CN 115755097A CN 202211390299 A CN202211390299 A CN 202211390299A CN 115755097 A CN115755097 A CN 115755097A
Authority
CN
China
Prior art keywords
point cloud
candidate point
pixel
target
candidate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211390299.7A
Other languages
Chinese (zh)
Inventor
蔡禹丞
桂晨光
许新玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN202211390299.7A priority Critical patent/CN115755097A/en
Publication of CN115755097A publication Critical patent/CN115755097A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The disclosure provides a weather condition detection method, a weather condition detection device, equipment and a storage medium, which can be applied to the field of Internet of vehicles, the field of intelligent driving and the field of intelligent cities. The method comprises the following steps: processing an initial point cloud data set based on a preset conversion rule to obtain a candidate depth image containing a candidate point cloud pixel set, wherein the initial point cloud data set comprises data obtained after a target detection device detects a space to be detected; screening out a first target point cloud pixel from the candidate point cloud pixel set according to the candidate point cloud pixel set and the candidate point cloud pixel distance between different candidate point cloud pixels, wherein the first target point cloud pixel represents at least part of characteristics of a target object in a space to be detected; and determining the abnormal weather condition of the space to be detected based on the first target point cloud pixel.

Description

Weather condition detection method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of car networking, the field of intelligent driving, and the field of intelligent cities, and more particularly, to a weather condition detection method, apparatus, device, storage medium, and program product.
Background
Along with the rapid development of science and technology, the supplementary driving technique of intelligence based on detection devices such as laser radar realize is widely applied to domestic vehicle to promote the security and the driving convenience of vehicle. Meanwhile, the intelligent auxiliary driving technology can also be applied to various application scenes such as unmanned vehicle article transportation and the like, and the application range is very wide. In the related art, the intelligent driving assistance technology may perform target detection, such as obstacle detection, on a surrounding environment by using a detection device such as a laser radar, and determine a target object around a vehicle based on a target detection result, so that the vehicle may be controlled according to the detected target object to implement an automatic driving or driving assistance function.
In implementing the disclosed concept, the inventors found that there are at least the following problems in the related art: under abnormal weather conditions, such as rainfall and snowfall weather conditions, raindrops, snowflakes and other errors are often recognized as obstacles by the intelligent auxiliary driving technology, so that subsequent wrong driving states such as emergency avoidance and the like occur. However, in the related art, a method capable of accurately detecting abnormal weather is not implemented, so that the intelligent auxiliary driving system often has a wrong driving state.
Disclosure of Invention
In view of the above, the present disclosure provides a weather condition detection method, apparatus, device, storage medium, and program product.
One aspect of the present disclosure provides a weather condition detection method, including:
processing an initial point cloud data set based on a preset conversion rule to obtain a candidate depth image containing a candidate point cloud pixel set, wherein the initial point cloud data set comprises data obtained after a target detection device detects a space to be detected;
screening out a first target point cloud pixel from the candidate point cloud pixel set according to the candidate point cloud pixel set and the candidate point cloud pixel space between different candidate point cloud pixels, wherein the first target point cloud pixel represents at least part of characteristics of a target object in the space to be detected; and
and determining the abnormal weather condition of the space to be detected based on the first target point cloud pixel.
According to an embodiment of the present disclosure, the weather condition detection method further includes:
calculating the candidate point cloud pixel distance between adjacent candidate point cloud pixels according to the candidate point cloud pixel parameters of the adjacent candidate point cloud pixels in the candidate depth image;
wherein, according to the candidate point cloud pixel set and the candidate point cloud pixel distance between different candidate point cloud pixels, screening out a first target point cloud pixel from the candidate point cloud pixel set comprises:
and determining candidate point cloud pixels corresponding to the target point cloud pixel spacing as the first target point cloud pixel according to the comparison result of the candidate point cloud pixel spacing and the screening spacing threshold corresponding to the candidate point cloud pixel spacing, wherein the target point cloud pixel spacing is the candidate point cloud pixel spacing meeting preset conditions according to the comparison result of the target point cloud pixel spacing and the screening spacing threshold corresponding to the candidate point cloud pixel spacing.
According to an embodiment of the disclosure, the candidate point cloud pixel pitch includes a first direction candidate point cloud pixel pitch between candidate point cloud pixels adjacent to each other in a first direction and a second direction candidate point cloud pixel pitch between candidate point cloud pixels adjacent to each other in a second direction in the candidate depth image, and the first direction and the second direction are two intersecting directions in the candidate depth image;
determining a candidate point cloud pixel corresponding to a target point cloud pixel spacing as the first target point cloud pixel according to a comparison result of the candidate point cloud pixel spacing and a screening spacing threshold corresponding to the candidate point cloud pixel spacing includes:
determining a first direction candidate point cloud pixel interval meeting the preset condition as a first direction target point cloud pixel interval according to a first direction comparison result of the first direction candidate point cloud pixel interval and a screening interval threshold corresponding to the first direction candidate point cloud pixel interval;
deleting first-direction target point cloud pixels in the candidate point cloud pixel set to obtain a target candidate point cloud pixel set, wherein the first-direction target point cloud pixels are candidate point cloud pixels corresponding to the first-direction target point cloud pixel pitch;
determining a second direction candidate point cloud pixel pitch satisfying the preset condition as a second direction target point cloud pixel pitch according to a second direction candidate point cloud pixel pitch between adjacent target candidate point cloud pixels in the target candidate point cloud pixel set and a second direction comparison result of a screening pitch threshold corresponding to the second direction candidate point cloud pixel pitch; and
and respectively screening out first target point cloud pixels corresponding to the target point cloud distance in the first direction and the target point cloud pixel distance in the second direction from the candidate point cloud pixel set.
According to the embodiment of the disclosure, the screening pitch threshold corresponding to the pixel pitch of the candidate point cloud is calculated in the following manner:
determining a reference candidate point cloud pixel forming the candidate point cloud pixel pitch, wherein the reference candidate point cloud pixel is a candidate point cloud pixel closest to the target detection device in the candidate point cloud pixels forming the candidate point cloud pixel pitch;
calculating a reference distance between the reference candidate point cloud pixel and an adjacent ray according to the reference pixel position of the reference candidate point cloud pixel, wherein the adjacent ray is a ray corresponding to each of other candidate point cloud pixels adjacent to the reference candidate point cloud pixel;
and processing the reference distance based on a preset screening distance rule so as to calculate and obtain a screening distance threshold value corresponding to the candidate point cloud pixel distance.
According to an embodiment of the present disclosure, calculating a reference distance between the reference candidate point cloud pixel and an adjacent ray according to the reference pixel position of the reference candidate point cloud pixel includes:
determining a virtual candidate point cloud pixel with the reference pixel depth position on the adjacent ray according to the reference pixel depth position of the reference candidate point cloud pixel;
calculating a reference arc distance between the reference candidate point cloud pixel and the virtual candidate point cloud pixel according to an included angle between a reference ray and the adjacent ray by taking the depth position of the reference pixel as a preset radius, wherein the reference arc distance is used as the reference distance; or alternatively
Calculating a reference straight line distance between the reference candidate point cloud pixel and the virtual candidate point cloud pixel according to an included angle between the reference ray and the adjacent ray by taking the depth position of the reference pixel as the side length of a preset triangle, wherein the reference straight line distance is used as the reference distance;
the reference ray is a ray corresponding to the reference candidate point cloud pixel.
According to the embodiment of the disclosure, processing the initial point cloud data set based on the preset conversion rule to obtain the candidate depth image containing the candidate point cloud pixel set comprises:
determining a candidate point cloud area according to the detection performance parameters of the target detection device;
determining candidate point cloud data located in the candidate point cloud area from the initial point cloud data set according to the candidate point cloud area, the initial point cloud positions of the initial point cloud data set and the initial point cloud data; and
and performing data conversion on the candidate point cloud data to obtain a candidate depth image containing the candidate point cloud pixel set.
According to an embodiment of the present disclosure, determining a candidate point cloud region according to the detection performance parameter of the target detection apparatus includes:
determining a first detection distance and a second detection distance according to the detection performance parameters; and
and determining a candidate point cloud area in the space to be detected based on the difference distance between the first detection distance and the second detection distance in the space to be detected.
According to an embodiment of the present disclosure, the above detection performance parameter includes at least one of:
angular resolution, detection distance, number of points.
According to the embodiment of the disclosure, determining the abnormal weather condition of the space to be detected based on the first target point cloud pixel comprises:
determining a second target number according to the difference value between the candidate point cloud pixel number of the candidate point cloud pixels in the candidate point cloud pixel set and the first target number of the first target point cloud pixels; and
and determining the weather condition in the space to be detected as an abnormal weather condition when the second target number is greater than or equal to a preset weather screening threshold value.
According to an embodiment of the present disclosure, the weather condition detection method further includes:
filtering out candidate point cloud pixels in the candidate point cloud pixel set, wherein the candidate point cloud pixels are the same as second target point cloud pixels in the candidate point cloud pixel set, and a target detection point cloud pixel set is obtained, wherein the second target point cloud pixels are other candidate point cloud pixels except the first target point cloud pixels in the candidate point cloud pixel set;
according to target detection point cloud pixels in the target detection point cloud pixel set, screening target detection point cloud data corresponding to the target detection point cloud pixels from the initial point cloud data set; and
and detecting the target object in the space to be detected based on the target detection point cloud data.
According to an embodiment of the present disclosure, the object detection device includes at least one of:
laser radar detection device, millimeter wave radar detection device.
According to an embodiment of the present disclosure, the above abnormal weather condition includes at least one of:
rainfall weather conditions, snowfall weather conditions, hail weather conditions, sand blowing weather conditions.
Another aspect of the present disclosure also provides a weather condition detecting apparatus, including:
the point cloud data conversion module is used for processing an initial point cloud data set based on a preset conversion rule to obtain a candidate depth image containing a candidate point cloud pixel set, wherein the initial point cloud data set comprises data obtained by detecting a space to be detected through a target detection device;
the first screening module is used for screening out first target point cloud pixels from the candidate point cloud pixel set according to the candidate point cloud pixel set and the candidate point cloud pixel space among different candidate point cloud pixels, wherein the first target point cloud pixels represent at least part of characteristics of a target object in the space to be detected; and
and the weather condition determining module is used for determining the abnormal weather condition of the space to be detected based on the first target point cloud pixel.
Another aspect of the present disclosure provides an electronic device including:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the weather condition detection method as described above.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the weather condition detection method as described above when executed.
Another aspect of the disclosure provides a computer program product comprising computer executable instructions for implementing the weather condition detection method as described above when executed.
According to the embodiment of the disclosure, the initial point cloud data set detected by the target detection device is processed according to the preset conversion rule, and the first target point cloud pixels capable of at least partially representing the target object are screened from the candidate point cloud pixel set obtained after processing according to the candidate point cloud pixel spacing, so that the factors irrelevant to the weather condition in the space to be detected in the initial point cloud data set can be effectively analyzed, the factors irrelevant to the weather condition in the space to be detected can be accurately filtered according to the first target number of the first target point cloud pixels, the detection accuracy rate of the weather condition in the space to be detected is at least partially improved, and the technical effect of improving the working stability of the related intelligent driving assistance system can be realized according to the detected abnormal weather condition.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent from the following description of embodiments of the present disclosure with reference to the accompanying drawings, in which:
FIG. 1 schematically illustrates an exemplary system architecture to which the weather condition detection method and apparatus may be applied, according to an embodiment of the disclosure;
FIG. 2 schematically illustrates a flow chart of a method of weather condition detection according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow chart of processing an initial point cloud data set based on a predetermined transformation rule to obtain a candidate depth image including a candidate point cloud pixel set according to an embodiment of the disclosure;
FIG. 4 schematically illustrates a flow chart for determining candidate point cloud pixels corresponding to a target point cloud pixel spacing as first target point cloud pixels according to a comparison of the candidate point cloud pixel spacing and a screening spacing threshold corresponding to the candidate point cloud pixel spacing, in accordance with an embodiment of the present disclosure;
fig. 5 schematically illustrates a schematic diagram of calculating a screening pitch threshold according to an embodiment of the present disclosure;
FIG. 6 schematically shows a flow of a weather condition detection method according to another embodiment of the disclosure;
FIG. 7 schematically illustrates a block diagram of a weather condition detection apparatus according to an embodiment of the present disclosure; and
fig. 8 schematically shows a block diagram of an electronic device adapted to implement a weather condition detection method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B, and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, and C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.).
In the technical scheme of the disclosure, before the personal information of the user is acquired or collected, the authorization or the consent of the user is acquired.
In the technical scheme of the disclosure, the collection, storage, use, processing, transmission, provision, disclosure, application and other processing of the personal information of the related user are all in accordance with the regulations of related laws and regulations, necessary confidentiality measures are taken, and the customs of the public order is not violated.
In a related intelligent driving assistance system, a target object in a space to be detected is usually detected based on a target detection device such as a laser radar and a millimeter wave radar, so that vehicles and other transportation means can be controlled to complete operation state conversion such as emergency avoidance according to a target object detection result obtained by detection. However, under the abnormal weather conditions such as rainfall, snowfall and the like, the target detection device can be influenced by the abnormal weather factors such as raindrops and snowflakes, noise interference is generated on the target detection device, the noises can trigger the intelligent auxiliary driving system to make collision judgment, and then trigger a brake or sudden stop error control signal, and when the intelligent auxiliary driving system is informed that the space to be detected has the abnormal weather conditions such as rainy and snowy days, different strategies can be used for responding. There is no better method in the art for determining abnormal weather conditions. The intelligent driver assistance system may be told of the weather conditions, for example, by manually entering the weather conditions externally, or by capturing an image with a camera and then processing the image using a visual algorithm to determine the weather conditions of rain and snow.
However, the weather condition judgment method is poor in real-time performance and low in judgment accuracy. For example, the weather of rain and snow is judged by taking images through a camera and using a visual algorithm: the method has high requirements on the camera imaging quality and algorithm, particularly in rainy days, the camera is difficult to shoot clearly, the algorithm is difficult to accurately identify, and the weather judgment accuracy is low.
The reason is that the intelligent assistant driving system is difficult to automatically sense the current weather condition due to the complex scene in rainy and snowy days. Tiny raindrops or snowflake cameras are difficult to shoot and remove. Target collection devices such as laser radars and the like collect point cloud data in a space to be detected in rainy and snowy days, and an intelligent auxiliary driving system is difficult to accurately distinguish noise point clouds and real obstacles formed by rain and snow.
Embodiments of the present disclosure provide a weather condition detection method, apparatus, device, storage medium, and program product. The weather condition detection method comprises the following steps: processing an initial point cloud data set based on a preset conversion rule to obtain a candidate depth image containing a candidate point cloud pixel set, wherein the initial point cloud data set comprises data obtained after a target detection device detects a space to be detected; screening out a first target point cloud pixel from the candidate point cloud pixel set according to the candidate point cloud pixel set and the candidate point cloud pixel distance between different candidate point cloud pixels, wherein the first target point cloud pixel represents at least part of characteristics of a target object in a space to be detected; and determining the abnormal weather condition of the space to be detected based on the first target point cloud pixel.
According to the embodiment of the disclosure, the initial point cloud data set detected by the target detection device is processed according to the preset conversion rule, and the first target point cloud pixels capable of at least partially representing the target object are screened from the candidate point cloud pixel set obtained after processing according to the candidate point cloud pixel spacing, so that the factors irrelevant to the weather condition in the space to be detected in the initial point cloud data set can be effectively analyzed, the factors irrelevant to the weather condition in the space to be detected can be accurately filtered according to the first target number of the first target point cloud pixels, the detection accuracy rate of the weather condition in the space to be detected is at least partially improved, and the technical effect of improving the working stability of the related intelligent driving assistance system can be realized according to the detected abnormal weather condition.
It should be noted that, the application scenarios of the embodiments of the present disclosure may include application in an intelligent vehicle such as an unmanned vehicle and an unmanned aerial vehicle to control or assist the operation of the intelligent vehicle. However, the present disclosure is not limited to this, and may also be applied to application scenarios such as an intelligent security monitoring system in a city, for example, to an intelligent detection device such as a security monitoring camera device.
Fig. 1 schematically illustrates an exemplary system architecture to which the weather condition detection method and apparatus may be applied, according to an embodiment of the disclosure. It should be noted that fig. 1 is only an example of a system architecture to which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, and does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, a system architecture 100 according to this embodiment may include a vehicle 101, a target object 102, raindrops 103, a network 104, and a server 105. The network 104 is used to provide the medium for the communication link between the vehicle 101 and the server 105. Network 104 may include various connection types, such as wired and/or wireless communication links, and so forth.
The vehicle 101 may include a vehicle mounted with a target detection device such as a laser radar, and the target object 102 may be a movable object such as a pedestrian in the space 110 to be detected. Or the target object may also be a fixed moving object in the space to be detected. Accordingly, the vehicle 101 may have electronic devices therein, including but not limited to electronic devices having chips, processors, etc. for data processing, capable of processing the initial point cloud data set detected by the object detection device.
It should be understood that the vehicle 101 may have an object detection device mounted thereon for detecting the space 110 to be detected to obtain an initial point cloud data set. Or other target detection devices installed outside the vehicle 101 may be used to detect the space to be detected 110 to obtain an initial point cloud data set, and the initial point cloud data set is sent to the vehicle 101 and/or the server 105, so as to implement the weather condition detection method provided by the embodiment of the present disclosure.
The vehicle 101 interacts with a server 105 over a network 104 to receive or send messages or the like. Various communication client applications may be installed on vehicle 101.
The server 105 may be a server providing various services, such as a background management server (for example only) providing support for websites browsed by users using the terminal devices 101, 102, 103. The background management server may analyze and perform other processing on the received data such as the user request, and feed back a processing result (e.g., a webpage, information, or data obtained or generated according to the user request) to the terminal device.
It should be noted that the weather condition detection method provided in the embodiment of the present disclosure may be generally executed by an electronic device in the vehicle 101. Accordingly, the weather condition detection apparatus provided by the embodiment of the present disclosure may be generally provided in the vehicle 101. Alternatively, the weather condition detection method provided by the embodiment of the present disclosure may be executed by the server 105 capable of being communicatively connected to the vehicle 101. Accordingly, the weather condition detection device provided by the embodiment of the present disclosure may also be disposed in the server 105. The weather condition detection method provided by embodiments of the present disclosure may also be performed by a server or cluster of servers that is different from server 105 and is capable of communicating with vehicle 101 and/or server 105. Alternatively, the weather condition detection device provided in the embodiment of the present disclosure may be disposed in a server or a server cluster that is different from the server 105 and is capable of communicating with the vehicle 101 and/or the server 105.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Fig. 2 schematically shows a flow chart of a weather condition detection method according to an embodiment of the present disclosure.
As shown in fig. 2, the weather condition detection method may include operations S210 to S230.
In operation S210, the initial point cloud data set is processed based on a preset transformation rule to obtain a candidate depth image including a candidate point cloud pixel set, where the initial point cloud data set includes data obtained by detecting a space to be detected by a target detection apparatus.
According to the embodiment of the disclosure, the initial point cloud data set can represent any target to be detected in the space to be detected, such as target objects such as obstacles to be detected and noise objects such as raindrops and snowflakes in abnormal weather conditions.
According to the embodiments of the present disclosure, the target detection device may include a device capable of detecting a target object in a space to be detected, for example, a laser radar device, and the like.
According to the embodiment of the disclosure, candidate point cloud pixels contained in the candidate point cloud pixel set can constitute a candidate depth image. The candidate point cloud pixel may be pixel data obtained by converting part or all of the initial point cloud data set based on a point cloud data conversion method in the related art.
It should be noted that, the embodiment of the present disclosure does not limit the specific data conversion method for converting the initial point cloud data to the candidate point cloud pixel, and for example, the method may include a coordinate conversion method based on a camera internal reference matrix to convert the initial point cloud data to the candidate point cloud pixel, but is not limited thereto, and a person skilled in the art may select the specific data conversion method according to actual needs.
In operation S220, a first target point cloud pixel is screened from the candidate point cloud pixel set according to a candidate point cloud pixel distance between different candidate point cloud pixels in the candidate point cloud pixel set, where the first target point cloud pixel represents at least part of features of a target object in a space to be detected.
It should be noted that the candidate point cloud pixel pitch may be calculated by calculating respective point cloud pixel positions of different candidate point cloud pixels, or may also be calculated by calculating a point cloud data position of initial point cloud data corresponding to the candidate point cloud pixel to obtain the candidate point cloud pixel pitch.
In operation S230, an abnormal weather condition of the space to be detected is determined based on the first target point cloud pixels.
According to the embodiment of the disclosure, the distance between different initial point cloud data corresponding to different candidate point cloud pixels can be represented by the candidate point cloud pixel distance between the different candidate point cloud pixels, and the first target point cloud pixels which are distributed more closely in the candidate depth image can be screened out through the candidate point cloud pixel distance, so that the first target point cloud pixels can be set as the target object in the space to be detected, the target object in the candidate depth image can be accurately screened out by means of the first target point cloud pixels, the interference of the target object in the candidate depth image is eliminated, the point cloud pixels except the first target point cloud pixels in the candidate depth image can be further accurately analyzed, the detection precision of noise objects such as raindrops and snowflakes is improved, and the detection accuracy of abnormal weather conditions is improved.
According to the embodiment of the disclosure, the initial point cloud data set detected by the target detection device is processed according to the preset conversion rule, and the first target point cloud pixels capable of at least partially representing the target object are screened from the candidate point cloud pixel set obtained after processing according to the candidate point cloud pixel spacing, so that the factors irrelevant to the weather condition in the space to be detected in the initial point cloud data set can be effectively analyzed, the factors irrelevant to the weather condition in the space to be detected can be accurately filtered according to the first target number of the first target point cloud pixels, the detection accuracy rate of the weather condition in the space to be detected is at least partially improved, and the technical effect of improving the working stability of the related intelligent driving assistance system can be realized according to the detected abnormal weather condition.
According to an embodiment of the present disclosure, an object detection apparatus includes at least one of:
laser radar detection device, millimeter wave radar detection device.
According to the embodiment of the disclosure, the laser radar detection device may include any type of laser radar detection device such as a phased array laser radar in the related art, and the specific device type of the laser radar detection device is not limited by the embodiment of the disclosure. Accordingly, the embodiment of the present disclosure does not limit the specific device type of the millimeter wave radar detection device, and a person skilled in the art may select the millimeter wave radar detection device according to actual needs.
It should be noted that, in the embodiments of the present disclosure, the specific installation position of the target detection device and the specific performance parameter of the target detection device are not limited, and those skilled in the art may select the installation position and the performance parameter according to actual requirements.
According to an embodiment of the present disclosure, the abnormal weather condition includes at least one of:
a rainfall weather condition, a snowfall weather condition, a hail weather condition, a sand blowing weather condition.
It should be understood that, when an abnormal weather condition exists in the space to be detected, in the candidate depth image, the point cloud pixels other than the first target point cloud pixel may represent at least part of noise objects such as raindrops, snowflakes, hailstones or sand dust under the abnormal weather condition.
Fig. 3 schematically shows a flowchart of processing an initial point cloud data set based on a preset conversion rule to obtain a candidate depth image including a candidate point cloud pixel set according to an embodiment of the present disclosure.
As shown in fig. 3, the operation S210 of processing the initial point cloud data set based on the predetermined transformation rule to obtain the candidate depth image including the candidate point cloud pixel set may include operations S310 to S330.
In operation S310, a candidate point cloud area is determined according to a detection performance parameter of the target detection apparatus.
In operation S320, candidate point cloud data located in the candidate point cloud region is determined from the initial point cloud data set according to the candidate point cloud region and the respective initial point cloud positions of the initial point cloud data in the initial point cloud data set.
In operation S330, data conversion is performed on the candidate point cloud data to obtain a candidate depth image including a candidate point cloud pixel set.
According to the embodiment of the disclosure, the detection performance parameters may affect the detection results such as the point cloud data quality, the point cloud data quantity and the like of the initial point cloud data set obtained by the target detection device. For example, a target detection device with stronger detection performance represented by the detection performance parameters may detect a larger amount of point cloud data, while a target detection device with weaker detection performance represented by the detection performance parameters may detect a smaller amount of point cloud data. An appropriate candidate point cloud area can be selected in the space to be detected according to the detection performance parameters so as to balance the influence of the detection performance parameters on the detection result of the subsequent weather condition.
According to the embodiment of the present disclosure, the initial point cloud position of each of the initial point cloud data may be represented by the point cloud data coordinate of each of the initial point cloud data. And then candidate point cloud data in the candidate point cloud area can be screened out from the initial point cloud data set by using the candidate point cloud area.
It should be noted that, the embodiment of the present disclosure does not limit the specific manner of data conversion, and those skilled in the art may design the data conversion according to actual needs.
In one embodiment of the present disclosure, the candidate point cloud data may be converted into candidate point cloud pixels based on the following equation (1).
Figure BDA0003931196390000131
In formula (1), where (u, v) is the coordinate of the candidate point cloud pixel (i.e., the candidate point cloud pixel position), (x, y, z) represents the candidate point cloud data coordinate of the candidate point cloud data (i.e., the candidate point cloud data position), and row _ scal represents the image height of the candidate depth image, which may be the same as the number of ray bundles of the target detection device, for example, a lidar detection device with 16 ray bundles, row _ scal =16; col _ scale represents the image width of the candidate depth image, which may coincide with the number of scanning beams after one period of scanning per beam of ray by the lidar. R represents the distance of the candidate point cloud data from the target detection device, and this distance R can be represented by formula (2).
Figure BDA0003931196390000141
FOV up Representing the angle of the target detection device from horizontal to the uppermost field of view, and the FOV representing the angle of the field of view of the target detection device.
According to an embodiment of the present disclosure, the probing performance parameter includes at least one of:
angle resolution, detection distance, number of points.
According to an embodiment of the present disclosure, in a case where the target detection device is a laser radar detection device, the angular resolution may be a scan interval angle at which the laser radar emits the scanning laser ray. The angular resolution may include a horizontal angular resolution and/or a vertical angular resolution.
According to the embodiment of the present disclosure, in the case that the target detection device is a laser radar detection device, the number of out-points may include the number of laser points emitted by the laser radar per second, that is, the number of laser rays.
According to an embodiment of the present disclosure, the operation S310 of determining the candidate point cloud region according to the detection performance parameter of the target detection device may include the following operations:
determining a first detection distance and a second detection distance according to the detection performance parameters; and determining a candidate point cloud area in the space to be detected based on the difference distance between the first detection distance and the second detection distance in the space to be detected.
In one embodiment of the present disclosure, the first detection distance and the second detection distance may be determined according to an angular resolution, so that a candidate point cloud region corresponding to the angular resolution may be selected in the space to be detected. The candidate point cloud area can be determined based on the difference distance, namely, under the condition that a represents the first detection distance and b represents the second detection distance, the difference distance r can satisfy the condition that b is not less than r and not more than a, and then the candidate point cloud area in the space to be detected can be determined according to the difference distance r.
It should be noted that, in the case of a small angular resolution, a correspondingly small difference distance is selected to determine a relatively small candidate point cloud region, and in the case of a large angular resolution, a correspondingly large difference distance is selected to determine a relatively large candidate point cloud region, thereby balancing errors caused by different angular resolutions.
According to the embodiment of the disclosure, the difference distance may also be determined based on other detection performance parameters, and the embodiment of the disclosure does not limit the type of the specific detection performance parameter for determining the difference distance.
According to the embodiment of the disclosure, the candidate point cloud area is determined according to the difference distance in the space to be detected, so that the candidate point cloud area filters initial point cloud data representing a noise object in the initial point cloud data set, which is closer to the target detection device, and filters initial point cloud data representing a target object in the initial point cloud data set, which is farther from the target detection device, so that the accuracy of representing the target object by the subsequently determined first target point cloud pixel is improved, the target object and the noise object in the space to be detected are prevented from being confused, and the detection accuracy of the weather condition is improved.
According to an embodiment of the present disclosure, the weather condition detection method may further include the operations of:
and calculating the candidate point cloud pixel distance between the adjacent candidate point cloud pixels according to the respective candidate point cloud pixel parameters of the adjacent candidate point cloud pixels in the candidate depth image.
Operation S220, according to the candidate point cloud pixel set and the candidate point cloud pixel pitch between different candidate point cloud pixels, screening out the first target point cloud pixel from the candidate point cloud pixel set may include the following operations:
and determining candidate point cloud pixels corresponding to the target point cloud pixel spacing as first target point cloud pixels according to the comparison result of the candidate point cloud pixel spacing and the screening spacing threshold corresponding to the candidate point cloud pixel spacing, wherein the target point cloud pixel spacing is the candidate point cloud pixel spacing meeting the preset condition with the comparison result of the corresponding screening spacing threshold.
According to embodiments of the present disclosure, the candidate point cloud pixel parameters may include respective pixel coordinates (u, v) of the candidate point cloud pixels. Because the angle between adjacent detection rays is fixed in the scanning detection rays emitted by the target detection device, namely the angular resolution is fixed, the farther the scanning detection rays are away from the target detection device, the farther the distance between the initial point cloud data detected by the two adjacent detection rays is, and the larger the inherent distance between the point cloud pixels formed after the initial point cloud data is projected to the spherical surface is. Therefore, the corresponding screening interval threshold value can be determined according to the point cloud pixel coordinates of the candidate point cloud pixels corresponding to the candidate point cloud pixel interval, so as to balance the error of the adjacent candidate point cloud pixels caused by the position difference from the target detection device.
According to the embodiment of the disclosure, the target point cloud pixel pitch may be smaller than or equal to the screening pitch threshold, that is, the comparison result may be smaller than or equal to the screening pitch threshold, so that adjacent candidate point cloud pixels with a short distance in the candidate depth image may be screened out, and then the sparseness or the compactness of the candidate point cloud pixels may be represented according to the comparison result, and the first target point cloud pixel representing the target object which is relatively compact is screened out, thereby improving the accuracy of determining the first target point cloud pixel.
According to an embodiment of the disclosure, the candidate point cloud pixel pitch includes a first direction candidate point cloud pixel pitch between candidate point cloud pixels adjacent in a first direction and a second direction candidate point cloud pixel pitch between candidate point cloud pixels adjacent in a second direction in the candidate depth image, the first direction and the second direction being two directions intersecting in the candidate depth image.
Fig. 4 schematically illustrates a flow chart for determining a candidate point cloud pixel corresponding to a target point cloud pixel spacing as a first target point cloud pixel according to a comparison of the candidate point cloud pixel spacing and a screening spacing threshold corresponding to the candidate point cloud pixel spacing, according to an embodiment of the disclosure.
As shown in fig. 4, in the above operations, determining the candidate point cloud pixel corresponding to the target point cloud pixel pitch as the first target point cloud pixel according to the comparison result between the candidate point cloud pixel pitch and the screening pitch threshold corresponding to the candidate point cloud pixel pitch may include operations S410 to S440.
In operation S410, according to a first direction comparison result of the first direction candidate point cloud pixel spacing and the screening spacing threshold corresponding to the first direction candidate point cloud pixel spacing, the first direction candidate point cloud pixel spacing whose first direction comparison result meets a preset condition is determined as a first direction target point cloud pixel spacing.
In operation S420, first direction target point cloud pixels in the candidate point cloud pixel set are deleted to obtain a target candidate point cloud pixel set, where the first direction target point cloud pixels are candidate point cloud pixels corresponding to a first direction target point cloud pixel pitch.
In operation S430, according to a second direction comparison result of the second direction candidate point cloud pixel spacing between adjacent target candidate point cloud pixels and the screening spacing threshold corresponding to the second direction candidate point cloud pixel spacing in the target candidate point cloud pixel set, the second direction candidate point cloud pixel spacing of which the second direction comparison result satisfies a preset condition is determined as a second direction target point cloud pixel spacing.
In operation S440, first target point cloud pixels corresponding to the first direction target point cloud pixel interval and the second direction target point cloud pixel interval are respectively screened out from the candidate point cloud pixel set.
According to the embodiment of the disclosure, a direction included angle may be formed between the first direction and the second direction, and the direction included angle may be any angle value.
In one embodiment of the present disclosure, the first direction and the second direction may be two directions perpendicular to each other, for example, a horizontal direction and a vertical direction in the candidate depth image. Therefore, adjacent candidate point cloud pixels in the candidate depth image can be traversed in the horizontal direction, the first direction candidate point cloud pixel spacing, in which the first direction candidate point cloud pixel spacing in the horizontal direction is smaller than or equal to the corresponding screening threshold, is determined as a first direction target point cloud pixel spacing, the second direction candidate point cloud pixel spacing, in which the second direction candidate point cloud pixel spacing in the vertical direction is smaller than or equal to the corresponding screening threshold, is determined as a second direction target point cloud pixel spacing, and the first target point cloud pixels can be screened out from the mutually selected point cloud pixel set.
According to the embodiment of the disclosure, by deleting the first direction target point cloud pixels in the first candidate point cloud pixel set, the calculation amount of subsequently calculating the distance between the second direction candidate point cloud pixels can be reduced, the calculation cost is saved, and the overall efficiency of weather condition detection is improved.
According to the embodiment of the disclosure, the screening interval threshold corresponding to the candidate point cloud pixel interval is calculated in the following way:
determining benchmark candidate point cloud pixels forming the candidate point cloud pixel spacing, wherein the benchmark candidate point cloud pixels are candidate point cloud pixels which are closest to a target detection device in the candidate point cloud pixels forming the candidate point cloud pixel spacing; calculating a reference distance between a reference candidate point cloud pixel and an adjacent ray according to the reference pixel position of the reference candidate point cloud pixel, wherein the adjacent ray is a ray corresponding to each of other candidate point cloud pixels adjacent to the reference candidate point cloud pixel; and processing the reference distance based on a preset screening distance rule so as to calculate and obtain a screening distance threshold corresponding to the candidate point cloud pixel distance.
According to the embodiment of the disclosure, since the candidate point cloud pixels are obtained by performing data conversion on the initial point cloud data, the rays corresponding to the candidate point cloud pixels can be determined according to the respective rays corresponding to the initial point cloud data.
According to an embodiment of the present disclosure, the reference pixel position may be represented by pixel coordinates of the reference candidate point cloud pixels.
According to an embodiment of the present disclosure, calculating a reference distance between a reference candidate point cloud pixel and an adjacent ray according to a reference pixel position of the reference candidate point cloud pixel may include the following operations:
and determining virtual candidate point cloud pixels with the reference pixel depth positions on the adjacent rays according to the reference pixel depth positions of the reference candidate point cloud pixels.
Taking the depth position of the reference pixel as a preset radius, and calculating a reference circular arc distance between the reference candidate point cloud pixel and the virtual candidate point cloud pixel according to an included angle between the reference ray and an adjacent ray, wherein the reference circular arc distance is taken as a reference distance; or alternatively
And taking the depth position of the reference pixel as the side length of a preset triangle, and calculating the reference linear distance between the reference candidate point cloud pixel and the virtual candidate point cloud pixel according to the included angle between the reference ray and the adjacent ray, wherein the reference linear distance is taken as the reference distance.
And the reference ray is a ray corresponding to the pixel of the reference candidate point cloud.
Fig. 5 schematically illustrates a diagram of calculating a screening pitch threshold according to an embodiment of the disclosure.
As shown in fig. 5 (a), the candidate point cloud area 510 may include candidate point cloud data corresponding to horizontally adjacent candidate point cloud pixels in the candidate depth image. The candidate point cloud data in the candidate point cloud area 510 may correspond to adjacent candidate point cloud pixel a and candidate point cloud pixel B, the ray corresponding to the candidate point cloud pixel a is ray 511, and the ray corresponding to the candidate point cloud pixel B is ray 512. The candidate point cloud pixel a may be determined as a reference candidate point cloud pixel. The virtual candidate point cloud pixel C on ray 512 has the same base pixel depth position (i.e., distance from the target detection device) as candidate point cloud pixel a.
From the reference pixel depth position and the angular resolution α of the object detection device, the reference arc distance 51AC can be calculated, and thus the reference pitch determined by the reference arc distance can be obtained.
Alternatively, the reference straight-line distance 52AC may be calculated by the following equation (3).
52AC=2×r×sin(α); (3)
In formula (3), r may represent the distance between the candidate point cloud pixel a and the target detection device.
As shown in fig. 5 (b), the candidate point cloud region 520 may include candidate point cloud data corresponding to horizontally adjacent candidate point cloud pixels in the candidate depth image. The candidate point cloud data in the candidate point cloud area 520 may correspond to adjacent candidate point cloud pixel C and candidate point cloud pixel D, the ray corresponding to the candidate point cloud pixel C is a ray 521, the ray corresponding to the candidate point cloud pixel D is a ray 522, and an included angle between the ray 521 and the ray 522 is γ.
According to the same or similar method, the reference straight line distance and/or the reference circular arc distance corresponding to the candidate point cloud pixel C and the candidate point cloud pixel D can be calculated through the corresponding algorithm of the formula (3), that is, the reference distance corresponding to the candidate point cloud pixel C and the candidate point cloud pixel D is determined.
After the reference pitch is determined, the screening pitch threshold may be calculated based on the following formula (4).
m=k·d; (4)
In the formula (4), m represents a screening pitch threshold, k represents a preset screening pitch rule parameter, and d represents a reference pitch.
According to the embodiment of the disclosure, referring to the diagram (a) in fig. 5, in the case that the distance between the candidate point cloud pixel a and the candidate point cloud pixel B is greater than the corresponding screening interval threshold, the candidate point cloud pixel a and the candidate point cloud pixel B may be determined as point cloud pixels representing noise objects such as raindrops, snowflakes, and the like in the space to be detected. Accordingly, in the case where the distance between the candidate point cloud pixel C and the candidate point cloud pixel D in the diagram of (b) in fig. 5 is less than or equal to the corresponding screening pitch threshold, the candidate point cloud pixel C and the candidate point cloud pixel D may be determined as the first target point cloud pixel. And partial characteristics of the candidate point cloud pixel C and the candidate point cloud pixel D which can represent the target object 531 can be visually obtained.
It should be noted that in an actual application scenario, an angle between adjacent rays emitted by the object detection device is generally small, and the setting positions of the rays in fig. 5 are only for clearly illustrating the weather detection method provided by the embodiment of the present disclosure, and do not represent the intervals between the actual rays.
It should be understood that, for a multiline lidar detection apparatus, the number of beams of the lidar detection apparatus may represent the height of the candidate depth image in the vertical direction, and the range of the length of the laser beam after scanning in the horizontal direction for one period may be taken as the horizontal width of the candidate depth image.
According to an embodiment of the present disclosure, the operation S230 of determining an abnormal weather condition of the space to be detected based on the first target point cloud pixel may include the following operations:
determining a second target number according to the difference value between the candidate point cloud pixel number of the candidate point cloud pixels in the candidate point cloud pixel set and the first target number of the first target point cloud pixels; and determining the weather condition in the space to be detected as an abnormal weather condition when the second target number is larger than or equal to the preset weather screening threshold value.
According to the embodiment of the disclosure, according to the difference value between the number of candidate point cloud pixels and the number of first targets, the number of second targets of second target point cloud pixels at least partially representing noise objects such as raindrops in the space to be detected can be obtained, and therefore, under the condition that the number of the second targets is larger than or equal to the preset weather screening threshold value, the number of the second targets of the second target point cloud pixels representing the noise objects in the space to be detected can be determined
According to the embodiment of the disclosure, the preset weather screening threshold may be preset, for example, a specific value range of the preset weather screening threshold may be set according to the performance parameters of the corresponding target detection device, the candidate point cloud area representing the candidate depth image, and other factors.
In an embodiment of the present disclosure, the target detection device is a laser radar detection device, and the initial preset weather screening threshold value may be corrected based on the amount of the radar device noise point cloud data detected by the laser radar detection device, so as to obtain the preset weather screening threshold value. According to the embodiment of the disclosure, under the condition that the target detection device detects multiple frames of initial point cloud data sets, the same or similar weather condition detection method is adopted for each frame of initial point cloud data set, the number of first targets corresponding to each frame of initial point cloud data set is obtained, the number of second targets corresponding to each frame of initial point cloud data set is further obtained, then the number of second targets corresponding to the multiple frames of initial point cloud data sets is accumulated, the obtained accumulated number of second targets is compared with the accumulated preset weather screening threshold value, and therefore the weather condition in the space to be detected can be determined to be the abnormal weather condition under the condition that the accumulated number of second targets is larger than or equal to the accumulated preset weather screening threshold value.
According to the embodiment of the present disclosure, the abnormal level of the abnormal weather condition may be further classified based on a difference between the second target number (or the accumulated second target number) and a preset weather filtering threshold (accumulated preset weather filtering threshold). For example, after calculating the difference between the second target number (or the cumulative second target number) and the preset weather screening threshold (cumulative preset weather screening threshold), the difference is used as the abnormal weather detection result, and in the case where the abnormal weather detection result is greater than the first level result, the abnormal level of the abnormal weather condition is determined as the first level abnormality. In the case where the abnormal weather detection result is located between the first ranking result and the second ranking result, the abnormal ranking may be determined as the second ranking. Correspondingly, the abnormal grade can be associated with the weather grade in the related weather prediction field, so that the weather grade corresponding to the abnormal weather in the space to be detected can be accurately determined, an intelligent driving auxiliary system or a related algorithm system can conveniently adjust the algorithm strategy in time, and the stability and the adaptability of the vehicle control in the related unmanned driving field and the intelligent auxiliary driving field are improved.
Fig. 6 schematically shows a flow of a weather condition detection method according to another embodiment of the present disclosure.
As shown in fig. 6, the weather condition detection method may further include operations S610 to S630.
In operation S610, candidate point cloud pixels in the candidate point cloud pixel set that are the same as the second target point cloud pixels are filtered out to obtain a target detection point cloud pixel set, where the second target point cloud pixels are candidate point cloud pixel sets and other candidate point cloud pixels except the first target point cloud pixels.
In operation S620, target detection point cloud data corresponding to the target detection point cloud pixel is screened from the initial point cloud data set according to the target detection point cloud pixel in the target detection point cloud pixel set.
In operation S630, target object detection is performed on the space to be detected based on the target detection point cloud data.
According to the embodiment of the disclosure, the second target point cloud pixel can represent a noise object in a space to be detected, and noise data in the space to be detected can be at least partially filtered by filtering the second target point cloud pixel, so that target object detection is performed by utilizing target detection point cloud data corresponding to a target detection point cloud pixel set, interference of noise objects such as raindrops and snowflakes on target object detection can be at least partially avoided under abnormal weather conditions, detection accuracy of target object detection in the space to be detected is improved, and operation stability of an intelligent auxiliary driving system and an unmanned intelligent control system is realized.
Fig. 7 schematically illustrates a block diagram of a weather condition detection apparatus according to an embodiment of the present disclosure.
As shown in fig. 7, the weather condition detecting apparatus 700 includes a point cloud data converting module 710, a first filtering module 720, and a weather condition determining module 730.
The point cloud data conversion module 710 is configured to process an initial point cloud data set based on a preset conversion rule to obtain a candidate depth image including a candidate point cloud pixel set, where the initial point cloud data set includes data obtained by detecting a space to be detected by a target detection device.
The first screening module 720 is configured to screen out a first target point cloud pixel from the candidate point cloud pixel set according to a candidate point cloud pixel distance between different candidate point cloud pixels in the candidate point cloud pixel set, where the first target point cloud pixel represents at least part of features of a target object in a space to be detected.
The weather condition determining module 730 is configured to determine an abnormal weather condition of the space to be detected based on the first target point cloud pixel.
According to an embodiment of the present disclosure, the weather condition detection apparatus further includes: a first calculation module.
The first calculation module is used for calculating the candidate point cloud pixel spacing between adjacent candidate point cloud pixels according to the respective candidate point cloud pixel parameters of the adjacent candidate point cloud pixels in the candidate depth image.
Wherein, first screening module includes: a first determination submodule.
The first determining unit is used for determining candidate point cloud pixels corresponding to the target point cloud pixel spacing as first target point cloud pixels according to the comparison result of the candidate point cloud pixel spacing and the screening spacing threshold corresponding to the candidate point cloud pixel spacing, wherein the target point cloud pixel spacing is the candidate point cloud pixel spacing meeting the preset condition according to the comparison result of the target point cloud pixel spacing and the screening spacing threshold corresponding to the candidate point cloud pixel spacing.
According to the embodiment of the disclosure, the candidate point cloud pixel spacing comprises a first direction candidate point cloud pixel spacing between candidate point cloud pixels adjacent to each other in a first direction and a second direction candidate point cloud pixel spacing between candidate point cloud pixels adjacent to each other in a second direction in the candidate depth image, wherein the first direction and the second direction are two intersecting directions in the candidate depth image.
The first determination sub-module includes: the device comprises a first determining unit, a first deleting unit, a second determining unit and a first screening unit.
The first determining unit is used for determining the first direction candidate point cloud pixel spacing meeting the preset conditions as the first direction target point cloud pixel spacing according to the first direction comparison result of the first direction candidate point cloud pixel spacing and the screening spacing threshold corresponding to the first direction candidate point cloud pixel spacing.
The first deleting unit is used for deleting first-direction target point cloud pixels in the candidate point cloud pixel set to obtain a target candidate point cloud pixel set, wherein the first-direction target point cloud pixels are candidate point cloud pixels corresponding to the first-direction target point cloud pixel pitch.
The second determining unit is used for determining a second direction candidate point cloud pixel spacing, which meets a preset condition according to a second direction comparison result of a screening spacing threshold corresponding to the second direction candidate point cloud pixel spacing, and a second direction candidate point cloud pixel spacing between adjacent target candidate point cloud pixels in the target candidate point cloud pixel set, as well as the second direction candidate point cloud pixel spacing.
The first screening unit is used for respectively screening out first target point cloud pixels corresponding to the target point cloud spacing in the first direction and the target point cloud pixel spacing in the second direction from the candidate point cloud pixel set.
According to the embodiment of the disclosure, the screening interval threshold corresponding to the candidate point cloud pixel interval is calculated by the following method:
determining benchmark candidate point cloud pixels forming the candidate point cloud pixel spacing, wherein the benchmark candidate point cloud pixels are candidate point cloud pixels which are closest to a target detection device in the candidate point cloud pixels forming the candidate point cloud pixel spacing; calculating a reference distance between a reference candidate point cloud pixel and an adjacent ray according to the reference pixel position of the reference candidate point cloud pixel, wherein the adjacent ray is a ray corresponding to each of other candidate point cloud pixels adjacent to the reference candidate point cloud pixel; and processing the reference distance based on a preset screening distance rule so as to calculate and obtain a screening distance threshold corresponding to the candidate point cloud pixel distance.
According to an embodiment of the present disclosure, calculating a reference distance between a reference candidate point cloud pixel and an adjacent ray according to a reference pixel position of the reference candidate point cloud pixel comprises:
determining virtual candidate point cloud pixels with the reference pixel depth positions on the adjacent rays according to the reference pixel depth positions of the reference candidate point cloud pixels; taking the depth position of the reference pixel as a preset radius, and calculating a reference arc distance between the reference candidate point cloud pixel and the virtual candidate point cloud pixel according to an included angle between the reference ray and an adjacent ray, wherein the reference arc distance is taken as a reference distance; or taking the depth position of the reference pixel as the side length of a preset triangle, and calculating the reference linear distance between the reference candidate point cloud pixel and the virtual candidate point cloud pixel according to the included angle between the reference ray and the adjacent ray, wherein the reference linear distance is taken as the reference distance; the reference ray is a ray corresponding to the pixel of the reference candidate point cloud.
According to an embodiment of the present disclosure, a point cloud data conversion module includes: a second determining submodule, a third determining submodule and a point cloud data conversion submodule.
And the second determining submodule is used for determining a candidate point cloud area according to the detection performance parameters of the target detection device.
And the third determining submodule is used for determining candidate point cloud data located in the candidate point cloud area from the initial point cloud data set according to the candidate point cloud area and the initial point cloud positions of the initial point cloud data in the initial point cloud data set.
And the point cloud data conversion sub-module is used for performing data conversion on the candidate point cloud data to obtain a candidate depth image containing the candidate point cloud pixel set.
According to an embodiment of the present disclosure, the second determination submodule includes: a detection distance determination unit and a point cloud region determination unit.
The detection distance determining unit is used for determining a first detection distance and a second detection distance according to the detection performance parameter.
The point cloud area determining unit is used for determining a candidate point cloud area in the space to be detected based on the difference distance between the first detection distance and the second detection distance in the space to be detected.
According to an embodiment of the present disclosure, the probing performance parameter includes at least one of:
angle resolution, detection distance, number of points.
According to an embodiment of the present disclosure, a weather condition determination module includes: a fourth determination submodule and a fifth determination submodule.
The fourth determining submodule is used for determining the second target quantity according to the difference value of the candidate point cloud pixel quantity of the candidate point cloud pixels in the candidate point cloud pixel set and the first target quantity of the first target point cloud pixels.
The fifth determining submodule is used for determining the weather condition in the space to be detected as an abnormal weather condition when the second target number is larger than or equal to the preset weather screening threshold value.
According to an embodiment of the present disclosure, the weather condition detection apparatus further includes: the device comprises a point cloud pixel filtering module, a target point cloud data screening module and a target object detection module.
The point cloud pixel filtering module is used for filtering candidate point cloud pixels in the candidate point cloud pixel set, wherein the candidate point cloud pixels are the same as second target point cloud pixels in the candidate point cloud pixel set, and the second target point cloud pixels are other candidate point cloud pixels except the first target point cloud pixels in the candidate point cloud pixel set.
And the target point cloud data screening module is used for screening target detection point cloud data corresponding to the target detection point cloud pixels from the initial point cloud data set according to the target detection point cloud pixels in the target detection point cloud pixel set.
And the target object detection module is used for detecting a target object in the space to be detected based on the target detection point cloud data.
According to an embodiment of the present disclosure, an object detection apparatus includes at least one of:
laser radar detection device, millimeter wave radar detection device.
According to an embodiment of the present disclosure, the abnormal weather condition includes at least one of:
rainfall weather conditions, snowfall weather conditions, hail weather conditions, sand blowing weather conditions.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be implemented at least partly as a computer program module, which when executed, may perform a corresponding function.
For example, any plurality of the cloud data conversion module 710, the first filtering module 720 and the weather condition determination module 730 may be combined and implemented in one module/sub-module/unit/sub-unit, or any one of the modules/sub-modules/units/sub-units may be split into a plurality of modules/sub-modules/units/sub-units. Alternatively, at least part of the functionality of one or more of these modules/sub-modules/units/sub-units may be combined with at least part of the functionality of other modules/sub-modules/units/sub-units and implemented in one module/sub-module/unit/sub-unit. According to an embodiment of the present disclosure, at least one of the sub-modules may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or in any one of three implementations of software, hardware, and firmware, or in a suitable combination of any of them. Alternatively, at least one of the sub-modules may be at least partly implemented as a computer program module, which, when executed, may perform a corresponding function.
It should be noted that the weather condition detection device portion in the embodiment of the present disclosure corresponds to the weather condition detection method portion in the embodiment of the present disclosure, and the description of the weather condition detection device portion specifically refers to the weather condition detection method portion, which is not described herein again.
Fig. 8 schematically shows a block diagram of an electronic device adapted to implement a weather condition detection method according to an embodiment of the present disclosure. The electronic device shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 8, an electronic device 800 according to an embodiment of the present disclosure includes a processor 801 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 802 or a program loaded from a storage section 808 into a Random Access Memory (RAM) 803. The processor 801 may include, for example, a general purpose microprocessor (e.g., CPU), an instruction set processor and/or related chip sets and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), among others. The processor 801 may also include on-board memory for caching purposes. The processor 801 may include a single processing unit or multiple processing units for performing different actions of the method flows according to embodiments of the present disclosure.
In the RAM 803, various programs and data necessary for the operation of the electronic apparatus 800 are stored. The processor 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. The processor 801 performs various operations of the method flows according to the embodiments of the present disclosure by executing programs in the ROM 802 and/or RAM 803. Note that the programs may also be stored in one or more memories other than the ROM 802 and RAM 803. The processor 801 may also perform various operations of method flows according to embodiments of the present disclosure by executing programs stored in the one or more memories.
Electronic device 800 may also include input/output (I/O) interface 805, input/output (I/O) interface 805 also connected to bus 804, according to an embodiment of the present disclosure. The system 800 may also include one or more of the following components connected to the I/O interface 805: an input portion 806 including a keyboard, a mouse, and the like; an output section 807 including a signal such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 808 including a hard disk and the like; and a communication section 809 including a network interface card such as a LAN card, a modem, or the like. The communication section 809 performs communication processing via a network such as the internet. A drive 810 is also connected to the I/O interface 805 as necessary. A removable medium 811 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 810 as necessary, so that a computer program read out therefrom is mounted on the storage section 808 as necessary.
According to embodiments of the present disclosure, method flows according to embodiments of the present disclosure may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer-readable storage medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 809 and/or installed from the removable medium 811. The computer program, when executed by the processor 801, performs the above-described functions defined in the system of the embodiments of the present disclosure. The systems, devices, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to an embodiment of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium. Examples may include, but are not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
For example, according to embodiments of the present disclosure, a computer-readable storage medium may include the ROM 802 and/or RAM 803 described above and/or one or more memories other than the ROM 802 and RAM 803.
Embodiments of the present disclosure also include a computer program product comprising a computer program containing program code for performing the method provided by embodiments of the present disclosure, which, when the computer program product is run on an electronic device, is adapted to cause the electronic device to carry out the method of weather condition detection provided by embodiments of the present disclosure.
The computer program, when executed by the processor 801, performs the above-described functions defined in the system/apparatus of the embodiments of the present disclosure. The systems, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
In one embodiment, the computer program may be hosted on a tangible storage medium such as an optical storage device, a magnetic storage device, and the like. In another embodiment, the computer program may also be transmitted in the form of a signal on a network medium, distributed, downloaded and installed via communication section 809, and/or installed from removable media 811. The computer program containing program code may be transmitted using any suitable network medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
In accordance with embodiments of the present disclosure, program code for executing computer programs provided by embodiments of the present disclosure may be written in any combination of one or more programming languages, and in particular, these computer programs may be implemented using high level procedural and/or object oriented programming languages, and/or assembly/machine languages. The programming language includes, but is not limited to, programming languages such as Java, C + +, python, the "C" language, or the like. The program code may execute entirely on the user's computing device, partly on the user's device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It will be appreciated by those skilled in the art that various combinations and/or combinations of the features recited in the various embodiments of the disclosure and/or the claims may be made even if such combinations or combinations are not explicitly recited in the disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments of the present disclosure and/or the claims may be made without departing from the spirit and teachings of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
The embodiments of the present disclosure have been described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described separately above, this does not mean that the measures in the embodiments cannot be used in advantageous combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the present disclosure, and such alternatives and modifications are intended to be within the scope of the present disclosure.

Claims (16)

1. A weather condition detection method, comprising:
processing an initial point cloud data set based on a preset conversion rule to obtain a candidate depth image containing a candidate point cloud pixel set, wherein the initial point cloud data set comprises data obtained after a target detection device detects a space to be detected;
screening out a first target point cloud pixel from the candidate point cloud pixel set according to the candidate point cloud pixel set and the candidate point cloud pixel space between different candidate point cloud pixels, wherein the first target point cloud pixel represents at least part of characteristics of a target object in the space to be detected; and
and determining the abnormal weather condition of the space to be detected based on the first target point cloud pixel.
2. The method of claim 1, further comprising:
calculating the candidate point cloud pixel distance between adjacent candidate point cloud pixels according to the respective candidate point cloud pixel parameters of the adjacent candidate point cloud pixels in the candidate depth image;
wherein, according to the candidate point cloud pixel set and the candidate point cloud pixel distance between different candidate point cloud pixels, screening out a first target point cloud pixel from the candidate point cloud pixel set comprises:
and determining candidate point cloud pixels corresponding to the target point cloud pixel spacing as the first target point cloud pixel according to the comparison result of the candidate point cloud pixel spacing and the screening spacing threshold corresponding to the candidate point cloud pixel spacing, wherein the target point cloud pixel spacing is the candidate point cloud pixel spacing meeting preset conditions with the comparison result of the corresponding screening spacing threshold.
3. The method of claim 2, wherein the candidate point cloud pixel spacings comprise a first direction candidate point cloud pixel spacing between candidate point cloud pixels that are adjacent in a first direction and a second direction candidate point cloud pixel spacing between candidate point cloud pixels that are adjacent in a second direction in the candidate depth image, the first direction and the second direction being two directions that intersect in the candidate depth image;
determining candidate point cloud pixels corresponding to a target point cloud pixel spacing as the first target point cloud pixel according to a comparison result of the candidate point cloud pixel spacing and a screening spacing threshold corresponding to the candidate point cloud pixel spacing comprises:
determining a first direction candidate point cloud pixel spacing meeting the preset condition as a first direction target point cloud pixel spacing according to the first direction comparison result of the first direction candidate point cloud pixel spacing and a screening spacing threshold corresponding to the first direction candidate point cloud pixel spacing;
deleting first-direction target point cloud pixels in the candidate point cloud pixel set to obtain a target candidate point cloud pixel set, wherein the first-direction target point cloud pixels are candidate point cloud pixels corresponding to the first-direction target point cloud pixel pitch;
determining a second direction candidate point cloud pixel spacing meeting the preset condition of the second direction comparison result as a second direction target point cloud pixel spacing according to a second direction candidate point cloud pixel spacing between adjacent target candidate point cloud pixels in the target candidate point cloud pixel set and a second direction comparison result of a screening spacing threshold corresponding to the second direction candidate point cloud pixel spacing; and
and respectively screening out first target point cloud pixels corresponding to the first direction target point cloud pixel interval and the second direction target point cloud pixel interval from the candidate point cloud pixel set.
4. The method of claim 2, wherein the screening pitch threshold corresponding to the candidate point cloud pixel pitch is calculated by:
determining benchmark candidate point cloud pixels forming the candidate point cloud pixel interval, wherein the benchmark candidate point cloud pixels are candidate point cloud pixels which are closest to the target detection device in the candidate point cloud pixels forming the candidate point cloud pixel interval;
calculating a reference distance between the reference candidate point cloud pixel and an adjacent ray according to the reference pixel position of the reference candidate point cloud pixel, wherein the adjacent ray is a ray corresponding to each of other candidate point cloud pixels adjacent to the reference candidate point cloud pixel;
and processing the reference space based on a preset screening space rule so as to calculate and obtain a screening space threshold corresponding to the candidate point cloud pixel space.
5. The method of claim 4, wherein calculating a fiducial spacing between the fiducial candidate point cloud pixel and an adjacent ray from a fiducial pixel location of the fiducial candidate point cloud pixel comprises:
determining virtual candidate point cloud pixels with the reference pixel depth position on the adjacent ray according to the reference pixel depth position of the reference candidate point cloud pixels;
taking the depth position of the reference pixel as a preset radius, and calculating a reference arc distance between the reference candidate point cloud pixel and the virtual candidate point cloud pixel according to an included angle between a reference ray and the adjacent ray, wherein the reference arc distance is taken as the reference distance; or
Taking the depth position of the reference pixel as the side length of a preset triangle, and calculating a reference linear distance between the reference candidate point cloud pixel and the virtual candidate point cloud pixel according to an included angle between the reference ray and the adjacent ray, wherein the reference linear distance is taken as the reference distance;
and the reference ray is a ray corresponding to the pixel of the reference candidate point cloud.
6. The method of claim 1, wherein processing the initial point cloud data set based on a predetermined transformation rule to obtain a candidate depth image comprising a candidate point cloud pixel set comprises:
determining a candidate point cloud area according to the detection performance parameters of the target detection device;
determining candidate point cloud data located in the candidate point cloud area from the initial point cloud data set according to the candidate point cloud area and the respective initial point cloud positions of the initial point cloud data in the initial point cloud data set; and
and performing data conversion on the candidate point cloud data to obtain a candidate depth image containing the candidate point cloud pixel set.
7. The method of claim 6, wherein determining a candidate point cloud region according to detection performance parameters of the object detection device comprises:
determining a first detection distance and a second detection distance according to the detection performance parameters; and
and determining a candidate point cloud area in the space to be detected based on the difference distance between the first detection distance and the second detection distance in the space to be detected.
8. The method of claim 6, wherein the sounding performance parameter comprises at least one of:
angular resolution, detection distance, number of points.
9. The method of claim 1, wherein determining an abnormal weather condition of the space to be detected based on the first target point cloud pixel comprises:
determining a second target number according to the difference value between the candidate point cloud pixel number of the candidate point cloud pixels in the candidate point cloud pixel set and the first target number of the first target point cloud pixels; and
and determining the weather condition in the space to be detected as an abnormal weather condition when the second target number is greater than or equal to a preset weather screening threshold value.
10. The method of claim 9, further comprising:
filtering out candidate point cloud pixels in the candidate point cloud pixel set, wherein the candidate point cloud pixels are the same as second target point cloud pixels in the candidate point cloud pixel set, and a target detection point cloud pixel set is obtained, wherein the second target point cloud pixels are other candidate point cloud pixels except the first target point cloud pixels in the candidate point cloud pixel set;
screening target detection point cloud data corresponding to the target detection point cloud pixels from the initial point cloud data set according to the target detection point cloud pixels in the target detection point cloud pixel set; and
and detecting the target object in the space to be detected based on the target detection point cloud data.
11. The method of any one of claims 1 to 10, wherein the object detection device comprises at least one of:
laser radar detection device, millimeter wave radar detection device.
12. The method of any one of claims 1 to 10, wherein the abnormal weather condition comprises at least one of:
a rainfall weather condition, a snowfall weather condition, a hail weather condition, a sand blowing weather condition.
13. A weather condition detection apparatus comprising:
the system comprises a point cloud data conversion module, a target detection device and a data processing module, wherein the point cloud data conversion module is used for processing an initial point cloud data set based on a preset conversion rule to obtain a candidate depth image containing a candidate point cloud pixel set, and the initial point cloud data set comprises data obtained by detecting a space to be detected through the target detection device;
the first screening module is used for screening out a first target point cloud pixel from the candidate point cloud pixel set according to the candidate point cloud pixel distance among different candidate point cloud pixels in the candidate point cloud pixel set, wherein the first target point cloud pixel represents at least part of characteristics of a target object in the space to be detected; and
and the weather condition determining module is used for determining the abnormal weather condition of the space to be detected based on the first target point cloud pixel.
14. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-12.
15. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to carry out the method of any one of claims 1 to 12.
16. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 12.
CN202211390299.7A 2022-11-08 2022-11-08 Weather condition detection method, device, equipment and storage medium Pending CN115755097A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211390299.7A CN115755097A (en) 2022-11-08 2022-11-08 Weather condition detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211390299.7A CN115755097A (en) 2022-11-08 2022-11-08 Weather condition detection method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115755097A true CN115755097A (en) 2023-03-07

Family

ID=85357451

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211390299.7A Pending CN115755097A (en) 2022-11-08 2022-11-08 Weather condition detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115755097A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116259028A (en) * 2023-05-06 2023-06-13 杭州宏景智驾科技有限公司 Abnormal scene detection method for laser radar, electronic device and storage medium
CN117647852A (en) * 2024-01-29 2024-03-05 吉咖智能机器人有限公司 Weather state detection method and device, electronic equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116259028A (en) * 2023-05-06 2023-06-13 杭州宏景智驾科技有限公司 Abnormal scene detection method for laser radar, electronic device and storage medium
CN117647852A (en) * 2024-01-29 2024-03-05 吉咖智能机器人有限公司 Weather state detection method and device, electronic equipment and storage medium
CN117647852B (en) * 2024-01-29 2024-04-09 吉咖智能机器人有限公司 Weather state detection method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109212543B (en) Calibration verification method for autonomous vehicle operation
CN109829351B (en) Method and device for detecting lane information and computer readable storage medium
CN110458854B (en) Road edge detection method and device
US10678260B2 (en) Calibration methods for autonomous vehicle operations
CN109635685B (en) Target object 3D detection method, device, medium and equipment
CN112634181B (en) Method and device for detecting ground point cloud points
US11250288B2 (en) Information processing apparatus and information processing method using correlation between attributes
CN115755097A (en) Weather condition detection method, device, equipment and storage medium
KR20200018612A (en) Method, apparatus and apparatus for object detection
US11204610B2 (en) Information processing apparatus, vehicle, and information processing method using correlation between attributes
CN112753038B (en) Method and device for identifying lane change trend of vehicle
CN113435237B (en) Object state recognition device, recognition method, and computer-readable recording medium, and control device
CN115273039B (en) Small obstacle detection method based on camera
KR101030317B1 (en) Apparatus for tracking obstacle using stereo vision and method thereof
CN112630798B (en) Method and apparatus for estimating ground
CN115457505A (en) Small obstacle detection method, device and equipment for camera and storage medium
CN115359332A (en) Data fusion method and device based on vehicle-road cooperation, electronic equipment and system
CN115761668A (en) Camera stain recognition method and device, vehicle and storage medium
US20220121859A1 (en) System and method for detecting an object collision
US20220221585A1 (en) Systems and methods for monitoring lidar sensor health
CN114565906A (en) Obstacle detection method, obstacle detection device, electronic device, and storage medium
CN112526477B (en) Method and device for processing information
CN114694106A (en) Extraction method and device of road detection area, computer equipment and storage medium
CN113158864B (en) Method and device for determining included angle between truck head and trailer
CN113379591B (en) Speed determination method, speed determination device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination