CN117647852B - Weather state detection method and device, electronic equipment and storage medium - Google Patents

Weather state detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117647852B
CN117647852B CN202410115223.6A CN202410115223A CN117647852B CN 117647852 B CN117647852 B CN 117647852B CN 202410115223 A CN202410115223 A CN 202410115223A CN 117647852 B CN117647852 B CN 117647852B
Authority
CN
China
Prior art keywords
point cloud
weather
weather state
reflectivity
cloud data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410115223.6A
Other languages
Chinese (zh)
Other versions
CN117647852A (en
Inventor
毛威
曹亮
韦松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jika Intelligent Robot Co ltd
Original Assignee
Jika Intelligent Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jika Intelligent Robot Co ltd filed Critical Jika Intelligent Robot Co ltd
Priority to CN202410115223.6A priority Critical patent/CN117647852B/en
Publication of CN117647852A publication Critical patent/CN117647852A/en
Application granted granted Critical
Publication of CN117647852B publication Critical patent/CN117647852B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a weather state detection method, a weather state detection device, electronic equipment and a storage medium. Wherein the method comprises the following steps: receiving current frame point cloud data, and determining at least one clustering point cloud corresponding to the current frame point cloud data; respectively determining reflectivity distribution vectors corresponding to the clustering point clouds; processing each reflectivity distribution vector based on a weather state detection model obtained through pre-training to obtain weather state vectors corresponding to each clustering point cloud; and determining weather states corresponding to the point cloud data of the current frame based on the weather state vectors. According to the technical scheme, the effect that the vehicle can accurately detect the environmental weather state of the vehicle on the basis of not depending on the cloud and reducing the communication cost is achieved, and further the response speed and the driving safety of the vehicle are improved.

Description

Weather state detection method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of automatic driving technologies, and in particular, to a weather state detection method, a weather state detection device, an electronic device, and a storage medium.
Background
In urban or high-speed scenes, when extreme weather conditions such as rainy days, snowy days or foggy days are met, road surface running conditions become poor, if the road surface running conditions become wet and slippery, visibility is reduced relatively on sunny days, and a driver usually runs at a reduced speed so as to ensure safe running of the vehicle. However, for an unmanned vehicle, the weather condition during driving cannot be directly perceived, the unmanned vehicle cannot be driven at a reduced speed in time, and the risk of occurrence of a driving accident is increased. Therefore, the perception of weather conditions is of great importance for improving the driving safety of an autonomous vehicle.
At present, weather state can send to the autopilot vehicle in real time through high in the clouds server, but this needs to invest a large amount of cost construction high in the clouds server and build the real-time communication link of autopilot vehicle and high in the clouds server to under the relatively poor circumstances of communication, still can exist and can't receive weather state, and then, lead to the problem that the vehicle takes place the incident.
Disclosure of Invention
The invention provides a weather state detection method, a weather state detection device, electronic equipment and a storage medium, which are used for realizing the effect that a vehicle can accurately detect the environmental weather state of the vehicle on the basis of not depending on a cloud and reducing communication cost, and further improving the response rate and driving safety of the vehicle.
According to an aspect of the present invention, there is provided a weather condition detection method, the method comprising:
receiving current frame point cloud data, and determining at least one clustering point cloud corresponding to the current frame point cloud data;
respectively determining reflectivity distribution vectors corresponding to the clustering point clouds;
processing each reflectivity distribution vector based on a weather state detection model obtained through pre-training to obtain a weather state vector corresponding to each clustering point cloud;
And determining weather states corresponding to the current frame point cloud data based on the weather state vectors, wherein the weather states comprise weather categories and/or weather levels.
According to another aspect of the present invention, there is provided a weather condition detection apparatus, the apparatus comprising:
the point cloud data receiving module is used for receiving the point cloud data of the current frame and determining at least one clustering point cloud corresponding to the point cloud data of the current frame;
the reflectivity distribution vector determining module is used for respectively determining reflectivity distribution vectors corresponding to the clustering point clouds;
the weather state vector determining module is used for processing each reflectivity distribution vector based on a weather state detection model obtained through training in advance to obtain weather state vectors corresponding to each clustering point cloud;
and the weather state determining module is used for determining weather states corresponding to the current frame point cloud data based on the weather state vectors, wherein the weather states comprise weather categories and/or weather levels.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the weather condition detection method according to any one of the embodiments of the present invention.
According to a fifth aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute a weather condition detection method according to any embodiment of the present invention.
According to the technical scheme, at least one clustering point cloud corresponding to the current frame point cloud data is determined by receiving the current frame point cloud data, and reflectivity distribution vectors corresponding to the clustering point clouds are respectively determined; processing each reflectivity distribution vector based on a weather state detection model obtained through pre-training to obtain weather state vectors corresponding to each clustering point cloud; the weather state corresponding to the current frame point cloud data is determined based on each weather state vector, the problems that in the related art, a vehicle end cannot receive the weather state in time or cannot receive the weather state, safety accidents occur to the vehicle and the like are solved, the effect that the vehicle can accurately detect the environmental weather state of the vehicle on the basis of not depending on a cloud end and reducing communication cost is achieved, and then the response rate and the driving safety of the vehicle are improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for detecting weather conditions according to a first embodiment of the present invention;
FIG. 2 is a schematic view of a corresponding reflectance distribution histogram under sunny conditions according to a first embodiment of the present invention;
FIG. 3 is a schematic view of a corresponding reflectance distribution histogram under rainy conditions according to a first embodiment of the present invention;
FIG. 4 is a flow chart of a method for detecting weather conditions according to a second embodiment of the present invention;
FIG. 5 is a flow chart of a method for detecting weather conditions according to a third embodiment of the present invention;
Fig. 6 is a schematic structural diagram of a weather condition detection device according to a fourth embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device implementing a weather condition detection method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a weather state detection method according to a first embodiment of the present invention, where the method may be applicable to a case of detecting a weather state of an environment where a vehicle is located, and the method may be performed by a weather state detection device, where the weather state detection device may be implemented in a form of hardware and/or software, and the weather state detection device may be configured in a terminal and/or a server. As shown in fig. 1, the method includes:
s110, receiving the point cloud data of the current frame, and determining at least one clustering point cloud corresponding to the point cloud data of the current frame.
In this embodiment, the point cloud data of the current frame may be understood as the point cloud data received at the current time. It will be appreciated that the point cloud data may be transmitted from the radar device to the receiver at a frequency, for example, at 10 hz, or a frame of point cloud may be transmitted every 0.1 seconds. The current frame point cloud data can be point cloud data which is obtained by scanning the vehicle-mounted radar equipment and transmitted to the processor. In general, the vehicle surroundings can be scanned by an in-vehicle radar device during running of the vehicle. In addition, point cloud data representing the surrounding environment of the vehicle can be obtained and transmitted to the vehicle-mounted processor, and at the moment, the point cloud data received by the vehicle-mounted processor can be used as current frame point cloud data. The current frame point cloud data may be three-dimensional point cloud data. The points included in the current frame point cloud data may be spatial points, i.e., the current frame point cloud data may be composed of data corresponding to a plurality of spatial points. Alternatively, the vehicle-mounted radar device may be an ultrasonic radar, a laser radar, a microwave radar, a millimeter wave radar, or the like. Clustering point clouds may be understood as spatial point clustering regions included in point cloud data. The clustered point cloud may be any cluster of objects included in the point cloud data. Alternatively, the cluster point cloud may include road boundary clusters and/or vehicle clusters, etc.
In practical application, in order to detect the weather state at the current moment in the running process of the vehicle, current frame point cloud data capable of representing the surrounding environment of the vehicle can be obtained. Furthermore, in order to extract meaningful feature points from the current frame point cloud data, the current frame point cloud data may be clustered, and at least one clustered point cloud corresponding to the current frame point cloud data may be obtained. It should be noted that, there may be various ways of determining the clustered point cloud, for example, processing the point cloud data based on a three-dimensional clustering algorithm or processing the two-dimensional view corresponding to the point cloud data based on a two-dimensional clustering algorithm. In general, the amount of data of the point cloud data is huge, the point cloud data is directly processed, the requirements on computing power and memory are high, and the technical scheme is difficult to deploy in a vehicle-mounted processing platform, so that the point cloud data of the current frame can be converted into a two-dimensional image for the embodiment. Furthermore, the two-dimensional images can be clustered to obtain image data corresponding to at least one clustering area corresponding to the two-dimensional images, and further, the image data corresponding to each clustering area can be converted into three-dimensional point cloud data, so that at least one clustering point cloud corresponding to the point cloud data of the current frame can be obtained.
Optionally, determining at least one cluster point cloud corresponding to the current frame point cloud data includes: performing ground point cloud elimination processing on the point cloud data of the current frame to obtain point cloud data to be processed, and performing preset area point cloud extraction processing on the point cloud data to be processed to obtain target point cloud data; carrying out projection processing on the target point cloud data according to a preset view angle to obtain an image to be processed and image point cloud association information; processing the image to be processed based on a preset clustering algorithm to obtain image data corresponding to at least one clustering area included in the image to be processed; and determining at least one clustering point cloud corresponding to the point cloud data of the current frame according to the image data corresponding to the at least one clustering region and the image point cloud association information.
In the present embodiment, the ground point cloud may be understood as point cloud data representing a road surface on which the vehicle is traveling. The preset region may be understood as a preset region of interest. The preset area may be any area in the radar scanning area corresponding to the vehicle, and alternatively may be a rectangular parallelepiped area defined from the front of the vehicle. The predetermined area point cloud may be a non-ground point cloud within the predetermined area. Usually, after removing the ground point cloud, one frame of point cloud data further comprises non-traffic participant categories such as buildings, vegetation and the like, and the point clouds of the categories do not participate in object detection and tracking perception tasks, so that in order to reduce the number of input point clouds in subsequent processing steps, the noise filtering frequency is improved, a preset area can be obtained, the non-ground point clouds in the area are extracted, and the extracted non-ground point clouds are used as target point cloud data. The preset viewing angle may be understood as a preset three-dimensional data projection viewing angle. The preset viewing angle may be any viewing angle, and optionally may be a Bird's Eye View (BEV). BEV is a perspective of viewing an object or scene from above, as if birds were looking down on the ground in the sky. In the field of autopilot and robotics, data acquired by sensors (e.g., radar and cameras, etc.) is typically converted to BEV representations for better object detection, path planning, etc. BEV is capable of simplifying complex three-dimensional environments into two-dimensional images, which is particularly important for efficient computation in real-time systems. The image to be processed may be understood as a two-dimensional image corresponding to the point cloud data. The image to be processed may be a two-dimensional image in any form. The image point cloud association information is used for indicating an association relationship between a pixel point in the image to be processed and a spatial point in the target point cloud data. The image point cloud association information may be understood as information characterizing a mapping relationship between pixel points in an image and spatial points in the point cloud data.
In this embodiment, the preset clustering algorithm may be understood as a preset image clustering algorithm. The preset clustering algorithm may be any algorithm, and optionally, may be a connected component calibration (Connected Component Labeling) algorithm. The connected component calibration algorithm is a classical binary image clustering algorithm in the field of computer vision. A cluster region may be understood as a cluster region of pixels in an image. The cluster region may be any cluster of objects included in the image. Alternatively, the cluster region may include a road boundary cluster, a vehicle cluster, and the like. The Image data corresponding to the cluster region may be understood as an Image region (the term in computer vision is Image Patch) corresponding to the cluster region.
In practical applications, after receiving the current frame point cloud data, principal component analysis (Principal Components Analysis, PCA) may be used to calculate a normal vector of the current frame point cloud data, and a point with a normal vector approximately vertical is selected as a candidate ground point set. And (3) carrying out plane fitting on the candidate ground point set by applying a random sampling consistency (Random Sample Consensus, RANSAC) algorithm to obtain a ground point cloud set. Furthermore, the ground point cloud set can be removed from the point cloud data of the current frame to obtain point cloud data to be processed.
Furthermore, after the point cloud data to be processed is obtained, the data volume of the point cloud data is still huge, in order to reduce the number of input point clouds in the subsequent processing step and improve the noise filtering efficiency, a preset area can be determined from a scanning area corresponding to the vehicle-mounted radar equipment, and non-ground point clouds in the area can be extracted. Further, the extracted non-ground point cloud may be used as target point cloud data.
Furthermore, after the target point cloud data is obtained, the point cloud data is three-dimensional data, the point cloud data is directly processed, the requirements on the computing power and the memory of the equipment are high, and the point cloud data is difficult to execute in a vehicle-mounted processing platform. Therefore, the point cloud data can be converted into a two-dimensional image by performing projection processing on the point cloud data. Specifically, the target point cloud data is projected into a preset viewing angle and is discretized into a two-dimensional image. In the two-dimensional image, the pixel value of the pixel occupied by the target point cloud data is set to a first preset pixel value, and the pixel value of the pixel not occupied by the target point cloud data is set to a second preset pixel value. Further, an image to be processed can be obtained. And, the image to be processed and the target point cloud data can be mapped to obtain image point cloud associated information.
Further, after the image to be processed is obtained, the image to be processed may be processed by adopting a preset clustering algorithm, and further, a set of pixel points corresponding to at least one cluster included in the image to be processed may be output. Thus, each pixel point set can be used as image data corresponding to a corresponding clustering area. And then, the image to be processed is projected based on the target point cloud data, a certain mapping relation exists between the image to be processed and the target point cloud data, and the image point cloud association information can be determined based on the mapping relation. After obtaining the image data corresponding to the at least one clustering area, determining the point cloud data corresponding to each clustering area by combining the image point cloud association information, and taking the obtained point cloud data as at least one clustering point cloud corresponding to the point cloud data of the current frame.
S120, respectively determining reflectivity distribution vectors corresponding to the clustering point clouds.
In this embodiment, the reflectance distribution vector may be understood as a vector characterizing the reflectance distribution situation of each spatial point in the point cloud data. The reflectivity distribution vector may describe the reflectivity characteristics of the clustered point cloud quantitatively in its entirety. For the point cloud data, each spatial point can comprise reflectivity, and the reflectivity reflects the signal intensity of laser emitted by the laser radar and reflected back after encountering an object. The point cloud data scanned by the laser radar may include, in addition to the position information of each spatial point from the current vehicle, the reflectivity of each spatial point.
In practical application, for each cluster point cloud, the reflectivity of each spatial point in the current cluster point cloud and the total number of spatial points included in the current cluster point cloud may be obtained. Furthermore, the reflectivity distribution vector corresponding to the current cluster point cloud can be determined according to the reflectivities and the total number of the space points.
Optionally, determining the reflectivity distribution vector corresponding to each cluster point cloud respectively includes: for each cluster point cloud, acquiring the reflectivity of each space point in the current cluster point cloud, and carrying out normalization processing on each reflectivity to obtain a target reflectivity; for each target reflectivity, determining the target reflectivity as the space point number of the current target reflectivity based on the current cluster point cloud, determining the total space point number of the current cluster point cloud, and determining the reflectivity frequency corresponding to the current target reflectivity based on the space point number and the total space point number; and determining a reflectivity distribution vector corresponding to the current cluster point cloud based on each reflectivity frequency.
In general, the reflectivity of each spatial point included in the point cloud data may be a numerical value in a different data range, and thus, in this embodiment, normalization processing may be performed on each reflectivity to limit each reflectivity to a preset integer section. For example, each reflectance may be normalized to within an integer interval of 0 to 255, and further, the normalized reflectance may be taken as the target reflectance. The reflectivity frequency is understood to be a value which characterizes the distribution of the reflectivity in the belonging point cloud.
In practical application, for each cluster point cloud, the reflectivity of each spatial point in the current cluster point cloud can be obtained based on the current frame point cloud data. Further, the reflectances may be normalized to limit the reflectances to a predetermined integer interval, and a target reflectivity corresponding to each spatial point may be obtained. Further, for each target reflectivity, the number of spatial points whose target reflectivity is the current target reflectivity may be determined from the current cluster point cloud, and the total number of spatial points included in the current cluster point cloud is determined. Further, the reflectivity frequency corresponding to the current target reflectivity may be determined according to the determined number of spatial points and the total number of spatial points.
Optionally, determining the reflectivity frequency corresponding to the current target reflectivity based on the number of spatial points and the total number of spatial points includes: and determining the ratio of the number of the space points to the total number of the space points as the reflectivity frequency corresponding to the current target reflectivity.
In practical applications, after the number of space points and the total number of space points are obtained, the ratio between the number of space points and the total number of space points may be determined. Further, the ratio may be set as the reflectivity frequency corresponding to the current target reflectivity. Illustratively, assume that the total number of spatial points included in the current cluster point cloud is 100, the current target reflectivity is 4, and the number of spatial points in the current cluster point cloud for which the target reflectivity is 4 is 10. Further, the reflectivity frequency at which the current target reflectivity is 4 is
Further, after obtaining the reflectivity frequencies corresponding to the reflectivities, the reflectivity frequencies corresponding to the reflectivities may be stored in a pre-constructed multidimensional vector. Further, a reflectance distribution vector can be obtained. For example, the reflectivity frequency may be stored in a 256-dimensional vector, where the 1 st dimension to the 256 th dimension of the 256-dimensional vector sequentially correspond to the reflectivity normalized integer intervals 0 to 255, and a 256-dimensional reflectivity distribution vector may be obtained, where the 1 st dimension corresponds to a value of the reflectivity frequency with the target reflectivity of 0, the 2 nd dimension corresponds to a value of the reflectivity frequency with the target reflectivity of 1, and so on.
After obtaining the reflectance frequencies corresponding to the respective reflectances, a reflectance distribution histogram may be generated. The reflectance distribution histogram is in one-to-one correspondence with the reflectance distribution vector. In the reflectivity distribution histogram, the abscissa can be used for representing reflectivity, and the value range of the reflectivity is an integer interval (for example, integer interval 0-255) corresponding to normalization; the ordinate represents the reflectivity frequency, i.e. the ratio between the number of points of the abscissa corresponding value and the total number of points of the cluster point cloud. For example, fig. 2 is a reflectance distribution histogram corresponding to a vehicle cluster point cloud under a sunny condition, and fig. 3 is a reflectance distribution histogram corresponding to a vehicle cluster point cloud under a rainy condition. As can be seen from fig. 2 and 3, raindrops in the rainy day air attenuate the energy of the returned laser radar point cloud, reducing the reflectivity of the returned spatial points so that the reflectivity of most spatial points is at a level close to 0. Similarly, in extreme weather conditions such as foggy and snowy days, the reflectivity of the point cloud is also weakened, and similar features with reflectivity close to 0 are exhibited.
And S130, processing each reflectivity distribution vector based on a weather state detection model obtained through pre-training to obtain weather state vectors corresponding to each cluster point cloud.
In this embodiment, the weather state detection model may be a neural network model that uses a reflectance distribution vector as an input object to detect weather states corresponding to respective clustered point clouds based on the reflectance distribution vector. The weather state detection model may be a neural network model of any structure, and alternatively may be a Multi-Layer Perceptron (MLP). Because the MLP has smaller parameter compared with the mainstream deep learning model, and belongs to a lightweight network, the method is very efficient for quantification of the weather state detection model and deployment on the vehicle-mounted processing platform, and the weather state detection model can also realize real-time weather perception on the vehicle-mounted processing platform. A weather state vector may be understood as a vector characterizing a weather state. The weather status vector may include at least two elements characterizing the weather status, and optionally, may include a weather category element and a weather level element. Wherein the weather category element may be understood as a numerical value characterizing the weather category. The weather level element may be understood as a numerical value characterizing the weather level. Illustratively, the value ranges corresponding to the weather category element and the weather level element may each be any value between 0 and 1.
In practical application, after the reflectivity distribution vectors corresponding to the clustering point clouds are obtained, the reflectivity distribution vectors can be input into a weather state detection model obtained through training in advance. Further, each reflectance distribution vector may be processed based on the weather condition detection model. Thus, the weather state vector corresponding to each cluster point cloud can be obtained.
And S140, determining weather states corresponding to the point cloud data of the current frame based on the weather state vectors.
In this embodiment, the weather status may include a weather category and a weather level. Weather category may be understood as the category of weather. Weather level may be understood as the degree to which a weather category corresponds.
It should be noted that, for the current frame point cloud data, one or more clustered point clouds may be corresponding to the current frame point cloud data, and further, one or more reflectivity distribution vectors and one or more weather status vectors may be corresponding to the clustered point clouds. When determining the weather state corresponding to the current frame point cloud data, the weather state may be determined according to the number of clustered point clouds corresponding to the current frame point cloud data, and different weather state determining modes may be corresponding to the case where one clustered point cloud is used and the case where a plurality of clustered point clouds are used, which will be described below.
Optionally, in the case that the clustered point cloud is one, determining, based on each weather state vector, a weather state corresponding to the current frame point cloud data includes: and under the condition that the clustering point cloud is one, taking the weather state vector corresponding to the clustering point cloud as the weather state vector corresponding to the current frame point cloud data, and determining the weather state based on the weather state vector.
In practical application, under the condition that the number of the clustered point clouds corresponding to the current frame point cloud data is one, the weather state vector corresponding to the clustered point clouds can be used as the weather state vector corresponding to the current frame point cloud data. Further, the weather state corresponding to the current frame point cloud data can be obtained by determining the corresponding weather category and weather level according to the values corresponding to the elements in the weather state vector.
Optionally, determining the weather state based on the weather state vector includes: if the value corresponding to the weather category element is smaller than a first preset threshold value, the weather category is the first category; if the value corresponding to the weather category element is not smaller than the first preset threshold value, the weather category is the second category; if the value corresponding to the weather grade element is smaller than the second preset threshold value, the weather grade is the first grade; if the value corresponding to the weather grade element is larger than the second preset threshold value and smaller than the third preset threshold value, the weather grade is the second grade; if the value corresponding to the weather level element is larger than a third preset threshold value, the weather level is a third level.
In this embodiment, the first preset threshold may be any value, and optionally, may be 0.5. The first category may be any weather category, and optionally, may be sunny. The second category may be any weather category, and optionally, may be a rainy, foggy or snowy day. The second preset threshold may be any value, alternatively, may be 0.4. The first level may be any weather level, and optionally may be light. The third preset threshold may be any value, alternatively, may be 0.7. The second level may be any weather level, and optionally, may be moderate. The third level may be any weather level, and optionally, may be severe.
In practical applications, when determining the weather state corresponding to the current frame point cloud data based on the weather state vector, the weather state may be determined according to the numerical values of the respective elements included in the weather state vector. For the weather category, if the value corresponding to the weather category element in the weather state vector is smaller than a first preset threshold value, the weather category can be determined as a first category; if the value corresponding to the weather category element in the weather state vector is not smaller than the first preset threshold, the weather category can be determined to be the second category. For the weather grade, if the value corresponding to the weather grade element in the weather state vector is smaller than the second preset threshold value, the weather grade can be determined to be the first grade; if the value corresponding to the weather grade element in the weather state vector is greater than the second preset threshold value and less than the third preset threshold value, determining the weather grade as the second grade; if the value corresponding to the weather level element in the weather state vector is greater than the third preset threshold, the weather level may be determined to be the third level.
Optionally, in the case that the clustered point cloud is plural, determining, based on each weather state vector, a weather state corresponding to the current frame point cloud data includes: when the number of the clustered point clouds is multiple, weather states corresponding to the clustered point clouds are determined based on weather state vectors corresponding to the clustered point clouds, and weather states corresponding to the current frame point cloud data are determined based on the weather states.
In practical application, when the number of the clustered point clouds corresponding to the current frame point cloud data is multiple, the weather state corresponding to each clustered point cloud may be determined according to the weather state vector corresponding to each clustered point cloud. The weather state corresponding to each cluster point cloud can be determined based on the mode of determining the weather state corresponding to the current frame point cloud data under the condition that the cluster point cloud is one. Further, weather states corresponding to the clustering point clouds can be obtained. Further, each weather state can be processed based on a voting method to obtain a weather state corresponding to the current frame point cloud data, namely, the weather category with the largest occurrence number in each weather state is used as the weather category corresponding to the current frame point cloud data; and taking the weather grade with the largest occurrence number in each weather state as the weather grade corresponding to the current frame point cloud data. Furthermore, the determined weather category and weather level can be used as the weather state corresponding to the current frame point cloud data.
In order to further improve the judging stability of the weather state of the current frame point cloud data, the weather state corresponding to the current frame point cloud data may be updated in combination with the weather state corresponding to the historical frame point cloud data when the historical frame point cloud data exists in the current frame point cloud data.
Based on the above, the above technical means further includes: if a preset number of historical frame point cloud data exist before the current frame point cloud data, acquiring weather states corresponding to the historical frame point cloud data; and updating the weather state corresponding to the current frame point cloud data based on the weather state corresponding to each history frame point cloud data.
In this embodiment, the preset number may be any number, alternatively, may be 5, 10, 15, or the like. The historical frame point cloud data before the current frame point cloud data can be point cloud data selected from multi-frame point cloud data between the first frame point cloud data and the current frame point cloud data according to a preset screening standard. The preset screening criteria may be any criteria preset for selecting the historical frame point cloud data, and optionally, may be selected according to a preset step size, or selected randomly, etc. In practical applications, the vehicle speed may be slow and/or the scanning frequency of the laser radar may be too high, which may result in too high similarity between adjacent frame point cloud data, so the historical frame point cloud data may be point cloud data selected from the received multi-frame point cloud data.
In practical application, after obtaining the weather state corresponding to the current frame point cloud data, it may also be determined whether the current frame point cloud data has a preset number of historical frame point cloud data. If a preset number of historical frame point cloud data exist before the current frame point cloud data, weather states corresponding to the historical frame point cloud data can be obtained. Furthermore, the weather type and the weather level with the largest occurrence number in the weather state corresponding to the historical frame point cloud data and the weather state corresponding to the current frame point cloud data can be determined, and the determined weather type and weather level are used as the weather state after the current frame point cloud data is updated. For example, if the number of the historical frame point cloud data is 5 frames, and weather categories corresponding to the historical frame point cloud data are respectively: the weather category corresponding to the point cloud data of the current frame is the weather, and the weather, the fog and the snow are the sunny day, the rainy and the snowy day. And the weather category after the current frame point cloud data is updated is a sunny day, and the weather category is used as the weather category of the final output of the current frame point cloud data.
According to the technical scheme, at least one clustering point cloud corresponding to the point cloud data of the current frame is determined, and the reflectivity distribution vector corresponding to each clustering point cloud is respectively determined; processing each reflectivity distribution vector based on a weather state detection model obtained through pre-training to obtain weather state vectors corresponding to each clustering point cloud; the weather state corresponding to the current frame point cloud data is determined based on each weather state vector, the problems that in the related art, a vehicle end cannot receive the weather state in time or cannot receive the weather state, safety accidents occur to the vehicle and the like are solved, the effect that the vehicle can accurately detect the environmental weather state of the vehicle on the basis of not depending on a cloud end and reducing communication cost is achieved, and then the response rate and the driving safety of the vehicle are improved.
Example two
Fig. 4 is a flowchart of a weather condition detection method according to a second embodiment of the present invention, where, based on the foregoing embodiment, before the reflectivity distribution vector is processed based on the weather condition detection model, the weather condition detection model may be trained based on training samples, so as to obtain a trained weather condition detection model. The specific implementation manner can be seen in the technical scheme of the embodiment. Wherein, the technical terms identical or similar to those of the above embodiments are not repeated herein.
As shown in fig. 4, the method includes:
s210, training to obtain a weather state detection model.
It should be noted that, before the weather state detection model provided by the embodiment of the present invention is applied, a pre-built model to be trained needs to be trained first. Before training the model, a plurality of training samples may be constructed to train the model based on the training samples. In order to improve the accuracy of the weather state detection model, training samples can be constructed as much and as abundant as possible.
Optionally, training to obtain a weather state detection model includes: acquiring a plurality of training samples; for each training sample, the reflectivity distribution vector of at least one clustering point cloud corresponding to sample point cloud data in the current training sample is input into a weather state detection model to detect weather states, and a predicted weather state vector is output; and determining a loss value based on the predicted weather state vector and the actual weather state vector in the current training sample, and correcting model parameters in the weather state detection model based on the loss value until a loss function in the weather state detection model converges.
The training samples comprise reflectivity distribution vectors of at least one clustering point cloud corresponding to sample frame point cloud data and actual weather state vectors corresponding to the clustering point clouds.
In this embodiment, the sample frame point cloud data may be point cloud data obtained based on scanning by the radar device, or may also be point cloud data obtained based on reconstruction of a point cloud reconstruction model, or may also be point cloud data stored in advance in a storage space, or the like. The actual weather state vector may be a weather state vector determined according to an actual weather condition corresponding to the corresponding frame point cloud data at the scanning time. Illustratively, the actual weather category elements (clear or foggy and snowy, respectively, encoded as 1 and 2) and the actual weather level elements (mild, moderate or severe, respectively, encoded as 1, 2 and 3) in the actual weather status vector. The predicted weather state vector may be a weather state vector output after the reflectivity distribution vector of at least one cluster point cloud corresponding to the sample frame point cloud data is input to the weather state prediction model. The loss value may be a value that characterizes the degree of difference between the predicted output and the actual output. The loss function may be a function determined based on the loss value that characterizes the degree of difference between the predicted output and the actual output.
In practical application, a plurality of sample frame point cloud data can be obtained, and for each sample frame point cloud data, at least one cluster point cloud corresponding to the current sample frame point cloud data can be determined, and further, the reflectivity distribution vector corresponding to each cluster point cloud can be respectively determined. And the actual weather state vector corresponding to each cluster point cloud can be determined according to the actual weather condition corresponding to each cluster point cloud. Further, a training sample can be constructed according to the reflectivity distribution vector of at least one clustering point cloud corresponding to the sample frame point cloud data and the actual weather state vector corresponding to each clustering point cloud. Further, a plurality of training samples can be obtained.
Further, for each training sample, the reflectivity distribution vector of at least one cluster point cloud corresponding to the sample frame point cloud data in the current training sample can be input into a weather state detection model to detect the weather state. Further, the predicted weather state vector corresponding to each cluster point cloud may be output. Further, each predicted weather state vector may be compared to an actual weather state vector corresponding to a corresponding cluster point cloud in the current training sample to determine a loss value. Further, model parameters in the weather condition detection model may be corrected based on the loss value. Then, the training error of the loss function in the weather state detection model, that is, the loss parameter, may be used as a condition for detecting whether the current loss function converges, for example, whether the training error is smaller than a preset error or whether the error variation trend tends to be stable, or whether the current iteration number of the model is equal to a preset number of times, or the like. If the detection reaches the convergence condition, for example, the training error of the loss function is smaller than the preset error or the error change tends to be stable, which indicates that the training of the weather state detection model is completed, at this time, the iterative training can be stopped. If the current condition is detected not to be met, other training samples can be further obtained to train the weather state detection model until the training error of the loss function is within a preset range. And when the training error of the loss function reaches convergence, obtaining a weather state detection model after training.
S220, receiving the point cloud data of the current frame, and determining at least one clustering point cloud corresponding to the point cloud data of the current frame.
S230, respectively determining reflectivity distribution vectors corresponding to the clustering point clouds.
S240, processing each reflectivity distribution vector based on a weather state detection model obtained through training in advance to obtain weather state vectors corresponding to each cluster point cloud.
S250, determining weather states corresponding to the point cloud data of the current frame based on the weather state vectors.
After the trained weather state detection model is obtained, the reflectivity distribution vector corresponding to each cluster point cloud corresponding to the current frame point cloud data can be processed based on the weather state detection model. Further, weather state vectors corresponding to the clustering point clouds can be obtained. Further, a weather state corresponding to the current frame point cloud data may be determined based on each weather state vector. The specific process of determining the weather state corresponding to the current frame point cloud data may be described in steps S110 to S140.
According to the technical scheme, at least one clustering point cloud corresponding to the point cloud data of the current frame is determined, and the reflectivity distribution vector corresponding to each clustering point cloud is respectively determined; processing each reflectivity distribution vector based on a weather state detection model obtained through pre-training to obtain weather state vectors corresponding to each clustering point cloud; the weather state corresponding to the current frame point cloud data is determined based on each weather state vector, the problems that in the related art, a vehicle end cannot receive the weather state in time or cannot receive the weather state, safety accidents occur to the vehicle and the like are solved, the effect that the vehicle can accurately detect the environmental weather state of the vehicle on the basis of not depending on a cloud end and reducing communication cost is achieved, and then the response rate and the driving safety of the vehicle are improved.
Example III
Fig. 5 is a flowchart of a weather condition detection method according to a third embodiment of the present invention, and this embodiment is an alternative embodiment of the foregoing embodiments. As shown in fig. 5, the method according to the embodiment of the present invention may include the following steps:
1. acquiring current frame point cloud data;
2. removing ground point clouds in the point cloud data of the current frame to obtain point cloud data to be processed;
3. extracting non-ground point clouds of a preset area in point cloud data to be processed to obtain target point cloud data;
4. projecting the cloud data of the target point to the BEV view to obtain an image to be processed;
5. clustering the images to be processed to obtain image data corresponding to at least one clustering area, and obtaining image point cloud association information;
6. determining at least one clustering point cloud corresponding to the point cloud data of the current frame according to the image data corresponding to the at least one clustering region and the image point cloud association information;
7. determining reflectivity distribution vectors corresponding to the clustering point clouds;
8. processing each reflectivity distribution vector based on the weather state detection model to obtain weather state vectors corresponding to each cluster point cloud;
9. and determining weather states corresponding to the point cloud data of the current frame based on the weather state vectors.
According to the technical scheme, at least one clustering point cloud corresponding to the point cloud data of the current frame is determined, and the reflectivity distribution vector corresponding to each clustering point cloud is respectively determined; processing each reflectivity distribution vector based on a weather state detection model obtained through pre-training to obtain weather state vectors corresponding to each clustering point cloud; the weather state corresponding to the current frame point cloud data is determined based on each weather state vector, the problems that in the related art, a vehicle end cannot receive the weather state in time or cannot receive the weather state, safety accidents occur to the vehicle and the like are solved, the effect that the vehicle can accurately detect the environmental weather state of the vehicle on the basis of not depending on a cloud end and reducing communication cost is achieved, and then the response rate and the driving safety of the vehicle are improved.
Example IV
Fig. 6 is a schematic structural diagram of a weather condition detection device according to a fourth embodiment of the present invention. As shown in fig. 6, the apparatus includes: the point cloud data receiving module 310, the reflectivity distribution vector determination module 320, the weather state vector determination module 330, and the weather state determination module 340.
The point cloud data receiving module 310 is configured to receive current frame point cloud data, and determine at least one clustered point cloud corresponding to the current frame point cloud data; the reflectivity distribution vector determining module 320 is configured to determine reflectivity distribution vectors corresponding to the clusters respectively; the weather state vector determining module 330 is configured to process each of the reflectivity distribution vectors based on a weather state detection model obtained by training in advance, so as to obtain weather state vectors corresponding to each of the clustered point clouds; a weather state determining module 340, configured to determine a weather state corresponding to the current frame point cloud data based on each weather state vector, where the weather state includes a weather category and/or a weather level.
According to the technical scheme, at least one clustering point cloud corresponding to the point cloud data of the current frame is determined, and the reflectivity distribution vector corresponding to each clustering point cloud is respectively determined; processing each reflectivity distribution vector based on a weather state detection model obtained through pre-training to obtain weather state vectors corresponding to each clustering point cloud; the weather state corresponding to the current frame point cloud data is determined based on each weather state vector, the problems that in the related art, a vehicle end cannot receive the weather state in time or cannot receive the weather state, safety accidents occur to the vehicle and the like are solved, the effect that the vehicle can accurately detect the environmental weather state of the vehicle on the basis of not depending on a cloud end and reducing communication cost is achieved, and then the response rate and the driving safety of the vehicle are improved.
Optionally, the point cloud data receiving module 310 includes: the device comprises a ground point cloud eliminating unit, a point cloud projection unit, an image clustering unit and a clustered point cloud determining unit.
The ground point cloud eliminating unit is used for carrying out ground point cloud eliminating processing on the point cloud data of the current frame to obtain point cloud data to be processed, and carrying out preset area point cloud extraction processing on the point cloud data to be processed to obtain target point cloud data;
The point cloud projection unit is used for carrying out projection processing on the target point cloud data according to a preset view angle to obtain an image to be processed and image point cloud association information, wherein the image point cloud association information is used for indicating an association relationship between a pixel point in the image to be processed and a space point in the target point cloud data;
the image clustering unit is used for processing the image to be processed based on a preset clustering algorithm to obtain image data corresponding to at least one clustering area included in the image to be processed;
and the cluster point cloud determining unit is used for determining at least one cluster point cloud corresponding to the current frame point cloud data according to the image data corresponding to the at least one cluster region and the image point cloud association information.
Optionally, the reflectivity distribution vector determination module 320 includes: a reflectivity acquisition unit, a reflectivity frequency determination unit, and a reflectivity distribution vector determination unit.
The reflectivity acquisition unit is used for acquiring the reflectivity of each space point in the current cluster point cloud for each cluster point cloud, and carrying out normalization processing on each reflectivity to obtain a target reflectivity;
a reflectivity frequency determining unit, configured to determine, for each target reflectivity, a number of spatial points for which a target reflectivity is a current target reflectivity based on the current cluster point cloud, determine a total number of spatial points for the current cluster point cloud, and determine a reflectivity frequency corresponding to the current target reflectivity based on the number of spatial points and the total number of spatial points;
And the reflectivity distribution vector determining unit is used for determining a reflectivity distribution vector corresponding to the current clustering point cloud based on each reflectivity frequency.
Optionally, the reflectivity frequency determining unit is specifically configured to determine a ratio between the number of spatial points and the total number of spatial points, so as to serve as a reflectivity frequency corresponding to the current target reflectivity.
Optionally, the weather state determination module 340 includes: a weather state first determining unit and a weather state second determining unit.
A weather state first determining unit, configured to, when the clustered point cloud is one, take a weather state vector corresponding to the clustered point cloud as a weather state vector corresponding to the current frame point cloud data, and determine the weather state based on the weather state vector;
and the weather state second determining unit is used for determining weather states corresponding to the clustered point clouds based on weather state vectors corresponding to the clustered point clouds when the clustered point clouds are multiple, and determining weather states corresponding to the current frame point cloud data based on the weather states.
Optionally, the weather state first determining unit includes: the weather type determining unit comprises a weather type first determining subunit, a weather type second determining subunit, a weather level first determining subunit, a weather level second determining subunit and a weather level third determining subunit.
A weather category first determining subunit, configured to, if a value corresponding to the weather category element is smaller than a first preset threshold, determine that the weather category is a first category;
a weather category second determining subunit, configured to, if a value corresponding to the weather category element is not less than the first preset threshold, determine that the weather category is a second category; the method comprises the steps of,
a weather grade first determining subunit, configured to, if a value corresponding to the weather grade element is smaller than a second preset threshold, set the weather grade to be a first grade;
a weather grade second determining subunit, configured to, if a value corresponding to the weather grade element is greater than the second preset threshold and less than a third preset threshold, determine that the weather grade is a second grade;
and the weather grade third determining subunit is configured to, if the value corresponding to the weather grade element is greater than the third preset threshold, determine that the weather grade is a third grade.
Optionally, the apparatus further includes: the weather state acquisition module and the weather state update module.
The weather state acquisition module is used for acquiring weather states corresponding to the historical frame point cloud data if a preset number of the historical frame point cloud data exist before the current frame point cloud data;
And the weather state updating module is used for updating the weather state corresponding to the current frame point cloud data based on the weather state corresponding to each history frame point cloud data.
Optionally, the apparatus further includes: and a model training module.
The model training module is used for training to obtain the weather state detection model;
optionally, the model training module includes: the model parameter correction device comprises a sample acquisition unit, a predicted weather state vector determination unit and a model parameter correction unit.
The system comprises a sample acquisition unit, a storage unit and a storage unit, wherein the sample acquisition unit is used for acquiring a plurality of training samples, wherein the training samples comprise reflectivity distribution vectors of at least one clustering point cloud corresponding to sample point cloud data and actual weather state vectors corresponding to the clustering point clouds;
the predicted weather state vector determining unit is used for inputting the reflectivity distribution vector of at least one clustered point cloud corresponding to sample frame point cloud data in the current training sample into the weather state detection model for weather state detection and outputting the predicted weather state vector corresponding to each clustered point cloud;
and the model parameter correction unit is used for determining a loss value based on each predicted weather state vector and the actual weather state vector corresponding to the corresponding cluster point cloud in the current training sample, and correcting the model parameters in the weather state detection model based on the loss value until a loss function in the weather state detection model converges.
The weather state detection device provided by the embodiment of the invention can execute the weather state detection method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example five
Fig. 7 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 7, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the weather state detection method.
In some embodiments, the weather state detection method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the weather condition detection method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the weather state detection method in any other suitable way (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (7)

1. A method for detecting a weather condition, comprising:
receiving current frame point cloud data, and determining at least one clustering point cloud corresponding to the current frame point cloud data;
respectively determining reflectivity distribution vectors corresponding to the clustering point clouds;
processing each reflectivity distribution vector based on a weather state detection model obtained through pre-training to obtain a weather state vector corresponding to each clustering point cloud;
Determining weather states corresponding to the current frame point cloud data based on the weather state vectors, wherein the weather states comprise weather categories and/or weather levels;
the determining the reflectivity distribution vector corresponding to each cluster point cloud respectively comprises the following steps:
for each cluster point cloud, acquiring the reflectivity of each space point in the current cluster point cloud, and carrying out normalization processing on each reflectivity to obtain a target reflectivity;
for each target reflectivity, determining the number of space points with the target reflectivity as the current target reflectivity based on the current clustering point cloud, determining the total number of space points of the current clustering point cloud, and determining the reflectivity frequency corresponding to the current target reflectivity based on the number of space points and the total number of space points;
determining a reflectivity distribution vector corresponding to the current cluster point cloud based on each reflectivity frequency;
wherein the determining a reflectivity frequency corresponding to the current target reflectivity based on the number of spatial points and the total number of spatial points includes:
determining the ratio of the number of the space points to the total number of the space points to serve as the reflectivity frequency corresponding to the current target reflectivity;
The weather state detection method further comprises the following steps:
training to obtain the weather state detection model;
the training obtains the weather state detection model, comprising:
obtaining a plurality of training samples, wherein the training samples comprise reflectivity distribution vectors of at least one clustering point cloud corresponding to sample frame point cloud data and actual weather state vectors corresponding to the clustering point clouds;
for each training sample, the reflectivity distribution vector of at least one clustering point cloud corresponding to sample frame point cloud data in the current training sample is input into a weather state detection model to detect weather states, and the predicted weather state vector corresponding to each clustering point cloud is output;
and determining a loss value based on each predicted weather state vector and an actual weather state vector corresponding to a corresponding cluster point cloud in the current training sample, and correcting model parameters in the weather state detection model based on the loss value until a loss function in the weather state detection model converges.
2. The method according to claim 1, wherein determining at least one cluster point cloud corresponding to the current frame point cloud data comprises:
Performing ground point cloud elimination processing on the point cloud data of the current frame to obtain point cloud data to be processed, and performing preset area point cloud extraction processing on the point cloud data to be processed to obtain target point cloud data;
carrying out projection processing on the target point cloud data according to a preset view angle to obtain an image to be processed and image point cloud association information, wherein the image point cloud association information is used for indicating an association relationship between a pixel point in the image to be processed and a space point in the target point cloud data;
processing the image to be processed based on a preset clustering algorithm to obtain image data corresponding to at least one clustering area included in the image to be processed;
and determining at least one clustering point cloud corresponding to the current frame point cloud data according to the image data corresponding to the at least one clustering region and the image point cloud association information.
3. The weather state detection method as claimed in claim 1, wherein the determining a weather state corresponding to the current frame point cloud data based on each of the weather state vectors comprises:
under the condition that the clustering point cloud is one, taking a weather state vector corresponding to the clustering point cloud as a weather state vector corresponding to the current frame point cloud data, and determining the weather state based on the weather state vector;
And when the number of the clustered point clouds is multiple, determining weather states corresponding to the clustered point clouds based on weather state vectors corresponding to the clustered point clouds, and determining weather states corresponding to the current frame point cloud data based on the weather states.
4. The weather state detection method as claimed in claim 3, wherein the weather state vector comprises a weather category element and/or a weather level element, the determining the weather state based on the weather state vector comprising:
if the value corresponding to the weather category element is smaller than a first preset threshold value, the weather category is a first category;
if the value corresponding to the weather category element is not smaller than the first preset threshold, the weather category is a second category; the method comprises the steps of,
if the value corresponding to the weather grade element is smaller than a second preset threshold value, the weather grade is a first grade;
if the value corresponding to the weather grade element is larger than the second preset threshold value and smaller than a third preset threshold value, the weather grade is a second grade;
and if the value corresponding to the weather grade element is larger than the third preset threshold value, the weather grade is a third grade.
5. The weather condition detection method as claimed in claim 1, further comprising:
if a preset number of historical frame point cloud data exist before the current frame point cloud data, acquiring weather states corresponding to the historical frame point cloud data;
and updating the weather state corresponding to the current frame point cloud data based on the weather state corresponding to each history frame point cloud data.
6. A weather condition detection apparatus, comprising:
the point cloud data receiving module is used for receiving the point cloud data of the current frame and determining at least one clustering point cloud corresponding to the point cloud data of the current frame;
the reflectivity distribution vector determining module is used for respectively determining reflectivity distribution vectors corresponding to the clustering point clouds;
the weather state vector determining module is used for processing each reflectivity distribution vector based on a weather state detection model obtained through training in advance to obtain weather state vectors corresponding to each clustering point cloud;
a weather state determining module, configured to determine a weather state corresponding to the current frame point cloud data based on each weather state vector, where the weather state includes a weather category and/or a weather level;
Wherein the reflectivity distribution vector determination module includes: a reflectance acquisition unit, a reflectance frequency determination unit, and a reflectance distribution vector determination unit;
the reflectivity acquisition unit is used for acquiring the reflectivity of each space point in the current cluster point cloud for each cluster point cloud, and carrying out normalization processing on each reflectivity to obtain a target reflectivity;
the reflectivity frequency determining unit is used for determining the reflectivity of each target as the number of space points of the current target reflectivity based on the current clustering point cloud, determining the total number of space points of the current clustering point cloud, and determining the reflectivity frequency corresponding to the current target reflectivity based on the number of space points and the total number of space points;
the reflectivity distribution vector determining unit is used for determining a reflectivity distribution vector corresponding to the current clustering point cloud based on each reflectivity frequency;
the reflectivity frequency determining unit is specifically configured to determine a ratio between the number of spatial points and the total number of spatial points, so as to serve as a reflectivity frequency corresponding to the current target reflectivity;
wherein, weather state detection device still includes: a model training module;
The model training module is used for training to obtain the weather state detection model;
the model training module comprises: the system comprises a sample acquisition unit, a predicted weather state vector determination unit and a model parameter correction unit;
the sample acquisition unit is used for acquiring a plurality of training samples, wherein the training samples comprise reflectivity distribution vectors of at least one clustering point cloud corresponding to sample frame point cloud data and actual weather state vectors corresponding to the clustering point clouds;
the predicted weather state vector determining unit is used for inputting the reflectivity distribution vector of at least one clustering point cloud corresponding to sample frame point cloud data in the current training sample into the weather state detection model for weather state detection and outputting the predicted weather state vector corresponding to each clustering point cloud for each training sample;
the model parameter correction unit is configured to determine a loss value based on each of the predicted weather state vectors and actual weather state vectors corresponding to the corresponding cluster point cloud in the current training sample, and correct model parameters in the weather state detection model based on the loss value until a loss function in the weather state detection model converges.
7. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the weather condition detection method of any of claims 1-5.
CN202410115223.6A 2024-01-29 2024-01-29 Weather state detection method and device, electronic equipment and storage medium Active CN117647852B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410115223.6A CN117647852B (en) 2024-01-29 2024-01-29 Weather state detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410115223.6A CN117647852B (en) 2024-01-29 2024-01-29 Weather state detection method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN117647852A CN117647852A (en) 2024-03-05
CN117647852B true CN117647852B (en) 2024-04-09

Family

ID=90043644

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410115223.6A Active CN117647852B (en) 2024-01-29 2024-01-29 Weather state detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117647852B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031010A (en) * 2021-03-31 2021-06-25 小马易行科技(上海)有限公司 Method and device for detecting weather, computer readable storage medium and processor
CN115755097A (en) * 2022-11-08 2023-03-07 北京京东乾石科技有限公司 Weather condition detection method, device, equipment and storage medium
CN115991195A (en) * 2023-02-21 2023-04-21 九识(苏州)智能科技有限公司 Automatic detection and compensation method, device and system for wheel slip in automatic driving
CN116184438A (en) * 2023-02-08 2023-05-30 山东正中信息技术股份有限公司 Data processing method for identifying bad weather based on laser radar
CN116660934A (en) * 2022-12-05 2023-08-29 宇通客车股份有限公司 Vehicle, rain/snow weather identification method, vehicle control method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230206653A1 (en) * 2021-12-26 2023-06-29 Hyundai Mobis Co., Ltd. Method and device for classifying end-to-end weather and road conditions in real time by using lidar

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031010A (en) * 2021-03-31 2021-06-25 小马易行科技(上海)有限公司 Method and device for detecting weather, computer readable storage medium and processor
CN115755097A (en) * 2022-11-08 2023-03-07 北京京东乾石科技有限公司 Weather condition detection method, device, equipment and storage medium
CN116660934A (en) * 2022-12-05 2023-08-29 宇通客车股份有限公司 Vehicle, rain/snow weather identification method, vehicle control method and system
CN116184438A (en) * 2023-02-08 2023-05-30 山东正中信息技术股份有限公司 Data processing method for identifying bad weather based on laser radar
CN115991195A (en) * 2023-02-21 2023-04-21 九识(苏州)智能科技有限公司 Automatic detection and compensation method, device and system for wheel slip in automatic driving

Also Published As

Publication number Publication date
CN117647852A (en) 2024-03-05

Similar Documents

Publication Publication Date Title
CN109635685B (en) Target object 3D detection method, device, medium and equipment
CN113902897B (en) Training of target detection model, target detection method, device, equipment and medium
CN116503803A (en) Obstacle detection method, obstacle detection device, electronic device and storage medium
CN113421217A (en) Method and device for detecting travelable area
CN114882198A (en) Target determination method, device, equipment and medium
CN117647852B (en) Weather state detection method and device, electronic equipment and storage medium
CN112529011A (en) Target detection method and related device
CN117036457A (en) Roof area measuring method, device, equipment and storage medium
CN113920273B (en) Image processing method, device, electronic equipment and storage medium
CN115527187A (en) Method and device for classifying obstacles
CN115439692A (en) Image processing method and device, electronic equipment and medium
CN117962774A (en) Weather state prediction method and device, electronic equipment and storage medium
CN114708498A (en) Image processing method, image processing apparatus, electronic device, and storage medium
CN114445802A (en) Point cloud processing method and device and vehicle
CN117392000B (en) Noise removing method and device, electronic equipment and storage medium
CN117496165B (en) Rain and snow noise filtering method and device, electronic equipment and storage medium
CN117372988B (en) Road boundary detection method, device, electronic equipment and storage medium
CN116168132B (en) Street view reconstruction model acquisition method, device, equipment and medium
CN115345919B (en) Depth determination method and device, electronic equipment and storage medium
CN117830412A (en) Target identification method, device, equipment and storage medium
CN117935209A (en) Obstacle detection method, device, equipment and storage medium
CN117710459A (en) Method, device and computer program product for determining three-dimensional information
CN117872346A (en) Object tracking method, device, equipment and storage medium
CN115346194A (en) Three-dimensional detection method and device, electronic equipment and storage medium
CN117008136A (en) Ranging method and device for obstacle in front of vehicle, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant