CN117962774A - Weather state prediction method and device, electronic equipment and storage medium - Google Patents

Weather state prediction method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117962774A
CN117962774A CN202410122350.9A CN202410122350A CN117962774A CN 117962774 A CN117962774 A CN 117962774A CN 202410122350 A CN202410122350 A CN 202410122350A CN 117962774 A CN117962774 A CN 117962774A
Authority
CN
China
Prior art keywords
point cloud
weather state
weather
cloud data
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410122350.9A
Other languages
Chinese (zh)
Inventor
毛威
曹亮
韦松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jika Intelligent Robot Co ltd
Original Assignee
Jika Intelligent Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jika Intelligent Robot Co ltd filed Critical Jika Intelligent Robot Co ltd
Priority to CN202410122350.9A priority Critical patent/CN117962774A/en
Publication of CN117962774A publication Critical patent/CN117962774A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a weather state prediction method, a weather state prediction device, electronic equipment and a storage medium. Wherein the method comprises the following steps: acquiring a reflectivity distribution vector set corresponding to the point cloud data of the current frame and reflectivity distribution vector sets corresponding to the point cloud data of a plurality of historical frames; processing each reflectivity distribution vector set based on a weather state prediction model obtained through pre-training, and determining a weather state vector corresponding to the prediction moment; and determining the weather state corresponding to the predicted time based on the weather state vector. According to the technical scheme, the effect that the vehicle end can accurately predict the weather state at the future moment on the basis of not depending on the cloud and reducing the communication cost is achieved, and further, the driving safety of the vehicle in the extreme weather state is improved.

Description

Weather state prediction method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of automatic driving technologies, and in particular, to a weather state prediction method, a weather state prediction device, an electronic device, and a storage medium.
Background
In urban or high-speed scenes, when extreme weather conditions such as rainy days, snowy days or foggy days are met, road surface running conditions become poor, if the road surface running conditions become wet and slippery, visibility is reduced relatively on sunny days, and a driver usually runs at a reduced speed so as to ensure safe running of the vehicle. However, for an unmanned vehicle, the weather condition during driving cannot be directly perceived, the unmanned vehicle cannot be driven at a reduced speed in time, and the risk of occurrence of a driving accident is increased. Therefore, the perception of weather conditions is of great importance for improving the driving safety of an autonomous vehicle.
At present, the weather state prediction process is usually implemented in a cloud server, and the weather state is obtained after the historical weather state is analyzed based on a manual statistical algorithm, and the weather state is transmitted to a vehicle end.
However, the whole weather state prediction process is very complicated, the consumption period is long, the algorithm iteration speed is greatly influenced, in addition, deviation often exists in manual analysis, the accuracy of the predicted weather state is possibly low, in addition, under the condition of poor communication, the weather state transmitted by the cloud server cannot be received, and further, the problem of safety accidents of vehicles is possibly caused.
Disclosure of Invention
The invention provides a weather state prediction method, a weather state prediction device, electronic equipment and a storage medium, which are used for realizing the effect that a vehicle end can accurately predict the weather state at the future moment on the basis of not depending on a cloud end and reducing communication cost, and further improving the driving safety of a vehicle in an extreme weather state.
According to an aspect of the present invention, there is provided a weather condition detection method, the method comprising:
Acquiring a reflectivity distribution vector set corresponding to the point cloud data of the current frame and reflectivity distribution vector sets corresponding to the point cloud data of a plurality of historical frames, wherein the reflectivity distribution vector sets comprise reflectivity distribution vectors corresponding to at least one clustering point cloud contained in the point cloud data of the corresponding frame;
Processing each reflectivity distribution vector set based on a weather state prediction model obtained through pre-training, and determining a weather state vector corresponding to a prediction time, wherein the weather state prediction module comprises at least one module, the at least one module comprises a vector fusion module and a weather prediction module, and the prediction time comprises at least one future time after a point cloud scanning time corresponding to the point cloud data of the current frame;
And determining a weather state corresponding to the predicted time based on the weather state vector, wherein the weather state comprises a weather category and/or a weather level.
According to another aspect of the present invention, there is provided a weather condition detection apparatus, the apparatus comprising:
the system comprises a set acquisition module, a data acquisition module and a data processing module, wherein the set acquisition module is used for acquiring a reflectivity distribution vector set corresponding to the point cloud data of the current frame and a reflectivity distribution vector set corresponding to the point cloud data of a plurality of historical frames, and the reflectivity distribution vector set comprises reflectivity distribution vectors corresponding to at least one clustering point cloud contained in the point cloud data of the corresponding frame;
The weather state vector determining module is used for processing each reflectivity distribution vector set based on a weather state prediction model obtained through training in advance to determine a weather state vector corresponding to a prediction time, wherein the weather state prediction module comprises at least one module, the at least one module comprises a vector fusion module and a weather prediction module, and the prediction time comprises at least one future time after a point cloud scanning time corresponding to the current frame point cloud data;
And the weather state determining module is used for determining a weather state corresponding to the predicted time based on the weather state vector, wherein the weather state comprises a weather category and/or a weather level.
According to another aspect of the present invention, there is provided an electronic apparatus including:
At least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the weather state prediction method according to any one of the embodiments of the present invention.
According to a fifth aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute a weather state prediction method according to any embodiment of the present invention.
According to the technical scheme, the reflectivity distribution vector set corresponding to the current frame point cloud data and the reflectivity distribution vector set corresponding to the plurality of historical frame point cloud data are obtained, further, the reflectivity distribution vector sets are processed based on the weather state prediction model obtained through training in advance, the weather state vector corresponding to the prediction time is determined, finally, the weather state corresponding to the prediction time is determined based on the weather state vector, the problems that in the related art, weather state prediction can only be carried out at a cloud end, the prediction process is complicated, the prediction accuracy is low and the like are solved, the effect that the vehicle end can accurately predict the weather state at the future time on the basis that the cloud end is not dependent and the communication cost is reduced is achieved, and further, the driving safety of the vehicle in the extreme weather state is improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a weather state prediction method according to a first embodiment of the present invention;
FIG. 2 is a flow chart of a weather state prediction method according to a second embodiment of the present invention;
FIG. 3 is a flow chart of a weather state prediction method according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a weather state prediction apparatus according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device implementing a weather state prediction method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a weather state prediction method according to a first embodiment of the present invention, where the present embodiment is applicable to a case of predicting a weather state of a future time in an environment where a vehicle is located, and the method may be performed by a weather state prediction device, which may be implemented in the form of hardware and/or software, and the weather state prediction device may be configured in a terminal and/or a server. As shown in fig. 1, the method includes:
S110, acquiring a reflectivity distribution vector set corresponding to the point cloud data of the current frame and reflectivity distribution vector sets corresponding to the point cloud data of a plurality of historical frames.
The reflectivity distribution vector set comprises reflectivity distribution vectors corresponding to at least one clustering point cloud contained in corresponding frame point cloud data.
In this embodiment, the point cloud data of the current frame may be understood as point cloud data obtained after the vehicle surrounding environment is scanned by the vehicle-mounted radar device at the current time, or may be understood as point cloud data received at the current time. It will be appreciated that the point cloud data will be transmitted from the vehicle radar device to the receiver at a frequency, for example, at 10 hz, or at every 0.1 seconds. In practical applications, the vehicle surroundings may be scanned by the vehicle-mounted radar apparatus during the running of the vehicle. In addition, point cloud data representing the surrounding environment of the vehicle can be obtained and transmitted to the vehicle-mounted processing platform, and at the moment, the point cloud data received by the vehicle-mounted processing platform at the current moment can be used as current frame point cloud data. Meanwhile, for the point cloud data of the current frame, the point cloud data before the point cloud data of the current frame at the receiving time and/or the scanning time is the historical frame point cloud data. It should be noted that, the historical frame point cloud data before the current frame point cloud data may be point cloud data selected according to a preset screening standard from multi-frame point cloud data between the first frame point cloud data and the current frame point cloud data. The preset screening criteria may be preset, and any criteria used for selecting the historical frame point cloud data may be selected according to a preset step length, or selected randomly, or the like. In practical applications, the vehicle speed may be slow and/or the scanning frequency of the laser radar may be too high, which may result in too high similarity between adjacent frame point cloud data, so the historical frame point cloud data may be point cloud data selected from the received multi-frame point cloud data.
In this embodiment, the set of reflectance distribution vectors may be understood as a set composed of reflectance distribution vectors corresponding to each cluster point cloud included in the corresponding frame point cloud data. The reflectance distribution vector may be understood as a vector characterizing the reflectance distribution of each spatial point in the point cloud data. The reflectivity distribution vector may describe the reflectivity characteristics of the clustered point cloud quantitatively in its entirety. As will be appreciated by those skilled in the art, for point cloud data, each spatial point may comprise a reflectivity reflecting the signal strength of the laser light emitted by the radar device after encountering an object. The point cloud data scanned by the radar device may include, in addition to the relative spatial position information between each spatial point and the current vehicle, the reflectivity of each spatial point.
In practical application, under different weather conditions, the signal intensity of the laser emitted by the vehicle-mounted radar device after encountering an object is different, that is, under different weather conditions, the reflectivity distribution vector of the same frame of point cloud data may be different, for example, under the conditions of rain, fog and snow, water drops in the air weaken the energy of returned laser radar point clouds, the reflectivity of space points is reduced, and the reflectivity of most space points is at a level close to 0; under the sunny condition, the water content in the air is low, and the attenuation capability on the energy of the laser radar point cloud is low, so that the reflectivity of the space points may not be reduced, and the reflectivity of most space points is at a normal level. Therefore, the weather condition of the environment where the vehicle is located can be predicted based on the reflectivity distribution vector corresponding to the point cloud data. Specifically, current frame point cloud data may be acquired and historical frame point cloud data determined. Further, at least one clustered point cloud corresponding to the current frame point cloud data and at least one clustered point cloud corresponding to the historical frame point cloud data may be determined. Then, the reflectivity distribution vector corresponding to each cluster point cloud can be determined. Further, a reflectivity distribution vector set corresponding to the corresponding frame point cloud data can be constructed according to the reflectivity distribution vector corresponding to each cluster point cloud.
Optionally, obtaining a set of reflectivity distribution vectors corresponding to the point cloud data of the current frame and a set of reflectivity distribution vectors corresponding to the point cloud data of the plurality of historical frames includes: acquiring current frame point cloud data, and determining at least one cluster point cloud corresponding to the current frame point cloud data; determining reflectivity distribution vectors corresponding to the clustered point clouds, and determining a reflectivity distribution vector set corresponding to the point cloud data of the current frame based on the reflectivity distribution vectors; determining a preset number of historical frame point cloud data before the current frame point cloud data; for each historical frame point cloud data, determining at least one clustered point cloud corresponding to the current historical frame point cloud data, determining a reflectivity distribution vector corresponding to each clustered point cloud, and determining a reflectivity distribution vector set corresponding to the current historical frame point cloud data based on each reflectivity distribution vector.
In this embodiment, clustering the point cloud may be understood as a spatial point clustering region included in the point cloud data. The clustered point cloud may be any cluster of objects included in the point cloud data. Alternatively, the cluster point cloud may include road boundary clusters and/or vehicle clusters, etc.
In practical application, the current frame point cloud data can be acquired. Furthermore, in order to extract meaningful feature points from the current frame point cloud data, the current frame point cloud data may be clustered, and at least one clustered point cloud corresponding to the current frame point cloud data may be obtained. It should be noted that, there may be various ways of determining the clustered point cloud, for example, processing the point cloud data based on a three-dimensional clustering algorithm or processing a two-dimensional image corresponding to the point cloud data based on a two-dimensional clustering algorithm. In general, the amount of data of the point cloud data is huge, the point cloud data is directly processed, the requirements on computing power and memory are high, and the technical scheme is difficult to deploy in a vehicle-mounted processing platform, so that the point cloud data of the current frame can be converted into a two-dimensional image for the embodiment. Furthermore, the two-dimensional images can be clustered to obtain image data corresponding to at least one clustering area corresponding to the two-dimensional images, and further, the image data corresponding to each clustering area can be converted into three-dimensional point cloud data, so that at least one clustering point cloud corresponding to the point cloud data of the current frame can be obtained.
Optionally, determining at least one cluster point cloud corresponding to the current frame point cloud data includes: performing ground point cloud elimination processing on the point cloud data of the current frame to obtain point cloud data to be processed, and performing preset area point cloud extraction processing on the point cloud data to be processed to obtain target point cloud data; carrying out projection processing on the target point cloud data according to a preset view angle to obtain an image to be processed and image point cloud mapping information; processing the image to be processed based on a preset clustering algorithm to obtain image data corresponding to at least one clustering area included in the image to be processed; and determining at least one clustering point cloud corresponding to the point cloud data of the current frame according to the image data corresponding to the at least one clustering region and the image point cloud mapping information.
In the present embodiment, the ground point cloud may be understood as point cloud data representing a road surface on which the vehicle is traveling. The preset region may be understood as a preset region of interest. The preset area may be any area in the radar scanning area corresponding to the vehicle, and may alternatively be a rectangular parallelepiped area defined in front of the vehicle. The predetermined area point cloud may be a non-ground point cloud within the predetermined area. In general, after removing the ground point cloud, one frame of point cloud data further includes non-traffic participant categories such as buildings and vegetation, and the point cloud data of the categories do not participate in object detection and tracking perception tasks, in order to reduce input point cloud data of subsequent processing steps and improve noise filtering frequency, a preset area can be determined, non-ground point cloud can be extracted from point cloud data in the preset area, and the extracted non-ground point cloud can be used as target point cloud data. The preset viewing angle may be understood as a preset three-dimensional data projection viewing angle. The preset viewing angle may be any viewing angle, and optionally may be a Bird's Eye View (BEV). BEV is capable of simplifying complex three-dimensional data into two-dimensional images, which is particularly important for efficient computation in real-time systems. The image to be processed may be understood as a two-dimensional image corresponding to the three-dimensional point cloud data. The image point cloud mapping information is used for indicating the association relation between the two-dimensional pixel points in the image to be processed and the three-dimensional space points in the target point cloud data set. The image point cloud mapping information may be understood as information characterizing a mapping relationship between pixel points in an image and spatial points in the point cloud data.
In this embodiment, the preset clustering algorithm may be understood as a preset image clustering algorithm. The preset clustering algorithm may be any algorithm, and optionally, may be a connected component calibration (Connected Component Labeling) algorithm. The connected component calibration algorithm is a classical binary image clustering algorithm in the field of computer vision. A cluster region may be understood as a cluster region of pixels in an image. The cluster region may be any cluster of objects included in the image. Alternatively, the cluster region may include a road boundary cluster, a vehicle cluster, and the like. The image data corresponding to the cluster region may be understood as an image region (term IMAGE PATCH in computer vision) corresponding to the cluster region.
In practical applications, after receiving the current frame point cloud data, principal component analysis (PRINCIPAL COMPONENTS ANALYSIS, PCA) may be used to calculate a normal vector of the current frame point cloud data, and a point where the normal vector is approximately vertical is selected as the candidate ground store set. And (3) performing plane fitting on the candidate ground shop set by applying a random sampling consistency (Random Sample Consensus, RANSAC) algorithm to obtain a ground point cloud set. Furthermore, the ground point cloud set can be removed from the point cloud data of the current frame to obtain point cloud data to be processed.
Furthermore, after the point cloud data to be processed is obtained, the data volume of the point cloud data is still huge, in order to reduce the number of input point clouds in the subsequent processing step and improve the noise filtering efficiency, a preset area can be determined from a scanning area corresponding to the vehicle-mounted radar equipment, and non-ground point clouds in the area can be extracted. Further, the extracted non-ground point cloud may be used as target point cloud data.
Furthermore, after the target point cloud data is obtained, the point cloud data is three-dimensional data, the point cloud data is directly processed, the requirements on the computing power and the memory of the equipment are high, and the point cloud data is difficult to execute in a vehicle-mounted processing platform. Therefore, the point cloud data can be converted into a two-dimensional image by performing projection processing on the point cloud data. Specifically, the target point cloud data is projected into a preset viewing angle and is discretized into a two-dimensional image. In the two-dimensional image, the pixel value of the pixel occupied by the target point cloud data is set to a first preset pixel value, and the pixel value of the pixel not occupied by the target point cloud data is set to a second preset pixel value. Further, an image to be processed can be obtained. And, the image to be processed and the target point cloud data can be mapped to obtain image point cloud associated information.
Further, after the image to be processed is obtained, the image to be processed may be processed by adopting a preset clustering algorithm, and further, a set of pixel points corresponding to at least one cluster included in the image to be processed may be output. Thus, each pixel point set can be used as image data corresponding to a corresponding clustering area. And then, the image to be processed is projected based on the target point cloud data, a certain mapping relation exists between the image to be processed and the target point cloud data, and the image point cloud association information can be determined based on the mapping relation. After obtaining the image data corresponding to the at least one clustering area, determining the point cloud data corresponding to each clustering area by combining the image point cloud association information, and taking the obtained point cloud data as at least one clustering point cloud corresponding to the point cloud data of the current frame.
Further, after determining at least one cluster point cloud corresponding to the current frame point cloud data, a reflectivity distribution vector corresponding to each cluster point cloud can be determined. For each cluster point cloud, the reflectivity of each spatial point in the current cluster point cloud and the total number of spatial points included in the current cluster point cloud may be obtained. Furthermore, the reflectivity distribution vector corresponding to the current cluster point cloud can be determined according to the reflectivities and the total number of the space points.
Optionally, determining the reflectivity distribution vector corresponding to each cluster point cloud includes: for each clustering point cloud, obtaining the reflectivity corresponding to each point in the current clustering point cloud, and carrying out normalization processing on each reflectivity to obtain the target reflectivity corresponding to each point; and determining a reflectivity distribution vector corresponding to the current cluster point cloud based on the reflectivity of each target and the total point corresponding to the current cluster point cloud.
In general, the reflectivity of each spatial point included in the point cloud data may be a numerical value in a different data range, and thus, in this embodiment, normalization processing may be performed on each reflectivity to limit each reflectivity to a preset integer section. For example, each reflectance may be normalized to within an integer interval of 0 to 255, and further, the normalized reflectance may be taken as the target reflectance. The reflectivity frequency is understood to be a value which characterizes the distribution of the reflectivity in the belonging point cloud.
In practical application, for each cluster point cloud, the reflectivity of each spatial point in the current cluster point cloud can be obtained based on the current frame point cloud data. Further, the reflectances may be normalized to limit the reflectances to a predetermined integer interval, and a target reflectivity corresponding to each spatial point may be obtained. Further, for each target reflectivity, the number of spatial points whose target reflectivity is the current target reflectivity may be determined from the current cluster point cloud, and the total number of spatial points included in the current cluster point cloud is determined. Further, the reflectivity frequency corresponding to the current target reflectivity may be determined according to the determined number of spatial points and the total number of spatial points.
Optionally, determining the reflectivity frequency corresponding to the current target reflectivity based on the number of spatial points and the total number of spatial points includes: and determining the ratio of the number of the space points to the total number of the space points as the reflectivity frequency corresponding to the current target reflectivity.
In practical applications, after the number of space points and the total number of space points are obtained, the ratio between the number of space points and the total number of space points may be determined. Further, the ratio may be set as the reflectivity frequency corresponding to the current target reflectivity. Illustratively, assume that the total number of spatial points included in the current cluster point cloud is 100, the current target reflectivity is 4, and the number of spatial points in the current cluster point cloud for which the target reflectivity is 4 is 10. Further, the reflectivity frequency at which the current target reflectivity is 4 is
Further, after obtaining the reflectivity frequencies corresponding to the reflectivities, the reflectivity frequencies corresponding to the reflectivities may be stored in a pre-constructed multidimensional vector. Further, a reflectance distribution vector can be obtained. For example, the reflectivity frequency may be stored in a 256-dimensional vector, where the 1 st dimension to the 256 th dimension of the 256-dimensional vector sequentially correspond to the reflectivity normalized integer intervals 0 to 255, and a 256-dimensional reflectivity distribution vector may be obtained, where the 1 st dimension corresponds to a value of the reflectivity frequency with the target reflectivity of 0, the 2 nd dimension corresponds to a value of the reflectivity frequency with the target reflectivity of 1, and so on.
Further, after obtaining the reflectivity distribution vectors corresponding to the cloud data of each cluster point, the reflectivity distribution vector set corresponding to the cloud data of the current frame can be determined based on each reflectivity distribution vector. Alternatively, the manner of determining the set of reflectance distribution vectors may include vector addition or vector concatenation, or the like. In practical applications, vector addition processing may be performed on each reflectance distribution vector, and the vector obtained by the addition may be used as a reflectance distribution vector set. Alternatively, the reflectance distribution vectors may be connected, and the vector set obtained by the connection may be used as the reflectance distribution vector set.
Further, on the basis of determining the point cloud data of the current frame, determining and acquiring a preset number of point cloud data before the point cloud scanning moment corresponding to the point cloud data of the current frame by taking the point cloud scanning moment as a reference, and taking the acquired point cloud data as historical frame point cloud data. And then, for each historical frame point cloud data, at least one clustering point cloud corresponding to the current historical frame point cloud data can be determined, and the reflectivity distribution vector corresponding to each clustering point cloud can be determined. And then, determining a reflectivity distribution vector set corresponding to the point cloud data of the current historical frame according to each reflectivity distribution vector. Therefore, the reflectivity distribution vector set corresponding to each historical frame point cloud data can be obtained.
It should be noted that, for each historical frame point cloud data, a manner of determining at least one clustered point cloud corresponding to the current historical frame point cloud data, a manner of determining a reflectivity distribution vector corresponding to each clustered point cloud, and a manner of determining a set of reflectivity distribution vectors corresponding to the current historical frame point cloud data based on each reflectivity distribution vector are consistent with the above-described determination manner associated with the current frame point cloud data, which is not described in detail herein.
And S120, processing each reflectivity distribution vector set based on a weather state prediction model obtained through pre-training, and determining a weather state vector corresponding to the prediction moment.
In this embodiment, the weather state prediction model may be a neural network model that takes a set of reflectance distribution vectors as an input object to predict a weather state at a future time based on the set of reflectance distribution vectors. The weather state prediction model may include at least one module, and the at least one module may include a vector fusion module and a weather prediction module. The vector fusion module may be understood as a neural network model that processes the set of reflectance distribution vectors to convert the set of reflectance distribution vectors into a preset dimension vector. The vector fusion module can be a neural network model of any structure, alternatively, the vector fusion module can be a fully connected network or a feed-forward neural network. The weather prediction module may be understood as a neural network model that predicts weather conditions. The weather prediction module may be a neural network model of any structure, alternatively, a sequence model based on a transducer, which is a model based on an attention mechanism to accelerate a deep learning algorithm. For example, the weather prediction module may be a long time series model based on pyramid attention (Pyraformer model). The predicted time can be understood as any time in the future. The predicted time is at least one time after the point cloud scanning time corresponding to the point cloud data of the current frame. The predicted time may be one time or a plurality of consecutive times. For example, if the point cloud scanning time corresponding to the point cloud data of the current frame is the t time, the predicted time may be the t+1 time or the t+1, t+2, …, or t+n time. A weather state vector may be understood as a vector characterizing a weather state. The weather status vector may include at least two elements characterizing the weather status, and optionally, may include a weather category element and a weather level element. Wherein the weather category element may be understood as a numerical value characterizing the weather category. The weather level element may be understood as a numerical value characterizing the weather level. Illustratively, the value ranges corresponding to the weather category element and the weather level element may each be any value between 0 and 1.
In practical applications, each set of acquired reflectance distribution vectors may be input into a weather state prediction model. Furthermore, the vector fusion module in the weather state prediction model can be used for carrying out fusion processing on each reflectivity distribution vector set to obtain reflectivity distribution vectors with preset dimensions and corresponding to each frame of point cloud data. Further, the weather prediction module in the weather state prediction model may process each reflectivity distribution vector to predict the weather state corresponding to the prediction time based on each reflectivity distribution vector. Thus, a weather state vector corresponding to the predicted time can be obtained.
S130, determining weather states corresponding to the prediction time based on the weather state vectors.
In this embodiment, the weather status may include a weather category and a weather level. Weather category may be understood as a weather category. Alternatively, weather categories may include sunny days, rainy days, foggy days, snowy days, and the like. Weather level may be understood as the degree to which a weather category corresponds. Alternatively, the weather level may include mild, moderate, severe, and the like.
In practical application, after obtaining the weather state vector corresponding to the predicted time, the weather state corresponding to the predicted time can be obtained by determining the corresponding weather category and weather grade according to the numerical value corresponding to each element in the weather state vector.
Optionally, determining the weather state based on the weather state vector includes: if the value corresponding to the weather category element is smaller than a first preset threshold value, the weather category is the first category; if the value corresponding to the weather category element is not smaller than the first preset threshold value, the weather category is the second category; if the value corresponding to the weather grade element is smaller than the second preset threshold value, the weather grade is the first grade; if the value corresponding to the weather grade element is larger than the second preset threshold value and smaller than the third preset threshold value, the weather grade is the second grade; if the value corresponding to the weather level element is larger than a third preset threshold value, the weather level is a third level.
In this embodiment, the first preset threshold may be any value, and optionally, may be 0.5. The first category may be any weather category, and optionally, may be sunny. The second category may be any weather category, and optionally, may be a rainy, foggy or snowy day. The second preset threshold may be any value, alternatively, may be 0.4. The first level may be any weather level, and optionally may be light. The third preset threshold may be any value, alternatively, may be 0.7. The second level may be any weather level, and optionally, may be moderate. The third level may be any weather level, and optionally, may be severe.
In practical applications, when determining the weather state corresponding to the predicted time based on the weather state vector, the weather state may be determined according to the numerical values of the respective elements included in the weather state vector. For the weather category, if the value corresponding to the weather category element in the weather state vector is smaller than a first preset threshold value, the weather category can be determined as a first category; if the value corresponding to the weather category element in the weather state vector is not smaller than the first preset threshold, the weather category can be determined to be the second category. For the weather grade, if the value corresponding to the weather grade element in the weather state vector is smaller than the second preset threshold value, the weather grade can be determined to be the first grade; if the value corresponding to the weather grade element in the weather state vector is greater than the second preset threshold value and less than the third preset threshold value, determining the weather grade as the second grade; if the value corresponding to the weather level element in the weather state vector is greater than the third preset threshold, the weather level may be determined to be the third level.
According to the technical scheme, the reflectivity distribution vector set corresponding to the current frame point cloud data and the reflectivity distribution vector set corresponding to the plurality of historical frame point cloud data are obtained, further, the reflectivity distribution vector sets are processed based on the weather state prediction model obtained through training in advance, the weather state vector corresponding to the prediction time is determined, finally, the weather state corresponding to the prediction time is determined based on the weather state vector, the problems that in the related art, weather state prediction can only be carried out at a cloud end, the prediction process is complicated, the prediction accuracy is low and the like are solved, the effect that the vehicle end can accurately predict the weather state at the future time on the basis that the cloud end is not dependent and the communication cost is reduced is achieved, and further, the driving safety of the vehicle in the extreme weather state is improved.
Example two
Fig. 2 is a flowchart of a weather state prediction method according to a second embodiment of the present invention, where, on the basis of the foregoing embodiment, a set of reflectance distribution vectors corresponding to current frame point cloud data and a plurality of historical frame point cloud data may be input to a weather state prediction model, and the set of reflectance distribution vectors may be sequentially processed based on a vector fusion module and a weather prediction module in the weather state prediction model. Further, a weather state vector corresponding to the predicted time can be obtained. The specific implementation manner can be seen in the technical scheme of the embodiment. Wherein, the technical terms identical or similar to those of the above embodiments are not repeated herein.
As shown in fig. 2, the method includes:
S210, acquiring a reflectivity distribution vector set corresponding to the point cloud data of the current frame and reflectivity distribution vector sets corresponding to the point cloud data of a plurality of historical frames.
S220, processing each reflectivity distribution vector set based on a vector fusion module in the weather state prediction model to obtain a reflectivity characteristic vector corresponding to the current frame point cloud data and a reflectivity characteristic vector corresponding to each historical frame point cloud data.
In this embodiment, the vector fusion module may be a full-connection layer network composed of at least one full-connection layer, or may be a feedforward neural network composed of an input layer, a hidden layer, and an output layer. The vector fusion module may fuse the vector set into feature vectors of a particular dimension. The reflectance characteristic vector may be understood as a vector characterizing the point cloud data reflectance characteristic information. The dimensions of the reflectivity feature vector may be smaller than the dimensions of the reflectivity profile vector, which provides the advantage that: the efficiency of the subsequent feature extraction can be improved to speed up the model calculation speed.
In practical application, in order to reduce the total dimension of all the reflectivity distribution vectors included in the single-frame point cloud data, vector fusion processing may be performed on the reflectivity distribution vector set corresponding to each frame point cloud data. Further, reflectance characteristic vectors corresponding to the point cloud data of each frame can be obtained. Specifically, after obtaining the set of reflectance distribution vectors corresponding to the current frame point cloud data and the set of reflectance distribution vectors corresponding to each of the historical frame point cloud data, each set of reflectance distribution vectors may be input into the weather state prediction model. Further, each set of reflectance distribution vectors may be processed based on a vector fusion module in the weather state prediction model to process each set of reflectance distribution vectors into feature vectors of a particular dimension. Further, a reflectance characteristic vector corresponding to the current frame point cloud data and a reflectance characteristic vector corresponding to each of the history frame point cloud data can be obtained.
S230, processing each reflectivity characteristic vector based on a weather prediction module in the weather state prediction model to obtain a weather state vector.
In practical application, after obtaining the reflectivity feature vector corresponding to the current frame point cloud data and the reflectivity feature vector corresponding to each historical frame point cloud data, each reflectivity feature vector may be input to a weather prediction module in the weather state prediction model. Further, each reflectance feature vector may be processed based on the weather prediction module to predict weather conditions at a future time based on each reflectance feature vector. Thus, a weather state vector corresponding to the predicted time can be obtained.
S240, determining weather states corresponding to the prediction time based on the weather state vectors.
According to the technical scheme, the reflectivity distribution vector set corresponding to the current frame point cloud data and the reflectivity distribution vector set corresponding to the plurality of historical frame point cloud data are obtained, further, the reflectivity distribution vector sets are processed based on the weather state prediction model obtained through training in advance, the weather state vector corresponding to the prediction time is determined, finally, the weather state corresponding to the prediction time is determined based on the weather state vector, the problems that in the related art, weather state prediction can only be carried out at a cloud end, the prediction process is complicated, the prediction accuracy is low and the like are solved, the effect that the vehicle end can accurately predict the weather state at the future time on the basis that the cloud end is not dependent and the communication cost is reduced is achieved, and further, the driving safety of the vehicle in the extreme weather state is improved.
Example III
Fig. 3 is a flowchart of a weather state prediction method according to a third embodiment of the present invention, where, based on the foregoing embodiment, before the set of reflectance distribution vectors is processed based on the weather state prediction model, the weather state prediction model may be trained based on a training sample, so as to obtain a trained weather state prediction model. The specific implementation manner can be seen in the technical scheme of the embodiment. Wherein, the technical terms identical or similar to those of the above embodiments are not repeated herein.
As shown in fig. 3, the method includes:
s310, training to obtain a weather state prediction model.
It should be noted that, before the weather state prediction model provided by the embodiment of the present invention is applied, a pre-built neural network model needs to be trained first. Before training the model, a plurality of training samples may be constructed to train the model based on the training samples. In order to improve the accuracy of the weather state prediction model, training samples can be constructed as much and as abundant as possible.
Optionally, training to obtain a weather state prediction model includes: acquiring a plurality of training sample data; and training the weather state prediction model based on the plurality of training sample data to obtain the weather state prediction model.
The training sample data comprises a sample reflectivity distribution vector set corresponding to a plurality of historical frame point cloud data and an actual weather state vector corresponding to the prediction moment.
In this embodiment, the history frame point cloud data may be point cloud data obtained based on scanning by the radar device, or may also be point cloud data obtained based on reconstruction of a point cloud reconstruction model, or may also be point cloud data stored in advance in a storage space, or the like. The sample reflectivity distribution vector set may include reflectivity distribution vectors corresponding to at least one cluster point cloud included in the corresponding historical frame point cloud data. The predicted time is the time at which the weather state prediction is to be made. The actual weather state vector may be a state vector determined based on actual weather conditions at the predicted time. Illustratively, the actual weather category element (1 and 2 for sunny days or rainy, foggy, snowy days, respectively) and the actual weather level element (1, 2, and 3 for mild, moderate, or severe, respectively) in the actual weather status vector may be (2, 3) when, for example, the actual weather condition at the predicted time is severe rainy, foggy, snowy days.
In practical application, a plurality of historical frame point cloud data can be obtained, and for each historical frame point cloud data, at least one cluster point cloud corresponding to the current historical frame point cloud data can be determined. Further, the reflectance distribution vectors corresponding to the respective clusters can be determined. Thereafter, a set of sample reflectance distribution vectors corresponding to the current historical frame point cloud data may be determined based on the respective reflectance distribution vectors. Further, a predicted time may be determined, and an actual weather state vector corresponding to the predicted time may be determined based on an actual weather condition corresponding to the predicted time. Furthermore, training sample data can be constructed according to a sample reflectivity distribution vector set corresponding to each historical frame point cloud data and an actual weather state vector corresponding to a prediction time, and a plurality of training sample data can be obtained. Further, the weather state prediction model may be trained according to the plurality of training sample data to obtain a trained weather state prediction model.
Optionally, training the weather state prediction model based on the plurality of training sample data to obtain the weather state prediction model, including: for each piece of training sample data, inputting each sample reflectivity distribution vector set in the current training sample data into a weather state prediction model, and predicting to obtain a model prediction weather state vector corresponding to the current training sample data; and determining a loss value based on the model predictive weather state vector and the actual weather state vector in the current training sample data, and correcting model parameters in the weather state predictive model based on the loss value until a loss function in the weather state predictive model converges.
In this embodiment, the loss value may be a value that characterizes the degree of difference between the predicted output and the actual output. The loss function may be a function determined based on the loss value and used to characterize the degree of difference between the predicted output and the actual output.
In practical applications, for each training sample data, each sample reflectance distribution vector set in the current training sample data may be input into the weather state prediction model. Furthermore, vector fusion can be performed on each sample reflectivity distribution vector set based on a vector fusion module in the weather state prediction model, and sample reflectivity feature vectors corresponding to each historical frame point cloud data can be obtained. Further, the reflectivity feature vectors of each sample can be processed based on a weather prediction module in the weather state prediction model to obtain a model predicted weather state vector corresponding to the current training sample data. Further, the model predictive weather status vector may be compared to the actual weather status vector in the current training sample data to determine a loss value. Further, model parameters in the weather state prediction model may be corrected based on the loss value. Then, the training error of the loss function in the weather state prediction model, that is, the loss parameter, may be used as a condition for detecting whether the current loss function reaches convergence, for example, whether the training error is smaller than a preset error or the error variation trend tends to be stable, or whether the current iteration number of the model is equal to a preset number of times, or the like. If the convergence condition is detected, for example, the training error of the loss function is smaller than the preset error or the error change tends to be stable, which indicates that the training of the weather state prediction model is completed, at this time, the iterative training can be stopped. If the current condition is detected not to be met, other training samples can be further obtained to train the weather state prediction model until the training error of the loss function is within a preset range. And when the training error of the loss function reaches convergence, obtaining a weather state prediction model after training.
S320, acquiring a reflectivity distribution vector set corresponding to the point cloud data of the current frame and reflectivity distribution vector sets corresponding to the point cloud data of a plurality of historical frames.
S330, processing each reflectivity distribution vector set based on a weather state prediction model obtained through pre-training, and determining a weather state vector corresponding to the prediction moment.
S340, determining the weather state corresponding to the prediction time based on the weather state vector.
According to the technical scheme, the reflectivity distribution vector set corresponding to the current frame point cloud data and the reflectivity distribution vector set corresponding to the plurality of historical frame point cloud data are obtained, further, the reflectivity distribution vector sets are processed based on the weather state prediction model obtained through training in advance, the weather state vector corresponding to the prediction time is determined, finally, the weather state corresponding to the prediction time is determined based on the weather state vector, the problems that in the related art, weather state prediction can only be carried out at a cloud end, the prediction process is complicated, the prediction accuracy is low and the like are solved, the effect that the vehicle end can accurately predict the weather state at the future time on the basis that the cloud end is not dependent and the communication cost is reduced is achieved, and further, the driving safety of the vehicle in the extreme weather state is improved.
Example IV
Fig. 4 is a schematic structural diagram of a weather status prediction apparatus according to a fourth embodiment of the present invention. As shown in fig. 4, the apparatus includes: the collection acquisition module 410, the weather state vector determination module 420, and the weather state determination module 430.
The set obtaining module 410 is configured to obtain a set of reflectivity distribution vectors corresponding to the point cloud data of the current frame and a set of reflectivity distribution vectors corresponding to the point cloud data of the plurality of historical frames, where the set of reflectivity distribution vectors includes a reflectivity distribution vector corresponding to at least one clustered point cloud included in the point cloud data of the corresponding frame; the weather state vector determining module 420 is configured to process each of the set of reflectivity distribution vectors based on a weather state prediction model obtained by training in advance, and determine a weather state vector corresponding to a predicted time, where the weather state prediction module includes at least one module, the at least one module includes a vector fusion module and a weather prediction module, and the predicted time includes at least one future time after a point cloud scanning time corresponding to the current frame point cloud data; the weather state determining module 430 is configured to determine a weather state corresponding to the predicted time based on the weather state vector, where the weather state includes a weather category and/or a weather level.
According to the technical scheme, the reflectivity distribution vector set corresponding to the current frame point cloud data and the reflectivity distribution vector set corresponding to the plurality of historical frame point cloud data are obtained, further, the reflectivity distribution vector sets are processed based on the weather state prediction model obtained through training in advance, the weather state vector corresponding to the prediction time is determined, finally, the weather state corresponding to the prediction time is determined based on the weather state vector, the problems that in the related art, weather state prediction can only be carried out at a cloud end, the prediction process is complicated, the prediction accuracy is low and the like are solved, the effect that the vehicle end can accurately predict the weather state at the future time on the basis that the cloud end is not dependent and the communication cost is reduced is achieved, and further, the driving safety of the vehicle in the extreme weather state is improved.
Optionally, the weather state vector determination module 420 includes: a reflectivity feature vector determination unit and a weather state vector determination unit.
The reflectivity characteristic vector determining unit is used for processing each reflectivity distribution vector set based on a vector fusion module in the weather state prediction model to obtain a reflectivity characteristic vector corresponding to the current frame point cloud data and a reflectivity characteristic vector corresponding to each history frame point cloud data;
and the weather state vector determining unit is used for processing each reflectivity characteristic vector based on a weather prediction module in the weather state prediction model to obtain the weather state vector.
Optionally, the set acquisition module 410 includes: the device comprises a clustering point cloud determining unit, a reflectivity distribution vector determining unit, a history frame point cloud data determining unit and a reflectivity distribution vector set determining unit.
The clustering point cloud determining unit is used for obtaining the current frame point cloud data and determining at least one clustering point cloud corresponding to the current frame point cloud data;
The reflectivity distribution vector determining unit is used for determining reflectivity distribution vectors corresponding to the clustered point clouds and determining a reflectivity distribution vector set corresponding to the current frame point cloud data based on the reflectivity distribution vectors; and
A historical frame point cloud data determining unit, configured to determine a preset number of historical frame point cloud data before the current frame point cloud data;
The device comprises a reflectivity distribution vector set determining unit, a reflectivity distribution vector set determining unit and a reflectivity distribution vector set determining unit, wherein the reflectivity distribution vector set determining unit is used for determining at least one clustering point cloud corresponding to the current historical frame point cloud data for each historical frame point cloud data, determining reflectivity distribution vectors corresponding to the clustering point clouds and determining a reflectivity distribution vector set corresponding to the current historical frame point cloud data based on the reflectivity distribution vectors.
Optionally, the cluster point cloud determining unit includes: a ground point Yun Tichu subunit, a point cloud projection subunit, an image clustering subunit, and a clustered point cloud determination subunit.
A ground point Yun Tichu subunit, configured to perform ground point cloud rejection processing on the current frame point cloud data to obtain point cloud data to be processed, and perform preset area point cloud extraction processing on the point cloud data to be processed to obtain target point cloud data;
The point cloud projection subunit is used for carrying out projection processing on the target point cloud data according to a preset view angle to obtain an image to be processed and image point cloud mapping information, wherein the image point cloud mapping information is used for indicating the association relationship between pixel points in the image to be processed and space points in the target point cloud data;
the image clustering subunit is used for processing the image to be processed based on a preset clustering algorithm to obtain image data corresponding to at least one clustering area included in the image to be processed;
And the cluster point cloud determining subunit is used for determining at least one cluster point cloud corresponding to the current frame point cloud data according to the image data corresponding to the at least one cluster region and the image point cloud mapping information.
Optionally, the reflectance distribution vector determination unit includes: the reflectivity normalization subunit and the reflectivity distribution vector determination subunit.
The reflectivity normalization subunit is used for acquiring the reflectivity corresponding to each point in the current clustered point cloud for each clustered point cloud, and normalizing the reflectivity to obtain the target reflectivity corresponding to each point;
And the reflectivity distribution vector determination subunit is used for determining the reflectivity distribution vector corresponding to the current clustering point cloud based on the target reflectivity and the total point number corresponding to the current clustering point cloud.
Optionally, the apparatus further includes: and a model training module.
Training to obtain the weather state prediction model;
The model training module comprises: and the training sample data acquisition unit and the model training unit.
The system comprises a training sample data acquisition unit, a prediction time calculation unit and a prediction time calculation unit, wherein the training sample data acquisition unit is used for acquiring a plurality of training sample data, and the training sample data comprises a sample reflectivity distribution vector set corresponding to a plurality of historical frame point cloud data and an actual weather state vector corresponding to the prediction time;
and the model training unit is used for training the weather state prediction model based on the plurality of training sample data to obtain the weather state prediction model.
Optionally, the model training unit includes: a weather state vector prediction subunit and a model parameter correction subunit.
The weather state vector predicting subunit is used for inputting each sample reflectivity distribution vector set in the current training sample data into a weather state predicting model for predicting to obtain a model predicted weather state vector corresponding to the current training sample data;
And the model parameter correction subunit is used for determining a loss value based on the model prediction weather state vector and the actual weather state vector in the current training sample data, and correcting the model parameter in the weather state prediction model based on the loss value until a loss function in the weather state prediction model converges.
The weather state prediction device provided by the embodiment of the invention can execute the weather state prediction method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example five
Fig. 5 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as weather state prediction methods.
In some embodiments, the weather state prediction method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the weather state prediction method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the weather state prediction method in any other suitable way (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A method of weather condition prediction, comprising:
Acquiring a reflectivity distribution vector set corresponding to the point cloud data of the current frame and reflectivity distribution vector sets corresponding to the point cloud data of a plurality of historical frames, wherein the reflectivity distribution vector sets comprise reflectivity distribution vectors corresponding to at least one clustering point cloud contained in the point cloud data of the corresponding frame;
Processing each reflectivity distribution vector set based on a weather state prediction model obtained through pre-training, and determining a weather state vector corresponding to a prediction time, wherein the weather state prediction module comprises at least one module, the at least one module comprises a vector fusion module and a weather prediction module, and the prediction time comprises at least one future time after a point cloud scanning time corresponding to the point cloud data of the current frame;
And determining a weather state corresponding to the predicted time based on the weather state vector, wherein the weather state comprises a weather category and/or a weather level.
2. The method according to claim 1, wherein the processing each set of reflectance distribution vectors based on the weather state prediction model obtained by training in advance to obtain a weather state vector corresponding to the predicted time comprises:
processing each reflectivity distribution vector set based on a vector fusion module in the weather state prediction model to obtain a reflectivity characteristic vector corresponding to the current frame point cloud data and a reflectivity characteristic vector corresponding to each historical frame point cloud data;
And processing each reflectivity characteristic vector based on a weather prediction module in the weather state prediction model to obtain the weather state vector.
3. The method according to claim 1, wherein the obtaining a set of reflectance distribution vectors corresponding to the current frame point cloud data and a set of reflectance distribution vectors corresponding to the plurality of historical frame point cloud data includes:
acquiring current frame point cloud data, and determining at least one clustering point cloud corresponding to the current frame point cloud data;
Determining a reflectivity distribution vector corresponding to each clustered point cloud, and determining a reflectivity distribution vector set corresponding to the current frame point cloud data based on each reflectivity distribution vector; and
Determining a preset number of historical frame point cloud data before the current frame point cloud data;
For each historical frame point cloud data, determining at least one clustered point cloud corresponding to the current historical frame point cloud data, determining a reflectivity distribution vector corresponding to each clustered point cloud, and determining a reflectivity distribution vector set corresponding to the current historical frame point cloud data based on each reflectivity distribution vector.
4. A method according to claim 3, wherein said determining at least one clustered point cloud corresponding to the current frame point cloud data comprises:
Performing ground point cloud elimination processing on the point cloud data of the current frame to obtain point cloud data to be processed, and performing preset area point cloud extraction processing on the point cloud data to be processed to obtain target point cloud data;
Carrying out projection processing on the target point cloud data according to a preset view angle to obtain an image to be processed and image point cloud mapping information, wherein the image point cloud mapping information is used for indicating the association relationship between pixel points in the image to be processed and space points in the target point cloud data;
Processing the image to be processed based on a preset clustering algorithm to obtain image data corresponding to at least one clustering area included in the image to be processed;
and determining at least one clustering point cloud corresponding to the current frame point cloud data according to the image data corresponding to the at least one clustering region and the image point cloud mapping information.
5. The method of claim 3, wherein determining a reflectance distribution vector for each of the clustered point clouds comprises:
for each clustering point cloud, obtaining the reflectivity corresponding to each point in the current clustering point cloud, and carrying out normalization processing on each reflectivity to obtain the target reflectivity corresponding to each point;
And determining a reflectivity distribution vector corresponding to the current clustering point cloud based on the target reflectivity and the total point corresponding to the current clustering point cloud.
6. The method as recited in claim 1, further comprising:
training to obtain the weather state prediction model;
the training to obtain the weather state prediction model includes:
acquiring a plurality of training sample data, wherein the training sample data comprises a sample reflectivity distribution vector set corresponding to a plurality of historical frame point cloud data and an actual weather state vector corresponding to a prediction moment;
And training the weather state prediction model based on the plurality of training sample data to obtain the weather state prediction model.
7. The method of claim 6, wherein training the weather state prediction model based on the plurality of training sample data results in the weather state prediction model, comprising:
for each piece of training sample data, inputting each sample reflectivity distribution vector set in the current training sample data into a weather state prediction model, and predicting to obtain a model prediction weather state vector corresponding to the current training sample data;
Determining a loss value based on the model predictive weather state vector and the actual weather state vector in the current training sample data, and correcting model parameters in the weather state predictive model based on the loss value until a loss function in the weather state predictive model converges.
8. A weather condition prediction apparatus, comprising:
the system comprises a set acquisition module, a data acquisition module and a data processing module, wherein the set acquisition module is used for acquiring a reflectivity distribution vector set corresponding to the point cloud data of the current frame and a reflectivity distribution vector set corresponding to the point cloud data of a plurality of historical frames, and the reflectivity distribution vector set comprises reflectivity distribution vectors corresponding to at least one clustering point cloud contained in the point cloud data of the corresponding frame;
The weather state vector determining module is used for processing each reflectivity distribution vector set based on a weather state prediction model obtained through training in advance to determine a weather state vector corresponding to a prediction time, wherein the weather state prediction module comprises at least one module, the at least one module comprises a vector fusion module and a weather prediction module, and the prediction time comprises at least one future time after a point cloud scanning time corresponding to the current frame point cloud data;
And the weather state determining module is used for determining a weather state corresponding to the predicted time based on the weather state vector, wherein the weather state comprises a weather category and/or a weather level.
9. An electronic device, the electronic device comprising:
At least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the weather state prediction method of any of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to perform the weather state prediction method of any one of claims 1-7.
CN202410122350.9A 2024-01-29 2024-01-29 Weather state prediction method and device, electronic equipment and storage medium Pending CN117962774A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410122350.9A CN117962774A (en) 2024-01-29 2024-01-29 Weather state prediction method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410122350.9A CN117962774A (en) 2024-01-29 2024-01-29 Weather state prediction method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117962774A true CN117962774A (en) 2024-05-03

Family

ID=90863771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410122350.9A Pending CN117962774A (en) 2024-01-29 2024-01-29 Weather state prediction method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117962774A (en)

Similar Documents

Publication Publication Date Title
CN113902897B (en) Training of target detection model, target detection method, device, equipment and medium
CN113936256A (en) Image target detection method, device, equipment and storage medium
CN113947188A (en) Training method of target detection network and vehicle detection method
CN113569912A (en) Vehicle identification method and device, electronic equipment and storage medium
CN113569911A (en) Vehicle identification method and device, electronic equipment and storage medium
CN116168132B (en) Street view reconstruction model acquisition method, device, equipment and medium
CN117036457A (en) Roof area measuring method, device, equipment and storage medium
CN112085101A (en) High-performance and high-reliability environment fusion sensing method and system
CN117647852B (en) Weather state detection method and device, electronic equipment and storage medium
CN113920273B (en) Image processing method, device, electronic equipment and storage medium
CN115995075A (en) Vehicle self-adaptive navigation method and device, electronic equipment and storage medium
CN117962774A (en) Weather state prediction method and device, electronic equipment and storage medium
CN115439692A (en) Image processing method and device, electronic equipment and medium
CN115761698A (en) Target detection method, device, equipment and storage medium
CN115376106A (en) Vehicle type identification method, device, equipment and medium based on radar map
CN114708498A (en) Image processing method, image processing apparatus, electronic device, and storage medium
CN115018784A (en) Method, device, equipment and medium for detecting defect of strand scattering of lead
CN114911813B (en) Updating method and device of vehicle-mounted perception model, electronic equipment and storage medium
CN117392000B (en) Noise removing method and device, electronic equipment and storage medium
CN117372988B (en) Road boundary detection method, device, electronic equipment and storage medium
CN115410408B (en) Parking space state change detection method, device, equipment and medium
CN116994222A (en) Target detection method, device, equipment and storage medium
CN117710459A (en) Method, device and computer program product for determining three-dimensional information
CN117008136A (en) Ranging method and device for obstacle in front of vehicle, electronic equipment and storage medium
CN118134881A (en) Foreign matter detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination