CN115204026A - Training method, device, vehicle, medium and program product for vehicle data model - Google Patents
Training method, device, vehicle, medium and program product for vehicle data model Download PDFInfo
- Publication number
- CN115204026A CN115204026A CN202110388981.1A CN202110388981A CN115204026A CN 115204026 A CN115204026 A CN 115204026A CN 202110388981 A CN202110388981 A CN 202110388981A CN 115204026 A CN115204026 A CN 115204026A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- local model
- local
- parameter
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
- G06F30/27—Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- General Engineering & Computer Science (AREA)
- Geometry (AREA)
- Computer Hardware Design (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
A training method, apparatus, vehicle, medium, and program product for a vehicle data model are provided. The method comprises the following steps: acquiring local data of a vehicle; training a local model of the vehicle according to the acquired local data of the vehicle to obtain a first local model updating parameter of the vehicle; performing matrix estimation on the first local model updating parameter to obtain a second local model updating parameter with reduced parameter scale; and sending the second local model update parameters to the server, so that the server integrates the update parameters based on the second local model update parameters reported by the vehicles and feeds the integrated update parameters back to the vehicles.
Description
Technical Field
The present disclosure relates to the field of machine learning and federal learning technologies, and in particular, to a method and an apparatus for training a vehicle data model, a vehicle, a computer-readable storage medium, and a computer program product.
Background
In the related art, the training of the vehicle data model is usually performed based on the original or previously processed (e.g., previously integrated) vehicle data. In this way, the vehicle needs to transmit a large amount of data to the backend device such as the server, and thus, the requirements on communication bandwidth, backend data storage and processing resources are high, and the training time is long. In addition, because the privacy of the user is exposed in the data transmission process, the privacy of the user cannot be well protected, and the like.
The approaches described in this section are not necessarily approaches that have been previously conceived or pursued. Unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, unless otherwise indicated, the problems mentioned in this section should not be considered as having been acknowledged in any prior art.
Disclosure of Invention
According to an aspect of the present disclosure, there is provided a training method of a vehicle data model, including: acquiring local data of a vehicle; training a local model of the vehicle according to the acquired local data of the vehicle to obtain a first local model update parameter of the vehicle; matrix estimation is carried out on the first local model updating parameter to obtain a second local model updating parameter with reduced parameter scale; and sending the second local model update parameters to a server, so that the server integrates the update parameters based on the second local model update parameters reported by each vehicle and feeds the integrated update parameters back to each vehicle.
According to another aspect of the present disclosure, there is provided another training method of a vehicle data model, including: receiving second local model update parameters reported by a plurality of vehicles participating in training of a vehicle data model, wherein the second local model update parameter of each vehicle in the plurality of vehicles is a model update parameter with reduced parameter scale obtained by performing matrix estimation on a first local model update parameter of the vehicle, and wherein the first local model update parameter is a model update parameter obtained by the vehicle training a local model thereof based on local data thereof; and integrating the update parameters of the second local model reported by the vehicles and feeding the integrated update parameters back to each vehicle in the vehicles.
According to yet another aspect of the present disclosure, there is provided an apparatus for training of a vehicle data model, comprising: a data acquisition module configured to acquire local data of a vehicle; the local training module is configured to train a local model of the vehicle according to the acquired local data of the vehicle to obtain a first local model update parameter of the vehicle; the matrix estimation module is configured to perform matrix estimation on the first local model update parameter to obtain a second local model update parameter with a reduced parameter scale; and the parameter sending module is configured to send the second local model update parameter to a server, so that the server integrates the update parameter based on the second local model update parameter reported by each vehicle and feeds the integrated update parameter back to each vehicle.
According to yet another aspect of the present disclosure, there is provided another apparatus for training of a vehicle data model, comprising: a receiving module configured to receive second local model update parameters reported by a plurality of vehicles participating in training of a vehicle data model, wherein the second local model update parameter of each of the plurality of vehicles is a model update parameter with a reduced parameter scale obtained by performing matrix estimation on a first local model update parameter of the vehicle, and wherein the first local model update parameter is a model update parameter obtained by the vehicle training its local model based on its local data; the integration module is configured to integrate the update parameters of the second local model reported by the vehicles; and a transmitting module configured to feed back the integrated update parameters to each of the plurality of vehicles.
According to yet another aspect of the present disclosure, an apparatus for training of a vehicle data model is provided. The device includes: a processor, and a memory storing a program. The program includes instructions which, when executed by a processor, cause the processor to perform a first method of training a vehicle data model as described in the present disclosure.
According to yet another aspect of the present disclosure, a vehicle is provided. The vehicle comprises the device for training the vehicle data model disclosed by the disclosure.
According to yet another aspect of the present disclosure, another apparatus for training of a vehicle data model is provided. The device comprises: a processor, and a memory storing a program. The program includes instructions that when executed by a processor cause the processor to perform a second method of training a vehicle data model as described in the present disclosure.
According to yet another aspect of the disclosure, a non-transitory computer-readable storage medium storing a program is provided. The program includes instructions that, when executed by one or more processors, cause the one or more processors to perform the methods described in the present disclosure.
According to another aspect of the disclosure, a computer program product is provided, comprising program code portions for performing the method according to the disclosure when the computer program product is run on one or more computing devices.
Further features and advantages of the present disclosure will become apparent from the following description of exemplary embodiments, which is to be read in connection with the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the embodiments and, together with the description, serve to explain the exemplary implementations of the embodiments. The illustrated embodiments are for purposes of illustration only and do not limit the scope of the claims. Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements.
FIG. 1 is a network diagram showing a training process of a vehicle data model in the related art;
FIG. 2 is a flow chart illustrating a method of training a vehicle data model according to an exemplary embodiment;
FIG. 3 is a network diagram illustrating a training process for a vehicle data model according to an exemplary embodiment;
FIG. 4 is a flow chart illustrating another method of training a vehicle data model in accordance with an exemplary embodiment;
FIG. 5 is a block diagram illustrating an apparatus for training of a vehicle data model according to an exemplary embodiment;
FIG. 6 is a block diagram illustrating another apparatus for training of a vehicle data model in accordance with an exemplary embodiment;
FIG. 7 is a schematic view of an application scenario for a motor vehicle according to an exemplary embodiment of the present disclosure; and
fig. 8 is a block diagram illustrating an exemplary computing device to which exemplary embodiments of the present disclosure can be applied.
Detailed Description
In the present disclosure, unless otherwise specified, the use of the terms "first", "second", etc. to describe various elements is not intended to define a positional relationship, a temporal relationship, or an importance relationship of the elements, and such terms are used only to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, while in some cases they may refer to different instances based on the context of the description.
The terminology used in the description of the various described examples in this disclosure is for the purpose of describing the particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, if the number of elements is not specifically limited, the element may be one or a plurality of. Furthermore, the term "and/or" as used in this disclosure is intended to encompass any and all possible combinations of the listed items.
In the related art, as shown in fig. 1, a vehicle data model is usually trained by integrating local data reported by each vehicle at a back end of a server (for example, by integrating local data reported by each vehicle, such as vehicle 1, vehicle 2 to vehicle n, on a central database platform shown in fig. 1), and then training the integrated data. In this way, because the amount of data reported by the vehicle is large, for example, there are likely to be original local data reported by hundreds of vehicles, there are problems of high requirements for communication bandwidth, backend data storage and processing resources, long training time, and the like. In addition, each vehicle uploads the original local data to a server and the like, and the privacy of the user is leaked in the data transmission process, so that the privacy of the user cannot be well protected, and the like.
In view of this, the present disclosure provides a training method of a vehicle data model. FIG. 2 shows a flow diagram of a method of training a vehicle data model, which, as shown in FIG. 2, may include:
step S201: acquiring local data of a vehicle;
step S202: training a local model of the vehicle according to the acquired local data of the vehicle to obtain a first local model update parameter of the vehicle;
step S203: performing matrix estimation on the first local model updating parameter to obtain a second local model updating parameter with reduced parameter scale; and
step S204: and sending the second local model update parameters to a server, so that the server integrates the update parameters based on the second local model update parameters reported by each vehicle and feeds the integrated update parameters back to each vehicle.
When training the vehicle data model, vehicle data can be trained locally in the vehicle, and then corresponding update parameters are sent to the rear ends of the server and the like for summarizing or integrating (for example, as shown in fig. 3, the vehicle 1, the vehicle 2 to the vehicle n can train the local model of the vehicle based on local data of the vehicle first, and then send update parameters obtained by training to the rear end for integrating the update parameters), and the sent update parameters are update parameters with reduced parameter scales (for example, the scale is much smaller than that of the original local data of the vehicle) obtained by matrix estimation of the original update parameters obtained by local training, so that even under the condition that network connection is unreliable, communication overhead with the rear end can be greatly reduced, occupied data storage and processing resources of the rear end are reduced, thereby reducing expenditure, shortening development cycle, and improving robustness of the aggregated model. In addition, each vehicle participating in the training of the vehicle data model does not need to upload original local data of each vehicle to a server and other back ends, so that the problem of privacy disclosure of a user can be avoided, the data privacy of the user can be well protected, and the application experience of the user is improved.
Further, it is understood that the vehicle data model training methods described in the present disclosure may employ various statistical and machine learning methods, including, but not limited to, one or more of neural network modeling methods (e.g., stochastic gradient descent or Adam optimization methods), support vector machines, graph networks, extreme gradient enhancement trees, and the like.
The steps of the training method of the vehicle data model according to the exemplary embodiment of the present disclosure will be described in detail with reference to fig. 2 and 3.
According to some embodiments, the local data of the vehicle acquired in step S201 includes one or more of vehicle driving behavior data, vehicle driving state data, and vehicle fault data.
According to some embodiments, the vehicle driving behavior data comprises data relating to a steering behavior of the vehicle by the driver, wherein the steering behavior comprises one or more of: steering wheel, brake pedal, and accelerator pedal. The vehicle driving state data includes one or more of the following data: travel speed, travel trajectory, travel time, travel direction, acceleration, angular velocity, wheel steering angle, gear state, and the like. The vehicle fault data includes one or more of a number of times a fault occurred, a component that failed (e.g., an engine, a brake, a transmission, or a clutch), and the like.
According to some embodiments, local data of the vehicle may be acquired (e.g., collected) by various sensors in the vehicle. For example, taking the example of collecting vehicle driving behavior data, the sensors used include, but are not limited to: an angle sensor for detecting the angle of rotation of the steering wheel, a displacement sensor or an acceleration sensor for detecting the displacement of the brake pedal and/or the accelerator pedal, and/or a force sensor for detecting the force on the brake pedal and/or the accelerator pedal, etc.
In addition, it is understood that the vehicle in the present disclosure includes a motor vehicle having an automatic driving function, and for example, may include an unmanned vehicle, and other various motor vehicles having and having been switched to the automatic driving function. Among others, the unmanned Vehicle may include a private unmanned Vehicle, a Vehicle for providing an automated Vehicle Mobility as a Service, and the like.
According to some embodiments, the local model of the vehicle on which the local training in step S202 is based may be pre-acquired by the vehicle from a server or pre-deployed by the server for the vehicle. It is understood that, in the present disclosure, the local models of the vehicles participating in the training of the vehicle data model have a unified model structure (for example, the initial local model of each vehicle is a unified model obtained in advance from the server, or a unified model deployed for each vehicle in advance by the server), so as to implement the corresponding lateral federal learning mechanism. In addition, the local model of the vehicle may be a model used for one or more of various purposes such as automatic driving, driving behavior analysis, and early failure prediction, depending on actual needs.
According to some embodiments, in step S202, the vehicle may train the current local model of the vehicle according to the acquired local data to obtain a trained local model, and obtain a first local model update parameter of the vehicle based on the trained local model and the local model before the current training. For example, with the current local model of vehicle i as W t (where t may represent the current cycle of model training, e.g., federal learning), and the local model after training, resulting from training the local model based on local data, is W i t For example, the obtained first local model update parameter H of the vehicle i i t Can be expressed as: h i t =W i t –W t 。
According to some embodiments, in step S203, performing matrix estimation on the first local model update parameter to obtain a second local model update parameter with a reduced parameter size, includes:
decomposing the first local model update parameter into a product of two low-order matrices (which may also be referred to as low-rank matrices or small-rank matrices) based on a matrix decomposition algorithm; and taking one matrix of the two low-order matrixes with updated parameters in the local model training process as a second local model updating parameter with reduced parameter scale.
Illustratively, the first local model update parameter of vehicle i is H i t For example, assume H i t Matrix R in d1 by d2 dimensions d1xd2 Then H can be substituted i t Decomposition into A i t And B i t Product of (i.e. H) i t =A i t B i t Wherein A is i t Matrix R which can be d1 by k dimensions d1 x k ,B i t Matrix R which can be k by d2 dimensions k x d2 And wherein the values of d1, d2 and k can be set according to practical conditions, generally speaking, the values of d1 and d2 are related to vehicle data based on model training, and the value of k can be smaller than d1 and d2 to satisfy the relation a i t And B i t Is a requirement for a low order matrix.
Assume a low order matrix A i t Considered constant (i.e., remaining unchanged) during the local model training process, another matrix B with updated parameters during the local model training process may be used i t (i.e., the matrix that is constantly optimized during the local model training process) as a second local model update parameter with reduced parameter size to send to the corresponding server.
Therefore, the sent updated parameters are the parameters obtained by decomposing the low-order matrix of the original updated parameters obtained by local training through the matrix decomposition algorithm, and the parameter scale is far smaller than the scale of the original local data of the vehicle, so that the communication overhead with the back end can be greatly reduced, the occupied data storage and processing resources of the back end are reduced, the expenditure can be reduced, and the development period can be shortened. Illustratively, to form the above-mentioned matrix B i t Sending to the server for example, can saveSaving a communication factor of approximately d 1/k.
According to other embodiments, in step S203, performing matrix estimation on the first local model update parameter to obtain a second local model update parameter with a reduced parameter size includes:
based on a random mask mode, constraining the first local model updating parameter to be a random sparse matrix; and taking the updated non-zero elements in the random sparse matrix and the seeds for generating the random sparse matrix as the updated parameters of the second local model with the reduced parameter scale.
Therefore, the effect of saving a large amount of transmission resources, storage resources, operation resources and operation time by using the zero elements in the random sparse matrix can be achieved, so that the communication overhead with the back end can be greatly reduced, the occupied data storage and processing resources of the back end are reduced, the expenditure can be reduced, and the development period can be shortened.
According to some embodiments, before sending the second local model update parameter to a server, the method further comprises: compressing (also referred to as compression encoding) the second local model update parameters. Therefore, the size of the transmitted parameters can be further reduced, so that the transmission resources are saved, and the communication overhead with the back end is reduced.
According to some embodiments, compressing the second local model update parameters comprises: compressing the second local model update parameters based on a probabilistic quantization (probabilistic quantization).
According to some embodiments, each scalar h in the parameters may be updated by updating the second local model j The second local model update parameter is compressed by quantization to 1bit, and the compressed scalar can be expressed as
According to some embodiments, the second local model update parameters may also be compressed by quantizing each scalar in the second local model update parameters to b bits (b being a positive integer greater than 1). For this, the above equation regarding 1-bit quantization can be generalized to b-bit quantization as follows: mixing [ hmin, hmax]Equally divided into 2b intervals, let h j Falling within one of the intervals defined by h 'and h ", the scalar h can be scaled by replacing hmin and hmax in the above equation with h' and h", respectively j Quantization is performed.
It is to be understood that, in addition to the above-mentioned 1-bit quantization or b-bit quantization equal probability quantization method for compressing the second local model update parameter, the second local model update parameter may be compressed in other manners. In addition, in addition to compressing the second local model update parameters, the first local model update parameters may also be compressed, for example, before matrix estimation of the first local model update parameters.
According to some embodiments, prior to compressing the second local model update parameters, the method further comprises: and carrying out random rotation processing on the second local model updating parameter so as to reduce the error of subsequent compression, such as compression based on a probability quantization mode.
According to some embodiments, randomly rotating the second local model update parameters comprises: and multiplying the second local model updating parameter by a random orthogonal matrix to realize random rotation processing of the second local model updating parameter. The random orthogonal matrix may be generated in various manners in the related art, which is not limited in this regard.
According to some embodiments, the training method of the vehicle data model of the present disclosure further comprises:
updating the local model based on the integrated updated parameters from the server (e.g., as shown in fig. 3, the integrated updated parameters are transmitted back to vehicles 1-n for updating of the respective local models);
determining whether a model training end condition of the vehicle data model has been satisfied;
responding to that the model training end condition is not met, training the updated local model based on the local data again to obtain a new second local model updating parameter, and sending the new second local model updating parameter to the server for updating parameter integration; and in response to the model training end condition being met, taking the updated local model as a vehicle data model required by the vehicle.
According to some embodiments, the model training end condition includes one or more of a number of iterations of the local model reaching a set number threshold (which may be set according to actual conditions) and a loss function convergence of the local model.
According to the method of the exemplary embodiment of the disclosure, when training the vehicle data model, vehicle data may be trained locally in the vehicle first, and then corresponding update parameters may be sent to a server and other backend for summarizing or integrating (for example, as shown in fig. 3, the vehicle 1, the vehicle 2 to the vehicle n may train the local model of the vehicle based on the local data of the vehicle first, and then send the update parameters obtained by training to the backend for integrating the update parameters), and the sent update parameters are update parameters with reduced parameter scales (for example, scales much smaller than the original local data of the vehicle) obtained by performing matrix estimation on the original update parameters obtained by local training, so that even in the case of unreliable network connection, the communication overhead with the backend can be greatly reduced, occupied data storage and processing resources of the backend are reduced, thereby reducing expenses, shortening the development cycle, and improving robustness of the aggregated model. In addition, because each vehicle participating in the training of the vehicle data model does not need to upload original local data of the vehicle to a server and other back ends, the problem of user privacy disclosure can be avoided, the data privacy of the user can be well protected, and the application experience of the user is improved.
FIG. 4 shows a flow chart of another method of training a vehicle data model in accordance with an exemplary embodiment in the present disclosure. Unlike the method shown in fig. 2, in the method shown in fig. 4, the corresponding method execution subject may be a backend device such as a server, and for other operations, reference may be made to the related description of the related embodiments. Specifically, as shown in fig. 4, the training method for another vehicle data model may include:
step S401: receiving second local model update parameters reported by a plurality of vehicles participating in training of a vehicle data model, wherein the second local model update parameter of each vehicle in the plurality of vehicles is a model update parameter with reduced parameter scale obtained by performing matrix estimation on a first local model update parameter of the vehicle, and wherein the first local model update parameter is a model update parameter obtained by the vehicle training a local model thereof based on local data thereof; and
step S402: and integrating the updated parameters of the second local model reported by the vehicles and feeding the integrated updated parameters back to each vehicle in the vehicles.
According to some embodiments, the integrating the update parameters of the second local model update parameters reported by the plurality of vehicles comprises: and performing weighted summation on the second local model update parameters reported by the vehicles, wherein the weight coefficient corresponding to each vehicle can be set according to the actual situation, and is not limited to this.
According to some embodiments, if the second local model update parameter reported by the vehicle is a compressed update parameter, before the update parameter is integrated, decompression processing may be performed on the received update parameter in a compressed form.
According to some embodiments, if the second local model update parameter reported by the vehicle is further subjected to a corresponding random rotation processing operation, before the update parameter is integrated, the received update parameter subjected to the random rotation processing may be further subjected to a reverse rotation processing. According to some embodiments, the product of the Walsh-Hadamard matrix and the binary diagonal matrix may be used as a structural rotation matrix, and the received update parameters may be subjected to a reverse rotation process based on the structural rotation matrix to reduce the computational cost.
According to some embodiments, as can be seen in conjunction with fig. 3, the integrated updated parameters may also be verified to determine their accuracy before being fed back to each vehicle. The verification may be performed in various ways in the related art, which is not limited in this regard.
FIG. 5 is a block diagram illustrating an apparatus for training of a vehicle data model according to an exemplary embodiment. The apparatus 500 for training of a vehicle data model according to this exemplary embodiment may include:
a data acquisition module 501 configured to acquire local data of a vehicle;
a local training module 502 configured to train a local model of the vehicle according to the acquired local data of the vehicle, so as to obtain a first local model update parameter of the vehicle;
a matrix estimation module 503 configured to perform matrix estimation on the first local model update parameter to obtain a second local model update parameter with a reduced parameter size; and
a parameter sending module 504, configured to send the second local model update parameter to a server, so that the server integrates the update parameter based on the second local model update parameter reported by each vehicle, and feeds back the integrated update parameter to each vehicle.
It can be understood that the foregoing description of the method steps in conjunction with fig. 2 and fig. 3 is applicable to the unit or module in fig. 5 for executing the corresponding method steps, and is not repeated here.
FIG. 6 is a block diagram illustrating another apparatus for training of a vehicle data model according to an exemplary embodiment. The apparatus 600 for training of a vehicle data model according to this exemplary embodiment may include:
a receiving module 601 configured to receive second local model update parameters reported by a plurality of vehicles participating in training of a vehicle data model, wherein the second local model update parameter of each vehicle in the plurality of vehicles is a model update parameter with a reduced parameter scale obtained by performing matrix estimation on a first local model update parameter of the vehicle, and wherein the first local model update parameter is a model update parameter obtained by the vehicle training its local model based on its local data;
an integration module 602 configured to integrate update parameters of second local model update parameters reported by the vehicles; and
a transmitting module 603 configured to feed back the integrated update parameters to each of the plurality of vehicles.
It is understood that the foregoing description of the method steps in conjunction with fig. 4 is applicable to the unit or module in fig. 6 for executing the corresponding method steps, and is not repeated here.
Additionally, while particular functionality is discussed above with reference to particular modules, it should be noted that the functionality of the various modules discussed herein can be separated into multiple modules and/or at least some of the functionality of multiple modules can be combined into a single module. Performing an action by a particular module discussed herein includes the particular module itself performing the action, or alternatively the particular module invoking or otherwise accessing another component or module that performs the action (or performs the action in conjunction with the particular module). Thus, a particular module that performs an action can include the particular module that performs the action itself and/or another module that the particular module invokes or otherwise accesses that performs the action.
More generally, various techniques may be described herein in the general context of software hardware elements or program modules. The various modules described above with respect to fig. 5 and 6 may be implemented in hardware or in hardware in combination with software and/or firmware. For example, the modules may be implemented as computer program code/instructions configured to be executed in one or more processors and stored in a computer-readable storage medium. Alternatively, the modules may be implemented as hardware logic/circuitry. For example, in some embodiments, one or more of the data acquisition module 501, the local training module 502, the matrix estimation module 503, and the parameter transmission module 504 may be implemented together in a system on a chip (SoC). The SoC may include an integrated circuit chip including one or more components of a processor (e.g., a Central Processing Unit (CPU), microcontroller, microprocessor, digital Signal Processor (DSP), etc.), memory, one or more communication interfaces, and/or other circuitry, and may optionally execute received program code and/or include embedded firmware to perform functions.
In accordance with another aspect of the present disclosure, an apparatus for training of a vehicle data model is provided. The device includes: a processor, and a memory storing a program. The program includes instructions that when executed by a processor cause the processor to perform a first method of training a vehicle data model as described in this disclosure, such as that shown in fig. 2.
According to another aspect of the present disclosure, a vehicle is provided. The vehicle comprises the device for training the vehicle data model disclosed by the disclosure.
According to another aspect of the present disclosure, another apparatus for training of a vehicle data model is provided. The device comprises: a processor, and a memory storing a program. The program includes instructions that when executed by a processor cause the processor to perform a second method of training a vehicle data model as described in this disclosure (e.g., as shown in fig. 4).
According to another aspect of the present disclosure, a non-transitory computer-readable storage medium storing a program is provided. The program includes instructions that, when executed by one or more processors, cause the one or more processors to perform the training methods of the various vehicle data models described in the present disclosure.
According to another aspect of the present disclosure, there is also provided a computer program product comprising program code portions for performing the method of training various vehicle data models described in the present disclosure, when the computer program product is run on one or more computing devices. The computer program product may be stored on a computer readable storage medium.
Fig. 7 shows a schematic diagram of an application scenario including a motor vehicle 2010 and a communication and control system for the motor vehicle 2010. It is noted that the structure and function of the vehicle 2010 shown in fig. 7 is only one example, and the vehicle of the present disclosure may include one or more of the structure and function of the vehicle 2010 shown in fig. 7 according to a specific implementation form. According to some embodiments, the vehicle 2010 may be the vehicle described above with respect to fig. 2, 3, etc.
In addition, the motor vehicle 2010 includes a powertrain, a steering system, a brake system, and the like, which are not shown in fig. 7, for implementing a motor vehicle driving function.
Referring to fig. 8, a block diagram of a structure of a computer device 2000 that can serve as a server of the present disclosure will now be described.
Referring to fig. 8, computing device 2000 may be any machine configured to perform processing and/or computing, and may be, but is not limited to, a workstation, a server, a desktop computer, a laptop computer, a tablet computer, a personal digital assistant, a smart phone, an on-board computer, or any combination thereof. The apparatus 600 shown in fig. 6 described above may be implemented in whole or at least in part by a computing device 2000 or similar device or system.
The computing device 2000 may also include a working memory 2014, which may be any type of working memory that can store programs (including instructions) and/or data useful for the operation of the processor 2004, and may include, but is not limited to, random access memory and/or read only memory devices.
Software elements (programs) may reside in the working memory 2014 including, but not limited to, an operating system 2016, one or more application programs 2018, drivers, and/or other data and code. Instructions for performing the above-described methods and steps may be included in one or more applications 2018, and various sub-modules of the above-described apparatus 600, including the receiving module 601, the integrating module 602, and the sending module 603, may be implemented by instructions that are read and executed by the processor 2004 for the one or more applications 2018. Executable code or source code of instructions of the software elements (programs) may be stored in a non-transitory computer-readable storage medium (such as the storage device 2010 described above) and, upon execution, may be stored in the working memory 2014 (possibly compiled and/or installed). Executable code or source code for the instructions of the software elements (programs) may also be downloaded from a remote location.
It will also be appreciated that various modifications may be made in accordance with specific requirements. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. For example, some or all of the disclosed methods and apparatus may be implemented by programming hardware (e.g., programmable logic circuitry including Field Programmable Gate Arrays (FPGAs) and/or Programmable Logic Arrays (PLAs)) in an assembly language or hardware programming language such as VERILOG, VHDL, C + +, using logic and algorithms in accordance with the present disclosure.
It should also be understood that the foregoing method may be implemented in a server-client mode. For example, a client may receive data input by a user and send the data to a server. The client may also receive data input by the user, perform part of the processing in the foregoing method, and transmit the data obtained by the processing to the server. The server may receive data from the client and perform the aforementioned method or another part of the aforementioned method and return the results of the execution to the client. The client may receive the results of the execution of the method from the server and may present them to the user, for example, through an output device.
It should also be understood that the components of computing device 2000 may be distributed across a network. For example, some processes may be performed using one processor while other processes may be performed by another processor that is remote from the one processor. Other components of computing device 2000 may also be similarly distributed. As such, the computing device 2000 may be interpreted as a distributed computing system that performs processing at multiple locations.
Although embodiments or examples of the present disclosure have been described with reference to the accompanying drawings, it is to be understood that the above-described methods, systems and apparatus are merely exemplary embodiments or examples and that the scope of the present invention is not limited by these embodiments or examples, but only by the claims as issued and their equivalents. Various elements in the embodiments or examples may be omitted or may be replaced with equivalents thereof. Further, the steps may be performed in an order different from that described in the present disclosure. Further, the various elements in the embodiments or examples may be combined in various ways. As the technology evolves, many of the elements described herein may be replaced with equivalent elements that appear after the present disclosure.
Claims (19)
1. A method of training a vehicle data model, comprising:
acquiring local data of a vehicle;
training a local model of the vehicle according to the acquired local data of the vehicle to obtain a first local model updating parameter of the vehicle;
matrix estimation is carried out on the first local model updating parameter to obtain a second local model updating parameter with reduced parameter scale; and
and sending the second local model updating parameters to a server, so that the server integrates the updating parameters based on the second local model updating parameters reported by each vehicle and feeds the integrated updating parameters back to each vehicle.
2. The method of claim 1, wherein the local data of the vehicle comprises one or more of vehicle driving behavior data, vehicle driving status data, and vehicle fault data.
3. The method of claim 1, wherein the local model of the vehicle is pre-acquired by the vehicle from the server or pre-deployed by the server for the vehicle.
4. The method of any one of claims 1-3, wherein matrix estimating the first local model update parameters resulting in second local model update parameters of reduced parameter size comprises:
decomposing the first local model update parameter into a product of two low-order matrices based on a matrix decomposition algorithm; and
and taking one matrix of the two low-order matrixes with updated parameters in the local model training process as a second local model updating parameter with reduced parameter scale.
5. The method of any one of claims 1-3, wherein matrix estimating the first local model update parameters resulting in second local model update parameters of reduced parameter size comprises:
based on a random mask mode, constraining the first local model updating parameter to be a random sparse matrix; and
taking the updated non-zero elements in the random sparse matrix and the seeds for generating the random sparse matrix as the updated parameters of the second local model with the reduced parameter scale.
6. The method of any of claims 1-3, wherein prior to sending the second local model update parameter to a server, the method further comprises:
compressing the second local model update parameters.
7. The method of claim 6, wherein compressing the second local model update parameters comprises:
and compressing the second local model updating parameters based on a probability quantization mode.
8. The method of claim 6, wherein prior to compressing the second local model update parameters, the method further comprises:
and carrying out random rotation processing on the second local model updating parameter.
9. The method of claim 8, wherein randomly rotating the second local model update parameters comprises:
and multiplying the second local model updating parameter by a random orthogonal matrix to realize random rotation processing of the second local model updating parameter.
10. The method of any of claims 1-3, further comprising:
updating the local model based on the integrated update parameters from the server;
determining whether a model training end condition of the vehicle data model has been satisfied;
responding to that the model training end condition is not met, training the updated local model based on the local data again to obtain a new second local model updating parameter, and sending the new second local model updating parameter to the server for updating parameter integration; and
and in response to the model training end condition being met, taking the updated local model as a vehicle data model required by the vehicle.
11. The method of claim 10, wherein the model training end condition comprises one or more of a number of iterations of the local model reaching a set number threshold and a loss function convergence of the local model.
12. A method of training a vehicle data model, comprising:
receiving second local model update parameters reported by a plurality of vehicles participating in training of a vehicle data model, wherein the second local model update parameter of each vehicle in the plurality of vehicles is a model update parameter with reduced parameter scale obtained by performing matrix estimation on a first local model update parameter of the vehicle, and wherein the first local model update parameter is a model update parameter obtained by the vehicle training a local model thereof based on local data thereof; and
and integrating the updated parameters of the second local model reported by the vehicles and feeding the integrated updated parameters back to each vehicle in the vehicles.
13. An apparatus for training of a vehicle data model, comprising:
a data acquisition module configured to acquire local data of a vehicle;
the local training module is configured to train a local model of the vehicle according to the acquired local data of the vehicle to obtain a first local model update parameter of the vehicle;
the matrix estimation module is configured to perform matrix estimation on the first local model update parameter to obtain a second local model update parameter with reduced parameter scale; and
and the parameter sending module is configured to send the second local model update parameter to a server, so that the server integrates the update parameter based on the second local model update parameter reported by each vehicle and feeds the integrated update parameter back to each vehicle.
14. An apparatus for training of a vehicle data model, comprising:
a receiving module configured to receive second local model update parameters reported by a plurality of vehicles participating in training of a vehicle data model, wherein the second local model update parameter of each of the plurality of vehicles is a model update parameter with a reduced parameter scale obtained by performing matrix estimation on a first local model update parameter of the vehicle, and wherein the first local model update parameter is a model update parameter obtained by the vehicle training its local model based on its local data;
the integration module is configured to integrate the update parameters of the second local model reported by the vehicles; and
a transmitting module configured to feed back the integrated update parameters to each of the plurality of vehicles.
15. An apparatus for training of a vehicle data model, comprising:
a processor, and
a memory storing a program comprising instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1 to 11.
16. A vehicle, comprising:
the apparatus of claim 15.
17. An apparatus for training of a vehicle data model, comprising:
a processor, and
a memory storing a program comprising instructions that, when executed by the processor, cause the processor to perform the method of claim 12.
18. A non-transitory computer-readable storage medium storing a program, the program comprising instructions that when executed by one or more processors cause the one or more processors to perform the method of any of claims 1-12.
19. A computer program product comprising program code portions for performing the method of any one of claims 1-12 when the computer program product is run on one or more computing devices.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110388981.1A CN115204026A (en) | 2021-04-12 | 2021-04-12 | Training method, device, vehicle, medium and program product for vehicle data model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110388981.1A CN115204026A (en) | 2021-04-12 | 2021-04-12 | Training method, device, vehicle, medium and program product for vehicle data model |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115204026A true CN115204026A (en) | 2022-10-18 |
Family
ID=83571579
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110388981.1A Pending CN115204026A (en) | 2021-04-12 | 2021-04-12 | Training method, device, vehicle, medium and program product for vehicle data model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115204026A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116580448A (en) * | 2023-04-11 | 2023-08-11 | 深圳市大数据研究院 | Behavior prediction method, system, electronic equipment and storage medium |
-
2021
- 2021-04-12 CN CN202110388981.1A patent/CN115204026A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116580448A (en) * | 2023-04-11 | 2023-08-11 | 深圳市大数据研究院 | Behavior prediction method, system, electronic equipment and storage medium |
CN116580448B (en) * | 2023-04-11 | 2024-04-16 | 深圳市大数据研究院 | Behavior prediction method, system, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11645520B2 (en) | Methods and apparatuses for inferencing using a neural network | |
US10818016B2 (en) | Systems and methods for predictive/reconstructive visual object tracker | |
US11164051B2 (en) | Image and LiDAR segmentation for LiDAR-camera calibration | |
CN110692044A (en) | Techniques for data management in a vehicle-based computing platform | |
US11689623B2 (en) | Adaptive real-time streaming for autonomous vehicles | |
CN111104954B (en) | Object classification method and device | |
US20230005169A1 (en) | Lidar point selection using image segmentation | |
CN115366920B (en) | Decision-making method, device, equipment and medium for automatic driving vehicle | |
CN115203078A (en) | Vehicle data acquisition system, method, equipment and medium based on SOA architecture | |
US20240221215A1 (en) | High-precision vehicle positioning | |
CN115204026A (en) | Training method, device, vehicle, medium and program product for vehicle data model | |
CN112712608B (en) | System and method for collecting performance data by a vehicle | |
CN117724361A (en) | Collision event detection method and device applied to automatic driving simulation scene | |
CN116630978A (en) | Long-tail data acquisition method, device, system, equipment and storage medium | |
CN116630888A (en) | Unmanned aerial vehicle monitoring method, unmanned aerial vehicle monitoring device, electronic equipment and storage medium | |
US11745747B2 (en) | System and method of adaptive distribution of autonomous driving computations | |
CN115459886A (en) | Data transmission method and data transmission device for vehicle | |
US20230102233A1 (en) | Systems and methods for updating models for image processing using federated learning | |
CN115861953A (en) | Training method of scene coding model, and trajectory planning method and device | |
CN114494682A (en) | Object position prediction method, device, equipment and storage medium | |
CN115019278B (en) | Lane line fitting method and device, electronic equipment and medium | |
CN114283604B (en) | Method for assisting in parking a vehicle | |
US20240256891A1 (en) | Systems and methods for federated learning using non-uniform quantization | |
US20230401911A1 (en) | Lightweight in-vehicle critical scenario extraction system | |
CN115243035A (en) | Video compression quality evaluation method, device, electronic equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |