CN116991101A - Algorithm model loading method and device, electronic equipment and storage medium - Google Patents

Algorithm model loading method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116991101A
CN116991101A CN202310952654.3A CN202310952654A CN116991101A CN 116991101 A CN116991101 A CN 116991101A CN 202310952654 A CN202310952654 A CN 202310952654A CN 116991101 A CN116991101 A CN 116991101A
Authority
CN
China
Prior art keywords
model
loading
algorithm
control unit
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310952654.3A
Other languages
Chinese (zh)
Inventor
陈长贵
陈旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Faw Nanjing Technology Development Co ltd
FAW Group Corp
Original Assignee
Faw Nanjing Technology Development Co ltd
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Faw Nanjing Technology Development Co ltd, FAW Group Corp filed Critical Faw Nanjing Technology Development Co ltd
Priority to CN202310952654.3A priority Critical patent/CN116991101A/en
Publication of CN116991101A publication Critical patent/CN116991101A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44557Code layout in executable memory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Stored Programmes (AREA)

Abstract

The invention discloses an algorithm model loading method, an algorithm model loading device, electronic equipment and a storage medium, and relates to the computer technology, wherein the algorithm model loading method comprises the following steps: the service control unit sends a model loading instruction to the data loading unit according to the current service demand, wherein the model loading instruction comprises a target algorithm name; and the data loading unit loads the corresponding target algorithm model from the data storage unit into the DRAM according to the target algorithm name so that the service control unit processes the target algorithm model. In the invention, the business control unit and the data loading unit carry out inter-core communication, and the target algorithm model is loaded into the DRAM from the data storage unit by the data loading unit, so that the target model is rapidly responded and processed in the DRAM, and the algorithm model stored in the DRAM is the algorithm model which is loaded according to the business requirement.

Description

Algorithm model loading method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to an algorithm model loading method, an algorithm model loading device, an electronic device, and a storage medium.
Background
With the increase of the complexity of the service scene (such as an automatic driving scene), the algorithm models contained in the service scene realization based on the AI (Artificial Intelligence) algorithm are massive, and how to orderly load and operate the massive algorithm models is an important problem to be solved.
The existing method for loading the mass algorithm models is that the loading sequence of the mass algorithm models is configured in advance according to the service implementation flow, and in the actual application process, the service system controls the model loading unit to load all the mass algorithm models from the model storage unit to a dynamic random access memory (Dynamic Random Access Memory, DRAM for short) according to the pre-configured loading sequence. For sub-services containing judgment logic, all algorithm models associated with the judgment logic are loaded into the DRAM, and a target algorithm model is selected in the DRAM according to the actual judgment result.
The method for loading the algorithm model according to the fixed configuration sequence can load the unnecessary algorithm model into the DRAM, so that the problem of occupying the memory space of the DRAM exists; further, when the memory used by the massive algorithm models is larger than the DRAM memory, the problem that all the algorithm models cannot be loaded into the DRAM exists, and part of the algorithm models cannot be operated is caused.
Disclosure of Invention
The invention provides an algorithm model loading method, an algorithm model loading device, electronic equipment and a storage medium, which can improve the existing scheme for loading an algorithm model.
In a first aspect, the present invention provides an algorithm model loading method, applied to an algorithm model loading system, where the system includes a service control unit, a data loading unit, and a data storage unit, where the service control unit and the data loading unit are respectively disposed in different CPU cores, and the method includes:
the service control unit sends a model loading instruction to the data loading unit according to the current service demand, wherein the model loading instruction comprises a target algorithm name;
and the data loading unit loads the corresponding target algorithm model from the data storage unit into the DRAM according to the target algorithm name so that the service control unit processes the target algorithm model.
Optionally, a model loading linked list is stored in the DRAM, the model loading linked list includes at least two algorithm names, and each algorithm name is associated according to a service implementation logic sequence;
the service control unit sends a model loading instruction to the data loading unit according to the current service demand, and the method comprises the following steps:
The service control unit determines a target algorithm name according to the current service requirement and the model loading linked list;
and the service control unit generates a model loading instruction according to the target algorithm name and sends the model loading instruction to the data loading unit.
Optionally, before the algorithm models corresponding to at least two algorithm names contained in the model loading linked list are not loaded, the method further includes:
and receiving an update instruction of a logical sequence among unloaded algorithm names in the model loading chain table, wherein the update instruction comprises at least one of a modification instruction, a deletion instruction and an addition instruction.
Optionally, the algorithm model loading system further comprises a model calculation unit;
the service control unit processes the target algorithm model, and the service control unit comprises the following steps:
obtaining model categories of the target algorithm model;
when the model class is the judging class, repeating the operation that the service control unit sends a model loading instruction to the data loading unit according to the current service requirement;
and when the model class is not the judgment class, the service control unit controls the model calculation unit to calculate the target algorithm model to obtain a model calculation result.
Optionally, the method further comprises:
after the service control unit receives the loading completion identification about the target algorithm model sent by the data loading unit, the service control unit is executed to control the model calculation unit to calculate the target algorithm model.
Optionally, the service control unit sends a model loading instruction to the data loading unit according to the current service requirement, including:
when the interval distance between the vehicle and the lane stop line is smaller than the preset distance, acquiring a weather type, wherein the weather type comprises a sunny day and a rainy day;
when the weather type is rainy days, the service control unit sends a loading instruction for loading a rainy day signal lamp model to the data loading unit;
and when the weather type is sunny, the service control unit sends a loading instruction for loading the sunny signal lamp model to the data loading unit.
Optionally, the method further comprises:
the service control unit obtains a logic relationship between each algorithm name according to the model loading linked list;
when the logic relation among the algorithm names does not contain judgment categories, the service control unit sends a model group loading instruction to the model loading instruction, wherein the model group loading instruction comprises at least two algorithm names;
And the data loading unit loads at least two corresponding algorithm models from the data storage unit into the DRAM according to at least two algorithm names.
In a second aspect, the present invention provides an algorithm model loading device, integrated in an algorithm model loading system, where the system includes a service control unit, a data loading unit, and a data storage unit, where the service control unit and the data loading unit are respectively disposed in different CPU cores, and the device includes:
the instruction sending module is used for sending a model loading instruction to the data loading unit according to the current service requirement by the service control unit, wherein the model loading instruction comprises a target algorithm name;
and the model loading module is used for loading the corresponding target algorithm model from the data storage unit into the DRAM by the data loading unit according to the target algorithm name so that the business control unit processes the target algorithm model.
In a third aspect, the present invention also provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the algorithm model loading method of any one of the embodiments of the present invention.
In a fourth aspect, the present invention further provides a computer readable storage medium, where computer instructions are stored, where the computer instructions are configured to cause a processor to implement the algorithm model loading method according to any embodiment of the present invention when executed.
The algorithm model loading scheme of the embodiment of the invention is applied to an algorithm model loading system, the system comprises a service control unit, a data loading unit and a data storage unit, the service control unit and the data loading unit are respectively arranged in different CPU (central processing unit) cores, specifically, the service control unit sends a model loading instruction to the data loading unit according to the current service requirement, and the model loading instruction comprises a target algorithm name; and the data loading unit loads the corresponding target algorithm model from the data storage unit into the DRAM according to the target algorithm name so that the service control unit processes the target algorithm model. According to the scheme provided by the embodiment, the business control unit and the data loading unit are in inter-core communication, and the target algorithm model is loaded from the data storage unit to the DRAM through the data loading unit, so that the target model is rapidly responded and processed in the DRAM, and the algorithm model stored in the DRAM is the algorithm model which is loaded according to business requirements.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and should not be considered as limiting the scope, and that other related drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of an algorithm model loading method provided by the invention;
FIG. 2 is a schematic diagram of an algorithm model loading system according to the present invention;
FIG. 3 is another flow chart of the algorithm model loading method provided by the invention;
FIG. 4 is another schematic diagram of an algorithm model loading system provided by the present invention;
FIG. 5 is a schematic diagram of an algorithm model loading device according to the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a detailed description of the same will be given below with reference to the accompanying drawings in this embodiment, and it is apparent that the described embodiment is only a partial embodiment of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Fig. 1 is a schematic flow chart of an algorithm model loading method provided by the invention, and the embodiment can be applied to the situation of loading a large number of algorithm models in a service scene. The method may be performed by an algorithm model loading apparatus, which may be implemented in hardware and/or software, and which may be configured in a computer device such as a server. The algorithm model loading method provided by the embodiment is applied to an algorithm model loading system. Specifically, referring to fig. 1, the method may include the steps of:
s110, the service control unit sends a model loading instruction to the data loading unit according to the current service requirement, wherein the model loading instruction comprises a target algorithm name.
First, referring to fig. 2, fig. 2 is a schematic structural diagram of an algorithm model loading system provided by the present invention, where the algorithm model loading system provided by the present embodiment includes a service control unit, a data loading unit and a data storage unit. The business control unit and the data loading unit are respectively arranged in different CPU cores, the business control unit and the data loading unit are in inter-core communication, the data loading unit and the data storage unit are in bus addressing communication or serial port communication, and the data loading unit and the business control unit are respectively in memory bus addressing communication with the DRAM.
Wherein, the business control program is stored in the business control unit, so that the corresponding control function of the business control unit is realized based on the business control program; a data loading program is stored in the data loading unit, so that the corresponding loading function of the data loading unit is realized based on the data loading program; the dynamic random access memory DRAM is used for storing the algorithm model loaded by the model loading unit so that the loaded algorithm model can be run; the data storage unit may be implemented by a nonvolatile storage device (e.g. a magnetic disk), and is used for storing a massive algorithm model required for implementing a service scenario, taking the service scenario as an example of an autopilot scenario, where the massive algorithm model included in the autopilot scenario may be a driving road condition determining model, a signal lamp state determining model, a vehicle acceleration and deceleration control model, a weather determining model, etc., and the algorithm model loading scheme provided in the embodiment is not applicable to the autopilot scenario, and the algorithm model required for the autopilot scenario is not limited by the above example.
The service requirement is used for indicating the current requirement in the actual service application, taking an automatic driving service scene as an example, determining the current service requirement according to the received real-time road scene data in the running process of the vehicle, and the current service requirement can be, for example, a braking requirement, a parking requirement, a lane changing requirement, a signal lamp waiting requirement and the like; taking an electric business scenario as an example, a current business requirement is determined according to a control instruction sent by a user, and the current business requirement can be, for example, a search target commodity requirement, a commodity screening requirement, an order checking requirement, a commodity ordering requirement and the like, wherein one business requirement corresponds to one algorithm model.
The model loading instruction is used for instructing the data loading unit to load the target model so as to realize the current business requirement by loading the target model. Wherein the model load instruction includes a target algorithm name. Specifically, each algorithm name may be named according to a corresponding service requirement, and taking the above autopilot service scenario as an example, the corresponding algorithm names may be a braking algorithm, a parking algorithm, a lane-changing algorithm, a parking waiting algorithm, and the like; alternatively, each algorithm name may be identified in a combination of numbers, letters and symbols according to a coding manner, and the specific identification manner of each algorithm name is not limited herein, so long as a unique corresponding target algorithm model can be obtained in a subsequent step according to the current algorithm name.
And S120, the data loading unit loads the corresponding target algorithm model from the data storage unit into the DRAM according to the target algorithm name, so that the service control unit processes the target algorithm model.
The data storage unit stores at least one algorithm model, and each algorithm model comprises program codes and algorithm names of corresponding models of the architecture. Optionally, when each algorithm model is stored in the data storage unit, a model-code mapping table may be included, a target algorithm model corresponding to the target algorithm name may be found and obtained based on the model-code mapping table, and the data loading unit may store the target algorithm model found in the data storage unit into the DRAM.
In the scheme, a service control program in a service control unit runs in one processor core, a model loading program in a data loading unit runs in another processor core, the service control unit and the model loading unit can share a DRAM, and when the service control unit determines that loading of a target algorithm model in the DRAM is completed, the loaded target algorithm model can be processed. The scheme provided by the embodiment can instruct the data loading unit to load the specific algorithm model at any time according to the service requirement of the current service scene, and has the advantages that the loading mode of the algorithm model can be in a dynamic loading mode according to the current service requirement, so that the mode of loading the model is more flexible, and the loading efficiency is improved; and the algorithm model stored in the DRAM is an algorithm model which is actually applied for realizing each service, the condition that the loaded model is not used does not exist, the storage efficiency of the DRAM is improved, and the like.
The method for determining that the loading of the target algorithm model in the DRAM is completed by the service control unit can be that a loading completion identifier can be generated after the loading of the target algorithm model is completed, and if the service control unit obtains the loading completion identifier, the loading completion of the target algorithm model can be determined, and the operation of processing the target algorithm model can be performed; if the loading completion identification is not generated, the service control unit can continue waiting until the loading completion identification is generated. Alternatively, the indication manner of the loading completion identifier may be that if loading is completed, an instruction code corresponding to "completion" may be generated, and may also be denoted by "1", where the specific indication manner of the loading completion identifier is not limited herein.
The mode of the service control unit for processing the target algorithm model may be to determine whether to continue loading the next algorithm model or calculate the current algorithm model according to the currently loaded target algorithm model. For example, when the target algorithm model is a judgment model related to the current requirement, the algorithm model corresponding to the judgment result needs to be continuously loaded by combining with the actual judgment result, and then the service control unit can continuously send a model loading instruction to the data loading unit; if the target algorithm model is not the judgment model, resolving processing is needed to be performed on the target algorithm model to obtain a model settlement result, so that the next algorithm model to be loaded and the like are determined according to the settlement result and the current actual scene, and the mode of processing the target algorithm model by the specific service control unit is not limited.
The algorithm model loading method provided by the embodiment is applied to an algorithm model loading system, the system comprises a service control unit, a data loading unit and a data storage unit, the service control unit and the data loading unit are respectively arranged in different CPU (central processing unit) cores, specifically, the service control unit sends a model loading instruction to the data loading unit according to the current service requirement, and the model loading instruction comprises a target algorithm name; and the data loading unit loads the corresponding target algorithm model from the data storage unit into the DRAM according to the target algorithm name so that the service control unit processes the target algorithm model. According to the scheme provided by the embodiment, the business control unit and the data loading unit are in inter-core communication, and the target algorithm model is loaded from the data storage unit to the DRAM through the data loading unit, so that the target model is rapidly responded and processed in the DRAM, and the algorithm model stored in the DRAM is the algorithm model which is loaded according to business requirements.
Fig. 3 is another flow chart of the algorithm model loading method provided by the present invention, and the relationship between the present embodiment and the above embodiment further refines the corresponding features of the above embodiment. As shown in fig. 3, the method may include the steps of:
s210, the service control unit determines the name of the target algorithm according to the current service demand and the model loading linked list.
The stored model loading linked list is stored in the DRAM, and the model loading linked list comprises at least two algorithm names, and each algorithm name is associated according to a service implementation logic sequence.
In this scheme, through the sharing mechanism of the DRAM, the service control unit and the data loading unit may both obtain the content stored in the model loading linked list, where in the model loading linked list, each algorithm name is associated according to the service implementation logic sequence, and illustratively, the algorithm names included in the model loading linked list are respectively a model a, a model B, a model C, a model D, etc., and the model implementation logic sequence in the actual service requirement may be a model a-model B-model C or a model D, so that each algorithm name may be pre-associated according to the current service implementation logic sequence.
The mode that the service control unit determines the name of the target algorithm according to the current service requirement and the model loading linked list can be that, if the front part of the vehicle comprises a signal lamp according to a pre-loaded running map in the process that the vehicle runs on a road, the current service requirement can be that the signal lamp is processed, and further, the current target algorithm name can be determined to be a signal lamp processing algorithm according to a pre-determined vehicle running logic sequence in the model loading linked list.
Optionally, the model load linked list may further include information such as an offset of each model, a memory size occupied by the model, whether the model is loaded, and the like, in addition to each algorithm name, and the information included in the specific model load linked list is not limited herein.
S211, the service control unit generates a model loading instruction according to the name of the target algorithm, and sends the model loading instruction to the data loading unit.
In one implementation manner, in the solution provided in this embodiment, before the algorithm models corresponding to at least two algorithm names included in the model loading linked list are not loaded, the method further includes:
and receiving an update instruction of a logical sequence among the unloaded algorithm names in the model loading linked list, wherein the update instruction comprises at least one of a modification instruction, a deletion instruction and an addition instruction.
For massive algorithm models required by the realization of the service scene, if the algorithm model corresponding to the algorithm name in the model loading linked list is not loaded, updating the logic sequence among the unloaded algorithm names so that the model loading linked list supports corresponding adjustment according to actual requirements.
And the update instruction comprises at least one of a modification instruction, a deletion instruction and an addition instruction, and the model loading linked list correspondingly modifies, deletes or adds the realization logic sequence among the pre-associated algorithm names according to the actual meaning indicated by the update instruction.
When the update instruction is an addition instruction, it indicates that an algorithm model name needs to be newly added in the model loading linked list, and logic association is performed, so that the data storage unit needs to include an algorithm model actually corresponding to the newly added algorithm model name, so that loading operation of the newly added algorithm model can be realized according to the newly added algorithm model name.
S220, the data loading unit loads the corresponding target algorithm model from the data storage unit into the DRAM according to the target algorithm name.
S230, obtaining model types of the target algorithm model.
The model class provided in this embodiment may include a judgment class and other classes, and the purpose of determining the model class is to facilitate determining, in a subsequent step, a processing manner of the service control unit on the target algorithm model.
The judgment type indication target algorithm model is a judgment logic, and the next algorithm model to be loaded is determined according to actual data; and the other categories indicate specific algorithm models and are used for calculating the algorithm models in the subsequent steps to obtain calculation results.
Referring to fig. 4, fig. 4 is another schematic structural diagram of the algorithm model loading system provided by the present invention. The algorithm model loading system provided in this embodiment further includes a model calculation unit. The model calculation unit stores a model calculation program to realize the corresponding reasoning operation function of the model calculation unit based on the model calculation program.
The model calculation unit is communicated with the DRAM based on bus addressing, the model calculation unit is connected with the service control unit through a bus or a serial port, and the service control unit can control the model calculation unit to operate the algorithm model loaded in the DRAM so as to obtain an operation result of the model.
S231, determining whether the model class of the target algorithm model is a judging class.
If yes, the service control unit returns to execute step S210 according to the current service result.
If not, step S240 is performed.
In another alternative embodiment, the solution provided in this embodiment further includes: the service control unit obtains the logic relation between each algorithm name according to the model loading linked list; when the logic relation among the algorithm names does not contain the judgment category, the service control unit sends a model group loading instruction to the model loading instruction, wherein the model group loading instruction comprises at least two algorithm names; the data loading unit loads at least two corresponding algorithm models from the data storage unit into the DRAM according to at least two algorithm names.
The present embodiment may be understood that if the logic relationship between the plurality of algorithm names includes a judgment category, the next algorithm model to be loaded needs to be determined according to the actual judgment result of the judgment category, and for the logic relationship between the plurality of algorithm names includes the algorithm model of the judgment category, the service control unit sends a model group loading instruction to the model loading instruction, so that the data loading unit loads the algorithm model corresponding to at least two algorithm names included in the model group loading instruction to the DRAM, which has the advantage that, under the condition that the logic sequence is unchanged, the service control unit does not need to send the model loading instruction to the data loading unit for multiple times, and can improve the model loading efficiency.
According to the scheme improved by the embodiment, the service control unit can flexibly determine whether to load a series of model groups continuously or load the next algorithm model from the operation result of one current algorithm model according to service requirements. The model loading program provides a universal and flexible unified interface for the service through the to-be-loaded model managed by the model loading linked list, and can load a group of algorithm models of the sequence list or independently load an algorithm model.
S240, after the service control unit receives the loading completion identification about the target algorithm model sent by the data loading unit, the service control unit controls the model calculation unit to calculate the target algorithm model.
The loading completion identification is an identification sent by the data loading unit as to whether the target algorithm model is loaded to completion. After receiving the loading completion identification about the target algorithm model, the model calculation unit may calculate the target algorithm model in the DRAM.
An actual application scenario example, where a service control unit sends a model loading instruction to a data loading unit according to a current service requirement, including:
when the interval distance between the vehicle and the lane stop line is smaller than the preset distance, acquiring weather types, wherein the weather types comprise sunny days and rainy days; when the weather type is rainy days, the service control unit sends a loading instruction for loading the rainy day signal lamp model to the data loading unit; and when the weather type is sunny, the service control unit sends a loading instruction for loading the sunny signal lamp model to the data loading unit.
According to the embodiment, the signal lamp model corresponding to the weather type is described by taking the loading of the weather condition as an example, if the signal lamp monitoring model in the automatic driving scene at present comprises a rainy day signal lamp model and a sunny day signal lamp model, the rainy day signal lamp model and the sunny day signal lamp model are loaded into the DRAM according to the existing mode of loading the model into the DRAM according to the fixed sequence before judging whether the current weather type is rainy or sunny, the weather judging model is further combined to judge the actual weather type of the vehicle when the vehicle is currently running, if the current actual weather type is determined to be rainy, the loaded rainy day signal lamp model is calculated, and the sunny day signal lamp model cannot be used. The memory required by the rainy signal lamp model and the sunny signal lamp model is large, such as 5G, but 4G is required in total by combining other algorithm models, but the DRAM is only provided with 8G memory due to project cost. There are situations where memory is insufficient.
According to the scheme provided by the embodiment, the weather type can be acquired when the vehicle is at a preset distance (such as 100 meters) from the signal lamp, and the current weather type acquisition mode is that the service control unit sends an instruction for loading the weather type algorithm model to the data loading unit, when the weather type algorithm model is loaded, environment data obtained in the running process of the vehicle can be input to determine the target weather type of the current actual environment, and when the target weather type is determined to be rainy, the service control unit sends a loading instruction for loading the rainy signal lamp model to the data loading unit; and when the target weather type is determined to be sunny, the service control unit sends a loading instruction for loading the sunny signal lamp model to the data loading unit. In the example, the algorithm model can be dynamically loaded according to the service requirement, so that the corresponding algorithm model can be used in sunny days and rainy days, the service control unit can flexibly control the data loading unit to load the algorithm model according to the current service requirement, the process of controlling the data loading unit to load one algorithm model and the process of controlling the model calculating unit to calculate the other algorithm model can be operated in parallel, and the parallel processing under different system software of different cores of the multi-core system is fully utilized, so that the operation efficiency is improved.
According to the algorithm model loading method provided by the embodiment, the service control program in the service control unit runs in one processor core, the model loading program in the data loading unit runs in the other processor core, the model loading linked list is stored in the DRAM through sharing the DRAM, and the service control unit can instruct the data loading unit to load a specific model according to actual requirements, so that the effect of flexibly loading the actual requirement algorithm model is achieved. And the logical relation among the algorithm names is obtained through the model loading linked list, and when the logical relation among the algorithm names does not contain judgment categories, the service control unit sends a model group loading instruction to the model loading instruction, so that the service control unit can flexibly determine whether to continuously load a series of model groups or load the next algorithm model from the operation result of one current algorithm model according to service requirements, and the model loading efficiency can be improved.
Fig. 5 is a schematic structural diagram of an algorithm model loading device provided by the present invention, which is suitable for executing the method integrated in the algorithm model loading system and method provided in the present embodiment. The device can be integrated in an algorithm model loading system, the system comprises a service control unit, a data loading unit and a data storage unit, the service control unit and the data loading unit are respectively arranged in different CPU cores, and the device specifically can comprise: an instruction sending module 310 and a model loading module 320, wherein:
The instruction sending module 310 is configured to send, by the service control unit, a model loading instruction to the data loading unit according to a current service requirement, where the model loading instruction includes a target algorithm name;
and the model loading module 320 is configured to load, by the data loading unit, a corresponding target algorithm model from the data storage unit into the DRAM according to the target algorithm name, so that the service control unit processes the target algorithm model.
The algorithm model loading device is applied to an algorithm model loading system, the system comprises a service control unit, a data loading unit and a data storage unit, the service control unit and the data loading unit are respectively arranged in different CPU (central processing unit) cores, specifically, the service control unit sends a model loading instruction to the data loading unit according to the current service requirement, and the model loading instruction comprises a target algorithm name; and the data loading unit loads the corresponding target algorithm model from the data storage unit into the DRAM according to the target algorithm name so that the service control unit processes the target algorithm model. According to the scheme provided by the embodiment, the business control unit and the data loading unit are in inter-core communication, and the target algorithm model is loaded from the data storage unit to the data loading unit through the data loading unit, so that the target model is rapidly responded and processed in the DRAM, and the algorithm model stored in the DRAM is the algorithm model which is loaded according to business requirements.
In one embodiment, a model loading linked list is stored in the DRAM, the model loading linked list including at least two algorithm names, each algorithm name being associated according to a service implementation logic order;
the instruction sending module 310 includes: a name determining unit and an instruction transmitting unit, wherein:
the name determining unit is used for determining a target algorithm name according to the current service requirement and the model loading linked list by the service control unit;
the instruction sending unit is used for generating a model loading instruction according to the target algorithm name by the service control unit and sending the model loading instruction to the data loading unit.
In one embodiment, the apparatus further comprises: an instruction receiving module, wherein:
the instruction receiving module is used for receiving an update instruction of the logical sequence among the unloaded algorithm names in the model loading chain table, wherein the update instruction comprises at least one of a modification instruction, a deletion instruction and an addition instruction.
In one embodiment, the algorithm model loading system further includes a model calculation unit; the apparatus further comprises a model processing module, wherein:
the model processing module is used for obtaining model types of the target algorithm model; when the model class is the judging class, repeating the operation that the service control unit sends a model loading instruction to the data loading unit according to the current service requirement; and when the model class is not the judgment class, the service control unit controls the model calculation unit to calculate the target algorithm model to obtain a model calculation result.
In an embodiment, the method further comprises identifying a receiving module, wherein:
and the identification receiving module is used for receiving the loading completion identification about the target algorithm model sent by the data loading unit by the service control unit.
In an embodiment, the instruction sending module 310 is specifically configured to obtain a weather type when the distance between the vehicle and the lane stop line is smaller than a preset distance, where the weather type includes a sunny day and a rainy day; when the weather type is rainy days, the service control unit sends a loading instruction for loading a rainy day signal lamp model to the data loading unit; and when the weather type is sunny, the service control unit sends a loading instruction for loading the sunny signal lamp model to the data loading unit.
In one embodiment, the apparatus further comprises: a relationship acquisition module, wherein:
the relation acquisition module is used for acquiring a logic relation between each algorithm name by the service control unit according to the model loading linked list;
the instruction sending module is further configured to send a model group loading instruction to the model loading instruction when the logical relationship between the plurality of algorithm names does not include a judgment category, where the model group loading instruction includes at least two algorithm names;
And the model loading module is also used for loading at least two corresponding algorithm models from the data storage unit into the DRAM according to at least two algorithm names by the data loading unit.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above. The specific working process of the functional module described above may refer to the corresponding process in the foregoing method embodiment, and will not be described herein.
The invention also provides an electronic device, which comprises: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the algorithm model loading method according to any one of the embodiments of the present invention.
The present invention also provides a computer readable storage medium storing computer instructions for causing a processor to implement the algorithm model loading method according to any one of the embodiments of the present invention when executed.
Referring now to FIG. 6, there is illustrated a schematic diagram of a computer system 500 suitable for use in implementing the electronic device of the present invention. The electronic device shown in fig. 6 is only one example, and should not impose any limitation on the functions and the scope of use of the present embodiment.
As shown in fig. 6, the computer system 500 includes a Central Processing Unit (CPU) 501, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the system 500 are also stored. The CPU 501, ROM 502, and RAM 503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input section 506 including a keyboard, a mouse, and the like; an output portion 507 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker, and the like; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The drive 510 is also connected to the I/O interface 505 as needed. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as needed so that a computer program read therefrom is mounted into the storage section 508 as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 509, and/or installed from the removable media 511. The above-described functions defined in the system of the present invention are performed when the computer program is executed by a Central Processing Unit (CPU) 501.
The computer readable medium shown in the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules and/or units involved in the present embodiment may be implemented by software, or may be implemented by hardware. The described modules and/or units may also be provided in a processor, e.g., may be described as: a processor includes an instruction issue module and a model loading module. The names of these modules do not constitute a limitation on the module itself in some cases.
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to include: the service control unit sends a model loading instruction to the data loading unit according to the current service demand, wherein the model loading instruction comprises a target algorithm name; and the data loading unit loads the corresponding target algorithm model from the data storage unit into the DRAM according to the target algorithm name so that the service control unit processes the target algorithm model.
According to the technical scheme of the embodiment, the business control unit and the data loading unit carry out inter-core communication, and the target algorithm model is loaded into the DRAM from the data storage unit through the data loading unit, so that the target model is rapidly responded and processed in the DRAM, and the algorithm model stored in the DRAM is the algorithm model which is loaded according to business requirements.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives can occur depending upon design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. The algorithm model loading method is characterized by being applied to an algorithm model loading system, wherein the system comprises a service control unit, a data loading unit and a data storage unit, the service control unit and the data loading unit are respectively arranged in different CPU cores, and the method comprises the following steps:
the service control unit sends a model loading instruction to the data loading unit according to the current service demand, wherein the model loading instruction comprises a target algorithm name;
and the data loading unit loads the corresponding target algorithm model from the data storage unit into the DRAM according to the target algorithm name so that the service control unit processes the target algorithm model.
2. The algorithm model loading method according to claim 1, wherein a model loading linked list is stored in the DRAM, the model loading linked list including at least two algorithm names, each algorithm name being associated according to a service implementation logic order;
The service control unit sends a model loading instruction to the data loading unit according to the current service demand, and the method comprises the following steps:
the service control unit determines a target algorithm name according to the current service requirement and the model loading linked list;
and the service control unit generates a model loading instruction according to the target algorithm name and sends the model loading instruction to the data loading unit.
3. The algorithm model loading method according to claim 2, wherein before the algorithm models corresponding to at least two algorithm names contained in the model loading linked list are not loaded, the method further comprises:
and receiving an update instruction of a logical sequence among unloaded algorithm names in the model loading chain table, wherein the update instruction comprises at least one of a modification instruction, a deletion instruction and an addition instruction.
4. The algorithm model loading method according to claim 1, wherein the algorithm model loading system further comprises a model calculation unit;
the service control unit processes the target algorithm model, and the service control unit comprises the following steps:
obtaining model categories of the target algorithm model;
when the model class is the judging class, repeating the operation that the service control unit sends a model loading instruction to the data loading unit according to the current service requirement;
And when the model class is not the judgment class, the service control unit controls the model calculation unit to calculate the target algorithm model to obtain a model calculation result.
5. The algorithm model loading method according to claim 4, wherein the method further comprises:
after the service control unit receives the loading completion identification about the target algorithm model sent by the data loading unit, the service control unit is executed to control the model calculation unit to calculate the target algorithm model.
6. The algorithm model loading method according to claim 1, wherein the service control unit sends a model loading instruction to the data loading unit according to the current service requirement, comprising:
when the interval distance between the vehicle and the lane stop line is smaller than the preset distance, acquiring a weather type, wherein the weather type comprises a sunny day and a rainy day;
when the weather type is rainy days, the service control unit sends a loading instruction for loading a rainy day signal lamp model to the data loading unit;
and when the weather type is sunny, the service control unit sends a loading instruction for loading the sunny signal lamp model to the data loading unit.
7. The algorithm model loading method according to claim 2, wherein the method further comprises:
the service control unit obtains a logic relationship between each algorithm name according to the model loading linked list;
when the logic relation among the algorithm names does not contain judgment categories, the service control unit sends a model group loading instruction to the model loading instruction, wherein the model group loading instruction comprises at least two algorithm names;
and the data loading unit loads at least two corresponding algorithm models from the data storage unit into the DRAM according to at least two algorithm names.
8. The utility model provides an algorithm model loading device which characterized in that integrates in algorithm model loading system, the system includes business control unit, data loading unit and data storage unit, business control unit and data loading unit set up respectively in different CPU kernels, the device includes:
the instruction sending module is used for sending a model loading instruction to the data loading unit according to the current service requirement by the service control unit, wherein the model loading instruction comprises a target algorithm name;
and the model loading module is used for loading the corresponding target algorithm model from the data storage unit into the DRAM by the data loading unit according to the target algorithm name so that the business control unit processes the target algorithm model.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the algorithm model loading method of any one of claims 1-7.
10. A computer readable storage medium having stored thereon a computer program, which when executed by a processor implements an algorithm model loading method according to any of claims 1-7.
CN202310952654.3A 2023-07-31 2023-07-31 Algorithm model loading method and device, electronic equipment and storage medium Pending CN116991101A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310952654.3A CN116991101A (en) 2023-07-31 2023-07-31 Algorithm model loading method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310952654.3A CN116991101A (en) 2023-07-31 2023-07-31 Algorithm model loading method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116991101A true CN116991101A (en) 2023-11-03

Family

ID=88533265

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310952654.3A Pending CN116991101A (en) 2023-07-31 2023-07-31 Algorithm model loading method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116991101A (en)

Similar Documents

Publication Publication Date Title
CN111210136B (en) Robot task scheduling method and server
CN109886693B (en) Consensus realization method, device, equipment and medium for block chain system
WO2021114025A1 (en) Incremental data determination method, incremental data determination apparatus, server and terminal device
CN112529711B (en) Transaction processing method and device based on block chain virtual machine multiplexing
CN108711025B (en) Car rental vehicle inventory query method and device, electronic equipment and storage medium
CN118075256B (en) Internet of things ammeter upgrading method, system, terminal and storage medium
CN109215169A (en) Storage method, device and the equipment of travelling data
CN115277882B (en) CAN message database establishment method and device, vehicle-mounted electronic equipment and storage medium
CN114301980A (en) Method, device and system for scheduling container cluster and computer readable medium
CN115617511A (en) Resource data processing method and device, electronic equipment and storage medium
CN108696554B (en) Load balancing method and device
CN116991101A (en) Algorithm model loading method and device, electronic equipment and storage medium
CN116319499A (en) Diagnostic method and device for vehicle, electronic equipment and storage medium
CN115080085A (en) Method and system for solving EOL calibration in OEM
CN113888028A (en) Patrol task allocation method and device, electronic equipment and storage medium
CN116418670A (en) Upgrade method and device for multi-ECU system, electronic equipment and storage medium
CN110782115B (en) Method and device for processing risk of Internet of vehicles system, electronic equipment and storage medium
CN114326689B (en) Method, device, equipment and storage medium for brushing firmware of vehicle
CN114500593B (en) Remote driving system, method and equipment
CN111539666B (en) Material demand management method and device and computer storage medium
CN109597813B (en) Vehicle data processing method and device
CN117667153A (en) Vehicle OTA upgrading method, device, equipment and computer readable storage medium
CN117057441A (en) Travel time prediction model training and prediction method and device and electronic equipment
CN116653677A (en) Bus-based charging bit allocation method, system, equipment and storage medium
CN113076198A (en) Data processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination