CN110503180A - Model treatment method, apparatus and electronic equipment - Google Patents

Model treatment method, apparatus and electronic equipment Download PDF

Info

Publication number
CN110503180A
CN110503180A CN201910750241.0A CN201910750241A CN110503180A CN 110503180 A CN110503180 A CN 110503180A CN 201910750241 A CN201910750241 A CN 201910750241A CN 110503180 A CN110503180 A CN 110503180A
Authority
CN
China
Prior art keywords
model
operator
processed
stored
chart
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910750241.0A
Other languages
Chinese (zh)
Other versions
CN110503180B (en
Inventor
方攀
陈岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910750241.0A priority Critical patent/CN110503180B/en
Publication of CN110503180A publication Critical patent/CN110503180A/en
Application granted granted Critical
Publication of CN110503180B publication Critical patent/CN110503180B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Debugging And Monitoring (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the present application discloses a kind of model treatment method, apparatus and electronic equipment.The described method includes: obtaining the model file to be processed for being stored with algorithm model, the model file to be processed is to be generated based on flatbuffer mode;The root table in the model file to be processed is read, described table is stored with the storage location of model table included by the model file to be processed;The model table is got according to the storage location that described table stores;Read the sub-chart in the model table, operator table read from the sub-chart, and the operator table is parsed, obtain include operator information parsing result;The parsing result is stored according to specified format.Automation may be implemented through the above way and accurately carry out the statistics of operator information.

Description

Model treatment method, apparatus and electronic equipment
Technical field
This application involves field of computer technology, set more particularly, to a kind of model treatment method, apparatus and electronics It is standby.
Background technique
Algorithm model, for example, neural network model be by a large amount of, simple processing unit (referred to as neuron) widely The complex networks system for interconnecting and being formed.Some algorithm models have large-scale parallel, distributed storage and processing, from group It knits, adaptive and self-learning ability.It usually include a large amount of operator in algorithm model.In relevant operator statistical method, Efficiency and accuracy rate are all unable to get effective guarantee.
Summary of the invention
In view of the above problems, above-mentioned to improve present applicant proposes a kind of model treatment method, apparatus and electronic equipment Problem.
In a first aspect, being applied to model treatment device, the method packet this application provides a kind of model treatment method It includes: obtaining the model file to be processed for being stored with algorithm model, the model file to be processed is based on flatbuffer mode It generates;The root table in the model file to be processed is read, described table is stored with included by the model file to be processed The storage location of model table;The model table is got according to the storage location that described table stores;Read the model Sub-chart in table reads operator table from the sub-chart, and parses to the operator table, obtains including operator information Parsing result;The parsing result is stored according to specified format.
Second aspect, this application provides a kind of model treatment device, described device includes: file obtaining unit, is used for The model file to be processed for being stored with algorithm model is obtained, the model file to be processed is raw based on flatbuffer mode At;Document reading unit, for reading the root table in the model file to be processed, described table is stored with the mould to be processed The storage location of model table included by type file;Data capture unit, the storage position for being stored according to described table It sets and gets the model table;Data parsing unit is read from the sub-chart for reading the sub-chart in the model table Take operator table, and the operator table parsed, obtain include operator information parsing result;Data storage cell is used for The parsing result is stored according to specified format.
Fourth aspect, this application provides a kind of electronic equipment, including processor and memory;One or more programs It is stored in the memory and is configured as being executed by the processor to realize above-mentioned method.
5th aspect, this application provides a kind of computer readable storage medium, in the computer readable storage medium It is stored with program code, wherein execute above-mentioned method when said program code is activated controller operation.
A kind of model treatment method, apparatus provided by the present application and electronic equipment, obtain be stored with algorithm model to Model file is handled, the model file to be processed is to generate based on flatbuffer mode;Read the model text to be processed Root table in part, described table are stored with the storage location of model table included by the model file to be processed;According to described The storage location of root table storage gets the model table;The sub-chart in the model table is read, from the sub-chart Middle reading operator table, and the operator table is parsed, obtain include operator information parsing result;By the parsing result It is stored according to specified format, automation may be implemented through the above way and accurately carries out the statistics of operator information.
Detailed description of the invention
In order to more clearly explain the technical solutions in the embodiments of the present application, make required in being described below to embodiment Attached drawing is briefly described, it should be apparent that, the drawings in the following description are only some examples of the present application, for For those skilled in the art, without creative efforts, it can also be obtained according to these attached drawings other attached Figure.
Fig. 1 shows a kind of flow chart of model treatment method of the embodiment of the present application proposition;
A kind of group of algorithm model file in a kind of model treatment method proposed Fig. 2 shows one embodiment of the application Knit the schematic diagram of form;
Fig. 3 shows a kind of flow chart for model treatment method that another embodiment of the application proposes;
Fig. 4 shows a kind of flow chart of model treatment method of the application another embodiment proposition;
Fig. 5 shows a kind of structural block diagram of model treatment device of the embodiment of the present application proposition;
Fig. 6 shows a kind of structural block diagram for model treatment device that another embodiment of the application proposes;
Fig. 7 shows a kind of structural block diagram of model treatment device of the application another embodiment proposition;
Fig. 8 shows the knot of the electronic equipment for executing the model treatment method according to the embodiment of the present application of the application Structure block diagram;
Fig. 9 is the embodiment of the present application for saving or carrying the model treatment method realized according to the embodiment of the present application Program code storage unit.
Specific embodiment
Below in conjunction with the attached drawing in the embodiment of the present application, technical solutions in the embodiments of the present application carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of embodiments of the present application, instead of all the embodiments.It is based on Embodiment in the application, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall in the protection scope of this application.
Algorithm model, such as neural network (Neural Networks, NN) (are claimed by a large amount of, simple processing unit For neuron) widely interconnect and the complex networks system that is formed.Neural network has large-scale parallel, distributed storage With processing, self-organizing, adaptive and self-learning ability.It usually include a large amount of operator in neural algorithm model.Wherein it is possible to Understand, operator can regard some algorithm process in a neural algorithm model as, and operator can be function mapping of a set onto another letter Number, or function mapping of a set onto another number.
However, inventor has found under study for action, in relevant operator statistical method, efficiency and accuracy rate are all unable to get It is effective to guarantee.For example, being to open model file by netron software, one by one in statistical model file in a kind of mode The parameter information of operator.But in this fashion, the number and each calculation usually in the manual Statistical Operator of user oneself The operator information of son, it is therefore desirable to a large amount of human time and enough carefulnesses.There is also statistics for simple algorithm model just True possibility, many comparable complexity of algorithm model, manpower are unable to complete the statistics of operator information at all.Believe in the operator of mistake Under breath, the operator code of realization is necessarily wrong, in entire algorithm model when bottom calls, will necessarily generate the knot of failure Fruit, and the operator for finding in error analysis error is extremely difficult.
Therefore, the model treatment method, apparatus and electronics that can improve the above problem in the application are inventors herein proposed Equipment.Automation may be implemented by mode provided by the present application and accurately carry out the statistics of operator information.
Present embodiments are specifically described below in conjunction with attached drawing.
Referring to Fig. 1, a kind of model treatment method provided by the embodiments of the present application, is applied to model treatment device, it is described Method includes:
Step S110: obtaining and be stored with the model file to be processed of algorithm model, the model file to be processed be based on Flatbuffer mode generates.
Wherein, model file to be processed can be for from local acquisition, model file to be processed may be directly from network Hold the neural algorithm model file got.In addition, model file to be processed may be the mind for getting from network-side Neural algorithm model file after algorithm model file optimizes.Wherein, neural algorithm model file is optimized It can be understood as carrying out neural algorithm model Operator Fusion, network beta pruning, model quantization, the operation such as network cutting.Wherein, it calculates Son fusion can be understood as merging some operators to reduce calculating or reduce memory copying (such as Conv2D and BatchNorm Merging).Network beta pruning, it can be understood as some unnecessary operators of removal (for example remove some and instructing to simplify network The redundancy operator that can be used when practicing).Furthermore the calculating of the inside of general neural algorithm model all uses floating number calculating, floats The calculating of points can consume bigger computing resource (space and cpu/gpu time), if not influencing neural algorithm model In the case where accuracy rate, if being calculated using other simple value types inside neural algorithm model, calculating speed It can improve very much, the computing resource of consumption can greatly reduce, and for mobile device, this point is even more important.Quantization Original neural algorithm model is compressed by bit number needed for reducing each weight of expression.
It is understood that algorithm model, for example, neural algorithm model is typically all corresponding by generating after training Then model file is stored or is distributed in other equipment.For example, Tflite algorithm model training after with Flatbuffer mode generates model file.
It is so exemplary and model file first is generated to flatbuffer mode is illustrated.Flatbuffer is one Save a series of buffer area of scalar sum vectors.Scalar sum vector in this buffer area can be accessed directly.Such as Fig. 2 institute Show, includes root table (root table), T vtable (allocation list) and T (model table) in file structure shown in Fig. 2. Wherein, offset root can be one 32 offsets, be directed toward ROOT DATA.Wherein, ROOT DATA can be aforementioned Model table in field1 or field2.It should be noted that a T vtable (allocation list) can be multiple models Table provides description information.And each model table includes field offset vtable for which T vtable (allocation list) to be specified Its field information is described.And each T vtable (allocation list) has one 16 vtable size, for recording itself Size (including vtable size).And object size therein be used to record table object size (including Vtable offset this 4 bytes).Offset field is used to the field in record cast table with respect to table object's Offset.
It is so based on above content, method provided in this embodiment further includes following step.
Step S120: reading the root table in the model file to be processed, and described table is stored with the model to be processed The storage location of model table included by file.
It is understood that after getting model file to be processed, so that it may be got by reading file wait locate Manage the root table in model file.As shown in foregoing teachings, the storage location of following model table is stored in root table.For example, if straight It connects and is loaded into memory model file to be processed, and then obtain model file to be processed from memory, then storage here Position is it can be understood that for offset address in memory.The device for so executing the embodiment of the present application can pass through acquisition Offset address carrys out and then reads model table.
Step S130: the model table is got according to the storage location that described table stores.
It should be noted that the operator information of operator included in model file to be handled is stored in model table, It is subsequent further to read operator information so after getting model table.
Step S140: reading the sub-chart in the model table, operator table is read from the sub-chart, and to the calculation Sublist is parsed, obtain include operator information parsing result.
It is understood that sub-chart is nested in model table.And nesting therein can pass through specified data Type is realized.For example, model table described in foregoing teachings, sub-chart and operator table are all by this data class of table Type is realized.And the data of table type can be nested against one another, i.e., it again can be nested another in a table A table, and then realize by nested sub-chart in model table, and operator table is nested in sub-chart.
Step S150: the parsing result is stored according to specified format.
Wherein, operator information is processed again for the ease of subsequent, then can be according to one in storage operators information Fixed format is stored.
A kind of model treatment method provided by the present application obtains the model file to be processed for being stored with algorithm model, described Model file to be processed is to be generated based on flatbuffer mode;Read the root table in the model file to be processed, described Table is stored with the storage location of model table included by the model file to be processed;The storage stored according to described table Position acquisition is to the model table;The sub-chart in the model table is read, operator table is read from the sub-chart, and to institute Operator table is stated to be parsed, obtain include operator information parsing result;The parsing result is carried out according to specified format Automation may be implemented through the above way and accurately carry out the statistics of operator information for storage.
Referring to Fig. 3, a kind of model treatment method provided by the embodiments of the present application, is applied to model treatment device, it is described Method includes:
Step S210: the model file to be processed for being stored with algorithm model is loaded into memory, described to be processed Model file is to be generated based on flatbuffer mode.
Step S220: reading the root table in the model file to be processed, and described table is stored with the model to be processed The storage location of model table included by file.
Step S230: the model table is got according to the storage location that described table stores.
Step S240: the allocation list in the model file to be processed is obtained, is stored with sub-chart in the allocation list Storage location.
It is understood that may there are multiple allocation lists and multiple model tables in model file to be processed. And different allocation lists can be described the configuration information of different model tables.So in this manner, each model Table includes that field offset vtable is used to which T vtable (allocation list) to be specified to describe its field information.
Step S250: the storage location based on the sub-chart gets the sub-chart in the model table.
Step S260: reading operator table from the sub-chart, carries out according to content of the specified sequence to the operator table Detection.
Step S270:, will be described using the content blocks as an operator if detecting the content blocks including target identification The corresponding operator parameter of target identification is as operator information.
As a kind of mode, when the operator parameter includes the input data, output data and operator operation of operator Additional information.
Step S280: the parsing result including the operator information is generated.
Step S290: the parsing result is stored according to specified format.
A kind of model treatment method provided by the present application obtains the model file to be processed for being stored with algorithm model, described Model file to be processed is to be generated based on flatbuffer mode;Read the root table in the model file to be processed, described Table is stored with the storage location of model table included by the model file to be processed;The storage stored according to described table Position acquisition is to the model table;The sub-chart in the model table is read, operator table is read from the sub-chart, and to institute Operator table is stated to be parsed, obtain include operator information parsing result;The parsing result is carried out according to specified format Automation may be implemented through the above way and accurately carry out the statistics of operator information for storage.
Referring to Fig. 4, a kind of model treatment method provided by the embodiments of the present application, is applied to model treatment device, it is described Method includes:
Step S310: the model file to be processed for being stored with algorithm model is loaded into memory, described to be processed Model file is to be generated based on flatbuffer mode.
Wherein, model treatment device can be the processing that triggering in several cases carries out algorithm model.
As a kind of mode, the processing of algorithm model can be carried out in response to triggering manually for user.It is understood that It is that model treatment device may include having two parts of interface module and kernel module, wherein kernel module is for executing sheet Apply for the process about operator statistics in model treatment method provided in embodiment, for example, step S110 can be executed To the content of step S150, for another example the content of step S210 to step S280 can be executed, step S310 can also be performed and arrive The content of step S390.And interface module therein can be used for the operation that user carries out some controllings.
For example, in interface module configured with load control, starting control, pause control and control can be terminated.Its In, load control can be used for user and select model file to be processed by way of store path, then filling in model treatment It sets after finding model file to be processed based on the determining store path of user, so that it may the model file to be processed that will be found It is loaded into memory.And starting control therein can be used for user's trigger model processing unit and start to execute model treatment method In acquisition be stored with model file the step to be processed of algorithm model.It is understood that user above-mentioned is by adding Carrying control is only that model treatment device is allowed to determine which file as model file to be processed, and not substantial will determine Model file to be processed be loaded into memory.So in this manner, model treatment device is detecting user's touch-control After starting control, then determining model file to be processed is loaded into memory, with realize obtain be stored with algorithm model to Handle model file.
Corresponding, pause control therein can be used for the statistical operation of trigger model processing unit pause operator.For example, It including multiple operator tables include multiple operators in each operator table for sub-chart above-mentioned as a kind of mode In the case where, multiple threads can be configured and per thread corresponds to one or more operator table.In this manner, at model Reason device creates a queue after being triggered detecting starting control, which is used to store thread above-mentioned, that Thread for going out team can execute the operation for detecting corresponding operator table, and be still in the thread in queue and just need Go out team again after waiting the thread of team to complete corresponding operator table detection.In this fashion, if detecting pause control By touch-control, then model treatment device will suspend the dequeue operation of queue, after detecting starting control by touch-control, restore Dequeue operation before, and the thread for having gone out team during pause will continue to complete also unfinished Detection task, in order to avoid There is error in data.
It is understood that terminating control terminates the statistical operation of operator for trigger model processing unit.So preceding In the mode for storing thread based on queue stated, if detecting terminates control by touch-control, it can directly suspend dequeue operation, and The thread for terminating team continues to complete also unfinished Detection task, then can destroy the queue having built up, so as to Temporary memory space.
It is understood that here to operator table carry out detection can be understood as parsing the operator table, obtain To the parsing result including operator information.
Alternatively, model treatment device can trigger after detecting object event and be stored with algorithm for described The model file to be processed of model is loaded into memory, and then carries out subsequent operator detection operation to processing model file.
It should be noted that the neural network framework training of server end can be directly used in the operator sequence of neural network Good operator sequence.For example, similar these frames by Tensorflow train the neural algorithm modeling operator come sequentially, and Trained neural algorithm model can be optimized sometimes.
For example, deployment of the neural algorithm model in mobile terminal, generally by the computers such as PC by trained mould Type is solidified into a file, and then the neural network framework of mobile terminal parses this document, reads in memory, sequentially executes neural algorithm Operator in model.But because the resource of mobile terminal is limited, some mini Mods can only be often run, large-sized model is needed by one The optimization of series, such as Operator Fusion, network beta pruning, model quantization, network cutting, etc. carry out Optimized model, with reach can be with In the purpose of mobile terminal operation.And correspondence is had in the quantity of the operator after this some optimization in model file and sequence Change.It is so optional, algorithm model can be optimized to event as object event.So in this manner, at model After reason device detects that model file is executed once optimization operation, judgement detects object event, and then triggers and deposit described The model file to be processed for containing algorithm model is loaded into memory.
Wherein, model treatment device can determine that model file has been performed optimization operation in several ways.
As a kind of mode, a model optimization information storage file can be configured, when model optimization module completes a mould The designated identification in model optimization information storage file can be updated after type optimization, so that updated value can characterize Currently complete a model modification.So model treatment device can specify in timing reading model optimization information storage file The value of mark, and then detect after the value of designated identification can characterize and currently complete a model modification to model optimization Model file executes the model treatment method in model treatment method or previous embodiment provided in this embodiment, with complete The operator information of the operator in the model file after optimization count and stored according to specified format in pairs.And complete After storage, and the value update of designated identification in model optimization information storage file can be characterized to the state for being not carried out update.
Alternatively mode, can by detect be stored with algorithm model model file size to determine whether Currently complete a model modification.It is understood that can be related to increasing operator during model optimization or delete Decreasing operator in turn results in the big of the occupancy memory space of model file then may just will affect the content of model file in this way It is small.So when model treatment device detects that the size for being stored with the model file of algorithm model changes, timing it is specified when Long, the size of model file does not change in the specified duration, that is, determines currently to complete a model modification.
It, can also be with it should be noted that model optimization module above-mentioned can be a module in model treatment device For the module independently of model treatment device.
Step S320: reading the root table in the model file to be processed, and described table is stored with the model to be processed The storage location of model table included by file.
Step S330: the model table is got according to the storage location that described table stores.
Step S340: the allocation list in the model file to be processed is obtained, is stored with sub-chart in the allocation list Storage location.
Step S350: the storage location based on the sub-chart gets the sub-chart in the model table.
Step S360: the table name based on the operator table identifies the quantity of operator in the operator table.
Step S370: if identifying, the quantity is greater than destination number, starts multiple threads respectively from the operator table Different location detects the content of the operator table.
It is understood that needing to compare the statistics for quickly finishing operator information in other instances.So it is used as one Kind mode, model treatment device can be by establishing multiple threads and triggering multiple threads simultaneously from the difference in the operator table The content of the operator table is detected in position, to increase the efficiency of statistics.For example, can establish as a kind of mode Two threads, one of thread are detected from the starting position of operator table towards end position, and another thread is from calculation The end position of sublist is detected towards starting position.For another example can establish two threads as a kind of mode, wherein One thread from the starting position direction of operator table, detected by the position between starting position and end position, and another The end position that is positioned against between starting position and end position is detected.
Alternatively mode may have multiple sub-charts in model table.For example, as shown in Figure 1, therein Field2 and field1 can be respectively as a sub-chart, then can further nesting table type in sub-chart Data, it can nested configuration table and corresponding model table again.The field2 and field1 so corresponded to here can divide Not Pei Zhi a thread operator table therein is detected.
Step S380:, will be described using the content blocks as an operator if detecting the content blocks including target identification The corresponding operator parameter of target identification is as operator information.
As a kind of mode, when the operator parameter includes the input data, output data and operator operation of operator Additional information.
Step S390: the parsing result including the operator information is generated.
Step S391: the parsing result is stored according to specified format.
Wherein, as a kind of mode, the operator counted may belong to different types of operator.For example, can belong to In the operator of convolution type, the operator of pond type or other kinds of operator.So correspond to different types of operator, it can To configure different types of storage format, and parameter corresponding to different-format may be different.
It should be noted that after for the parsing result is stored according to specified format, the operator that is stored Duplicate information may be also had in information.For example, the operator of the same type is counted multiple.So by the parsing As a result after being stored according to specified format, so that it may be instructed by the duplicate removal further received, to the operator of storage Information carries out duplicate removal.
A kind of model treatment method provided by the present application obtains the model file to be processed for being stored with algorithm model, described Model file to be processed is to be generated based on flatbuffer mode;Read the root table in the model file to be processed, described Table is stored with the storage location of model table included by the model file to be processed;The storage stored according to described table Position acquisition is to the model table;The sub-chart in the model table is read, operator table is read from the sub-chart, and to institute Operator table is stated to be parsed, obtain include operator information parsing result;The parsing result is carried out according to specified format Automation may be implemented through the above way and accurately carry out the statistics of operator information for storage.Also, in the present embodiment, Specific detection mode can be determined by the quantity of the operator in operator table, improve the flexibility of operator statistics.
Referring to Fig. 5, a kind of model treatment device 400 provided by the embodiments of the present application, described device 400 include:
File obtaining unit 410, for obtaining the model file to be processed for being stored with algorithm model, the model to be processed File is to be generated based on flatbuffer mode.
Specifically, as a kind of mode, file obtaining unit 410, specifically for by it is described be stored with algorithm model to Processing model file is loaded into memory;Wherein, the storage location is the offset address of the model table in memory.
Document reading unit 420, for reading the root table in the model file to be processed, described table is stored with described The storage location of model table included by model file to be processed.
Data capture unit 430, the storage location for being stored according to described table get the model table.
Data parsing unit 440 reads operator from the sub-chart for reading the sub-chart in the model table Table, and the operator table is parsed, obtain include operator information parsing result.
Specifically, as a kind of mode, data parsing unit 440 is specifically used for obtaining in the model file to be processed Allocation list, the storage location of sub-chart is stored in the allocation list;Storage location based on the sub-chart gets institute State the sub-chart in model table.
Optionally, as shown in fig. 6, data parsing unit 440, comprising:
Detection sub-unit 441, for reading operator table from the sub-chart, according to specified sequence to the operator table Content is detected.
Acquisition of information subelement 442, if for detecting the content blocks including target identification, using the content blocks as one A operator, using the corresponding operator parameter of the target identification as operator information.
Optionally, the additional letter when operator parameter includes the input data, output data and operator operation of operator Breath.
As a result subelement 443 is generated, for generating the parsing result including the operator information.
Optionally, data parsing unit 440 identify in the operator table specifically for the table name based on the operator table and calculate The quantity of son;If identifying, the quantity is greater than destination number, starts multiple threads respectively from the different location in the operator table The content of the operator table is detected.
Data storage cell 450, for storing the parsing result according to specified format.
As a kind of mode, as shown in fig. 7, data storage cell 450, comprising:
Type detection subelement 451, for identification type of the affiliated operator of the parsing result.
Storing sub-units 452, for being stored according to storage format corresponding to the type to operator.
A kind of model treatment device provided by the present application obtains the model file to be processed for being stored with algorithm model, described Model file to be processed is to be generated based on flatbuffer mode;Read the root table in the model file to be processed, described Table is stored with the storage location of model table included by the model file to be processed;The storage stored according to described table Position acquisition is to the model table;The sub-chart in the model table is read, operator table is read from the sub-chart, and to institute Operator table is stated to be parsed, obtain include operator information parsing result;The parsing result is carried out according to specified format Automation may be implemented through the above way and accurately carry out the statistics of operator information for storage.
It should be noted that Installation practice is mutual corresponding, device implementation with preceding method embodiment in the application Specific principle may refer to the content in preceding method embodiment in example, and details are not described herein again.
A kind of electronic equipment provided by the present application is illustrated below in conjunction with Fig. 8.
Referring to Fig. 8, based on above-mentioned model treatment method, apparatus, the another kind that the embodiment of the present application also provides can be with Execute the electronic equipment 200 of foregoing model processing method.Electronic equipment 200 include the one or more that intercouples (in figure only Show one) processor 102, memory 104 and network module 106.Wherein, it is stored with and can execute in the memory 104 The program of content in previous embodiment, and processor 102 can execute the program stored in the memory 104.
Wherein, processor 102 may include one or more core for being used to handle data.Processor 102 utilizes various Various pieces in interface and the entire electronic equipment 200 of connection, by running or executing the finger being stored in memory 104 It enables, program, code set or instruction set, and calls the data being stored in memory 104, execute the various of electronic equipment 200 Function and processing data.Optionally, processor 102 can use Digital Signal Processing (Digital Signal Processing, DSP), it is field programmable gate array (Field-Programmable Gate Array, FPGA), programmable At least one of logic array (Programmable Logic Array, PLA) example, in hardware is realized.Processor 102 can Integrating central processor (Central Processing Unit, CPU), image processor (Graphics Processing Unit, GPU) and one or more of modem etc. combination.Wherein, the main processing operation system of CPU, user interface With application program etc.;GPU is for being responsible for the rendering and drafting of display content;Modem is for handling wireless communication.It can be with Understand, above-mentioned modem can not also be integrated into processor 102, be carried out separately through one piece of communication chip real It is existing.
Memory 104 may include random access memory (Random Access Memory, RAM), also may include read-only Memory (Read-Only Memory).Memory 104 can be used for store instruction, program, code, code set or instruction set.It deposits Reservoir 104 may include storing program area and storage data area, wherein the finger that storing program area can store for realizing operating system Enable, for realizing at least one function instruction (such as touch function, sound-playing function, image player function etc.), be used for Realize the instruction etc. of following each embodiments of the method.Storage data area can also store the number that terminal 100 is created in use According to (such as phone directory, audio, video data, chat record data) etc..
The network module 106 is used to receive and transmit electromagnetic wave, realizes the mutual conversion of electromagnetic wave and electric signal, from And it is communicated with communication network or other equipment, such as communicated with audio-frequence player device.The network module 106 can Including various existing for executing the circuit elements of these functions, for example, antenna, RF transceiver, digital signal processor, Encryption/deciphering chip, subscriber identity module (SIM) card, memory etc..The network module 106 can be for example mutual with various networks Networking, intranet, wireless network communicate or communicated by wireless network and other equipment.Above-mentioned is wireless Network may include cellular telephone networks, WLAN or Metropolitan Area Network (MAN).For example, network module 106 can carry out letter with base station Breath interaction.
Referring to FIG. 9, it illustrates a kind of structural block diagrams of computer readable storage medium provided by the embodiments of the present application. Program code is stored in the computer-readable medium 1100, said program code can be called by processor and execute above method reality Apply method described in example.
Computer readable storage medium 1100 can be (the read-only storage of electrically erasable of such as flash memory, EEPROM Device), the electronic memory of EPROM, hard disk or ROM etc.Optionally, computer readable storage medium 1100 includes non-volatile Property computer-readable medium (non-transitory computer-readable storage medium).It is computer-readable Storage medium 1100 has the memory space for the program code 810 for executing any method and step in the above method.These programs Code can read or be written to this one or more computer program from one or more computer program product In product.Program code 1110 can for example be compressed in a suitable form.
In conclusion a kind of model treatment method, apparatus provided by the present application and electronic equipment, acquisition is stored with algorithm The model file to be processed of model, the model file to be processed are to be generated based on flatbuffer mode;It reads described wait locate The root table in model file is managed, described table is stored with the storage location of model table included by the model file to be processed; The model table is got according to the storage location that described table stores;The sub-chart in the model table is read, from institute State in sub-chart reading operator table, and the operator table parsed, obtain include operator information parsing result;It will be described Parsing result is stored according to specified format, and automation may be implemented through the above way and accurately carry out operator information Statistics.
Finally, it should be noted that above embodiments are only to illustrate the technical solution of the application, rather than its limitations;Although The application is described in detail with reference to the foregoing embodiments, those skilled in the art are when understanding: it still can be with It modifies the technical solutions described in the foregoing embodiments or equivalent replacement of some of the technical features;And These are modified or replaceed, do not drive corresponding technical solution essence be detached from each embodiment technical solution of the application spirit and Range.

Claims (10)

1. a kind of model treatment method, which is characterized in that be applied to model treatment device, which comprises
The model file to be processed for being stored with algorithm model is obtained, the model file to be processed is based on flatbuffer mode It generates;
The root table in the model file to be processed is read, described table is stored with mould included by the model file to be processed The storage location of type table;
The model table is got according to the storage location that described table stores;
The sub-chart in the model table is read, operator table is read from the sub-chart, and parse to the operator table, Obtain include operator information parsing result;
The parsing result is stored according to specified format.
2. the method according to claim 1, wherein described obtain the model to be processed text for being stored with algorithm model The step of part includes:
The model file to be processed for being stored with algorithm model is loaded into memory;
Wherein, the storage location is the offset address of the model table in memory.
3. the method according to claim 1, wherein the step of sub-chart read in the model table, wraps It includes:
The allocation list in the model file to be processed is obtained, the storage location of sub-chart is stored in the allocation list;
Storage location based on the sub-chart gets the sub-chart in the model table.
4. according to the method described in claim 3, it is characterized in that, described read operator table from the sub-chart, and to institute It states operator table to be parsed, obtains including the steps that the parsing result of operator information includes:
Operator table is read from the sub-chart, is detected according to content of the specified sequence to the operator table;
It is using the content blocks as an operator, the target identification is corresponding if detecting the content blocks including target identification Operator parameter as operator information;
Generate the parsing result including the operator information.
5. according to the method described in claim 4, it is characterized in that, the operator parameter includes the input data of operator, output Additional information when data and operator operation.
6. according to the method described in claim 4, it is characterized in that, described read operator table from the sub-chart, according to finger Determining the step of sequence detects the content of the operator table includes:
Table name based on the operator table identifies the quantity of operator in the operator table;
If identifying, the quantity is greater than destination number, starts multiple threads respectively from the different location in the operator table to described The content of operator table is detected.
7. -6 any method according to claim 1, which is characterized in that it is described by the parsing result according to specified lattice The step of formula is stored include:
Identify the type of the affiliated operator of the parsing result;
Operator is stored according to storage format corresponding to the type.
8. a kind of model treatment device, which is characterized in that described device includes:
File obtaining unit, for obtaining the model file to be processed for being stored with algorithm model, the model file to be processed is It is generated based on flatbuffer mode;
Document reading unit, for reading the root table in the model file to be processed, described table is stored with described to be processed The storage location of model table included by model file;
Data capture unit, the storage location for being stored according to described table get the model table;
Data parsing unit reads operator table, and to institute for reading the sub-chart in the model table from the sub-chart Operator table is stated to be parsed, obtain include operator information parsing result;
Data storage cell, for storing the parsing result according to specified format.
9. a kind of electronic equipment, which is characterized in that including processor and memory;
One or more programs are stored in the memory and are configured as being executed by the processor to realize that right is wanted Seek any method of 1-7.
10. a kind of computer readable storage medium, which is characterized in that be stored with program generation in the computer readable storage medium Code, wherein perform claim requires any method of 1-7 when said program code is run by processor.
CN201910750241.0A 2019-08-14 2019-08-14 Model processing method and device and electronic equipment Active CN110503180B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910750241.0A CN110503180B (en) 2019-08-14 2019-08-14 Model processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910750241.0A CN110503180B (en) 2019-08-14 2019-08-14 Model processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN110503180A true CN110503180A (en) 2019-11-26
CN110503180B CN110503180B (en) 2021-09-14

Family

ID=68587378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910750241.0A Active CN110503180B (en) 2019-08-14 2019-08-14 Model processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN110503180B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111782403A (en) * 2020-07-17 2020-10-16 Oppo广东移动通信有限公司 Data processing method and device and electronic equipment
CN113326942A (en) * 2020-02-28 2021-08-31 上海商汤智能科技有限公司 Model reasoning method and device, electronic equipment and storage medium
CN117521856A (en) * 2023-12-29 2024-02-06 南京邮电大学 Large model cutting federal learning method and system based on local features

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101610319A (en) * 2008-06-17 2009-12-23 大唐移动通信设备有限公司 Recording of information, statistical method and recording of information, statistic device
US20160314626A1 (en) * 2015-04-23 2016-10-27 Siemens Product Lifecycle Management Software Inc. Massive model visualization with a product lifecycle management system
CN107992408A (en) * 2017-11-16 2018-05-04 南京轩世琪源软件科技有限公司 A kind of software probe method of software probe
CN108764040A (en) * 2018-04-24 2018-11-06 Oppo广东移动通信有限公司 A kind of image detecting method, terminal and computer storage media
CN108764487A (en) * 2018-05-29 2018-11-06 北京百度网讯科技有限公司 For generating the method and apparatus of model, the method and apparatus of information for identification
CN109902819A (en) * 2019-02-12 2019-06-18 Oppo广东移动通信有限公司 Neural computing method, apparatus, mobile terminal and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101610319A (en) * 2008-06-17 2009-12-23 大唐移动通信设备有限公司 Recording of information, statistical method and recording of information, statistic device
US20160314626A1 (en) * 2015-04-23 2016-10-27 Siemens Product Lifecycle Management Software Inc. Massive model visualization with a product lifecycle management system
CN107992408A (en) * 2017-11-16 2018-05-04 南京轩世琪源软件科技有限公司 A kind of software probe method of software probe
CN108764040A (en) * 2018-04-24 2018-11-06 Oppo广东移动通信有限公司 A kind of image detecting method, terminal and computer storage media
CN108764487A (en) * 2018-05-29 2018-11-06 北京百度网讯科技有限公司 For generating the method and apparatus of model, the method and apparatus of information for identification
CN109902819A (en) * 2019-02-12 2019-06-18 Oppo广东移动通信有限公司 Neural computing method, apparatus, mobile terminal and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
熊亚蒙: "基于TensorFlow的移动终端图像识别方法", 《无线互联科技》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113326942A (en) * 2020-02-28 2021-08-31 上海商汤智能科技有限公司 Model reasoning method and device, electronic equipment and storage medium
CN111782403A (en) * 2020-07-17 2020-10-16 Oppo广东移动通信有限公司 Data processing method and device and electronic equipment
CN111782403B (en) * 2020-07-17 2022-04-19 Oppo广东移动通信有限公司 Data processing method and device and electronic equipment
CN117521856A (en) * 2023-12-29 2024-02-06 南京邮电大学 Large model cutting federal learning method and system based on local features
CN117521856B (en) * 2023-12-29 2024-03-15 南京邮电大学 Large model cutting federal learning method and system based on local features

Also Published As

Publication number Publication date
CN110503180B (en) 2021-09-14

Similar Documents

Publication Publication Date Title
CN110378413A (en) Neural network model processing method, device and electronic equipment
US11016673B2 (en) Optimizing serverless computing using a distributed computing framework
CN110503180A (en) Model treatment method, apparatus and electronic equipment
KR102094066B1 (en) Method and device for processing data for mobile games
CN111415358A (en) Image segmentation method and device, electronic equipment and storage medium
CN112532530B (en) Method and device for adjusting congestion notification information
CN109062715A (en) The determination method, apparatus and terminal of memory clock frequency
CN106686545B (en) A kind of application method and device of Wireless Fidelity national code
WO2021232958A1 (en) Method and apparatus for executing operation, electronic device, and storage medium
CN111182332B (en) Video processing method, device, server and storage medium
CN112070213A (en) Neural network model optimization method, device, equipment and storage medium
CN110705531A (en) Missing character detection and missing character detection model establishing method and device
CN114842307A (en) Mask image model training method, mask image content prediction method and device
CN110908797A (en) Call request data processing method, device, equipment, storage medium and system
CN110069217A (en) A kind of date storage method and device
CN107870862B (en) Construction method, traversal testing method and computing device of new control prediction model
CN109034176B (en) Identification system and identification method
CN107196857B (en) Moving method and network equipment
CN113377998A (en) Data loading method and device, electronic equipment and storage medium
CN111315026B (en) Channel selection method, device, gateway and computer readable storage medium
CN110532448B (en) Document classification method, device, equipment and storage medium based on neural network
CN112163473A (en) Multi-target tracking method and device, electronic equipment and computer storage medium
CN117014507A (en) Training method of task unloading model, task unloading method and device
CN113487041B (en) Transverse federal learning method, device and storage medium
CN112783574B (en) Application development method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant