CN111290785B - Method, device, electronic equipment and storage medium for evaluating compatibility of deep learning framework system - Google Patents

Method, device, electronic equipment and storage medium for evaluating compatibility of deep learning framework system Download PDF

Info

Publication number
CN111290785B
CN111290785B CN202010153719.4A CN202010153719A CN111290785B CN 111290785 B CN111290785 B CN 111290785B CN 202010153719 A CN202010153719 A CN 202010153719A CN 111290785 B CN111290785 B CN 111290785B
Authority
CN
China
Prior art keywords
deep learning
application program
learning framework
framework system
compatibility
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010153719.4A
Other languages
Chinese (zh)
Other versions
CN111290785A (en
Inventor
骆涛
曾锦乐
胡晓光
高铁柱
田硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202010153719.4A priority Critical patent/CN111290785B/en
Publication of CN111290785A publication Critical patent/CN111290785A/en
Application granted granted Critical
Publication of CN111290785B publication Critical patent/CN111290785B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/70Software maintenance or management
    • G06F8/76Adapting program code to run in a different environment; Porting

Abstract

The disclosure relates to a method, a device, electronic equipment and a medium for evaluating compatibility of a deep learning framework system, and relates to the technical field of deep learning. The method provided by the present disclosure includes: acquiring a first code associated with a first deep learning framework system and a second code associated with a second deep learning framework system, wherein the second code is an iteration of the first code; identifying at least one first application program interface called in the first code, and generating a first character string file corresponding to the at least one first application program interface; identifying at least one second application program interface called in the second code, and generating a second character string file corresponding to the at least one second application program interface; comparing the first character string file with the second character string file; based on the result of the comparison, the compatibility of the second deep learning framework system is evaluated. The method provided by the disclosure is helpful for enabling the assessment process of the compatibility of the deep learning framework system to be more convenient and reliable.

Description

Method, device, electronic equipment and storage medium for evaluating compatibility of deep learning framework system
Technical Field
The present disclosure relates to computer technology, and more particularly to deep learning technology.
Background
With the development of artificial intelligence technology, deep learning has become a very influential key commonality technology of artificial intelligence. The deep learning framework system enables people to build customized neural networks with pertinence by utilizing functional components provided in the system, and the customized neural networks are used for a plurality of different application fields such as natural language processing, computer vision recognition, voice recognition and the like, so that scientific experiments and product research and development based on deep learning are facilitated to a great extent.
To accommodate algorithm optimization, functional migration, and expansion of application scenarios for artificial intelligence applications requires fast iterations and expansion of the deep learning framework. Taking PaddlePaddle (an open source deep learning platform) as an example, large versions are released every three months on average, and small versions are released every month. In addition, a large number of developers are involved in the development and maintenance work of the system. Thus, each developer's submitted new code and the entire iterative deep learning framework system may be subject to compatibility issues.
Disclosure of Invention
According to one aspect of the present disclosure, a method for evaluating deep learning framework system compatibility is provided. The method may include: obtaining a first code associated with a first deep learning framework system and a second code associated with a second deep learning framework system, wherein the second code is an iteration of the first code; identifying at least one first application program interface called in the first code, and generating a first character string file corresponding to the at least one first application program interface; identifying at least one second application program interface called in the second code, and generating a second character string file corresponding to the at least one second application program interface; comparing the first character string file with the second character string file; based on the result of the comparison, compatibility of the second deep learning framework system is evaluated.
According to another aspect of the present disclosure, an apparatus for evaluating deep learning framework system compatibility is provided. The apparatus may include: a receiving unit configured to acquire a first code related to a first deep learning framework system and a second code related to a second deep learning framework system, wherein the second code is an iteration of the first code; an identifying unit, configured to identify at least one first application program interface that is invoked in the first code and identify at least one second application program interface that is invoked in the second code; a generating unit, configured to generate a first string file corresponding to the at least one first application program interface and generate a second string file corresponding to the at least one second application program interface; a comparison unit for comparing the first character string file and the second character string file; and an evaluation unit configured to evaluate compatibility of the second deep learning framework system based on a result of the comparison.
According to another aspect of the present disclosure, an electronic device is provided. The electronic device includes: a processor and a memory storing a program comprising instructions that when executed by the processor cause the processor to perform the method for evaluating deep learning framework system compatibility described above.
According to another aspect of the present disclosure, there is provided a computer readable storage medium storing a program comprising instructions that, when executed by a processor of an electronic device, cause the electronic device to perform a method for evaluating deep learning framework system compatibility according to the above.
Drawings
The accompanying drawings illustrate exemplary embodiments and, together with the description, serve to explain exemplary implementations of the embodiments. The illustrated embodiments are for exemplary purposes only and do not limit the scope of the claims. In the drawings, like reference numerals designate similar, but not necessarily identical, elements throughout the several views:
FIG. 1 is a flowchart illustrating a method for evaluating deep learning framework system compatibility in accordance with one exemplary embodiment;
FIG. 2 is a flowchart illustrating a method for evaluating deep learning framework system compatibility in accordance with another exemplary embodiment;
FIG. 3 is a block diagram illustrating an apparatus for evaluating deep learning framework system compatibility in accordance with one exemplary embodiment;
FIG. 4 is a block diagram illustrating an exemplary computing device that may be used in connection with the exemplary embodiments.
Detailed Description
In the present disclosure, the use of the terms "first," "second," and the like to describe various elements is not intended to limit the positional relationship, timing relationship, or importance relationship of the elements, unless otherwise indicated, and such terms are merely used to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, they may also refer to different instances based on the description of the context.
The terminology used in the description of the various illustrated examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, the elements may be one or more if the number of the elements is not specifically limited. Furthermore, the term "and/or" as used in this disclosure encompasses any and all possible combinations of the listed items. For example, a and/or B may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
The deep learning framework system has the characteristics of a plurality of user application program interfaces (Application Programming Interface, API) and quick iteration. For example, more than 500 operator APIs are provided in the PaddlePaddle learning platform and continue to increase to effectively support rapid construction of various different directional models for natural language processing, computer vision recognition, speech recognition, and the like. The applicant finds that the stability of the application program interface in the iterative process is an important index for evaluating the compatibility of the deep learning framework system; how to ensure the stability of the application program interface in the iterative process becomes a problem to be considered when the deep learning framework system iterates rapidly.
FIG. 1 is a flowchart illustrating a method for evaluating deep learning framework system compatibility according to one exemplary embodiment.
In step 101, a first code associated with a first deep learning framework system and a second code associated with a second deep learning framework system are acquired. The second code is an iteration of the first code.
The first code and the second code may be the entire system codes of the first and second deep learning frame systems, respectively, or may be a part of the codes in the corresponding frame systems, respectively. In the case that there are numerous developers involved in the overall system development, each developer can timely detect whether the partial code completed by himself is compatible with the corresponding part of the previous version, so that possible compatibility problems can be found in the early stage of program development.
In step 102, identifying at least one first application program interface called in the first code and generating a first string file corresponding to the at least one first application program interface; at least one second application program interface called in the second code is identified, and a second character string file corresponding to the at least one second application program interface is generated.
The first code associated with the first deep learning framework system and the second code associated with the second deep learning framework system iteratively developed based on the first deep learning framework system may include a plurality of application program interfaces (e.g., python interfaces, c++ interfaces in the deep learning system). An application program interface is a set of definitions, programs, and protocols by which a set of general functions provided by a primary function between computer software is implemented. The application program interface can also be used as a middleware for providing data sharing for various different software platforms. If the second code is inconsistent with the application program interface in the first code, incompatibility problems may occur, resulting in difficulty in upgrading for the user, such that, for example, the neural network built by the user in the first deep learning framework system cannot operate normally in the second deep learning framework system. Furthermore, the application program interface may be discarded or modified in a subsequent upgrade stage due to its non-usability, again resulting in difficulty for the user to upgrade.
The application program interface invoked in the first code and the second code may include: application program interface name, parameter list, parameter default values, and interface comments.
The following is code of one exemplary application program interface:
paddle.fluid.layers.uniform_random(shape,dtype=‘float32’,min=1.0,seed=0)
the OP initializes a Tensor with random values sampled from a uniform distribution within the range min max.
Example 1:
given:
shape=[1,2]
the output is:
result=[[0.8505902,0.8397286]]
where "pad.fluid.filters.units_random" represents the interface name of the exemplary application program interface; "shape", "dtype", "min", "max" and "seed" represent parameter lists of the exemplary application program interface, wherein 'float32', -1.0,1.0,0 represent parameter default values of the respective parameters of the exemplary application program interface, respectively; the remaining portion of code represents interface comments of the exemplary application program interface.
According to some embodiments, the first string file comprises a string corresponding to an interface name and parameter list of each of the at least one first application program interface, and the second string file comprises a string corresponding to a name and parameter list of each of the at least one second application program interface. The terms "first application program interface" and "second application program interface" are mainly used to distinguish whether the corresponding application program interface belongs to the first deep learning framework system or the second deep learning framework system, and do not necessarily represent different application program interfaces. For example, the application program interface named "page. Fluids. Filters. Units_random" may exist in either a first code associated with a first deep learning framework system or a second code associated with a second deep learning framework system.
In the case where the character string file contains only the character string corresponding to the interface name and the parameter list of the application program interface, information related to the application program interface can be quickly acquired and subsequent judgment can be performed, which is advantageous especially in the case where it is judged that the two character string files are not identical.
According to some embodiments, the first string file and the second string file may further include parameter default values of the corresponding application program interface, respectively. Different parameter defaults may cause the performance of the deep learning framework system to change, thereby causing incompatibility of the new and old systems. Therefore, the parameter default value is considered when comparing the character string files, which is helpful for improving the accuracy of the system compatibility evaluation.
According to some embodiments, the first string file and the second string file may further include interface annotations of the corresponding application program interface, respectively. It will be appreciated that setting up annotations helps assist the user in selecting and invoking the appropriate application program interface. Thus, changes in annotations may cause a user to choose an application program interface from new and old versions to deviate, thereby affecting the compatibility and usability of the system. Therefore, the annotation is considered when comparing the character string files, which helps to further improve the accuracy of the system compatibility evaluation.
Thus, for each application program interface, a string may be generated in the order of the interface name of the application program interface, the list of parameters of the application program interface, the parameter default value, and the comment. Of course, the present disclosure is not limited to this order, and other orders may be set in advance.
And so on, the character strings corresponding to all the application program interfaces possibly contained in the first code and the second code can be obtained respectively, and the character strings corresponding to all the application program interfaces are integrated to generate corresponding character string files.
According to some embodiments, a cryptographic hash function may be utilized to calculate a string corresponding to a parameter default and/or interface annotation. In other words, the character string corresponding to the application program interface does not directly include the parameter default value and/or the comment content, but the character string is processed by using the password hash function, and the character string obtained by processing represents the corresponding parameter default value and/or comment and is added into the character string representing the application program interface. For example, by invoking a cryptographic hashing algorithm to generate annotation analysis in the application program interface of the first code and the second code as a hash string, a change in the annotation of the second code with respect to the interface of the application program may be detected. This is particularly advantageous in cases where the content of the annotation is large, and by reducing the large-scale explanatory characters to their corresponding cryptographic hash function values, it is possible to make it possible to comprehensively consider the content of the annotation and efficiently acquire the corresponding character string text when judging the compatibility of the deep learning framework system. In addition, this also simplifies the comparison of the subsequent string text. It will be appreciated that the choice of cryptographic hash functions for other content of the application program interface is also contemplated.
According to some embodiments, the message digest algorithm (MD 5) is invoked to perform the hashing operation described above. The present embodiment is not limited to a specific algorithm. For example, the character strings corresponding to the above-described exemplary application program interfaces may be represented as follows: paddle.fluid.laminates.units_random (ArgSpec (args= [ ' shape ', ' dtype ', ' min ', ' max ', ' seed ', ' default= (' float32', -1.0,1.0,0)), (' document ', '6de6775d9e 885056e764982130cfd ')
Furthermore, it should be understood that in the above steps 101 and 102, the operations related to the first code and the second code may be independent of each other, and the respective operations may be completed simultaneously or in different time sequences.
In step 103, the first string file is compared with the second string file.
Part or all of the execution subjects of steps 101 to 103 may be an application located in the local terminal, or may be a functional unit such as a plug-in unit or a software development kit (Software Development Kit, SDK) provided in the application located in the local terminal, or may be a processing engine located in a server on the network side, or may be a distributed system on the network side, for example, a processing engine or a distributed system in a test platform on the network side, which is not particularly limited in this embodiment. The application may be a native program (native app) installed on the terminal, or may also be a web page program (webApp) of a browser on the terminal, which is not limited in this embodiment.
In step 104, compatibility of the second learning framework system is evaluated based on the comparison result.
It will be appreciated that the second learning framework system may be more likely to be compatible with the first learning framework system when the first string file and the second string file are identical than when the two are not identical. In addition, by comparing whether the character string file related to at least one application program interface in the second code is the same as the character string file related to at least one application program interface in the first code, the modification made to the application program interface in the second code can be determined, which is helpful to improve the consistency and stability of the application program interface of the second code, and makes the application program interface easier to upgrade, maintain and expand.
Fig. 2 is a flowchart illustrating a method for evaluating deep learning framework system compatibility according to another exemplary embodiment.
Steps 201 to 203 may correspond to steps 101 to 103 described based on fig. 1, respectively, and will not be described here again.
In response to determining that the first string file is not identical to the second string file, evaluating compatibility of the second deep learning framework system at a first audit level; and in response to determining that the first string file is the same as the second string file, evaluating compatibility of a second deep learning framework system at a second audit level, wherein a requirement of the second audit level is lower than a requirement of the first audit level.
Any of the following non-exhaustive scenarios may result in the first string file and the second string file being different: 1) The total number of APIs in the string file increases or decreases; 2) An API name change for one or more APIs; 3) Parameter names of one or more APIs are changed; 4) The parameter list of one or more APIs is increased or decreased; 5) Parameter defaults for one or more APIs are altered; or 6) annotation changes to one or more APIs.
In step 204, when it is determined that the first string file is not identical to the second string file, compatibility of the second deep learning framework system is evaluated at a first audit level. The first review level assessment may be performed, for example, by a person with a high level of expertise. The senior citizens can more quickly and accurately judge the reason for the difference between the first character string and the second character string, and further judge whether the reason has influence on the compatibility of the second deep learning frame system. For example, while a change in the default parameters of the application program interface may result in a first string being different from a second string, if the parameters that are changed relate only to the speed of computation, such a change will only affect the performance of the system and will not cause compatibility problems.
In step 205, when it is determined that the first string file is identical to the second string file, compatibility of the second deep learning framework system is evaluated at a second audit level. The second audit level is less demanding than the first audit level. The second review level assessment may be performed, for example, by relatively light professional competency personnel.
According to the embodiment of the disclosure, the character string files related to the application program interface are respectively generated for the first code and the second code, and based on whether the two character string files are identical or not, the level judgment and distribution of the auditing task are carried out, and the compatibility of the test codes is evaluated by adopting different auditing levels, so that the reliability and the efficiency of the whole evaluation process are considered.
In addition, it is contemplated that the first code and the second code may be respectively subjected to automatic test verification before performing the operation according to the embodiments of the present disclosure on the two codes, thereby eliminating obvious errors in the codes and further improving the efficiency of the subsequent comparison step.
It can be seen that the method of evaluating deep learning framework system compatibility provided by the present disclosure may have one or more of the following advantages: 1) The operation is simple; 2) The development iteration efficiency of the whole system is improved, so that the software whole development system has stronger expansibility and maintainability; 3) The code quality of the whole system, particularly the code quality of a plurality of application program interfaces is improved, and the possible technical problems in the future are reduced. Specifically, whether the character string files of the application program interfaces based on the first code and the second code are the same or not is compared, and corresponding level auditing tasks are distributed, so that consistency and stability of the second code and the application program interfaces in the first code are ensured, incompatibility caused by inconsistent application program interfaces of subsequent codes and the application program interfaces of the first code is avoided, and further, difficulty in upgrading of a user is caused, or the interface is abandoned or modified in a subsequent upgrading stage due to non-usability of the application program interfaces of the user, and further, difficulty in upgrading of the user is caused again.
FIG. 3 is a block diagram illustrating an apparatus for evaluating deep learning framework system compatibility according to an example embodiment.
The apparatus 300 for evaluating the compatibility of the deep learning framework system according to the exemplary embodiment may include: a receiving unit 301, an identifying unit 302, a generating unit 303, a comparing unit 304, and an evaluating unit 305. Wherein the receiving unit 301 may be configured to obtain a first code associated with the first deep learning framework system and to obtain a second code associated with the second deep learning framework system, wherein the second code is based on an iteration of the first code. The identification unit 302 may be used to identify at least one first Application Program Interface (API) that is invoked in a first code and to identify at least one second application program interface that is invoked in a second code. The generating unit 303 may be configured to generate a first string file corresponding to at least one first application program interface and generate a second string file corresponding to at least one second application program interface. The comparison unit 304 may be used to compare whether the first string file and the second string file are identical. The evaluation unit 305 may be configured to evaluate the compatibility of the second deep learning framework system based on the above-described comparison result.
According to some embodiments, the evaluation unit is further configured to: in response to determining that the first string file is not identical to the second string file, evaluating compatibility of the second deep learning framework system at a first audit level; and in response to determining that the first string file is the same as the second string file, evaluating compatibility of the second deep learning framework system at a second audit level, wherein the second audit level is less than the first audit level.
It should be understood that the description of the method steps described above in connection with fig. 1 and 2 is equally applicable to the unit performing the corresponding method steps in fig. 3, and will not be repeated here.
According to one aspect of the present disclosure, there is also provided an electronic device, which may include: a processor; and a memory storing a program comprising instructions that when executed by the processor cause the processor to perform the method for evaluating deep learning framework system compatibility described above.
According to another aspect of the present disclosure, there is also provided a computer readable storage medium storing a program comprising instructions that, when executed by a processor of an electronic device, cause the electronic device to perform the above-described method for evaluating deep learning framework system compatibility.
With reference to fig. 4, a computing device 2000 will now be described as an example of an electronic device that may be applied to aspects of the present disclosure. The computing device 2000 may be any machine configured to perform processes and/or calculations and may be, but is not limited to, a workstation, a server, a desktop computer, a laptop computer, a tablet computer, a personal digital assistant, a robot, a smart phone, an on-board computer, or any combination thereof. The intent recognition method described above may be implemented in whole or at least in part by computing device 2000 or a similar device or system.
The computing device 2000 may include elements that are connected to the bus 2002 (possibly via one or more interfaces) or that communicate with the bus 2002. For example, computing device 2000 may include a bus 2002, one or more processors 2004, one or more input devices 2006, and one or more output devices 2008. The one or more processors 2004 may be any type of processor and may include, but are not limited to, one or more general purpose processors and/or one or more special purpose processors (e.g., special processing chips). Input device 2006 may be any type of device capable of inputting information to computing device 2000 and may include, but is not limited to, a mouse, a keyboard, a touch screen, a microphone, and/or a remote control. The output device 2008 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, video/audio output terminals, vibrators, and/or printers. Computing device 2000 may also include a non-transitory storage device 2010, which may be any storage device that is non-transitory and may enable data storage, or is connected to non-transitory storage device 2010, and mayIncluding but not limited to a magnetic disk drive, optical storage device, solid state memory, floppy disk, flexible disk, hard disk, magnetic tape, or any other magnetic medium, optical disk or any other optical medium, ROM (read only memory), RAM (random access memory), cache memory, and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions, and/or code. The non-transitory storage device 2010 may be detached from the interface. The non-transitory storage device 2010 may have data/program (including instructions)/code for implementing the methods and steps described above. Computing device 2000 may also include a communication device 2012. The communication device 2012 may be any type of device or system that enables communication with external devices and/or with a network, and may include, but is not limited to, a modem, a network card, an infrared communication device, a wireless communication device, and/or a chipset, such as bluetooth TM Devices, 1302.11 devices, wiFi devices, wiMax devices, cellular communication devices, and/or the like.
Computing device 2000 may also include a working memory 2014, which may be any type of working memory that may store programs (including instructions) and/or data useful for the operation of processor 2004 and may include, but is not limited to, random access memory and/or read-only memory devices.
Software elements (programs) may reside in the working memory 2014 including, but not limited to, an operating system 2016, one or more application programs 2018, drivers, and/or other data and code. Instructions for performing the above-described methods and steps may be included in one or more applications 2018, and the above-described intent recognition methods may be implemented by instructions of the one or more applications 2018 being read and executed by the processor 2004. More specifically, in the above-described intention recognition method, steps S101 to S103 may be implemented by, for example, the processor 2004 executing the application 2018 having the instructions of steps S101 to S103. Further, other steps in the intent recognition method described above may be implemented, for example, by the processor 2004 executing an application 2018 having instructions to perform the corresponding steps. Executable code or source code of instructions of software elements (programs) may be stored in a non-transitory computer readable storage medium (such as storage device 2010 described above) and, when executed, may be stored (possibly compiled and/or installed) in working memory 2014. Executable code or source code for instructions of software elements (programs) may also be downloaded from a remote location.
It should also be understood that various modifications may be made according to specific requirements. For example, custom hardware may also be used, and/or particular elements may be implemented in hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. For example, some or all of the disclosed methods and apparatus may be implemented by programming hardware (e.g., programmable logic circuits including Field Programmable Gate Arrays (FPGAs) and/or Programmable Logic Arrays (PLAs)) in an assembly language or hardware programming language such as VERILOG, VHDL, c++ using logic and algorithms according to the present disclosure.
It should also be appreciated that the foregoing method may be implemented by a server-client mode. For example, a client may receive data entered by a user and send the data to a server. The client may also receive data input by the user, perform a part of the foregoing processes, and send the processed data to the server. The server may receive data from the client and perform the aforementioned method or another part of the aforementioned method and return the execution result to the client. The client may receive the result of the execution of the method from the server and may present it to the user, for example, via an output device.
It should also be appreciated that the components of computing device 2000 may be distributed over a network. For example, some processes may be performed using one processor while other processes may be performed by another processor remote from the one processor. Other components of computing system 2000 may also be similarly distributed. As such, computing device 2000 may be construed as a distributed computing system that performs processing in multiple locations.
Although embodiments or examples of the present disclosure have been described with reference to the accompanying drawings, it is to be understood that the foregoing methods, systems, and apparatus are merely exemplary embodiments or examples, and that the scope of the present invention is not limited by these embodiments or examples but only by the claims following the grant and their equivalents. Various elements of the embodiments or examples may be omitted or replaced with equivalent elements thereof. Furthermore, the steps may be performed in a different order than described in the present disclosure. Further, various elements of the embodiments or examples may be combined in various ways. It is important that as technology evolves, many of the elements described herein may be replaced by equivalent elements that appear after the disclosure.

Claims (14)

1. A method for evaluating deep learning framework system compatibility, the method comprising:
obtaining a first code associated with a first deep learning framework system and a second code associated with a second deep learning framework system, wherein the second code is an iteration of the first code;
identifying at least one first application program interface called in the first code, and generating a first character string file corresponding to the at least one first application program interface, wherein the first character string file comprises character strings corresponding to interface names and parameter lists of each first application program interface in the at least one first application program interface;
identifying at least one second application program interface called in the second code, and generating a second character string file corresponding to the at least one second application program interface, wherein the second character string file comprises character strings corresponding to interface names and parameter lists of each second application program interface in the at least one second application program interface;
comparing the first character string file with the second character string file; and
based on the result of the comparison, compatibility of the second deep learning framework system is evaluated.
2. The method for evaluating compatibility of a deep learning framework system of claim 1, wherein the evaluating compatibility of the second deep learning framework system comprises:
in response to determining that the first string file is not identical to the second string file, evaluating compatibility of the second deep learning framework system at a first audit level; and
in response to determining that the first string file is identical to the second string file, evaluating compatibility of the second deep learning framework system at a second audit level, wherein,
the second audit level is less demanding than the first audit level.
3. The method for assessing the compatibility of a deep learning framework system of claim 1 wherein,
the first string file further includes a string corresponding to a parameter default value of each of the at least one first application program interface; and wherein the first and second heat sinks are disposed,
the second string file further includes a string corresponding to a parameter default value of each of the at least one second application program interface.
4. The method for assessing the compatibility of a deep learning framework system of claim 3 wherein,
the first string file further includes a string corresponding to an interface annotation of each of the at least one first application program interface; and wherein the first and second heat sinks are disposed,
the second string file also includes a string corresponding to an interface annotation of each of the at least one second application program interface.
5. The method for assessing the compatibility of a deep learning framework system of claim 4 wherein,
a string corresponding to at least one of the parameter default value and the interface annotation is calculated using a cryptographic hash function.
6. The method for evaluating deep learning framework system compatibility of claim 5 wherein the cryptographic hash function uses a message digest algorithm MD5.
7. An apparatus for evaluating deep learning framework system compatibility, the apparatus comprising:
a receiving unit configured to acquire a first code related to a first deep learning framework system and a second code related to a second deep learning framework system, wherein the second code is an iteration of the first code;
an identifying unit, configured to identify at least one first application program interface that is invoked in the first code and identify at least one second application program interface that is invoked in the second code;
a generating unit configured to generate a first string file corresponding to the at least one first application program interface and generate a second string file corresponding to the at least one second application program interface, where the first string file includes a string corresponding to an interface name and a parameter list of each of the at least one first application program interface, and the second string file includes a string corresponding to an interface name and a parameter list of each of the at least one second application program interface;
a comparison unit for comparing the first character string file and the second character string file; and
and an evaluation unit configured to evaluate compatibility of the second deep learning framework system based on a result of the comparison.
8. The apparatus for evaluating deep learning framework system compatibility of claim 7, wherein the evaluation unit is configured to:
in response to determining that the first string file is not identical to the second string file, evaluating compatibility of the second deep learning framework system at a first audit level; and
in response to determining that the first string file is identical to the second string file, evaluating compatibility of the second deep learning framework system at a second audit level, wherein,
the second audit level is less demanding than the first audit level.
9. The apparatus for evaluating compatibility of a deep learning framework system of claim 8 wherein,
the first string file further includes a string corresponding to a parameter default value of each of the at least one first application program interface; and wherein the first and second heat sinks are disposed,
the second string file further includes a string corresponding to a parameter default value of each of the at least one second application program interface.
10. The apparatus for evaluating compatibility of a deep learning framework system of claim 9 wherein,
the first string file further includes a string corresponding to an interface annotation of each of the at least one first application program interface; and wherein the first and second heat sinks are disposed,
the second string file also includes a string corresponding to an interface annotation of each of the at least one second application program interface.
11. The apparatus for evaluating deep learning framework system compatibility of claim 10, wherein the generation unit is configured to:
a string corresponding to at least one of the parameter default value and the interface annotation is calculated using a cryptographic hash function.
12. The apparatus for evaluating deep learning framework system compatibility of claim 11 wherein the cryptographic hash function uses a message digest algorithm MD5.
13. An electronic device, comprising:
a processor; and
a memory storing a program comprising instructions that when executed by the processor cause the processor to perform the method for evaluating deep learning framework system compatibility of any one of claims 1 to 6.
14. A computer readable storage medium storing a program comprising instructions that when executed by a processor of an electronic device cause the electronic device to perform the method for evaluating deep learning framework system compatibility of any one of claims 1-6.
CN202010153719.4A 2020-03-06 2020-03-06 Method, device, electronic equipment and storage medium for evaluating compatibility of deep learning framework system Active CN111290785B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010153719.4A CN111290785B (en) 2020-03-06 2020-03-06 Method, device, electronic equipment and storage medium for evaluating compatibility of deep learning framework system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010153719.4A CN111290785B (en) 2020-03-06 2020-03-06 Method, device, electronic equipment and storage medium for evaluating compatibility of deep learning framework system

Publications (2)

Publication Number Publication Date
CN111290785A CN111290785A (en) 2020-06-16
CN111290785B true CN111290785B (en) 2023-06-06

Family

ID=71026958

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010153719.4A Active CN111290785B (en) 2020-03-06 2020-03-06 Method, device, electronic equipment and storage medium for evaluating compatibility of deep learning framework system

Country Status (1)

Country Link
CN (1) CN111290785B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112101571A (en) * 2020-09-25 2020-12-18 北京百度网讯科技有限公司 Method and device for monitoring operator compatibility under deep learning framework
CN112257856A (en) * 2020-12-18 2021-01-22 鹏城实验室 Deep learning framework determination method and device and readable storage medium
CN116257286B (en) * 2023-03-13 2023-09-15 北京百度网讯科技有限公司 File processing method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110018820A (en) * 2019-04-08 2019-07-16 浙江大学滨海产业技术研究院 A method of the Graph2Seq based on deeply study automatically generates Java code annotation
CN110688098A (en) * 2019-09-02 2020-01-14 深圳壹账通智能科技有限公司 Method and device for generating system framework code, electronic equipment and storage medium
CN110709816A (en) * 2017-06-03 2020-01-17 苹果公司 Integrating learning models into software development systems

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102103537A (en) * 2009-12-17 2011-06-22 珠海市君天电子科技有限公司 Method and device for finding compatibility problem among safety software
US10540606B2 (en) * 2014-06-30 2020-01-21 Amazon Technologies, Inc. Consistent filtering of machine learning data
WO2019028269A2 (en) * 2017-08-02 2019-02-07 Strong Force Iot Portfolio 2016, Llc Methods and systems for detection in an industrial internet of things data collection environment with large data sets
AU2017272252A1 (en) * 2016-06-08 2018-01-04 Accenture Global Solutions Limited Resource evaluation for complex task execution
WO2018094099A1 (en) * 2016-11-17 2018-05-24 The Mathworks, Inc. Systems and methods for automatically generating code for deep learning systems
CN108170468B (en) * 2017-12-28 2021-04-20 中山大学 Method and system for automatically detecting annotation and code consistency
CN110162556A (en) * 2018-02-11 2019-08-23 陕西爱尚物联科技有限公司 A kind of effective method for playing data value
CN109376041A (en) * 2018-09-19 2019-02-22 广州优亿信息科技有限公司 A kind of Benchmark test system and its workflow for AI chip for cell phone
CN109871686A (en) * 2019-01-31 2019-06-11 中国人民解放军战略支援部队信息工程大学 Rogue program recognition methods and device based on icon representation and software action consistency analysis

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110709816A (en) * 2017-06-03 2020-01-17 苹果公司 Integrating learning models into software development systems
CN110018820A (en) * 2019-04-08 2019-07-16 浙江大学滨海产业技术研究院 A method of the Graph2Seq based on deeply study automatically generates Java code annotation
CN110688098A (en) * 2019-09-02 2020-01-14 深圳壹账通智能科技有限公司 Method and device for generating system framework code, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111290785A (en) 2020-06-16

Similar Documents

Publication Publication Date Title
US10481884B2 (en) Systems and methods for dynamically replacing code objects for code pushdown
US10289959B2 (en) Artificial intelligence and knowledge based automation enhancement
CN111290785B (en) Method, device, electronic equipment and storage medium for evaluating compatibility of deep learning framework system
US8706771B2 (en) Systems and methods for analyzing and transforming an application from a source installation to a target installation
US8997065B2 (en) Automatic modularization of source code
US9323851B1 (en) Collaborative modeling environment
US11327874B1 (en) System, method, and computer program for orchestrating automatic software testing
US11263113B2 (en) Cloud application to automatically detect and solve issues in a set of code base changes using reinforcement learning and rule-based learning
US11886792B1 (en) Model documentation generation system
US11681607B2 (en) System and method for facilitating performance testing
US11698825B2 (en) Application programming interface compatibility
US9507592B2 (en) Analysis of data integration job
US10733540B2 (en) Artificial intelligence and knowledge based automation enhancement
US11394668B1 (en) System and method for executing operations in a performance engineering environment
US11636022B2 (en) Server and control method thereof
US11249749B2 (en) Automatic generation of configuration files
US9250870B2 (en) Automated creation of shim programs and interfaces
CA3106998C (en) System and method for executing operations in a performance engineering environment
US11714743B2 (en) Automated classification of defective code from bug tracking tool data
US20230297880A1 (en) Cognitive advisory agent
CA3107004C (en) System and method for facilitating performance testing
US20240126976A1 (en) Model documentation generation system
JP2024052547A (en) Version update recommendations for software packages
CN117435490A (en) Test script data filling method, device, computer equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant